-
Notifications
You must be signed in to change notification settings - Fork 1
Home
Welcome to the HaBench wiki!
At BelHac, we started the effort of constructing the framework. The basic idea now is to have a framework that uses a configuration file to determine which benchmarks will make it to a specific instance of the benchmark suite, making it easy to evolve suites. We will rely on cabal and hackage to determine which libraries, tools and general stuff is required to construct such an instance. Each benchmark is then in its own right described by a cabal file, stating which (exact) versions are required to build it. Actually building is then done in a sandbox, so we have the right versions compiled with the targetted compiler, etc.
Reboot statement
As wiki’s (especially mine) have the tendency to go unstructured very soon, this one will likely follow that same patterns. But, this need not mean we leave it unused.
A long time ago — in computer science time — we (Kenneth Hoste and myself) proposed to start with a new Haskell benchmark suite. The proof of this (smallish) effort can be found at the Haskell wiki.
Since then, little has happened. nofib is still around and heavily used. A lot of experiments — no offense to the experimenters, there is little else to use after all — use small kernels or microbenchmarks.
I decided to reboot the HaBench effort, this time aiming to carry it through. The following mails on the Haskell-cafe list are what happened
- original email
- http://article.gmane.org/gmane.comp.lang.haskell.cafe/76419
- http://article.gmane.org/gmane.comp.lang.haskell.cafe/76448
- http://article.gmane.org/gmane.comp.lang.haskell.cafe/76450
- http://article.gmane.org/gmane.comp.lang.haskell.cafe/76451
- http://article.gmane.org/gmane.comp.lang.haskell.cafe/76452
- http://article.gmane.org/gmane.comp.lang.haskell.cafe/76454
If you have benchmarks that you would consider candidates for the suite (and at this point, pretty much anything goes that is functioning and compiles and executes without hassle), let us know.