Commits

certainty  committed 72c4bc0

fineshed up

  • Participants
  • Parent commits 3cc8b29

Comments (0)

Files changed (1)

File _posts/2013-03-27-i_did_missbehave.md

 layout: post
 title: I did mis(s)behave
 tagline: Lessons learned from a failed project
-datestring: 2013-03-27
+datestring: 2013-06-19
 code:
   url: http://bitbucket.org/certainty/missbehave
   caption: missbehave
 Also the chicken CI expects the tests to work in a certain way and without going through some hoops it was not possible to run missbehave in the
 context of [salmonella](http://tests.call-cc.org/). I added a way to do that later, as the following example shows:
 
-~~~
+~~~ clojure
 (use missbehave missbehave-matchers missbehave-stubs srfi-1)
 
 (run-specification
 Indeed some of these things will make it into a new library that intents to honor the language more. It's a work in progress, but
 if you're curious you can take a peek at [veritas](https://bitbucket.org/certainty/veritas).
 
-#### The matcher abstraction
+#### 1) The matcher abstraction
 
 Missbehave introduced an abstraction called a matcher, that was used to verify expectations. A matcher, in missbehave, is
 a higher order function that knows how to verify the behavior of the subject that is passed to it.
 This will automatically generate a message that uses the expression itself.
 
 
-#### Meta information and filters
+#### 2) Meta information and filters
 
 The library provided a way to attach meta data to examples and contexts. The user could then use filters to run only examples that
 have corresponding meta-data. This is a valuable feature as it gives you fine grained control over which tests are run.
 So that's completely orthogonal to the notion and syntax of contexts and examples. Also I want meta-data to compose in the way that
 nested meta data "adds up", so that the inner most expression holds the union of all meta-data surrounding it.
 
-#### Pending tests
+#### 3) Pending tests
 
 Pending tests are extremely valuable and I don't quite understand why they are not supported by the test egg, or at least not directly.
 As the name suggests you can temporarily disable the execution of tests by marking them pending. The point is that these tests aren't run,
 
 ### What now?
 
-As I wrote before, I have learned from my failures and work on a testing library that incorporates the good parts and throws away the bad parts.
-This library will be called veritas and is a work in progress. It will further more encourage the use of quick-check like
-automated value generators as well as using the REPL as a host to run tests interactively. To support this style of testing
-I currently work on a library that provides combinators to generate data. You can have a look here both projects here:
+Another framework? Yes, that's what I'm working on. I believe that diversity is a good thing and having the choice
+between different tools for the same task is good. What I aim at is a library that:
 
+1. embraces the host language
+2. focuses on value and state verification
+3. works nicely in the REPL
+4. is small, composable and works well
+5. fits in the existing infrastructure
+6. incorporates some ideas from missbehave
+7. enables quick-check like tests
+
+You can have a look here two projects in that direction:
 * [veritas](https://bitbucket.org/certainty/veritas)
 * [data-generators](https://bitbucket.org/certainty/data-generators)
 
+I'll blog about them once there is something to say.
+
 ### Wrap up
 
 I hope you enjoyed this little journey through all my failures. It has certainly been a pleasure for me and a healthy way to look at the "monster" I've made.
 I'm sure there is still much to learn for me and I'm open to it. I want to thank all the helpful people that provided valuable feedback for this post
-and for missbehave. I for one will continue to improve, which means I will continue to fail. Promised! ;)
+and for missbehave.
+
+<strong>I for one will continue to improve, which means I will continue to fail. Promised! ;)</strong>