Showing posts with label workaround. Show all posts
Showing posts with label workaround. Show all posts

Friday, July 20, 2012

Grunt watch and the Node.js require cache revisited

Still inspired by James Shore's series "Let's Code: Test-Driven Javascript",  I've been continuing with my endeavors to get the Grunt watch stuff working flawlessly(?).

In my last post I mentioned some niggles that were remaining from my previous workaround.
  • The workaround only addresses the Mongoose issue
  • The workaround assumes intimate knowledge of Mongoose
  • Grunt watch still explodes silently when unhandled errors are encountered in tests
    • undefined references
    • nonexistent requires
    • etc.
The good news is that I think I have addressed all of these. In addition to that I've figured out some stuff about how to extend grunt and how to manipulate the Node.js require cache.

First off I thought I'd take a look at Mocha to see if it handled things better. After all Mocha also has a watch function.
  • Mocha watch does not explode on undefined references (which is nice)
  • Mocha watch does still explode on nonexistent requires (actually I didn't find this out till much later on when integrating with grunt)
  • Mocha watch still failed to handle my Mongoose issue
  • Unfortunately Mocha watch doesn't integrate with JSHint and actually I'd quite like to lint my code on file changes too
So despite only having a small advantage in not falling over so much I thought Mocha showed more promise than NodeUnit and as James noted it is much more active on GitHub. In fact it's under the same banner as Express and Jade which are definitely very popular and well maintained frameworks for Node.js.

Next thing was to integrate Mocha with Grunt so that i can use the Grunt watch function to both lint and run tests on file changes.

The nice thing about writing my own task to run Mocha instead of NodeUnit is that it was then quite easy to fix the issue of exploding on nonexistent requires... It just needed a try/catch around the mocha.run call. In retrospect I could probably have added this to the existing NodeUnit task but by the time I got to this point, I'd already ported all my tests to Mocha.

[A short interlude on Mocha and Should...]

James noted in his videos that Mocha is targeted as a BDD test framework and as such he is not so keen on it's verbosity. I can see what he means but, to be honest, I don't find it much of an issue and in fact quite like it, so for a while at least, I think i'll stick with it.

I also tried the should.js assert library that provides an interesting take on asserts by making them a bit more natural language like. Things like: thing.should.have.property(things).with.length(5);

On first take I thought cool and went full steam ahead in making all my asserts like this. Currently though I'm not sure I like it.

For one, I keep thinking that I should be able to write something in a natural way but find that it's not really supported - it kinda feels like I'm being teased. This will lessen I guess as I really learn the idioms.

A more annoying problem though is related to the way Javascript handles types and comparisons. I keep finding comparisons that i think should work and don't and then comparisons that I think shouldn't work and do! I think this is made worse by hiding the comparisons inside assert functions. As a result I'm starting to come to the opinion that not only is the should framework more trouble than it's worth but in fact any assert framework that hides comparison logic is not such a good idea to use in tests in Javascript. This includes very standard things like: assert.equal(object1, object2);

I may revert to just a single check function that will better reflect how comparisons would actually be written in production code. Ie: assert(conditionalCodeThatResolvesToTrueOrFalse);

[...interlude over]

So there I have it, I can now run my tests as files change and rely on the watch task to keep going no matter what happens (so far!). Just the mongoose problems to resolve then, and actually I added another.
  • If a unit test beforeEach function falls over then the after functions are not run
    • This means that as I open a database connection in before and close it in after, when I get such an error I then continue to get failures when files change due to not being able to open the database anymore (it's already open)
    • Not as serious as the silent failures as at least the watch process keeps pinging me and I can restart it. But still a little annoying
This new issue got me thinking again about the require cache. My previous investigations here had proven fruitless but then, perhaps I had been led astray by some dubious comments on StackOverflow. Beware, this code does not work:

for (var key in Object.keys(require.cache)) {delete require.cache[key];}

So now I was thinking about the Mongoose module.
  • The problem isn't that the changed module is still in cache
  • The problem is that the Mongoose module is still in cache
  • In fact the problem is that any modules are still in cache
  • I must clear the cache completely before running my tests!
    • Actually I had tried this and it didn't seem to work
    • However I had tried it in my tests themselves, now I could try it in my new grunt task :)
      • I had already needed to add code that dropped all my own files from cache to make things work. It made sense to drop the rest too when I come to think about it.
So i fixed the code above:

for (var key in require.cache) {delete require.cache[key];}

Tidied up my mocha task adding support for options and this is what I have in a new module...


To use this I dropped it in a grunt tasks directory and updated my grunt.js file...


Note that the call to loadTasks takes the directory name. Also note that I overrode the built in NodeUnit test task and that the options to pass into mocha are given in the mocha config property.

So that's it I no longer have to use my Mongoose workaround as the Mongoose module is cleaned up along with everything else before I run the tests :)

I hope this will save me from similar gotchas in other modules too, but I guess I'll just have to code and find out :D

Wednesday, July 18, 2012

NodeUnit, Mongoose and Grunt watch

Edit: Although interesting to me as a history to my Node.js testing issues, this article is now pretty much superseded by this one which better addresses all of the below problems

Now that James Shore's "Test Driven Javascript" series has kicked off I've been integrating unit tests into the 5Live hangout project. This has, for the most part, been simpler than I expected. NodeUnit is pretty easy to use and from one of the comment threads I have been introduced to Grunt which allows me to tie all my lint tasks and unit tests into a single automated script (James has been doing this himself in Jake but I figured I would give Grunt a try as it does some of the 'grunt' work for me :)) .

Like I said, for the most part this has all been going swimmingly. One of the nice features of Grunt that I discovered is the watch task. This allows me to watch a list of files and when they change, automagically kick off my lint tasks and unit tests - very nice :D

There are some problems though. My application uses Mongoose to interact with a MongoDB database. As such I follow the standard mongoose pattern of using a singleton and defining my model schemas like this...



As I'm doing TDD I actually start off with something like this in a separate test file...



That's all hunkydory. I can run the tests and they pass. I can kick off grunt watch and leave it running while I start editing my files. Let's see what happens when I change my test, thusly...



As expected grunt watch pings me to let me know that my test has failed :)

So I go back to my model and update the greeting function...



Gah, grunt watch pings me again to say that my test still fails. Puzzlement abounds!

If I stop grunt watch and run the tests manually they pass! So what's going on?

Well I wasted a lot of time messing around with the require.cache object, as I figured it was something to do with Node.js module caching, but that wasn't it at all. Either NodeUnit or Grunt is smart enough to remove the changed files from the module cache (I think it must be Grunt that does this but I didn't check).

Eventually I realised that it was the mongoose singleton that was causing the problem. After all this only happened with my mongoose model tests. As the mongoose singleton persists between test runs it doesn't matter that I change the methods on my models, the old versions also persist.

Again I tried a number of workarounds but so far the best seems to be the following.

First I created a wrapper for the mongoose singleton which allows me to reset the schemas...



Next I integrated this wrapper into my tests (only the tests, i still use the the mongoose singleton directly in the model implementations)...



So why do I prefer this solution and what else did I try?

Well I also had another workaround which fixed the problems with method updates.
  • Instead of using Schema.methods to assign methods I used Model.prototype
  • Instead of using Schema.statics to assign static methods I just assigned them to the Model directly
Why didn't I like this?
  • This solution meant a little rejigging of the code to what seemed like a non standard pattern in the actual implementations
  • This did not fix a similar problem with updating the actual Schema - ie. adding or removing fields
I still don't much like my eventual workaround as...
  • it depends on knowledge of the the internals of Mongoose (which might change)
But at least it's contained in my tests and seems to work for all changes in the model.

However, even with this workaround in place I'm still  not fully happy with the way grunt watch works.
  • Annoyingly, it exits on a number of test failures particularly when things are not yet defined. This happens a lot when doing TDD, it's how we write failing tests.
    • When it does exit it doesn't actually ping me. As such I have to keep looking to see if it stopped (If i have to do this all the time, it occurs to me that I may as well not use it and instead run my tests manually)
  • I'm now just waiting for the next gotcha as I only worked around a problem with Mongoose.
    • It seems to me quite likely that there will be other libraries that follow similar internal patterns and they are likely to trip me up in the same way
I have a solution to suggest though...
  • Spawn a new Node.js process at least for every grunt watch event if not for every NodeUnit test
    • Wouldn't this fix the problem once and for all?