This is part II of the post on my experiences from working with a Node.js and MongoDB solution. Read part I of this post to learn more about OS support, Node.js guidelines, services, tools and more.

What web frameworks and Node.js modules to use

A lot of web frameworks are available for Node.js. Express is the most commonly used, but Restify, Geddy, Tako, Flatiron and others may also be viable options. Spend some time on researching them, and choose what’s right for your needs. Personally, I’m not too keen on frameworks that focus heavily on building models. With MongoDB and JSV in place I think it’s redundant, but this may be because we’re building a REST API.

Through Nodes Package Manager (NPM) you can get a list of the most commonly used Node.js modules. When using others’ modules it’s a good idea to select a Node.js module that is high on the list and has a lot of activity at GitHub.

Originally, we used Jasmine for our tests since this is used for our client-side JavaScript. We’ve since switched to Mocha, however – partly because the asynchronous nature of Node.js code makes it much easier to write in Mocha, and partly because we could get code coverage reports with Mocha, unlike with Jasmine.

Our choice of JSV for JSON schema validation in the Node.js code has been extremely successful and is highly recommended. We use JSV to verify data sent to our API, and only indirectly to verify parts of the data stored in MongoDB. For our API documentation, we’re generating an example of the kind of JSON you can send to us based on the JSV schema.

MongoDB

It’s easy to develop with MongoDB. You can just start writing data, without any kind of initial configuration – and there is no DDL. Later, you can add indexes to enhance performance.

Remember to protect against JSON injection and mass assignment attacks.

In MongoDB, you need to handle referential integrity yourself – there are no foreign key constraints. So you need to implement them yourself in the Node.js code. Be prepared to spend some time getting used to a schemaless document database if you’re used to relational databases.

MongoDB has atomic transactions at document level, but not across multiple documents. So if you have many large transactions involving many entities, MongoDB isn’t the right database for you.

Every time you write in MongoDB you can graduate how certain you want to be that data is persisted. There is a gradual transition from “fire and forget” to data needs to be written to the database on disk on N servers in your replica set. You should select the strategy that suits the data you’re writing on a case-by-case basis.

Be prepared to introduce other datastores for particular use patterns. For instance, MongoDB is not the obvious choice for pulling reports on a large scale. In this case, you should probably select another datastore to pull the reports from.

You need to think carefully about the security between Node.js and MongoDB, particularly if you want to use cloud hosting. The data traffic between Node.js and MongoDB isn’t encrypted by default, and encryption isn’t supported directly in the MongoDB protocol. To secure your data, you need to establish a secure connection, e.g. using SSL. This also applies to the traffic between nodes in a MongoDB replica set. MongoDB doesn’t come with a built-in audit trail.

Our contacts at 10gen, the company behind MongoDB, have been supportive, and we have had a number of beneficial Skype calls with them.  They have been very open about the things that MongoDB can’t do. They also provide hourly-paid consultants who can review your design, fine-tune your indexes etc.

Hosting

It has proved difficult to find a single supplier that provides 24/7 support and hosting of MongoDB with replica sets and SSL as well as Node.js with load balancing and failover. So you need to spend some time finding a supplier if your hosting requirements are in any way enterprise-like. Also keep in mind that the Node.js servers should preferably be in the same data center as the MongoDB servers. This is the case with e.g. Heroku and MongoHQ, whereas Nodejitsu and MongoHQ aren’t a good fit.

If you do your own hosting, or handle Node.js and MongoDB yourself using rented hardware, Node.js is easy to host, but MongoDB can take up some operation time in production. MongoDB has a few drawbacks, for one thing it’s not easy to reclaim disk space when deleting data. Like all databases, MongoDB likes to have plenty of RAM.

It was great to be able to test deploy our application for free using Heroku.com and MongoHQ in half a day, as it allowed the decision-makers in our company to see the application running. This will have a particularly strong impact if you’re in an organization that’s not in the cloud, meaning that adding servers is usually expensive and takes a long time. Once the setup is in place, each deployment only takes a minute. You just need to push to a Git repository at Heroku using a single Git command.

Feel inspired?

Do you feel inspired to try out Node.js and MongoDB? As mentioned earlier, Node.js and MongoDB are free to download and only take 10 minutes to install – and they run on Windows, Mac and Linux.