Node.js REST API Enhancement and VR Party

Let's get back to looking at node.js, a REST API implementation, and a recent San Francisco virtual reality hackathon project:

Node.js REST API enhancement

My colleagues Cyrille and Philippe were not very impressed with one aspect of my previous node WebGL viewer server REST API implementation.

Both suggested that each API module implement and export its own router instead of just defining and exporting individual functions.

The main reason I was loath to do that was that Cyrille's initial suggestion required redefining the body parser middleware for each API module.

However, Philippe's sample shows that that is not necessary. He says:

Quickly taking a look at your node sample, here is a suggestion on how I manage the API routes:

Instead of specifying separate app.get and app.post functions for each individual API route, I use app.use('route...', export), which enables me to easily gather all my routes for a specific API in the same file with a single export. More importantly, if I add or modify any route for a specific API in the future, I don’t have to worry about modifying or adding routes to the server.

Here is an example that illustrates this in my server.js module:

var collectionsApi = require('./routes/api/collections');
var itemsApi = require('./routes/api/items');

var app = express();

// API routes

app.use('/node/mongo-admin/api/collections', collectionsApi);
app.use('/node/mongo-admin/api/items', itemsApi);

The API implementation in items.js, for example, looks like this:

var router = express.Router();

router.get('/:collectionName', function (req, res) {
  // ...

router.post('/:collectionName', function (req, res) {
  // ...

I find this approach quite elegant compared to exporting separately all get, post and so on...

A personal feeling that I wanted to point out.

A full sample that I’ve been working on recently, mongo-admin, illustrates a full concrete implementation of this concept.

I changed my NodeWebGL implementation accordingly.

The server module is simplified and just pulls in the API implementations like this:

var apiv1 = require('./routes/apiv1');
var apiv2 = require('./routes/apiv2');

app.use ('/api/v1', apiv1);
app.use ('/api/v2', apiv2);

The API v2 implementation now uses its own router instead of just exporting two standalone functions:

var express = require('express');
var router = express.Router();

router.get ('/', function(req, res) {
  res.send('API v2 GET: Here belongs a succinct '
    + 'explanation how to use The Building Coder '
    + 'WebGL Viewer REST API v2...');
});

router.post ('/', function(req, res) {
  console.log('API v2 POST: ' + JSON.stringify(req.body));
  res.setHeader('Access-Control-Allow-Origin', '*');
  res.render('viewer', req.body);
});

module.exports = router;

The body parser middleware, however, is still defined globally in server.js for the whole app.

The complete node server implementation is available from the NodeWebGL GitHub repo, and the version discussed here is 0.2.9.

I used the unmodified existing interactive testing framework to test it both locally and live on Heroku.

It still seems to be working just fine.

Many thanks to Cyrille and Philippe for their critical review and valuable support!

Top ten nodejs mistakes

Talking about suboptimal implementations, here is a useful and illuminating article by Alexandru Vladutu describing the top ten mistakes node developers make.

I'm afraid I must admit to committing every single one of them, except the ones that my project is still too small for.

Kean's VR party

Before closing, a quick note on Kean Walmsley's recent vr-party project at the virtual reality hackathon in San Francisco May 22-24.

Quoting from Kean's report on cooling down after the SF VR hackathon:

vr-party makes VR a collaborative experience: to have someone curate and control the VR session for a number of consumers.

Communicating design information is a really important activity for all parts of our industry, and I think VR could well become a great enabling tool.

We ended up with a 'presenter' page, which allows you to open and view models via the View and Data API, and an arbitrary number of 'participant' pages on the any kind of device, ideally using Google Cardboard to see the page in 3D.

All the events you perform – apart from changing the viewpoint, which is something we want controlled locally – get propagated to all of the connected clients via Web Sockets. So, if you isolate geometry in the presenter window, all the viewers see the same thing; the same is true for exploding the model; and even for sectioning!

And still you control the camera yourself... The experience was actually really compelling.

Kean's VR party

Congratulations on winning the well-deserved award for the best web-based VR project!

Please refer to Kean's report for more pictures and the full story, and to the vr-party GitHub repository for the source.