Thursday, October 27, 2016

Nodejs module.exports and singleton


Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.

Multiple calls to require('foo') may not cause the module code to be executed multiple times. This is an important feature. With it, "partially done" objects can be returned, thus allowing transitive dependencies to be loaded even when they would cause cycles.

If you want to have a module execute code multiple times, then export a function, and call that function.

Module Caching Caveats

Modules are cached based on their resolved filename. Since modules may resolve to a different filename based on the location of the calling module (loading from node_modules folders), it is not a guarantee that require('foo') will always return the exact same object, if it would resolve to different files.

Additionally, on case-insensitive file systems or operating systems, different resolved filenames can point to the same file, but the cache will still treat them as different modules and will reload the file multiple times. For example, require('./foo') and require('./FOO') return two different objects, irrespective of whether or not ./foo and ./FOO are the same file.

Most Node developers have figured out that if two files in your project require the same module, like this:
var shared = require('shared');

var shared = require('shared');
... Then they will get back the same object, because Node caches requests for the same module.
But you might not realize (I sure didn't!) that if your project is based on npm modules that both require the "shared" module as a dependency, like this:
var shared = require('shared'); 

var shared = require('shared');
... And you install their dependencies with npm, there will be two copies of "shared/index.js" hanging about:
The cache mechanism of "require" does not consider these to be the same file. In my tests, the two modules get different objects back from require('shared').

 Here's a simple example, taken from my upload fs module, of an npm module that does this the right way:
// Export a convenience function that creates an instance
module.exports = function() {
  return new uploadfs();

// The actual constructor function, which is also a closure 
// holding all the private state information

function uploadfs() {
  // Reference to "this" that won't get clobbered by some other "this"
  var self = this;
  // Private state variables
  var tempPath;
  var backend;
  var imageSizes;
  var orientOriginals = true;

  // Public method: initialize the object
  self.init = function(options, callback) { ... }

  // Private functions can exist in the closure too; they can see the state but
  // outsiders can't touch them

  function private() { ... }

These problems might not just be obvious in daily jobs, but when it comes to event emitters this becomes a critical problem, you might be receiving events one place and not other. Probably now you know why.