Tom MacWright

tom@macwright.com

Writing a small module in 2018

Rube Goldberg Machine

The controversy over ‘micro’ modules in the land of JavaScript has faded, arguably because they won. Heavily-used modules like npm itself use micro-dependencies like once, a 42-line module with over 8 million weekly downloads.

But simultaneously, the definition of a module has expanded, warped, and agglomerated complexity until it’s something quite different than it used to be - and arguably a lot harder to fulfill. When all we needed to support was Browserify and Node.js, a single CommonJS entry point would suffice. But now, module consumers now might expect TypeScript definitions, a UMD entry point, an ES6 module entry point, and possibly even Flow definitions.

I recently updated a number of my modules to a ‘modern’ setup, advised the Turf project on a large refactor, and via my work at Observable have had additional exposure to the module-packing ecosystem. Here’s what I’ve been doing, and some thoughts.


Cheatsheet / covering the basics: just in case you’re unfamiliar with what each style of module looks like:

// CommonJS
module.exports = …; 

// ES Module
export default …; 

// UMD
(function (root, factory) {
  if (typeof define === 'function' && define.amd) {
    define(['b'], factory);
  } else if (typeof module === 'object' && module.exports) {
    module.exports = factory(require('b'));
  } else {
    root.returnExports = factory(root.b);
  }
}(typeof self !== 'undefined' ? self : this, function (b) {
…

As a basic rule of thumb:

  • People usually write code with ES6 or CommonJS modules. Rarely do projects write code in UMD style.
  • ES6 can be compiled to CommonJS and vice-versa. ES6 and CommonJS can be compiled to UMD. UMD can’t be compiled to anything else.

The barest minimum: just CommonJS

The barest of the bare minimum of entry points, arguably, is still a CommonJS module. With a CommonJS module, you can support Rollup users who are using rollup-plugin-commonjs, and Parcel, Webpack, and Browserify users without any additional plugins or configuration.

CommonJS modules can’t be consumed by browsers, no matter what you do. So if you publish a module as only CommonJS, people who want to load it into a browser and who aren’t using a bundler will have to use a bundler-as-a-service, like bundle.run . That isn’t ideal: bundlers aren’t simple or reliable things to run as a service, and even the best bundler-as-a-service is going to be less reliable and slower than a service like unpkg that only has to serve files verbatim, and can aggressively cache them in a CDN.

CommonJS modules are also less than ideal for people who are writing applications using ES6 exports & imports. The ability of tools like Rollup and Webpack to turn CommonJS into ES6 is spotty at best, and advanced techniques like tree shaking are easily broken by common code patterns in CommonJS.

Too bleeding edge: just ES Modules

Writing the source of a module with ES import/export is arguably the way of the future.

Let’s say you write a module using ES6 export and only a module entry. Where will it work?

  • It’ll work in 80% of browsers without any transpilation step.
  • It’ll work with Rollup, Parcel, and Webpack without any additional configuration.

But it’ll fall short with Node.js. The Node project has progressed toward module support… slowly. There’s experimental support in Node v11, but you’ll need an experimental flag and a special file extension. There isn’t enough support in Node.js yet to say that you can just ship an ES6 module and be done.

You can, though, use the excellent esm module to import ES modules in Node.js today - that’s what we do with our backend services at Observable. But for a small module, I’d wager that’s too much to ask, and runtime compilation, even when done extremely well like in the esm project, is still icky.

ES Modules transpiled to UMD and CommonJS: Just right

The approach that I landed on is to write my tiny modules with ES Module syntax - import and export – and then to use microbundle to build CommonJS and UMD entry points for them.

microbundle is a ‘zero configuration’ bundler: it transforms ES Modules into CommonJS and UMD modules. Under the hood, it uses Rollup, another bundler, but it makes Rollup easier to use for this particular instance. Instead of configuring Rollup and installing Rollup plugins as you’d have to do usually, microbundle includes a reasonable basic configuration of Rollup and cleverly uses package.json, the module manifesto you already have to write, as its configuration.

So taking as an example my wcag-contrast module, here are the relevant parts of package.json:

  "source": "index.js",
  "main": "dist/index.js",
  "umd:main": "dist/index.umd.js",
  "unpkg": "dist/index.umd.js",
  "module": "dist/index.m.js",

That’s a dense five lines: here’s what each does:

  • source: this is the input, the raw code that’s maintained in the GitHub repository, what I write.
  • main: the CommonJS entry point. This is what you get if you use Node.js or Browserify to consume the module.
  • umd:main and unpkg point to the same thing. I specify umd:main because that’s how you configure microbundle to output a UMD bundle, and then alias it with unpkg, because that informs unpkg.com to serve that file by default. That means that people who include the module with a script tag get the right thing, and so do consumers who use a UMD-compatible module loader, like d3-require, which is what Observable uses.
  • module: the ES Module entry point. This is what people who use Rollup or Webpack will get, and what you can use with JavaScript’s dynamic import() and the static import syntax, if you use <script type=module.

I tell microbundle to run by calling it in the scripts and then running standard-version

"scripts": {
  "release": "microbundle && standard-version"
}

But if you don’t use standard-version (that’s a whole different blog post), you can run it with prepublish:

"scripts": {
  "prepublish": "microbundle"
}

Finally, I instruct npm or yarn to include the dist files in the npm module by configuring the files field:

"files": [
  "index.js",
  "dist"
]

Are JavaScript modules getting worse? Or better?

I wrote something similar to this in 2013, which sounds simpler.1. In 2018, we’ve added moving parts to small modules, which does detract from their minimalism.

The transition to ES Modules has been rocky and isn’t yet complete. The transition to ES6 syntax has also been a bit of a rollercoaster: some people want to ship modules that rely on bleeding-edge syntax, and other people expect all modules to be compatible with Internet Explorer.

That said, as an eternal optimist 😉, I do see a silver lining to this situation.

For the first time, thanks to ES Modules, browsers can load JavaScript modules, and even do it asynchronously. This means that, at some point in the near future, you may be able to develop web applications without a bundler whatsoever as you’re working, and then only introduce a tool like Rollup or microbundle once you want to ship it to users. I’ve already started doing this with a few of Observable’s modules — testing changes in a browser without running any bundler, just using the native ES Module support.

After years of relying on scripts to make modules browser-compatible, it’s a pretty exciting future.

An extra note: how to play nice with tree-shaking

One of the much-hyped related benefits of ES Modules is tree-shaking: a sort of dead code elimination that’s able to determine which exports your code uses, and only include related code. So if you have a big module and someone only imports a single function from it:

import {someFunction} from "big-module";

Then only the code related to someFunction will be included, not the whole thing. When it works, it’s a very nice performance perk.

But it often doesn’t work. Tree-shaking isn’t magic, and there are lots of ways to write code that breaks it. Rollup’s notes are pretty good, and Webpack’s are also worth a read. But to summarize what I found from a recent misadventure, here are some lessons.

The good news is that the things that break tree-shaking were already code smells that were already recommended against.

First: don’t mess with globals

For example, let’s say you rely on the Number.MAX_SAFE_INTEGER constant, a value that has been implemented in everything… except for IE. MDN recommends this polyfill:

if (!Number.MAX_SAFE_INTEGER) {
  Number.MAX_SAFE_INTEGER = Math.pow(2, 53) - 1;
}

Don’t put this in your small module, or anything like it. This polyfill and others like it modify global variables. Rollup and other bundlers have to look at this file and think “well, maybe some consumer of this module is relying on this side-effect,” and Rollup is obliged to include it, even if the functions and values you imported don’t rely on it.

Instead, do something like this:

const MAX_SAFE_INTEGER = Math.pow(2, 53) - 1;

And just use that local variable in your module.

Don’t put your module.exports in if

If you want to go the CommonJS route, it might be tempting to guard your module.exports statements so that you can test it in a browser:

if (typeof module !== 'undefined') {
  module.exports = something;
}

Unfortunately, this also breaks Rollup’s tree shaking, because it can’t statically figure out what’s exported, and remove the code that isn’t. Instead, rely on a CommonJS environment being present.

Truthfully, this is tricky territory: there currently isn’t any easy way to tell if you’ve broken tree-shaking. There should be - I welcome someone to implement that in microbundle or Rollup!2 So right now it’s essentially something is happening somewhere, that you can’t see.

Remaining points of controversy

I hope that I’ve made some of the fundamental mechanics of modern JavaScript modules clear. Assuming I have, let me muddy the waters a bit, again.

Should you bundle dependencies? I lied a bit, earlier - for wcag-contrast, I run

microbundle --external=none && standard-version

The external=none flag tells microbundle to bundle in wcag-contrast’s one dependency. This invokes a series of tradeoffs:

  • It makes it safer to use my micro-modules, because consumers will no longer suffer if one of their dependencies is unpublished or becomes malicious. Dependencies are frozen when I publish.
  • It makes my micro-modules shallower in people’s dependency trees. They don’t have dependencies of their own.
  • But in the case that people use multiple of my micro-modules - if someone relies on both wcag-contrast and its dependency, relative-luminance, then they get two copies of the relative-luminance code, instead of letting yarn or npm do its magic to deduplicate that dependency chain.

These tradeoffs make sense to me and my niches, and but whether the make sense for other people is an open question.

Should modules include TypeScript definitions? I don’t use TypeScript, and don’t write TS definitions for my modules. I think some users - TypeScript superfans - want definitions, and I accept their PRs to add those definitions, but I’m very divided about this addition. It’s very hard to tell if TypeScript definitions are correct, to test them, and to maintain them if I don’t, as well, use TypeScript.

So, for now, the answer is no: I don’t usually include TypeScript definitions, and really don’t think it’s a safe thing to expect from new modules.


Footnotes

2: Rich Harris’s agadoo project lets you know if tree shaking works for a specific module, but doesn’t yet say why.