The primary purpose of updating a library is to fix things, not to break things.
How many of you have encountered the same dilemma as I do when trying to upgrade a relatively old NPM package in order to keep on trend but resulted in an error saying I cannot use require
to import an ESM module and suggests I should use await import
instead? Who wants to use await import
?
Many people from other languages have been accusing Node.js ecosystem of lacking backward compatibility support, a year ago I would argue that this was the speed of Node.js evolution and that was good. But now my opinion has changed. It would be much nicer if we just upgraded the dependency and had all the new features it brought and didn’t have to massively refactor our projects. (It’s such a joke that I have to refactor the whole project just so I can use the newest version of nanoid. BTW, I haven’t.)
A complicated application isn’t like a small library, we cannot afford to do so much meaningless refactoring just to use the new syntax, it would be irresponsible for our users and our team members. From my experience, sometimes CJS is even better than ESM, especially when hot-reloading is a primary function in my program, which I can easily achieve by deleting the cache form require.cache
and requiring the same file again. With ESM, it just cannot be done.
So what should we do? well, in my previous article <<Use URL imports before Node/TypeScript supports it>>, I’ve talked about something called import-map, now I’ll be talking about its reversed version, the export-map (a.k.a exports
field in package.json
), which allows us to choose what version of code should be exposed to the importer. I’ll be using the same package @hyurl/utils for demonstration.
So I wrote the code perfectly in ESM standard, even much more standard than commonly seen Node.js modules since I use URL imports in the source code. Then I transfer the code to various output targets, including CJS and ESM for Node.js. So how do I guarantee that when this library is imported via the require
keyword, it redirects to the CJS version, and when it’s imported via the import
keyword, it redirects to the ESM version? The answer is using the exports
field in the package.json
file. (I also used main
and module
for backward compatibility.)
This is how I configure my package.json:
I don’t need to explain very much about the exports.require
and exports.import
, people see them and they’ll immediately understand what they mean. However, I’ll be talking about something more subtle.
exports.require
will work immediately without any doubt, but exports.import
requires an extra setup, that is an additional package.json
file sits in the dist
directory whose contents include "type": "module"
, in order to instruct Node.js that these .js
files are in ESM format instead of CJS format. That is why in my previous example, after replacing the URLs in the .d.ts
files, it also emitted a package.json
file to the dist directory with the content {"type": "module"}
.
exports.types
, apparently is for TypeScript to link type declarations of the target module. But it, too, won’t work immediately. In an application, TypeScript won’t support exports.types
unless the compilerOptions.module
is set to NodeNext
in the tsconfig.json
. So if our application codebase is currently using CommonJS
, change this option to NodeNext
instead, the emitted code will still be in CJS format, our program should work the way it was.
Or will it? It turns out, there is a little difference between NodeNext
and CommonJS
in the generated code, that is the use of dynamic import. With CommonJS
, TSC will replace the await import
to require
, however, with NodeNext
, it will not (I guess it asserts that dynamic import is now natively supported in Node.js and it refuses to modify this code). So If our program previously used await import
and assumed it would be translated to require
and used the goodies of require.cache
(or other CommonJS specials), we need to modify our application code to use require
instead of await import
.
There is nothing to talk about the exports.default
, currently I haven’t seen any runtime or bundler using it yet, here I just filled it with the source file, which I’m going to talk about why I publish the source code alongside the generated code and even place the source code in the root directory of this library instead of using a src
folder.
There is something missing in this configuration, that is exports.bun
, which instructs Bun runtime to import the target file instead of using the exports.require
or exports.import
. If you haven’t heard of Bun yet, then you should Google about it. In my opinion, Bun is the future of Node.js.
Now you may wonder, if it’s so important, why I missed exports.bun
in my package.json
. Well, It turns out that currently, Bun has some bug for resolving the import map that contains URLs, which I’ve already filed an issue on its GitHub page. Hopefully, it will be fixed in the future.
This is just an edge case with Bun, if we don’t have URLs imports in our codebase, we should add the exports.bun
option and point it to the source file. Why this is important? Will it slow down our program for importing and parsing the source file, than just importing the already generated code? Well, that should never be the primary concern, for the source file is only imported and parsed once, and we should run the bundled version of code in Bun in production instead of using Bun like Node.js and importing every file from node_modules.
I learned this practice from Golang. With Bun, we could finally publish source code to NPM and rely on Bun to just compile/bundle the source code once. Just like Golang, we don’t publish the built version and provide ABI, we just publish the source code, and it’s the user’s freedom to determine how the code should run.
That’s why now I always write and place the source code in the root directory and compile them to a subdirectory for use in Node.js. The source code has become the first class member of the library, and the compiled code is just for specific runtimes. And that’s the same reason why I’m refactoring the library in Deno’s style and providing native support for Deno. It just feels right.
Back to the package.json
, there remains one last thing I need to talk about. The reason why I added main
, module
and types
options, is not to support old Node.js versions (I think not many people are still using a Node.js version older than v10). It’s that these options work fine with compilerOptions.module: "CommonJS"
, so if we just import the whole library through the entry file in our application, we don’t need to modify the tsconfig.json
at all. These small settings provide even more backward compatibility to the library.
Oh, there is ONE MORE THING, don’t use "type": "module"
in the root package.json
alongside with exports
, it confuses people, and since we don’t have a package.json
in the cjs
folder, this setting will make exports.require
stops working. It’s better never to use "type": "module"
in the root package.json
at all, and always use exports
instead.