µcdn: a live, bundlerless, alternative

If there’s something annoying about JS modules in 2020, is that they don’t play very well with the million of modules published in npm … but this is about to change, thanks to µcompress, “micro/you compress”, and µcdn.

A basic example

Explained in a repository dedicated for this purpose only, the following is all it takes to have an idea of how both µcdn and µcompress work:

git clone https://github.com/WebReflection/ucdn-test.git
cd ucdn-test
npm i

That’s it, if everything installs fine, you’ll have a localhost:8080 to point at, and see all requests related to files in the ./src are delivered already optimized. A simple refresh, and see all assets become 304.

Modify any file in the source, and see after max 1 second it’ll be served updated, and pre-optimized.

Run the following command, and create a ./public folder with all assets already optimized, ready to be published in any static site deliverer.

npm run build

What is this sorcery?

Latest µcompress added the ability to crawl any folder, ignoring eventually the node_modules one, but using it to resolve dependencies automatically.

Check the top of the ./src/js/index.js file, as example:

import { render, html } from 'uhtml';import './my-counter.js';// ... 

The crawler will resolve uhtml module, finding its dual-module compatible entry point, a technique already described few days ago.

That’s it, if you have published an ESM compatible module in npm, the crawler will look for either the "exports" property, and its related "import", or it will fallback to the "module" field, assuming this points at a standard JavaScript file.

Standard JS is key though, as specialized syntax such as TypeScript or JSX, both not standard, aren’t transpiled on the fly, as the toolchain required to handle this or that case is massive, and if included within µcompress, the “micro” prefix of the name would become instantly meaningless.

However, you have all the tooling you like to pre-build standard JS from your source, and use that result as source target for either µcdn or µcompress.

About multiple dependencies

Each module is resolved, and optimized, only once. As example, the js/index.js file requires µhtml, but it also imports the js/my-counter.js component, which requires µce, which in turns uses µhtml to render its content. As result, all imports from "uhtml" will point at the exact same file, unleashing the real power of ECMAScript modules.

Shared dependencies will be then resolved through the node_modules folder which must be included in the source, ’cause it has to be reachable from the root of the site, hence being able to be part of the project, once live, but fear not, only pre-optimized and resolved files will land in production so … everything is fine.

Are dynamic imports available?

Unfortunately, there is no way to know imports targets composed at runtime. However, import("module") will work as expected, resolving "module" the same way any other static imports are resolved.

Any other relative import will work too, as long as the file is reachable, and not outside the source root boundaries, so that cases where dynamic import wouldn’t work are reduced to things like:

// this doesn't work at all
import(condition ? "module1" : "module2");
import(`module-${thing}`);
// this works *only* if ./relative is part of
// the source and not as 3rd parts module logic
import(`./relative/${file}.js`).then(console.log);

And what about IE?

Once again, there are plenty of tools able to transpile ESM and modules into ES5 compatible syntax, as well as there are plenty of ways to conditionally load modern, or transpiled code, where this one, based on <script type="module"> and its <script nomodule> counter-part would be the easiest way to go.

Using bundlers is still a very good practice, ’cause as cool as standard JS modules are, the amount of requests, specially in big projects, can make your site slower than a pre-optimized bundle split in meaningful chunks that aggregate only code that’s needed.

That being said, bundlers are incapable of making code really independent, or testable in isolation, ’cause modules could be duplicated across chunks, even the basics used to import already imported modules.

In this scenario, the processing done by µcompress makes any project instantly portable through shared, external, 3rd parts dependencies, so that any file can be also tested in isolation, through the public folder.

In order to do so, I’ve also implemented a --no-minify flag, so that the only thing that happens to your JS files, is the eventual module resolution.

A new, refreshing, way to deploy JS 🎉

It’s been a very long time since refreshing a page, after changing a standard JS file content, would’ve produced an instant feedback, and on demand:

  • complex projects won’t need to build the universe per each file change
  • only browsed parts are eventually invalidated, all dependencies cost one time only: happy blazing fast browsing!
  • it works 100% offline, no need to use unpkg.com/uhtml?module or similar online helper, which are amazing, but always need bandwidth, and are usually slower than a 304 from your own machine 😉

I hope you’ll give µcompress and µcdn a chance to test, or deploy, pre-optimized assets, and eventually help me improving their potentials which are, in my opinion, even at this early stage, huge!

Literally … in 2 minutes!

Above video shows how trivial it is to bootstrap any project based, or not, on npm modules, separating FE dependencies from BE one. Isn’t this great?

Written by

Web, Mobile, IoT, and all JS things since 00's. Formerly JS engineer at @nokia, @facebook, @twitter.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store