MDN doesn’t trust you, should you trust MDN?

Andrea Giammarchi
10 min readOct 14, 2024

--

Photo by Bernard Hermant on Unsplash

There are some lovely Quora’s answers regarding this topic, and I’d like to already break the ice around the fact MDN has been the go-to website when it comes to Web specifications, documentation, examples, broader context around APIs or experimental features and whatnot, so that if you are here to “hate”, I want to clarify that MDN remains (imho) the most trustworthy reference for Web developers out there these days … until they are not though, hence this post.

some background

While many on X that don’t know me at all described the topic I am going to talk about as “ego issue”, these are some historical facts around me and my 25+ years experience on the field:

  • I am an Open Source and Open Web believer since my first days of Web development: I’ve learned from OSS and I’ve always tried to give back
  • I have written polyfills to move the Web forward since IE4 times (yeah, I am 46 old, and I contribute as I can since I was 20 or even earlier), but if you don’t believe me, I’ve written polyfills before the term even existed for browsers like Firefox 1 and others around at that time … please don’t stop there though, me writing polyfills has been a thing up to today and FAANG to startups used my code in production too
  • I did write some whole page on MDN in the past plus I did contribute in more recent times, due time constraints, amending here and there on occasions … even recently
  • as annoying as I could be on (hopefully rare) occasions, when I disagree around some topic, my history of working experience from FAANG to startups can probably confirm I am also perfectly capable of letting it go and quickly move forward over disagreements or even suggest the best outcome for everyone out of such disagreement
  • I have sporadically contributed to TC39 (ECMAScript / JS) specifications, or some WHATWG idea, plus I collaborated with Igalia to help them bring in the CSS :has(...) selector (and other less popular topics)

But dude, why should I care about all this?” You are right, I might be “Mr Nobody” to you, but I think in the specs, and Web field in general, I probably or hopefully gained some trust: not as the best dev, surely not as the best dev to deal with, hopefully as one that has records of trusted contribution to the Web itself without ever causing issues with his popular, up to hundreds million downloads per month, OSS ideas.

And yet, MDN wouldn’t care a bit about me, my previous work, my history there, my contribution to their browser in the past, or the fact I also do OSS like they do, with tests, coverage, and cross-browser or cross-engine intents, because this is the answer I’ve got from my recent (double) PR:

We generally reject links to one’s own work

If you are still reading, I’ve tried to explain that it’s not like I want a special throne on the Web or anything, but from there to be a “generally rejected link” there are oceans:

  • what does “generally” mean in there? where is the explanation of what it takes to be excluded from that rule? … crickets!
  • why is a project born to be a community project that doesn’t even use my GitHub namespace, such as @ungap is, considered “one’s own work”?

Meet @ungap project

The @ungap project is a community related effort to bring mostly newly spec’d APIs to the browsers, in a “best effort” way that doesn’t include all the bloat other polyfills would include by default.

Quoting the project’s goal:

Pragmatic is better than (im)perfect

There are parts of the specifications that are very hard, if not impossible, to polyfill. The main purpose of this project is to help developers move forward, and possibly without unnecessary bloat. This basically means that polyfills are written to support 99% of the use cases, without granting 100% spec compliance.

If you need that, which again is basically impossible in most of the cases, feel free to keep using whatever monolithic polyfill or approach you were using before.

We all rant about JS bloat here and there and most don’t even realize the moment they use any transpiler that bloat is included by default to help them writing otherwise not necessarily usable, out of the box, code in the wild … and here @core-js plays a wonderful (no irony intended) role:

  • it’s used by Babel and other transpilers, so that even if you don’t know about it, it’s likely part of your code-base or toolchain
  • it’s under a single person repository, it’s not an organization or a collaborative place: you need to file issue to that original repo in order to get anything approved
  • it’s paranoid about JS pollution, so it includes all over the place its internals. This is not a bad thing in general, but it’s free repeated bloat for anything you need or use from that repository
  • because of the previous point, if you include 2 ponyfills separately from core-js, your bundle size will double out of the box for no reason and, most importantly, no extra security warrants

But here is the catch:

  • core-js is popular because popular bundlers use and trust it and MDN promotes it
  • core-js code-base is defensive by design, but it’s objectively as vulnerable as anything else on the Web, because the moment some evil script manages to pollute the env, if core-js is loaded after, in a module, or lazily, it’s a doom field like any other JS script that exists to date (until native import from ‘esm:Object’ lands in browsers and no bundler would dare touching it)
  • because of all previous points, contributing to core-js is a giant effort, because learning all the ways core-js works, to grant that pseudo-feeling about security, and deliver, requires a lot of time to investigate or a lot of time in fixing whatever PR lands in there: you know it already? maybe easy … do you just want to move forward about some recent spec? Good luck there, I value my time more than ever these days (full-time employee + father of 2.5yo kid)

So, beside the fact I, as extra fact, value my time, the time it would take me to change current core-js polyfill around anything would probably mean a week of work, if lucky, while solving the problem my way would take, depending on the task, of course, half to 3 hours for an MVP that solves already whatever I needed to solve …

Broken MDN metrics

We generally want some proof of popularity (downloads, stars…) before committing to suggesting it

Let’s recap what happened in here … I’ve proposed an alternative ponyfill for a specification that nobody even knows it exists and the popularity argument has been used as a reason to close my effort to contribute?

I smell a catch 22 situation here or favoritism all over the place:

  • you promote only core-js in there: how is anyone else even able to contribute by adding smaller, faster, yet still as safe, alternatives that could possibly be more popular?
  • how can you disclose, ignore, destroy history around, the person trying to contribute for the community, one that contributed already a few times, one that has collaborated with standards here and there, one that is trying to help the community back by stating “here there’s a ponyfill that you can try with ease without breaking or slowing down everything around its usage” ?
  • how can you make, as guideline for experimental related contributions, your decision on metrics for something maybe landed the day before and something nobody knows at all existed before?

So here I got triggered, because none of the above points make sense to me … maybe the fact the reviewer had no idea about myself is even OK, but what is the nonsense around this popularity point, when MDN publishes non existent in standard (yet) features and ignores anyone maybe even excited about such feature, or one like me that needs that feature daily, that even spent time to provide via a community project a solution that doesn’t cause bloat or global slow-down for every JSON related operation?

I believe the answer in there, and forever, would be crickets, because once again, while everyone on X has been fast enough to blame my ego, few understood that there is an overall issue with all the reasons the PR has been closed out of the box: no discussion ever started … “I suck”, that’s it!

The FUD around Security

We are particularly wary about links to polyfills, because (a) they will eventually be removed in favor of native solutions (b) they represent one of the most vulnerable attack vectors for supply chain attacks. Therefore, we have by convention decided that we will only include core-js polyfills.

This is the real reason I am here to talk about this review process:

  • polyfills will be removed, and so will ponyfill. The MR in charge of doing that, once the time is right, is exactly one … so that my extra link wouldn’t really have bothered anyone in the future; it’s not that I added maintenance burden, a topic I would’ve paid more attention to if presented
  • they don’t have any process in place to validate that core-js polyfill actually works as standards meant; they just accepts without a doubt that core-js works … if they had such process, other links would’ve been welcomed, or rejected by CI, because it means they tested the standard behavior, and can make an informed decision about accepting, or rejecting, that PR. Here they just decided any link that solves, in possibly a better way, the issue is not worth it, but no process exists to make sense of that decision …
  • they are assuming me, my account, my OSS, is vulnerable, without providing an explanation around “why is that” or “what can I do to make it less vulnerable” … have I learned anything around that PR and my effort to improve the DX around this newly introduced API? No, nothing at all, all I know is that my code is considered vulnerable out of the box, thank you MDN!
  • they tried to make the security argument later on, not while closing the PR, and that is even worse than the rest of this story … keep reading …
  • there is a convention that established core-js is not vulnerable, and boy if I have links around that …

First of all, I already wrote a post around the fact nobody can trust JS execution, it’s by JS design, it doesn’t matter how paranoid is the stack you are running, the moment my evil.js script has a chance to run before your trusted core-js internals, you are doomed, trust me!

I wrote libraries that help mitigating this issue as long as these run before evil.js, that’s the contract: first run, first safe, there is no “but” here!

It’s not that I am new to the security concerns world neither, I have been working with security-concerned companies (as security service) where I could taint a whole environment to provide security concerns related issues for production sites … again, not my ego, it’s just a matter of facts that I know JS inside out and I wrote about security concerns “forever”.

Moreover, the moment you discard any community contribution in the name of security, implicitly inferring that the contributor is not a trusted JS developer, is the moment you are vomiting out anyone out there that actually has a way to contribute to the community by providing code that understands these concerns and provide better solutions that still work in best conditions, just like core-js does if no evil.js file has been landed on the page before its execution.

So here I might have lost my patience, answered badly, you name it in there, but if you make security a joke and you blame security on others, you are doing a disservice to anyone reading that thread, anyone believing core-js is infallible, anyone trying to learn what security means on the Web.

Security means you are behind CSP, you trust and validated every single script that lands on your page, including those 3rd party Ads related scripts, and you are sure by code validation before going in production that no script, unless trusted, would ever try to pollute global prototypes ahead of time … now that’s security, not your cheap and easy blame on ad-hoc issues created for the sake of creating those issues, because once again, I wrote “evil” code that can feel native, once introspected, by all means and others did the same before me! core-js is not immune and nobody should believe that using core-js means no security risks can possibly exist on the Web.

As Summary

Don’t trust MDN around security concerns: they have way more work to do before developers could be really informed around it and the fact they promote only a single polyfill on their documentation is rather an increased security risk (ad-hoc core-js evil.js as single point of failure) than a guard for Web developers.

This was the post, this was my rant around it … security has no compromises, if you care about security, and you should, that Pull Request is not the place you’ll learn anything about it, quite the opposite: it’s making you feel safer about their choice while you are not.

The rest of that discussion went from ugly to unbearable to me, so once again, I might have over-reacted, but I do take security concerns seriously and that was not the case in there .. and on top of that, my presented alternative stats:

  • it doesn’t patch the global JSON primitive, hence it doesn’t make it slower by default, that’s what a ponyfill does
  • it’s as “secure” (accordingly to those broken metrics used in there) as core-js
  • it’s 90% smaller as plain text, 60% smaller once minified
  • it’s 3x up to 5x faster than core-js once minified and gzipped

This is raw-json, a ponyfill MDN doesn’t want you to use or try … and this is the end, I hope I’ve given back to OSS something, one more time 💕

Update

MDN locked my issue and my account after also pointing me at other 3 locked discussions … and when did that happen? After I’ve invalidated all arguments in there providing fixes to the mentioned “security” concerns and benchmarks about code size and performance where my ponyfill wins by far compared to the polyfill they propose to all Web developers.

That’s Open Web “at its best”, isn’t it?

https://github.com/mdn/content/pull/36294#issuecomment-2411767537

--

--

Andrea Giammarchi
Andrea Giammarchi

Written by Andrea Giammarchi

Web, Mobile, IoT, and all JS things since 00's. Formerly JS engineer at @nokia, @facebook, @twitter.

Responses (1)