Taming JS Proxy API
With the plethora of libraries dealing with remote references and the amount of WASM targeting runtimes, I think it’s time to explain how proxies should be used to mimic in the best possible way types discrepancies across programming languages and whatnot.
Rule #1 — the 3 kind of Proxy
It doesn’t matter what you hold as proxy reference but it does matter how you hold that. Native JS APIs are able to “drill” into proxied references so that:
typeof proxyshould return eitherobjectorfunctionArray.isArray(proxy)should returntrueif the reference is meant to be used, or behave, like a JS Array (tuples, lists, collections … in JS these are likely handled as arrays)- both
get,setandhastraps should takesymbolkeys into account, instead of failing when symbols are checked or accessed. Please note thatObject.prototype.toString.call(proxy)will access implicitlySymbol.toStringTag, as example … be sure your proxies can handle these scenarios or simply return nothing whentypeof keyis notstring
Proxy Object
const proxy = new Proxy({ ref }, {
get({ ref }, key, receiver) {},
set({ ref }, key, value, receiver) {},
has({ ref }, key) {},
});Assuming ref is a pointer to your real value, if such pointer goal is to mimic object literals, records or dictionaries like references, proxying { ref: value } guarantees the surrounding code will handle that reference as plain object.
Proxy Array
const proxy = new Proxy([ ref ], {
get([ ref ], key, receiver) {},
set([ ref ], key, value, receiver) {},
has([ ref ], key) {},
});Any iterable should be referenced as such to be sure that checks such as Array.isArray(proxy) would return true instead of false, playing very well along with all regular JS libraries and expectations when lists, collections, tuples, call it as you like, are meant to be handled like arrays.
Proxy Function & Class
There are various ways to reference something that is meant to be invoked but here’s the tricky part: other programming languages might not have all function variants present in JS (arrows, short-hand methods, legacy functions, modern classes) so that the “one solution to rule them” all is:
// reusable for all function cases
function proxied() {
'use strict';
return this;
}
const proxy = new Proxy(proxied.bind(ref), {
construct(target, args, newTarget) {
const ref = target();
// ... return new instance of that ref,
// or throw if that ref is not a class ...
},
apply(target, context, args) {
const ref = target();
// ... return the invoke of that ref,
// or throw (maybe?) if that ref requires `new` ...
},
});With above traps, the typeof proxy will return function as expected and both invokes with or without new will be possible. Of course it’s also possible to fine-tune and branch out specific cases, one where new is never desired or one where new is always desired:
// it fails with new
const arrow = new Proxy(() => ref, {
apply(target, _, args) {
const ref = target();
// ... invoke the ref with args and no context
},
// construct trap won't ever be invoked even if defined
});
class Proxied {}
class ProxiedHandler {
// the proxy handler constructor
constructor(ref) {
this.ref = ref;
}
// the actual Proxy trap for `new`
construct(_, args, newTarget) {
// use the handler ref property
const { ref } = this;
// return a new instance for that ref
}
// apply trap won't ever be invoked even if defined
}
// if will fail without new
const Class = new Proxy(Proxied, new ProxiedHandler(ref));I am letting you decide which approach is better or easier to reason about but I am usually handling things via proxied.bind(ref) because it always work and if it needs to fail when new is used or not: let if fail!
Rule #2 — don’t repeat proxy handlers
When proxies are used for WASM interoperability reasons or for reflecting API purposes, or simply any other FFI use case, it’s very likely that the runtime will handled hundreds, if not thousands, proxied references during the lifecycle of the program.
On top of that, things might become easily slower than these need to be, so that keeping in mind that proxy handlers can be shared, or can actually share their traps if these are instances and not just literal, will save a lot of Garbage Collection work + it will be definitively faster and ligther than it is now.
I am guilty as charged in this post because for brevity, context, and simplicity sake I have used objects literals as handlers, but the truth is that none of my heavily Proxy based libraries use runtime object literals for handlers as that 99.9% of the time a sloppy slippery slope for both performance, RAM, and GC pauses.
Accordingly, every time you write something like this that could occur more than once:
const proxied = ref => new Proxy({ ref }, {
get(target, key, receiver) {},
// ...
});You should rather refactor that as such:
const objectHandler = {
get(target, key, receiver) {},
// ...
};
const proxied = ref => new Proxy({ ref }, objectHandler);
// ... or even ...
class ObjectHandler {
constructor(ref) {
this.ref = ref;
}
// traps
get(_, key, receiver) {
const { ref } = this;
// ...
}
// ...
}
const proxied = ref => new Proxy(
// good enough to mimic literals
Object.prototype,
// ref is handled directly
new ObjectHandler(ref),
);As result, your code will be cleaner, faster and less greedy on the RAM, plus each handler could itself carry more interesting data that its prototype can reuse across handlers, change state, track things, use this context when needed to point at such handler and so on.
Rule #3 — trap only what needs to be trapped
If you have a handler which goal is to define an object or an array and you have apply and construct traps in there, that’s both confusing and wrong.
Proxies are special “beasts” in JS and despite their names not everything will be invoked so that those function related traps, as example, won’t happen with object literals … we need to be smarter there, for instance using the proxied.bind(ref) approach and then add all sort of traps to make that behave both as literal and function but then again, that will make such proxy an “alien looking” kind of object with typeof function but both invokable and usable as object literal? … well, you do you but only if you really know what you are doing, so it’s important to learn all traps potentials, when these are needed and when these are not, and leave all other traps behind the proxied object that will already answer properly to its expected behaviors.
… and be careful with Arrays
Proxied arrays expect to always return at least length as list of ownKeys but that’s granted if you proxy [ ref ], but when such length is retrieved it cannot be just 1, it has to reflect the real length or size of whatever it’s pointing at.
Same goes with Symbol.iterator which must be the one that will iterate over the real ref, not the proxied reference, and so on so … add tests to your proxies and be sure all edge cases, or most edge cases, are covered.
And that’s a wrap; I hope you’ve found something useful or new in this post and you’ll manage to improve your proxies within your project sooner than later so that interoperability will improve, as well as performance and RAM usage will be reduced, as well as “surprises” when dealing with proxied variables.
If there’s only an extra point to make in this post is that proxies break the structured clone algorithm and cannot travel across workers so that having any way to recognize proxies and prevent these from being sent elsewhere is always a good idea, among a way to serialize these so that these can be reflected elsewhere … and if you don’t know how or where to start, remember that both coincident and its reflected-ffi use all best practices to work seamlessly across tabs, workers, or even the server side of affairs 👋
