Brainstorming: What should Javascript support look like?

I’m currently putting together a little wishlist of things we want to work on within the Javascript SIG. Here’s what I have so far:


  • Have a dedicated repo to keep our work in (eg. auxolotl/javascript)
  • Three main classes of things in this repo: Runtimes, Package Sets, Builders


  • Runtimes are derivations for Javascript interpreters/runtimes (eg. Node.js, Deno, Bun)
  • Distributed much like normal derivations - each derivation should provide the interpreter binary, as well as any supporting tools that come along side it, such as package managers
  • Some Node-specific notes:
    • Multiple major versions of Node are actively supported at any given time - we should aim to track that support schedule identically, deprecating/potentially removing old versions when support ends
    • One main nodejs alias that always follows LTS, and nodejs-latest that follows the latest stable (ie. what Nixpkgs already does)
    • There’s quite a few package managers for Node out there (npm, yarn (classic and berry), pnpm) - how should we support them?
      • also, Corepack? I think Nixpkgs has some integration for it, but I’m not super familiar with it

Package Sets

  • This would be our equivalent of nodePackages - a set of user-facing tools typically distributed through npm, packaged as derivations
  • Automatic CI-based updating of packages in nodePackages - ideally, we declare a list of packages to include in our set, and derivations will be created from there, as well as scheduled checks for new versions published to NPM
  • As I understand it, Bun is meant to be a Node compatible runtime - but nixpkgs currently has no (clear) way to create a nodePackages set that uses Bun as it’s runtime. Is this something we want to investigate supporting?
  • There’s also nothing like this for Deno at the moment - is that a concept they even have? I’m not very familiar with it.


  • These will create derivations given a Javascript project (ie. something with a package.json) as input
  • In Node land, each package manager has it’s own dedicated lockfile format for locking project dependencies (Bun too). We should aim to include builders for all formats:
    • npm package-lock.json: Already exists
    • yarn yarn.lock: Already exists
    • pnpm pnpm-lock.yaml: No support in nixpkgs, some prior art outside 1, 2
    • bun bun.lockb: No support anywhere, as far as I can tell
  • Over in the Deno side of things, I see no tooling in nixpkgs to build a Deno app - this is something we can investigate too
  • Functions for building development shells would be nice to have as well, to provide a slightly nicer UX than manually specifying everything in an mkShell call
    • Ideally, some sort of javascriptDevEnv function, where users specify a runtime, a package manager, whether they want to include eg. some language servers in the shell - things like that

Are there any other suggestions or comments?


I have some thoughts on this (a lot of which align with what you’ve posted), but it is starting to get a bit late here so I am going to save my full response for another time. But I did want to call out that this looks very good and I agree with the different components you mention need to be covered. Thanks for writing this!

1 Like

Ideally, some sort of javascriptDevEnv function, where users specify a runtime, a package manager, whether they want to include eg. some language servers in the shell - things like that

Imagine it pulled languages servers depending on the npm packages you had :>


Oh shit yeah, you could totally eg. parse the project’s package.json and add typescript-language-server if typescript is installed, svelte-language-server for svelte, etc.

Only issue I see is I’m not 100% sure what a good heuristic for vscode-html-languageserver-bin and vscode-css-languageserver-bin would be. I feel most frontend projects would want those, however I’m not aware of a reliable way to detect if we’re in a frontend project just from package.json.

Really solid idea though, will keep in mind!

Not JS specific but kind of relevant, I’d like to have a discussion later/eventually about avoiding the whole python38 python39 thing. I think better solutions could exist.

Probably will still need the node numbering thing in the meantime, so this is more just a heads up.

1 Like

I think treating them like any other binary works. Is there anything that makes them not work?

I think this is a great line of thinking, but I actually think it should be a bit more generic so that a JavaScript+python+rust devshell is easy and possible.

The vast vast vast majority of my time on nix has been working to make a better devshell. So at some point I’ll have a lot of say about that on aux.

In the meantime though, I think it would be good to discuss what JS envs would need/want. I can ask the same of the other languages and see if there’s a common pattern.

Imagine it pulled languages servers depending on the npm packages you had :>

Yeah! Stuff like this! I didnt even think about that.

It would be really nice to have presets, like the typescript starter kit, a base-unix tools starter kit, a python venv starter kit, etc where those three things kits can be listed/combined, and then the devshell just has all the expected CLI tools and makes-sense defaults.

1 Like

If you mean like package mangement stuff, dont really need any! This is why I use deno :two_hearts:.

While not as aggressive as nix, deno has hash checking and pinned versions by default.

(Straight from their docs)

Let’s say your module depends on remote module https://some.url/a.ts. When you compile your module for the first time a.ts is retrieved, compiled and cached. It will remain this way until you run your module on a new machine (say in production) or reload the cache (through deno cache --reload for example). But what happens if the content in the remote url https://some.url/a.ts is changed? This could lead to your production module running with different dependency code than your local module. Deno’s solution to avoid this is to use integrity checking and lock files.

So there is a way to piggy back off of that.

deno vendor will build a local cache of all dependencies.

deno compile makes a standalone executable.

Oooh, you’ve got me thinking about module-based devshells now… that’s a fun one to explore maybe.

I was thinking more along the lines of “what does it take to make a derivation that packages an app that runs with Deno”, ie. a mkDenoPackage of sorts - AIUI that might mean some form of dependency vendoring, although I’m not familiar enough with Deno to say for sure.

running bun bun.lockb outputs a yarn.lock-compliant file to stdout, but the dependency resolution is different somehow and it often fails to install packages correctly. without a lockfile pre-existing, bun and yarn generate massively different yarn.lock files.

currently bun doesn’t appear to have any form of offline caching, which breaks the sort of methods nixpkgs already uses with mkYarnPackage, etc. pnpm allows you to only used cached dependencies that exist in the global store.

Yep. Nothing. A localhost:8080/hello.js is as valid a deno module as a github with a single js file in it, or as valid as a package.

If its a JavaScript file accessible on the internet, its a deno package.

Now Deno 2.0, whenever it launches, is going to change that. That is going to be a giant mess. But for now Deno 1.0 is as simple as it could possibly be.

I guess for nix it would be good to freeze everything all the way down. So I suppose mkDenoPackage could run deno vendor on the target js file(s), which would build up a local cache. Then that cache could be part of the derivation output


one thing i’d like to do with Bun is make it easier to install the Baseline version that supports computers without the needed AVX(2?) cpu extensions. i’ve seen a lot of people get tripped up about that.

on older systems, running standard bun at all in any way results in an illegal instruction almost immediately.

I do think it would be good to try or keep all of Aux separated into 1-deep (ex: node) and 2-deep derivations (nodePackages.thing) rather than having something that’s 3 deep (ex: javascript.nodePackages.thing).

There is this very very difficult problem I’ve (and many others) wasted weeks of my life on, which is that nixpkgs is not a tree (DAG) and its not just a tree with loops, and its not just a tree with loops that can’t be detected. Its a fractal with diverging looping patterns and, when exploring the fractal, theres no way to detect if in a loop or not. This makes it EXTREMELY difficult and resource intensive for scrapers/indexers to find derivations that are 3 deep. And 4 deep? (Which nixpkgs has) well good luck ever finding those. To be fair, its mostly a mess because of stuff like nixpkgs.nixpkgs == nixpkgs and perlPackages.perlPackages.

I was debating whether or not to join the SIG JavaScript, I think I’ve decided yeah.

I can be the local Deno expert, just @ me if you’ve got deno questions. I dont know everything, but I know the Deno core team so I can always get an official answer pretty quick.

1 Like

Yeah, this was my concern - I assume dependencies would typically get pulled at runtime in a Deno app, which seems to run contrary to Aux’s reproducibility promises. Good to know that we do have options for vendoring dependencies, though.

1 Like

Yeah sorry I missed it in your message the first time.

Actually no, not when vendored! Thats the beauty of EcmaScript imports over nodejs require. All sync imports are statically analyzable. Deno even goes a step further, if its a dynamically imported library but it uses a static string, it’ll vendor that too. However if its a truly dynamic import then yeah theres no way for it to know.

Relevant cursed information: Jake Hamilton: "Cursed fact I discovered with @essentialrandom@in…" -