Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Strategic initiative: future of Node.js build toolchain #901

Closed
mmarchini opened this issue Aug 4, 2020 · 100 comments
Closed

Strategic initiative: future of Node.js build toolchain #901

mmarchini opened this issue Aug 4, 2020 · 100 comments

Comments

@mmarchini
Copy link
Contributor

mmarchini commented Aug 4, 2020

This was brought up several times, most recently on the Build IRC channel. Our current build toolchain is based on GYP, which is not the default toolchain for V8 anymore and was discontinued by Google. As a result, we have to maintain our own GYP fork, and keeping dependencies (especially V8) up to date requires considerable manual labor. Furthermore, the build experience for Windows and Unix is extremely different, with different config files for those platforms (which makes it hard to keep them in sync feature wise).

Switching to a modern, widely used build toolchain like CMake has been brought up a few times. There were also suggestions to move to GN, which comes with a completely different set of tradeoffs. In the recent discussion on the Build chat I suggested we should investigate an hybrid mode where V8 is built with GN and the rest of the project is built with CMake* (this also would come with a completely different set of tradeoffs). And this is only talking about building Node.js itself, we also need to evaluate how each toolchain will impact native modules, and we'll need to provide gyp + something else for some time on the native modules side.

I suggest we create a strategic initiative to start planning, exploring options and working on the future of our build toolchain. This could potentially become a working group in the future depending on the amount of work we have ahead as well as the number of collaborators interested in helping. I'm willing to champion this initiative.

* I would really like to explore this one because if it works, we can provide pre-built V8 binaries which can be downloaded by collaborators, reducing significantly the build time of the project

@mmarchini
Copy link
Contributor Author

cc @nodejs/build-files @nodejs/gyp @nodejs/tsc

@ryzokuken
Copy link

Thanks for taking this up, @mmarchini. I suppose a good first step would be to list down the pros and cons of each approach so that everyone would have a good idea of what exactly are the tradeoffs involved. This has always been a tough pill for me to swallow personally because of the fact that:

  1. Building V8 with CMake doesn't really fix the problem of "considerable manual labor", we'd just end up having to maintain CMakeLists for V8 instead of gypfiles. It might be easier due to various reasons, but it would not go away completely IIUC, unless we switch to GN.

  2. Building with GN comes with its own set of problems, since IIUC, GN supports a much smaller set of platforms than Node. I don't know for sure what the Chromium team feels about it, but adding support for the missing platforms in GN would be a significant undertaking in itself which would come with attached maintenance costs.

I had never considered your suggestion of using GN to build V8 and CMake to build everything else, but it might just work. That said, I am afraid it would still require us to add support for platforms like SmartOS to GN.

@mmarchini
Copy link
Contributor Author

I'll need to dig deeper on the whole "GN unsupported platforms" story. My understanding is that, as long as we can target some of those platforms, we should be fine, right? Especially platforms which are more common on servers.

Totally agree with you that changing from gyp to cmake doesn't solve the issues with upgrading V8, which is why I would like to think of it as a last resort. I think there's a lot of investigation and experimentation to do before making any decisions on the path forward.

@ryzokuken
Copy link

I'll need to dig deeper on the whole "GN unsupported platforms" story. My understanding is that, as long as we can target some of those platforms, we should be fine, right? Especially platforms which are more common on servers.

I mean, if GN doesn't support a platform that we currently do, either we would have to consider dropping support or continue maintaining gypfiles, right?

I think there's a lot of investigation and experimentation to do before making any decisions on the path forward.

Absolutely agreed. Let's figure this one out, shall we? 😄

@mmarchini
Copy link
Contributor Author

I mean, if GN doesn't support a platform that we currently do, either we would have to consider dropping support or continue maintaining gypfiles, right?

I mean, that's probably the case, but I'm not 100% sure 🤷‍♀️. From what I can tell GN just generates Ninja files, so as long as we can use it to generate Ninja files to build on SmartOS and friends we should be fine.

@mmarchini
Copy link
Contributor Author

I wonder if we could, as a starting point, use CMake as a frontend for gyp, replacing configure and vcbuild (assuming we can use CMake to generate build files on Windows).

@devsnek
Copy link
Member

devsnek commented Aug 6, 2020

+1 for cmake, we could vendor stuff so much more easily.

@mmarchini
Copy link
Contributor Author

+1 for cmake, we could vendor stuff so much more easily.

Except for V8 😅

@victorgomes
Copy link

Regarding "GN unsupported platforms": GN is an extensible build system, even if there are platforms that currently unsupported, one can also add new scripts and config files to include a new toolchain. https://gn.googlesource.com/gn/+/master/docs/reference.md#func_toolchain

@richardlau
Copy link
Member

Re. gn, it’s not just whether gn supports all of our supported platforms but by implication whether ninja support for those platforms is available.

@mhdawson
Copy link
Member

Feedback from Google in the past is that they were not looking to support GN for use outside of V8/Chrome. They had other suggestions instead. I'm hesitant to bet on GN (except possibly just for V8 itself) without Google recommending/supporting it as a good choice and be willing to help if we run into issue.

@mmarchini
Copy link
Contributor Author

mmarchini commented Aug 11, 2020

@mhdawson yes, that's definitely going to weight in our decision. The ideal build toolchain for us is still IMO GN for V8 and insert-new-tool-but-probably-cmake for everything else. Biggest challenge is to keep defines and compiler flags in sync between the V8 build and everything else, I have some ideas and am working on a proof-of-concept for GN+GYP hybrid build with synced defines and without depot_tools. This hybrid approach might turn out to be unfeasible or too hacky, but it's hard to know without trying.

I'm also in contact with Google to come up with a solution that is good for both projects (since they also have CI bots for Node.js with updated V8), and I want to include embedders in the conversation to make sure whichever path we choose also work for them.

@gocarlos
Copy link

regarding v8 and cmake: I think that the people at vcpkg and conan could join forces with nodejs to maintain the cmake script...

@mmarchini
Copy link
Contributor Author

Thanks @gocarlos, if we move in the direction of porting V8 to cmake we'll loop those projects in, although I'm still hopeful that we can avoid that (since porting V8 to any other build system will result in burden to maintain that port, which is quite hard considering V8's development speed).

You might be interested in https://github.com/bnoordhuis/v8-cmake, which I think already works.

@devsnek
Copy link
Member

devsnek commented Aug 20, 2020

I would like to vendor getdns, which is practically impossible because we don't support cmake. Additionally, most of our deps already have cmake configs. I think it would be unfortunate if we moved to some new system and still didn't support cmake.

@mmarchini
Copy link
Contributor Author

Sorry, should've been more clear: I think cmake is the way to go for everything except V8, and I'm still hopeful we can have a hybrid system that will work better for 90-99% of use cases than today.

@anlexN
Copy link

anlexN commented Sep 10, 2020

whether can use gradle like android?

@mmarchini
Copy link
Contributor Author

whether can use gradle like android?

I think it's unlikely we would use gradle. Gradle is more popular on Java ecosystem, and it doesn't give an advantage over CMake in terms of reducing duplicate configuration for dependencies.

@ryzokuken
Copy link

I think if people can use the "but it's Java" argument against Bazel, then Gradle would be an even worse option.

@codebytere
Copy link
Member

codebytere commented Sep 14, 2020

Speaking for Electron - we build Node.js, V8, & all other deps with GN (see this patch for our config) and will continue to do so likely regardless of what choice is ultimately made here. We recognize that GN isn't blessed outside a Google context, and as such would not try to push Node.js itself in that direction insofar as our ability to build with GN is preserved by the outcome of this issue (i can't see why it wouldn't be).

+1 for building V8 with GN though, & i would say that we've had a good experience asking Google folks for help when we run into issues and receiving feedback/assistance.

@mmarchini
Copy link
Contributor Author

@codebytere that's great info thanks! I think GN is not completely out of the table yet (I've been fiddling with it more lately), but the bad experience we had with gyp does make it harder for us to have confidence to migrate to GN.

Does Electron have a plan in place in case Google drops support to GN and move the Chromium toolchain to something else?

@ryzokuken
Copy link

I don't think Electron has much of a choice here. It's significantly more difficult to maintain build files for the entirety of Chromium than just V8, so I think ideally Electron would always want to stick to the same build toolchain.

@anlexN
Copy link

anlexN commented Oct 8, 2020

we need all in one toolchain very much.

@codebytere
Copy link
Member

codebytere commented Oct 8, 2020

@mmarchini yep - we'd love GN since it's what Chromium uses at present but ultimately we're fairly beholden to whatever toolchain Chromium uses even if they change it - should that happen we'd adapt other dependencies accordingly.

@mhdawson
Copy link
Member

mhdawson commented Oct 8, 2020

Since this is now in the list of strategic initiatives( https://github.com/nodejs/TSC/blob/master/Strategic-Initiatives.md) I think this issue could be close ?

@mmarchini mmarchini changed the title Proposed new strategic initiative: future of Node.js build toolchain Strategic initiative: future of Node.js build toolchain Oct 8, 2020
@mmarchini
Copy link
Contributor Author

IMO it makes sense to keep it open as a tracking issue (we do that for other strategic initiatives as well). I changed the title for clarity.

@nornagon
Copy link

nornagon commented Aug 5, 2022

@jviotti FWIW I don't think in Electron we would easily be able to import Ninja definitions directly. GN expects all build configuration to be in GN, not Ninja, and as far as I know has no support for depending on raw Ninja definitions generated by other tools.

@frank-dspeed
Copy link

i am working at present on a many related projects like a fuchsia nodejs runner i come to the final conclusion why does NodeJS not build it self via NodeJS?

I am also working on language Imlementations like typescript and php on GraalVM and out of Language Implementer view every good language builds it self why not NodeJS? in graalvm nodejs is also handled as language that is compile able to single binary this way i got successfull runs on fuchsia os + fidl.

but the graal-node method does not use v8 so it is clear why that is simpl.

i am also working on the concept of a v8 embedder framework that allows to build apps directly against v8 a small linux only implementation of that can be seen in that project : https://github.com/just-js/just/

the resulting applications bound directly to the linux epoll are outperforming nodejs by up to 100 ranks it is first place in the composit score of a very reputable benchmark: https://www.techempower.com/benchmarks/#section=data-r21&test=composite

so outperforming everything else.

At present my main Project is to get a just-js like sdk done for fuchsia that easy allows to directly bind the fidl client into v8 with bundled JS as glue code my PoC is using the fuchsia.http.Loader and current state is to add a fuchsia based http server implementation. Using the Cloudflare Rust version of the the Quic Bindings.

As also a sideEffect of this is a nodejs Implementation of the Fuchsia Component Manager for Mocking but at the end as a side Project i am Porting all Fuchsia Concepts to NodeJS and similar Platforms as they have overlapp by design and this are low hanging fruits.

cc: @hashseed

@eli-schwartz
Copy link

why does NodeJS not build it self via NodeJS?

I am also working on language Imlementations like typescript and php on GraalVM and out of Language Implementer view every good language builds it self why not NodeJS?

I'm not sure how you define "every good language" but I feel like I'd be hard pressed to find many languages where the build system is written in the language's own final runtime.

There's an obvious bootstrap issue there, so for starters if the runtime itself isn't written in that programming language, it seems highly strange to write the build system in it.

Also investing in a custom build system instead of an existing one is a bit of a hard sell to begin with. An (IMHO weak) argument could be made in favor of taking a language-specific build system + package manager that exists anyway and using it to compile the self-hosted language runtime. But this is very much not the case here anyway, because nodejs.exe's source code isn't written in nodejs!

@frank-dspeed
Copy link

frank-dspeed commented Aug 14, 2022

@eli-schwartz i understand your point but i will go for a ECMAScript Build / Make tool anyway and will use all needed workarounds to get that working eg: ffi magic with sharedArray Buffers and the memory location of them.

I guess i can even translate gn and other build tools relativ simple and even use the original source files.

then i guess the following will happen.

People will find the Project and use it anyway because it works and then we call it a day.

Then NodeJS can Build It Self via a Older Version from it self that works. Using the Language its written in (C)

NodeJS is the successor over Python as Python will not upgrade and python did nothing diffrent also Xmake with the Lua Runtime does nothing diffrent.

Ok NodeJS is maybe a bigger runtime but who cares it offers also more features out of the box. Combined with Typescript i guess this is a selling point including the Typedefinitions.

Also the tooling will get bigger to translate and express Native Stuff in JS and so generate the files needed for the other build tools so doing what GN does and hand over to Ninja.

All build systems share more or less a big common set of features it will be easy to do mappings.

For Performance there is a clear migration path

ffi-napi: A successor to node-ffi compatible with modern versions of Node.js.
sbffi: This library.
napi-addon: A very simple/normal Node.js addon using NAPI in C.
napi-addon-sb: A NAPI addon using the same shared-buffer technique as sbffi, but with a hard-coded function call, rather than a dynamic/FFI call.
wasm: The adding function compiled to WebAssembly.
js: Re-implementing the function in plain JavaScript.

Each function will be called 100000 times, in 1 repetitions, timed with console.time(). Here are the results on my machine (2019 Lenovo X1 Extreme, running Ubuntu, Node v12):

ffi-napi: 1419.159ms
sbffi: 29.797ms <= Easy archive able
napi-addon: 3.946ms
napi-addon-sb: 3.717ms <= Easy archive able
wasm: 0.871ms <= sometimes easy archive able
js: 0.090ms <= sometimes easy archive able

so there is no question combined with multi TH Support and the NodeJS VM Module for example we can build maybe the fastest XMake ever

@eli-schwartz
Copy link

eli-schwartz commented Aug 14, 2022

I'm not sure what point you're trying to make?

A build system has two components. A configuration handler, and a command runner.

Common configuration handlers are: meson, cmake, autotools ./configure, gn, gyp

Common command runners are: make, ninja, msbuild, xcodebuild

Some conflate the two: waf, scons, setup.py -- these typically tend to be a lot slower than anything else, as they need to reparse the build configuration, and everything is basically a cold build.

Generating ninja files is a good idea, and meson, cmake, gn, and gyp all do this (either exclusively or as one of several options). You will generally not get faster than running ninja which is heavily optimized for speed and doing as little work at build time as possible. Reimplementing an excellent tool in JavaScript for reasons you haven't really explained doesn't seem to make any sense and I can't possibly overstate how bad an idea I think this is.

Writing a build system in JavaScript that generates ninja files can be done, I guess, but you haven't really explained why that's the superior idea either. It's not something I'd be wary of to the same level as reimplementing ninja, but...

Honestly? Speed for the initial build configurator doesn't seem like the best thing to focus on. There's a bunch of reasonably fast options, and your bottleneck will basically always come from the time spent executing subprocesses such as compilers, system info scraping tools, and suchlike. Compiler tests are the big time hog in any build system as you're totally dependent on the compiler itself. The good thing is that once you did this all once, you don't need to do it again as you have the configuration saved to disk and now you just need to run the command runner (like ninja).

What speed goals are you trying to hit with nodejs build systems?

NodeJS is the successor over Python as Python will not upgrade and python did nothing diffrent also Xmake with the Lua Runtime does nothing diffrent.

I'm not sure what you're trying to get at with this language war, but I think you've misunderstood something about both Python and Lua.

@waruqi
Copy link

waruqi commented Aug 14, 2022

Why not consider xmake? xmake has the same build speed as ninja, it's very fast and it doesn't regenerate the makefile/ninja.build file due to configuration changes like cmake/autoconf/meson, which is very time consuming.

Generally, it is very fast to build projects directly with xmake, which also supports parallel builds and has built-in cross-platform build caching optimisations, similar to ccache but with support for msvc. There is also built-in support for distributed builds, and remote build caching. There is also support for unity build to further speed up compilation.

If you use ninja directly, it is difficult to use distributed builds to further speed up builds, and using the external ccache, which does not support msvc, has a number of limitations.

There are also commands in xmake that allow us to generate build files such as ninja.build/makefile/vs/cmakelists. If we use xmake, we can still switch to ninja to build at any time.

Finally, xmake also provides built-in package management and also supports packages directly from any other package manager such as conan/vcpkg/brew/apt/cargo/pacman/dub. In theory, we are able to directly integrate over 90% of packages using the c++ ecology, as long as any of the repositories have it.

@waruqi
Copy link

waruqi commented Aug 14, 2022

@eli-schwartz i understand your point but i will go for a ECMAScript Build / Make tool anyway and will use all needed workarounds to get that working eg: ffi magic with sharedArray Buffers and the memory location of them.

I guess i can even translate gn and other build tools relativ simple and even use the original source files.

then i guess the following will happen.

People will find the Project and use it anyway because it works and then we call it a day.

Then NodeJS can Build It Self via a Older Version from it self that works. Using the Language its written in (C)

NodeJS is the successor over Python as Python will not upgrade and python did nothing diffrent also Xmake with the Lua Runtime does nothing diffrent.

Ok NodeJS is maybe a bigger runtime but who cares it offers also more features out of the box. Combined with Typescript i guess this is a selling point including the Typedefinitions.

Also the tooling will get bigger to translate and express Native Stuff in JS and so generate the files needed for the other build tools so doing what GN does and hand over to Ninja.

All build systems share more or less a big common set of features it will be easy to do mappings.

For Performance there is a clear migration path

ffi-napi: A successor to node-ffi compatible with modern versions of Node.js. sbffi: This library. napi-addon: A very simple/normal Node.js addon using NAPI in C. napi-addon-sb: A NAPI addon using the same shared-buffer technique as sbffi, but with a hard-coded function call, rather than a dynamic/FFI call. wasm: The adding function compiled to WebAssembly. js: Re-implementing the function in plain JavaScript.

Each function will be called 100000 times, in 1 repetitions, timed with console.time(). Here are the results on my machine (2019 Lenovo X1 Extreme, running Ubuntu, Node v12):

ffi-napi: 1419.159ms
sbffi: 29.797ms <= Easy archive able
napi-addon: 3.946ms
napi-addon-sb: 3.717ms <= Easy archive able
wasm: 0.871ms <= sometimes easy archive able
js: 0.090ms <= sometimes easy archive able

so there is no question combined with multi TH Support and the NodeJS VM Module for example we can build maybe the fastest XMake ever

js may be fast, but the performance bottleneck in building systems comes mainly from the compiler. The performance of compilers can only be improved if they are better scheduled, e.g. parallel compilation, distributed compilation, build caching, Unity Build, etc.

Switching to js doesn't improve the performance of the build fundamentally, but instead you waste a lot of time reimplementing the build system and find that it doesn't perform as well as make, because the build system is very complex and there are a lot of details to consider. I don't think it makes sense to build a new build system just to build your own project, rather than using an existing mature build system.

Also, xmake can use both lua runtimes, luajit and lua, and luajit is much faster than lua, but I have tested many projects and found that they are basically the same for build performance, luaKit doesn't make the build much more efficient, and I think switching to js would be the same.

@eli-schwartz
Copy link

eli-schwartz commented Aug 14, 2022

Why not consider xmake? xmake has the same build speed as ninja, it's very fast and it doesn't regenerate the makefile/ninja.build file due to configuration changes like cmake/autoconf/meson, which is very time consuming.

Regenerating Makefile/ninja.build is not time-consuming. Reconfiguring on changes to the build system configuration may or may not be time-consuming, due to needing to re-run configure-time actions (some of which are cached, though).

It's entirely possible that xmake is very fast at processing the configuration, though!

and has built-in cross-platform build caching optimisations, similar to ccache but with support for msvc. There is also built-in support for distributed builds, and remote build caching. There is also support for unity build to further speed up compilation.

  • Just wanted to comment on this... ccache 4.6 added support for MSVC so this is a non-issue.
  • And distributed builds aren't exactly a unique feature, people have been using distcc successfully for a long time.
  • Most modern build systems do support unity builds (cmake and meson both support this for example) but see also https://engineering.leewinder.co.uk/2009/12/15/the-evils-of-unity-builds/; anyway 🤷 build systems support it because users want it, xmake is doing the same thing all the other cool new build systems are doing.

Writing your own builtin version of ccache/distcc actually sounds bad. In fact, it causes me to remember a recent xmake bug report in which xmake was erroneously failing to emit warnings for cached objects -- ccache is well designed to be equivalent to actually running the compiler and covers a wide variety of edge cases, and therefore among other things emits the same warnings that actually running the compiler would emit.

ccache and distcc are well tested by a vast number of people, your builtin comparable features probably are not. So even projects that use xmake should, IMO, use e.g. CC="ccache gcc" and sidestep the homebrew caching.

Finally, xmake also provides built-in package management

cool

and also supports packages directly from any other package manager such as conan/vcpkg/brew/apt/cargo/pacman/dub. In theory, we are able to directly integrate over 90% of packages using the c++ ecology, as long as any of the repositories have it.

But this is just what literally every build system does, even the shell scripts that run hardcoded gcc $(pkg-config --cflags --libs libfoodep) myprog.c -o myprog, so I'm not sure that saying "xmake supports third-party package managers" provides significant insight into the state of the art in build systems?

@waruqi
Copy link

waruqi commented Aug 14, 2022

And distributed builds aren't exactly a unique feature, people have been using distcc successfully for a long time.

right, but distcc does not support msvc and remote build cache. Also it has not been updated for a long time and does not support load balancing scheduling between nodes.

In fact, it causes me to remember a recent xmake bug report in which xmake was erroneously failing to emit warnings for cached objects

xmake has only recently supported cache, so it may not be as mature as ccache, but I will continue to improve it. This issue has been fixed quickly before and I believe that in a few releases it will be more or less as stable as ccache.

I don't think it's a big problem, you'll encounter some issues with any tool, even with ccache. As long as the maintainer can fix it quickly.

Also, even with xmake, we can still quickly use ccache and distcc, which do not conflict

@waruqi
Copy link

waruqi commented Aug 14, 2022

But this is just what literally every build system does, even the shell scripts that run hardcoded gcc $(pkg-config --cflags --libs libfoodep) myprog.c -o myprog, so I'm not sure that saying "xmake supports third-party package managers" provides significant insight into the state of the art in build systems?

We just need to configure

add_requires("conan::zlib 1.2.12")
add_requires("vcpkg::zlib 1.2.12")

and you can quickly switch between different package managers to use them.

xmake automatically calls vcpkg/conan to install packages and then automatically integrates links/includedirs. it is not as simple as just calling pkg-config to find packages.

The user doesn't need to care about using conan and vcpkg, xmake will do everything.

If you use packages with xmake's built-in package manager, it can also provide even more useful features.

you can see https://github.com/xmake-io/xmake/wiki/Xmake-and-Cplusplus--Package-Management

@frank-dspeed
Copy link

My point was about reducing Learning Curve and improve flexible builds.

I do not expect general more speed while executing but i expect Many More coders who could improve the build.

I hate it since years that every one used python because its easy to package a .so file and do a ffi.

I can now code polyglot and i know what i am doing that took me only 30 years. We can save other peoples time.

@eli-schwartz
Copy link

The plan is to steer away from custom solutions (like we currently have). That might mean an existing, well maintained toolchain that uses python

I think the discussion about the future of Python is atleast somewhat derailing from the matter at hand: irrespective of the future of Python, we should investigate an alternative to gyp-next.

@frank-dspeed I don't see the point of quibbling over some hatred of python, when the use of python in a build system has no effect on the learning curve of nodejs because it doesn't affect anyone, not even the people who build nodejs (because they will run a CLI tool and the language it is written in is totally irrelevant, whether it is written in python or ruby or golang or malbolge).

Improving the build would happen inside the build files. For example, by editing CMakeLists.txt (written in a language called cmake, not C++, even though CMake is written in C++), or by editing meson.build (written in a language called meson, even though Meson is written in python).

(Admittedly xmake seems to break this pattern by using lua files. That being said, the common syntax its docs recommend doesn't have any reliance on the lua language.)

@frank-dspeed
Copy link

@eli-schwartz maybe you see it out of a other view but since nodejs exists i got build problems with it on diffrent devices and platforms and arch thats nothing new and i needed to learn a lot to solve any of that issues and to even understand how all this systems work.

When we would code the most parts in JS we would create something flexible that a JS Developer could maintain and we would even teach him how to integrate with other lanaguges and software.

As you mentioned already every build system introduces also additional a DSL Domain Specific Language that your free to learn on top :D so i see no point in learning all that details.

Thats my point we can reduce a lot of complexity here and get it right once and for all as we can Easy create good readable Conditional code when we write that build Logic in ESM or if we would even use something like gulp and add if missing plugins.

Everything that reduces layers makes it also later more easy to switch to other build systems without changing as much code because we can then use all the JS Stuff like Injection and Loader Interception patterns.

@waruqi
Copy link

waruqi commented Aug 15, 2022

Whether we use python or nodejs, building systems that depend on external runtimes can introduce more uncertainty and problems. Even if a previous project can be compiled successfully, a python version update that introduces a bug can break meson/scons. like this case: xmake-io/xmake-repo#1013

but xmake has a lightweight builtin lua runtime. So if the current xmake version is stable, it will at least not break xmake builds due to lua runtime version updates and stability issues.

@frank-dspeed
Copy link

frank-dspeed commented Aug 15, 2022

@waruqi and we would have a light lightweight v8 runtime https://github.com/just-js/just/i as light as that for example but with v8 builds for more platform arch combos

i already use JS for system programming and it is the best choice i ever took in the last years the v8/JS/ECMAScript event loop replicates exactly what Zircon (FuchsiaOS Kernel) Does and in general it matches what i do in the cloud. I know out of expirence that every big Cloud Architecture is using the event bus pattern and all this concepts are not new.

but today we see that they are working. I am a big fan of the Overall Movement of the Ecosystem but there are some blockers like the idea of reusing existing stuff and fear of reinventing it.

All that systems got designed out of a 1970 or maybe older view its time to change that. Today we got CPU's with 128+ cores thats simply something total diffrent then it was 10 years back.

@frank-dspeed
Copy link

just-js by the way ranks 1 in a well designed benchmark suite that tests frameworks https://www.techempower.com/benchmarks/#section=data-r21&test=composite

i can tell you this is amazing my build speed is nice and the results are stable also all this is at last fast understandable!

@eli-schwartz
Copy link

Thats my point we can reduce a lot of complexity here and get it right once and for all as we can Easy create good readable Conditional code when we write that build Logic in ESM or if we would even use something like gulp and add if missing plugins.

Everything that reduces layers makes it also later more easy to switch to other build systems without changing as much code because we can then use all the JS Stuff like Injection and Loader Interception patterns.

In fact, by reducing layers you make it harder to switch to other build systems, and simultaneously move all complexity into your own project instead of letting a dedicated "build system" project handle the complexity for you.

This has been specifically rejected in this ticket. In fact, it's the entire point of the ticket.

The plan is to steer away from custom solutions (like we currently have).

It's eminently reasonable to want to avoid custom stuff you don't specialize in, and the nodejs team specializes in javasacript runtimes, not cross-platform build systems.

i already use JS for system programming

Build systems are not system programming. If build systems were system programming, the nodejs team would write their own build system where all configuration is done in config.cpp.

I am confident this is not what you want.

@frank-dspeed
Copy link

@eli-schwartz i guess i understand now where your view comes from but i do not agree while i know many people will agree with you at last it does not matter the future is clear at last for me and i see the whole ecosystem Polyglot moving into the right direction i see overlap in functionality everywhere and i see a OS Getting it finaly right via simply offering a Universal ABI to hardcode interfaces.

All this concepts solve so many problems that they will get adopted it is only a question of time and i only wanted to speed up the overall process at the end it will never matter for the future what the nodejs core project does or does not do. There is a reason why NodeJS gets forked and forked all over and it is not contributing back to NodeJS Core. There are Fundamental Problems that the People will address and they will sideStep RFC Processes as they got simply the ability to do so.

thats all i have to say. At the end You and me are Correct at the same time:

  • Your point is never touch a running system
  • My point is build to change or be legacy even before you created the PoC

@ryzokuken
Copy link

ryzokuken commented Aug 16, 2022

I need to dig deeper into the replies here, but just FYI, I don't think we're considering switching from ninja or make here. We're just considering switching from gyp-next to a better meta build system.

Another important thing to note is that speed is not our biggest priority here IIUC, our biggest priority is cutting down the work needed to:

  1. Maintain our own bespoke meta build system in Python.
  2. Translate V8's GN build files into gypfiles.

And while there's arguments to be made in favor of writing and maintaining our own in JavaScript, I don't believe it would make our lives much easier as compared to the current situation but will come at a huge cost.

@frank-dspeed
Copy link

@ryzokuken do no worry about that once i am done i will do a PR if you like it take it if not it is total ok

I am Coding a Whole Kernel in ECMAScript so i need some tooling to do custom v8 builds and optional other builds anyway as part of that i will come up with something that is gyp compatible and a drop in replacement.

I will take as much of the XMake implementations as needed. and incremental come up with own ways.

I did code a lot of parser stuff in ECMAScript already parsing the lua code of XMake to ECMAScript is a nobrainer. That should be low Hanging Fruits.

@waruqi
Copy link

waruqi commented Aug 17, 2022

we would have a light lightweight v8 runtime https://github.com/just-js/just as light as that for example but with v8 builds for more platform arch combos

@frank-dspeed I saw it, I don't think it's lightweight, as long as it still relies on v8, because no matter how you trim v8, it's still heavyweight. it's still 50M, and the whole binary of just after linking is 17M.

But the whole xmake binary is only 1M. If lto is turned on, it is only 670K.

I did code a lot of parser stuff in ECMAScript already parsing the lua code of XMake to ECMAScript is a nobrainer. That should be low Hanging Fruits.

I don't really think it's necessary to implement a build system based on js for this, and right now just-js only supports linux.

@frank-dspeed
Copy link

frank-dspeed commented Aug 17, 2022

@waruqi your partial correct just js out of the box does static linking so the resulting binary can be smaller when we do dynamic linking like XMake does overall i guess you get the situation a bit wrong let me explain more.

There is a reason why it is Linux only because it made it simple there is no technical reason for that. The Concepts are Portable. In Fact it is already a Compose able Linux Only NodeJS there are repos that hold all neeeded modules for that.
and it is easy to add more modules as this is simply C and Some Conventions to build v8

thats it!

NodeJS Example

  1. Download Source
  2. run our own make script that builds v8 or downloads prebuild
  3. build our build environment (creating some C code not much that throws our .js files into the v8::isolate) using fast v8 ffi calls to integrate into the Host system v8 includes a own FFI Call Engine so that no round trips are needed to the host environment that embeddeds v8 (Filesystem, network access, exec .....)
  4. build NodeJS with the same v8

Be Happy! No Extra Dependencies no Extra waisted Space as reused the same v8 in this scenario

@eli-schwartz
Copy link

I do not understand why we are 21 comments deep into a discussion about taking a bespoke python meta-build system and basically bespoke gyp meta-build files and... reimplementing it in bespoke javascript...

... when the topic of this discussion is about moving away from bespoke build systems on the grounds that it "doesn't make things better, but does add a huge cost".

Writing a nodejs-specific build system in any language does not sound like it's on the table. Writing one that depends on having a previously built copy of nodejs or v8 is just additonal pain by way of bespoke "own make script that builds v8", which is exactly the thing that was declared to be a problem to be moved away from.

I think it probably makes quite a bit of sense to stop talking about something that's known to be a non-goal and will not happen.

@Hi-Angel
Copy link

We haven't reached a decision, so it's not late

@mmarchini is it now? If not, I'd in turn suggest looking into Meson. I find it odd Meson is mentioned everywhere in the discussion, however nobody has put a suggestion for it.

I never worked with XMake, GN and Bazel, so can't say anything useful here. However, I see there is a suggestion for CMake, and I can tell for sure Meson is better. Basically, you can do all the same things, however it's more automated, and it has waaay nicer syntax which is easier to read (IMO that's the most important difference, meson.build files a way more understandable than CMakeLists.txt ones). Also, it has an implicit type-system, which allows to give much more useful error messages when one screws up something in the build system files.

Meson is very popular, these days it's used by lots of well known projects: Gnome-related ones, Mesa graphics drivers, XServer, libinput, i3wm, etc… Mesa is an interesting user, because it has lots of code-generators, they have different langs (e.g. there's an assembly code, which has to be compiled without lto, otherwise it won't link), so I'd say they use it in non-trivial ways.

@frank-dspeed
Copy link

frank-dspeed commented Oct 25, 2022

@Hi-Angel I guess your correct at last i am working on a ECMAScript written MESON implementation already as the syntax is easy parse able and transpile able. But i also throw GN into the Mix and then the user can choose what he wants to use to build his NodeJS Distrubtion.

We are in a good time periot where you can transpile anything fast. GN, MESON that makes not much difference or Python (Gyp).

At the end the only importent thing is feeding text into gcc or cmake or any esoteric compiler stack that you choose and call it a day.

@eli-schwartz
Copy link

i am working on a ECMAScript written MESON implementation already as the syntax is easy parse able and transpile able.

That's super cool. Also, we (Meson) have intentionally designed the syntax to be capable of reimplementing in other languages without undue difficulty, there's a FAQ entry about it even. :) So I'm happy to hear that that is working out for you.

Whenever you feel it's ready, I'd love to take a look at your implementation. We could list it in the Meson FAQ as well. Currently we list a couple alternative implementations of Meson, the most advanced one being a c99 version.

feeding text into gcc or cmake or any esoteric compiler stack that you choose

CMake isn't a compiler stack! :p

But yes, indeed, build systems are really just a pile of conventions for calling a compiler, with a few frills added on at the end.

@frank-dspeed
Copy link

ah sorry cmake is always automaticly assigned with llvm some how in my head

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests