Zig’s build system is great but its documentation still has a lot to be desired. Hence, I am asking for help here.
I am trying to use use build.zig
to compile python extensions implemented in C or Zig on Linux and MacOS. The compilation problem is exacerbated by the fact that each of my computers (and of my colleagues) has at least 6 python installations – original system-wide python, several conda environments with different python versions, as well as several virtual environments. Choosing right include directories, libraries to links against becomes a nuisance. Fortunately, python provides module sysconfig
that contains all the information needed to compile that version of python or/and corresponding C extensions.
I am looking for a way for build.zig
to run my python script get-config.py
that returns all the needed build info as a json
object and pass it back to build.zig
to be used in compile build steps. This second part is where I am stuck. I an not even sure that it is possible at all, for Zig build system splits build process into two phases – build.zig defines the execution graph and then build runner executes it. How can I pass information generated at build.zig
runtime back to the build-step graph to define include/lib paths and compile/link options? Any help is greatly appreciated.
the recommended approach is to write scripts in zig then compile and run them when needed, but you can also use b.addSystemCommand
to run any system tool.
that tools that interact with files should take args to configure the input/output files, this lets zig cache them
This is the easy part. The hard part is to make the data returned by the script available to a build compile step. Example, the script returns the include directory. Now I need to pass it to compile step as -I/path/to/include/dir
when running zig cc
.
build.zig
const tool = @import("tool.zig");
pub fn build(...) void {
const output = tool.run(...);
//use output
}
Here are two good sources of documentation for the build system – particularly for running system utilities and capturing the output:
There is b.run()
, which runs during the configure phase (unlike b.addSystemCommand()
which runs during the make phase ). You should be able to accomplish this either way: b.run()
is more straight forward, as the return value of this function will contain captured stdout
of the spawned process, where as b.addSystemCommand()
will only run if it’s depended on. Both methods have the ability to capture stdout
.
Additionally, as @vulpesx pointed out, you could write your own tool, build it, and run it all from withinbuild.zig
.
I guess it boils down to two main scenarios:
b.run()
to collect output during the configure phase
- a
std.Build.Step.Run
step (b.addSystemCommand()
, b.addRunArtifact()
) during the make phase
5 Likes
Thanks @weskoerber . b.run(...)
is exactly what I was missing.
1 Like