Best practice for pre-installing an external tool used during normal build/run flow

working in the embedded domain, some sort of external “loader” is invariably required within the build-flow… while OSS tools like OpenOCD often fill this role, i sometimes must rely upon tooling specific to some silicon vendor…

while i don’t necessarily have sources for these tools, silicon vendors will generally let you re-distribute their (free) tools in a binary form… to that end, i’ve often used npm as a package manager for this purpose – simply collecting the tool’s runtime artifacts into a package which would then install its content in a well-known place…

[ nodejs was fundamental in my prior (non-Zig) world, hence leveraging npm was a no-brainer… ]

starting with clean(er) slate in Zig, how should i best package/deliver these 3rd party binaries in a build.zig flow – the idea being that installation of (say) a loader tool would be an infrequently executed step, whereas invocation of the loader would happen repeatedly when building/running a program…

while i can probably figure out how to specify these steps in a build.zig file, i’m equally interested in HOW y’all might distribute such binary artifacts… would i make it an “official” package described in build.zon; or would i simply bundle my artifacts in a more ad-hoc way known to my build.zig file…

there’s an almost limitless choice of approaches here… if there is a “best practice” for handling 3rd-party binaries in the world of zig, i’m glad to follow suit…

any suggestions or links to existing zig projects???

This is an interesting question. I feel one of the big tenets of Zig is portability, and pre-built binaries are inherently non-portable. For instance, if there’s a pre-built binary for OS A, B, and C, how would you reflect that in a build.zig.zon?

Maybe a bigger question… Is it an appropriate use of Zig’s package manager to distribute pre-built binaries? There’s an alternative here, and that is checking for existence of said binary in your build.zig script. Here’s a snippet from a project where I rely on arm-none-eabi-gcc being installed somewhere in the system:

// Try to find arm-none-eabi-gcc program at a user specified path, or PATH variable if none provided
const arm_gcc_pgm = if (b.option([]const u8, "armgcc", "Path to arm-none-eabi-gcc compiler")) |arm_gcc_path|
    b.findProgram(&.{"arm-none-eabi-gcc"}, &.{arm_gcc_path}) catch {
        std.log.err("Couldn't find arm-none-eabi-gcc at provided path: {s}\n", .{arm_gcc_path});
        unreachable;
    }
else
    b.findProgram(&.{"arm-none-eabi-gcc"}, &.{}) catch {
        std.log.err("Couldn't find arm-none-eabi-gcc in PATH, try manually providing the path to this executable with -Darmgcc=[path]\n", .{});
        unreachable;
    };

I allow the user to explicitly pass a path to where this binary resides via the armgcc build option, and if they don’t specify I try to find it in PATH. Now this method, of course, just puts the burden of getting the correct binary for a given tool on the user. But it at least can produce a nice descriptive error instructing the user why this particular program was unable to build and what tool they need to go get.

Generally speaking, reliance on “system installations” like this is discouraged in favor of building said tool from source. This is because if Zig is the one building the tool, it sort of “inherently” makes that tool portable. However, you and I both know that closed-source pre-built binaries are inherent to the embedded industry.

So… apologies if this wasn’t a satisfying answer but AFAIK Zig’s package manager is very much not designed to distribute binaries, it’s designed to distribute source code. So I don’t think there’s a “standard” way to do what you’re wanting at least currently…

let me give you a very common scenario…

i’m currently working with TI’s CC2340 wireless MCU; and while it’s certainly possible to use “open” debug infrastructure via direct access to the MCU’s SWD pins, by far the simplest OOTB experience comes from using the emulation hardware included on your <$25 launchpad dev board…

TI offers their own UniFlash utility, which supports 1000s of distinct MCU in their product line… past experience has shown, however, that written instructions like “install UniFlash on your PC” are easier said than done :wink:

the objective is for my users’ instructions to be:

  1. install zig (easy to do)
  2. clone one of my repos (easy to do)
  3. type zig build (easy to do)

the latter step, of course, could download/unpack/install tools like UniFlash – ensuring everything is consistent and in a place where i can find them…

having already bundled many such tools as .zip files which themselves formed the payload of npm packages, i suppose i could simply fetch these bundles from some repo of mine and then inflate their content in a folder of my choosing…

this seems rather “low-tech” but workable; i certainly don’t want to shoe-horn this into zig packages… since step 2) above presumably gives my users a “consistent” slice of my software, having the requisite tool binaries directly included in the repo or else indirectly fetch via the repo’s build.zig

just thinking out loud now, but bundling these adjunct binaries directly in my repo would probably do the trick… in my prior world (where installing something like arm-none-eabi-gcc was also a requirement) the binaries were indeed quite large… fortunately, tools like UniFlash or Segger JLink are quite compact…

seem reasonable???

Seems totally reasonable to me! I would also potentially integrate OS checks into your build.zig as well to give nice informative errors if a user’s environment isn’t able to run the binary.