I’m excited to present pre-alpha release of Zaica - AI coding assistant.
6 posts were split to a new topic: Clarifications about AI Policy
Does Zaica interfacing with Ollama mean, that you could configure it to run with local LLM only, so that you can train it to give responses for a particular zig version?
Currently if you try to get some zig code examples from your search engine, there is a good chance to get a perfectly good example that only compiles with some older Zig version.
Assuming it is possible to run this locally and train the LLM, that might help newbies, and irregular Zig coders (that may have skipped a minor version or two since last using Zig and didn’t read the intermediate release notes). It would require combining Zaica with a description of the steps to set up Ollama, what model to use, and a list of compatible source code repositories (the compiler itself, but also other projects checked to build with that version) and instructions on how to use them to train/extend the model.
Such a setup might not be as responsive (depending on the hardware available to you), but it I for one rather have a slow response that compiles than a fast one that requires zig 0.12.
Nice work!