In this blog, we’ll delve into the benefits of using AI in the open source coding-app Zed and explore how it can help to write better and faster code with the assistance of a Raspberry Pi.
For AI we don’t use ollama, but we run Llamafile on the Raspberry Pi.
Llamafile is created by, to make LLMs much more accessible to both developers and end users. Also it’s often faster than Ollama, and had a more performant server. That’s important running a LLM on a rather low-end Raspberry Pi 5.
So ssh into you’re Pi5 and startup Llamafile:
dev/llamafile/Llama-3.2-1B-Instruct.Q6_K.llamafile --server --v2 -l 0.0.0.0
To make ZED use your Pi, add this to your settings file CTRL+,:
The important thing is to set the name to openai, and add a api_url in which you set your local endpoint.That way it will use the openai API on a local endpoint instead of the cloud.
“language_models”: {
“openai”: {
“api_url”: “http://raspberrypi.local:8080/v1”
}
},
“assistant”: {
“version”: “2”,
“provider”: “openai”,
“type”: “openai”,
“default_model”: {
“provider”: “openai”,
“model”: “rpi”
}
}
For the rest just add a bogus key in the Assistant configuration for ZED openai, and you’re ready to go.
Llama-3.2 is running fine on a Raspberry Pi 5, it’s a good tradeoff between quality and speed for lower-end hardware. You could also try Qwencoder2.5, that is a bit slower though.
Just run another Llamafile on your Pi, there is no need to reconfigure the settings in ZED. That’s is the fun and easy of this solution.