With the 0.9 release, you can unshackle yourself from cloud LLM providers, at least for code completion. We support CodeLlama-7B-QML and DeepSeekCoder v2 Lite now. You can run them with Ollama, the LLM self-hosting technology, on your computer with a few clicks and a single CLI command.
