I picked Gemini mainly because of its long context window. Many Gemini models support up to 1M tokens, which is a game-changer for Motoko development since it allows ICP Coder to load:
-
Entire Motoko docs
-
Multiple code samples
-
Developer queries
…all in one shot, without having to aggressively trim context.
With other providers (like OpenAI), context is much smaller and API usage costs are higher, which makes continuous RAG-powered coding less sustainable.
That said, ICP Coder is model-agnostic — Gemini is just the best fit right now for long-context coding support.
if you check our `env.example` fileb on github, we also show other providers:
OPENAI_API_KEY=your-openai-key-here
GEMINI_API_KEY=your-gemini-key-here
CLAUDE_API_KEY=your-claude-key-here