ICP Coder is an AI-powered coding assistant specialized in Motoko for the Internet Computer (ICP). It lowers entry barriers for new developers and accelerates productivity for experienced ones by providing context-aware code generation, search, and explanations directly inside IDEs like Cursor and VS Code.
Target users include ICP developers, students, and teams looking to onboard quickly to the Motoko language while leveraging modern AI workflows.
Features
Retrieval-Augmented Generation (RAG) pipeline for Motoko code search and generation
Full MCP (Model Context Protocol) server to stream Motoko context into IDEs (Cursor, Claude Desktop, VS Code, etc.)
Vector embeddings stored in ChromaDB for fast context retrieval
Google Gemini integration for context-aware code completions
REST API with authentication and key management
CLI and direct API examples for quick prototyping
Automated ingestion job to refresh Motoko docs and project samples monthly
How to install
git clone https://github.com/Quantum3-Labs/icp-coder
cd icp-coder
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
Rust support. Extend ICP Coder to generate, debug, and explain Rust-based canisters.
Full DApp support. Expand beyond single canisters to full-stack apps (canister + frontend + integrations).
Sustainable free tier. Explore support from the DFINITY Foundation to help sustain the free base model, ensuring developers always have access to core features.
I picked Gemini mainly because of its long context window. Many Gemini models support up to 1M tokens, which is a game-changer for Motoko development since it allows ICP Coder to load:
Entire Motoko docs
Multiple code samples
Developer queries
…all in one shot, without having to aggressively trim context.
With other providers (like OpenAI), context is much smaller and API usage costs are higher, which makes continuous RAG-powered coding less sustainable.
That said, ICP Coder is model-agnostic — Gemini is just the best fit right now for long-context coding support.
if you check our `env.example` fileb on github, we also show other providers:
Well I used Cline daily, I did try Gemini vs anthropic. Gemini just blows through budget very quickly compared the anthropic. I do understand the context length issue. So your saying 1 Million context length is easier to deal with even though Gemini Pro is around 10 - 15 USD a million token. I’m very new to RAG so context triming and budgeting I have no idea.
In other cases I have used GLM 4.5 air and the big guy. They performed much better than anthropic to be honest and very budget friendly. Just my personal usage opinion. I’m curious to why tools pick closed source models over opensource model, In your experience why is it ?
Well… when Gian and me started this project we thought to make the project accessible to the most amount of developers. The problems with dealing with open source models is that you have to sometimes run them locally or rely on a service that provides their API. Hence, its a bit more practical for everyone to get a free API key like Gemini and other services provide and just use that for a given LLM to pass through our RAG + MCP repurposing when vibe-coding with our framework. Gemini’s free API use is quite generous. That way many people have access to it.
Hi everyone — excited to share that ICP Coder is now live and ready to use!
ICP Coder is an AI assistant for Motoko developers that provides context-aware code generation, search, and explanations right inside IDEs (Cursor, VS Code).
The team has launched a public service with user registration, API keys, and an NPM package for easier IDE setup — so you no longer need to run the project from localhost.
Just follow the Quickstart in the repo README and start using the tool right away.