Use an LLM to help build on ICP [Feedback appreciated!]


The ICP developer docs (trained on the wiki, this forum, and other docs) now have an LLM to help developers build ICP dapps (and answer other questions).


Ahoy Folks,

This morning the ICP developer docs website released an LLM chat widget on the developer docs:. You can see it here: ICP developer docs.

Many folks have worked on updating documentation, unblocking developers, creating examples, and a common. piece of feedback we have received is that it is still not quite easy enough, so we wanted to give developers a better experience.


1. What is this LLM trained on?

It is trained on the following:

  • ICP Wiki
  • ICP developer docs
  • ICP developer forum

and fine tuned manually as folks give us feedback.

2. Where does the LLM live?

The LLM is powered by the nice folks at They made it easy for us to ingest the data (all of which is public) and train on it.

This is an important note because we should be clear that while the ICP website is hosted on-chain… the LLM is hosted off-chain on Dont want folks to get the wrong impression about where the LLM is hosted.

3. If I find an incorrect answer or want to recommend something, where can i give feedback?

You can give feedback on this very forum, but you can also use discord where folks like @domwoe or @kpeacock usually hang out on.

4. What kind of questions should I ask it?

Anything. Here are some examples:

  1. what is a candid file?
  2. how do I start building stuff?
  3. What is the maximum memory of a canister?
  4. What is the expiration time of an ingress message?


Special props to @dfx-json and @domwoe for getting this across the line.



Did you train it also on The Azle Book, The Kybra Book, and the examples in their repos?


No, but that’s a good idea.

We originally started with dev forum and then added dev docs and wiki after. So the tale grew in the telling.

We’ll add those as well. good idea @lastmjs


I will try it out. Very cool news!

Can the icpp–pro docs also be added to the training data?

I have three more questions:

  • how often is the LLM retrained?
  • are you considering to add public code and demo repos?
  • which LLM is the basis for the fine tuned LLM?

Can the icpp–pro docs also be added to the training data?

yes, well add it.

  • how often is the LLM retrained?

my understanding is that is every weeks, but manually can be done more. I need to double check this.

are you considering to add public code and demo repos?


which LLM is the basis for the fine tuned LLM?, which i believe uses OpenAI (but i also need to double check this)


This stuff is RAD. Huge props!

ICP is amaaaaaaazing, I just recently got into the ecosystem and I already don’t know how I lived before without all this :slight_smile:
Trully the next gen.


I’m not sure if the small black widget on the bottom right corner is eye catching enough. It took me several seconds to find the widget on the page.


Agreed. Will improve


Any parallel world projects in Motoko code generation, other than (Microsoft) copilot? I’ve came across only MotokoPilot, not released. Maybe it’s time for the IC community to train its Motoko model?


Thanks Dfinity, One unique thing I love about Dfinity, It has always Innovated amazing things, Chat GPT has always given me wrong Motoko code snippets until I stopped using it. Hope adzle and Kybra are added It will do us well.



Being relatively new to this forum, I am thrilled to delve into the ICP ecosystem with the newly launched LLM chat widget on the ICP developer docs website. However, during my interaction with the LLM, I have observed a few areas that could be fine-tuned to deliver an even more enriching user experience:

  1. Chat History Retention:

    • Currently, the chat history disappears once the chat is closed, which can be inconvenient, especially when a user accidentally clicks outside the chat box, causing it to close. A feature to retain chat history or an option to save the chat would be a beneficial addition.
  2. Auto-Scrolling During Answer Generation:

    • The auto-scroll feature that moves the user along with the generated content can be disruptive, especially when trying to read a particular line. It would be great to have control over the scrolling, allowing users to stay on the line they were reading without being auto-moved.
  3. Response Limit:

    • It’s noticeable that upon engaging with the LLM, one can quickly hit a limit on the number of responses. It would be helpful to either increase this limit or provide a clear notification about the limitation so users can tailor their questions accordingly.
  4. Expanded Resource References:

    • While the current resources are quite informative, adding references to external resources like the Motoko Bootcamp, Juno, among others, would broaden the spectrum of learning materials and examples available to developers. This could foster a richer understanding and exploration of ICP and dapp development.

Great feedback. Thank you @timiv1 . let me see what we can do on this.


Can it be trained on this as well? GitHub - dfinity/examples: Example applications, microservices, and code samples for the Internet Computer


Yes. good call Timo. Will add those.

1 Like

Update on adding sources:


  1. The Kybra Book - The Kybra Book
  2. The Azle Book - The Azle Book

Not currently possible:

  1. GitHub - dfinity/examples: Example applications, microservices, and code samples for the Internet Computer - not possible, only Markdown files are supported for ingesting, not Rust, C, Motoko code.
  2. Code examples from Kyra and Axle repos (only from the docs)

@diegop ,

Will the system crawl the docs site during training or do we need to provide the full list of pages?

1 Like

It crawls all the pages. I manually verified the pages within the domain.


Wow, it is working great!

I need to learn the details of the HTTP Gateway Protocol, and talking to the LLM to figure it out is so much better than reading the docs page directly. But the LLM does point me to that docs page for reference, so that was great too.



I wanted to personally thank you for leading this effort to add this AI Chat functionality to the documentation site of the IC. It helps a lot to find disparate pieces of information, and has already helped me!

Thanks man, to you and to the folks at

Could you share some of the story of how this happened, and what can it ingest? I was wondering if we could add open source relevant material such as the Rust Programming Guide, or other Rust related resources that would compliment the good stuff we already have in the IC Docs.

1 Like