The ICP developer docs (trained on the wiki, this forum, and other docs) now have an LLM to help developers build ICP dapps (and answer other questions).
Summary
Ahoy Folks,
This morning the ICP developer docs website released an LLM chat widget on the developer docs:. You can see it here: ICP developer docs.
Many folks have worked on updating documentation, unblocking developers, creating examples, and a common. piece of feedback we have received is that it is still not quite easy enough, so we wanted to give developers a better experience.
FAQs
1. What is this LLM trained on?
It is trained on the following:
ICP Wiki
ICP developer docs
ICP developer forum
and fine tuned manually as folks give us feedback.
2. Where does the LLM live?
The LLM is powered by the nice folks at Kapa.ai. They made it easy for us to ingest the data (all of which is public) and train on it.
This is an important note because we should be clear that while the ICP website is hosted on-chain… the LLM is hosted off-chain on Kapa.ai. Dont want folks to get the wrong impression about where the LLM is hosted.
3. If I find an incorrect answer or want to recommend something, where can i give feedback?
You can give feedback on this very forum, but you can also use discord where folks like @domwoe or @kpeacock usually hang out on.
4. What kind of questions should I ask it?
Anything. Here are some examples:
what is a candid file?
how do I start building stuff?
What is the maximum memory of a canister?
What is the expiration time of an ingress message?
Kudos
Special props to @dfx-json and @domwoe for getting this across the line.
Any parallel world projects in Motoko code generation, other than (Microsoft) copilot? I’ve came across only MotokoPilot, not released. Maybe it’s time for the IC community to train its Motoko model?
Thanks Dfinity, One unique thing I love about Dfinity, It has always Innovated amazing things, Chat GPT has always given me wrong Motoko code snippets until I stopped using it. Hope adzle and Kybra are added It will do us well.
Being relatively new to this forum, I am thrilled to delve into the ICP ecosystem with the newly launched LLM chat widget on the ICP developer docs website. However, during my interaction with the LLM, I have observed a few areas that could be fine-tuned to deliver an even more enriching user experience:
Chat History Retention:
Currently, the chat history disappears once the chat is closed, which can be inconvenient, especially when a user accidentally clicks outside the chat box, causing it to close. A feature to retain chat history or an option to save the chat would be a beneficial addition.
Auto-Scrolling During Answer Generation:
The auto-scroll feature that moves the user along with the generated content can be disruptive, especially when trying to read a particular line. It would be great to have control over the scrolling, allowing users to stay on the line they were reading without being auto-moved.
Response Limit:
It’s noticeable that upon engaging with the LLM, one can quickly hit a limit on the number of responses. It would be helpful to either increase this limit or provide a clear notification about the limitation so users can tailor their questions accordingly.
Expanded Resource References:
While the current resources are quite informative, adding references to external resources like the Motoko Bootcamp, Juno, among others, would broaden the spectrum of learning materials and examples available to developers. This could foster a richer understanding and exploration of ICP and dapp development.
I need to learn the details of the HTTP Gateway Protocol, and talking to the LLM to figure it out is so much better than reading the docs page directly. But the LLM does point me to that docs page for reference, so that was great too.
I wanted to personally thank you for leading this effort to add this AI Chat functionality to the documentation site of the IC. It helps a lot to find disparate pieces of information, and has already helped me!
Thanks man, to you and to the folks at Kapa.ai.
Could you share some of the story of how this happened, and what can it ingest? I was wondering if we could add open source relevant material such as the Rust Programming Guide, or other Rust related resources that would compliment the good stuff we already have in the IC Docs.