I found this package on npm but I have no idea how it work locally. I have to install ollama and pull a model and then I can use it but on the main Net how the heck does it actually work? Does it have or connect to an AI model that is already deployed on a canister there is no explanation there.
Hi @AliSci, if you’d like to try the LLM canister with no setup, you can try the example on ICP Ninja here: https://icp.ninja/projects/llm-chatbot. Note that we only have a Rust or Motoko example for now on ICP Ninja.
If you need the JavaScript example, you can find it here.
No, I want to deploy it on my own canister on main net.
The LLM canister is already deployed on mainnet under the canister ID w36hm-eqaaa-aaaal-qr76a-cai
. Do you want to deploy your own canister as a client of the LLM canister?
yes, how to do that?
You can check out the examples I shared above to see how to make your canister talk to the LLM canister. You can then download the code from ICP Ninja, or clone the repo to deploy the client to your own canister. Do you have a more specific question, or does that answer your question?
As far as using the LLM canister on mainnet, it should “just work” when you deploy the examples available. If you’re curious how it works behind the scenes on mainnet, checkout the “How Does it Work?” section in the README here. Note that the canister used for local development is slightly different from the binary actually used on mainnet.
From my local can I directly connect the the deployed AI canister instead of my local one ollama ?
I want to test it without download ollama because it may take long, and later will not decide to use ollama with dfinity llm
I am using
"llm": {
"candid": "https://github.com/dfinity/llm/releases/latest/download/llm-canister-ollama.did",
"type": "custom",
"specified_id": "w36hm-eqaaa-aaaal-qr76a-cai",
"remote": {
"id": {
"ic": "w36hm-eqaaa-aaaal-qr76a-cai"
}
},
"wasm": "https://github.com/dfinity/llm/releases/latest/download/llm-canister-ollama.wasm"
},
in my dfx.json
and set it
"candid": "src/backend/backend.did",
"package": "backend",
"type": "rust",
"dependencies": [
"llm"
]
},
But got
this error
However, I tried to follow the extract step by adding
"type": "pull",
"id": "w36hm-eqaaa-aaaal-qr76a-cai"
},
to my dfx but I got