Browser-based AI Chatbot Served From The IC

I am actually also looking at running Llama 2 locally before tinkering with it and try running it in a canister since Llama2.c LLM running in a canister! it has been done already. But will do another approach.

3 Likes