@Gamris ,
Actively working on it!
- see icpp-llm allows you to run the llama2.c model in a canister. I tested it on the TinyStories model for now, but working on larger models.
- see also this forum post
Is this what you’re looking for?
@Gamris ,
Actively working on it!
Is this what you’re looking for?