Hi everyone, thank you for today’s call (2025.02.20). Special thanks to @ielashi for sharing the LLM Canister update. This is the generated summary (short version, please find the long version here ): The ICP LLM Canister was introduced as a new LLM service on the Internet Computer, currently supporting Llama 3.1 (8B parameters) with Rust and Motoko libraries. While the AI worker is centralized for now, decentralization is a key future goal. The system is stateless, using IC’s random beacon for variability in responses, and scalability improvements are being explored. Future plans include expanding language support, integrating Anthropic’s MCP for tool calling, and enhancing security and privacy. At ETH Denver, ICP teams will showcase live AI demos, engage in hackathons, and explore cross-chain AI collaborations.
Links shared during the call:
- New ICP LLM Canister: Introducing the LLM Canister: Deploy AI agents with a few lines of code
- MCP standard: Introduction - Model Context Protocol
- vLLM is the current industry standard for serving batch requests: Welcome to vLLM — vLLM
- ETH Denver ICP Events: ICP Events · Events Calendar
- Agents Day at ETH Denver: Agents Day - AI x Web3 | 🇺🇸 Denver 2025 · Luma
- Another Agents event: Agents Unleashed: Builders Night @ETHDenver · Luma
- ICP Telegram group for ETH Denver: Telegram: Join Group Chat
- ICP Projects at ETH Denver: ICP @ ETH Denver 2025 - Projects View - Google Sheets
- Outlier Ventures event at ETH Denver on the Post Web: Outlier Ventures' Open House presents The Post Web | AI x Web3 · Luma
- Environment for creating Fetch.AI agents: GitHub - JupiterM/Fetch-Ai-Vagrant: Code Repository for Vagrant box which install development environment for creating Fetch.AI agents.