openchat doesn’t add a did file for canister, so you need to call the corresponding interface in query mode
The Vly project is working on this front. It will release several functions gradually.
-
An API that can enable an AI Agent to send any token to anyone on social network like X.com and TikTok. Key feature here is that the recipients don’t need to own a crypto wallet in advance. The API is already publicly available. It is being integrated into the Eliza repo as a plugin for any team to use.
-
An authentication mechanism within the Vly wallet that can approve an AI Agent to spend money on the user’s behalf within the allowed amount. This will be showcased soon: you can tag the Vly agent and it will tip other people on X.com using crypto on your behalf.
-
Another plugin for Eliza to allow an AI Agent to control a Vly wallet. The agent will be able to send any tokens on any chain using a unified API while having the identity of an X account owner.
We’d love to get involved in the work of the ICP community to contribute to Eliza if you wish to use the Vly project in any other way.
Twitter account of Vly is here.
could also add ICP as a database “adapter”, so that all of the data is stored in some sort of canister, instead of sql lite or postgres, etc
What would the database adapter be used for? For which part of an AI agent flow?
Great that you are contributing to Eliza! How would I get a secret token? Would love to try out the Social API to see how it works.
I suggest that you update the Social API README on how to obtain one.
DMed secret token and updated the Github page. Thanks!
any good news about icp ai agent??
Can you elaborate more on what you mean or what you are looking for in an “ICP AI agent”?
Is there any plan from Dfinity to incorporate existing AI Agent framework into ICP canister?
If not, are there on-going efforts to build ICP’s own agent framework?
What is Dfinity’s strategy regarding the AI Agent sector within 2 months?
Would love to see these questions answered, thanks!
I believe that James (Lucid) and @realdanmccoy are working on having an on-chain version of Eliza run in an ICP canister built on Azle.
What would you like to see?
There are a few areas that come to mind that the Developer Relations Team at least could prioritize a bit more but more than happy to hear your thoughts first.
Would love to see more info on this if it can be shared.
I understand running a full blown LLM like chatGPT in a canister is still not feasible. Maybe we start from something simple.
We know ICP canister can host the wallet of an AI Agent. At the same time maybe the same canister can access the API of e.g. ChatGPT. As such, we can use chatGPT as an external brain of the canister and keep the wallet securely on-chain.
It would be great if there is some demo/example to showcase this.
At Aikin we are working on AI including AI support helping writers on Nuance as well as an agent factory that will soon come out of stealth! We currently have an LLM fully in a backend canister using onicai’s c++ framework, we have a client side AI solution that runs in the browser using web LLM and then we have built a solution to access LLMs such as chatgpt using http outcalls.
Yes, however, you can run a smaller LLM as exemplified by projects such as Devinci and Onicai.
I think having an AI agent interacting with an ICP canister would be valuable. A canister accessing the API of ChatGPT - would developers be turned away from the cost of HTTPS Outcalls? I think we’ve also talked about timeouts as well.
Do you have a timeline for when this will launch? Let us know and we’ll promote it more!
Hello all, recently published a short article on the AI Agents on ICP:
The goal of these shorter articles is to capture SEO and drive more conversations in the wider community about AI agents.
A huge thanks to @Mar, @branbuilder, the aaaaa-terminal, and Omnity teams, and giving us visibility for the work that you are working on either in this forum or elsewhere.
For part two, I think we can expand a bit more:
We think in about 3 weeks. Exciting. In about a week we will start the comms. to promote it. Exciting times.
Does the foundation have plan to implement regular HTTP that is very cheap? I would imagine not every use case require everything to go through consensus. There are many use cases, especially with social media, that are very low value where I can see normal HTTP is just fine.
I’ll have to ask. Do you mean the ability to make HTTP calls that don’t go through consensus? Ideally, the calls would be free to make?