Call for Participation: NFT Token Standard Working Group [Status Updated]

NFT Token Standard Working Group

We at DFINITY are dedicated to providing better developer experiences and driving adoption. Along with members of our great community, we created the first Fungible Token Standard ICRC-1. The standard is seeing increasing adoption and is paving the path for easier integrations. We’re looking to further our efforts by creating a working group for an NFT standard, and we’d like to call out to all members of our community to participate and to contribute.

Summary

  • Call for participation open to everyone
  • Starting with weekly meetings:
    Tue 29nd Nov, @ 17:00 CET (calendar)
    The meetings will be bi-weekly after the initial phase.
  • GitHub will be used as main discussion board. Link will follow shortly.

Goal

The goal of this working group is to create a baseline NFT standard to facilitate development and integration. Extensions can be foreseen for the future.

Organisation

This working group is derived from the Ledger & Tokenization Working Group, but we decided to form a new working group because this standard potentially appeals to a different audience. We will follow the format of the Ledger & Tokenization Working Group.

17 Likes

Thats very interesting! What can we do to participate?

We would like to contribute to make physical assets tokenizable according to the romano-germanic civil law that is currently being ammended in some countries.

Hi, you can participate by participating in the WG meetings.

Regarding Transaction History

The working group is currently assessing metadata to be associated with transaction history. Here I advocate we should strive for feature parity with CAP, as the incumbent with overwhelming capture.

That Psychedelic steps back at the same time this group attempts to standardize transaction history presents an opportunity to ship an implementation with very broad near-term impact. As a caveat, of course CAP isn’t perfect, and we should be comfortable when differing from it.

General Lessons From CAP

  • Motoko and Rust libraries make it trivial to add tx history to your canister.
  • Hugely flexible data model is future-proof and extensible, but means unreliable data.

Transaction History Events

CAP employs an incredibly flexible data model, allowing developers to store basically anything they want:

public type Event = {
    time : Nat64;
    operation : Text;
    details : [(Text, DetailValue)];
    caller : Principal;
};
public type DetailValue = {
    #I64 : Int64;
    #U64 : Nat64;
    #Vec : [DetailValue];
    #Slice : [Nat8];
    #Text : Text;
    #True;
    #False;
    #Float : Float;
    #Principal : Principal;
};

Developers made typos and forgot to add certain values, so the data was very inconsistent as a result.

The flexible model also provides extensibility, which may be a critical property. Consider the exercise of trying to determine a finite set of operations: ICRC has mint, transfer, and burn, but what about sale and list? Can we reliably predict future use cases, such as rent, delist, fractionalize, and so on? Should the standard leave room to support innovative tx history use cases?

Some Real CAP Event Types

I would recommend adding the “sale” event type to ICRC, considering the importance of marketplaces to the NFT ecosystem (sorry if this is already included, couldn’t find the reference.) Here’s an example CAP event:

// Sale (BTC Flower)
{
  "time": 1670588669701,
  "operation": "sale",
  "details": [
    [
      "to",
      {"Text": "58a0fcbd3ae8d5589f9b7e3208b979867b44d04b8d4d1bfe2ede9f641973243f"}
    ],
    [
      "from",
      {"Principal": "jqazm-bdzbj-hnq3o-yexug-vqu3v-jonju-7ngcp-4yzbw-jyux4-lqejy-mae"}
    ],
    [
      "price_decimals",
      {"U64": 8}
    ],
    [
      "price_currency",
      {"Text": "ICP"}
    ],
    [
      "price",
      {"U64": 18000000000}
    ],
    [
      "tokend_id",
      {"Text": "jz2z7-3akor-uwiaa-aaaaa-beaag-maqca-aaaq7-a"}
    ]
  ],
  "caller": "s4d6r-c23m5-5gapy-myo3t-7pgtm-pdf6g-3nepw-hfcnc-2ohyg-p6qqq-rae"
}

A “mint” event can also have a purchase price associated with it, as we see with this example:

// Sale (BTC Flower)
{
  "time": 1670588669701,
  "operation": "sale",
  "details": [
    [
      "to",
      {"Text": "58a0fcbd3ae8d5589f9b7e3208b979867b44d04b8d4d1bfe2ede9f641973243f"}
    ],
    [
      "from",
      {"Principal": "jqazm-bdzbj-hnq3o-yexug-vqu3v-jonju-7ngcp-4yzbw-jyux4-lqejy-mae"}
    ],
    [
      "price_decimals",
      {"U64": 8}
    ],
    [
      "price_currency",
      {"Text": "ICP"}
    ],
    [
      "price",
      {"U64": 18000000000}
    ],
    [
      "tokend_id",
      {"Text": "jz2z7-3akor-uwiaa-aaaaa-beaag-maqca-aaaq7-a"}
    ]
  ],
  "caller": "s4d6r-c23m5-5gapy-myo3t-7pgtm-pdf6g-3nepw-hfcnc-2ohyg-p6qqq-rae"
}

Some projects opted to add the “memo” field:

// Transfer (Cronics)
{
  "time": 1670345256752,
  "operation": "transfer",
  "details": [
    [
      "token",
      { "Text": "oz5mx-cykor-uwiaa-aaaaa-b4aaq-maqca-aacq3-a" }
    ],
    [
      "to",
      { "Principal": "icdsn-5xpay-zrct4-of2q2-hyhrs-ftoyx-se5w7-xhzri-psrh4-ksiyr-bqe" }
    ],
    [
      "from",
      { "Principal": "tetgr-gbsy6-6hauq-s64fd-4qbcs-tkzqq-gmn26-3u5am-6bmn5-g2wi4-xqe" }
    ],
    [
      "memo",
      { "Slice": { "0": 0, "1": 0, "2": 0, "3": 0, "4": 0, "5": 0, "6": 0, "7": 0, "8": 0, "9": 0,"10": 0,"11": 0,"12": 0,"13": 0,"14": 0,"15": 0,"16": 0,"17": 0,"18": 0,"19": 0,"20": 0,"21": 0,"22": 0,"23": 0,"24": 0,"25": 0,"26": 0,"27": 0,"28": 0,"29": 0,"30": 0,"31": 0 } }
    ],
    [
      "balance",
      { "U64": 1 }
    ]
  ],
  "caller": "tetgr-gbsy6-6hauq-s64fd-4qbcs-tkzqq-gmn26-3u5am-6bmn5-g2wi4-xqe"
}

Looking forward to the next discussion! I hope this provides some useful reference.

4 Likes

As the NFT Working Group is coming close to having a consensus on the minimal standard, I would invite all members of the community to review and post comments here on the forum, as well as under the github issue.

The draft text of the standard is here: ICRC/ICRC-7.md at repo-init · dfinity/ICRC · GitHub

Latest slide on pending issues: NFT WG meeting 20230214 - Google Slides

The next WG meeting is on Feb 28, please participate if you’re interested.

5 Likes

Hi IC NFT People! I’d like to re-iterate some points we brought up during the recent working group meeting. There are a few things we’d like to explore for the ICRC-7 standard:

  1. A GraphQL interface for canister metadata: We propose creating a GraphQL interface for canister metadata to provide a flexible and efficient way for developers to query information about a canister’s state. This is important since NFT metadata can be quite large in size.

  2. JSON HTTP interface for the canister: We suggest implementing a JSON HTTP interface for canisters to enable developers to interact with canisters over the web without needing agentjs. This would allow developers to use a wide range of programming languages and tools to build applications for NFTs on the IC. References here and here.

  3. Transaction history similar to ICRC-3 and compatible with CAP if possible.

  4. Permit transfer similar to ICRC-2.

If you have specific comments or insights regarding those points please don’t hesitate to contribute.

6 Likes

Hi there, I want to know when we can deploy nft canisters based on icrc-7 standard on IC? :grinning:

Soon™ :grinning:
Realistically next steps are: a reference implementation, potentially make some modifications to the draft standard in the process, and vote on the standard.

4 Likes

Hi everyone
What’s the status of the standard, reference implementation etc? What are the main obstacles problems and blockers?

1 Like

I’m sorry for the delay on this one…I was able to take some time of with my youngest son while my older kids were away at camp and the managed to get in a week away with my wife which is a bit of a rarity these days…thanks for the patience.

I took a stab at ICRC8 as I promised to the working group a month or so ago, a more robust NFT standard that aims to use the power of the IC to fix some of the inherent problems in “standard” NFTs that we’ve inherited from Ethererum.

So far I’ve just tackled the query side of things as I think that gives us plenty to talk about in this week’s meeting:

3 Likes

Hi everyone!
just added few considerations/issues on the standard in the Github issues page (ICRC-7: Minimal Non-Fungible Token (NFT) Standard · Issue #7 · dfinity/ICRC · GitHub). I think there are still some open issues to think about.
At the moment I’m working on a Motoko implementation of the ICRC-7 standard (well, the draft). I hope to make it available to everyone as soon as possible.

2 Likes

I assume TransferError.TemporarilyUnavailable can be used to protect against DoS attacks ?
The implementation could for example allow only X transfers per hour and return a TemporarilyUnavailable error if exceeding.

1 Like

Hey all,

I made some changes and added the updated functions to the ICRC8 proposal…not sure if there will be time to talk about it tomorrow or not.

I decided to include a general mirror of the ICRC7 8 transfer, transferFrom, and approve flow even though, in my opinion, they break NFTs.:grimacing: It will be up to the implementation to decide if they want to implement these.

1 Like

A few notes on our implementation of ICRC7 for the origyn nft here: ICRC-7: Minimal Non-Fungible Token (NFT) Standard · Issue #7 · dfinity/ICRC · GitHub

Icrc7 Implementation Notes

Last week we implemented the current draft of the ICRC7 standard into the ORIGYN NFT.

While we have been working with the working group to help develop ICRC7, ORIGYN foundation has taken a stance since the beginning that we think that the general principles behind ICRC7 exacerbate the inherent issues with NFTs that arose out of the limitations of the Ethereum architecture and that the group should instead focus on an NFT standard that utilizes the full potential of the IC ecosystem.

The group is now discussing that with ICRC8 and we’re progressing well. Since the group wanted to do a more eth-like standard in ICRC7, we participated and contributed to its development as best as we felt the standard could be developed. We have every intention of supporting whatever standards are approved, just as we already support the EXT standard and the DIP721 standard. In that spirit, we undertook to implement the ICRC7 standard into the existing implementation of the origyn_nft standard. As is usually the case, some issues with a system won’t be seen until you actually try to implement it. In that spirit, the following is our implementation report of ICRC7 which we hope will be helpful in finalizing the standard.

icrc7_owner_of query - It seems odd that this does not have a result response. Since it is possible to request the owner of a token_id that does not exist, the best we could do was trap if the token doesn’t exist. It may make more sense for this function(and perhaps the other query functions that let you request a response for a token id that may not exist to have a return type of

{
#Ok(return_type);
#Err({
#NotFound;
#GenericError({
message: Text;
error_code: Nat})
};
}

Metadata Limitations - We’ve proposed elsewhere the ICRC16 standard for recursive, candid like, extensible metadata. We’d repropose it be considered here. We recently updated it so that can be a supertype of the ICRC3 transaction event type metadata and it makes sense to do the same here. Because we didn’t really have the control we wanted here(for example, we have a manager array and with no way to return a manager we were faced with having to have list of “com.origyn.manager.1”,“com.origyn.manager.2”,“com.origyn.manager.X…”. There is also not a Principal type which is odd considering how much principals are used in the IC(you can convert to Blob, but it is going to be really messy to read on transaction logs). Text is another option but takes more space). We chickened out and have one “metadata” field that is the json text representation of our nested, recursive metadata.

Versions evolve - The icrc7_supported_standards has a return of vec [(Name, URL)]. It may be that some standards evolve and thus end up with a version number. (we are already on origyn_nft v0.1.5 which does have some differences). We think this schema can support that with different URLs, but a version field might be worth a second consideration.

No way to get all token IDs - the collection metadata query does not return the token ids and there is no other query function to do so. We’d recommend an ircr7_token_ids( opt {skip: Nat; take: Nat}) to allow paginating IDs. I think there is an assumption that tokenids will be sequential and from 0 to total_supply-1, but this is not how we implement it. Origyn_nft has text based token_ids for readability and human cognition and recognisability. We convert these strings to a Nat when we implement ICRC7 or DIP721 and they are not sequential.

Is_atomic in transfer args - We do a number of potentially async functions when transferring around NFTs, including KYC, paying royalties, recognizing deposits, etc. Atomicity is not something we can support, and we’d argue that any sufficiently interesting or complex system with real-world utility is going to have the same problem on the IC. Encouraging that atomicity is possible/reliable is creating a very hard-to-untie knot for developers. This can become especially troublesome for generic 3rd party services that may assume you have atomicity when you actually don’t. We chose to reject any request for transfer where atomicity is requested.

Multiple token transfers but singular response - We keep separate ledgers for each token id. We could support batch transfer requests(and in fact do support this in the origyn_nft standard), but each transfer will have its own response. The fact that the standard assumes a singular response from this function is odd. As a result, we are currently rejecting any request for more than one token_id at a time.

Approve - Origyn NFTs see an existing escrow of tokens as the approval of a transfer as a market transaction. As a result, this function doesn’t have enough information in the event args to make sense to an origyn_nft and we trap automatically.

Metadata unification - The Collection metadata is hard coded to an object type, but the NFT metadata is extensible. Perhaps the Collection metadata should use the same format as NFT metadata?

Royalties - royalties were awkward to implement. We support a menu of royalty recipients and distinguish between primary and secondary markets. ICRC7 lets you pick one pay-to-account and one amount. We chose to sum up the secondary royalties and created a sub-account on the canister to direct these two. We can, in the future create a system for distributing anything sent to this account, but given that all transfers happen through our in-nft market mechanism the royalties would be auto-distributed. As a result, anything sent here was probably done by mistake. We’ll propose a better way to report this info in ICRC8, and there may not be a solution here because the demand is for simplicity. One solution might be to just move this to collection metadata if it is adjusted to be extensible considering this standard is just asking marketplace to honor this and there is no actual assumed enforcement mechanism.

Supply Cap - We don’t really have this concept(supply is controlled by the minter, so if you built some kind of capped supply you’d need to program that in the minting contract). We return null.

3 Likes

Few notes for what we want the spec to contain:

add: icrcX_metadata_bulk([nat]) in case a canister receives 100 NFTs and wants to fetch all of them. Making 100 queries instead of 1 will make it hard to accomplish. Actually. scratch that, it’s not enough.

In general, I believe it’s a good idea to allow other canisters to “sync” the NFTs they want and work with locally. What will that look like:

icrcX_tokens_meta_of(Account, last_checkpoint, limit) : (vec (nat, Metadata)) query;
Notice you get the metadata too, not only the token id. Last_checkpoint points to index last_updated+id. You get only limit results and if you receive less than then the limit, then it’s the last page, if not you keep on pulling more. This means, if an account has 1000 Nfts and you send it one, it doesn’t have to fetch them all again, it will get only what got changed. That kind of sync is used by RxDB and is pretty easy to handle on the IC. Easy to implement that index with RxMoDb.

Similarly:
icrcX_metadata_sync(last_checkpoint, limit) : (vec (nat, Metadata)) query; returns a list of all NFTs and their Metadata ordered by last updated. Another canister can use that to copy the whole NFT memory and keep it synced cheaply. Once everything is copied, it will only get what’s changed and won’t have to fetch everything again.

Not 100% sure about the next one:
icrcX_metadata_all_indexes() : [Text]
icrcX_metadata_indexed(index_name:Text, ?lastid, limit) : (vec (nat, Metadata)) query; Will do something similar, but allow the NFT canister to keep custom indexes. This is helpful for UIs when they want to get all NFTs with a certain ordered attribute.

Metadata:
Since we can now serialize from and to Candid. I wonder if we shouldn’t just keep normal candid data structures inside the canister and when giving them to the outside world, have the option to serialize them to blob. UIs in JS can easily handle dynamic meta schema. Canisters can’t, so for them blobs are better. They will be able to store the metadata without knowing what it is. If they know what the schema is, they can parse and work with it. An NFT can actually be split into multiple meta schemas [(MicroSchemaId, Blob)]. Micro schema IDLs will be stored on-chain and also standardized. This way a group of games can agree on a micro_schema for various item types like avatars, swords, etc. Then a canister that wants to work with these will fetch and add them to their code. Decode the blob and use them. Additionally, NFTs collections can add micro schema blobs to their NFTs without the need to update canisters. That’s something Candy solves partially right now. I guess what it doesn’t solve is, well - if you receive Candy and you don’t know what’s inside - perhaps you don’t have the schema for working with game items for a certain game - then you still can’t really work with the Metadata. You can traverse it, but not do anything meaningful. You can only work with what the fields you know about. Multiple micro schemas will let you work with what you know and let you ignore what you don’t know the same way as Candy. You can also have a Candy micro schema and just put it all inside. I guess it’s something that needs experimenting with. I may create another ICRC standard with all these ideas. Maybe we will converge on a later version.

If we add [MicroschemaId] field in the above queries, the client will be able to request what it wants exactly. A game can do something like this:
icrcX_tokens_meta_of(Account, [‘author’,‘managers’,‘vehicle-v3’], last_checkpoint, limit) : (vec (nat, Metadata)) query;
And receive:
“author”, { author: …, license: … },
“managers”, [ {principal, permissions}, {principal, permissions} ],
“vehicle-v3”, { model: …, year: …, parts: … ,attachments:… }

1 Like

I think ICRC8 is pretty powerful and allows the NFT canister to do more than just be a ledger. It’s what I am doing with Anvil too, the marketplace functionality is inside, not in another canister.

How will this functionality get handled in what I have in mind:
Let’s take the Dutch auction functionality in ICRC8 for example:

It adds:
image
and
image
to AskFeature (Which is something like the micro schema)
As far as I can tell It adds transactions type too
image


and actor functions


But then that’s not the only auction type. Here are different ones:

On one side, it’s good to focus on something that works and let it spread across the whole ecosystem. On another, it’s disabling innovation. You will need another standard to add another type of auction.

One way to solve this is by keeping the ledger inside one canister and also providing other auxiliary canisters which sync the ledger and provide these functions. They can be still owned by the creator of the ledger canister, but also someone who doesn’t want to develop their Dutch auction, can buy a license for a canister doing it and attach it to their ledger. (We will have to check implementation specifics to know if it will be secure and fast tho) Then you will need standards for each auction canister too.

Another way - have micro schemas/ micro standards that work inside IcrcX (not sure what number to put) and provide:

  • additional transactions
  • additional functions
  • additional metadata

IcrcX is made so it can contain and work with these without knowing them. The canister implementation can know their schemas and work with them, but the standard doesn’t need to. For the standard, they are blobs and pointers to what these blobs mean.

So in theory you should be able to split ICRC8 into one IcrcX and ~10 micro schemas and get the same functionality inside one canister.

With something like what Neutron is doing - compiling multiple modules inside one canister, you can probably offer good dev experience (they won’t need to reimplement everything on their own) and it will allow devs to provide modules and upgrades. If these aren’t pluggable modules, the implementations may differ and even tho there is a standard, things may work differently. But then you will have to do these systems for each IC language.

If the functionality is added in auxiliary canisters, then you have to deal with asynchronicity, and multiple canisters, but your NFT can be a mix of Rust with Motoko and Azle.

If the functionality comes from 3rd party canisters, then again you deal with asynchronicity, but also you add another party that needs trust & takes a cut from the tokens going thru it. These contracts can be immutable and take fees from usage or only add NFT collections which pay a fee, but if things change or there are bugs, you will need to spawn another canister. Your immutable canister can’t benefit from IC CDK improvements. You can’t handle the whole ecosystem with one canister, so you will need a swarm. If you want to make that small piece of functionality SNS-controlled, then we are going to end up with hundreds of small-cap DAOs which someone can take over and break the whole thing.
A lot of strategic trade-offs to be made. Let me know what you prefer.

2 Likes

We talked a lot about this on the last call. The main point of contention was around ICRC7 having approvals or not. For Fungies we separate this into ICRC1 and ICRC2(mostly as a legacy of the ICP ledger). You can get a bit overboard with this as I could argue that transfer is not needed for soulbound NFTs. So should we split every thing into different numbers?

Well, there are no standards police and I’d argue that the end state justifies what ever we want to do. The suggestion was to leave ICRC7 as it is and define some “supports” hints in the standard that a developer can deliver to wallets that might want to know something about what is supported and what isn’t. ie:

icrc7_no_transfer
icrc7_no_approval.

If you query ‘ICRC7_supports’ and you get these, you should not make the transfer and approve options available.

I’d say the same is true for ICRC8(which we are just at the beginning of). Likely we’d need to have a few support entries for different marketplace features.

ICRC8_wait_for_quiet
ICRC8_dutch
ICRC8_royalties

ICRC8 also calls for some definition around metadata…we have opinionated metadata entries in origyn_nft(like how royalties are defined, other system variables) and those likely need to be defined as well so you can tell a canister or service if they should be looking for these items(Maybe we can use ICRC16 schema for part of this that @ZhenyaUsenko has been developing).

So a standard is more than just the definitions of the actor functions and types. We may need to define metadata schemas, support definitions, etc.

As far as those other sales types go we can add them…maybe they aren’t ICRC8, but a set of them may be ICRC9, and with the ask system using variants, they can just be additions of the variants without having to add any more functions. A dev will be able to use aggregated types I think to:

public type AskFeature = ICRC8.AskFeature and ICRC9.AskFeature(Do ands combine variants as well as properties? I hope so.).

As far as your microschema metadata-of function goes, there is a bit of this already in the icrc8_nft_info:

public type NFTFieldRequest = {
    #current_ask;
    #current_offers : (?Nat, ?Nat); //skip, take
    #metadata: Text; //candypath
    #transaction_count;
    #transactions: (?Nat, ?Nat);
    #owner;
    #managers;
    #allocated_storage;
    #available_storage;
    #minted_date;
    #burned_date;
  };

The fields that need pagination have them…and as it comes to metadata, it allows a candypath to select different things, so likely the standard would need to define the paths that should be available, which are optional, etc…likely they should have corresponding names and can be added to the ‘supports’ query as well. (There may be a better solution for supports than just a vec of Text…one thing I’d been thinking of was versioning…some of these icrcs may need to evolve…or maybe we need to be hardened and if you’re going to replace 8 with 8-1 and if you need to go grab the next ICRC available(say 325) and don’t do versioning.

There is definitely a path forward here that is neutrony…but maybe not full neutron? If these features can be built openzepplin style then I could see a tool where you just check the features you need and it spits out some starter code. But I do also like the thought of being able to construct with features…we’d have to think about the kernel a bit and how data sharing works.

1 Like

Ok, this will work for Metadata I suppose, if a canister doesn’t know about all of the features, since they are variants.
Adding other icrc* functions will also work.
But having variants in TransactionRecord that the client doesn’t know about, won’t that produce an error in Motoko?
Maybe if
#transactions: (?[TransactionRecord], Nat); becomes
#transactions: (?[?TransactionRecord], Nat);
The ones you don’t know about will probably become null after deserialization.
If all these work, I suppose no need to have blobs around.

I need to look at the new stuff that they did for ICRC3 and likely adopt that syntax…I think it avoids this issue.

1 Like

github - ICRC-7.md

If a tokenId doesn’t exist or if the caller principal is not permitted to act on the tokenId, then the tokenId would be added to the Unauthorized list. If is_atomic is true (default), then the transfer of tokens in the token_ids list must all succeed or all fail.

What is the default value for is_atomic? I think true is safer, so the default should be true.
However, considering the following scenario, it might be better if the default were false."
What do you think?

https://forum.dfinity.org/t/call-for-participation-nft-token-standard-working-group-status-updated/16566/14

Is_atomic in transfer args - We do a number of potentially async functions when transferring around NFTs, including KYC, paying royalties, recognizing deposits, etc. Atomicity is not something we can support, and we’d argue that any sufficiently interesting or complex system with real-world utility is going to have the same problem on the IC. Encouraging that atomicity is possible/reliable is creating a very hard-to-untie knot for developers. This can become especially troublesome for generic 3rd party services that may assume you have atomicity when you actually don’t. We chose to reject any request for transfer where atomicity is requested.