Hey folks,
This thread has been a great source of inspiration and I’ve learned a lot following your code and examples. Clearly there’s a lot of community effort going on… many different wallet implementations, many different tokens coming out, NFTs, etc. It’s super exciting to see.
I think the discussion of a standard remains very important so I decided to start working on a draft post which I hope to distribute more widely. But before that I wanted to share with this forum and get some feedback and thoughts. The way I see it, a token standard belong to the community - the Foundation’s role is to help facilitate the conversation and that’s what the post hopes to help encourage. Let me know if I’ve left something out, if things need more clarification, or if I should try to dive deeper on any specific points.
The IC Token Standard Discussion
The Internet Computer developer community has been having ongoing discussions about defining a token standard fit for purpose for the IC blockchain. While other blockchain ecosystems have demonstrated a clear product/market fit for tokens, the IC provides a new paradigm for blockchain computation, and as such there is a strong desire to build a native token standard that can in time scale to the demands of millions of users.
This document will attempt to catalog some of the existing discussions around a standard, highlight key considerations, and generally serve as a resource for the early reference implementations of tokens on the Internet Computer.
All credit is due to the members of the IC developer community. While it would be too difficult to exhaustively list everyone’s contributions, we attempt to recognize all of those who have contributed to the conversation. Special thanks to: senior.joinu, ICVF, hackape, stephenandrews, harrison, geokos, Hazel, dostro, paulyoung, dmd, skilesare, claudio, jzxchiang, Jessica, wang, Ori, flyq, PaulLiu, witter, stopak, quinto, …
Introduction
What is a Token?
A token is a type of digital asset that is native to a blockchain ledger. ICP is an example of a token, and serves as a utility token for the Internet Computer blockchain.
Given that blockchains provide general purpose execution environments, developers use tokens as a foundational building block for building their decentralized applications — not only for bootstrapping funding, but also for community engagement and decentralized control of the project.
Why a Standard?
The topic of token standards has a storied history going all the way back to the days of colored coins on Bitcoin. With the advent of Ethereum smart contracts, the token standard discussion really gained prominence because for the first time a general purpose scripting environment was available to developers.
Arguably the most successful token standard is known as ERC-20, which was the initial catalyst for broad token interoperability on the Ethereum blockchain. The standard defines an interface for basic functionality to transfer tokens, as well as standard token metadata such as balances, token name, symbol, etc.
Tokens are generally considered to be primitives for a blockchain’s community. They act as coordinating mechanisms for projects, and enable many add-on ecosystem services such as decentralized exchanges, lending platforms, marketplaces, launchpads, DAOs, and so forth. A token standard interface allows any token that implements the interface to be re-used by other tools and platforms such as exchanges and wallets.
Design Considerations
PubSub
Most popular token implementations that pre-date the Internet Computer are generally designed for single-threaded execution environments such as the Ethereum Virtual Machine. Given the sequential nature of such blockchains, these tokens are rather simplistic interfaces that rely on the blockchain’s native consensus mechanism (generally Proof of Work) to order transactions, execute state transitions, and produce blocks.
Because the Internet Computer provides a truly distributed compute environment as compared to other blockchain systems, developers can find more expressive ways to develop their software architectures. PubSub (or “notifications,” “subscriptions,” “topics,” or “events”) is a common pattern that reduces complexity and creates code that is simpler and easier to extend.
Forum user senior.joinu provides an example PubSub interface (here named subscribe):
fn subscribe(
from: Option<Option<Principal>>,
to: Option<Option<Principal>>,
callback: String
);
The semantics of this style of subscription adds a callback to the subscription list. From here on, whenever there is a transfer
from → to, then the callback method will be called asynchronously without awaiting.
Whether or not PubSub should be considered a standard way of implementing tokens on the Internet Computer is a topic that is currently being discussed, and if so, what is the best way to execute on this pattern (naming conventions, extensibility, etc). It is worth noting that the ICP ledger canister implements some methods in the PubSub pattern.
Atomicity
The canister messaging model creates important differences compared to the Ethereum/EVM messaging model when it comes to the atomicity of transactions.
In the EVM (and other similar blockchains), if there is an exception when processing a transaction, the entire call stack of that transaction is rolled back. That is to say, when an Ethereum transaction is being processed, it has a global lock on the entire state of the EVM and can call in to other smart contracts in order to execute some complex logic. If the transaction is successful, then state transitions are applied to all contracts that were invoked. If the transaction is unsuccessful, then the entire call stack is rolled back as if the transaction had never happened in the first place.
Within the canister messaging model, no such rollback guarantees are provided. If an exception occurs within a canister, it only rolls back that canister’s state, not the entire state of the environment.
This difference means that token implementations on the Internet Computer need to consider designing for atomic, cross-canister transactions within the application logic, whereas token implementations on Ethereum get this atomic behavior “for free” (which is why such use cases as Flash Loans are popular within the Ethereum ecosystem).
Any token standard on the Internet Computer should leave room to enable cross-canister transactions.
Extensibility
In their simplest form, tokens are used for value transfer: sending the value of a digital asset from A to B. But tokens can also have much richer functionality. Indeed, some functionality can be taken for granted. For instance, developers on the Internet Computer who want to maintain the entire ledger of a given token’s transaction history would need to implement this functionality themselves. Developers on other blockchains such as Ethereum would simply get that functionality for free as a part of the platform’s native functionality.
From a standards perspective, it would be useful to agree on what the most basic form of a token API would look like, and what sorts of extension mechanisms could be implemented on top of that API. Some potential token extensions may be (h/t to Hazel & Toniq Labs):
- Burning
- History
- Allowances
- Batching
- Extended Metadata
- Fees
- Etc.
The fact that a token can be extended (and upgraded!) is a compelling and novel addition to the blockchain landscape that is not easily replicable on other blockchain environments.
Scalability
In other blockchain ecosystems, tokens are as scalable as their underlying systems. Users who are willing to pay larger gas fees are generally given priority in their transactions (edge cases involving concepts such as Miner Extractable Value are outside of the scope of this document). As the token’s transaction history expands, the underlying blockchain hosts the token state at no additional fee to the users, but at the cost of expanding the global state of the system and therefore limiting scalability (proposals regarding State Fees are outside the scope of this document) of the system.
Canisters in the Internet Computer are given no such “free lunch.” If canisters need to maintain their entire transaction history, then it is the responsibility of the application to implement that functionality. In the case of tokens, this can be achieved by implementing the functionality directly into the canister logic or via extensions, as previously discussed.
The design of the ICP ledger canister includes a mechanism for scaling transaction ledger storage beyond the limit of a single canister. The mechanism is implemented by maintaining many archival node canisters. As the current “tip” of the ledger approaches the limit of a single canister, a new canister is created as the tip and the existing canister is added to the collection of archives.
Immutability (Read-Only Code)
Due to canisters having an ability to be upgraded over time, there exists a possibility that the API of a canister as well as its underlying implementation may change at any time. This is different from other blockchain environments where contracts are immutable upon deployment and upgrade paths require re-deployment and migration of state.
As a result, there exists the potential for malicious token implementers to deploy a canister that seems benign at first but then upgrade the canister at some later date to some implementation that is not to the end users’ expectations, or which manages to steal funds or do other harm to consumers of the canister.
This is an edge case unique to the Internet Computer, and as such a token standard on the IC will require mitigation to prevent malicious token implementers from changing APIs in a manner that causes harm to consumers of the token.
One potential workaround is to imagine the existence of an open internet service that provides a “verified source” type of functionality for canisters. The well-known blockchain explorer Etherscan has a “contracts with verified source code” feature that is an example of this pattern. If such an open internet service existed, then token contracts could be certified by this open internet service, and any token that changed its interface would need to recertify or risk being delisted from the “accepted” use registry. Alternatively, token contracts may adopt using tools like Black Hole in order to make the canister public and immutable.
In any event, it’s worthy for the community to have a robust conversation about this feature of the Internet Computer and how it differs from other blockchain environments.
Rosetta API Integration
Rosetta is an open standard designed to simplify blockchain deployment and interaction. Many exchanges around the world, such as Coinbase, use the Rosetta API and expect blockchain projects to implement the API as a part of an integration and onboarding process.
The design of the ICP ledger canister implements the Rosetta API, and so there already exists a token canister with Rosetta API integration in the Internet Computer ecosystem.
It is worth noting that simply implementing the Rosetta API does not guarantee that a token will be listed on any given centralized exchange platform. However, having a well tested, off-the-shelf implementation of the Rosetta API for token canisters may be a boon for the token ecosystem, as tokens may be more readily supported by third-party tools and platforms.
Other Considerations
Rust vs Motoko (vs other Wasm-compatible languages)
The two primary programming languages for development on the Internet Computer (as of Summer 2021) are Motoko and Rust. While Motoko has been developed specifically for the Internet Computer, Rust is also a popular choice due to its robust community and extensive collection of libraries.
In an ideal world, the two programming languages would offer near parity in terms of performance characteristics, and choice would ultimately come down to developer preference. In practicality, there may be tradeoffs between the two languages. Further benchmarking may be required to fully understand the performance characteristics of tokens implemented with various languages.
Principal Identifiers vs Ledger Account Identifiers
Unfortunately, the ICP ledger canister uses a different cryptographic scheme for its account ids than the Internet Computer proper uses for its principal identifiers. The reason for these two different schemes is mostly historic (the keys that were used in the seed round were secp256k1 keys — as a result, they needed to be supported by the ledger canister at genesis).
Through the development phases of the Internet Computer, it was decided that Ed25519 would be used as the main signature scheme for the IC; this made sense as an isolated decision but unfortunately created the current conflict.
There is no clear way to unify these things in the near future, as the roadmap for doing so involves many components and there strictly isn’t enough bandwidth from the Foundation’s roadmap to prioritize such an effort.
Since most industry standards follow secp256k1 (including hardware wallets), perhaps it is a vote in favor of moving toward that direction for canister development in general.
Security Considerations
This section remains a big to-do. An exhaustive list of security considerations for IC style tokens needs further exploration from the community and from security experts. A few high level topics to consider include:
Re-entrancy
Double spend
Canister upgrade rug pulls
Appendix I - Existing Implementations
- Senior.joinu’s forum implementation of IFungibleToken
- SuddenlyHazel’s Internet Computer Token Standard
- ToniqLab’s Extendable Token
- Fleek (Psychedelic)’s Token Interface
Appendix II - Existing Proposals
- ICIP1 - Sailfish
- Toniq Labs - IC Fungible Token
- EIP-20: ERC-20 Token Standard (Ethereum)
- Deland Labs - Dfinity Fungible Token Standard
- ERC20 Re-implementations