Prepared by Connor McCormick and Tim Lloyd
March, 2023
Introduction
With more than $100M USD in venture funding secured by 2018, Dominic Williams and the early DFINITY team-members assembled a world-class roster of cryptography researchers, blockchain architects and software engineers. And, over the next two and a half years, this “NASA for decentralized computing” realized their vision for a fully decentralized, globally distributed cloud computing network called the Internet Computer (IC). Then, with the “Genesis Launch Event” in May of 2021 – which saw the release of previously locked Internet Computer Protocol (ICP) governance tokens – the DFINITY foundation began to hand the future evolution of the Internet Computer over to a decentralized global network of node operators, application developers and networked multi-party service communities.
Numerous groundbreaking innovation accomplishments are wrapped up in the events of those years - and the pace of innovation only continues to accelerate. Cultivated by the DFINITY Foundation, the Internet Computer ecosystem has created an unparalleled developer experience with unparalleled functional scope and capability for rapidly deploying fully decentralized “Web3” applications (Dapps) that require no other cloud infrastructure or services. Central among these accomplishments – and emblematic of blockchain technology’s promise – is the distribution of “tokens” for broadly decentralizing the governance of this powerful innovation infrastructure – an unstoppable, cryptographically secured, open and permissionless, censorship-resistant, cloud computing network.
The Token Conversation
As the IC ecosystem accumulates experience and insight regarding the wide and growing variety of “tokens” “tokenization” and “token economics” use-cases and design patterns, a process of standardization in token system design has naturally emerged. In April, 2022, the Ledger and Tokenization Working Group was launched to lend basic collaboration and decision-making regularity to this still immature standardization process. In consideration of this early stage of standardization and, acknowledging accepted norms for how technical standards are composed, this report often uses the term “token system” or “design” and reserves the attribution of “standard” for the expectations and results normally associated with explicit standardization efforts.
It seems reasonable to acknowledge that at least some aspects of the recent NFT market exuberance reflects the sheer novelty value of these first digital “things” to capture the imagination of the general (non-developer) public. Indeed, some of the NFT project creators interviewed began their efforts in the context of “digital twins” - virtual objects trustable as remaining true to contours and measures of some corresponding “real world thing”. Fascination with novelty aside, these early creations also reflect incredible creativity in the collaborations emerging within the IC ecosystem. This exuberance and creativity will pay dividends through 2023 and beyond, as novel token-enabled integrations and compositions - rendering all manner of utility, ownership, agreement, access, governance and more - are constituted on the Internet Computer.
This research is proving worthwhile for gaining a more complete and nuanced sense of where tokens and token standards are “at” within the IC ecosystem and beyond. While still quite early in development, the available technologies and the social imagination and acceptance for these cryptographically concretized “things” suggests that the greater part of token system innovation and impact is yet to come. And whatever things you may be conjuring into the metaverse, the Internet Computer comes through in our analysis as offering one of the richest spell books available - as a secure, feature-rich and reliable decentralizing network on which to advance the state of the art in token-enabled use-cases.
About us and our approach
We therefore gratefully acknowledge the hard-won and generously shared insights of:
Bob Bodily, Ph.D, Co-Founder & CEO of Toniq Labs, a company focused on NFT tooling for the Internet Computer
Hazel R., Founder of Departure Labs, a firm building Web3 Infrastructure on the Internet Computer.
Ted Reinhardt, Internet Computer developer community organizer, admin for independent developer-community Discord server, IC bootcamp mentor. In early experiments with NFTs, Ted deployed the first Internet Computer NFT composed of virtual reality artwork. (@tedreinhardt)
J. H., Co-creator of Legends series of Tarot card inspired NFTs for the Internet Computer (
Austin Fatheree. CTO ORIGYN Foundation, pushing the technical boundaries of NFT design on the Internet Computer and Executive Director at ICDevs.org a grass-roots Internet Computer developer collective.
This work would not have been possible without the support of a small grant from the DFINITY Foundation.
Anything we produced as part of creating this review are available for you to share and remix under a CC BY 4.0 License.
Implications of the Internet Computer’s Unique Architecture
Anyone approaching token systems and standards development on the Internet Computer — especially those familiar with the evolution of the slightly older Ethereum network — must appreciate a few unique architectural features of the Internet Computer.
Ability to mutate: On first-generation Layer 1 networks like Ethereum, smart contracts are immutable, once a contract is launched it cannot be changed. This (defining) blockchain-enabled feature serves as a primitive basis for “trustless” assumptions in service design on the network. In contrast, Internet Computer canisters (the equivalent to Ethereum’s smart contracts) are mutable and may be optionally rendered immutable (a so called, ”blackholed” canister) or partially and conditionally mutable, under a variety of mechanisms - token enabled and otherwise. Because of this additional layer of control, if you’re going to create a token on the Internet Computer you must implement your own suitable ledger design and your desired combination of governance and decentralization promises.
Extensibility: First-generation, Layer 1 networks such as Ethereum serve only a singular blockchain ledger and rely on centralized public or hybrid clouds for most of their “Web3” implementation. In contrast, the Internet Computer employs an extended repertoire of cryptographic primitives to fully decentralize the provisioning of a complete cloud computing experience - a “World Computer” - comprising all of the “data”, “network and control-plane” and “presentation layer“ capability of a fully functional “cloud stack”.
Chain Agnostic: Recent integrations that enable Internet Computer canisters, wallets, accounts and other token-enabled services to directly hold and transact on both the Ethereum and Bitcoin networks will have immense implications for token system designs and related standards on IC. Apart from the obvious benefits and advantages of extending network control to these highly capitalized networks, it creates an opportunity for the Internet Computer community to anticipate the power of many-chain, token systems and demonstrate leadership in related network-spanning standards.
Typical Token Systems Uses and Category Distinctions
The most common distinction made among tokens is on the basis of their fungibility - that is, whether they are each unique and indivisible like a work of art (non-fungible) or identical and divisible - as suited to their use for something like a currency (fungible). And, no less complex as for their metaphorical counterparts in the “real world”, a corresponding variety of attributes and measures lend value and utility to these digital tokens (rarity, authenticity and merit - artistic or otherwise, use-value in transacting club-goods or gaining access or decision-rights, or constituting digital identities and intellectual property (IP) rights - the list goes on).
From a design perspective, a fungible token is almost entirely embodied in the specification of the ledger, whereas a non-fungible token typically also entails additional components - non-ledger control canisters, “asset” and “content” payloads, etc. and their associated metadata . Many novel NFT use-cases arise from creative definitions and uses of associated “content”, “payloads” or “assets”.
With new designs and standards, these simple categorical distinctions have blended together in “multi-token standards”. Ethereum standards-making processes are spinning out tokens like ERC1155; synthesizing fungibility and its antithesis in a single token standard. Whereas, ERC777 remained backwards compatible with ERC20, while adding hooks so that contracts and accounts can react automatically when they receive tokens. ERC-4626, the Tokenized Vault Standard establishes the essential requirements of yield-bearing vaults, representing shares of a single underlying ERC-20 token with optional extension for depositing, withdrawing and reading balances of ERC-20 tokens. ERC-1337 is a proposed ERC token standard designed to accommodate blockchain-based recurring subscription models. And, EIP-5192 - Minimal Soul-bound NFTs - aims to standardize a minimal extension of EIP-721 to render tokens “soul- bound” (non-fungible and inextricably bound to a single person’s digital identity) using the feature detection functionality of EIP-165.
New categories and combinations of token design will proliferate in the coming years as we anticipate intricate and powerful combinations of increasingly varied standard token types. Non-fungible tokens gained popularity in 2021 - “the year of the NFT” and gave-way to the (once-again) token-enabled, “year of the DAO” in 2022. In 2023, an intensifying pace of token-enabled decentralized service innovation will offer new levels of capability but will be doing so under much greater regulatory scrutiny and out-right (sometimes over-reaching or incoherent) legislation and regulation of the digital and blockchain realms. The
appears to be a widely-cited conceptual anchor, in precedent - to begin with, anyway. Additional degrees of design freedom furnished by the Internet Computer should prove valuable for negotiating the challenges and opportunities arising with increasing regulatory oversight.
Internet Computer Token Systems and the Standardization Process
Where Internet Computer token systems are concerned, we presently remain very early in the process of standardization. The first token systems deployed on Internet Computer ecosystem were modelled after and named for token standards developed in and for the Ethereum blockchain. Discussion of Internet Computer token standards can therefore fall into confusion if we do not keep in mind that:
The term “standard” does not properly apply to early IC token design patterns as the necessary standards development processes are only now beginning to materialize within the IC ecosystem.
Early IC token implementations transliterated design decisions made for a very different network design (Ethereum) which were transplanted from a standardization regime largely unconnected to the development, operation and utilization (“consumer”) ecosystems of the Internet Computer.
Unlike globally recognized and harmonized standards and standards development processes such as those maintaining safety and interoperability of electrical power systems, blockchain component design standardization typically does not yet extend or harmonize beyond the network “border-lines”.
Following these “early days” of digital token standards development, we can anticipate significant growth and maturation of Internet Computer token standardization efforts throughout 2023 and beyond.
Mature standardization regimes typically embody a few essential features and capabilities, such as [1]:
Identifying the need for the standard and reviewing the existing standards landscape
Notifying and engaging affected stakeholders and the public when standardization is being considered, is underway or is subsequently published or monitored for compliance and enforcement.
Sustained participation of technical experts qualified to develop understanding, prioritize objectives, negotiate trade-offs and decide the precise details of the standard.
Consulting widely on the proposed standard, subjecting it to public scrutiny and developing consensus among a balanced committee of stakeholders affected by standardization.
Publishing and making standards readily available in languages and formats accessible to the affected stakeholders.
Ensuring that standards are consistent with (or incorporate) existing international and pertinent foreign standards and are not acting as a barrier to trade or interoperability.
Maintenance of standards through periodic review or as changes are needed
In order to convene collaboration across organizational boundaries and among diverse commercial, government, non-profit and civil-society sectors, standards development organizations (SDOs) must also address legal and regulatory matters such as, intellectual property agreements and anti-trust laws.
Like the early Internet Computer token designers we can also point here to the examples of the Ethereum network, which documents its own emerging standardization process here:
. The Ethereum network has, in turn, drawn on examples like the Internet Engineering Task Force’s, Request for Comments (RFC) process [2] and acknowledges basing the Ethereum Improvement Process on the Bitcoin Improvement Proposals (BIPs) process, which itself is based on the Python Enhancement Proposals (PEPs) process.
Borrowed Standards and Early Internet Computer Token System Designs
Token standards are scoped to address only the minimal interfaces and operations needed by the token’s core transactional promises and security. Beyond conforming with minimum standards, token system designers will also typically add data structures, data, methods and events in order to differentiate their offerings (traditionally, in ways often depending on Web2 public cloud infrastructure). To this, the spectrum of mutability afforded by the Internet Computer architecture adds some new requirements for token system designs and standards.
This report is not intended to advocate for any particular token system or collections – nor concern itself with building consensus toward one or another standard. We aim simply to summarize useful context, observations, insights and information and refer interested readers on to some of the notable IC token system designs and document the innovation and standardization emerging there.
ICP (Ledger standard - “NNS ledger token standard”): The native utility token of the Internet Computer network facilitating network governance and convertible to the “cycles” required for running services on the network. While it pre-exists any IC token standard it’s nevertheless represents a central reference-point for IC token system creators by virtue of its critical role within the network.
Network Nervous System:
DIP721: Implemented by Psychedelic and referencing the Ethereum’s naming convention, furnishes a non-fungible token (NFT) standard mirroring its Ethereum counterpart (ERC721) and adapting it to the Internet Computer, maintaining the same interface. By conforming with the Ethereum precursor, the DIP721 designers hope to make it easier to port Ethereum contracts onto the IC – presumably accelerating IC token innovation and jump-starting IC-Etherum integration experimentation with the advent of
EXT: Developed byToniq-Labs, this system renders a multi-token extending an ERC20-inspired fungible token implementation pattern with non-fungible capabilities – capable of mirroring Ethereum’s ERC1155 token capabilities and supporting entirely novel extensions form that foundation. Complexity introduced by its extensibility may warrant a slightly longer learning on-ramp.
) - one of the first projects to reference the ICRC token standard produced by the Internet Computer community’s early ledger and token standards development efforts.
These and other early token implementations and standards emerging on the Internet Computer are listed for comparison in the accompanying Token Standards Database (
As an indication of overall NFT production and market activity, at 2022-11-24, there were 128,717 holders totaling 1,078,047 NFTs across 344 collections enjoying 24 Hour market volume of 1,616.0427 ICP on total market value of 1,262,038 ICP. Peak activity occurred most recently prior, on February 9th 2022, seeing almost 5000 transactions totaling almost 50,000 ICP. Up-to-date, on-chain records of overall NFT Market activity on the Internet Computer can be browsed at:
Internet Computer Token Standards Development Progress
Standardization efforts can significantly aid industry innovation, growth and market expansion. Such benefits can be greatly amplified for goods and services subject to, “network effects.” The emergence of standardization and standards development activity within an industrial ecosystem also creates valuable opportunities for professional learning and development and for collaboration and socializing among diverse stakeholders. This process is only just beginning for the Internet Computer but this is relatively true for the larger blockchain industry everywhere, as compared with well-established industries and disciplines.
The IC Ledger & Tokenization Working Group was announced in Q1 of 2022 and, in the months following, brought together some of the early token system innovators to develop the basic ICRC-1 standard which was formally accepted by NNS vote on August 14th, 2022
(
The ICRC-1 standard specifies a token and ledger combination supporting essential transactions for fungibility and designates a minting account, unique for being able to create new tokens and act as the receiver of burnt tokens. An ICRC-2 standard proposal is in draft that would extend the basic operations of ICRC-1 to enable account owners to delegate token transfers to a third party on the owner's behalf (
A major token system development and standardization milestone was achieved in 2022 with the deployment of Service Nervous System (SNS). The SNS serves open infrastructure supporting near-turnkey minting and issuance of tokens for governing the deployment, operation and maintenance of decentralized services deployed on the Internet Computer. In the spirit of experimentation the SNS infrastructure was employed short thereafter to launch the SNS-1 service and corresponding SNS1 governance governance token. Launch of the token sale was initiated by
. The experiment tested the limits of the Internet Computer and proved valuable in early identification of fixes and adjustment of default configuration parameters. A retrospective report of this groundbreaking event is found at:
As such, SNS-1 is the first SNS DAO to be created on the Internet Computer. Initially, the SNS-1 DAO controls only a blank canvas - consisting of a quote and a letter - awaiting a shared vision for something more:
. On November 29th, 2022, all 3'141.00 of the available SNS1 tokens sold out in the first few hours. It will be up to the SNS1 token holders what happens next.
Originating Token Innovation on the Internet Computer Community
In the course of this research, we met some early Internet Computer token system designers stretching the imagination of possible token concepts and use-cases. Those with an appetite for invention may wish to learn more about these projects.
Ntagle: A Near Field Communications enabled NFT producing a liminal (so called, “phygital”) object bridging the real and the virtual realms, created by Isaac Valadez, a mechanical engineer, blockchain developer, and serial entrepreneur from Houston Texas. Isaac is an active contributor across a variety of IC community endeavors. He was instrumental in organizing the first Motoko bootcamp (information about this year’s event
). Ntagle creates a unique URL binding the proven possession of a physical object to some digital asset or canister-controller - essentially, “soul-bound” to the physical item and transferrable between parties - but only when there has been a transfer of the physical object. Through a layer of indirection, the Ntagle NFT owner is still empowered to delegate, lend, or rent the powers conferred by their possession of the Phygital NFT. Isaac continues to explore possible use-cases and required token system extensions - for instance, will successive owners of an Ntagle phygital NFT be identified to each other - or share some continuing benefit by virtue of the physical object once passing through their hands (along with it’s corresponding metaverse or extended reality (XR) powers)?
(
ORIGYN Foundation’s Perpetual Operating System and sNFT (Sovereign NFT): Austin Fatheree’s creativity and Internet Computer innovation leadership pushes the conceptual boundaries of “token” entirely. The perpetualOS is extensible in that it supports third-party development of entertainment, financial, social and other capabilities with the sNFT specifying a Data API, Market API, Identity API, Social API, and Object API (these NFTs actually have their own wallets). “Sovereign” being an operative word here, the sNFT and PerpetualOS aim to furnish sNFT holders more inventor and creator-focused alternatives for “peer-to-peer” market coordination serving the interests of creators and preserving for them some rights and controls presently centralized with marketplaces and exchanges. ORIGYN envisions a future of tokens which are much more context-aware (of chain-state, say) and much more adaptable and dynamic in fulfilling your creative vision. (
Wallets creators and providers equip the Internet Computer token ecosystem with essential functionality and are motivated to contribute significantly in standards development and also conform with token standards as they’re formulated and published - some notable wallet provides on the Internet Computer include:
Stoic Wallet - a self-custodial wallet for Internet Computer that can stores, mint and distribute tokens, stake ICP, top off canisters, login to Internet Computer Dapps, and more.
(
Volt smart wallet canisters, by Toniq Labs takes token innovation to the next level by extending the possibilities for automation and interoperability across NFTs, collections and marketplaces. By its nature this project offers fertile ground for exploring the range of interoperability protocol standards required for tokens and token-enabled use-cases to achieve product-market fit and mature. Integrating with your own wallet canister, Volt introduces new levels of wallet automation for participation in binding offers, auctions, secure transfers (approved by the receiver), which also interoperate across marketplaces.
Exponent token platform, by Toniq Labs expands the development experience and possibilities with a variety of tooling comparable to that of Enjin (ENJ) on Ethereum (ETH). The three main components of Exponent are EXT - the extendable token standard for the Internet Computer, Spatial- a headless-wallet for invisible token management, eDex protocol - built-in exchange protocol, Toniq SDK - tooling for developers
Registries, Marketplaces and Exchanges
Crowdfund NFT
DAB is an open internet service for NFT, Token, Canister, and Dapp registries that apps can consume to auto-surface a user’s assets, names of canisters they are interacting with, and metadata in their user interface. Making assets and canisters easily discoverable, with metadata that makes them descriptive and human-readable (and safer to interact with). (
Entrepot by ToniqLabs - By 2022, Entrepot has grown to support over 20,000 active users, 1.3 Million ICP in secondary market volume and over 300K transactions [3].
building a token platform called “Exponent”. Comparable to Enjin (ENJ) on Ethereum (ETH), Exponent enhances the on-chain development experience through a variety of tooling. Also, Entrepot “Launchpad” services.
As in the broader blockchain industry, attention is on extending token content, behavior and interoperability beyond Decentralized Finance (DeFi) and toward SocialFi, GameFi and “the metaverse” however that may manifest, materially, in years to come.
Please refer to Appendix 2 for a survey of notable IC NFT Collections.
Token-related Innovations and Standards Transcending a single Network
As distinct blockchain network architectures and networks individuate, complementarity integrations are emerging that span cryptographic boundaries and boundary nodes. This opens up an expansive array of potential multi-chain, omni-chain, or even “meta-chain” token system design patterns and standards. As such we take a brief look at token system and standards that transcend a single blockchain network’s standards-making machinery.
Token Taxonomy Framework: Taxonomy initiatives, taking shape in various collaborations, will prove helpful in navigating the future menagerie of token species. Launched in 2019, the
- “the TTF empowers organizations to adopt and use token-powered services in their day-to-day operations, across use cases and networks, bringing inclusivity to globally distributed applications.” It may be at this level that standardization first transforms into “certification” as higher reliability signals and assurances become necessary across unrelated blockchain ecosystems and standardization regimes. The
articulates an intricate conception of essential token patterns, relationships and behaviors.
Rosetta API: Another relevant cross-chain standardization effort is found in the Rosetta API standard. Coinbase initially developed Rosetta API as middleware to securely integrate blockchains into its platform and, “simplify the integration of blockchain-based tokens in exchanges, block explorers, and wallets”. The Data API provides the ability to access blocks, transactions, and balances of any blockchain in a standard format. the Construction API enables developers to write to a blockchain (i.e., construct transactions) in a standard format. To meet strict security standards, implementations are expected to be stateless, operate entirely offline, and support detached key generation and signing. The Rosetta API is specified in the OpenAPI 3.0 format, the most broadly adopted industry standard allowing remote, machine-readable API description accessible through HTTP or HTTP-like protocols.
A number of other groups and collaborations are advancing the state of the art for tokens in exceptionally novel and powerful directions. The
which equips would-be creators of token-enabled multi-party services, communities and ecosystems with open source modeling and simulation tools. Its agent-based foundations and complex adaptive system perspective are well-suited for analyzing and anticipating how the, “tokenomics” might unfold in any given token-enabled cyber-physical system. Collaborating with these, and truly transcending the bounds of popular imagination is the
- advancing a mind-bending vision that sees tokens and token standards imbued with sufficient adaptive intelligence to essentially “sort it all out amongst themselves”. Equipped with Active Inference, the differentiation , individuation and evolution of tokens goes on auto-pilot - expanding our view of token-related standardization in ways that connect with parallel work being undertaken to establish norms of data-ownership, custody, rights and related
. Purposefully directed toward decentralized science (DeSci), the Active inference Institute conceives of ActiveBlockference as instrumental in animating their,
Finally, it may be instructive to look for the dark-matter in the distributed hash table innovation universe where, for many use-cases, tokens are undetectable. The Holochain project - in particular the hREA effort implementing the agent-centric
, reveals the negative space surrounding tokenization, generally - as these projects aim to securely and reliably decentralize multi-party service and community networks in the absence of any singular network-wide blockchain and associated tokenization approaches. One avenue for resolving the outlines of what can be done with tokens may be to appreciate what can be done without them.
Intensifying efforts to identify and standardize universal interoperability, regulatory compliance and other conformance requirements for accounting, reporting, taxation, data-privacy, Intellectual property, and automated decision-making, combined with intensifying regulatory influence and oversight stands to accelerate boundary-spanning token system cooperation and wide-scale collaboration on recommended design patterns, operational norms, standards and certifications.
Discussion and Next Steps
A salient sentiment emerging in several conversations is that the compounding complexity and sophistication of token system designs is rapidly expanding and shifting the focal point of standards activity well past the ideas expressed by the term, “token”. These radical extensions to early token conceptions enabled by the Intelligent Internet of Things (IIOT) and artificial intelligence technologies nudges standards development attention toward assuring interoperable behavior of token enabled compositions and assemblies at higher levels of abstraction. The demands of regulation will naturally pull attention in this direction, as well - expanding the perspective, metaphorically speaking; from the security threads woven into the dollar bills toward something closer to Generally Accepted Accounting Principles. Reflection on this eventuality may suggest a variety of next steps in this research exploring standards reuse and interoperability for higher abstraction behaviors built upon the present token design-primitives and standards. Given the experience of the electronics industry, we should not be surprised if such efforts lead to digital objects that resemble present-day tokens as much as the iPhone resembles a transistor.
The IC Token standards Database is only a small step in assembling and automating some reference information and tools for token standard analysis and experimentation. It only offers continuing benefit if appended, extended and maintained through cooperation of IC community contributors. A number of useful extensions are conceivable - for instance, equipping the database with code-pattern generation templates partially automating deployment of standard token system starter-code.
The Token Standards database can also be extended with data about the status, versions and other features of standards and link also to other references and application guidance aiding selection among the available token systems and promoting good implementation and integration practices.
We hope this research helps eliminate ambiguity to clarify important ideas and developments influencing the evolution of token systems and standards on the Internet Computer, contributing to realization of more accessible, enjoyable and rewarding token-enabled tools, experiences, services and communities.