From gagra vc
the Broken Model
We just hit 15 years since the Bitcoin White Paper came out spawning millions of digital tokens in its liking. And yet, as crazy as it is,the economic benefit or even nature of these tokens is misunderstood by majority of the industry, let alone the broader public! Why so, one might ask?
We blame it on the broken incentives – poorly designed digital tokens still generate serious returns from speculative demand. No need to worry too much about the longer-term drivers, better focus on marketing and liquidity to produce a quick fix.
This logic has led to:
a) projects producing vaporware technology to masque their real purpose of dumping tokens on the often equally culprit buyers;
b) select few projects that actually do have merit still treating their token as an afterthought. Everyone’s copying whatever is the most recent new mechanism (most of the time a new wrapper around the same constant, which is “ponzi-nomics”). This leads to a very slow step-change improvement at the cost of tens of decent projects per step (where each failed design causes the next generation of projects to iterate further);
c) the industry still struggling with the product-market-fit for a technology that even detractors (e.g. central bankers) admit is exceptionally valuable – because the founders are too busy optimizing for a token-market-fit instead.
Basically, all three of the above points mean one thing – the innovation on the token design and actual utility is extremely slow because it’s rewarding to keep things simple (and dumb) in the short term. To be a “pro” in crypto, whether on the sell- (founders) or buy-side (both institutional or retail), is to anticipate this pattern by rinsing and repeating while getting numb to the consequences.
Obviously, this is unsustainable. TradFi investors and broader public – i.e. people you’d want to “buy your bag” beyond the 8 months of bubbly frenzy – have caught on to it. For the projects that make sense and money (mostly DeFi), but not so much on the token side, insiders have resorted to just hoping the regulatory clarity from the likes of US lets them turn on the “fee switch”. This is akin to surrendering to the status quo, in our view.
Somehow, we collectively ignore the fact that tokens are as programmable as the software underlying them and the design space is limitless. A good token design that captures the value of the application or a protocol, we’d argue, can defy the pump and dump physics, as well as avoid regulatory burdens of instruments like equity (even if adjusted to something lighter post-Trump election).
What we want is a token that best captures the value its underlying product generates to users. We also want it in the hands of anyone who derives value from it. And not just US-based investors because (hypothetical): core team behind $UNI is based in this jurisdiction and the token is considered a “crypto security” there and thus can only be traded on US-licensed “crypto security” DEXes like Uniswap itself (the irony makes this outcome even more plausible). This example is extreme, but that’s where it’s headed if we’re lazy and “fee switch” is the best we can do.
In our fund we certainly do want tokens that work out over the long term because there is a compounding nature to capturing value while expanding network effects – which is ideally what a well-designed token does for a project. And we advise portfolio companies to want the same. Projects ignoring good token design for the sake of a quick cash out are leaving millions and even billions of dollars on the table. Ok, that’s what we want, but does the world need it?
Why Do We Need Tokens
We get this question a lot... until the bull market hits. Which then makes everyone just rush it out without ever questioning the motives. Of course, until the next downturn comes along, and founders are fearful again of the market gyrations.
But the reason for a token on a blockchain to exist, in our view, is not to make anyone rich or poor, of course. And your bags will stay cursed (i.e. die over time, dragging even the best projects with it) for as long as you think it is.
In the simplest terms, the token should exist to guarantee 24/7 global access to the product/the value it generates/or the governance thereof. This point is very important - especially since it gets overlooked.
If you live in a developed economy and have barely been banned from using any internet service other than a crypto exchange, you may not appreciate the fundamental value of accessing your finances, friends, work resources or even entertainment. But most African, Asian or CIS country residents are officially cut off from many digital platforms and services on the basis of lacking banking, accepted social credentials, their local government restrictions or the service provider’s government’s restrictions on serving those users.
This is obviously not future-proof in a world where internet access is universal and economic activity is no longer tied to person’s country of origin or residence. That’s why Web3 is a rising developing nation phenomenon – well along the widely-accepted Clay Christensen’s classification of disruptive technological shifts that always start in smaller, on-the-fringe markets before they catch up with incumbent tech in a broader marketplace.
Ensuring that your Web3 app cannot be banned in any country and always accessed or empowered by the token you issue (instead of e.g. credit card payments) is an increasingly important consideration given the geopolitical backdrop. US-China tensions, and their first order derivatives (Russia-EU) have separated the markets. This trend is not going away soon. The biggest internet businesses are no longer trying to become global monopolies everyone was so worried about and are essentially compartmentalized into groups of regional champions. No internet application sector has a clear global winner.
Figure 1. Regional Winners of Web2 (Simplified)
This is exactly how we win over those incumbents with the Web3 tech: globally accessible, autonomous and hard to censor. The winning Web3 application or a protocol is a natural global monopoly in its respective niche. The moat is its network effects, and the vehicle is the token. There is simply no other way for an internet technology to achieve this otherwise. So it can’t be understated how important it is for a digital service to opt into a token and ensure a sustainable growth by leveraging its properties1.
It can also be argued that some applications can just exist on a blockchain without a token and collect fees in said blockchain’s native asset (e.g. $SOL or $ETH). If you as a builder don’t need additional contributors or an explicit vessel for network effects to achieve stated goals – by all means pursue such a strategy instead of issuing another short-lived token.
But most of the time token is the instrument to tie in multiple stakeholders/contributors to a decentralized enterprise as well as a vehicle to capture and redistribute the value in a manner that does not involve collection and distribution of fees, which, as we noted earlier, makes it just a fancy equity-like instrument.
What is Value and What Captures It
Speaking abstractly, value is the benefit a certain product or service gives to its buyer (i.e. cost saving; time saving; higher level of satisfaction etc.). Expressed in economic terms, it can often be measured directly in currency. The most straightforward way of capturing the value is by charging customers for the benefit they get from your product/service. As long as the price < benefit, the customers will be paying.
The goal of the product/service provider is to charge as much as possible without affecting demand – that’s the value capture. So, different businesses approach this differently and hence some are better at capturing most of the value they produce, and some aren’t as good, which is usually reflected in the price of their shares. This is an oversimplification, and the correct view is much more nuanced 2, but the basic principle holds.
In the last 20-30 years of the internet boom, which transformed the way business is done, the value creation and capture have been separated in time beyond the usual business cycle, as well as abstracted away and re-hypothecated.
The goal of time separation is usually to outgrow competition and benefit from the so-called “economies of scale” – cheaper per-user cost of creating a product. The textbook example is Uber, which turned in profits only some 10 years into its existence while never really benefiting from its scale other than maybe through brand recognition.
An example of re-hypothecation of value creation/capture dynamic is Instagram. The perceived customer of Instagram is the user who can access their friends’ feed and post their own content for “free”, but what is actually happening is the company selling their users’ time and attention to advertisers, who are the real customers.
Because of such enormous success cases as in examples above, engineering the value capture into an internet-based product or service has become a common practice that usually involves founders getting upfront financial leverage from investors in order to expand the addressable market as fast as possible and only then proving their business model.
This approach often leads to large capital dislocations and value capture fading out of relevance as company appreciation driven by market expectations replaces focus on profit margins of the business model. The market tends to correct for such imbalances with violent swings in the opposite direction.
And crypto, which has benefited more than any other tech sector from this perceptual shift, is at the forefront of the trend and thus under highest risk of its reversal. In the Web3 space the gap between value creation and value capture is even more widened with the token being the cause. This disconnect manifests through market pricing – i.e. marginal buyers’ willingness to pay X per token, which is imprecise to say the least.
The value inflow into a Web3 business (i.e. its revenue) can then be expressed as a product of all the financial inflows into the circulating supply of tokens: the $ value of assets exchanged for the said token, times all tokens in circulation. However, if the token is later sold back into $ by e.g. the network’s service providers to cover their costs, it creates a value outflow. So, the “capture” in such a scenario is the difference between the value in- and outflows of a token and ideally you want to maximize the former and reduce the latter.
Figure 2. Token Value Flows and Their Effect on Value Capture (Reflected in Market Capitalization)
The reason a token is an imprecise instrument of value capture in Web3 is not just the often-arbitrary pricing based on expectations of supply/demand dynamic on the market, but also the effects of its liquidity profile. Designing a token that captures value well in this context means creating a clear value proposition for an “organic” buyer for the product/service, which is always joined by speculators - altogether creating a better liquidity profile for the token given. There is always going to be a bid on a token that well reflects organic demand for its underlying product, given such demand is present.
If you are a long-term project founder trying to “time the cycle” – there’s no need for that. You can make your token appreciate in price with organic growth of your product if the relationship between the two is direct. This is crucial to making the token work.
What Has Been Working So Far
The digital token that’s worked out the best to date is obviously $BTC. We believe the main reason that Bitcoin succeeded where its many predecessors like eCash, b-money, RPOW or Bit Gold failed, was a combination of an explicit supply cap and fully predictable distribution schedule to reach the said cap.
While much has been said about it in relation to the Austrian School of Economics, as in: “the Bitcoin creator(s) wanted to have the hardest possible currency in existence and bring us back to the asset-based rather than credit-fueled economic expansion” – it’s missing the main point. The more practical reason for hard capped supply of $BTC is often underappreciated: there is no other way of instilling credibility in a newly created asset other than by having a fixed supply that can’t be tampered with. Add a distribution schedule fully known in advance and you have what it takes. Simply couldn’t have achieved any adoption for $BTC if it wasn’t for these two elements.
Frontloading the majority of supply to reward early miners is the third important component of Bitcoin’s steadfast adoption, but we’d give it a secondary role. Could’ve been less aggressive with the early distribution and still get enough traction given the cap and predictability. Once Bitcoin earned credibility, the second generation of digital asset systems no longer needed to box themselves into as hard of a rule set to instill confidence.
The only token model that has sustainably caught on and captured value since $BTC, in our view, are the so-called “Layer-1” tokens, a.k.a. access tokens to the decentralized cloud platforms for applications. While similar to it in many aspects, these networks offer a completely different service from what Bitcoin does. Their purpose is to host a wide variety of applications in a decentralized and censorship-resistant manner (as opposed to the previous generation of centralized internet platforms and services).
Bitcoin, on the other hand, is a pure store-of-value (SoV) asset. All of its features: the simple architecture, strong security guarantees, uncensored and credibly neutral nature, backward-compatible and slow improvement process – serve this one single purpose. And thus, the value that $BTC captures is almost 100% correlated with demand for its SoV properties. It is also a medium-of-exchange (MoE), of course, but it gains this property after the SoV demand is established (as opposed to L1 tokens that are used to price assets on their respective blockchains).
The reason for tokens like $ETH or $SOL to exist, on the other hand, is to prevent users from spamming or overloading the censorship-resistant “blockchain computer” behind it – by pricing the compute and storage resources of the network in such a way that punishes heavy use. The creators of Ethereum opted to put technical constraints on their version of the globally distributed computer to ensure it remains decentralized and can run on as many nodes as possible - so its capabilities are fairly limited (it’s more of a sophisticated calculator than a modern computer really), which makes the pricing of access very high relative to functionality, but also provides best security. That’s why you see applications on Ethereum being mostly limited to simple arithmetic-based smart contracts for Decentralized Finance (DeFi) and tokenized high value assets (liquid instrument portfolios, premium digital art, TradFi securities, etc.).
Solana, on the other hand, optimizes for cheaper resource pricing by imposing higher tech requirements on its node operators, thus limiting their set. This theoretically lowers the network security threshold, but practically keeps it decentralized enough until a very hard test can break it (at least temporarily) – thus rendering all assets and applications on the network inaccessible. The most plausible scenario besides network failure, which also (happened before) would be the shutting down of majority of Solana nodes due to some cross-data center outage or related to an outright ban, or, more likely, high-cost compliance regime imposed on validators by jurisdictions where majority of Solana-hosting data centers are located (i.e. the West3).
While indeed mostly theoretical, this risk is reflected in the cheap network resource pricing, which in turn affects the applications on it. That’s why we’re seeing most of the retail low value Web3 activity happening on Solana and not much DeFi value locked on that chain.
Both $ETH and $SOL have solidified the “access token” design as the next evolutionary step from $BTC. Bitcoin has similar optics, because you do need $BTC to transact on it, but only because it’s a simple one-asset system, not an explicit access token. The new L1 designs, on the other hand, are explicitly access instruments, so in order to keep the pricing of resources predictable in their native asset terms over long time (“plateau of productivity” post speculative volatility of first few years), they need to expand the token supply along with the growth in usage. Hence no longer requiring a hard cap on the supply, given the nature of their service is to continuously grow by offering more applications to more users in perpetuity.
The best analogy to L1 tokens would be sovereign currencies: the network imposes a mandatory use and pricing of its resources in its native currency just as the government would, while also inflating a supply in perpetuity to get itself going. The demand for the currency is high as long as the economy it powers is strong and growing, which allows it to over time acquire the medium-of-exchange and store-of-value properties, but those are first-order effects of the economic growth of a network, not its token’s primary function.
We do believe that it is best to view L1 tokens as digital economies’ native currencies, where each respective network acts as a digital sovereign with their own laws (system rules), while hosting different businesses within this economy according to those rules and imposing a token-denominated tax on them and their customers. While we will see experiments with issuance mechanics moving forward, this rough framework shall hold – with some of the more prevalent L1 tokens even acquiring “digital global currency” status based on their network’s size and prevalence in global cyber economy (the digital world analogs of USD, EUR, RMB or JPY).
So, when designing an L1 token or trying to evaluate it, one must consider the relationship between the “GDP” of the network underlying it, the token's circulating supply and issuance schedule. We are still in the experimentation phase of L1 monetary policy, with Ethereum Foundation leading the way with upgrades like EIP 1551, which has been an imprecise attempt to re-capture value lost due to fragmentation we discuss in the next section. We are still far from optimal form, in our view, but given L1s’ price action is best tied with organic usage, we find their model to have proven itself sufficiently successful.
Now let's discuss what hasn’t worked so far, in our view, before offering some principles to avoid five major pitfalls examined in the corresponding sections below.
Five Major Ways to Destroy the Token’s Value
1. Taking Too Many L’s
The aggregate value that Ethereum brought into the world as of now is the combination of all Ethereum Virtual Machine (EVM) blockchains, Layer 2s and beyond. Even if one considers only the top EVM implementations: Binance Smart Chain, Tron, Optimism, Arbitrum and Polygon – their total market capitalization is around 30% of Ethereum’s market cap at the time of writing (and usually goes up during market expansion).
If one also adds formally distinct smart contract networks like Avalanche, Polkadot, Fantom, and Zero-Knowledge-Proof-based rollups (ZKSync, Scroll etc.) – the biggest usage for which also comes from EVM compatibility, we may see that up to 40% of Ethereum’s value isn’t captured by its token. Now add here a longer tail of smaller networks and private/enterprise implementations and you get the actual aggregate value of Ethereum.
The reason this fragmentation happened is clear: the Ethereum network could not scale up to accommodate that many transactions originally – so the limitation was unavoidable. But the problem, in our view, is exacerbated by the decision to allow different “sidechains”, “rollups”, “application chains” and other forms of standalone networks on top of Ethereum to settle onto it just as any regular user would.
Moreover, Ethereum developers made it even cheaper for rollups to settle on it with a recent upgrade. Which means Ethereum lets other networks benefit from its security, without explicitly pricing it for them under the false pretext of growing adoption.
The L2s, on the other hand, have full autonomy in designing their ruleset, which is how it should be. But it also implies they are free to use their own native assets as access tokens to their networks: $BNB, $TRX, $POL. Even if they stick to $ETH to charge users in, the collected revenue versus their cost of settling onto Ethereum is orders of magnitude higher. There is no requirement for them to hoard $ETH in order to cover security costs.
L2s running on $ETH for the moment but also having or entertaining their own tokens “for governance” (e.g. Arbitrum, Optimism) are almost certain to embed them for network access over time. They are incentivized to do so to avoid equity-like classification, which puts jurisdictional and other limitations on growth. Also, to keep growing the user base they may end up settling onto a variety of L1s, as alternatives like Solana inevitably eat into EVM’s market share. Essentially, by issuing own tokens they create an implicit incentive for themselves to expand beyond Ethereum over time, regardless of how ”aligned” with it they are at the moment.
This last argument also breaks the “$ETH is money” thesis that anticipates a constellation of sidechains/L2s/L3s or any platform leveraging Ethereum’s security to use $ETH as access token. Given the “genie is out of the bottle” and the temptation to realize quick gains via own token launch being so high, it’s hard to imagine L2s sticking to $ETH as their access token absent value capture mechanics other than “fee switch” for their own tokens – which, as we noted above, may be sub-optimal for them.
Could Ethereum have avoided losing its value to other EVM chains while still keeping itself open-source? Sure! It should have created a more tightly knit techno-economic framework for L2s and sidechains by e.g. making them stake a certain amount of $ETH to ensure corresponding level of bandwidth. The Ethereum Foundation has been driving the development in the opposite direction.
We explore in the section below why this model of offering your tech for free to parties deriving economic benefit from it while expecting it to grow network effects for your otherwise non-enforced token, is flawed.
2. Open-Sourcing Own Demise
A textbook example of how an absolute category winner in Web3 lost a big portion of the value it created (and, most likely, loses even more in the years ahead) is the IPFS/Filecoin story. We strongly believe that Protocol Labs, the team behind the distributed storage protocol Inter-Planetary File System and the corresponding economic layer on top of it that is Filecoin, is probably the strongest Silicon Valley-style team in Web3. They are brilliant engineers who understand distributed systems exceptionally well and can build great protocols and products on top of them.
And yet they approached their project like a typical open source legacy endeavor: “let’s release the software into the world without imposing any economic rent for using it and thus build adoption and then create an (un-enforceable) token layer on top of it”.
Obviously, the thinking was that a token can capture more value than, say, a classic “enterprise solutions” on top of the OS software, which has been the business model for such projects to date. And to illustrate how leaky a value capture model that is: RedHat, one of the leading Linux contributors, has been acquired by IBM in 2019 for $34bn. This may seem like a hefty sum, until one considers that Linux is running on all the world’s data centers and Red Hat has been its biggest contributor for over a decade by that point. The global data center market was generating around $300bn per year around the time, and those figures exclude a big chunk of the server costs incurred by the likes of Google and other “Hyperscalers”.
It may be a hypothetical given we’ve never experienced a world where Linux wasn’t open source, but if it were a Web3 protocol and you’d need to use a digital token to run it on your servers it would probably be valued in trillions of dollars in market capitalization (given the “decentralization premium” a digital common good should have over a single tech company for geopolitical reasons outlined earlier).
So, we assume the intention of the team behind IPFS, which is an absolute winner in decentralized storage – simply owning that category, was to ensure fast adoption first before trying to monetize it.
However, it ended up creating an opening for other competitors to run with IPFS but come up with economic models that are different from that of Filecoin – and thus eat into the share of the market that Protocol Labs created. Arweave is the biggest of them, currently sitting at around 15% of Filecoin’s market value, but there is also a long tail of smaller ones. And we would argue that if Protocol Labs aren’t on their “A game” for the next 10 years, there will be more to come.
They themselves gave their competition an opening. The real winner from legacy open source turned out to be AWS, adding hundreds of billions of dollars to Amazon’s capitalization. Let this be a lesson to our industry as well. A token traps all of these third-party incentives - it’s available to anyone who’s excited about a technology and wants to build it into a business of their own.
IPFS did not have the scalability constraints Ethereum did – it could’ve accommodated any level of demand from the start. We would argue that if it had an access token closely tied to it from the start, it might’ve caught more value over time than $FIL has so far. Way more. And also create an impenetrable moat for all token holders around the network effects of IPFS.
Open-sourcing your code while keeping the “reference” implementation mechanics closely tied to your token, which is actually the community’s rather than the core team’s asset, is a much better model, in our view, and does not contradict the ethos of openness. It’s the best way to monetize open-source technology and the fact that developers have gotten away with being lazy about it, doesn’t mean they can afford to going forward.
Now let’s discuss another common mistake in Web3 token design, which also gave birth to the other two that follow it, before we summarize this essay with a brief overview of the “do-s” derived from these examples.
3. Making Money on the Side (Not the Token)
The DeFi applications is the biggest product category that worked in Web3 so far. They have the most evolved set of smart contracts built on top of L1 chains, which means that those projects “outsource” the core backend and operational costs to the blockchain in exchange for one-time deployment fee. After that the contracts work autonomously, so ideally as a decentralized app you only need to maintain user-friendly access to the application (via UI) and work on code upgrades if needed. This significantly reduces the go-to-market speed and team overhead, also benefiting from 24/7 global distribution provided by the L1, thus making DeFi in particular a very attractive business model for financial applications.
However, the same cannot be said about its tokens. The main issue lies with the incentive misalignment of token stakeholders. Let’s dissect it.
The original DeFi coin, in our view, is $MKR of Maker Protocol, which served 2 main purposes:
- Application buildout via direct distribution into the hands of most active capital or labor contributors.
- Risk management asset, a.k.a. a buffer of perceived value, standing to protect the application of bad debt.
The early community around Maker Protocol, which originated as a money market application where anyone could provide their $ETH as collateral to borrow a dollar-pegged stablecoin against it, was able to grow organically into a well-functioning risk management group behind the application’s key parameters. Those contributors were directly rewarded in $MKR token from the project’s treasury, managed by the core team of developers, and could propose and vote on proposed changes using said token.
Thus, the Maker team outsourced ongoing risk management to the community, while controlling the software implementation and the token treasury. They also originally sold tokens to investors via batches in private transactions before the direct-listing-style token generation event (TGE) that allowed retail users to get access to it almost a year into development.
Given that the original distribution was tied to actual contributions, this did place the tokens into the real community member’s hands and had an attached floor value of time/money spent by them. However, using it as as a risk management tool with very little time to establish its value before big tests emerged proved to be suboptimal for its performance.
The governance of Maker, while one of the most well-established and sophisticated in the space, settled primarily with the team + long-term members and biggest private backers, while the market was just using the token to speculate on Maker’s growth prospects, without an explicit value proposition for the token itself. So, while it started quite organically, the actual distribution of governance hit a ceiling at a certain point as the team did not do much to further it.
The RM buffer role of the token essentially didn’t work, when the application got stress-tested during the 2019-2020 drawdown in $ETH, culminating in the March ‘20 crash. And while it didn’t kill the token, it killed the mechanism, and Maker had to resort to using dollar-hardpegged $USDC as collateral, losing the value proposition of its “decentralized” stablecoin. $MKR, though, did accrue a small portion of the fees generated by the app, but most of the value was captured by the so-called liquidators - third parties providing liquidity when borrowers went underwater. The liquidators had no explicit use in $MKR other than to arbitrage its price (i.e. dump) by buying it with a discount from Maker when application accrued bad debt.
The second big iteration for token design in DeFi happened as two liquidity-dependent applications – a trading venue, Balancer, and a collateralized lending platform, Compound – decided to incentivize quick growth of their respective capacities by giving out tokens to liquidity providers. This helped both projects to quickly boost user-owned liquidity on their platforms, so the model got expanded onto multiple other projects of the same cohort with the ultimate one being a retroactive airdrop to reward both users and liquidity providers of Uniswap, the biggest decentralized exchange.
What this model of aggressive token distribution did not consider, was the incentive misalignment between liquidity providers (LPs) and the applications’ long-term growth. Specifically, the fact that LPs still maintained the ownership of the provided liquidity even after having been rewarded for it, allowed them to earn token incentives without committing to the application. This obviously led to multiple “liquidity wars” in the space and little stickiness of capital or users to actual applications. The only products that survived this wave were the ones that, we’d argue, would have done it without a token because they had organic liquidity and usage to begin with. But they too, e.g. Uniswap, were forced to issue one, in order not to lose out to competitors with tokens.
Since very little thought was given to most of these rushed releases (as there was a fear of losing market share), most of that generation of DeFi tokens ended up being simple voting, but not profit-right, instruments. Some tweaks to the mechanism design for incentives have been made later – e.g. Olympus DAO acquiring rather than incentivizing liquidity from providers – but nothing radical enough to stop DeFi tokens from bleeding out amidst unclear value capture mechanics given regulation fears for the straightforward “fee switch”.
The teams that didn’t fear regulation (mostly non-US-based) ended up distributing the application revenues to the token holders, which did provide support to their tokens’ prices even during a bear market drawdown (e.g. $GMX during 2022). But those designs remained relatively straightforward and thus leaked a lot of value the application generated, since big portion of revenues in DeFi are flowing directly to liquidity providers or similar kind application maintainers (we call them broadly “Keepers”) in this decentralized context.
So, while some of the DeFi protocols are making tens of millions of dollars in daily revenues, the majority of it isn’t captured by the token holders, but direct contributors instead. The thinking behind the pro-“fee switch” camp, which does hold merit in our view, is that even if the application itself or its token holders capture a smaller portion of those revenues, the global decentralized nature of the app makes the total addressable market so much bigger that it would still imply very high value.
But we believe that embedding the token more closely into the Keeper-related mechanics (e.g. liquidators, liquidity providers etc.) would tap directly into the source of value a DeFi app can generate instead of letting them make most of the money while token holders are pushed to the side.
4. Play-to-Churn
The similar story to DeFi tokens has been shared across other sectors, primarily Web3 games, as applications adopted the DeFi model of aggressive incentives to attract usage. However, no one to date was able to balance aggressive distribution with enough demand. Most solutions come down to creating artificial “sinks” that look something like “stake your token to get more tokens, but in X time” – which obviously is still net inflationary to the supply, but the expectation is to grow enough demand by the time such distribution happens (a.k.a. a ponzi scheme).
Gaming is just the biggest sector applying these mechanics, however almost any other Web3 app is trying the same formula with always the same result. Focus on the incentives attracts mercenary users (just as it did for mercenary capital in case of DeFi), who only care about getting a quick payout. Those users by default wouldn’t be interested in spending in the game other than to guarantee them an even bigger payout.
This creates a very quick churn of users in Web3 games and similar consumer apps – new “meta” gets discovered, quickly exploited to the tilt and shifted out of as soon as incentives dry out (whether because it becomes crowded, or because the fast depreciating token price no longer justifies the time/effort spent on the app).
All the biggest success stories of the sector, which actually had meaningful user adoption – “Axie Infinity”, “Stepn”, and recently “Hamster Kombat” – have precisely the same token trajectory. Extreme growth and an even faster death spiral brought about by aggressive distribution of a token without clear value proposition to the user.
We strongly believe that in Web3 your token’s price chart is part of the product and if it’s really bad, it may drag the project down with it. It’s very hard to un-fuk a project with a fuked up chart.
Figure 3. Effective Lifespan of Two of the Biggest Web3 Consumer Apps to Date
The main concern in these kinds of projects is balancing supply-demand dynamics. However, as we mentioned above, the incentive addiction has taken its toll on designers – most of the projects can’t escape the loop of quick “dopamine hits” and just try to incentivize everything they want to get from a user by printing more tokens to them. This is always inflationary to the supply and attracts “wrong” kind of users, no matter how brilliant the token “sinks” may be - hence we’re yet to see one of such models to work.
The main issue in Web3 Gaming and generally consumer application space arises from the self-imposed pressure by the teams to list tokens ASAP and “monetize” their apps by dumping it under the pretext of “marketing spend”, “ecosystem development” and whatever other meaningless term they come up with to dupe the public as to its true meaning.
Most gamers feel comfortable buying in-game currency directly from the developer. Then why dilute everyone’s value, including your own, by trying to create a market long before the said value for the asset is established?
5. What’s the Point?
Fed up with selling pressure risks, but stuck with the mindset that incentives drive usage, some Web3 teams have recently resorted to the ultimate “solution” – points. Instead of committing explicitly to a certain amount of tokens to be distributed as incentives, they offer users an internal leaderboard system that rewards their activities that the application deems valuable. There is an implicit expectation that points will have a certain allocation in the token distribution, but it is never explicitly committed to by the team, which, as the thinking goes, should give the project more flexibility.
The understanding among these teams is that it helps with adoption of their product as much as token distribution would, but also helps reduce the final allocation to users and extend it in time – so that when it finally happens the application is established in its niche and usage is sticky. The pioneer on that front was the NFT marketplace Blur, which drove significant volume onto its platform as speculative traders raced to offer their liquidity in exchange for points.
However, the approach does not take into account that it still attracts primarily those users that want to benefit from the airdrop and hence creates little stickiness for the token itself, regardless of how separated in time the promise and the actual delivery are, which is perfectly reflected in the $BLUR price. A distribution that is deemed by those “airdrop farmers” as unfair will have an equally negative effect as if all of the “community allocation” was dumped at once in the previous model of direct distribution, since they still get the majority of the tokens in existence at the time of airdrop, regardless of how small of a % of a total supply their allocation may be.
One may offer a counter-argument that Blur still benefited from growing the user base and expanding liquidity in the NFT market by introducing points. We are certain that given the lack of serious competition and good enough user experience on the platform tailored specifically to NFT speculators it would have happened anyways. Point-based distribution only added a bad taste in those users' mouths and slowed down the token momentum.
Figure 4. Blur's Token Price Chart, Source: CoinMarketCap
The team and investors have squandered a big chunk of their upside with this approach essentially – as it didn’t really change much of the original distribution mechanics of DeFi applications. It’s essentially the same incentive structure that doesn’t attract sticky users and ends up negatively affecting the token if the distribution is deemed unfair. Which it always will be, especially if the program is successful and many users are attracted, which lessens every individual user’s payout. Given payout is often their only goal, the less they get - the more viciously they will dump it. So it’s almost always a negative for the token given such baked in incentives.
How to Design a Token
We have covered multiple examples of how not to design a token, but what does it tell us about optimal design? We won’t share all of the “secret sauce” just yet, given we need to prioritize portfolio company work, but here are the core principles derived from the examples above, which contain all of the necessary knowledge:
- Token incentives are overhyped. It frontloads adoption, but also attracts mercenary users. A good product can do without aggressive incentives and won’t shoot itself in the foot by adding them early. Aggressive incentives almost always end up with a short-lived project. Incentives only work as a defensive strategy – when your market share is being attacked. In that case it’s most important not to box oneself in with too high of a distribution or particular mechanics, as most likely those would need to be adjusted afterwards.
- Token as an access instrument has been proven to work (as long as there is demand for the underlying product/service). Access tokens aren’t reserved only for L1s. You can enforce a token into a smart contract just as well. It doesn’t have to contradict offering your service for free early on, which has been a winning adoption strategy for consumer internet services for over 20 years – you can still reward early adopters or retroactively airdrop them, but only in as much as they need to maintain access.
- Some tokens do not require external markets. At least initially. The only reason for an early listing is to allow both the team and other stakeholders to sell their allocations freely into expected inflow of capital. However, the inflow is often much smaller than the outflow in the project’s early stages, which leads to fuking up the chart, which in return fuks up the project and the dynamic is hard to un-fu*k. As we noted a couple times across this essay, creating organic demand and expanding on it will always attract speculators wanting to frontrun it, even in bear markets. So, a product with an inflow into its token, based on such token being tightly-knit to it, will also benefit from additional inflow almost by default. But this requires establishing the demand before establishing broader market access and never the other way around.
- Value capture can be programmed into mechanics. Given the simple, yet powerful concept of value in- and outflow as illustrated in Figure 2, one can see how placing the token at the right spot in the mechanism design of the application or protocol, one may achieve even greater value capture than in traditional business, given Web3 is mostly automated and even autonomous. The goal is to identify the root of the economic relationship a product or service represents and make the token a focal point of it without negatively affecting user experience. Ideally you’d want it abstracted away from a user or much simplified.
We are at the cusp of a new era where tokens are accepted more broadly. The winning designs will determine how the Web3 innovation cycle plays out – whether we will revert to slightly improved status quo or create truly new forms of internet businesses, with new distribution and access dynamics, as well as bigger moats. It’s up to us to shape it and we’re here for it!
- We intuitively sense that correct business-oriented (rather than monetary) token design, which is still to be discovered and solidified, is very much in line with the economist Brian Arthur’s Increasing Returns theory, that is more commonly referred to as “network effects” in our industry. The latter term has been somewhat watered down, stripping the phenomena in question of much insightful context one may find in Arthur’s work. ↩
- Specifically, there is academic literature covering the strategies businesses may implement when charging customers. For example, “charge as much as you can” can reduce the overall lifetime value (LTV) of a customer and may not always be optimal. By way of example, TSMC, which holds high pricing power over its customers, is not maxing it out on purpose, which, as some argue, is the cornerstone of its long-lasting reign over the chip fabrication industry. Numerous supply chains are critically dependent on TSMC in no small part because of the company’s approach to pricing. ↩
- It used to be almost all based on one cloud provider, Hetzner’s, German data centers a few years ago, so things have been improving ;-) ↩
All Comments