Tokenomics Modeling for Tokenomics Designers
Space and Time added Space and Tokens, formerly Cenit Finance, to the SXT product suite in August 2024. Space and Tokens is a cutting-edge tokenomics platform designed to help protocols optimize their token utilities and distribution strategies. With this integration, protocols can simulate tokenomics, better anticipate token performance and market risks, and build comprehensive analytics and dashboards around their token economy, all powered by SXT's verified onchain data. This content was originally published on the Cenit Finance website.
Designing a token economy is a complex task. There are many pieces to assemble and numerous factors that can lead to a quick meltdown. A deep understanding of your value proposition and how you accrue value is key to determining your token economy. The task becomes even more complex when your promises to early investors and the community must be fulfilled according to your vesting schedule. Add speculation effects, liquidity management, and you have a complex mix of hundreds of unexpected corner cases that are impossible to keep straight in your head. This is why it's so important to model your token economy while designing it. Making a digital version of your token plan helps you see potential problems before they happen.
At Space and Tokens, we've built a top-notch tool for tokenomics modeling. This tool helps token designers double-check their ideas and answer big questions about their token economy. Questions like what market share they need to hit a certain value, or when is the best time to use tokens for funding, are made simpler with our platform. Our tool uses simple forms to help you map out your tokenomics plan, like filling in a template.
Tokenomics Modeling: Set up a Token Economy
We have came up with a list of steps to structure your tokenomics modeling in our platform
1. Token Allocation
Although the token allocation and vesting schedule should be one of the last pieces to add to your economy, once you have defined your business model, activity, and token dynamics, for many projects it is one of the first ones to have, so we can start with it.
We will start by introducing the parameters of the allocation into our token simulation platform, although we surely will need to tune these parameters later. While defining the vesting schedule, there are five key elements for each agent that has an allocation:
- Allocation: how many tokens an entity on the vesting schedule receives.
- Vesting start: when the vesting process starts.
- Cliff: defines a period during which no tokens are distributed to ensure people are committed for the long term, and the equivalent amount that would have vested is distributed all at once at the end.
- Duration of the vesting: the time by which an entity has received all their tokens.
- Agent type: consider an agent type as the behavior that each agent will have in your token economy. Stakeholders are those that at some point invested in your project and will look for an economic reward, such as the team, investors, etc. These agents will sell their tokens over time. Then we have the agent type of treasury, which funds at specific times the operative costs, incentives, rewards, and giveaways. We also have the market maker, which provides liquidity to the market.
By filling in this information, we will be able to determine the first approximation of our token selling pressure schedule.
2. Value Propositions, drivers of the economy
The success of a company depends on the success of its value proposition, and the same applies to token success. The value proposition is the means by which the protocol generates value; while the token utilities are how the token accrues it, and the forecasted growth or volume in that value proposition determines how much value the token accrues. Thus, for modeling purposes, having a clear value proposition and a business plan for it is key.
In our platform, we define the value proposition growth KPIs mainly as volume, the number of users, or the number of transactions, but there might be other metrics for your specific case.
Each protocol might have one or more value propositions, so we should indicate each of them.
3. Token Utilities and Their Effects
Some token utilities might be linked to a specific value proposition (for instance, a service fee that goes towards token stakers), while others might be linked to the protocol as a whole (such as governance or incentives).
When defining the value proposition, we will be able to specify each of its associated utilities.
To model the tokenomics, we need to understand the effect of the utilities on the token buying pressure, which is different for each.
- For some token utilities, such as fee burning or fees sent to the treasury, a directly proportional amount of buying pressure is generated
- For other token utilities, such as staking, the amount of buying pressure generated is proportional to the first derivative of the volume in the utility (i.e., how the volume changes over time) Think about this for a second; if my protocol grows the revenue shared with stakers to $1M, and if stakers are expecting an APY of 10%, more and more stakers will come until $10M have been staked. However, once that has occurred, no more stakers will come unless my revenue also increases; otherwise, the APY would fall below 10%, and some of them would not find it attractive with respect to alternative investments and leave over time as their tokens unlock.
Something similar happens to those protocol members that stake to become premium members. As long as the protocol keeps growing, more value will be stored as memberships, but once the protocol does not grow anymore, there will be no more staking demand.
As a result, we can see that some utilities let the token accrue value with composability (meaning even if there are no increments on usage, the token benefits), while others are more focused on accruing value only if the protocol keeps growing. There is no best choice, of course, it will depend on your specific case. We are now aware of the key elements that we need to model and introduce as inputs to model our token economy and generate a full picture of buying and selling pressure. Now, let’s see how to analyze our modeled tokenomics and improve them.
Analyze and Improve Your Tokenomics Model
Once we fill in all the information about the project, the simulator will run the tokenomics model, and we will get a dashboard similar to this:
1. Simulation Result Analysis
While the token price is most of the time the part that everyone pays attention to at first, in order to understand a token economy in depth, the buying and selling pressure charts are even more informative.
Let’s see how the simulation works:
Based on the value propositions, vesting schedule, and tokenomics, we will be able to determine the token buying and selling pressure over time, based on the mechanisms described above. The token price will change over time based on the net buying pressure. If the net buying pressure is negative, more tokens will be sold than bought, and the price will go down.
Notice that for modeling purposes, we do not take into account speculative movements, so here we analyze the token's organic price. Since the buying/selling pressure is what determines the token price, by analyzing the contribution of the different agents in the graph we will know who is impacting more positively or negatively on the economy.
2. Improve your Token Economy
Now, if we want to improve our token economy, we can do so by augmenting or reducing the contribution of each agent. This can be done by adjusting the parameters of the token economy, such as service fees of vesting schedules, and analyzing the effect of those changes.
In the example above, for a DEX, we see that at the beginning, our modeling shows that a big amount of incentives are sold by liquidity providers and traders, probably coming from an airdrop. That means that if we reduce that airdrop or introduce a time-lock, ensuring that those tokens are not dumped so early, our system will improve considerably.
We could also want to augment the amount of tokens that the stakers are buying to compensate for the selling pressure from stakeholders (team, investors…) so based on the amount given in the simulation, we can calculate how much the revenue share should be increased exactly. This is an iterative process, since in a dynamic system multiple pieces inform each other. At the end of it, we will have a robust token economy, properly modeled and easily explainable to other stakeholders or investors.