Modeling & Optimization - Tokenomics Design Course (Part 9)
Modeling and Optimization in Tokenomics
Introduction to Modeling and Optimization
- The video focuses on Step 6 of the tokenomics design process, emphasizing its importance in modeling and optimization.
- This step is crucial, with additional resources provided to aid understanding.
Importance of Market Comparisons
- Best practices involve analyzing market comps (comparable projects), as they provide market-validated models rather than theoretical ones.
- Utilizing real-world data from similar projects helps create more realistic growth projections for network modeling.
Purpose of Modeling
- Modeling is not intended for predicting token prices but for analyzing risks associated with different scenarios.
- Scenario analysis becomes vital when assessing collateral value within a system, highlighting the need for risk management.
Objectives of Effective Modeling
- Publicly releasing modeling and scenario analyses demonstrates thorough risk consideration, distinguishing professional teams from amateurs.
- Key objectives include identifying critical assumptions that must hold true, quantifying potential breaking points, and optimizing incentive mechanisms.
Risk Analysis through Practical Examples
- Examples such as wash trading exploits illustrate how specific emission schedules can lead to identifiable risks that need quantification.
- Historical cases like Luna and UST show that proactive risk identification could have mitigated severe outcomes; however, insights must be acted upon by builders.
Deterministic vs. Stochastic Modeling
Understanding Deterministic and Stochastic Modeling
Overview of Deterministic Modeling
- Deterministic modeling involves making predictions based on fixed inputs, allowing users to estimate growth, pricing, or user numbers without needing coding skills.
- This approach is accessible for end-users, enabling them to manipulate data in a spreadsheet format and derive insights independently.
- While deterministic models can be informed by empirical data (e.g., market comparisons), they lack comprehensive risk analysis capabilities.
- Users must manually consider risks, which can lead to overconfidence as only known risks are assessed.
Transitioning to Stochastic Modeling
- The speaker advocates for using both deterministic and stochastic modeling iteratively; starting with deterministic models before advancing to stochastic ones for deeper insights.
- Stochastic modeling requires coding skills and utilizes tools like Python libraries (CAD CAD, Token Spies) or GUI tools (Machinations).
- Unlike deterministic models that use fixed inputs, stochastic models generate random paths based on observed data, simulating various outcomes through multiple iterations.
Advantages and Disadvantages of Stochastic Modeling
- This method allows for comprehensive risk analysis by assessing numerous variables simultaneously (e.g., Bitcoin price fluctuations, user growth rates).
- It helps identify non-obvious system flaws but demands technical expertise for building, running, and interpreting the results.
Key Considerations in Modeling
- Effective modeling replicates the system's functionality; it can range from simple spreadsheets to complex Python models.
- Inputs should include both controllable factors (like token supply and fees) and uncontrollable factors (market prices and third-party oracle inputs).
Stress Testing Framework
- Varying inputs helps assess vulnerabilities within the model. For example, testing how quickly a marketplace could become undercollateralized if market prices fluctuate rapidly.
Marketplace Dynamics: Borrowing Against NFTs
Overview of Borrowers and Lenders
- Borrowers possess NFTs but prefer to borrow against them rather than sell, seeking liquidity without losing ownership.
- Lenders have available funds but lack income; they aim to earn by lending against NFT collateral.
Assumptions in Minting Loans
- The system assumes that the value of NFT collateral is accurately represented by the Oracle's price.
- To enhance confidence in this valuation, a conservative approach could involve using a minimum of the current price and an N-block moving average.
Risk Control Measures
- Implementing controls can mitigate risks associated with rapid price fluctuations leading to under-collateralization.
- If the NFT's value is overestimated, setting a lower maximum loan-to-value (LTV) ratio can limit potential losses for lenders.
Stochastic Modeling for Damage Control
- By varying parameters in stochastic models, one can assess potential losses during extreme market conditions.
- This modeling helps identify necessary risk control measures and their effectiveness in maintaining system stability.
Lender Considerations When Lending Against NFTs
Confidence in Liquidation Value
- Lenders must believe they can find buyers at reasonable prices if liquidation becomes necessary.
- Allowing pre-bidding on NFTs during liquidations provides lenders with assurance regarding sale prices.
Maintenance Margin as a Safety Net
- Setting a higher maintenance margin increases safety by providing more room before triggering liquidation.
- A larger margin allows lenders to withstand greater fluctuations in NFT values without incurring losses.
Economic Security Framework
- This framework aids in understanding economic risks within the lending system and informs decisions on what parameters to model and test for effectiveness.
Token Modeling Template Insights
Exploring NFT Marketplace Examples
Modeling Wash Trading Risks in NFT Marketplaces
Understanding Wash Trading and Token Emissions
- The discussion begins with the concept of wash trading, where individuals trade NFTs back and forth to earn token emissions from a platform. This practice poses risks if it remains profitable for traders.
- To model this risk, it's essential to analyze factors such as expected value of emissions, fees incurred during trading, and overall token value.
Approaches to Modeling NFT Marketplace Performance
- A sample market comparison approach is introduced using real-world data from existing NFT marketplaces. This involves analyzing days since launch and their corresponding trading volumes.
- The natural logarithm is utilized in modeling because financial data often grows at a logarithmic or square root rate rather than linearly, enhancing predictability.
Analyzing Trading Volume and Market Cap
- Actual trading activity in Ethereum is examined for comparable marketplaces to establish a baseline for analysis.
- The fully diluted market cap in Ethereum is also considered to ensure that comparisons are made on an apples-to-apples basis without the variability of USD prices affecting the results.
Valuation Metrics: Market Cap vs. Trading Volume
- The ratio of market cap to trading volume serves as a key valuation metric, akin to price-to-earnings ratios used in traditional finance.
- A simple linear regression model explains approximately 34% of the variability in this competitor's valuation metrics day-to-day, indicating its predictive power.
Visualizing Data Trends and Predictions
- A rolling cumulative minimum analysis shows that worst-case scenarios can explain over 90% of variance in valuation metrics, providing insights into potential downturn risks.
- Graphical representations illustrate raw market cap versus trading volume alongside median values and prediction lines, aiding comprehension of trends over time.
Implications for New NFT Marketplace Launches
- These models help predict expected market cap to trading volume multiples post-launch for new NFT marketplaces based on historical data from competitors.
- Insights gained from these analyses inform expectations regarding project market caps and token prices while assessing potential wash trading impacts on emissions' value and fee structures.
Conclusion: Practical Applications of Analysis
Understanding Emissions and Trading Dynamics
Connection Between Emissions Data and Supply Charts
- The emissions tab is linked to the supply charts, automatically feeding data from actual emissions established in previous discussions.
- It calculates the expected USD equivalent of tokens emitted monthly, which informs trading activity requirements before wash trading becomes unprofitable.
Analyzing Wash Trading Risks
- The model assesses how much trading activity is necessary for wash trading to be profitable, considering fees versus rewards.
- Initial predictions indicate that less than 10% of trading activity will involve wash trading; however, this percentage could rise significantly over time.
Implications of Increased Wash Trading
- By months 18 to 24, projections suggest that up to 25-30% of trading may be classified as wash trading, indicating a potential risk for the protocol.
- This does not guarantee that such levels will occur but highlights a concerning trend that necessitates strategic adjustments.
Strategies for Mitigating Wash Trading
- To combat high levels of wash trading, options include altering emission schedules or increasing transaction fees to reduce profitability.
- A deterministic model has been utilized to understand revenue capture and token emission implications on future wash trading activities.
Advancing Modeling Techniques
Transitioning from Deterministic to Stochastic Models
- The next step involves implementing stochastic modeling for more sophisticated predictions beyond fixed outcomes.
- A Python library has been developed to facilitate simulations based on real-world data without requiring extensive coding knowledge.
Utilizing Real Market Data in Simulations
- The library uses actual traded prices of Bitcoin and Ethereum to generate price fluctuations over a simulated period (90 days).
- It creates multiple scenarios (1,000 simulations), allowing analysis of potential market behaviors under varying conditions.
Assessing Risk Exposure Through Simulations
- Each simulation provides insights into risk exposure by mapping price paths and identifying probabilities of adverse outcomes.
- This approach quantifies risks associated with assumptions made during modeling and helps determine when those assumptions might fail.
Conclusion: Best Practices in Modeling
Summary of Key Takeaways