Replies: 5 comments 5 replies
-
Minimum Viable Oracle (MVO)Can you be specific on which oracles you have in mind to start off this system? For example "we need the following markets' data:
In order to create a minimum viable oracle with this design" Other ConsiderationsWe discussed Credit redemption and how it could make the token inflationary at a variable rate, depending on the health of our economy. Perhaps this is out of scope for this TWAP oracle but I know you did some research on it and I'm not sure if we should have a separate discussion for it, or if we should include those considerations in here. Since I'm already talking about it here, I want to add that we should also look at the total circulating supply of Dollars, and distinguish which are in control of the protocol and which are not. If the Dollars are in control of other protocols/whales, then we should always plan for a worst case scenario of them dumping 100% and still being able to pay off those liabilities (given our current liquidity and/or price + collateral in the bank.) This means that we should always be coasting along over-collateralized (unless for some reason, perhaps while bootstrapping things, that Ubiquity controls over half of the circulating supply of Dollars, then we could be under collateralized because we wouldn't dump on ourselves after everybody else dumped on us) |
Beta Was this translation helpful? Give feedback.
-
Markets to track for the MVOCurve markets
Uniswap markets by TVL
Median of Chainlink Data Feeds
These 5 will work as a baseline when we start to use data from chainlink, as volume and volatility grow more data feeds are added, similar to how we would take all usd pairs from curve Note for this baseline we wont be using our own price as a reference as that would result in unwanted feedback loops |
Beta Was this translation helpful? Give feedback.
-
MVO spec v1The initial controller will be a simple P controller, we should look into the possibility of forking RAI as the math should be the same (if not simpler) variables are
the governing equation for the controller will be A controller always wants to be at 0, as such everytime volume goes up so will the amount of polls the oracle makes, this can be simplified as Our current volume will be considered baseline (1), ideally the volume variable will be logarithmic however for simplicity of v1 I have 2 options
MarketsThe following list is all of the markets that v1 should track order depends on the gas cost of each source
Curve markets
Uniswap markets by TVL
Median of Chainlink Data Feeds
Since ChainLink already does medians on their oracles this section is probably overkill, but depending on gas cost it can be a nice to have |
Beta Was this translation helpful? Give feedback.
-
Core conceptsThe twap price will be the median of various data sources grouped depending on their gas cost and origin, These data sources will be called based on volume (in future versions also amount of tx and volatility) when volume is low only 1-2 of the cheapest price feeds will be called, as volume rises more data sources are polled and This will make our twap as efficient as possible by reducing gas when there is not a lot going on while maintaining a robust infrastructure when shit goes down These calls will be determined by a proportional controller (this ideally will evolve into full PID in future versions) based on Volume per Block Equations
Graphs/SimulationsIn this example when volume rises the difference between the base polls (BP) and volume (V) increases causing the poll rate (R) to increase, since the controller always wants to be at 0 the net polls (P) increase An example of the functionality can be found here the general idea is move the value sliders around while trying to maintain the blue line at 0 MarketsThe following list is all of the markets that v1 should track order depends on the gas cost of each source
Curve markets
Uniswap markets by TVL
Median of Chainlink Data Feeds
Since ChainLink already does medians on their oracles this section is probably overkill, but depending on gas cost it can be a nice to have |
Beta Was this translation helpful? Give feedback.
-
so the overall idea is to create an efficient TWAP based on:
questions:
|
Beta Was this translation helpful? Give feedback.
-
The twap price we will target is an aggregation of various data sources grouped depending on their gas cost and origin, current groups are
ChainLink price feeds
Uma optimistic oracles (prob will get axed)
Median aggregation of dexes
to make various sources
These data sources will be called based on market conditions. When there is not a lot of volume amount of transactions is minimum and volatility is low only the cheapest price feeds will be called, as the conditions change to a harsher, more lively market data sources will progressively be added until the twap becomes an aggregation of all possible data sources
This has the purpose of making our oracle calls as efficient as possible by reducing the gas used determining the twap by reducing our set of data sources in “safer” market conditions while at the same time being able to ramp the robustness during higher volatility periods
These calls will be determined by a proportional controller (this ideally will evolve into full PID in future versions) based on
Volume per unit of time (daily, hourly or block based)
Amount of tx per unit of time (daily, hourly or block based)
The controller will work based on 3 variables, a simplified version of them is
P is the actual amount of polls per unit of time
B is a baseline variable, its a function that will increase or decrease overtime to change the net polls done and is affected by the
third variable generated at the output, it's the feedback element of the system
The resulting equations will be then
V-BP=R, since the controller always wants to be at 0 and V is what where measuring then;
R=BP, as B is an independent function of time only P can be changed as a result of an R change
In this simplified example when volume rises the difference between the base polls (BP) and volume (V) increases causing the poll rate (R) to increase, since the controller always wants to be at 0 the net polls (P) increase
As the difference goes back to 0 due to base polls increasing in tandem with volume the baseline (B) increases as an effect of the R increase allowing the net polls go down while the baseline stays the same, as this new volume is deemed “new normal”
When volume decreases P has to go down to bring the difference back to 0, as R is a negative number. This negative R causes B to also go down over time, increasing the net polls after given amount of time
Baseline function
This will change what the controller defines as “normal volume” , a simple way to see it is in an unit of time the controller wants it to be the equal to R
By example “it should go from 1 to 3 in a week”
Contract structure
The current contract structure is based on the way I design electronics, this means it will probably be shit in some aspects, for a more exact spec look at RAI controller contracts
an individual contract for each individual data source, in this version that means 4
Individual interfaces to normalize the data sources to a common format for the controller to read, another 4
A controller contract
This puts the total to 9 in this current iteration
Beta Was this translation helpful? Give feedback.
All reactions