Skip to content

Commit

Permalink
Fix#400: Re-merge YAML & CLI (#403)
Browse files Browse the repository at this point in the history
* First stab at porting various functions over to polars... lots to go

* TOHLCV df initialization and type checking added. 2/8 pdutil tests passing

* black formatted

* Fixing initialization and improving test. datetime is not generated if no timestamp is present

* Restructured pdutil a bit to reduce DRY and utilize schema more strictly.

* test initializing the df and datetime

* improve init test to show exception without timestamp

* fixing test_concat such that it verifies that schemas must match, and how transform handles datetime

* saving parquet enforces datetime and transform. updated test_load_append and test_load_filtered.

* black formatted

* data_eng tests are passing

* initial data_eng tests are passing w/ black, mypy, and pylint.

* _merge_parquet_dfs updated and create_xy test_1 is passing. all data_eng tests that are enabled are passing.

* 2exch_2coins_2signals is passing

* Added polars support for fill_nans, has_nans, and create_xy__handle_nan is passing.

* Starting to deprecate references to pandas and csv in data_factory.

* Black formatted

* Deprecated csv logic in DataFactory and created tests around get_hist_df() to verify that its working as intended. I believe kraken data is returning null at the moment.

* All tests should be passing.

* Fix #370: YAML & CLI (#371)

* Towards #232: Refactoring towards ppss.yaml part 3/3
* move everything in model_eng/ to data_eng/
* Fix #352: [SW eng] High DRY violation in test_predictoor_agent.py <> test_predictoor_agent3.py
* Deprecate backend-dev.md (long obsolete), macos.md (obsolete due to vps), and envvars.md (obsolete because of ppss.yaml).
* Rename BaseConfig to web3_pp.py and make it yaml-based
* Move scripts into util/, incorporate them into pdr cli, some refactoring.
* revamp READMEs for cli. And, tighten up text for getting OCEAN & ROSE
* Deprecated ADDRESS_FILE and RPC_URL envvars.
* deprecate Predictoor approach 2. Pita to maintain 


Co-authored-by: trizin <[email protected]>

* Update CI to use pdr instead of scripts/ (#399)

* Update check script CI

* Update cron topup

* Workflow dispatch

* Nevermind, revert previous commit

* Run on push to test

* Pass ppss.web3_pp instead of web3_config

* Don't run on push

* Replace long try/except with _safe*() function; rename pdutil -> plutil; get linters to pass

* Update entrypoint script to use pdr cli (#406)

* Add main.py back (#404)

* Add main.py back

* Black

* Linter

* Linter

* Remove "switch back to version v0.1.1"

* Black

* make black happy

* small bug fix

* many bug fixes. Still >=1 left

* fix warning

* Add support for polars where needed

* tweak docstring

* Fix #408: test_sim_engine failing in yaml-cli2, bc hist_df is s not ms. Proper testing and documentation was added, as part of the fix

* BaseContract tests that Web3PP type is input

* goes with previous commit

* tweak - lowercase

* Bug fix - fix failing tests

* Remove unwanted file

* (a) better organize ppss.yaml for usability (b) ensure user isn't annoyed by git with their copy of ppss.yaml being my_ppss.yaml

* add a more precise test for modeling

* make black happy

* Small refactor: make transform_df() part of helper routine

* Fix #414: Split data_factory into (1) CEX -> parquet -> df (2) df -> X,y for models

* Fix #415: test_cli_do_dfbuyer.py is hanging #415

* test create_xy() even more. Clarify the order of timestamps

* Add a model-building test, using data shaped like data from test_model_data_factory

* Fix #416: [YAML branch] No Feeds Found - data_pp.py changes pair standards

* For barge#391: update to *not* use barge's predictoor branch

* Update vps.md: nicer order of operations

* For #417, #418 in yaml-cli2 branch. publisher TUSD -> USDT

* remove default_network from ppss.yaml (obsolete)

* Fix #427 - time now

* Fix #428: test_get_hist_df - FileNotFoundError. Includes lots of extra robustness checking

* remove dependency that we don't need, which caused problems

* Fix #421: Add cli + logic to calculate and plot traction metrics (PR #422)

Also: mild cleanup of CLI.

* bug fix: YAML_FILE

* fix breaking test; clean it up too

* add barge-calls.md

* Fix #433. Calculate metrics and draw plots for epoch-based stats (PR #434)

#433 : "Plot daily global (pair_timeframe x20) <average predictoors> and <average stake>, by sampling slots from each day."

* Tweak barge-calls.md

How: show origin of NETWORK_RPC_URL

* Tweak barge-calls.md: more compactly show RPC_URL calc

* update stake_token

* bug fix

* Update release-process.md: bug fix

* Tweak barge-calls.md

* Tune #405 (PR #406): Update entrypointsh script to use pdr CLI

* Update vps.md: docker doesn't need to prompt to delete

* Update vps.md: add docker-stop instrs

* allow CLI to have NETWORK_OVERRIDE, for more flexiblity from barge

* fix pylint issue

* Update barge-calls.md: link to barge.md

* Update release-process.md: fix typo

* touch

* Update vps.md: more instrs around waiting for barge to be ready

* add unit tests for cli_module

* Towards #437: [YAML] Publisher error 'You must set RPC_URL environment variable'

* Bug fixes

* refactor tweaks to predictoor and trader

* Clean up some envvar stuff. Document ppss vars better.

* publish_assets.py now supports barge-pytest and barge-predictoor-bot

* bug fix

* bug fix the previous 'bug fix'

* Clean up how dfbuyer/predictoor/trader agents get feeds: web3_pp.query_feed_contracts() -> data_pp.filter_feeds(); no more filtering within subgraph querying; easier printing & logging. Add timeframestr.Timeframe. Add feed.mock_feed. All tests pass.

* fix breaking subgraph tests. Still breakage in trader & dfbuyer (that's next)

* Fix failing tests in tradder, dfbuyer. And greatly speed up the tests, via better mocking.

* Fix bugs for failing tests of https://github.com/oceanprotocol/pdr-backend/actions/runs/7156603163/job/19486494815

* fix tmpdir bug

* Fix (hopefully) failing unit test - restricted region in querying binance api

* consolidate gas_price setting, make it consistent; set gas_price to 0 for development/barge

* fix linter complaints

* Fix remaining failing unit tests for predictoor_batcher

* Finish the consolidation of gas pricing. All tests pass

* Update vps.md: add debugging info

- Where to find queries
- Key docker debugging commands

* add to/from wei utility. Copied from ocean.py

* tweak docs in conftest_ganache

* tweaks from black for wei

* Make fixed_rate.py and its test easier to understand via better var naming & docs

* Make predictoor_contract.py easier to understandn via better var anming & docs

* test fixed_rate calcBaseInGivenOutDT

* Refactor predictoor_contract: push utility methods out of the class, and into more appropriate utility modules. And, move to/from_wei() from wei.py to mathutil.py. Test it all.

* Tweak docstrings for fixed_rate.py

* Improve DX: show dev what the parameters are. Improve UX: print when done.

* Improve DX & UX for predictoor_contract

* Tweak UX (prints)

* Update vps.md: export PATH

* Logging for predictoor is way better: more calm yet more informative. Predictoors only do 1 feed now.

* TraderAgent -> BaseTraderAgent

* Rename parquet_dfs -> rawohlcv_dfs; hist_df -> mergedohlcv_df; update related

* apply black to test_plutil.py

* apply black to test_model_data_factory.py

* apply black to ohlcv_data_factory.py

* refactor test_ohlcv_data_factory: cleanup mocks; remove redundant test; pq_data_factory -> factory

* Fix #443: [YAML] yaml timescale is 5m, yet predictoor logs s_per_epoch=3600 (1h)

* Update feed str() to give full address; and order to be similar to predict_feeds_strs. Show all info used in filtering feeds.

* Small bug fix: not printing properly

* Tweak: logging in predictoor_contract.py

* Tweak: logging in trueval_agent_single.py

* Two bug fixes: pass in web3_pp not web3_config to PredictoorContract constructor

* enhance typechecking

* tweak payout.py: make args passed more obvious

* fix broken unit test

* make black happy

* fix breaking unit test

* Tweak predictoor_contract DX & UX

* Improve trueval: Have fewer layers of try/except, better DX via docstrings and more, better UX via logs

* Rename TruevalAgentBase -> BaseTruevalAgent

* (a) Fix #445: merge 3 trueval agent files into 1. (b) Fix #448 contract.py::get_address() doesn't handle 'sapphire-testnet' etc #448. (c) test_contract.py doesn't test across all networks we use, and it's missing places where we have errors (d) clean up trueval agent testing (e) move test_contract.py into test_noganache since it doesn't use ganache

* Fix #450: test_contract_main[barge-pytest] fails

* renaming pq_data_factory to ohlcv_data_factory

* Removing all TODOs

* Fix #452: Add clean code guidelines README

* removing dangling _ppss() inside predictoor_agent_runner.py

* Fixing linter

* Fix #454: Refactor: Rename MEXCOrder -> MexcOrder, ERC721Factory

* Fix #455: Cyclic import issue

* Fix #454 redux: the first commit introduced a bug, this one fixes the bug

* Fix #436 - Implement GQL data factory (PR #438)

* First pass on gql data factory

Co-authored-by: trentmc <[email protected]>

* Fix #350: [Sim] Tweaks to plot title

* make black happy

* Fix #446: [YAML] Rename/move files & dirs for proper separation among lake, AI models, analytics (#458)

* rename data_eng/ -> lake/
* rename model_factory -> aimodel_factory, model_data_factory -> aimodel_data_factory, model_ss -> aimodel_ss
* for any module in util/ that should be in analytics/, move it
* for any module in util/ that should be in lake/, move it. Including get_*_info.py
* create dir subgraph/ and move all *subgraph*.py into it. Split apart subgraph.py into core_subgraph.py and more
* apply mathutil.to_wei() and from_wei() everywhere
* move contents of util/test_data.py (a bunch of sample predictions) into models/predictions.py. Fix DRY violations in related conftest.pys
* note: there are 2 failing unit tests related to polars and "timestamp_right" column. However they were failing before. I just created a separate issue for that: #459

* Fix #459: In CI, polars error: col timestamp_right already exists (#460)

Plus:
* remove datetime from all DFs, it's too problematic, and unneeded
* bug fix: wasn't mocking check_dfbuyer(), so CI was failing

* Fix #397: Remove need to specify 'stake_token' in ppss.yaml (#461)

* Docs fixes (#456)

* Make Feeds objects instead of tuples. (#464)

* Make Feeds objects instead of tuples.
* Add namings for different feed objects.
* Move signal at the end.

* Move and rename utils (#467)

* Move and rename utils

* Objectify pairstr. (#470)

* Objectify pairstr.
* Add possibility for empty signal in feeds.
* Move and add some timeframe functions.
* Move exchangestr.

* Towards #462: Separate lake and aimodel SS, lake command (#473)

* Split aimodel and lake ss.
* Split data ss tests.
* Add aimodel ss into predictoor ss.
* Remove stray data_ss.
* Moves test_n to sim ss.
* Trader ss to use own feed instead of data pp.
* Remove data pp entirely.
* Correct ohlcv data factory.
* Add timeframe into arg feeds.
* Refine and add tests for timeframe in arg feed.
* Remove timeframe dependency in trader and predictoor.
* Remove timeframe from lake ss keys.
* Singleify trader agents.
* Adds lake command, assert timeframe in lake (needed for columns).
* Process all signals in lake.

* [Lake] integrate pdr_subscriptions into GQL Data Factory (#469)

* first commit for subscriptions

* hook up pdr_subscriptions to gql_factory

* Tests passing, expanding tests to support multiple tables

* Adding tests and improving handling of empty parquet files

* Subscriptions test

* Updating logic to use predictSubscriptions, take lastPriceValue, and to not query the subgraph more than needed.

* Moving models from contract/ -> subgraph/

* Fixing pylint

* fixing tests

* adding @enforce_types

* Improve DRY (#475)

* Improve DRY in cli module.
* Add common functionality to single and multifeed entries.
* Remove trader pp and move necessary lines into trader ss.
* Adds dfbuyer filtering.
* Remove exchange dict from multifeed mixin.
* Replace name of predict_feed.
* Add base_ss tests.
* Adds trueval filtering.

* Add Code climate. (#484)

* Adds manual trigger to pytest workflow.

* issue483: move the logic from subgraph_slot.py (#489)

* Add some test coverage (#488)

* Adds a line of coverage to test.
* Add coverage for csvs module.
* Add coverage to check_network.
* Add coverage to predictions and traction info.
* Adds coverage to predictoor stats.
* Adds full coverage to arg cli classes.
* Adds cli arguments coverage and fix a wrong parameter in cli arguments.
* Adds coverage to cli module and timeframe.
* Some reformats and coverage in contract module.
* Adds coverage and simplifications to contracts, except token.
* Add some coverage to tokens to complete contract coverage work.

* Fix #501: ModuleNotFoundError: No module named 'flask' (PR #504)

* Fix #509: Refactor test_update_rawohlcv_files (PR #508)

* Fix #505: polars.exceptions.ComputeError: datatypes of join keys don't match (PR #510)

* Refactor: new function clean_raw_ohlcv() that moves code from _update_rawohlcv_files_at_feed(). It has sub-functions with precise responsibilities. It has tests.
* Add more tests for merge_raw_ohlcv_dfs, including one that replicates the original issue
* Fix the core bug, now the new tests pass. The main fix is at the top of merge_df::_add_df_col()
* Fix failing test due to network override. NOTE: this may have caused the remaining pytest error. Will fix that after this merge

* Fix #517: aimodel_data_factory.py missing data: binance:BTC/USDT:None (PR #518)

Fixes #517

Root cause: ppss.yaml's aimodel_ss feeds section didn't have eg "c" or "ohlcv"; it assumed that they didn't need to be specified. This was an incorrect assumption: aimodel_ss needs it. In fact aimodel_ss class supports these signals, but the yaml file didn't have it.

What this PR does:
- add a test to aimodel_ss class constructor that complains if not specified
- do specify signals in the ppss.yaml file

Note: the PR pytest failed, but for an unrelated reason. Just created #520 for follow-up.

* Towards #494: Improve coverage 2 (#498)

* Adds some coverage to dfbuyer agent.
* Add dfbuyer and ppss coverage.
* Adds predictoor and sim coverage.
* Adds coverage to util.
* Add some trueval coverage.
* Adds coverage to trader agents.
* Add coverage to portfolio.
* Add coverage to subgraph consume_so_far and fix an infinite loop bug.
* More subgraph coverage.

* Fix #519: aimodel_data_factory.py missing data col: binance:ETH/USDT:close (#524)

Fix #519

Changes:
- do check for dependencies among various ppss ss feeds
- if any of those checks fails, give a user-friendly error message
  - greatly improved printing of ArgFeeds, including merging across pairs and signals. This was half the change of this PR
- appropriate unit tests

* Replace `dftool` with `pdr` (#522)

* Print texts: dftool -> pdrcli

* pdrcli -> pdr

* Fix #525: Plots pop up unwanted in tests. (PR #528)

Fix by mocking plt.show().

* Issue 519 feed dependencies (#529)

* Make missing attributes message more friendly and integrate ai ss part to multimixin.

* Update to #519: remove do_verify, it's redundant (#532)

* Fix #507: fix asyncio issues (PR #531)

How fixed: use previous ascynio version.

Calina: Asyncio has some known issues, per their changelog. Namely issues with fixture handling etc., which I believe causes the warnings and test skips in our runs. They recommend using the previous version until they are fixed. It is also why my setup didn't spew up any warnings, my asyncio version was 21.1.

https://pytest-asyncio.readthedocs.io/en/latest/reference/changelog.html

* #413 - YAML thorough system level tests (#527)

* Fix web3_config.rpc_url in test_send_encrypted_tx

* Add conftest.py for system tests

* Add system test for get_traction_info

* Add system test for get_predictions_info

* Add system test for get_predictoors_info

* Add "PDRS" argument to _ArgParser_ST_END_PQDIR_NETWORK_PPSS_PDRS class

* Fix feed.exchange type conversion in publish_assets.py

* Add print statement for payout completion

* Add system level test for pdr topup

* Add conditional break for testing via env

* Add conditional break for testing via env

* Black

* Add test for pdr rose payout system

* System level test pdr check network

* System level test pdr claim OCEAN

* System level test pdr trueval agent

* Remove unused patchs

* Fix wrong import position in conftest.py

* Remove unused imports

* System level test for pdr dfbuyer

* System level tests for pdr trader

* System level tests for publisher

* Rename publisher test file

* Add conditional break in take_step() method

* Update dftool->pdr names in system tests

* Refactor test_trader_agent_system.py

* Add mock fixtures for SubgraphFeed and PredictoorContract

* Add system tests for predictoor

* Black

* Refactor system test files - linter fixes

* Linter fixes

* Black

* Add missing mock

* Add savefig assertion in test_topup

* Update VPS configuration to use development entry

* Patch verify_feed_dependencies

* Refactor test_predictoor_system.py to use a common test function

* Refactor trader approach tests to improve DRY

* Black

* Indent

* Ditch NETWORK_OVERRIDE

* Black

* Remove unused imports

* Adds incremental waiting for subgraph tries. (#534)

* Add publisher feeds filtering. (#533)

* Add publisher feeds filtering.

* Pass the ppss.web3_pp instead of web3_config into WrappedToken class (#537)

* Fix #542: Add code climate usage to developer flow READMEs

* #538 - check network main subgraph query fails (#539)

* Use current time in seconds utc

* Remove unused import

* Fix the check_network test

* Divide current_ut by 1000

* test_check_network_without_mock, WIP

* Add missing import

* Implement current_ut_s

* Use current_ut_s

* Update tests

* Formatting

* Return int

* Remove balanceOf assert

* Remove unused import

* current_ut -> current_ut_ms

* #540 - YAML CLI topup and check network actions require address file (#541)

* GH workflow: Fetch the address file and move it to contracts directory

* Fetch and move the address file to address dir

* Remove predictoor2 ref from pytest

---------

Co-authored-by: idiom-bytes <[email protected]>
Co-authored-by: trizin <[email protected]>
Co-authored-by: Idiom <[email protected]>
Co-authored-by: Călina Cenan <[email protected]>
Co-authored-by: Mustafa Tunçay <[email protected]>
  • Loading branch information
6 people authored Jan 16, 2024
1 parent e93c4be commit 3544c73
Show file tree
Hide file tree
Showing 322 changed files with 17,528 additions and 10,635 deletions.
12 changes: 7 additions & 5 deletions .github/workflows/check_mainnet.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,18 +17,20 @@ jobs:
with:
python-version: "3.11"

- name: Fetch the address file and move it to contracts directory
run: |
wget https://raw.githubusercontent.com/oceanprotocol/contracts/main/addresses/address.json
mkdir -p ~/.ocean/ocean-contracts/artifacts/
mv address.json ~/.ocean/ocean-contracts/artifacts/
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Notify Slack
env:
RPC_URL: "https://sapphire.oasis.io"
SUBGRAPH_URL: "https://v4.subgraph.sapphire-mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph"
PRIVATE_KEY: "0xb23c44b8118eb7a7f70d21b0d20aed9b05d85d22ac6a0e57697c564da1c35554"
run: |
output=$(python scripts/check_network.py 1 | grep -E 'FAIL|WARNING|error' || true)
output=$(python pdr check_network ppss.yaml sapphire-mainnet | grep -E 'FAIL|WARNING|error' || true)
fact=$(curl -s https://catfact.ninja/fact | jq -r '.fact')
if [ -z "$output" ]; then
echo "No output, so no message will be sent to Slack"
Expand Down
12 changes: 7 additions & 5 deletions .github/workflows/check_testnet.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,18 +17,20 @@ jobs:
with:
python-version: "3.11"

- name: Fetch the address file and move it to contracts directory
run: |
wget https://raw.githubusercontent.com/oceanprotocol/contracts/main/addresses/address.json
mkdir -p ~/.ocean/ocean-contracts/artifacts/
mv address.json ~/.ocean/ocean-contracts/artifacts/
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Notify Slack
env:
RPC_URL: "https://testnet.sapphire.oasis.dev"
SUBGRAPH_URL: "https://v4.subgraph.sapphire-testnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph"
PRIVATE_KEY: "0xb23c44b8118eb7a7f70d21b0d20aed9b05d85d22ac6a0e57697c564da1c35554"
run: |
output=$(python scripts/check_network.py 1 | grep -E 'FAIL|WARNING|error' | grep -v "1h" || true)
output=$(python pdr check_network ppss.yaml sapphire-testnet | grep -E 'FAIL|WARNING|error' | grep -v "1h" || true)
joke=$(curl -s https://official-joke-api.appspot.com/jokes/general/random | jq -r '.[0].setup, .[0].punchline')
if [ -z "$output" ]; then
echo "No output, so no message will be sent to Slack"
Expand Down
21 changes: 13 additions & 8 deletions .github/workflows/cron_topup.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ name: Topup accounts
on:
schedule:
- cron: "0 * * * *"

jobs:
topup-mainnet:
runs-on: ubuntu-latest
Expand All @@ -15,18 +15,20 @@ jobs:
uses: actions/setup-python@v2
with:
python-version: "3.11"

- name: Fetch the address file and move it to contracts directory
run: |
wget https://raw.githubusercontent.com/oceanprotocol/contracts/main/addresses/address.json
mkdir -p ~/.ocean/ocean-contracts/artifacts/
mv address.json ~/.ocean/ocean-contracts/artifacts/
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Set env variables
run: |
echo "SUBGRAPH_URL=http://v4.subgraph.sapphire-mainnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph" >> $GITHUB_ENV
echo "RPC_URL=https://sapphire.oasis.io" >> $GITHUB_ENV
echo "PRIVATE_KEY=${{ secrets.TOPUP_SCRIPT_PK }}" >> $GITHUB_ENV
- name: Run top-up script
run: python3 scripts/topup.py
run: python3 pdr topup ppss.yaml sapphire-mainnet

topup-testnet:
runs-on: ubuntu-latest
Expand All @@ -41,10 +43,13 @@ jobs:
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Fetch the address file and move it to contracts directory
run: |
wget https://raw.githubusercontent.com/oceanprotocol/contracts/main/addresses/address.json
mkdir -p ~/.ocean/ocean-contracts/artifacts/
mv address.json ~/.ocean/ocean-contracts/artifacts/
- name: Set env variables
run: |
echo "SUBGRAPH_URL=http://v4.subgraph.sapphire-testnet.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph" >> $GITHUB_ENV
echo "RPC_URL=https://testnet.sapphire.oasis.dev" >> $GITHUB_ENV
echo "PRIVATE_KEY=${{ secrets.TOPUP_SCRIPT_PK }}" >> $GITHUB_ENV
- name: Run top-up script
run: python3 scripts/topup.py
run: python3 pdr topup ppss.yaml sapphire-testnet
8 changes: 6 additions & 2 deletions .github/workflows/pytest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@ jobs:
name: Checkout Barge
with:
repository: "oceanprotocol/barge"
ref: "main"
path: "barge"

- name: Run Barge
Expand Down Expand Up @@ -57,5 +56,10 @@ jobs:
- name: Test with pytest
id: pytest
run: |
coverage run --omit="*test*" -m pytest
coverage run --source=pdr_backend --omit=*/test/*,*/test_ganache/*,*/test_noganache/* -m pytest
coverage report
coverage xml
- name: Publish code coverage
uses: paambaati/[email protected]
env:
CC_TEST_REPORTER_ID: ${{secrets.CC_TEST_REPORTER_ID}}
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -164,9 +164,12 @@ cython_debug/
.test_cache/
.cache/

# predictoor dynamic modeling
# predictoor-specific
out*.txt
my_ppss.yaml
csvs/
parquet_data/

# pdr_backend accuracy output
pdr_backend/accuracy/output/*.json
# pm2 configs
Expand Down
3 changes: 2 additions & 1 deletion .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,8 @@ disable=too-many-locals,
consider-using-dict-items,
consider-using-generator,
dangerous-default-value,
unidiomatic-typecheck
unidiomatic-typecheck,
unsubscriptable-object

# Enable the message, report, category or checker with the given id(s). You can
# either give multiple identifier separated by comma (,) or put this option
Expand Down
73 changes: 54 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,44 +5,79 @@ SPDX-License-Identifier: Apache-2.0

# pdr-backend

⚠️ As of v0.2, the CLI replaces previous `main.py` calls. Update your flows accordingly.

## Run bots (agents)

- **[Run predictoor bot](READMEs/predictoor.md)** - make predictions, make $
- **[Run trader bot](READMEs/trader.md)** - consume predictions, trade, make $


(If you're a predictoor or trader, you can safely ignore the rest of this README.)

## Settings: PPSS

A "ppss" yaml file, like [`ppss.yaml`](ppss.yaml), holds parameters for all bots and simulation flows.
- We follow the idiom "pp" = problem setup (what to solve), "ss" = solution strategy (how to solve).
- `PRIVATE_KEY` is an exception; it's set as an envvar.

When you run a bot from the CLI, you specify your PPSS YAML file.

## CLI

(First, [install pdr-backend](READMEs/predictoor.md#install-pdr-backend-repo) first.)

To see CLI options, in console:
```console
pdr
```

This will output something like:
```text
Usage: pdr sim|predictoor|trader|..
Main tools:
pdr sim YAML_FILE
pdr predictoor APPROACH YAML_FILE NETWORK
pdr trader APPROACH YAML_FILE NETWORK
...
```

## Atomic READMEs

- [Get tokens](READMEs/get-tokens.md): [testnet faucet](READMEs/testnet-faucet.md), [mainnet ROSE](READMEs/get-rose-on-sapphire.md) & [OCEAN](READMEs/get-ocean-on-sapphire.md)
- [Envvars](READMEs/envvars.md)
- [Predictoor subgraph](READMEs/subgraph.md)
- [Dynamic model codebase](READMEs/dynamic-model-codebase.md)
- [Static models in predictoors](READMEs/static-model.md)
- [Claim payout for predictoor bot](READMEs/payout.md)
- [Predictoor subgraph](READMEs/subgraph.md). [Subgraph filters](READMEs/filters.md)
- [Run barge locally](READMEs/barge.md)

## Flows for core team

- **Backend dev** - for `pdr-backend` itself
- [Main backend-dev README](READMEs/backend-dev.md)
- Backend-dev - for `pdr-backend` itself
- [Local dev flow](READMEs/dev.md)
- [VPS dev flow](READMEs/vps.md)
- [Release process](READMEs/release-process.md)
- [Run barge locally](READMEs/barge.md)
- [Run barge remotely on VPS](READMEs/vps.md)
- [MacOS gotchas](READMEs/macos.md) wrt Docker & ports
- **[Run dfbuyer bot](READMEs/dfbuyer.md)** - runs Predictoor DF rewards
- **[Run publisher](READMEs/publisher.md)** - publish new feeds
- **[Scripts](scripts/)** for performance stats, more
- [Clean code guidelines](READMEs/clean-code.md)
- [Run dfbuyer bot](READMEs/dfbuyer.md) - runs Predictoor DF rewards
- [Run publisher](READMEs/publisher.md) - publish new feeds
- [Run trueval](READMEs/trueval.md) - run trueval bot

## Repo structure

This repo implements all bots in Predictoor ecosystem.

Each bot has a directory:
- `predictoor` - submits individual predictions
- `trader` - buys aggregated predictions, then trades
- other bots: `trueval` report true values to contract, `dfbuyer` implement Predictoor Data Farming, `publisher` to publish
Each bot has a directory. Alphabetically:
- `dfbuyer` - buy feeds on behalf of Predictoor DF
- `predictoor` - submit individual predictions
- `publisher` - publish pdr data feeds
- `trader` - buy aggregated predictions, then trade
- `trueval` - report true values to contract

Other directories:
- `util` - tools for use by any agent
- `models` - classes that wrap Predictoor contracts; for setup (BaseConfig); and for data feeds (Feed)
Other directories, alphabetically:
- `accuracy` - calculates % correct, for display in predictoor.ai webapp
- `data_eng` - data engineering & modeling
- `models` - class-based data structures, and classes to wrap contracts
- `payout` - OCEAN & ROSE payout
- `ppss` - settings
- `sim` - simulation flow
- `util` - function-based tools

93 changes: 0 additions & 93 deletions READMEs/backend-dev.md

This file was deleted.

41 changes: 41 additions & 0 deletions READMEs/barge-calls.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
### Barge flow of calls

From getting barge going, here's how it calls specific pdr-backend components and passes arguments.

- user calls `/barge/start_ocean.sh` to get barge going
- then, `start_ocean.sh` fills `COMPOSE_FILES` incrementally. Eg `COMPOSE_FILES+=" -f ${COMPOSE_DIR}/pdr-publisher.yml"`
- `barge/compose-files/pdr-publisher.yml` sets:
- `pdr-publisher: image: oceanprotocol/pdr-backend:${PDR_BACKEND_VERSION:-latest}`
- `pdr-publisher: command: publisher`
- `pdr-publisher: networks: backend: ipv4_address: 172.15.0.43`
- `pdr-publisher: environment:`
- `RPC_URL: ${NETWORK_RPC_URL}` (= `http://localhost:8545` via `start_ocean.sh`)
- `ADDRESS_FILE: /root/.ocean/ocean-contracts/artifacts/address.json`
- (many `PRIVATE_KEY_*`)
- then, `start_ocean.sh` pulls the `$COMPOSE_FILES` as needed:
- `[ ${FORCEPULL} = "true" ] && eval docker-compose "$DOCKER_COMPOSE_EXTRA_OPTS" --project-name=$PROJECT_NAME "$COMPOSE_FILES" pull`

- then, `start_ocean.sh` runs docker-compose including all `$COMPOSE_FILES`:
- `eval docker-compose "$DOCKER_COMPOSE_EXTRA_OPTS" --project-name=$PROJECT_NAME "$COMPOSE_FILES" up --remove-orphans`
- it executes each of the `"command"` entries in compose files.
- (Eg for pdr-publisher.yml, `"command" = "publisher ppss.yaml development"`)
- Which then goes to `pdr-backend/entrypoint.sh` via `"python /app/pdr_backend/pdr $@"`
- (where `@` is unpacked as eg `publisher ppss.yaml development`) [Ref](https://superuser.com/questions/1586997/what-does-symbol-mean-in-the-context-of#:).
- Then it goes through the usual CLI at `pdr-backend/pdr_backend/util/cli_module.py`


### How to make changes to calls

If you made a change to pdr-backend CLI interface, then barge must call using the updated CLI command.

How:
- change the relevant compose file's `"command"`. Eg change `barge/compose-files/pdr-publisher.yml`'s "command" value to `publisher ppss.yaml development`
- also, change envvar setup as needed. Eg in compose file, remove `RPC_URL` and `ADDRESS_FILE` entry.
- ultimately, ask: "does Docker have everything it needs to succesfully run the component?"

### All Barge READMEs

- [barge.md](barge.md): the main Barge README
- [barge-calls.md](barge-calls.md): order of execution from Barge and pdr-backend code
- [release-process.md](release-process.md): pdr-backend Dockerhub images get published with each push to `main`, and sometimes other branches. In turn these are used by Barge.
Loading

0 comments on commit 3544c73

Please sign in to comment.