Giter VIP home page Giter VIP logo

ricochet-keeper's People

Contributors

khaeljy avatar mikeghen avatar mrsquiggles13 avatar nikramakrishnan avatar samirsalem avatar shreyaspapi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

ricochet-keeper's Issues

Automatic Wallet Refill

This task involves creating a DAG that will swap the inbound RIC stream into MATIC and distribute it to the addresses the Ricochet Keeper requires.

REX Bank Keeper Workflows - Liquidator

This issue is to create a workflow that checks for borrowers that are in liquidation.

There's an existing workflow for updating the bank price. I haven't done much with it but you can use it as a reference for how to interact with the bank. https://github.com/Ricochet-Exchange/ricochet-keeper/blob/v2-workflows/dags/rex_bank_keeper.py

I got started on these bank workflows on the v2-workflows branch, so base your branch off that and PR into v2-workflows when you're done.

Your keeper address will need to be approved to submit prices and liquidate so hmu and I will get you allowlisted.

CI mechanism for keepers

A mechanism was needed to test changes in DAG's and updates of airflow. this mechanism is described in the PR down below
#39

Secure the keeper

secured install of the keeper:
customized password of airflow user, pg
autogeneration of the fernet key for metadata encryption
nginx entry point with self-signed ssl certificate to secure GUI
introduced fail2ban to avoid ssh bruteforce
All changes can be found here:
#39

Create `BaseContractInteractionOperator`

All the operators that interact with smart contracts share the same basic functionality. This issue is to reduce the amount of duplicated code by creating a BaseContractInteractionOperator that handles the basic functionality for interacting with a smart contract.

Create a staging keeper

In order to test dags execution and in general that the keeper will work when some changes are made i'm working on a staging keeper to test those changes, this is part of the CI stack.
All changes will be executed against this new keeper

Keep on multiple networks

We need to be able to keep on multiple networks. Superfluid is deployed on Gnosis Chain and Avalanche. Getting REX setup on these networks will increase volume.

As a first step, the existing DAGs should be changed so that if there's multiple networks, then there will be multiple DAGs.

A single DAG script usually creates 1 dag, the pattern is 1 file 1 dag. However, we can use a single file to make multiple DAGs.

One solution is to add a variable networks = ['polygon', 'gnosis', 'avalanche'] and then at the top of the DAG files use:

for network in Variables.networks:
   dag = DAG("ricochet_distribute_{network}", ...)

A single DAG file will then make multiple DAGs:

ricochet_distribute_polygon
ricochet_distribute_gnsois
ricochet_distribute_avalanche

Test Transaction Status before Sending

Many distribute calls fail for BAD_EXCHANGE_RATE which just means the exchange price on sushi/quickswap is no good compared to the "global" price reported by tellor. Right now we just send these txns and let them fail, wasting gas.

This issue is to add a pre-send check in the ContractInteractionOperator to check that a transaction won't fail, something like:

txn = self.function(...).buildTransaction(..)
if not confirm_success(txn):
   return False
...

where confirm_success will use web4py to verify the txn will be successful (without executing it).

How to setup a keeper video tutorial

This bounty is for making a video tutorial for how to set up the keeper on your localhost for local development. We need a video like this for the keeper developers who may not run all the workflows and just need to set things up on their localhost to test.

Make new Docker Image

Keepers have been using my tellor-airflow image but we need to incorporate a docker image here and build it without relying on the one I pushed to docker hub months age.

This issue is to make a Dockerfile to use instead of my dockerhub image. A docker file for this can be found here: https://github.com/tellor-io/airflow

Use Variables to set Schedules

This task involves converting the schedules for each DAG to use a Variable. Create variables like:

distribution-schedule-interval -> "0 * * * *"
harvest-schedule-interval -> "0 0 * * *"

Then in the DAGs:

schedule_interval = Variables.get("distribution-schedule-interval", "0 * * * *")

 dag = DAG("ricochet_stream_watch",
          max_active_runs=1,
          catchup=False,
          default_args=default_args,
          schedule_interval=schedule_interval)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.