Directional Trading Using External (Binanace) Data

Background

Dojo provides easy access and interaction with decentralised protocols. However, a trading strategy will likely want to pull in external data to make its decisions. The data we use in this example is price data from a centralised exchange (CEX); specifically, we use Binance.

External data sources can come from anywhere, from any market with correlated pricing information, to any internal signals being published by your institution.

As long as there is a way to read the data into Python the data can be included in the simulation.

Screenshot of https://compasslabs.ai/dashboard?example=trade_towards_cex

Loading The Data

Binance publishes historical market data. We can load this data in Python using the requests module, and then parse the rows into a data type. This provides an interface we will later use in the simulation to fetch Binance prices for the ETH/USDC market.

binance_data.py
@dataclass
class Binance_data_point:
  """Represents a single line in the binance data."""
 
  open_time: datetime
  open_: float
  high: float
  low: float
  close: float
  volume: float
  close_time: datetime
  quote_asset_volume: float
  number_of_trades: int
  taker_buy_base_asset_volume: float
  taker_buy_quote_asset_volume: float
  ignore: int
 
 
class Binance_data:
  """Represents an entire binance data file."""
 
  def __init__(self, data: List[Binance_data_point]):  # noqa: D107
      self.data = data
 
  def find_nearest(self, target_time: datetime) -> Binance_data_point:
      """Slow method of find the data closest time to target_time.
 
      Removes all data prior or equal to target_time.
      """
      target_time.replace(tzinfo=UTC)
      while True:
          datum = self.data[0]
          if datum.close_time < target_time:
              self.data = self.data[1:]
          else:
              return datum
      return datum
 
 
def load_binance_data(year: int, month: int) -> Binance_data:
  """Load binance data for a particular month."""
  url = f"https://data.binance.vision/data/spot/monthly/klines/ETHUSDC/1m/ETHUSDC-1m-{year}-{month:02}.zip"
  file_name = f"ETHUSDC-1m-{year}-{month:02}.csv"
  zipfile_data_raw = requests.get(url)
  zipfile_data = zipfile.ZipFile(io.BytesIO(zipfile_data_raw.content))
  csv_data_raw = zipfile_data.read(file_name).decode("UTF-8")
  csv_data = list(csv.reader(io.StringIO(csv_data_raw)))
 
  def binance_data_point_of_list(args: List[str]) -> Binance_data_point:
      return Binance_data_point(
          UTC.localize(datetime.fromtimestamp(float(args[0]) / 1000.0)),
          float(args[1]),
          float(args[2]),
          float(args[3]),
          float(args[4]),
          float(args[5]),
          UTC.localize(datetime.fromtimestamp(float(args[6]) / 1000.0)),
          float(args[7]),
          int(args[8]),
          float(args[9]),
          float(args[10]),
          int(args[11]),
      )
 
  return Binance_data([binance_data_point_of_list(datum) for datum in csv_data])
 

This data loading is done when the program is initialised. There are multiple ways to reduce the overhead that comes from loading the data over the web and then marshalling it into the correct format.

We will discuss optimisations below.


Using The Data

Within the agent based simulation model that Dojo is built on, each agent instance has a single policy instance which determines which actions it will take. The policy decides which actions to take by reading in information from its observation of the environment at each step of the simulation.

External data should be thought of as an observation from outside the environment. This means that the user must provide both provide the data to the policy, and make sure that the policy only reads the correct data at the correct time.

First, we load the data and pass it to the policy:

run.py
binance_data = load_binance_data(year, month)
 
# Policies
arb_policy = TradeTowardsCentralisedExchangePolicy(
  agent=cex_directional_agent, binance_data=binance_data
)
policy.py
def __init__(
  self, agent: UniswapV3Agent, binance_data: Binance_data
) -> None:  # noqa: D107
  super().__init__(agent=agent)
  self.binance_data = binance_data
  self.block_last_trade: int = 0
  self.state = State.NOT_INVESTED

Now when our policy is asked to make a prediction (that is, to produce a list of actions from an observation about the environment, which will then be used to run the agent within the simulation), it can also consult its copy of the external data.

policy.py
# inside [ def predict(self, obs: UniswapV3Observation -> List[UniswapV3Action]: ]
binance_data_point = self.binance_data.find_nearest(date)
cex_usdc_per_eth = (binance_data_point.open_ + binance_data_point.close) / 2.0
policy.py
if diff < -0.025:
  # Buy WETH
  if self.state in [State.NOT_INVESTED, State.IN_TOKEN0]:
      token0_amount = self.agent.erc20_portfolio()[token0]
      self.state = State.IN_TOKEN1
      self.block_last_trade = block
      return [
          UniswapV3Trade(
              agent=self.agent,
              pool=pool,
              quantities=(Decimal(token0_amount), Decimal(0)),
          )
      ]

In this example, we compare the price of ETH in USDC on Binance and on the Uniswap V3 pool. We have to select the Binance data based on the time within the simulation, which we do using the dojo.network.block_date module.

We then trade in the pool when the Binance data indicates price movements larger than a constant percentage threshold.

This example selects data from slightly in the future of the simulation time, using the TIME_ADVANTAGE constant, and averages the close and open prices from a one-minute interval - this is done for demonstration purposes to give the agent an advantage. Real strategies would want to use data that would be available to the agent at that simulated point in time.


How To Run

Installation

Follow our Getting Started guide to install the dojo library and other required tools.

Then clone the dojo_examples repository and go into the relevant directory.

Terminal
git clone https://github.com/CompassLabs/dojo_examples.git
cd dojo_examples/examples/trade_towards_cex

Running

Download the dashboard to view the simulation results. To view example simulation data, download trade_towards_cex.db file from here and click 'Add A Simulation' on the dashboard.

To run the simulation yourself, use the following command.

Terminal
python run.py

This command will setup your local blockchain, contracts, accounts and agents. You can then access your results on your Dojo dashboard by connecting to a running simulation.


Conclusion

The Dojo library provides the user with tools to build and run simulations of agents interacting with decentralised exchanges. We provide useful data about the simulated decentralised exchanges and the agent positions to enable users to write high simulation level code.

External data sources from outside the environment can be useful for trading strategies. Dojo allows its users to write arbitrary Python code to drive their policies. This means there is no restriction on using external data. This example uses Binance historical pricing data to demonstrate that.

There are multiple possible optimizations to the example. This includes caching the external data on disk, which would remove the need to do a network request on repeated simulations; and using asynchronous functions to either prefetch data (such as the block times) or load data on demand (such as the Binance data).

These optimisations would decrease the running time of the simulation. The Compass Labs team is happy to give advise on these sort of strategies to make the most of our framework.

The team is also open to discussions on expanding the Dojo framework to provide first-class support for more data sources/data fetching methods as part of our continuing mission to provide a comprehensive, high-level, backtesting framework for DeFi.

Results

You can download the results to this example below.

We offer a dashboard desktop application for visualizing your simulation results. You can download the file for the desktop application here, or just open the results in our hosted dashboard.