Giter VIP home page Giter VIP logo

example-scalping's Introduction

Concurrent Scalping Algo

This python script is a working example to execute scalping trading algorithm for Alpaca API. This algorithm uses real time order updates as well as minute level bar streaming from Polygon via Websockets (see the document for Polygon data access). One of the contributions of this example is to demonstrate how to handle multiple stocks concurrently as independent routine using Python's asyncio.

The strategy holds positions for very short period and exits positions quickly, so you have to have more than $25k equity in your account due to the Pattern Day Trader rule, to run this example. For more information about PDT rule, please read the document.

Dependency

This script needs latest Alpaca Python SDK. Please install it using pip

$ pip3 install alpaca-trade-api

or use pipenv using Pipfile in this directory.

$ pipenv install

Usage

$ python main.py --lot=2000 TSLA FB AAPL

You can specify as many symbols as you want. The script is designed to kick off while market is open. Nothing would happen until 21 minutes from the market open as it relies on the simple moving average as the buy signal.

Strategy

The algorithm idea is to buy the stock upon the buy signal (20 minute moving average crossover) as much as lot amount of dollar, then immediately sell the position at or above the entry price. The assumption is that the market is bouncing upward when this signal occurs in a short period of time. The buy signal is extremely simple, but what this strategy achieves is the quick reaction to exit the position as soon as the buy order fills. There are reasonable probabilities that you can sell the positions at the better prices than or the same price as your entry within the small window. We send limit order at the last trade or position entry price whichever the higher to avoid unnecessary slippage.

The buy order is canceled after 2 minutes if it does not fill, assuming the signal is not effective anymore. This could happen in a fast-moving market situation. The sell order is left indifinitely until it fills, but this may cause loss more than the accumulated profit depending on the market situation. This is where you can improve the risk control beyond this example.

The buy signal is calculated as soon as a minute bar arrives, which typically happen about 4 seconds after the top of every minute (this is Polygon's behavior for minute bar streaming).

This example liquidates all watching positions with market order at the end of market hours (03:55pm ET).

Implementation

This example heavily relies on Python's asyncio. Although the thread is single, we handle multiple symbols concurrently using this async loop.

We keep track of each symbol state in a separate ScalpAlgo class instance. That way, everything stays simple without complex data structure and easy to read. The main() function creates the algo instance for each symbol and creates streaming object to listen the bar events. As soon as we receive a minute bar, we invoke event handler for each symbol.

The main routine also starts a period check routine to do some work in background every 30 seconds. In this background task, we check market state with the clock API and liquidate positions before the market closes.

Algo Instance and State Management

Each algo instance initializes its state by fetching day's bar data so far and position/order from Alpaca API to synchronize, in case the script restarts after some trades. There are four internal states and transitions as events happen.

  • TO_BUY: no position, no order. Can transition to BUY_SUBMITTED
  • BUY_SUBMITTED: buy order has been submitted. Can transition to TO_BUY or TO_SELL
  • TO_SELL: buy is filled and holding position. Can transition to SELL_SUBMITTED
  • SELL_SUBMITTED: sell order has been submitted. Can transition to TO_SELL or TO_BUY

Event Handlers

on_bar() is an event handler for the bar data. Here we calculate signal that triggers a buy order in the TO_BUY state. Once order is submitted, it goes to the BUY_SUBMITTED state.

If order is filled, on_order_update() handler is called with event=fill. The state transitions to TO_SELL and immediately submits a sell order, to transition to the SELL_SUBMITTED state.

Orders may be canceled or rejected (caused by this script or you manually cancel them from the dashboard). In these cases, the state transitions to TO_BUY (if not holding a position) or TO_SELL (if holding a position) and wait for the next events.

checkup() method is the background periodic job to check several conditions, where we cancel open orders and sends market sell order if there is an open position.

It exits once the market closes.

Note

Each algo instance owns its child logger, prefixed by the symbol name. The console log is also emitted to a file console.log under the same directory for your later review.

Again, the beautify of this code is that there is no multithread code but each algo instance can focus on the bar/order/position data only for its own. It still handles multiple symbols concurrently plus runs background periodic job in the same async loop.

The trick to run additional async routine is as follows.

    loop = stream.loop
    loop.run_until_complete(asyncio.gather(
        stream.subscribe(channels),
        periodic(),
    ))
    loop.close()

We use asyncio.gather() to run all bar handler, order update handler and periodic job in one async loop indifinitely. You can kill it by Ctrl+C.

Customization

Instead of using this buy signal of 20 minute simple moving average cross over, you can use your own buy signal. To do so, extend the ScalpAlgo class and write your own _calc_buy_signal() method.

    class MyScalpAlgo(ScalpAlgo):
        def _calculate_buy_signal(self):
            '''self._bars has all minute bars in the session so far. Return True to
            trigger buy order'''
            pass

And use it instead of the original class.

example-scalping's People

Contributors

camelpac avatar macarse avatar umitanuki avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

example-scalping's Issues

stock prediction

I saw that you liked my work, I like that you have looked at it https://github.com/Leci37/LecTrade/tree/develop

My name is Luis, I'm a big-data machine-learning developer, I'm a fan of your work, and I usually check your updates.

I was afraid that my savings would be eaten by inflation. I have created a powerful tool that based on past technical patterns (volatility, moving averages, statistics, trends, candlesticks, support and resistance, stock index indicators).
All the ones you know (RSI, MACD, STOCH, Bolinger Bands, SMA, DEMARK, Japanese candlesticks, ichimoku, fibonacci, williansR, balance of power, murrey math, etc) and more than 200 others.

The tool creates prediction models of correct trading points (buy signal and sell signal, every stock is good traded in time and direction).
For this I have used big data tools like pandas python, stock technical patterns market libraries like: tablib, TAcharts ,pandas_ta... For data collection and calculation.
And powerful machine-learning libraries such as: Sklearn.RandomForest , Sklearn.GradientBoosting, XGBoost, Google TensorFlow and Google TensorFlow LSTM.

With the models trained with the selection of the best technical indicators, the tool is able to predict trading points (where to buy, where to sell) and send real-time alerts to Telegram or Mail. The points are calculated based on the learning of the correct trading points of the last 2 years (including the change to bear market after the rate hike).

I think it could be useful to you, to improve, I would like to share it with you, and if you are interested in improving and collaborating we could, who knows how to make beautiful things.

Thank you for your time
I'm sorry to contact you here ,by issues, I don't know how I would be able to do it.
mail : [email protected] or https://github.com/Leci37/stocks-Machine-learning-RealTime-telegram/discussions

Not all trade updates gets reflected in the algo

This issue is not particularly related to the example provided. The concern I have is that when I tried to run this code, some trade updates doesn't get reflected to the algo via trade_updates. For example: A trade gets executed, but the trade_update doesn't show up in the feed hence the algo thinks that the trade never took place in the first place and then 2 minutes later, it tries to cancel the trade.

Why does certain trade_updates doesn't show up? Please note that I saw this behavior in my paper account.

Forbidden URL: Doesn't work with new API

Hello,

I can't get the script to run due to the upgrade to the API:
When I run python main.py --lot=2000 TSLA FB AAPL , I get

requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: https://data.alpaca.markets/v1/aggs/ticker/TSLA/range/1/minute/2021-03-01/2021-03-02

Signal Loss Threat

Hey, saw your Medium article. Wonderful project.

Just wanted to clarify a danger of the algorithm with the line of code self._bars.rolling(20).mean().close.values

Assuming that internet connection is lost for a couple of minutes and then reconnected, the algorithm is gonna take all last 20 values and may calculate the timeframe wrong.

The collected bar data should always contain a timestamp to be overprotective here and avoid any wrong signals to be produced by checking if there are all bars of the last 20 minutes available and only calculating under these conditions. Makes your algo saver. :)

Keep up the good work and good luck making money with this.

My kindest regards,

V2 Changes are returning 404 errors

I didn't modify any of the code and ran this script as is. Here is the output.

https://stream.data.alpaca.markets
2021-03-04 10:18:23,356:stream.py:295:INFO:alpaca_trade_api.stream:started trading stream
2021-03-04 10:18:23,371:stream.py:196:INFO:alpaca_trade_api.stream:started data stream
2021-03-04 10:18:23,464:stream.py:213:WARNING:root:websocket error, restarting connection: server rejected WebSocket connection: HTTP 404
2021-03-04 10:18:23,606:stream.py:282:INFO:alpaca_trade_api.stream:connected to: wss://paper-api.alpaca.markets/stream/
2021-03-04 10:18:26,569:stream.py:213:WARNING:root:websocket error, restarting connection: server rejected WebSocket connection: HTTP 404
2021-03-04 10:18:29,657:stream.py:213:WARNING:root:websocket error, restarting connection: server rejected WebSocket connection: HTTP 404
Traceback (most recent call last):
  File "D:\Users\nickjohnson\PycharmProjects\alpaca\venv\lib\site-packages\alpaca_trade_api\stream.py", line 201, in _run_forever
    await self._start_ws()
  File "D:\Users\nickjohnson\PycharmProjects\alpaca\venv\lib\site-packages\alpaca_trade_api\stream.py", line 179, in _start_ws
    await self._connect()
  File "D:\Users\nickjohnson\PycharmProjects\alpaca\venv\lib\site-packages\alpaca_trade_api\stream.py", line 50, in _connect
    extra_headers={'Content-Type': 'application/msgpack'})
  File "D:\Users\nickjohnson\PycharmProjects\alpaca\venv\lib\site-packages\websockets\client.py", line 547, in __await_impl__
    extra_headers=protocol.extra_headers,
  File "D:\Users\nickjohnson\PycharmProjects\alpaca\venv\lib\site-packages\websockets\client.py", line 296, in handshake
    raise InvalidStatusCode(status_code)
websockets.exceptions.InvalidStatusCode: server rejected WebSocket connection: HTTP 404

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\Users\nickjohnson\PycharmProjects\alpaca\venv\lib\site-packages\alpaca_trade_api\stream.py", line 397, in _run_forever
    self._data_ws._run_forever())
  File "D:\Users\nickjohnson\PycharmProjects\alpaca\venv\lib\site-packages\alpaca_trade_api\stream.py", line 208, in _run_forever
    raise ConnectionError("max retries exceeded")
ConnectionError: max retries exceeded

Process finished with exit code 1

1006 disconnect

@umitanuki

How do you handle 1006 disconnects in this paradigm?
{"message": "code = 1006 (connection closed abnormally [internal]), no reason", "time": "2020-09-17T15:50:36.513913"}

A question

First of, I must say I have gone through 50+ tutorials on Alpaca and AlgoTrading, by far this is one of the most helpful and realistically useful implementation I came across.

I do have a question that I have been struggling with. How does one go about implementing and testing this code during "after market hours". I am not talking about backtesting a strategy (i have done it through manual / readily available libraries like backtrader)...I am talking about being able to stream dummy data, etc.

Any guidance would help!

Trying to set up for paper trading

In an effort to start using this for paper trading, I have modified the main function to stream ticker data using the live credentials and to stream trade data using the paper credentials. However, I have yet to get to a breakpoint set on stream2 in any circumstance.

def main(args):
    api = alpaca.REST(dev_key, dev_secret, base_url=dev_endpoint)
    stream1 = alpaca.StreamConn(live_key, live_secret)
    stream2 = alpaca.StreamConn(dev_key, dev_secret, base_url=dev_endpoint)

    fleet = {}
    symbols = get_tickers()
    for symbol in symbols:
        algo = ScalpAlgo(api, symbol, lot=args.lot)
        fleet[symbol] = algo

    @stream1.on(r'^AM')
     async def on_bars(conn, channel, data):
        if data.symbol in fleet:
            fleet[data.symbol].on_bar(data)

    @stream2.on(r'trade_updates')
     async def on_trade_updates(conn, channel, data):
        logger.info(f'trade_updates {data}')
        symbol = data.order['symbol']
        if symbol in fleet:
            fleet[symbol].on_order_update(data.event, data.order)

     async def periodic():
        while True:
            if not api.get_clock().is_open:
                logger.info('exit as market is not open')
                sys.exit(0)
            await asyncio.sleep(30)
            positions = api.list_positions()
            for symbol, algo in fleet.items():
                pos = [p for p in positions if p.symbol == symbol]
                algo.checkup(pos[0] if len(pos) > 0 else None)
     channels1 = ['AM.' + symbol for symbol in symbols]
     channels2 = ['trade_updates']

     loop1 = stream1.loop
     loop2 = stream2.loop
     loop1.run_until_complete(asyncio.gather(
        stream1.subscribe(channels1),
        periodic(),
    ))
     loop2.run_until_complete(asyncio.gather(
        stream2.subscribe(channels2),
        periodic(),
    ))
     loop1.close()
     loop2.close()

This, however, does seem to work in testing.

import alpaca_trade_api as alpaca
from auth import dev_key, dev_secret, dev_endpoint

stream2 = alpaca.StreamConn(dev_key, dev_secret, dev_endpoint)


@stream2.on(r'trade_updates')
async def on_trade_updates(conn, channel, data):
    print('data', data)


stream2.run(['trade_updates'])

Do you have any thoughts on how I might be able to modify this code to allow trade updates to come from the paper account?

Thank you,
Nick

status:active but still "Failed to authneticate"

I really like your scalping strategy and Python implementation!

I have a live account and a paper account, and regularly work with Polygon. However, when I try to run your code with my paper account, I get the following error:

c:\users\gil\envs\utils\lib\site-packages\alpaca_trade_api\stream2.py in _connect(self)
     44             status = msg.get('data').get('status')
     45             if status != 'authorized':
---> 46                 raise ValueError(
     47                     (f"Invalid Alpaca API credentials, Failed to "
     48                      f"authenticate: {msg}")

ValueError: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}

-- this is despite the fact my account is active:

ipdb> api.get_account()
Account({   'account_blocked': False,
    'account_number': 'PA28K7UTZQSB',
    'buying_power': '160000',
    'cash': '40000',
    'created_at': '2020-12-23T01:29:09.496085Z',
    'currency': 'USD',
    'daytrade_count': 0,
    'daytrading_buying_power': '160000',
    'equity': '40000',
    'id': 'e3d0cd29-078a-484e-856b-6819912ca25d',
    'initial_margin': '0',
    'last_equity': '40000',
    'last_maintenance_margin': '0',
    'long_market_value': '0',
    'maintenance_margin': '0',
    'multiplier': '4',
    'pattern_day_trader': False,
    'portfolio_value': '40000',
    'regt_buying_power': '80000',
    'short_market_value': '0',
    'shorting_enabled': True,
    'sma': '0',
    'status': 'ACTIVE',
    'trade_suspended_by_user': False,
    'trading_blocked': False,
    'transfers_blocked': False})

Can you please advise?

Thanks,
Gil

"Cannot do slice indexing"

I wanted to see what I could do to build off of the base code, but I'm unable to get it to run. Can you think of any reason why I keep getting this error?

TypeError: cannot do slice indexing on <class 'pandas.core.indexes.numeric.Int64Index'> with these indexers [2020-03-13 09:30:00-04:00] of <class 'pandas._libs.tslibs.timestamps.Timestamp'>

Thanks for your help

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.