Giter VIP home page Giter VIP logo

ginglis13 / magic-formula-scraper Goto Github PK

View Code? Open in Web Editor NEW
29.0 4.0 12.0 11.77 MB

scrapes names and tickers from magicformulainvesting.com every quarter, adds info to a google sheet which includes stock prices and a link to the first result of a google search of the company name

Home Page: https://ginglis.me/magic-formula/

License: MIT License

Python 100.00%
google-sheets gspread scraper google-worksheet selenium chrome-browser python3 investing automation automation-selenium

magic-formula-scraper's Introduction

magic-formula-scraper

Python script for scraping magicformulainvesting.com and appending data to a Google Sheet using selenium, Google Sheets API, and gspread.

My brother and I make investments by following Joel Greenblatt's Magic Formula. The site above uses this formula and outputs the top X companies that fit within the criteria of the formula. However, the site does not allow a user to copy the information of these companies from the webpage directly. Manually typing out the names of 30+ companies and their information is a time-suck, so I created this script to scrape this information instead.

Example GIF

Here is the script running using a headless version of the Google Chrome browser, one w/o a GUI. It is also running using my credentials, so there is no interaction between the program and user.

Features

  • opens a chrome browser to the magic formula login page, then uses selenium's Keys and the getpass library to enter login information
  • once logged in, selects the number of stocks to view and clicks the corresponding button to display them
  • scrapes information about listed companies, writes to csv file titled 'companies.csv'
  • appends data to spreadsheet using the Google Sheets API and gspread
  • Optional: can be turned into a cronjob, instructions below

Main Loop

This is where the data is both written to a csv file and added to a Google worksheet

# find all td elements, write needed elements to file
trs=driver.find_elements_by_xpath('//table[@class="divheight screeningdata"]/tbody/tr')

for tr in trs:
    td = tr.find_elements_by_xpath(".//td")
    # encode company info as string to write to file
    company_name=td[0].get_attribute("innerHTML").encode("UTF-8")
    company_tikr=td[1].get_attribute("innerHTML").encode("UTF-8")
    # write to csv file
    writer.writerow([company_name,company_tikr])
    # append row to worksheet
    # use value input option = user entered so that price can be called from google finance
    worksheet.append_row([company_name,company_tikr,'=GOOGLEFINANCE("' + company_tikr + '","price")'], value_input_option="USER_ENTERED")  

driver.quit()

Usage

  1. Create a Google Developer Account. This allows access to Google's Drive and Sheets APIs, as well as a ton of other resources. Signing up gives the user $300 in credit!

  2. Read the gspread docs on how to generate credentials. This will help with linking your worksheet to the script. Make sure you put the path to the JSON file on line 74!

  3. Some parts of the script will have to be personalized by the user. These sections of scraper.py are listed below.

Add Oauth Credentials

credentials = ServiceAccountCredentials.from_json_keyfile_name('/path/to/your/credentials', scope)

Add URL to Your Spreadsheet

# access sheet by url
worksheet = gc.open_by_url('URL_TO_YOUR_SPREADSHEET').get_worksheet(1) # worksheet number

Cron Job

I have set up my script to run using a cron job every 3 months on the first of each month at 1 pm.

Edit lines 31-35 if you wish to hardcode your login credentials

# enter email and password. uses getpass to hide password (i.e. not using plaintext)
your_email=raw_input("Please enter your email for magicformulainvesting.com: ")
your_password=getpass.getpass("Please enter your password for magicformulainvesting.com: ")
username.send_keys(your_email)
password.send_keys(your_password)

To run selenium with a cron job, the browser used must be headless. I am using Chrome and giving it the option to run headless in my personal script. Chrome webdrivers must also be installed:

brew cask install chromedriver

Add these lines to scraper.py in place of the current 'driver = ...' line:

options = webdriver.ChromeOptions()
options.add_argument('headless')

# declare driver as chrome headless instance
driver = webdriver.Chrome(executable_path="path/to/chromedriver", chrome_options=options)

Below is my cron job, accessed on Mac or Linux by running 'crontab -e' at the terminal. I first had to give iTerm and the Terminal apps permission to read/write from my ssd.

SHELL=/bin/bash
PATH=/usr/local/bin/:/usr/bin:/usr/sbin
0 1 1 */3 * export DISPLAY=:0 && cd /path/to/scraper && /usr/bin/python scraper.py

From reading online, it sounds as though a cron job cannot read standard input and will generate an end of file error. So for the cronjob, I have hardcoded my username and password, which is really bad practice. However, since this site doesn't really contain sensitive information, I'm okay with that. The provided script in this repository still uses the secure method provided by getpass to deal with the user's password.

Features to Implement

  • have a file of companies already researched/invested in, check this list before writing to csv or updating google worksheet
  • need to add a blank row before adding all company info to google worksheet
  • maybe scrape for company descriptions and add these to the spreadsheet

magic-formula-scraper's People

Contributors

bji219 avatar ginglis13 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

magic-formula-scraper's Issues

Scraper Paste Offset

The scraper pastes the data offset by a few columns in Google Sheets when run.
Screen Shot 2021-11-28 at 7 25 55 PM

Chromedriver/Chrome Browser version mismatch

The code chunk below downloads the latest chromedriver and extracts it which is great, but if the user's chrome browser is at a lower version the code will fail.

# Get latest chromedriver zip file for mac, extract into same folder
try:
    version = requests.get('https://chromedriver.storage.googleapis.com/LATEST_RELEASE').text
    url = 'https://chromedriver.storage.googleapis.com/{0}/{1}'.format(version, 'chromedriver_mac64.zip')
    r = requests.get(url, allow_redirects=True)
    open('chromedriver.zip', 'wb').write(r.content)
    with zipfile.ZipFile("chromedriver.zip", "r") as zip_ref:
        zip_ref.extractall()
except:
    pass

Error text:

selenium.common.exceptions.SessionNotCreatedException: Message: session not created: This version of ChromeDriver only supports Chrome version 111 Current browser version is 110.0.5481.77 with binary path C:\xxxxx

Proposed solutions: checking if the chromedriver download is necessary before executing the try/except block.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.