Giter VIP home page Giter VIP logo

ethereum-nft-activity's Introduction

ethereum-nft-activity

How much energy does it take to power popular Ethereum-backed CryptoArt platforms? And what emissions are associated with this energy use?

These questions do not have clear answers for two reasons:

  1. The overall energy usage and emissions of Ethereum are hard to estimate. I am working on this in a separate repo: kylemcdonald/ethereum-energy
  2. The portion for which a specific user, platform, or transaction might be considered "responsible" is more of a philosophical question than a technical one. Like many complex systems, there is an indirect relationship between the service and the emissions. I am working on different approaches in this notebook: Per-Transaction Models

This table represents one method for computing emissions, as of March 5, 2022. The methodology is described below.

Name Fees Transactions kgCO2
Art Blocks 12,006 244,594 21,531,626
Async 224 27,403 332,657
Foundation 8,602 661,074 14,568,164
KnownOrigin 507 64,326 904,455
Makersplace 1,840 144,163 3,010,383
Nifty Gateway 1,621 151,950 2,385,675
OpenSea 314,515 20,012,086 551,268,013
Rarible 20,930 1,802,971 27,706,539
SuperRare 2,215 320,697 3,172,169
Zora 532 21,660 721,254

Updates:

  • March 5, 2022: Missing contracts were added to Art Blocks and OpenSea, massively incresaing their totals. Duplicate contracts were removed remove Nifty Gateway, halving the totals. The contracts were duplicated because they were found both when scraping Nifty Gateway, and also when pulling labeled contracts from Etherscan.

Preparation

First, sign up for an API key at Etherscan. Create env.json and add the API key. It should look like:

{
    "etherscan-api-key": "<etherscan-api-key>"
}

Install dependencies:

pip install -r requirements.txt

Note: this project requires Python 3.

contracts_footprint.py

This will pull all the transactions from Etherscan, sum the gas and transaction counts, and do a basic emissions estimate. Results are saved in the /output directory as JSON or TSV. Run the script with, for example: python contracts_footprint.py --verbose --tsv data/contracts.json data/nifty-gateway-contracts.json.

This may take longer the first time, while your local cache is updated. When updating after a week, it can take 5 minutes or more to download all new transactions. The entire cache can be multiple gigabytes.

This script has a few unique additional flags:

  • --summary to summarize the results in a format similar to the above table, combining multiple contracts into a single row of output.
  • --startdate and --enddate can be used to only analyze a specific date range, using the format YYYY-MM-DD.
  • --tsv will save the results of analysis as a TSV file instead of JSON.

contracts_history.py

This will pull all the transactions from Etherscan, sum the transaction fees and gas used, and group by day and platform. Results are saved in the /output directory as CSV files. Run the script with, for example: python contracts_history.py --verbose data/contracts.json data/nifty-gateway-contracts.json

The most recent results are cached in the gh_pages branch.

Additional flags

Both scripts have these shared additional flags:

  • --noupdate runs from cached results. This will not make any requests to Nifty Gateway or Etherscan. When using the Etherscan class in code without an API key, this is the default behavior.
  • --verbose prints progress when scraping Nifty Gateway or pulling transactions from Etherscan.

Helper scripts

  • python ethereum_stats.py will pull stats from Etherscan like daily fees and block rewards and save them to data/ethereum-stats.json
  • python nifty_gateway.py will scrape all the contracts from Nifty Gateway and save them to data/nifty-gateway-contracts.json

Methodology

The footprint of a platform is the sum of the footprints for all artwork on the platform. Most platforms use a few Ethereum contracts and addresses to handle all artworks. For each contract, we download all the transactions associated with that address from Etherscan. Then for each day, we take the sum of all fees paid on all those transactions divided by the total fees paid across the whole network for that day. This ratio is multiplied by the daily Ethereum emissions estimate to get the total emissions for that address. Finally, the total emissions for a platform are equal to the emissions for all addresses across all days.

Sources

Contracts are sourced from a combination of personal research, DappRadar, and Etherscan tags.

When possible, we have confirmed contract coverage directly with the marketplaces. Confirmed contracts include:

  • SuperRare: all confirmed
  • Foundation: all confirmed
  • OpenSea: some contracts on DappRadar have not been confirmed
  • Nifty Gateway: all confirmed

How to add more platforms

To modify this code so that it works with more platforms, add every possible contract and wallet for each platform to the data/contracts.json file, using the format:

'<Platform Name>/<Contract Name>': '<0xAddress>'

Then submit a pull request back to this repository. Thanks in advance!

Contracts and Addresses

Contracts and addresses used by each platform can be found in data/contracts.json and are also listed here using python print_contracts.py to generate Markdown. Nifty Gateway contracts are listed separately in data/nifty-gateway-contracts.json.

Art Blocks

Async

  • ASYNC 2020-02-25 to 2021-12-21
  • ASYNC-V2 2020-07-21 to 2022-01-09

Foundation

KnownOrigin

Makersplace

Nifty Gateway

OpenSea

Rarible

SuperRare

Zora

ethereum-nft-activity's People

Contributors

jamiew avatar kylemcdonald avatar pippinlee avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ethereum-nft-activity's Issues

Swap old Digiconomist model for new ethereum-emissions model

ethereum_footprint.py needs to be almost completely replaced:

  • 1. In __init__, instead of loading the digiconomist energy consumption and etherscan gasUsed data, load the ethereum-emissions file daily-ktco2.csv which is regularly update on the gh-pages branch. Also, load an instance of the EthereumStats class, which will give you access to the tx_fees dict (mapping dates to total tx_fees).
  • 2. Remove kgco2_per_gas() from EthereumFootprint
  • 3. Remove get_etherscan_data() from EthereumFootprint
  • 3. sum_kgco2() should run a different calculation. First, sum all transactions fees (using tx.get_fees()) based on the date of the transaction (using tx.get_datetime()). Then for each day, divide the total fees from those transactions by the total fees on that day (using tx_fees from the EthereumStats instance). Finally, sum all values, then convert from kilotons to kilograms.

Separately:

  • 1. In contracts_footprint.py: add fees to each row and remove gas. Then re-run contracts_footprint.py and update the table in the readme.

Note: any server that regularly computes these numbers is going to need to run ethereum_stats.py on a daily basis to update the total transaction fees. Also, any emissions estimated for the current day should be treated with suspicion, as different data sources may be out of sync with each other.

fix 8GB RAM use

the script has been failing for the last month because it hit memory limits. using a db can fix this.

switch from json to a database

the files are big enough now to make this worth doing. not so much for speed, because etherscan is the true bottleneck, but for the purpose of maintaining a low memory footprint (and maybe low disk footprint too).

probably can switch this over quickly with sqlite3 if de-duplication is handled in python.

suggestion: report the total gas of Etheruem as well and percentages

I think it would be great to see what percentage these NFT's are taking up of the total ethereum network -- it could help create some context.

143,595,457,124 is a very big Gas number but what does that mean? Off hand I think Open Sea is something like 0.5-2% of Ethereum's gas usage for the day depending on the day -- you should show how much of the network it's using

do not report statistics from current day

when the stats are computed, the current day is just starting and has few or no transactions. this makes it look like the graph is always heading downwards, which is inaccurate.

build in 5 call/s rate limit

Etherscan has a 5 calls per second rate limit for the free tier. Instead of pausing every time we hit a rate limit, we could rate limit the fetch_transactions_in_range function itself.

Digiconomist kWh data accuracy

I've brought this up elsewhere but I'd love your perspective.

Since you're relying on Digiconomist energy estimation maybe you can help me get to the bottom of this:

Digiconomist says they:

  1. Calculate total USD Mining revenues
  2. Estimate what part is spent on electricity
  3. Find out how much miners pay per kWh
  4. Convert costs to consumption

If you go to you can see their source of data https://digiconomist.net/ethereum-energy-consumption

However for those four steps:

  1. There's no source for the mining revenue for Ethereum
  2. There's no source for the how much of that revenue is spent on electricity
  3. There's no source for their estimate of "miners are assumed to be paying 10 cents per KWh on average"
  4. There's no source for how they get the kWh consumption, at the end of these layers of assumptions with no sources

I think it's really problematic all these calculators like carbon.fyi, cryptoart.wtf and now yours are being built on top of these layers of assumptions and "energy assumed backwards from profit with no sources".

I've seen a handful of other Bitcoin studies that are more "bottom up" than "top down" like https://www.sciencedirect.com/science/article/pii/S2542435119302557 but their estimates are on the low end of Digiconomist and they say at length:

Previous academic studies, such as predictions of future carbon emissions or comparisons of cryptocurrency and metal mining, are based on simplistic estimates of power consumption and lack empirical foundations. Consequently, the estimates produced vary significantly among studies

I don't think you should use Digiconomist as a source of data if they have no sources. If you can find a more rigorous estimate of Ethereum network energy usage I think it would be really a great resource. Digiconomist lacks credibility even though its so highly pointed to

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.