Giter VIP home page Giter VIP logo

chainhammer's People

Contributors

drandreaskrueger avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

chainhammer's Issues

Failure to deploy chainhammer using vagrant-based quorum 7nodes example

Hello,
Could someone help me to fix the issue I am facing while trying to use chainhammer for performance measurement of a quorum network using a vagrant-based 7nodes example. The following is the message shown when I run ./script/install.sh and virtualenv -p python3 py3eth commands.

----------------------------- ./scripts/install.sh --------------------------
vagrant@ubuntu-xenial:~/CH$ ./scripts/install.sh

=========================================
Install ChainHammer dependencies, and clone network starter repos
version v59

Please report any issues IF this script is NOT ending with:
... with exit code 0.

Warning: No guarantees!!
Executing this script has consequences for this machine.
Better only use on a disposable/cloud/virtualbox machine,
and NOT on your main work machine!!

Is this the "chainhammer" folder where you want everything installed?
/home/vagrant/CH

Press enter to continue

must install some tools on system level, please input your sudo password:
scripts/install-docker-compose.sh scripts/install-geth.sh scripts/install-initialize.sh scripts/install-packages.sh scripts/install-virtualenv.sh
scripts/install-docker.sh scripts/install-go.sh scripts/install-network-starters.sh scripts/install-solc.sh
got sudo, thanks.

========================================
scripts/install-packages.sh

sudo apt-get update

Hit:1 https://deb.nodesource.com/node_12.x xenial InRelease
Hit:2 http://ppa.launchpad.net/openjdk-r/ppa/ubuntu xenial InRelease
Hit:3 http://archive.ubuntu.com/ubuntu xenial InRelease
Get:4 http://security.ubuntu.com/ubuntu xenial-security InRelease [109 kB]
Hit:5 http://archive.ubuntu.com/ubuntu xenial-updates InRelease
Hit:6 http://archive.ubuntu.com/ubuntu xenial-backports InRelease
Fetched 109 kB in 1s (66.0 kB/s)
Reading package lists... Done

installing wget htop jq apt-transport-https ca-certificates wget software-properties-common python3-pip python3-venv libssl-dev expect-dev dbus-x11 terminator build-essential automake libtool pkg-config libffi-dev python-dev libsecp256k1-dev
Reading package lists... Done
Building dependency tree
Reading state information... Done
Note, selecting 'expect' instead of 'expect-dev'
E: Unable to locate package libsecp256k1-dev

"echo got sudo, thanks." command filed with exit code 0.

--------------------------------------virtualenv -p python3 py3eth --------------------------
vagrant@ubuntu-xenial:~/CH$ virtualenv -p python3 py3eth
Traceback (most recent call last):
File "/usr/local/bin/virtualenv", line 7, in
from virtualenv.main import run_with_catch
File "/usr/local/lib/python3.5/dist-packages/virtualenv/init.py", line 3, in
from .run import cli_run, session_via_cli
File "/usr/local/lib/python3.5/dist-packages/virtualenv/run/init.py", line 13, in
from .plugin.activators import ActivationSelector
File "/usr/local/lib/python3.5/dist-packages/virtualenv/run/plugin/activators.py", line 6, in
from .base import ComponentBuilder
File "/usr/local/lib/python3.5/dist-packages/virtualenv/run/plugin/base.py", line 9, in
from importlib_metadata import entry_points
File "/usr/local/lib/python3.5/dist-packages/importlib_metadata/init.py", line 88
dist: Optional['Distribution'] = None
^
SyntaxError: invalid syntax

security alert: upgrade requirements.txt to requests 2.20.0 or later

security alert

1 requests vulnerability found in requirements.txt 8 days ago
Remediation
Upgrade requests to version 2.20.0 or later. For example:
requests>=2.20.0
Always verify the validity and compatibility of suggestions with your codebase.
Details
CVE-2018-18074 More information
moderate severity
Vulnerable versions: <= 2.19.1
Patched version: 2.20.0
The Requests package through 2.19.1 before 2018-09-14 for Python sends an HTTP Authorization header to an http URI upon receiving a same-hostname https-to-http redirect, which makes it easier for remote attackers to discover credentials by sniffing the network.

RPCaddress; how to talk to an Ethereum node, tutorial resources

Hello Dr
I installed the chainhammer on my machine as you told me using
scripts/install.sh docker Ubuntu
and i implemented first test
CH_TXS=1000 CH_THREADING="sequential" ./run.sh $HOSTNAME-TestRPC testrpc
Every things fine
Then implemented 7nodes quorum example locally on machine
OS: Ubuntu 20.04 virtual machine
consensus algorithm : clique
transaction : send private transaction between node1 and node7 as follow
mohamed@mohamed-VirtualBox:~/quorum-examples/examples/7nodes$ ./runscript.sh private-contract.js
Contract transaction send: TransactionHash: 0xf734748a660ada751edf7736311137b0eac76dcebab71d8889e9928cd6059751 waiting to be mined...
true
Now , how use chainhammer to test performance of my local network (7nodes example ) ? thanks

Getting several errors running scripts/install-virtualenv.sh

I ran install.sh nodocker but that failed because pandas could not be built.

Then I followed the instructions in #21 (comment) but still getting errors running scripts/install-virtualenv.sh

My environment:

$ cat /etc/*release*
ID="ec2"
VERSION="20230124-1270"
PRETTY_NAME="Debian GNU/Linux 11 (bullseye)"
NAME="Debian GNU/Linux"
VERSION_ID="11"
VERSION="11 (bullseye)"
VERSION_CODENAME=bullseye
ID=debian
HOME_URL="https://www.debian.org/"
SUPPORT_URL="https://www.debian.org/support"
BUG_REPORT_URL="https://bugs.debian.org/"

Output of uname -a

$ uname -a
Linux ip-172-31-44-113 5.10.0-21-cloud-amd64 #1 SMP Debian 5.10.162-1 (2023-01-21) x86_64 GNU/Linux

Error trace (some snipping donw by me to fit the github character limit):

$ scripts/install-virtualenv.sh

create chainhammer virtualenv

after possibly removing a whole existing env/ folder !!!

the new virtualenv will be installed below here:
/home/admin/chainhammer

Think twice. Then press enter to continue

++ rm -rf env
++ python3 -m venv env
++ echo

++ set +x
+++ source env/bin/activate
++ echo

++ python3 -m pip install --upgrade pip==18.1
Collecting pip==18.1
  Using cached pip-18.1-py2.py3-none-any.whl (1.3 MB)
Installing collected packages: pip
  Attempting uninstall: pip
    Found existing installation: pip 20.3.4
    Uninstalling pip-20.3.4:
      Successfully uninstalled pip-20.3.4
Successfully installed pip-18.1
++ pip3 install wheel
Collecting wheel
  Using cached https://files.pythonhosted.org/packages/bd/7c/d38a0b30ce22fc26ed7dbc087c6d00851fb3395e9d0dac40bec1f905030c/wheel-0.38.4-py3-none-any.whl
Installing collected packages: wheel
Successfully installed wheel-0.38.4
You are using pip version 18.1, however version 23.0.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
++ pip3 install --upgrade py-solc==3.2.0 web3==4.8.2 'web3[tester]==4.8.2' rlp==0.6.0 eth-testrpc==1.3.5 requests==2.21.0 pandas==1.1.2 matplotlib==3.3.2 pytest==4.0.2 pytest-cov==2.6.0 jupyter==1.0.0 ipykernel==5.1.0
Collecting py-solc==3.2.0
  Using cached https://files.pythonhosted.org/packages/47/74/d36abca3f36ccdcd04976c50f83502c870623e5beb4a4ec96c7bad4bb9e8/py_solc-3.2.0-py3-none-any.whl
Collecting web3==4.8.2
  Using cached https://files.pythonhosted.org/packages/84/7b/8dfe018c0b94a68f88d98ff39c11471ac55ffbcb22cd7ab41010c1476a75/web3-4.8.2-py3-none-any.whl
Collecting rlp==0.6.0
Collecting eth-testrpc==1.3.5
  Using cached https://files.pythonhosted.org/packages/bc/9a/8a8c90b8ed4db0afc39bc7b67b52aa8cbbc9c08bbd93f7ca92719e3493a3/eth_testrpc-1.3.5-py3-none-any.whl
Collecting requests==2.21.0
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting pandas==1.1.2
  Using cached https://files.pythonhosted.org/packages/64/f1/8fdbd74edfc31625d597717be8c155c6226fc72a7c954c52583ab81a8614/pandas-1.1.2.tar.gz
  Installing build dependencies ... done
Collecting matplotlib==3.3.2
Collecting pytest==4.0.2
  Using cached https://files.pythonhosted.org/packages/19/80/1ac71d332302a89e8637456062186bf397abc5a5b663c1919b73f4d68b1b/pytest-4.0.2-py2.py3-none-any.whl
Collecting pytest-cov==2.6.0
  Using cached https://files.pythonhosted.org/packages/7b/44/4e421b96b67b2daff264473f7465db72fbdf36a07e05494f50300cc7b0c6/rfc3339_validator-0.1.4-py2.py3-none-any.whl
Building wheels for collected packages: pandas, pillow
  Running setup.py bdist_wheel for pandas ... error
  Complete output from command /home/admin/chainhammer/env/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-install-t6avqa7g/pandas/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-2ez98py3 --python-tag cp39:
  /tmp/pip-install-t6avqa7g/pandas/setup.py:45: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    _CYTHON_INSTALLED = _CYTHON_VERSION >= LooseVersion(min_cython_ver)
  /tmp/pip-install-t6avqa7g/pandas/setup.py:491: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    if np.__version__ < LooseVersion("1.16.0"):
  /tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/setuptools/__init__.py:85: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated. Requirements should be satisfied by a PEP 517 installer. If you are using pip, you can try `pip install --use-pep517`.
    dist.fetch_build_eggs(dist.setup_requires)
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib.linux-x86_64-cpython-39
  creating build/lib.linux-x86_64-cpython-39/pandas
 [...]
  
  UPDATING build/lib.linux-x86_64-cpython-39/pandas/_version.py
  set build/lib.linux-x86_64-cpython-39/pandas/_version.py to '1.1.2'
  running build_ext
  building 'pandas._libs.algos' extension
  creating build/temp.linux-x86_64-cpython-39
  creating build/temp.linux-x86_64-cpython-39/pandas
  creating build/temp.linux-x86_64-cpython-39/pandas/_libs
  [...]
  building 'pandas._libs.writers' extension
  x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -ffile-prefix-map=/build/python3.9-RNBry6/python3.9-3.9.2=. -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -DNPY_NO_DEPRECATED_API=0 -I/tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/numpy/core/include -I/home/admin/chainhammer/env/include -I/usr/include/python3.9 -c pandas/_libs/writers.c -o build/temp.linux-x86_64-cpython-39/pandas/_libs/writers.o -Werror
  pandas/_libs/writers.c: In function ‘__pyx_f_6pandas_5_libs_7writers_word_len’:
  pandas/_libs/writers.c:5099:5: error: ‘_PyUnicode_get_wstr_length’ is deprecated [-Werror=deprecated-declarations]
   5099 |     __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val);
        |     ^~~~~~~~~
  In file included from /usr/include/python3.9/unicodeobject.h:1026,
                   from /usr/include/python3.9/Python.h:97,
                   from pandas/_libs/writers.c:35:
  /usr/include/python3.9/cpython/unicodeobject.h:446:26: note: declared here
    446 | static inline Py_ssize_t _PyUnicode_get_wstr_length(PyObject *op) {
        |                          ^~~~~~~~~~~~~~~~~~~~~~~~~~
  pandas/_libs/writers.c:5099:5: error: ‘PyUnicode_AsUnicode’ is deprecated [-Werror=deprecated-declarations]
   5099 |     __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val);
        |     ^~~~~~~~~
  In file included from /usr/include/python3.9/unicodeobject.h:1026,
                   from /usr/include/python3.9/Python.h:97,
                   from pandas/_libs/writers.c:35:
  /usr/include/python3.9/cpython/unicodeobject.h:580:45: note: declared here
    580 | Py_DEPRECATED(3.3) PyAPI_FUNC(Py_UNICODE *) PyUnicode_AsUnicode(
        |                                             ^~~~~~~~~~~~~~~~~~~
  pandas/_libs/writers.c:5099:5: error: ‘_PyUnicode_get_wstr_length’ is deprecated [-Werror=deprecated-declarations]
   5099 |     __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val);
        |     ^~~~~~~~~
  In file included from /usr/include/python3.9/unicodeobject.h:1026,
                   from /usr/include/python3.9/Python.h:97,
                   from pandas/_libs/writers.c:35:
  /usr/include/python3.9/cpython/unicodeobject.h:446:26: note: declared here
    446 | static inline Py_ssize_t _PyUnicode_get_wstr_length(PyObject *op) {
        |                          ^~~~~~~~~~~~~~~~~~~~~~~~~~
  cc1: all warnings being treated as errors
  error: command '/usr/bin/x86_64-linux-gnu-gcc' failed with exit code 1
  
  ----------------------------------------
  Failed building wheel for pandas
  Running setup.py clean for pandas
  Running setup.py bdist_wheel for pillow ... error
  Complete output from command /home/admin/chainhammer/env/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-install-t6avqa7g/pillow/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-svlivdp3 --python-tag cp39:
  running bdist_wheel
  running build
  running build_py
  Generating grammar tables from /usr/lib/python3.9/lib2to3/Grammar.txt
  Generating grammar tables from /usr/lib/python3.9/lib2to3/PatternGrammar.txt
  creating build
  creating build/lib.linux-x86_64-3.9
  creating build/lib.linux-x86_64-3.9/PIL
 [...]
  no previously-included directories found matching '.ci'
  writing manifest file 'src/Pillow.egg-info/SOURCES.txt'
  running build_ext
  
  
  The headers or library files could not be found for jpeg,
  a required dependency when compiling Pillow from source.
  
  Please see the install instructions at:
     https://pillow.readthedocs.io/en/latest/installation.html
  
  Traceback (most recent call last):
    File "/tmp/pip-install-t6avqa7g/pillow/setup.py", line 993, in <module>
      setup(
    File "/home/admin/chainhammer/env/lib/python3.9/site-packages/setuptools/__init__.py", line 162, in setup
      return distutils.core.setup(**attrs)
    File "/usr/lib/python3.9/distutils/core.py", line 148, in setup
      dist.run_commands()
    File "/usr/lib/python3.9/distutils/dist.py", line 966, in run_commands
      self.run_command(cmd)
    File "/usr/lib/python3.9/distutils/dist.py", line 985, in run_command
      cmd_obj.run()
    File "/home/admin/chainhammer/env/lib/python3.9/site-packages/wheel/bdist_wheel.py", line 325, in run
      self.run_command("build")
    File "/usr/lib/python3.9/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.9/distutils/dist.py", line 985, in run_command
      cmd_obj.run()
    File "/usr/lib/python3.9/distutils/command/build.py", line 135, in run
      self.run_command(cmd_name)
    File "/usr/lib/python3.9/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.9/distutils/dist.py", line 985, in run_command
      cmd_obj.run()
    File "/home/admin/chainhammer/env/lib/python3.9/site-packages/setuptools/command/build_ext.py", line 84, in run
      _build_ext.run(self)
    File "/usr/lib/python3.9/distutils/command/build_ext.py", line 340, in run
      self.build_extensions()
    File "/tmp/pip-install-t6avqa7g/pillow/setup.py", line 808, in build_extensions
      raise RequiredDependencyException(f)
  __main__.RequiredDependencyException: jpeg
  
  During handling of the above exception, another exception occurred:
  
  Traceback (most recent call last):
    File "<string>", line 1, in <module>
    File "/tmp/pip-install-t6avqa7g/pillow/setup.py", line 1010, in <module>
      raise RequiredDependencyException(msg)
  __main__.RequiredDependencyException:
  
  The headers or library files could not be found for jpeg,
  a required dependency when compiling Pillow from source.
  
  Please see the install instructions at:
     https://pillow.readthedocs.io/en/latest/installation.html
  
  
  
  ----------------------------------------
  Failed building wheel for pillow
  Running setup.py clean for pillow
Failed to build pandas pillow
eth-utils 1.10.0 has requirement eth-hash<0.4.0,>=0.3.1, but you'll have eth-hash 0.5.1 which is incompatible.
eth-rlp 0.3.0 has requirement eth-utils<3,>=2.0.0, but you'll have eth-utils 1.10.0 which is incompatible.
jupyter-console 6.6.3 has requirement ipykernel>=6.14, but you'll have ipykernel 5.1.0 which is incompatible.
Installing collected packages: semantic-version, py-solc, eth-typing, pycryptodome, eth-hash, toolz, cytoolz, eth-utils, six, parsimonious, eth-abi, lru-dict, eth-keys, eth-keyfile, hexbytes, attrdict, rlp, eth-rlp, eth-account, websockets, idna, urllib3, chardet, certifi, requests, web3, MarkupSafe, Werkzeug, click, PyYAML, repoze.lru, pbkdf2, scrypt, bitcoin, pyethash, pysha3, pycparser, cffi, secp256k1, ethereum, json-rpc, eth-testrpc, python-dateutil, pytz, numpy, pandas, kiwisolver, pillow, pyparsing, cycler, matplotlib, py, atomicwrites, attrs, more-itertools, pluggy, pytest, coverage, pytest-cov, pickleshare, backcall, ptyprocess, pexpect, asttokens, pure-eval, executing, stack-data, decorator, pygments, parso, jedi, traitlets, wcwidth, prompt-toolkit, matplotlib-inline, ipython, zipp, importlib-metadata, tornado, pyzmq, platformdirs, jupyter-core, jupyter-client, ipykernel, jupyterlab-widgets, widgetsnbextension, ipywidgets, terminado, argon2-cffi-bindings, argon2-cffi, pyrsistent, jsonschema, fastjsonschema, nbformat, Send2Trash, jinja2, defusedxml, nbclient, soupsieve, beautifulsoup4, mistune, pandocfilters, jupyterlab-pygments, webencodings, bleach, packaging, tinycss2, nbconvert, nest-asyncio, sniffio, anyio, websocket-client, jupyter-server-terminals, prometheus-client, rfc3986-validator, python-json-logger, rfc3339-validator, jupyter-events, jupyter-server, notebook-shim, ipython-genutils, nbclassic, notebook, jupyter-console, qtpy, qtconsole, jupyter
  Running setup.py install for pandas ... error
    Complete output from command /home/admin/chainhammer/env/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-install-t6avqa7g/pandas/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-record-aq3utvkw/install-record.txt --single-version-externally-managed --compile --install-headers /home/admin/chainhammer/env/include/site/python3.9/pandas:
    /tmp/pip-install-t6avqa7g/pandas/setup.py:45: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
      _CYTHON_INSTALLED = _CYTHON_VERSION >= LooseVersion(min_cython_ver)
    /tmp/pip-install-t6avqa7g/pandas/setup.py:491: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
      if np.__version__ < LooseVersion("1.16.0"):
    Compiling pandas/_libs/algos.pyx because it changed.
    Compiling pandas/_libs/groupby.pyx because it changed.
    Compiling pandas/_libs/hashing.pyx because it changed.
    Compiling pandas/_libs/hashtable.pyx because it changed.
    Compiling pandas/_libs/index.pyx because it changed.
    Compiling pandas/_libs/indexing.pyx because it changed.
    Compiling pandas/_libs/internals.pyx because it changed.
    Compiling pandas/_libs/interval.pyx because it changed.
    Compiling pandas/_libs/join.pyx because it changed.
    Compiling pandas/_libs/lib.pyx because it changed.
    Compiling pandas/_libs/missing.pyx because it changed.
    Compiling pandas/_libs/parsers.pyx because it changed.
    Compiling pandas/_libs/reduction.pyx because it changed.
    Compiling pandas/_libs/ops.pyx because it changed.
    Compiling pandas/_libs/ops_dispatch.pyx because it changed.
    Compiling pandas/_libs/properties.pyx because it changed.
    Compiling pandas/_libs/reshape.pyx because it changed.
    Compiling pandas/_libs/sparse.pyx because it changed.
    Compiling pandas/_libs/tslib.pyx because it changed.
    Compiling pandas/_libs/tslibs/base.pyx because it changed.
    Compiling pandas/_libs/tslibs/ccalendar.pyx because it changed.
    Compiling pandas/_libs/tslibs/dtypes.pyx because it changed.
    Compiling pandas/_libs/tslibs/conversion.pyx because it changed.
    Compiling pandas/_libs/tslibs/fields.pyx because it changed.
    Compiling pandas/_libs/tslibs/nattype.pyx because it changed.
    Compiling pandas/_libs/tslibs/np_datetime.pyx because it changed.
    Compiling pandas/_libs/tslibs/offsets.pyx because it changed.
    Compiling pandas/_libs/tslibs/parsing.pyx because it changed.
    Compiling pandas/_libs/tslibs/period.pyx because it changed.
    Compiling pandas/_libs/tslibs/strptime.pyx because it changed.
    Compiling pandas/_libs/tslibs/timedeltas.pyx because it changed.
    Compiling pandas/_libs/tslibs/timestamps.pyx because it changed.
    Compiling pandas/_libs/tslibs/timezones.pyx because it changed.
    Compiling pandas/_libs/tslibs/tzconversion.pyx because it changed.
    Compiling pandas/_libs/tslibs/vectorized.pyx because it changed.
    Compiling pandas/_libs/testing.pyx because it changed.
    Compiling pandas/_libs/window/aggregations.pyx because it changed.
    Compiling pandas/_libs/window/indexers.pyx because it changed.
    Compiling pandas/_libs/writers.pyx because it changed.
    Compiling pandas/io/sas/sas.pyx because it changed.
    [ 1/40] Cythonizing pandas/_libs/algos.pyx
    [ 2/40] Cythonizing pandas/_libs/groupby.pyx
    warning: pandas/_libs/groupby.pyx:1134:26: Unreachable code
    [ 3/40] Cythonizing pandas/_libs/hashing.pyx
    [ 4/40] Cythonizing pandas/_libs/hashtable.pyx
    [ 5/40] Cythonizing pandas/_libs/index.pyx
    [ 6/40] Cythonizing pandas/_libs/indexing.pyx
    [ 7/40] Cythonizing pandas/_libs/internals.pyx
    [ 8/40] Cythonizing pandas/_libs/interval.pyx
    [ 9/40] Cythonizing pandas/_libs/join.pyx
    [10/40] Cythonizing pandas/_libs/lib.pyx
    [11/40] Cythonizing pandas/_libs/missing.pyx
    [12/40] Cythonizing pandas/_libs/ops.pyx
    [13/40] Cythonizing pandas/_libs/ops_dispatch.pyx
    [14/40] Cythonizing pandas/_libs/parsers.pyx
    [15/40] Cythonizing pandas/_libs/properties.pyx
    [16/40] Cythonizing pandas/_libs/reduction.pyx
    [17/40] Cythonizing pandas/_libs/reshape.pyx
    [18/40] Cythonizing pandas/_libs/sparse.pyx
    [19/40] Cythonizing pandas/_libs/testing.pyx
    [20/40] Cythonizing pandas/_libs/tslib.pyx
    [21/40] Cythonizing pandas/_libs/tslibs/base.pyx
    [22/40] Cythonizing pandas/_libs/tslibs/ccalendar.pyx
    [23/40] Cythonizing pandas/_libs/tslibs/conversion.pyx
    [24/40] Cythonizing pandas/_libs/tslibs/dtypes.pyx
    [25/40] Cythonizing pandas/_libs/tslibs/fields.pyx
    [26/40] Cythonizing pandas/_libs/tslibs/nattype.pyx
    [27/40] Cythonizing pandas/_libs/tslibs/np_datetime.pyx
    [28/40] Cythonizing pandas/_libs/tslibs/offsets.pyx
    [29/40] Cythonizing pandas/_libs/tslibs/parsing.pyx
    [30/40] Cythonizing pandas/_libs/tslibs/period.pyx
    [31/40] Cythonizing pandas/_libs/tslibs/strptime.pyx
    [32/40] Cythonizing pandas/_libs/tslibs/timedeltas.pyx
    [33/40] Cythonizing pandas/_libs/tslibs/timestamps.pyx
    [34/40] Cythonizing pandas/_libs/tslibs/timezones.pyx
    [35/40] Cythonizing pandas/_libs/tslibs/tzconversion.pyx
    [36/40] Cythonizing pandas/_libs/tslibs/vectorized.pyx
    [37/40] Cythonizing pandas/_libs/window/aggregations.pyx
    [38/40] Cythonizing pandas/_libs/window/indexers.pyx
    [39/40] Cythonizing pandas/_libs/writers.pyx
    [40/40] Cythonizing pandas/io/sas/sas.pyx
    /tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/setuptools/__init__.py:85: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated. Requirements should be satisfied by a PEP 517 installer. If you are using pip, you can try `pip install --use-pep517`.
      dist.fetch_build_eggs(dist.setup_requires)
    running install
   [...]
    x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 build/temp.linux-x86_64-cpython-39/pandas/_libs/testing.o -L/usr/lib -o build/lib.linux-x86_64-cpython-39/pandas/_libs/testing.cpython-39-x86_64-linux-gnu.so
    building 'pandas._libs.window.aggregations' extension
    creating build/temp.linux-x86_64-cpython-39/pandas/_libs/window
    x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -ffile-prefix-map=/build/python3.9-RNBry6/python3.9-3.9.2=. -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -DNPY_NO_DEPRECATED_API=0 -Ipandas/_libs/window -I./pandas/_libs -I/tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/numpy/core/include -I/home/admin/chainhammer/env/include -I/usr/include/python3.9 -c pandas/_libs/window/aggregations.cpp -o build/temp.linux-x86_64-cpython-39/pandas/_libs/window/aggregations.o -Werror
    x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 build/temp.linux-x86_64-cpython-39/pandas/_libs/window/aggregations.o -L/usr/lib -o build/lib.linux-x86_64-cpython-39/pandas/_libs/window/aggregations.cpython-39-x86_64-linux-gnu.so
    building 'pandas._libs.window.indexers' extension
    x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -ffile-prefix-map=/build/python3.9-RNBry6/python3.9-3.9.2=. -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -DNPY_NO_DEPRECATED_API=0 -I/tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/numpy/core/include -I/home/admin/chainhammer/env/include -I/usr/include/python3.9 -c pandas/_libs/window/indexers.c -o build/temp.linux-x86_64-cpython-39/pandas/_libs/window/indexers.o -Werror
    x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 build/temp.linux-x86_64-cpython-39/pandas/_libs/window/indexers.o -L/usr/lib -o build/lib.linux-x86_64-cpython-39/pandas/_libs/window/indexers.cpython-39-x86_64-linux-gnu.so
    building 'pandas._libs.writers' extension
    x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -ffile-prefix-map=/build/python3.9-RNBry6/python3.9-3.9.2=. -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -DNPY_NO_DEPRECATED_API=0 -I/tmp/pip-build-env-wb4l_khi/lib/python3.9/site-packages/numpy/core/include -I/home/admin/chainhammer/env/include -I/usr/include/python3.9 -c pandas/_libs/writers.c -o build/temp.linux-x86_64-cpython-39/pandas/_libs/writers.o -Werror
    pandas/_libs/writers.c: In function ‘__pyx_f_6pandas_5_libs_7writers_word_len’:
    pandas/_libs/writers.c:5092:5: error: ‘_PyUnicode_get_wstr_length’ is deprecated [-Werror=deprecated-declarations]
     5092 |     __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val);
          |     ^~~~~~~~~
    In file included from /usr/include/python3.9/unicodeobject.h:1026,
                     from /usr/include/python3.9/Python.h:97,
                     from pandas/_libs/writers.c:38:
    /usr/include/python3.9/cpython/unicodeobject.h:446:26: note: declared here
      446 | static inline Py_ssize_t _PyUnicode_get_wstr_length(PyObject *op) {
          |                          ^~~~~~~~~~~~~~~~~~~~~~~~~~
    pandas/_libs/writers.c:5092:5: error: ‘PyUnicode_AsUnicode’ is deprecated [-Werror=deprecated-declarations]
     5092 |     __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val);
          |     ^~~~~~~~~
    In file included from /usr/include/python3.9/unicodeobject.h:1026,
                     from /usr/include/python3.9/Python.h:97,
                     from pandas/_libs/writers.c:38:
    /usr/include/python3.9/cpython/unicodeobject.h:580:45: note: declared here
      580 | Py_DEPRECATED(3.3) PyAPI_FUNC(Py_UNICODE *) PyUnicode_AsUnicode(
          |                                             ^~~~~~~~~~~~~~~~~~~
    pandas/_libs/writers.c:5092:5: error: ‘_PyUnicode_get_wstr_length’ is deprecated [-Werror=deprecated-declarations]
     5092 |     __pyx_v_l = PyUnicode_GET_SIZE(__pyx_v_val);
          |     ^~~~~~~~~
    In file included from /usr/include/python3.9/unicodeobject.h:1026,
                     from /usr/include/python3.9/Python.h:97,
                     from pandas/_libs/writers.c:38:
    /usr/include/python3.9/cpython/unicodeobject.h:446:26: note: declared here
      446 | static inline Py_ssize_t _PyUnicode_get_wstr_length(PyObject *op) {
          |                          ^~~~~~~~~~~~~~~~~~~~~~~~~~
    cc1: all warnings being treated as errors
    error: command '/usr/bin/x86_64-linux-gnu-gcc' failed with exit code 1
    
    ----------------------------------------
Command "/home/admin/chainhammer/env/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-install-t6avqa7g/pandas/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-record-aq3utvkw/install-record.txt --single-version-externally-managed --compile --install-headers /home/admin/chainhammer/env/include/site/python3.9/pandas" failed with error code 1 in /tmp/pip-install-t6avqa7g/pandas/
You are using pip version 18.1, however version 23.0.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
++ echo

++ ipython kernel install --user --name=Python.3.py3eth
scripts/install-virtualenv.sh: line 36: ipython: command not found
++ echo

++ set +x


Hyperledger Besu / Pantheon testing

Any plans to include and test the Hyperleder Besu/Pantheon client with Clique and/or IBFT?
Very curious to see how that compares vs. the others.

ValueError: max() arg is an empty sequence if there is only 1 filled block

when very few transactions, and all fit into 1 block:

Send 200 transactions 
...
block 3 | new #TX 200 / 4000 ms =  50.0 TPS_current | total: #TX  201 /  4.6 s =  43.5 TPS_average (peak  is  43.5 TPS_average)
block 4 | new #TX   0 / 4000 ms =   0.0 TPS_current | total: #TX  201 /  8.0 s =  25.1 TPS_average (peak  is  25.1 TPS_average)
block 5 | new #TX   0 / 4000 ms =   0.0 TPS_current | total: #TX  201 / 11.9 s =  16.8 TPS_average (peak  is  16.8 TPS_average)
block 6 | new #TX   0 / 4000 ms =   0.0 TPS_current | total: #TX  201 / 16.2 s =  12.4 TPS_average (peak  is  12.4 TPS_average)
block 7 | new #TX   0 / 4000 ms =   0.0 TPS_current | total: #TX  201 / 19.8 s =  10.1 TPS_average (peak was  12.4 TPS_average)
block 8 | new #TX   0 / 4000 ms =   0.0 TPS_current | total: #TX  201 / 24.1 s =   8.3 TPS_average (peak was  12.4 TPS_average)
block 9 | new #TX   0 / 4000 ms =   0.0 TPS_current | total: #TX  201 / 28.0 s =   7.2 TPS_average (peak was  12.4 TPS_average)
block 10 | new #TX   0 / 4000 ms =   0.0 TPS_current | total: #TX  201 / 32.0 s =   6.3 TPS_average (peak was  12.4 TPS_average)
...
=============================
= blocksDB_diagramming.py
=============================
...
from block 3 to block 3, with 10 empty blocks afterwards

then there is a problem:

./blocksDB_diagramming.py:422: RuntimeWarning: invalid value encountered in double_scalars
  tps=(txs/duration)

Traceback (most recent call last):
...
    blMin, blMax = min(dfs["blocknumber"])+1, max(dfs["blocknumber"][:lastFilledBlock_index])
ValueError: max() arg is an empty sequence

details:

peak TPS single block:
    blocknumber  TPS_1blk  TPS_3blks  TPS_5blks  TPS_10blks  txcount   size  gasUsed  gasLimit   timestamp  blocktime
2             5       0.0        NaN        NaN         NaN        0    586        0  40000000  1551190348        4.0
3             6       0.0        NaN        NaN         NaN        0    586        0  40000000  1551190352        4.0
4             7       0.0        0.0        NaN         NaN        0    586        0  40000000  1551190356        4.0
5             8       0.0        0.0        NaN         NaN        0    586        0  40000000  1551190360        4.0
6             9       0.0        0.0        0.0         NaN        0    586        0  40000000  1551190364        4.0
7            10       0.0        0.0        0.0         NaN        0    586        0  40000000  1551190368        4.0
8            11       0.0        0.0        0.0         NaN        0    586        0  40000000  1551190372        4.0
9            12       0.0        0.0        0.0         NaN        0    586        0  40000000  1551190376        4.0
10           13       0.0        0.0        0.0         NaN        0    586        0  40000000  1551190380        4.0
0             3       NaN        NaN        NaN         NaN      200  27462  5354823  40000000  1551190340        NaN

peak TPS over ten blocks:
   blocknumber  TPS_1blk  TPS_3blks  TPS_5blks  TPS_10blks  txcount   size  gasUsed  gasLimit   timestamp  blocktime
0            3       NaN        NaN        NaN         NaN      200  27462  5354823  40000000  1551190340        NaN
1            4       NaN        NaN        NaN         NaN        0    586        0  40000000  1551190344        NaN
2            5       0.0        NaN        NaN         NaN        0    586        0  40000000  1551190348        4.0
3            6       0.0        NaN        NaN         NaN        0    586        0  40000000  1551190352        4.0
4            7       0.0        0.0        NaN         NaN        0    586        0  40000000  1551190356        4.0
5            8       0.0        0.0        NaN         NaN        0    586        0  40000000  1551190360        4.0
6            9       0.0        0.0        0.0         NaN        0    586        0  40000000  1551190364        4.0
7           10       0.0        0.0        0.0         NaN        0    586        0  40000000  1551190368        4.0
8           11       0.0        0.0        0.0         NaN        0    586        0  40000000  1551190372        4.0
9           12       0.0        0.0        0.0         NaN        0    586        0  40000000  1551190376        4.0

Single block, vs averaged over 10 blocks:
peak( TPS_1blk) = 0.00 
peak(TPS_10blk) = nan

./blocksDB_diagramming.py:422: RuntimeWarning: invalid value encountered in double_scalars
  tps=(txs/duration)
second to last experiment block, averaging:
blocks 3-3, timestamps 1551190340-1551190340, duration 0 seconds, txcount 0, tps nan

Traceback (most recent call last):
  File "./blocksDB_diagramming.py", line 713, in <module>
    load_prepare_plot_save(*params)
  File "./blocksDB_diagramming.py", line 633, in load_prepare_plot_save
    emptyBlocks=EMPTY_BLOCKS)
  File "./blocksDB_diagramming.py", line 562, in diagrams
    tpsAv = tps_plotter(axes[0,0], dfs, blockFrom, blockTo, emptyBlocks)
  File "./blocksDB_diagramming.py", line 488, in tps_plotter
    avgLine(ax, dfs, emptyBlocks, avg, avgTxt)
  File "./blocksDB_diagramming.py", line 451, in avgLine
    blMin, blMax = min(dfs["blocknumber"])+1, max(dfs["blocknumber"][:lastFilledBlock_index])
ValueError: max() arg is an empty sequence

"./blocksDB_diagramming.py $DBFILE $INFOWORD ../$INFOFILE" command filed with exit code 0.

A few ideas to possibly improve performance

  1. Use Websocks instead of HTTP. When I was doing benchmarking with the Node Web3 library, I saw a huge gains when I switched to Websocks.

  2. Have you tried to use Multiprocessing instead of Threading? You may already know but Python threads can still block each other due to global interpreter lock. I'm not sure if this is happening but it would be interesting to see how Multiprocessing compares.

  3. Send from multiple addresses at once. The theory here is that the node won't have to wait for the previous transaction to complete before it can calculate the next transaction's nonce.

python version

Hi, can i check what python version you are using for your benchmarks? I run into many errors when installing the dependencies in the virtualenv. Failed with python3.5.2 and am currently also failing with python3.6.6 when running ./deploy.py and also the other py files

Parity Benchmarking

Hi all, I am quite impressed from your work here. It is very nice to have such a tool.

I have just a simple question on Parity benchmarking (is the one I am planning to use so far): why you are sending the entire txs workload to just ONE node exposed on port 8545? What if distribute the load on more than one node?
Your actual configuration is with 4 validators (3 + 1 exposed on the "outside"), and then you are loading transaction on such node. This could be a bottleneck.

Moreover, why you do not use one machine for the blockchain network and other fore the workload?

Thank you

waiting for 10 empty blocks - how to disable

Hello there,
I am trying to setup quorum 7 nodes example using docker-compose and trying to use the chainhammer quorum scripts, whenever I am trying to start a listener, it says connection refused, is it because of using docker as a tool?
How do we use chainhammer scripts for quorum 7 nodes example setup using docker rather than vagrant?

Cannot find img or diagrams folder

Operating System: Debian 9

Details

After running chainhammer i.e.

CH_TXS=40000 CH_THREADING="threaded2 300" ./run.sh "Quorum_t2.xlarge_8Mgas_5s_Blocktime"

the programs says the diagrams should be available here:
img/Quorum_t2.xlarge_8Mgas_5s_Blocktime-20191001-1020_blks219-237.png
but I cannot seem to find them.

I have also checker under reader but cant find a digram file as the README.md suggests

'method handler crashed' in deploy.py

Hello there,
When I tried running ./run.sh for my vagrant based quorum setup, I see the error as follows:
contract_CompileDeploySave(contract_source_file=FILE_CONTRACT_SOURCE) File "./deploy.py", line 120, in contract_CompileDeploySave contractAddress = deployContract(contract_interface) File "./deploy.py", line 70, in deployContract tx_hash = w3.toHex( myContract.constructor().transact() ) File "/home/kor11137702/testbed/chainhammer/chainhammer/env/lib/python3.6/site-packages/web3/utils/decorators.py", line 14, in _wrapper return self.method(obj, *args, **kwargs) File "/home/kor11137702/testbed/chainhammer/chainhammer/env/lib/python3.6/site-packages/web3/contract.py", line 842, in transact return self.web3.eth.sendTransaction(transact_transaction) File "/home/kor11137702/testbed/chainhammer/chainhammer/env/lib/python3.6/site-packages/web3/eth.py", line 262, in sendTransaction get_buffered_gas_estimate(self.web3, transaction), File "/home/kor11137702/testbed/chainhammer/chainhammer/env/lib/python3.6/site-packages/web3/utils/transactions.py", line 84, in get_buffered_gas_estimate gas_estimate = web3.eth.estimateGas(gas_estimate_transaction) File "/home/kor11137702/testbed/chainhammer/chainhammer/env/lib/python3.6/site-packages/web3/eth.py", line 303, in estimateGas [transaction], File "/home/kor11137702/testbed/chainhammer/chainhammer/env/lib/python3.6/site-packages/web3/manager.py", line 110, in request_blocking raise ValueError(response["error"]) ValueError: {'code': -32000, 'message': 'method handler crashed'}

The log file using tail -f logs/deploy.py.log:

tail: logs/deploy.py.log: file truncated versions: web3 4.3.0, py-solc: 2.1.0, solc 0.4.25+commit.59dbf8f1.Linux.gpp, testrpc 1.3.4, python 3.6.9 (default, Oct 8 2020, 12:12:24) [GCC 8.4.0] web3 connection established, blockNumber = 556, node version string = Geth/v1.9.7-stable-9339be03(quorum-v2.6.0)/linux-amd64/go1.13.10 first account of node is 0xed9d02e382b34818e88B88a309c7fe71E65f419d, balance is 1000000000 Ether WARN: raft consensus did report timestamps in nanoseconds. Is that still the case? nodeName: Quorum, nodeType: Geth, nodeVersion: v1.9.7-stable-9339be03(quorum-v2.6.0), consensus: raft, network: 10, chainName: ???, chainId: -1 unlocked: True unlock: True
Is there something I am missing?
I have modified the RPC address as discussed in the previous #23.

ubuntu

while running

scripts/install.sh

getting error

E: Unable to locate package libsecp256k1-dev

and installation stops...

Need to improve to a more realistic Web3 transaction

Chainhammer is a gem! Thank you so much, @drandreaskrueger

I am adapting the bits and pieces of Chainhammer to do a TPS measurement of a Polkadot Moonbeam parachain node. Currently, chainhammer relies on the concept of unlocked account where the transaction signing happens in a validator node. In the real usage, however, the transaction signing happens in a wallet. By doing it like this, will make Chainhammer usable in other chains, in my opinion.

In Moonbeam chain, however, I am facing an issue in the call to nonce = w3.eth.get_transaction_count(account). Because I have now the need to call next tx_receipt = w3.eth.wait_for_transaction_receipt(txh) and this resulted to a very low 8 TPS in Moonbeam. Without the wait call, I get a nonce too low error. Take note, this is not a predicament in Chainhammer because Chainhammer is using an unlocked node account hence it is able to avoid from the need of nonce retrieval. The concept of an unlocked account is absent in Moonbeam parachain, therefore, I have to sign the transaction in Web3 client, and send it via send_raw_transaction().

Also, in my initial examination of Moonbeam, it does not seem to have a transaction queue like in openethereum. Hence, the nonce must be correctly sequenced for transactions to succeed. I hope to hear your comments and tips to tackle this, to improve Chainhammer and brings its use to the Polkadot ecosystem. Thanks.

Comments on new tool.

Hi,

I am trying to reach you for comments on a tool we have built for testing blockchain. We have integrated chain hammer into it and would love your comments. I do not want to litter your issues page with links to another repo, and therefore be deeply appreciative of a means of reaching you.

some patched files to run it on Ubuntu 20

Hello everyone
I running 7nodes examples locally on ubuntu virtual machine .... i need to test performance using chainhammer ...

when i run (./tps.py) get
alzuharey@alzuharey-VirtualBox:~/CH/hammer$ ./tps.py get

Dependencies unavailable. Start virtualenv first!

ant then run alzuharey@alzuharey-VirtualBox:~/CH$ scripts/install.sh docker ubuntu
compile and install geth v1.9.6
Press enter to continue
go version go1.13.8 linux/amd64
get, compile and install geth - patience please
download repo into /home/alzuharey/go

cd /home/alzuharey/go/src/github.com/ethereum/go-ethereum; git pull --ff-only

You are not currently on a branch.
Please specify which branch you want to merge with.
See git-pull(1) for details.
git pull
package github.com/ethereum/go-ethereum: exit status 1
"install_chapter scripts/install-go.sh" command filed with exit code 0.
alzuharey@alzuharey-VirtualBox:~/CH$

when run ./tps.py again get same error

lzuharey@alzuharey-VirtualBox:~/CH/hammer$ ./tps.py

Dependencies unavailable. Start virtualenv first!

Any one can help me ? thanks

Script hangs indefinitely

Operating Systemand the app : Ubuntu 18.04 LTS

I have commented out the bthe its of the code that run geth as this crashes, leaving me just with Parity

Command: ./run-all_small.sh
Expected behaviour: script runs for parity
Actual Behaviour: Script hangs at

=============================
= is_up.py
=============================
Loops until the node is answering on the expected port

Full Output:

"CH_TXS=2000 CH_THREADING="sequential" ./run.sh "$CH_MACHINE-Parity-aura" parity" command filed with exit code 0.
ubuntu@ip-172-31-17-232:~/CH$ ./run-all_small.sh


@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@ TEST: runs on ALL networks, but SMALL number of transactions
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

machine name: t2.micro

Skipping Quorum, good on small machines, if you do want it, set CH_QUORUM=true




@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@ t2.micro-TestRPC
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

=============================
= chainhammer v52 - run all =
=============================

infoword: t2.micro-TestRPC
number of transactions: 400
concurrency algo: sequential

infofile: hammer/last-experiment.json
blocks database: temp.db
log files:
logs/tps.py.log
logs/deploy.py.log
logs/send.py.log

=============================
= start
=============================
Started network, call this command for watching the log file:
tail -n 10 -f ./logs/network.log

=============================
= activate virtualenv
=============================

Python 3.6.8

=============================
= is_up.py
=============================
Loops until the node is answering on the expected port.
Great, node is available now.

=============================
= tps.py
=============================
start listener tps.py, show here but also log into file logs/tps.py.log
this ENDS after send.py below writes a new INFOFILE hammer/last-experiment.json

=============================
= sleep
=============================
to have tps.py say its thing before deploy.py starts printing

versions: web3 4.8.1, py-solc: 3.2.0, solc 0.4.25+commit.59dbf8f1.Linux.gpp, testrpc 1.3.5, python 3.6.8 (default, Jan 14 2019, 11:02:34) [GCC 8.0.1 20180414 (experimental) [trunk revision 259383]]
web3 connection established, blockNumber = 0, node version string =  TestRPC/1.3.5/linux/python3.6.8
first account of node is 0x82A978B3f5962A5b0957d9ee9eEf472EE55B42F1, balance is 1000000 Ether
WARN: TestRPC has odd timestamp units, check 'tps.timestampToSeconds()' for updates
nodeName: TestRPC, nodeType: TestRPC, nodeVersion: 1.3.5, consensus: ???, network: 1, chainName: ???, chainId: -1

Block  0  - waiting for something to happen
(filedate 1565918812) last contract address: 0xc305c901078781C232A2a521C2aF7980f8385ee9

=============================
= deploy.py
=============================
Deploy the smartContract, deploy.py will then trigger tps.py to START counting.
Logging into file logs/deploy.py.log.

(filedate 1565919300) new contract address: 0xc305c901078781C232A2a521C2aF7980f8385ee9

blocknumber_start_here = 0
starting timer, at block 0 which has  1  transactions; at epochtime 1565919300.8035884

=============================
= send.py
=============================
Send 400 transactions with non/concurrency algo 'sequential', plus possibly wait 10 more blocks.
Then send.py triggers tps.py to end counting. Logging all into file logs/send.py.log.

block 1 | new #TX   1 /   54 ms =  18.6 TPS_current | total: #TX    2 /  0.7 s =   2.8 TPS_average (peak  is   2.8 TPS_average)
block 15 | new #TX  14 /  634 ms =  22.1 TPS_current | total: #TX   16 /  1.7 s =   9.6 TPS_average (peak  is   9.6 TPS_average)
block 42 | new #TX  27 / 1410 ms =  19.2 TPS_current | total: #TX   43 /  3.1 s =  13.9 TPS_average (peak  is  13.9 TPS_average)
block 82 | new #TX  40 / 2049 ms =  19.5 TPS_current | total: #TX   83 /  5.1 s =  16.2 TPS_average (peak  is  16.2 TPS_average)
block 134 | new #TX  52 / 2790 ms =  18.6 TPS_current | total: #TX  135 /  7.7 s =  17.6 TPS_average (peak  is  17.6 TPS_average)
block 198 | new #TX  64 / 3473 ms =  18.4 TPS_current | total: #TX  199 / 10.7 s =  18.7 TPS_average (peak  is  18.7 TPS_average)
block 275 | new #TX  77 / 3644 ms =  21.1 TPS_current | total: #TX  276 / 14.1 s =  19.6 TPS_average (peak  is  19.6 TPS_average)
block 364 | new #TX  89 / 3039 ms =  29.3 TPS_current | total: #TX  365 / 17.7 s =  20.7 TPS_average (peak  is  20.7 TPS_average)
block 400 | new #TX  36 / 1229 ms =  29.3 TPS_current | total: #TX  401 / 18.4 s =  21.8 TPS_average (peak  is  21.8 TPS_average)

=============================
= sleep 2
=============================
wait 2 second until also tps.py has written its results.

Received signal from send.py = updated INFOFILE.
Experiment ended! Current blocknumber = 400
Updated info file: last-experiment.json THE END.

=============================
= blocksDB_create.py
=============================
read blocks from node1 into SQL db
reading from DBFILE  temp.db
reading from INFOFILE  ../hammer/last-experiment.json
versions: web3 4.8.1, py-solc: 3.2.0, solc 0.4.25+commit.59dbf8f1.Linux.gpp, testrpc 1.3.5, python 3.6.8 (default, Jan 14 2019, 11:02:34) [GCC 8.0.1 20180414 (experimental) [trunk revision 259383]]
web3 connection established, blockNumber = 400, node version string =  TestRPC/1.3.5/linux/python3.6.8
first account of node is 0x82A978B3f5962A5b0957d9ee9eEf472EE55B42F1, balance is 1002005 Ether
WARN: TestRPC has odd timestamp units, check 'tps.timestampToSeconds()' for updates
nodeName: TestRPC, nodeType: TestRPC, nodeVersion: 1.3.5, consensus: ???, network: 1, chainName: ???, chainId: -1

Writing blocks information into temp.db
Reading blocks range from ../hammer/last-experiment.json

 1 ****************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************
400 blocks took 0.93 seconds

Creating new DB temp.db

last one was:  INSERT INTO blocks VALUES (400,6882815.146341464,612,26755,4000000,1);


execute & commit 400 SQL statements (where 0 duplicates) into DB took 0.00 seconds

TABLE blocks has 400 rows
MIN(blocknumber), MAX(blocknumber) = [(1, 400)] 
done.

=============================
= blocksDB_diagramming.py
=============================
make time series diagrams from SQL db
using  DBFILE=temp.db  NAME_PREFIX=t2.micro-TestRPC
reading blocks range from ../hammer/last-experiment.json
from block 1 to block 400, with 0 empty blocks afterwards

sqlite3 version 2.6.0
pandas version 0.23.4
numpy version 1.17.0
matplotlib version 3.0.2
matplotlib backend agg

Reading blocks table from temp.db
DB table names:  ('blocks',)
TABLE blocks has 400 rows
MIN(blocknumber), MAX(blocknumber) = [(1, 400)] 
len(blocknumbers)= 400

complete between blocks 1 and 400.

txcount_sum 400
blocksize_max 612
txcount_max 1
txcount average per block 1.0
blocks_nonempty_count 400
txcount average per NONEMPTY blocks =  1.0


Columns added. Now:  ['blocknumber', 'timestamp', 'size', 'gasUsed', 'gasLimit', 'txcount', 'blocktime', 'TPS_1blk', 'TPS_3blks', 'TPS_5blks', 'TPS_10blks', 'GUPS_1blk', 'GUPS_3blks', 'GUPS_5blks', 'GLPS_1blk', 'GLPS_3blks', 'GLPS_5blks']

peak TPS single block:
     blocknumber   TPS_1blk  TPS_3blks  TPS_5blks  TPS_10blks  txcount  size  gasUsed  gasLimit     timestamp  blocktime
220          221  29.285715  18.636364  18.636364   18.636364        1   610    26691   4000000  6.882809e+06   0.034146
351          352  29.285715  29.285715  29.285714   29.285714        1   612    26755   4000000  6.882814e+06   0.034146
293          294  29.285715  29.285715  29.285714   29.285714        1   612    26755   4000000  6.882812e+06   0.034146
33            34  29.285715  18.636364  21.808511   20.098039        1   608    26691   4000000  6.882799e+06   0.034146
263          264  29.285715  29.285715  29.285714   29.285714        1   612    26755   4000000  6.882811e+06   0.034146
381          382  29.285715  29.285715  29.285714   29.285714        1   612    26755   4000000  6.882815e+06   0.034146
322          323  29.285715  29.285715  29.285714   29.285714        1   612    26755   4000000  6.882813e+06   0.034146
14            15  29.285715  21.206897  21.808511   20.918367        1   608    26691   4000000  6.882798e+06   0.034146
280          281  29.285714  29.285714  29.285714   29.285714        1   612    26755   4000000  6.882811e+06   0.034146
286          287  29.285714  29.285714  29.285714   29.285714        1   612    26755   4000000  6.882811e+06   0.034146

peak TPS over ten blocks:
     blocknumber   TPS_1blk  TPS_3blks  TPS_5blks  TPS_10blks  txcount  size  gasUsed  gasLimit     timestamp  blocktime
324          325  29.285714  29.285715  29.285714   29.285714        1   612    26755   4000000  6.882813e+06   0.034146
298          299  29.285714  29.285714  29.285714   29.285714        1   612    26755   4000000  6.882812e+06   0.034146
358          359  29.285714  29.285714  29.285714   29.285714        1   612    26755   4000000  6.882814e+06   0.034146
356          357  29.285714  29.285714  29.285714   29.285714        1   612    26755   4000000  6.882814e+06   0.034146
355          356  29.285714  29.285714  29.285714   29.285714        1   612    26755   4000000  6.882814e+06   0.034146
354          355  29.285714  29.285714  29.285714   29.285714        1   612    26755   4000000  6.882814e+06   0.034146
353          354  29.285714  29.285715  29.285714   29.285714        1   611    26755   4000000  6.882814e+06   0.034146
352          353  29.285714  29.285715  29.285714   29.285714        1   612    26755   4000000  6.882814e+06   0.034146
351          352  29.285715  29.285715  29.285714   29.285714        1   612    26755   4000000  6.882814e+06   0.034146
293          294  29.285715  29.285715  29.285714   29.285714        1   612    26755   4000000  6.882812e+06   0.034146

Single block, vs averaged over 10 blocks:
peak( TPS_1blk) = 29.29 
peak(TPS_10blk) = 29.29

second to last experiment block, averaging:
blocks 1-400, timestamps 6882796-6882815, duration 18 seconds, txcount 399, tps 21.8

averaged over whole experiment: 21.8 TPS
averaged (    "    ) blocksize: 610 bytes

diagrams saved to:  img/t2.micro-TestRPC-20190816-0135_blks1-400.png
Done.

=============================
= page_generator.py
=============================
Reading from INFOFILE ../hammer/last-experiment.json and from TPSLOG ../logs/tps.py.log
Page saved to:  ../results/runs/TestRPC_20190816-0135_txs400.md
Page saved to:  ../results/runs/TestRPC_20190816-0135_txs400.html

=============================
= stop
=============================
Stopping PID 1824
sleep 2
networks/testrpc-stop.sh: line 10:  1824 Killed                  unbuffer testrpc-py > $LOG 2>&1

Stopped network.

=============================
= Ready.
=============================
See that image, and those .md and .html pages.



@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@ t2.micro-Quorum
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

Skipping Quorum, see very beginning of output of this script.



@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@ t2.micro-Parity-aura
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
No arguments supplied, assuming parity v1.11.11
Custom parity build set: v1.11.11
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  3856  100  3856    0     0   3910      0 --:--:-- --:--:-- --:--:--  3906
Release selected is: v1.11.11
Parity 1.11.11 already installed
patched docker-compose.yml file to use parity v1.11.11
patched for issue 60: stepDuration to 5
patched for issue 92
copied password

=============================
= chainhammer v52 - run all =
=============================

infoword: t2.micro-Parity-aura
number of transactions: 2000
concurrency algo: sequential

infofile: hammer/last-experiment.json
blocks database: temp.db
log files:
logs/tps.py.log
logs/deploy.py.log
logs/send.py.log

=============================
= start
=============================
Started network, call this command for watching the log file:
tail -n 10 -f logs/network.log


=============================
= activate virtualenv
=============================

Python 3.6.8

=============================
= is_up.py
=============================
Loops until the node is answering on the expected port.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.