Giter VIP home page Giter VIP logo

py-scale-codec's Introduction

Polkascan Open-Source

Polkascan Open-Source Application

Quick deployment (Use hosted Polkascan API endpoints)

Step 1: Clone repository:

git clone https://github.com/polkascan/polkascan-os.git

Step 2: Change directory:

cd polkascan-os

Step 3: Check available releases:

git tag

Step 4: Checkout latest releases:

git checkout v0.x.x

Step 5: Make sure to also clone submodules within the cloned directory:

git submodule update --init --recursive

Step 6: Then build the other docker containers

docker-compose -p kusama -f docker-compose.kusama-quick.yml up --build

Use public Substrate RPC endpoints

Step 1: Clone repository:

git clone https://github.com/polkascan/polkascan-os.git

Step 2: Change directory:

cd polkascan-os

Step 3: Check available releases:

git tag

Step 4: Checkout latest releases:

git checkout v0.x.x

Step 5: Make sure to also clone submodules within the cloned directory:

git submodule update --init --recursive

Step 6: During the first run let MySQL initialize (wait for about a minute)

docker-compose -p kusama -f docker-compose.kusama-public.yml up -d mysql

Step 7: Then build the other docker containers

docker-compose -p kusama -f docker-compose.kusama-public.yml up --build

Full deployment

The following steps will run a full Polkascan-stack that harvests blocks from a new local network.

Step 1: Clone repository:

git clone https://github.com/polkascan/polkascan-os.git

Step 2: Change directory:

cd polkascan-os

Step 3: Check available releases:

git tag

Step 4: Checkout latest releases:

git checkout v0.x.x

Step 5: Make sure to also clone submodules within the cloned directory:

git submodule update --init --recursive

Step 6: During the first run let MySQL initialize (wait for about a minute)

docker-compose -p kusama -f docker-compose.kusama-full.yml up -d mysql

Step 7: Then build the other docker containers

docker-compose -p kusama -f docker-compose.kusama-full.yml up --build

Links to applications

Other networks

Add custom types for Substrate Node Template

Cleanup Docker

Use the following commands with caution to cleanup your Docker environment.

Prune images

docker system prune

Prune images (force)

docker system prune -a

Prune volumes

docker volume prune

API specification

The Polkascan API implements the https://jsonapi.org/ specification. An overview of available endpoints can be found here: https://github.com/polkascan/polkascan-pre-explorer-api/blob/master/app/main.py#L60

Troubleshooting

When certain block are not being processed or no blocks at all then most likely there is a missing or invalid type definition in the type registry.

Some steps to check:

You can also dive into Python to pinpoint which types are failing to decode:

import json
from scalecodec.type_registry import load_type_registry_file
from substrateinterface import SubstrateInterface

substrate = SubstrateInterface(
    url='ws://127.0.0.1:9944',
    type_registry_preset='substrate-node-template',
    type_registry=load_type_registry_file('harvester/app/type_registry/custom_types.json'),
)

block_hash = substrate.get_block_hash(block_id=3899710)

extrinsics = substrate.get_block_extrinsics(block_hash=block_hash)

print('Extrinsincs:', json.dumps([e.value for e in extrinsics], indent=4))

events = substrate.get_events(block_hash)

print("Events:", json.dumps([e.value for e in events], indent=4))

py-scale-codec's People

Contributors

andelf avatar arjanz avatar codeforcer avatar dependabot[bot] avatar emielsebastiaan avatar erussel avatar thewhaleking avatar valentunsergeev avatar wuminzhe avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

py-scale-codec's Issues

Unable to get events

Calling get_events with block_hash = '0x1e5a94476abf191c4c54a47d20744cc684d0a21ae6c2c51085774ead4f5cac39' (block number = 7708105), results in the following error:

../env/lib/python3.7/site-packages/substrateinterface/base.py:1344: in get_events
    storage_obj = self.query(module="System", storage_function="Events", block_hash=block_hash)
../env/lib/python3.7/site-packages/substrateinterface/base.py:1316: in query
    obj.decode()
../env/lib/python3.7/site-packages/scalecodec/base.py:658: in decode
    self.value_serialized = self.process()
../env/lib/python3.7/site-packages/scalecodec/types.py:796: in process
    element = self.process_type(self.sub_type, metadata=self.metadata)
../env/lib/python3.7/site-packages/scalecodec/base.py:741: in process_type
    obj.decode(check_remaining=False)
../env/lib/python3.7/site-packages/scalecodec/base.py:658: in decode
    self.value_serialized = self.process()
../env/lib/python3.7/site-packages/scalecodec/types.py:2500: in process
    value = super().process()
../env/lib/python3.7/site-packages/scalecodec/types.py:471: in process
    field_obj = self.process_type(data_type, metadata=self.metadata)
../env/lib/python3.7/site-packages/scalecodec/base.py:741: in process_type
    obj.decode(check_remaining=False)
../env/lib/python3.7/site-packages/scalecodec/base.py:658: in decode
    self.value_serialized = self.process()
../env/lib/python3.7/site-packages/scalecodec/types.py:2434: in process
    arg_type_obj = self.process_type(arg_type)
../env/lib/python3.7/site-packages/scalecodec/base.py:740: in process_type
    obj = self.runtime_config.create_scale_object(type_string, self.data, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <scalecodec.base.RuntimeConfigurationObject object at 0x7f473c2b3dd0>
type_string = 'xcm::latest::Outcome'
data = <ScaleBytes(data=0x2000000000000000c0d786090000000002000000010000002c01d0070000cd5f6d68c0e6aeae6e09a9821b9f0f699f065de...fe5f85a2fe671ba028b35be25c83fd58758ef5642051113c018d8360400000000000000000000000000020000000000000e270700000000000000)>
kwargs = {}, decoder_class = None

    def create_scale_object(self, type_string: str, data=None, **kwargs) -> 'ScaleType':
        """
    
        Returns
        -------
        ScaleType
        """
        decoder_class = self.get_decoder_class(type_string)
    
        if decoder_class:
            return decoder_class(data=data, **kwargs)
    
>       raise NotImplementedError('Decoder class for "{}" not found'.format(type_string))
E       NotImplementedError: Decoder class for "xcm::latest::Outcome" not found

../env/lib/python3.7/site-packages/scalecodec/base.py:158: NotImplementedError

DigestItem should be t`ype mapping`

in subatrate:

pub enum DigestItem {
	/// A pre-runtime digest.
	///
	/// These are messages from the consensus engine to the runtime, although
	/// the consensus engine can (and should) read them itself to avoid
	/// code and state duplication. It is erroneous for a runtime to produce
	/// these, but this is not (yet) checked.
	///
	/// NOTE: the runtime is not allowed to panic or fail in an `on_initialize`
	/// call if an expected `PreRuntime` digest is not present. It is the
	/// responsibility of a external block verifier to check this. Runtime API calls
	/// will initialize the block without pre-runtime digests, so initialization
	/// cannot fail when they are missing.
	PreRuntime(ConsensusEngineId, Vec<u8>),

	/// A message from the runtime to the consensus engine. This should *never*
	/// be generated by the native code of any consensus engine, but this is not
	/// checked (yet).
	Consensus(ConsensusEngineId, Vec<u8>),

	/// Put a Seal on it. This is only used by native code, and is never seen
	/// by runtimes.
	Seal(ConsensusEngineId, Vec<u8>),

	/// Some other thing. Unsupported and experimental.
	Other(Vec<u8>),

	/// An indication for the light clients that the runtime execution
	/// environment is updated.
	///
	/// Currently this is triggered when:
	/// 1. Runtime code blob is changed or
	/// 2. `heap_pages` value is changed.
	RuntimeEnvironmentUpdated,
}

but in the py-scale-codec:

class LogDigest(Enum):

    value_list = ['Other', 'AuthoritiesChange', 'ChangesTrieRoot', 'SealV0', 'Consensus', 'Seal', 'PreRuntime']

    def __init__(self, data, **kwargs):
        self.log_type = None
        self.index_value = None
        super().__init__(data, **kwargs)

    def process(self):
        self.index = int(self.get_next_bytes(1).hex())
        self.index_value = self.value_list[self.index]
        self.log_type = self.process_type(self.value_list[self.index])

        return {'type': self.value_list[self.index], 'value': self.log_type.value}

Decoding "AccountInfo<Index, AccountData>

Hello!
Why could this issue appear? I forked the substrate interface, and I am running into this:
scalecodec.exceptions.RemainingScaleBytesNotEmptyException: Decoding "AccountInfo<Index, AccountData>" - Current offset: 69 / length: 80

Getting KeyError: '0000' when attempting to decode events vector

What I'm trying to do?

Decode events after making RPC call to state_getStorageAt

What's happening?

Getting KeyError: '0000' at the time of decoding.

Something to help reproduce.

result = "0x64000000000........ ........" # This is the output of `state_getStorageAt` call, which is correct. I verified it on https://polkadot.js.org/apps/

# Loaded the necessary types before this as I learned from README
events = runtime_config.create_scale_object(
    "Vec<EventRecord<Event, Hash>>", ScaleBytes(result), metadata=metadata,
)
print(events.decode())

Here's the full traceback

../../../.pyenv/versions/3.8.1/envs/substrateutils/lib/python3.8/site-packages/scalecodec/base.py:660: in decode
    self.value_serialized = self.process()
../../../.pyenv/versions/3.8.1/envs/substrateutils/lib/python3.8/site-packages/scalecodec/types.py:787: in process
    element = self.process_type(self.sub_type, metadata=self.metadata)
../../../.pyenv/versions/3.8.1/envs/substrateutils/lib/python3.8/site-packages/scalecodec/base.py:743: in process_type
    obj.decode(check_remaining=False)
../../../.pyenv/versions/3.8.1/envs/substrateutils/lib/python3.8/site-packages/scalecodec/base.py:660: in decode
    self.value_serialized = self.process()
../../../.pyenv/versions/3.8.1/envs/substrateutils/lib/python3.8/site-packages/scalecodec/types.py:2504: in process
    value = super().process()
../../../.pyenv/versions/3.8.1/envs/substrateutils/lib/python3.8/site-packages/scalecodec/types.py:462: in process
    field_obj = self.process_type(data_type, metadata=self.metadata)
../../../.pyenv/versions/3.8.1/envs/substrateutils/lib/python3.8/site-packages/scalecodec/base.py:743: in process_type
    obj.decode(check_remaining=False)
../../../.pyenv/versions/3.8.1/envs/substrateutils/lib/python3.8/site-packages/scalecodec/base.py:660: in decode
    self.value_serialized = self.process()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <GenericEvent(value=None)>

    def process(self):
    
        self.event_index = self.get_next_bytes(2).hex()
    
        # Decode attributes
>       self.event_module = self.metadata.event_index[self.event_index][0]
E       KeyError: '0000'

../../../.pyenv/versions/3.8.1/envs/substrateutils/lib/python3.8/site-packages/scalecodec/types.py:2432: KeyError

Kusama type registry needs update

Hello. I believe kusama type registry lacks some new data. I'm trying to get block #9438247 (for example) and get an error.

from substrateinterface import SubstrateInterface
substrate = SubstrateInterface(
    url='wss://kusama-rpc.polkadot.io',
    ss58_format=2,
    type_registry_preset='kusama',
)
block = substrate.get_block(block_number=9438247)

Traceback (most recent call last):
  File "/home/substrate/venv/lib/python3.8/site-packages/scalecodec/types.py", line 1024, in process
    enum_type_mapping = self.type_mapping[self.index]
IndexError: list index out of range

I believe kusama's type registry lacks new data from here (https://github.com/polkadot-js/api/blob/master/packages/types-known/src/spec/kusama.ts#L194).
For example, if I add following types to kusama type registry, everything goes well and I get desired block:

"MultiLocation": "MultiLocationV0",
"MultiAsset": "MultiAssetV0"

Problem of custom types add to types

If I want to create a new type registry file (e.g. joystream-new.json) to define my custom types of new chain, Should I create custom class of my chain In (types.py) manually ? Maybe some tools can generate custom class into (types.py) ?

No Type Registry for Karura Network

I am trying to use the py-substrate-interface module for interfacing with the Karura network.

I am unsure however how to create a type registry json that is set up in scale codes format.

Does anyone know how to find this type registry for Karura or how to generate from somewhere?

V14 Metadata decoder?

Any news on the implementation of a v14 metadata decoder? I've got a couple implementations that are already breaking down due to

  File "substrateinterface/base.py", line 801, in get_block_metadata
    metadata_decoder.decode()
  File "scalecodec/base.py", line 434, in decode
    self.value = self.process()
  File "scalecodec/metadata.py", line 35, in process
    self.version = self.process_type('Enum', value_list=[
  File "scalecodec/base.py", line 497, in process_type
    obj.decode(check_remaining=False)
  File "scalecodec/base.py", line 434, in decode
    self.value = self.process()
  File "scalecodec/types.py", line 969, in process
    raise ValueError("Index '{}' not present in Enum value list".format(self.index))
ValueError: Index '14' not present in Enum value list

'Set' need 'bytes_length' to indicate the value length

The Set type is fixed to 8 bytes, but the bytes length may be 1, 2, 4, 8 or 16.
So maybe add bytes_length to tell the decoder how to decode:

"WithdrawReasons": {
  "type": "set",
  "value_list": {
    "TransactionPayment": 1,
    "Transfer": 2,
    "Reserve": 4,
    "Fee": 8,
    "Tip": 16
  }
},

to

"WithdrawReasons": {
  "type": "set",
  "bytes_length": 8,
  "value_list": {
    "TransactionPayment": 1,
    "Transfer": 2,
    "Reserve": 4,
    "Fee": 8,
    "Tip": 16
  }
},

Incorrect decoding of compact Perbill type

The Compact<Perbill> type is not being parsed correctly, it is always expecting 4 bytes instead of a variable number.

This is causing that only values that are large enough to be encoded in 4 bytes are correctly read from chain, while the rest errors with Error: Decoding <U32> - No more bytes available (needed: 4 / total: 1).

This can be reproduced by querying the Staking.Validators map from Polkadot and printing out the type and value for each (key, value) tuple:

substrate = SubstrateInterface(url="wss://rpc.polkadot.io")
query = substrate.query_map("Staking", "Validators")
for key, value in query:
            print(type(key), type(value))
            print(key, value)

Sample of correct and incorrect decoding:

<class 'abc.scale_info::0'> <class 'NoneType'>
1zugcanAMwTjGakJthb9GNWpfesdFxpcZXW7GP94dgkS56u None
<class 'abc.scale_info::0'> <class 'abc.scale_info::153'>
1265yHoixkAt2GfEkYqP8nQ5gRCwMw6xGSscPQGM349t573q {'commission': 10000000, 'blocked': False}

By looking at the Polkadot metadata, the following information can be extracted:

ValidatorPrefs type id: 153
         field commission type id: 154
type id 154 is a Compact of type id 110
type id 110 is Perbill, which is a U32

So the type id 154 should be Compact<Perbill> which is decoded in the same way as Compact<U32>.
It is of note that the Compact<U32> type exists and has id 107.

Updating polkadot runtime to v28

Thanks for all the work put into this repository.
Would it be possible to update to v28?

Separately, In general, what are the steps necessary to generate the jsons, in case one wants to generate them manually?

Asking, because we keep running into issues when decoding extrinsic from newer block that use v28

ValueError: Pallet for index "45" not found

Unable to retrieve a block (only for recent blocks).

Code to reproduce:

from substrateinterface import SubstrateInterface

s = SubstrateInterface('wss://westend-rpc.polkadot.io')
print(s.get_block(s.get_block_hash(7762486)))

Traceback:

Traceback (most recent call last):
  File ".../venv/lib/python3.8/site-packages/scalecodec/types.py", line 1878, in get_pallet_by_index
    return self.pallets[index]
IndexError: list index out of range

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "gt.py", line 85, in <module>
    block = s.get_block(s.get_block_hash(7762486))
  File ".../venv/lib/python3.8/site-packages/substrateinterface/base.py", line 2333, in get_block
    return self.__get_block_handler(
  File ".../venv/lib/python3.8/site-packages/substrateinterface/utils/caching.py", line 36, in wrapper
    return cached_func(*args, **kwargs)
  File ".../venv/lib/python3.8/site-packages/substrateinterface/base.py", line 2292, in __get_block_handler
    return decode_block(response['result']['block'])
  File ".../venv/lib/python3.8/site-packages/substrateinterface/base.py", line 2223, in decode_block
    extrinsic_decoder.decode()
  File ".../venv/lib/python3.8/site-packages/scalecodec/base.py", line 658, in decode
    self.value_serialized = self.process()
  File ".../venv/lib/python3.8/site-packages/scalecodec/types.py", line 2354, in process
    self.value_object.update(self.process_type('Inherent', metadata=self.metadata).value_object)
  File ".../venv/lib/python3.8/site-packages/scalecodec/base.py", line 741, in process_type
    obj.decode(check_remaining=False)
  File ".../venv/lib/python3.8/site-packages/scalecodec/base.py", line 658, in decode
    self.value_serialized = self.process()
  File ".../venv/lib/python3.8/site-packages/scalecodec/types.py", line 471, in process
    field_obj = self.process_type(data_type, metadata=self.metadata)
  File ".../venv/lib/python3.8/site-packages/scalecodec/base.py", line 741, in process_type
    obj.decode(check_remaining=False)
  File ".../venv/lib/python3.8/site-packages/scalecodec/base.py", line 658, in decode
    self.value_serialized = self.process()
  File ".../venv/lib/python3.8/site-packages/scalecodec/types.py", line 1286, in process
    self.call_module = self.metadata.get_pallet_by_index(pallet_index.value)
  File ".../venv/lib/python3.8/site-packages/scalecodec/types.py", line 1880, in get_pallet_by_index
    raise ValueError(f'Pallet for index "{index}" not found')
ValueError: Pallet for index "45" not found

AttributeError: 'NoneType' object has no attribute 'get' running binary compiled via pyinstaller

I've compiled my project, it uses only the following imports:
before that I've updated substrate-interface with all the dependencies

import requests
import time
import sys
from substrateinterface import SubstrateInterface, Keypair

successfully compiled standalone via
pyinstaller -F ./dutch.py
when I run the binary file on the same machine, I'm getting the following error:

Traceback (most recent call last):
  File "dutch.py", line 239, in <module>
    substrate = init_network_params(is_prod)
  File "dutch.py", line 36, in init_network_params
    substrate = SubstrateInterface(
  File "substrateinterface/base.py", line 479, in __init__
  File "substrateinterface/base.py", line 2919, in reload_type_registry
  File "scalecodec/base.py", line 254, in update_type_registry
AttributeError: 'NoneType' object has no attribute 'get'
[465234] Failed to execute script 'dutch' due to unhandled exception!

init_network_params(is_prod) function:

def init_network_params(is_prod):
    if is_prod:
        wss_url = "wss://kusama-rpc.polkadot.io/"
    else:
        wss_url = "wss://127.0.0.1/"
    substrate = SubstrateInterface(
        url=wss_url,
        ss58_format=2
    )
    return substrate

Do you have any proposals how to solve this, or maybe best practices how to make a wheel or binary, included scalecodec -> substrate-interface?
p.s. I've also checked with Nuitka project and got the same error for binary file

Unable to decode DigestItems

Running the following code fails, for Digest1 (PreRuntime) or Digest2 (Seal), both with a NotImplementedError for that class:

from scalecodec.base import RuntimeConfiguration, ScaleBytes, ScaleDecoder
from scalecodec.type_registry import load_type_registry_preset

RuntimeConfiguration().update_type_registry(load_type_registry_preset("default"))
RuntimeConfiguration().update_type_registry(load_type_registry_preset("kusama"))
digest1 = "0x0642414245b5010106000000f88fc50f00000000046590b535259799f100e3b19a1d1c6f47037fcc53a919e3350dcace2bf9ec3bc50d4012ce978dca1460e44ba7ee5cda3ed9ce1d491c5d4f35ed8c9876200f0ba6d2d2c6dd4398357fa090c1689214a58ae430f984d29c9504c49a23f1e6040c"
digest2 = "0x05424142450101d45d7983639ca8f153877e254805f70f6b9f8522058c7a72888b3bccf14223622e82a7161f2568fdb3d56f2508c4ebe14547b57b9f8adcc4fde85539c9dbff87"
obj = ScaleDecoder.get_decoder_class('DigestItem', ScaleBytes(digest1))
obj.decode()
print(obj.value)

Result in:
NotImplementedError: Decoder class for "PreRuntime" not found (for Digest1)
NotImplementedError: Decoder class for "Seal" not found (for Digest2)

Seems that v0.11.24 is also not backward compatible

Just noticed that v0.11.24 removed the scalecodec.block reference. From earlier release notes of v1, it appears that was a breaking change and will only be part of a major version change.

New tag deployed by mistake?

how to connect other substrate chains

oh, so sorry๏ผŒi have a question, i want connect a substrate chain, but I only have the official types.json, i want know can it work in python๏ผŸ or tell me, what should I do

please, thanks for much.

Inconsistent parsing of Balances.Transfer events

Hello, I've found that the simple code

s = SubstrateInterface('wss://westend-rpc.polkadot.io', type_registry_preset='westend', ss58_format=42)
evts = s.get_events(block_hash=s.get_block_hash(x))

for evt in evts:
    event = evt.value
    if event['event_id'] == 'Transfer' and event['module_id'] == 'Balances':
        print(event['attributes'])

does produce different results for different block numbers. E.g. for the block with id 7862514 it produces

('5Ehed9osHF5LJ76SNVPh23FxZ6wqYyonDufVkWJBzvmf4eT2', '5FqbxNQkmrsVPPYfAgVDqFvMozyV9MmneJ1kZ3RkDK5ky842', 100000000000)

but for the block with id 7655084:

[{'type': 'AccountId', 'value': '5HpLdCTNBQDjFomqpG2XWadgB4zHTuqQqNHhUyYbett7k1RR'}, {'type': 'AccountId', 'value': '5HbwGTqpEw8LeUaCMtunWmCfFnJSN1gcnVjuqxAoDyuSc2rE'}, {'type': 'Balance', 'value': 1000000000000}]

So we have a tuple of values in the first case and list of dictionaries in the second. Is it deliberate?

Adopt for the new runtime

Hi again, it seems like Westend runtime was updated to version 50, which brought some API-breaking changes. For example, they finally migrated AccountInfo to TripleRefCount

Use struct in stead of tuples for variants with named fields

For variant type definitions in the PortableRegistry with named fields, for example:

{'name': 'Deposit', 'fields': [{'name': 'who', 'type': 0, 'typeName': 'T::AccountId', 'docs': []}, {'name': 'amount', 'type': 6, 'typeName': 'T::Balance', 'docs': []}], 'index': 7, 'docs': ['Some amount was deposited (e.g. for transaction fees).']}

were until now converted to tuples, discarding the name of the field.

Proposal is to use structs in stead of tuples when names are defined.

For example the attributes of the event Balances.Deposit:

was ('F3opxRbN5ZbjJNU511Kj2TLuzFcDq9BGduA9TgiECafpg29', 33599708)

and will now be {'who': 'F3opxRbN5ZbjJNU511Kj2TLuzFcDq9BGduA9TgiECafpg29', 'amount': 33599708}

How to convert AccountID to SCALE encoded?

When querying the storage, params should be SCALE encoded.

I saw many encoding test for int types. But as for AccountID (Alice): 5GrwvaEF5zXb26Fz9rcQpDWS57CtERHpNehXCPcNoHGKutQY, it should be converted to 0xd43593c715fdd31c61141abd04a99fd6822c8558854ccde39a5684e7a56da27d

But how can I convert AccountID to SCALE encoded? Could you please show an example?

AttributeError when trying to decode an event from Westend

Version of scalecodec: 1.0.1
Version of substrate-interface: 1.0.0

Use-case (how to reproduce):

from substrateinterface import SubstrateInterface

s = SubstrateInterface('wss://westend-rpc.polkadot.io')
events = s.get_events('0x767855b1bde5bcd5921ba67e0e34d292e0e3cef33c9152a7871c0aaf3a55b15c')

Expected behavior:

Got list of events for block with null extrinsic_idx. CODE_LINK

Real behavior:

Stack trace

Traceback (most recent call last):
  File "...", line 17, in <module>
    evetns = s.get_events('0x767855b1bde5bcd5921ba67e0e34d292e0e3cef33c9152a7871c0aaf3a55b15c')
  File ".../venv/lib/python3.8/site-packages/substrateinterface/base.py", line 1344, in get_events
    storage_obj = self.query(module="System", storage_function="Events", block_hash=block_hash)
  File ".../venv/lib/python3.8/site-packages/substrateinterface/base.py", line 1316, in query
    obj.decode()
  File ".../venv/lib/python3.8/site-packages/scalecodec/base.py", line 658, in decode
    self.value_serialized = self.process()
  File ".../venv/lib/python3.8/site-packages/scalecodec/types.py", line 796, in process
    element = self.process_type(self.sub_type, metadata=self.metadata)
  File ".../venv/lib/python3.8/site-packages/scalecodec/base.py", line 741, in process_type
    obj.decode(check_remaining=False)
  File ".../venv/lib/python3.8/site-packages/scalecodec/base.py", line 658, in decode
    self.value_serialized = self.process()
  File ".../venv/lib/python3.8/site-packages/scalecodec/types.py", line 2499, in process
    'extrinsic_idx': self.value_object['phase'][1].value,
AttributeError: 'str' object has no attribute 'value'

Final thoughts

Seems like self.value_object['phase'][1] == 'n' and the code needs to be updated to take account of such cases.

Unable to create as_multi calls due to encoding rules of OpaqueCall's

Currently the encoding rules for as_multi require the call to be serialised:

as_multi = ScaleDecoder.get_decoder_class("Call", metadata=kusama.metadata)
transfer = ScaleDecoder.get_decoder_class("OpaqueCall", metadata=kusama.metadata)
transfer.encode(
    {
        "call_module": "Balances",
        "call_function": "transfer",
        "call_args": {"dest": "CofvaLbP3m8PLeNRQmLVPWmTT7jGgAXTwyT69k2wkfPxJ9V", "value": 10000000000000},
    }
)
as_multi.encode(
    {
        "call_module": "Multisig",
        "call_function": "as_multi",
        "call_args": {
            "call": str(transfer),
            "maybe_timepoint": {"height": 3012294, "index": 3},
            "other_signatories": sorted(['D2bHQwFcQj11SvtkjULEdKhK4WAeP6MThXgosMHjW9DrmbE', 'CofvaLbP3m8PLeNRQmLVPWmTT7jGgAXTwyT69k2wkfPxJ9V']),
            "threshold": 2,
            "store_call": True,
            "max_weight": 10,
        },
    }
)

However, presumably due to a metadata change, the same parameter is now only accepted in dictionary format:

  File "/Users/nathan/.pyenv/versions/3.8.1/envs/ksmapi/lib/python3.8/site-packages/ksmutils/helper.py", line 161, in as_multi_signature_payload
    as_multi.encode(
  File "/Users/nathan/.pyenv/versions/3.8.1/envs/ksmapi/lib/python3.8/site-packages/scalecodec/base.py", line 298, in encode
    self.data = self.process_encode(value)
  File "/Users/nathan/.pyenv/versions/3.8.1/envs/ksmapi/lib/python3.8/site-packages/scalecodec/types.py", line 1180, in process_encode
    data += arg_obj.encode(param_value)
  File "/Users/nathan/.pyenv/versions/3.8.1/envs/ksmapi/lib/python3.8/site-packages/scalecodec/base.py", line 298, in encode
    self.data = self.process_encode(value)
  File "/Users/nathan/.pyenv/versions/3.8.1/envs/ksmapi/lib/python3.8/site-packages/scalecodec/types.py", line 1188, in process_encode
    return super().process_encode(str(call_obj.encode(value)))
  File "/Users/nathan/.pyenv/versions/3.8.1/envs/ksmapi/lib/python3.8/site-packages/scalecodec/base.py", line 298, in encode
    self.data = self.process_encode(value)
  File "/Users/nathan/.pyenv/versions/3.8.1/envs/ksmapi/lib/python3.8/site-packages/scalecodec/types.py", line 1157, in process_encode
    if call_module.name == value['call_module'] and call_function.name == value['call_function']:
TypeError: string indices must be integers
FAILED

I submitted a PR with one patch for the error, but ideally this could be handled at the point of as_multi encoding

Unable decode the event

from substrateinterface import SubstrateInterface, Keypair
from scalecodec import ScaleBytes

substrate=SubstrateInterface(url='ws://127.0.0.1:9944',type_registry_preset='default')
event_data = "0x2000000000000000b0338609000000000200000001000000000080b2e60e0000000002000000020000000003be1957935299d0be2f35b8856751feab95fc7089239366b52b72ca98249b94300000020000000500be1957935299d0be2f35b8856751feab95fc7089239366b52b72ca98249b943000264d2823000000000000000000000000000200000005027a9650a6bd43f1e0b4546affb88f8c14213e1fb60512692c2b39fbfcfc56b703be1957935299d0be2f35b8856751feab95fc7089239366b52b72ca98249b943000264d2823000000000000000000000000000200000013060c4c700700000000000000000000000000000200000005047b8441d5110c178c29be709793a41d73ae8b3119a971b18fbd20945ea5d622f00313dc01000000000000000000000000000002000000000010016b0b00000000000000"
metadata = substrate.get_block_metadata()

system_pallet = [p for p in metadata.pallets if p['name'] == 'System'][0]
event_storage_function = [s for s in system_pallet['storage']['entries'] if s['name'] == "Events"][0]


event = substrate.runtime_config.create_scale_object(
    event_storage_function.get_value_type_string(), data=ScaleBytes(event_data), metadata=metadata
)
print(event)
print(event.decode())

return message:

None
Traceback (most recent call last):
  File "c:\Users\lizhaoyang\Desktop\demo.py", line 711, in <module>
    print(event.decode())
  File "C:\Users\lizhaoyang\AppData\Local\Programs\Python\Python39\lib\site-packages\scalecodec\base.py", line 666, in decode
    self.value_serialized = self.process()
  File "C:\Users\lizhaoyang\AppData\Local\Programs\Python\Python39\lib\site-packages\scalecodec\types.py", line 792, in process
    element = self.process_type(self.sub_type, metadata=self.metadata)
  File "C:\Users\lizhaoyang\AppData\Local\Programs\Python\Python39\lib\site-packages\scalecodec\base.py", line 749, in process_type
    obj.decode(check_remaining=False)
  File "C:\Users\lizhaoyang\AppData\Local\Programs\Python\Python39\lib\site-packages\scalecodec\base.py", line 666, in decode
    self.value_serialized = self.process()
  File "C:\Users\lizhaoyang\AppData\Local\Programs\Python\Python39\lib\site-packages\scalecodec\types.py", line 2545, in process
    self.phase = self.process_type('Phase')
  File "C:\Users\lizhaoyang\AppData\Local\Programs\Python\Python39\lib\site-packages\scalecodec\base.py", line 748, in process_type
    obj = self.runtime_config.create_scale_object(type_string, self.data, **kwargs)
  File "C:\Users\lizhaoyang\AppData\Local\Programs\Python\Python39\lib\site-packages\scalecodec\base.py", line 160, in create_scale_object
    raise NotImplementedError('Decoder class for "{}" not found'.format(type_string))
NotImplementedError: Decoder class for "Phase" not found

self.contains_transaction: bool = False ^ SyntaxError: invalid syntax

Hey!

Very happy to find your library.

However, when installing, I encountered this error:

python3 ./setup.py install

/usr/lib/python3.5/distutils/dist.py:261: UserWarning: Unknown distribution option: 'long_description_content_type'
  warnings.warn(msg)
running install
running bdist_egg
running egg_info
writing top-level names to scalecodec.egg-info/top_level.txt
writing requirements to scalecodec.egg-info/requires.txt
writing scalecodec.egg-info/PKG-INFO
writing dependency_links to scalecodec.egg-info/dependency_links.txt
reading manifest file 'scalecodec.egg-info/SOURCES.txt'
writing manifest file 'scalecodec.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/scalecodec
copying build/lib/scalecodec/exceptions.py -> build/bdist.linux-x86_64/egg/scalecodec
copying build/lib/scalecodec/types.py -> build/bdist.linux-x86_64/egg/scalecodec
copying build/lib/scalecodec/block.py -> build/bdist.linux-x86_64/egg/scalecodec
copying build/lib/scalecodec/base.py -> build/bdist.linux-x86_64/egg/scalecodec
copying build/lib/scalecodec/metadata.py -> build/bdist.linux-x86_64/egg/scalecodec
copying build/lib/scalecodec/__init__.py -> build/bdist.linux-x86_64/egg/scalecodec
byte-compiling build/bdist.linux-x86_64/egg/scalecodec/exceptions.py to exceptions.cpython-35.pyc
byte-compiling build/bdist.linux-x86_64/egg/scalecodec/types.py to types.cpython-35.pyc
byte-compiling build/bdist.linux-x86_64/egg/scalecodec/block.py to block.cpython-35.pyc
  File "build/bdist.linux-x86_64/egg/scalecodec/block.py", line 43
    self.contains_transaction: bool = False
                             ^
SyntaxError: invalid syntax

byte-compiling build/bdist.linux-x86_64/egg/scalecodec/base.py to base.cpython-35.pyc
byte-compiling build/bdist.linux-x86_64/egg/scalecodec/metadata.py to metadata.cpython-35.pyc
byte-compiling build/bdist.linux-x86_64/egg/scalecodec/__init__.py to __init__.cpython-35.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying scalecodec.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying scalecodec.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying scalecodec.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying scalecodec.egg-info/requires.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying scalecodec.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
zip_safe flag not set; analyzing archive contents...
creating 'dist/scalecodec-0.1.0-py3.5.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
Processing scalecodec-0.1.0-py3.5.egg
Removing /env/lib/python3.5/site-packages/scalecodec-0.1.0-py3.5.egg
Copying scalecodec-0.1.0-py3.5.egg to /env/lib/python3.5/site-packages
scalecodec 0.1.0 is already the active version in easy-install.pth

Installed /env/lib/python3.5/site-packages/scalecodec-0.1.0-py3.5.egg
Processing dependencies for scalecodec==0.1.0
Searching for more-itertools==7.0.0
Best match: more-itertools 7.0.0
Adding more-itertools 7.0.0 to easy-install.pth file

Using /env/lib/python3.5/site-packages
Finished processing dependencies for scalecodec==0.1.0

Thanks

Bytes offset error

Hi Arjan, I came across a new error today and not sure how that happened:

Traceback (most recent call last):
  File "test_node_connection.py", line 70, in <module>
    extrinsic = substrate.create_signed_extrinsic(call=call, keypair=keypair)
  File "/Users/admin/.pyenv/versions/3.8.6/lib/python3.8/site-packages/substrateinterface/base.py", line 1457, in create_signed_extrinsic
    nonce = self.get_account_nonce(keypair.public_key) or 0
  File "/Users/admin/.pyenv/versions/3.8.6/lib/python3.8/site-packages/substrateinterface/base.py", line 1377, in get_account_nonce
    account_info = self.query('System', 'Account', [account_address])
  File "/Users/admin/.pyenv/versions/3.8.6/lib/python3.8/site-packages/substrateinterface/base.py", line 1258, in query
    obj.decode()
  File "/Users/admin/.pyenv/versions/3.8.6/lib/python3.8/site-packages/scalecodec/base.py", line 357, in decode
    raise RemainingScaleBytesNotEmptyException('Current offset: {} / length: {}'.format(self.data.offset, self.data.length))
scalecodec.exceptions.RemainingScaleBytesNotEmptyException: Current offset: 69 / length: 80
Admins-MBP:python-cli-substrate admin$ python test_node_connection.py 
Traceback (most recent call last):
  File "test_node_connection.py", line 70, in <module>
    extrinsic = substrate.create_signed_extrinsic(call=call, keypair=keypair)
  File "/Users/admin/.pyenv/versions/3.8.6/lib/python3.8/site-packages/substrateinterface/base.py", line 1457, in create_signed_extrinsic
    nonce = self.get_account_nonce(keypair.public_key) or 0
  File "/Users/admin/.pyenv/versions/3.8.6/lib/python3.8/site-packages/substrateinterface/base.py", line 1377, in get_account_nonce
    account_info = self.query('System', 'Account', [account_address])
  File "/Users/admin/.pyenv/versions/3.8.6/lib/python3.8/site-packages/substrateinterface/base.py", line 1258, in query
    obj.decode()
  File "/Users/admin/.pyenv/versions/3.8.6/lib/python3.8/site-packages/scalecodec/base.py", line 357, in decode
    raise RemainingScaleBytesNotEmptyException('Current offset: {} / length: {}'.format(self.data.offset, self.data.length))
scalecodec.exceptions.RemainingScaleBytesNotEmptyException: Current offset: 69 / length: 80

the code itself is

substrate = SubstrateInterface(
    url="wss://charlie-node.dev.gridsingularity.com",
    ss58_format=42,
    type_registry_preset='rococo',
    type_registry=custom_type_registry
)

keypair = Keypair.create_from_mnemonic('MNEMONIC')

call = substrate.compose_call(
    call_module='Balances',
    call_function='transfer',
    call_params={
        'dest': '14E5nqKAp3oAJcmzgZhUD2RcptBeUBScxKHgJKU4HPNcKVf3',
        'value': 1000
    }
)

extrinsic = substrate.create_signed_extrinsic(call=call, keypair=keypair)

try:
    receipt = substrate.submit_extrinsic(extrinsic, wait_for_inclusion=True)
    print("Extrinsic '{}' sent and included in block '{}'".format(receipt.extrinsic_hash, receipt.block_hash))

except SubstrateRequestException as e:
    print("Failed to send: {}".format(e))

I'm wondering if it could do something with the key encoding? The key format is different apparently for parachains then for the relay chain, if I understand correctly

ImportError: cannot import name 'LogDigest' from 'scalecodec.types'

I am deploying a project in a Docker container on a Raspberry Pi 4. During Docker build I get a warning:

WARNING: The candidate selected for download or install is a yanked version: 'scalecodec' candidate (version 1.0.20 at https://files.pythonhosted.org/packages/7c/fc/0d63ed8f097514e48eaa1d3df0c5a31783dad9dc48e774adf5ff6a57d2b5/scalecodec-1.0.20-py3-none-any.whl#sha256=baf3bbb28a8d17ddbb9c97dfff866e2a10791230b19a9171ea95a5142d033d59 (from https://pypi.org/simple/scalecodec/) (requires-python:>=3.6, <4))
Reason for being yanked: Extrinsic V14 support broken 

However build finishes with no errors.
My project couldn't start with the following error:

Traceback (most recent call last):
  File "/usr/local/bin/uvicorn", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1128, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1053, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1395, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 754, in invoke
    return __callback(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/uvicorn/main.py", line 371, in main
    run(app, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/uvicorn/main.py", line 393, in run
    server.run()
  File "/usr/local/lib/python3.9/site-packages/uvicorn/server.py", line 50, in run
    loop.run_until_complete(self.serve(sockets=sockets))
  File "/usr/local/lib/python3.9/asyncio/base_events.py", line 642, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.9/site-packages/uvicorn/server.py", line 57, in serve
    config.load()
  File "/usr/local/lib/python3.9/site-packages/uvicorn/config.py", line 318, in load
    self.loaded_app = import_from_string(self.app)
  File "/usr/local/lib/python3.9/site-packages/uvicorn/importer.py", line 25, in import_from_string
    raise exc from None
  File "/usr/local/lib/python3.9/site-packages/uvicorn/importer.py", line 22, in import_from_string
    module = importlib.import_module(module_str)
  File "/usr/local/lib/python3.9/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 850, in exec_module
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "/code/src/./app.py", line 13, in <module>
    from feecc_workbench.WorkBench import WorkBench
  File "/code/src/./feecc_workbench/WorkBench.py", line 11, in <module>
    from .IO_gateway import generate_qr_code, post_to_datalog, print_image, publish_file
  File "/code/src/./feecc_workbench/IO_gateway.py", line 9, in <module>
    from robonomicsinterface import RobonomicsInterface
  File "/usr/local/lib/python3.9/site-packages/robonomicsinterface/__init__.py", line 1, in <module>
    from .RobonomicsInterface import RobonomicsInterface
  File "/usr/local/lib/python3.9/site-packages/robonomicsinterface/RobonomicsInterface.py", line 4, in <module>
    import substrateinterface as substrate
  File "/usr/local/lib/python3.9/site-packages/substrateinterface/__init__.py", line 17, in <module>
    from .base import *
  File "/usr/local/lib/python3.9/site-packages/substrateinterface/base.py", line 33, in <module>
    from scalecodec.types import GenericCall, GenericExtrinsic, Extrinsic, LogDigest
ImportError: cannot import name 'LogDigest' from 'scalecodec.types' (/usr/local/lib/python3.9/site-packages/scalecodec/types.py)

After it the container gets stuck in an infinite restart loop, as dictated by it's restart policies.
What should I do?
The project
Dockerfile used for building the image

NotImplementedError: Decoder class for "scale_info::XXX" not found

After upgrading to latest v1 releases, we are facing issues decoding bunch of calls for block, events, transfer payloads etc.

Usually, all the error messages are related to missing types.

NotImplementedError: Decoder class for "scale_info::194" not found

It happens when we try to encode the call.

call = runtime_config.create_scale_object(
    "Call", metadata=metadata
)
call.encode(
    {
        "call_module": "Balances",
        "call_function": "transfer",
        "call_args": {"dest": address, "value": value},
    }
)

We use this in production to let users transfer funds. Is there any way to fix this ourselves, maybe loading custom type registry?

Using ss58_decode for validating an address

Hi arjan!

I'm using ss58_decode for validating that an address (a str) has the right ss58 format. For instance
Not the right format:

ss58_decode(
    address='14mB8stSf1vdP7WzbVr82YPgGGF7cBK9N7KxiVEac9UQgYj7',  // DOT address, ss58 1
    valid_ss58_format=2,
)

It has the right format:

ss58_decode(
    address='GLVeryFRbg5hEKvQZcAnLvXZEXhiYaBjzSDwrXBXrfPF7wj',  // KSM address, ss58 2
    valid_ss58_format=2,
)

The recent changes introduce a noop when the address starts with 0x, which alters its functionality as a validation function (unless I remove 0x from the address str). What is the reasoning behind? Should I use another way for validating addresses, e.g. Keypair?

Thanks!

More dublicated keys :)

Hi again! Thanks for your very quick response on the previous issue.

I found several other dublications:
DisputeStatementSet
MultiDisputeStatementSet
DisputeStatement
ValidDisputeStatementKind
InvalidDisputeStatementKind
ExplicitDisputeStatement
GlobalValidationData

AccountId types not automatically converted to SS58 format

AccountId related types are not presented in the SS58 format, but as the internal public key (0x8055e5e58e32701f75127a7fea2094f0eaa61964f0bc9c87082b211620606d5a in stead of FUbAUCPrWS2Gvf6BdphztUvjhqgqNc7883pXe5FdxjJmpT4).

Chain context can be retrieved from the RuntimeConfigurationObject

decode block events error after kusama upgraded runtime

after kusama upgrade runtime yesterday, can't deocde block events use scalecodec==0.9.35

from substrateinterface import *

substrate = SubstrateInterface(
    url="wss://kusama-rpc.polkadot.io/",
    address_type=2,
    type_registry_preset='kusama'
)

# block_hash = "0xb8e6b05f3036fa7c9be06d321718380e2642bbbb708d1280c65d2f030c67c125"   # block 2076530
block_hash = "0xef10c239a0c0f71cbfb21d808fc75edc5e550fa6956d847587605cda8c386dcd"   # block 2070349
metadata = substrate.get_block_metadata(block_hash)
block = substrate.get_chain_block(block_hash, metadata_decoder=metadata)
block_events = substrate.get_block_events(block_hash, metadata)

error:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/substrateinterface/__init__.py", line 402, in get_block_e

  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/base.py", line 255, in decode
    self.value = self.process()
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/block.py", line 291, in process
    element = self.process_type('EventRecord', metadata=self.metadata)
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/base.py", line 329, in process_type
    obj.decode(check_remaining=False)
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/base.py", line 255, in decode
    self.value = self.process()
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/block.py", line 328, in process
    self.event = self.metadata.event_index[self.type][1]
KeyError: '54bf'


if use block_hash = "0xb8e6b05f3036fa7c9be06d321718380e2642bbbb708d1280c65d2f030c67c125" # block 2076530

error:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/substrateinterface/__init__.py", line 402, in get_block_events
    events_decoder.decode()
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/base.py", line 255, in decode
    self.value = self.process()
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/block.py", line 291, in process
    element = self.process_type('EventRecord', metadata=self.metadata)
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/base.py", line 329, in process_type
    obj.decode(check_remaining=False)
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/base.py", line 255, in decode
    self.value = self.process()
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/block.py", line 332, in process
    arg_type_obj = self.process_type(arg_type)
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/base.py", line 329, in process_type
    obj.decode(check_remaining=False)
  File "/Users/donglongtu/Work/blockchains/polkadot/py-substrate-interface/.env/lib/python3.6/site-packages/scalecodec/base.py", line 262, in decode
    'No more bytes available (offset: {} / length: {})'.format(self.data.offset, self.data.length))
scalecodec.exceptions.RemainingScaleBytesNotEmptyException: No more bytes available (offset: 54 / length: 37)

Cannot retrieve event_idx after upgrading to major version

Hi,
I've upgraded recently to 1.0.4 from 0.11.23 and it seems that (what previously was) event_idx field value is no longer available. For instance:
image
I've used the 1 appearing here as the event_idx. For the same event, the event_index field value is "0502".
Where can I retrieve it from now?

Is it possible to distinguish between types with same name from diffrent pallets?

Hi.
In our application, we're heavily relying on your JSON files with type definitions. It works really well, providing all the necessary information to construct dynamic types. However, recently we found, that there are two different types with the name Phase - the first one is from System pallete and corresponds to the block execution phase - link. The seconds one is from election-provider-multi-phase pallete and corresponds to the election phase - link. The problem is that in the default.json there is only one Phase type, which corresponds to the System one. While it is pretty easy to add a new type for election-provider-multi-phase Phase into JSON file, it is not clear to me, how it is possible to distinguish between them while parsing metadata storage return type, for example. (it is Phase for both). In Polkadot js API, they are using additional config file for this purpose. Is it the only way or to do this?
Thanks for your help in advance

scalecodec.exceptions.RemainingScaleBytesNotEmptyException: No more bytes available (offset: 72 / length: 69)

Hello!
Why could this issue appear? I forked the substrate interface, and I am running into this:

File "/Users/lana/github/d3a/src/d3a/models/market/blockchain_interface.py", line 132, in track_trade_event extrinsic = self.substrate.create_signed_extrinsic(call=call, keypair=keypair) File "/Users/lana/Envs/d3a/lib/python3.6/site-packages/substrateinterface/__init__.py", line 1019, in create_signed_extrinsic nonce = self.get_account_nonce(keypair.public_key) or 0 File "/Users/lana/Envs/d3a/lib/python3.6/site-packages/substrateinterface/__init__.py", line 951, in get_account_nonce response = self.get_runtime_state('System', 'Account', [account_address]) File "/Users/lana/Envs/d3a/lib/python3.6/site-packages/substrateinterface/__init__.py", line 876, in get_runtime_state response['result'] = obj.decode() File "/Users/lana/Envs/d3a/lib/python3.6/site-packages/scalecodec/base.py", line 280, in decode self.value = self.process() File "/Users/lana/Envs/d3a/lib/python3.6/site-packages/scalecodec/types.py", line 431, in process result[key] = self.process_type(data_type, metadata=self.metadata).value File "/Users/lana/Envs/d3a/lib/python3.6/site-packages/scalecodec/base.py", line 325, in process_type obj.decode(check_remaining=False) File "/Users/lana/Envs/d3a/lib/python3.6/site-packages/scalecodec/base.py", line 280, in decode self.value = self.process() File "/Users/lana/Envs/d3a/lib/python3.6/site-packages/scalecodec/types.py", line 431, in process result[key] = self.process_type(data_type, metadata=self.metadata).value File "/Users/lana/Envs/d3a/lib/python3.6/site-packages/scalecodec/base.py", line 325, in process_type obj.decode(check_remaining=False) File "/Users/lana/Envs/d3a/lib/python3.6/site-packages/scalecodec/base.py", line 287, in decode 'No more bytes available (offset: {} / length: {})'.format(self.data.offset, self.data.length)) scalecodec.exceptions.RemainingScaleBytesNotEmptyException: No more bytes available (offset: 72 / length: 69)

Thank you!

`Text` type on Substrate

Hey, I found this great library and I'm trying to adapt it with to our substrate chain Polymesh.
I'm facing some troubles, maybe you could help me:

  1. Text type from schema.json in Substrate: How should I represent it? Should it be any kind of Vec<>?
  2. It doesn't recognize [u8; 32] on my json. Even when it is present on the default.json file and it's working fine there. Is there anything else I should add to make it work?

Thanks in advance!

Packaging with pyinstaller does not include the json metadata

Hey guys,

This is Lefteris from rotki. We use py-substrate-interface in rotki. Rotki generates a binary with the use of pyinstaller. Our latest release included the first version of kusama balance queries but we hit an unexpected bug: rotki/rotki#2116

The bug only happens in packaged mode and not in normal develop mode. This quickly lead to figuring out that pyinstaller does not have a proper hook to include the py-scale-codec metadata when packaging. By metadata I am referring to all the json files.

We added a hook in our project as the fix of the problem in this commit.

The reason I am making this issue is that the way pyinstaller works the downstream projects should not have to add hooks for the packages they use. Instead the upstream projects, in this case py-scale-codec should provide the pyinstaller hook. This way all projects using it don't need to deal with anything.

There is documentation on how to do it here: https://pyinstaller.readthedocs.io/en/stable/hooks.html#provide-hooks-with-package

StakingLedger is outdated

both polkadot and kusama have moved to Vec<EraIndex> for claimedRewards fields but seemingly the default presets have not been updated here.

I wasn't sure how to update the json files, for anyone who has the same issue this is a fix:

substrate = SubstrateInterface(
    url="ws://localhost:9944",
    address_type=0,
    type_registry={'types': {
            "StakingLedger<AccountId, BalanceOf>": {
                "type": "struct",
                "type_mapping": [
                    ["stash", "AccountId"],
                    ["total", "Compact<Balance>"],
                    ["active", "Compact<Balance>"],
                    ["unlocking", "Vec<UnlockChunk<Balance>>"],
                    ["claimedReward", "Vec<EraIndex>"]
                ]
            },
        }
    },
    type_registry_preset='polkadot',
)

Exception decoding extrinsics on joystream

I am trying to use the harvester to parse and store data from joystream blockchain.

Things work fine, but every few blocks I get a exception. I wonder if there is a condition we need to handle decoding extrinsics from joystream similar to this here: https://github.com/polkascan/polkascan-pre-harvester/blob/master/app/processors/converters.py#L645

The block as json is:

{'block': {'extrinsics': ['0x280302000b7077ca656e01', '0x140308007d1a', '0xd101030a009f0600008c881220713f8a2e4dadbe7420c781b0f58e99d32a70221e7ded05aa3c37feb46f426aa7001100000000000000e229dccd4ad4c4d7c2888a8a419b546c78b8a9d3637597c31435520fe8ac7b2c30429186a9e11bf33e094db68faf347a003f614a7ef9b8e0b955709e2554218b'], 'header': {'digest': {}}}, 'justification': None}

The metadata is V7 and I have created a gist with the same https://gist.github.com/ksinghf/e8a4fba3f526b2675d2365a897cda592#file-joystream-metadata

The exception I get is:

[2019-11-13 18:33:18,819: ERROR/ForkPoolWorker-1] Task app.tasks.accumulate_block_recursive[c0f7cfe5-6441-4744-ba8e-38f8e755739c] raised unexpected: HarvesterCouldNotAddBlock('0x7f89b58497ea3f90709b85a6438cedd20203f3a0da468f521f4da09bec624daa',)
Traceback (most recent call last):
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/harvester/app/tasks.py", line 106, in accumulate_block_recursive
    block = harvester.add_block(block_hash)
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/harvester/app/processors/converters.py", line 651, in add_block
    extrinsic_data = extrinsics_decoder.decode()
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/base.py", line 185, in decode
    self.value = self.process()
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/block.py", line 175, in process
    arg_type_obj = self.process_type(arg.type, metadata=self.metadata)
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/base.py", line 248, in process_type
    obj.decode(check_remaining=False)
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/base.py", line 185, in decode
    self.value = self.process()
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/types.py", line 305, in process
    result[key] = self.process_type(data_type, metadata=self.metadata).value
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/base.py", line 248, in process_type
    obj.decode(check_remaining=False)
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/base.py", line 185, in decode
    self.value = self.process()
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/types.py", line 305, in process
    result[key] = self.process_type(data_type, metadata=self.metadata).value
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/base.py", line 248, in process_type
    obj.decode(check_remaining=False)
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/base.py", line 185, in decode
    self.value = self.process()
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/types.py", line 505, in process
    element_count = self.process_type('Compact<u32>').value
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/base.py", line 248, in process_type
    obj.decode(check_remaining=False)
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/base.py", line 185, in decode
    self.value = self.process()
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/types.py", line 77, in process
    self.process_compact_bytes()
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/scalecodec/types.py", line 33, in process_compact_bytes
    byte_mod = compact_byte[0] % 4
IndexError: bytearray index out of range

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/celery/app/trace.py", line 375, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/harvester/app/tasks.py", line 66, in __call__
    return super().__call__(*args, **kwargs)
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/.venv/lib/python3.6/site-packages/celery/app/trace.py", line 632, in __protected_call__
    return self.run(*args, **kwargs)
  File "/nvme/projects/jsgenesis/polkascan/polkascan-pre/harvester/app/tasks.py", line 133, in accumulate_block_recursive
    raise HarvesterCouldNotAddBlock(block_hash) from exc
app.processors.converters.HarvesterCouldNotAddBlock: 0x7f89b58497ea3f90709b85a6438cedd20203f3a0da468f521f4da09bec624daa

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.