Giter VIP home page Giter VIP logo

slpdb's Introduction

SLPDB

SLPDB Readme

Last Updated: 2021-01-03

Current SLPDB Version: 1.0.0

1. What is SLPDB?

SLPDB is an indexer service for storing all data related to the Simple Ledger Protocol with realtime transaction and block notifications. Users can build block explorers (e.g., https://simpleledger.info), track token burn and mint history, track mint baton status, generate token holder lists at any block height, and easily determine state for script based smart contracts. Web sites and services can easily create new routes for SLP data when using the SlpServe and SlpSockServe http gateways.

SLPDB records all SLP token data, but it can be easily configured to only look at a specified subset of tokens using the token filtering feature. Filtering for your specific needs can drastically improve realtime notification speed, reduce initial db sync time, and reduce the db footprint.

Live status of nodes running slpdb can be found at: https://status.slpdb.io.

2. Do you need to install SLPDB?

Most likely you do not need to install SLPDB. Most users will be better off using someone else's publicly shared SLPDB instance like https://slpdb.fountainhead.cash or https://slpdb.bitcoin.com. You only need to install SLPDB, SlpServe, and/or SlpSockServe if any of the following is true:

  • You cannot rely on a third-party for your SLP data.
  • The rate limits imposed by slpdb.fountainhead.cash or slpdb.bitcoin.com are too restrictive for your needs.
  • Realtime event notifications available at slpsocket.fountainhead.cash are not fast enough for your needs.

NOTE: If you are going to operate your own SLPDB instance you should join the telegram group for help and updates: https://t.me/slpdb

3. How do I query for SLP data?

Queries into SLPDB data are made using bitquery which allows MongoDB queries and jq queries over HTTP. Here are some example SLPDB queries:

  • Get details of all token IDs (example)
  • Get token details for single token ID (example)
  • Get addresses for a single token ID (example)
  • Get token balances for an address (example)
  • Get utxos for a single token ID (example)
  • Get transaction history by token ID (example)
  • Get transaction history by address (example)
  • Get transaction history by address and token ID (example)
  • Get all invalid token transactions (w/ SLP op_return) (example)
  • Get transaction counts for each token (w/ time range) (example)
  • Get SLP usage per day (w/ time range) (example)
  • List input/output amount total for each valid transaction (example)

Users should utilize the SlpServe and SlpSockServer projects in order to conveniently query for the SLP data produced by SLPDB.

3.1. Working with Large Numbers (Decimal128 and BigNumber)

Some of the values used in SLP require 64 or more bits of precision, which is more precision than number type can provide. To ensure value precision is maintained values are stored in collections using the Decimal128 type. Decimal128 allows users to make database queries using query comparison operators like $gte.

The services SlpServe and SlpSockServer return query results as a JSON object with Decimal128 values converted to string type so that readability is improved for the query consumer, as opposed to being returned as an awkward $DecimalNumber JSON object. The string type also maintains the original value precision. If a user wants to perform math operations on these string values the user will need to first convert them to a large number type like BigNumber or Decimal128 (e.g., Decimal128.fromString("1000.123124") or using bignumber.js npm library via new BigNumber("1000.00000001")).

4. Installation Instructions

4.1. Prerequisites

  • Node.js 12
  • MongoDB 4.4+
  • Bitcoin Cash Node, BitcoinUnlimited, BCHD, or other Bitcoin Cash full node with:
    • RPC-JSON (or gRPC) and
    • ZeroMQ event notifications

4.2. Full Node Settings for bitcoin.conf

The following settings should be applied to your full node's configuration. NOTE: The settings presented here are matched up with the default settings presented in config.ts, you should modify these settings and use environment variables (shown in config.ts) if you need a custom setup.

  • txindex=1
  • server=1
  • rpcuser=bitcoin
  • rpcpassword=password
  • rpcport=8332
  • rpcworkqueue=10000
  • rpcthreads=8
  • zmqpubhashtx=tcp://*:28332
  • zmqpubrawtx=tcp://*:28332
  • zmqpubhashblock=tcp://*:28332
  • zmqpubrawblock=tcp://*:28332
  • Optional: testnet=1

4.3. MongoDB Configuration Settings

MongoDB will take up a large amount of memory and completely fill up a system with 16GB ram. To prevent this from happening you should set a limit on the WiredTiger maximum cache limit. Refer to MongoDB documentation for information on how to configure your specific version of MongoDB. On a linux based system add wiredTigerCacheSizeGB=2 to /etc/mongodb.conf.

4.4. BCHD & gRPC Support

High speed gRPC is supported with BCHD 0.15.2+ full nodes in place of JSON RPC and incoming ZMQ notifications. To enable, add the environment variables grpc_url and grpc_certPath. See the example.env file in this project and the BCHD documentation for more details. For instructions on installing a self-signed certificate see guidance here.

4.5. Testnet Support

To use SLPDB with Testnet simply set your full node to the testnet network (e.g., set testnet=1 within bitcoin.conf) and SLPDB will automatically instantiate using proper databases names according to the network. For informational purposes the database names are as follows:

  • Mainnet
    • Mongo db name = slpdb
    • LevelDB directory = ./_leveldb
  • Testnet
    • Mongo db name = slpdb_testnet
    • Testnet diectory = ./_leveldb_testnet

4.6. Running SLPDB

1) Run MongoDB (config.ts default port is 27017)

2) Run Bitcoin Cash full node using bitcoin.conf settings from above.

3) Install SLPDB dependencies using npm install at the command-line

4) Start SLPDB using npm start at the command-line and wait for sync process to complete (monitor status in the console).

  • SLPDB will need to crawl the blockchain to save all previous SLP transaction data to MongoDB

  • After crawling SLPDB will build token graphs for each token using either the raw transaction data or a previously saved token graph state.

5) Install and run slpserve and/or slpsocket to access SLP token data and statistics

4.7. Updating SLPDB

1) Execute git pull origin master to update to latest version.

2) Execute npm install to update packages

3) Execute npm run migrate up to run latest migrations.

4) Restart SLPDB.

4.8. Filtering SLPDB to specific Token IDs

Modify the example-filters.yml file to suit your needs and then rename it as filters.yml to activate the filtering. Currently, include-single is the only filter type available, reference the example file for useage requirements.

4.9. Pruning

Pruning removes totally spent and aged transactions from the global transaction cache, the token graph, and the validator cache. Pruning occurs after a transaction has been totally spent and is aged more than 10 blocks. At this time there is no custom configuration available for pruning.

5. Real-time Notifications

5.1. ZeroMQ (ZMQ)

SLPDB publishes the following notifications via ZMQ and can be subscribed to by binding to http://0.0.0.0:28339. The following events can be subscribed to:

  • mempool
  • block

Each notification is published in the following data format:

{
    "tx": {
        h: string; 
    };
    "in": Xput[];
    "out": Xput[];
    "blk": { 
        h: string; 
        i: number; 
        t: number; 
    };
    "slp": {
        valid: boolean|null;
        detail: {
            transactionType: SlpTransactionType;
            tokenIdHex: string;
            versionType: number;
            symbol: string;
            name: string;
            documentUri: string; 
            documentSha256Hex: string|null;
            decimals: number;
            txnContainsBaton: boolean;
            txnBatonVout: number|null;
        } | null;
        outputs: { address: string|null, amount: Decimal128|null }[]|null;|null;
        invalidReason: string|null;
        schema_version: number;
    };
}

5.2 HTTP Gateways

Realtime SLP notifications can be accessed via HTTP server-sent events (SSE) by utilizing SlpSocketServe. A good alternative to SLPDB based realtime notifications is SlpStream which utilizes the gs++ backend.

6. MongoDB Collections & Data Schema

Three categories of information are stored in MongoDB:

  1. Valid and invalid SLP token transactions,
  2. Statistical calculations about each token, and
  3. Token graph state

6.1. DB Collections

Four MongoDB collections used to store these three categories of data, they are as follows:

  • confirmed = c and unconfirmed = u

    • Purpose: These two collections include any Bitcoin Cash transaction containing the "SLP" Lokad ID and passes all filters set in filters.yml. The collection used depends on the transaction's confirmation status . Both valid and invalid SLP transactions are included. Whenever new SLP transactions are added to the Bitcoin Cash network they are immediately added to one of these collections.

    • Schema:

    {
        "tx": {
            h: string; 
        };
        "in": Xput[];
        "out": Xput[];
        "blk": { 
            h: string; 
            i: number; 
            t: number; 
        };
        "slp": {
            valid: boolean|null;
            detail: {
                transactionType: SlpTransactionType;
                tokenIdHex: string;
                versionType: number;
                symbol: string;
                name: string;
                documentUri: string; 
                documentSha256Hex: string|null;
                decimals: number;
                txnContainsBaton: boolean;
                txnBatonVout: number|null;
            } | null;
            invalidReason: string|null;
            schema_version: number;
        };
    }
  • tokens = t

    • Purpose: This collection includes metadata and statistics about each token. Each time SLPDB has finished updating a token graph the associated items in this collection are updated.

    • Schema:

    {
        "tokenDetails": {
            transactionType: SlpTransactionType;
            tokenIdHex: string;
            versionType: number;
            timestamp: string|null;
            timestamp_unix: number|null;
            symbol: string;
            name: string;
            documentUri: string; 
            documentSha256Hex: string|null;
            decimals: number;
            containsBaton: boolean;
            batonVout: number|null;
            genesisOrMintQuantity: Decimal128|null;
            sendOutputs: Decimal128[]|null;
        };
        "tokenStats": {
            block_created: number|null;
            approx_txns_since_genesis: number;
        }
        "pruningState": TokenPruneStateDbo;
        "mintBatonUtxo": string;
        "mintBatonStatus": TokenBatonStatus;
        "lastUpdatedBlock": number;
        "schema_version": number;
        "nftParentId?": string;
    }
  • graphs = g

    • Purpose: This collection contains an item for each valid SLP transaction (can be GENESIS, MINT, or SEND)

    • Schema:

    {
        "tokenDetails": { 
            tokenIdHex: string 
        };
        "graphTxn": {
            txid: string;
            details: SlpTransactionDetailsDbo;
            outputs: GraphTxnOutputDbo[];
            inputs: GraphTxnInputDbo[];
            _blockHash: Buffer | null;
            _pruneHeight: number | null;
        };
    }

7. Test Harnesses

SLPDB is backed by three distinct test harnesses, they include (1) OP_RETURN message parsing unit tests and differential fuzzing, (2) graph input unit tests, and (3) end-to-end regression testing.

7.1. Parser Tests

SLPDB leverages the SLPJS npm library has been tested using differential fuzzing and passes all SLP message parser unit tests for Token Type 1 and NFT1. You can learn more about this testing at the following locations:

7.2. Input Tests

Graph validation typically requires checking that a transactions's valid inputs the outputs specified in the SLP OP_RETURN message. The SLPJS npm library also passes all unit tests which test for the specified input requirements for Token Type 1 and NFT1, and you can learn more about these types of tests at the following location:

7.3. End-to-End Tests

A set of end-to-end tests have been created in order to ensure the expected behavior of SLPDB utilizing the bitcoin regtest network. These tests simulate actual transaction activity using the bitcoin regtest test network and check for proper state in mongoDB and also check that zmq notifications are emitted. The tests directory contains the end-to-end tests which can be run by following the instructions provided in the regtest directory.

slpdb's People

Contributors

blockparty-sh avatar damascene avatar dependabot[bot] avatar imaginaryusername avatar jcramer avatar lpinca avatar notation avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

slpdb's Issues

slpMempoolIgnoreList is limited to 10,000 items

slpMempoolIgnoreList is limited to 10000 items, with blocks larger
than 2MB this number may need to be larger. When this limit is exceeded
every non-SLP zmq transaction notification (prior to block) will cause
a RPC call to getRawTransaction(). To fix this, we will want to switch to
using a raw transaction hex notification, rather than txn hash notification,
raw transaction hex notification.

SLPDB/bit.ts

Line 148 in 894d147

if(this.slpMempoolIgnoreList.length > 10000)

James' To Do List:

  • Add query for fetching transaction history by address (see below)
  • Add query for fetching all token balances by Address (see below)
  • Add SLP transaction details to "confirmed" and "unconfirmed" collections
  • Add query for fetching transaction details by TXID
  • Add query for fetching a list of all valid tokens
  • null token details removal for send txs
  • Use ZMQ socket to publish real-time SLP events (new token, new mint, anything else?)
  • Make simple testnet experience
  • Add missing token stats
  • Complete documentation
  • create example queries like in bitplaylist (maybe merge these) and write a doc page describing the format and differences
  • slpsocketd - port from bitsocketd to work with the new token events
  • slpserve - port from sockserve

Refine status states

Currently, if a reorg happens the status would just say RUNNING. We can add more fine-grained statuses, like "REORG_CORRECTION_IN_PROGRESS".

Bitdb2.0: Accessing BCH balances and UTXOs

Is it possible to use this library for accessing BCH balances and UTXOs ?

I thought i read somewhere that SLPDB only syncs data starting at the block where SLP tokens were made. If I just change this number, will I be able to use this lib in websockets with BitDB and bitcoin queries?

I can't get any of unwriter's bitdb 2.0 queries to work but they work on his explorer but not slpdb's

slpdb stop sync becase tx 3c7048e40d81239accf6b682420154cc792df7621674d764acf585e1332deb48

error log:

[ERROR] unhandledRejection Error: Graph item cannot have inputs less than outputs (txid: 3c7048e40d81239accf6b682420154cc792df7621674d764acf58
5e1332deb48, inputs: 0 | 0, outputs: 9992400000000 | 2).
    at SlpTokenGraph.<anonymous> (/var/viabtc/SLPDB/slptokengraph.js:476:27)
    at Generator.next (<anonymous>)
    at fulfilled (/var/viabtc/SLPDB/slptokengraph.js:5:58)
    at runMicrotasks (<anonymous>)
    at processTicksAndRejections (internal/process/task_queues.js:93:5)
[INFO] Block checkpoint retrieved:  643464 000000000000000001e4f0cf837432589bbfb1b551758d7901473f54e19d01cd
[INFO] JSON RPC: getMempoolInfo
[INFO] Sending telemetry update to status.slpdb.io for unknown-96400...
Error: Graph item cannot have inputs less than outputs (txid: 3c7048e40d81239accf6b682420154cc792df7621674d764acf585e1332deb48, inputs: 0 | 0,
 outputs: 9992400000000 | 2).
    at SlpTokenGraph.<anonymous> (/var/viabtc/SLPDB/slptokengraph.js:476:27)
    at Generator.next (<anonymous>)
    at fulfilled (/var/viabtc/SLPDB/slptokengraph.js:5:58)
    at runMicrotasks (<anonymous>)
    at processTicksAndRejections (internal/process/task_queues.js:93:5)
[INFO] Shutting down SLPDB... Sun Jul 12 2020 08:14:16 GMT+0800 (China Standard Time)

resolve bitcoin-rpc-promise gist dep

Currently I have a personal gist for 'bitcoin-rpc-promise': 'gist:7c752c62fdae80fc7e2125a6016cbf26'.

We should create a permanent home for this on npm or with a formal GitHub repo.

Account for token burns caused by spending in non-SLP transaction

This is a lower priority item.

The individual token UTXO status stored within the token graph object property _graphTxns does not properly label the outputs if they were involved in a token burn incident caused by spending in a non-SLP transaction. For complete token graph labeling accuracy, this should eventually be fixed.

However, the token graph object's tokenStats and tokenUtxos properties do account for these burned tokens. Hence why this is a low priority item.

Syncing to testnet: Received an invalid Bitcoin Cash address as input.

When trying to sync with testnet, my copy of SLPDB keeps crashing with this error:

[Query] getMintTransactions(a184e3715b124c6641da4eb0f2940f578c1f6a3525646b8d9a0b919b9b83898e)
STR =  [
  {
    "slp": null,
    "txid": "de66d78081b92a9913dcea43943ac2b82ec8962550fe62710e68d61ee2e03d05",
    "versionTypeHex": "01",
    "block": 1293084,
    "timestamp": "2019-03-16 15:08:28",
    "batonHex": "02",
    "quantityHex": "00000000000004d2"
  }
]

{ InvalidAddressError: Received an invalid Bitcoin Cash address as input.
    at new InvalidAddressError (/home/safeuser/SLPDB/node_modules/bchaddrjs-slp/src/bchaddr.js:518:15)
    at decodeAddress (/home/safeuser/SLPDB/node_modules/bchaddrjs-slp/src/bchaddr.js:177:9)
    at Object.toSlpAddress (/home/safeuser/SLPDB/node_modules/bchaddrjs-slp/src/bchaddr.js:133:17)
    at Function.Utils.toSlpAddress (/home/safeuser/SLPDB/node_modules/slpjs/lib/utils.js:37:24)
    at SlpTokenGraph.<anonymous> (/home/safeuser/SLPDB/SlpTokenGraph.js:296:46)
    at Generator.next (<anonymous>)
    at /home/safeuser/SLPDB/SlpTokenGraph.js:7:71
    at new Promise (<anonymous>)
    at __awaiter (/home/safeuser/SLPDB/SlpTokenGraph.js:3:12)
    at asyncForEach (/home/safeuser/SLPDB/SlpTokenGraph.js:285:77)
  name: 'InvalidAddressError',
  message: 'Received an invalid Bitcoin Cash address as input.',
  stack:
   'InvalidAddressError: Received an invalid Bitcoin Cash address as input.\n    at new InvalidAddressError (/home/safeuser/SLPDB/node_modules/bchaddrjs-slp/src/bchaddr.js:518:15)\n    at decodeAddress (/home/safeuser/SLPDB/node_modules/bchaddrjs-slp/src/bchaddr.js:177:9)\n    at Object.toSlpAddress (/home/safeuser/SLPDB/node_modules/bchaddrjs-slp/src/bchaddr.js:133:17)\n    at Function.Utils.toSlpAddress (/home/safeuser/SLPDB/node_modules/slpjs/lib/utils.js:37:24)\n    at SlpTokenGraph.<anonymous> (/home/safeuser/SLPDB/SlpTokenGraph.js:296:46)\n    at Generator.next (<anonymous>)\n    at /home/safeuser/SLPDB/SlpTokenGraph.js:7:71\n    at new Promise (<anonymous>)\n    at __awaiter (/home/safeuser/SLPDB/SlpTokenGraph.js:3:12)\n    at asyncForEach (/home/safeuser/SLPDB/SlpTokenGraph.js:285:77)' }
trout@p2pvps:~$ 

Update returned data type for properties in token and addresses collections from string to number

Add baton status to token stats

The token statistics should have a new boolean or Enum field indicating the status of the baton. The field can be something like mintingBatonExists: boolean , or have more detail on baton status with an enum.

Probably can just start with a boolean to keep things simple.

Use a logging system such as winston to simplify debugging

This way we can easily provide warnings, errors, or verbose input. Plus, easy configuration of log file storage. If it isn't wanted by a user they could disable saving of it (such as if they have their own system of logging currently).

Broken version specific 'grpc' dependency in package.json for 'grpc-bchrpc-node'

Running npm install yields error

../deps/grpc/src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc:30:10: fatal error: address_sorting/address_sorting.h: No such file or directory
 #include <address_sorting/address_sorting.h>
          ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
compilation terminated.

due to grpc-bchrpc-node being version locked to [email protected] for which pre-built binaries are not available

node-pre-gyp WARN Tried to download(404): https://node-precompiled-binaries.grpc.io/grpc/v1.23.1/node-v79-linux-x64-glibc.tar.gz                                                                                                                            
node-pre-gyp WARN Pre-built binaries not found for [email protected] and [email protected] (node-v79 ABI, glibc) (falling back to source compile with node-gyp)

and compiling yields above error, a known issue that has been fixed in later releases:
grpc/grpc-node#952 (comment)

Please fix the dependency so npm can install the correct package version.

MISSING TXN INPUT

Overview

Syncing a new database from scratch, I've been keeping an eye on error messages. This Issue is intended to report what I've found to provide feedback as to possible areas of concern. Feel free to close this issue if the error message below is known and not worth worrying about.

System Info:

Mongo DB: v4.2.0
Node.js: v10.16.3
npm: v6.9.0
TypeScript: v3.5.3

Error message:

[INFO] JSON RPC: getBlockCount
SLPJS Validating: b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f
SLPJS Result: true (b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f)
[INFO] Valid txns 4351
[INFO] JSON RPC: getTxOut b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f 1 true
[INFO] JSON RPC: getTxOut b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f 2 true
SLPJS Validating: 3c4102978391486747ab3ed06f2d5c2dc509ff499eb294f9b4187febbd32ef22
SLPJS Result: true (3c4102978391486747ab3ed06f2d5c2dc509ff499eb294f9b4187febbd32ef22)
[INFO] Valid txns 4352
[INFO] JSON RPC: getTxOut 3c4102978391486747ab3ed06f2d5c2dc509ff499eb294f9b4187febbd32ef22 1 true
[INFO] JSON RPC: getTxOut 3c4102978391486747ab3ed06f2d5c2dc509ff499eb294f9b4187febbd32ef22 2 true
[INFO] JSON RPC: getTxOut 3c4102978391486747ab3ed06f2d5c2dc509ff499eb294f9b4187febbd32ef22 3 true
[Query] queryForTxoInputAsSlpSend(3c4102978391486747ab3ed06f2d5c2dc509ff499eb294f9b4187febbd32ef22,3)
STR =  [
  {
    "txid": "b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f",
    "block": 597769,
    "timestamp": "2019-08-28 15:28:05",
    "tokenid": "0df768b5485c72645de069b68f66d02205c26f827c608ef5ffa976266d753d50",
    "slp1": "0000000005f5e100",
    "slp2": "00000000e90ddd80",
    "slp3": null,
    "slp4": null,
    "slp5": null,
    "slp6": null,
    "slp7": null,
    "slp8": null,
    "slp9": null,
    "slp10": null,
    "slp11": null,
    "slp12": null,
    "slp13": null,
    "slp14": null,
    "slp15": null,
    "slp16": null,
    "slp17": null,
    "slp18": null,
    "slp19": null,
    "bch0": 0,
    "bch1": 2150,
    "bch2": 546,
    "bch3": 3958938,
    "bch4": null,
    "bch5": null,
    "bch6": null,
    "bch7": null,
    "bch8": null,
    "bch9": null,
    "bch10": null,
    "bch11": null,
    "bch12": null,
    "bch13": null,
    "bch14": null,
    "bch15": null,
    "bch16": null,
    "bch17": null,
    "bch18": null,
    "bch19": null
  }
]

SLPJS Validating: b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f
SLPJS Result: true (b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f)
[INFO] JSON RPC: getBlockCount
START f8b42574344527dd817d90e373aea2043125e56df5a7d8ca51c87f5699f22a70
START 3c4102978391486747ab3ed06f2d5c2dc509ff499eb294f9b4187febbd32ef22
FROM b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f
MISSING TXN INPUT

Behavior

After displaying the output above, SLPDB exited with no further information. I was able to restart the process and continue syncing.

SLPDB has tx which does not exist

SLPDB Hanging

This Issue was initially reported in the SLPDB Operators Telegram channel.

I've been seeing a 'hanging' behavior in SLPDB where it simply stops processing. It doesn't error, and it doesn't exit. I managed to reproduce this state on two separate machines overnight:

  • A dedicated machine running SLPDB in the docker container (the Mint box running Linux Mint).
  • In a Virtual Machine running on my dev box.

I had both syncing all night, both running off commit fe3f8df

Dev box output

...
[INFO] JSON RPC: getRawTransaction 3728841607b5559c2691ab24914f6c7be2ca3ddd207af5460070e1362384d8ca
SLPJS Result: false (3728841607b5559c2691ab24914f6c7be2ca3ddd207af5460070e1362384d8ca)
SLPJS Invalid Reason: Token outputs are greater than possible token inputs.
[INFO] Updating confirmed TNATxn SLP data for 279350c56bac24e422c0a1d625f1abd29ccafee3e10787443469105577fc7565
[INFO] JSON RPC: getRawTransaction 279350c56bac24e422c0a1d625f1abd29ccafee3e10787443469105577fc7565
[INFO] Updating confirmed TNATxn SLP data for 16a40eca8bf6a1d4b913820718db2361686a9371e4b4ad82998c0566cf7a3052
[INFO] JSON RPC: getRawTransaction 16a40eca8bf6a1d4b913820718db2361686a9371e4b4ad82998c0566cf7a3052
SLPJS Validating: 16a40eca8bf6a1d4b913820718db2361686a9371e4b4ad82998c0566cf7a3052
SLPJS Result: false (16a40eca8bf6a1d4b913820718db2361686a9371e4b4ad82998c0566cf7a3052)
SLPJS Invalid Reason: Token outputs are greater than possible token inputs.

Mint box output

...
[INFO] JSON RPC: getRawTransaction eda7665cc377faaad4a29182726a3ce6b7e06c6bdf735224eff7dff58b869290
SLPJS Validating: eda7665cc377faaad4a29182726a3ce6b7e06c6bdf735224eff7dff58b869290
SLPJS Result: true (eda7665cc377faaad4a29182726a3ce6b7e06c6bdf735224eff7dff58b869290)
SLPJS Result: true (83f62d17516c05e684ea742a12c41941b6d4dda8fe6cf54038ab3e7d263148f2)
SLPJS Result: true (0860168b6ce410800e5d0a12d96534bbdd0aad7f7e416523bb0304524740f4ad)
[INFO] Updating confirmed TNATxn SLP data for 1a07d6d3fb321fbb6c80b5b6ba26d86ec8bc1733edb05b5ffccd2e7b9fb9a3f9
[INFO] JSON RPC: getRawTransaction 1a07d6d3fb321fbb6c80b5b6ba26d86ec8bc1733edb05b5ffccd2e7b9fb9a3f9
[INFO] Updating confirmed TNATxn SLP data for 83f62d17516c05e684ea742a12c41941b6d4dda8fe6cf54038ab3e7d263148f2
[INFO] JSON RPC: getRawTransaction 83f62d17516c05e684ea742a12c41941b6d4dda8fe6cf54038ab3e7d263148f2
SLPJS Validating: 83f62d17516c05e684ea742a12c41941b6d4dda8fe6cf54038ab3e7d263148f2
SLPJS Result: true (83f62d17516c05e684ea742a12c41941b6d4dda8fe6cf54038ab3e7d263148f2)

It just looks like normal output, but there is no more console output. I sent some token transactions and these instances of SLPDB do not register it. They aren't registering any SLP token traffic or doing any further processing. They are just hanging. No errors. It didn't exit.

The RAM isn't very high. The processor is basically idle.

Both systems are running:

  • node.js v10+
  • Typescript v3.5.3
  • MongoDB 4+

The only similarity in the outputs that I'm seeing between the two hanging machines is that they both appear to have finished the same processing loop at the same point before hanging.

The SLPJS Result: is the last line for both. I'm assuming the SLPJS Invalid Reason: is unique to particular token that was being processed at the time on the dev box (probably bread token).

Tx recorded as invalid when it isn't, but it's children are recorded as valid

https://slpdb.fountainhead.cash/explorer/ewogICJ2IjogMywKICAicSI6IHsKICAgICJmaW5kIjogewogICAgICAiZ3JhcGhUeG4udHhpZCI6ICI0ZmUzNzFmM2NiYzViMjJmYjBjYzQ3OTA1M2MxYzgyNDEyZjFhYjNkNTVmYTA2MDRjZTAzNjMxYTg1Mjg5YTgxIgogICAgfSwKICAgICJsaW1pdCI6IDEwCiAgfQp9

{
"slpAmount": "100000",
"address": "simpleledger:qrs77l23pmclphnsjhd82dhxu2nvqssyluugg0jtn5",
"vout": 1,
"bchSatoshis": 546,
"spendTxid": "5f91233a5efc7b81f100d922b3967c82db27187bce22eb2f8a5df434d37dde77",
"status": "SPENT_INVALID_SLP",
"invalidReason": "Output burned in an invalid SLP transaction"
},

looking up that txid with tx.h shows its not in slpdb confirmed collection either

https://slpdb.fountainhead.cash/explorer/ewogICJ2IjogMywKICAicSI6IHsKICAgICJmaW5kIjogewogICAgICAidHguaCI6ICI1ZjkxMjMzYTVlZmM3YjgxZjEwMGQ5MjJiMzk2N2M4MmRiMjcxODdiY2UyMmViMmY4YTVkZjQzNGQzN2RkZTc3IgogICAgfSwKICAgICJsaW1pdCI6IDEwCiAgfQp9

Broken dependency with node-jq and big-jq

node-jq/bin/jq is not found
I had an issue trying to install one of the dependencies:
unwriter/bigjq#1

This might have been particular to my machine but i got a solution.

Solution:
In this lib, you can fix this by running npm install --save --unsafe-perm [email protected]; npm install --save --unsafe-perm --no-bin-links

The package maintainer can fix this by adding "node-jq:":"=1.5.0" to the package.json. I'm not sure why no-bin-links is needed but it was necessary to do the final install on my machine

Inconsistent format in ZMQ publishing for mempool and blocks

A user pointed out that slp.outputs uses type of string in block notification and $numberDecimal in mempool notification.

Conversation:
in the "block" message the outputs are similar to this format:

    "outputs": [{
      "address": "slptest:xxx",
      "amount": "1000000"
    }]

while in the "mempool" message the format is:

  "outputs": [{
    "address": "slptest:xxx",
    "amount": {
      "$numberDecimal": "1000000"
    }
  }]

i.e: the "amount" is number (as string) in the first one, while it's an object with a property named "$numberDecimal" in the second one

slpdb stop sync in block number 643464

when i start slpdb program, the program will stop with below log:

SLPJS Result: true (3c7048e40d81239accf6b682420154cc792df7621674d764acf585e1332deb48) [INFO] Unpruned txn count: 116 (token: c96d703453a3c66e03c395619f824b1d69d1ee2c1a858c898b409483c94041cb) [ERROR] unhandledRejection Error: Cannot have a SEND or MINT transaction without any input. at SlpTokenGraph.<anonymous> (/root/bitcoin_cash/SLPDB/slptokengraph.js:414:27) at Generator.next (<anonymous>) at fulfilled (/root/bitcoin_cash/SLPDB/slptokengraph.js:5:58) at runMicrotasks (<anonymous>) at processTicksAndRejections (internal/process/task_queues.js:94:5) [INFO] Block checkpoint retrieved: 643464 000000000000000001e4f0cf837432589bbfb1b551758d7901473f54e19d01cd [INFO] JSON RPC: getMempoolInfo [INFO] Sending telemetry update to status.slpdb.io for unknown-72059... Error: Cannot have a SEND or MINT transaction without any input. at SlpTokenGraph.<anonymous> (/root/bitcoin_cash/SLPDB/slptokengraph.js:414:27) at Generator.next (<anonymous>) at fulfilled (/root/bitcoin_cash/SLPDB/slptokengraph.js:5:58) at runMicrotasks (<anonymous>) at processTicksAndRejections (internal/process/task_queues.js:94:5) [INFO] Shutting down SLPDB... Sun Jul 12 2020 08:50:01 GMT+0800 (GMT+08:00)

i don't kown why,so i delete my slpdb dir and git clone new project, when i restart slpdb,the block number start with 543375...
so i need almost 10 hour to catch main net height.

help me,thanks

Feature: "blocks" collection

It would be cool to be able to have a blocks collection which stored some metadata such as:

height
blocksize
block version
previous block hash
merkle root
time
bits
difficulty
chainwork
nonce
coinbase input data
transaction count
total inputs
total outputs
fee total

Can we recover from MongoDB query timeouts?

Below are stack traces from the QA server I run to test releases of SLPDB. I intentially run SLPDB on the machine with 2GB of RAM to see how it will respond in a less-than-optimal environment. I apply the same environment to each release I test, to look for changes.

SLPDB appears to consistently crash due to timeouts when trying to talk to MongoDB. The error is being caught appropriately in query.js, but then get's re-thrown, which ultimately causes MongoDB to crash and the Docker container to restart.

Is there some way the error could be handled? Like maybe with p-retry?

These errors were first seen in build commit 7675d9293ad43a6b7d017cbae8b56db76cf30e9c. I revered the machine back to a known-good commit (42dc8b7cad3845629095ca72ae210af05e04227d), but saw the same errors. These crashes do not appear to be due to anything new that has been added to SLPDB.

Here are the stack traces:


    "[Thu, 12 Dec 2019 03:41:46 GMT] Error: _dbQuery failed with query {\"v\":3,\"q\":{\"db\":[\"c\"],\"find\":{\"out.s3\":\"MINT\",\"out.b4\":\"npnWPXX/PNynTJdnduUm0q/1ZUFggKh7pCf9gjA437c=\",\"out.b1\":\"U0xQAA==\"},\"sort\":{\"blk.i\":-1},\"limit\":1},\"r\":{\"f\":\"[ .[] | { block: (if .blk? then .blk.i else null end)} ]\"}} has response {\"errors\":[\"MongoNetworkError: connection 25 to 172.17.0.1:12301 timed out\"]}\n    at Function.<anonymous> (/home/safeuser/SLPDB/query.js:48:31)\n    at Generator.next (<anonymous>)\n    at fulfilled (/home/safeuser/SLPDB/query.js:5:58)\n    at runMicrotasks (<anonymous>)\n    at processTicksAndRejections (internal/process/task_queues.js:93:5)",

    "[Thu, 12 Dec 2019 03:37:11 GMT] Error: _dbQuery failed with query {\"v\":3,\"q\":{\"db\":[\"c\"],\"find\":{\"out.s3\":\"MINT\",\"out.b4\":\"/QHYZHYbgHIdKJ9+fVD3VJz020FGZXRyVreDIEWJx04=\",\"out.b1\":\"U0xQAA==\"},\"sort\":{\"blk.i\":-1},\"limit\":1},\"r\":{\"f\":\"[ .[] | { block: (if .blk? then .blk.i else null end)} ]\"}} has response {\"errors\":[\"MongoNetworkError: connection 26 to 172.17.0.1:12301 timed out\"]}\n    at Function.<anonymous> (/home/safeuser/SLPDB/query.js:48:31)\n    at Generator.next (<anonymous>)\n    at fulfilled (/home/safeuser/SLPDB/query.js:5:58)\n    at runMicrotasks (<anonymous>)\n    at processTicksAndRejections (internal/process/task_queues.js:93:5)\n    at runNextTicks (internal/process/task_queues.js:62:3)\n    at processTimers (internal/timers.js:472:9)",

    "[Thu, 12 Dec 2019 03:37:11 GMT] Error: _dbQuery failed with query {\"v\":3,\"q\":{\"db\":[\"c\"],\"find\":{\"out.s3\":\"SEND\",\"out.b4\":\"SDIm1m4tbASaHGXwUjzWyMiVxqYTrW6k7QrM80AlrgU=\",\"out.b1\":\"U0xQAA==\"},\"sort\":{\"blk.i\":-1},\"limit\":1},\"r\":{\"f\":\"[ .[] | { block: (if .blk? then .blk.i else null end)} ]\"}} has response {\"errors\":[\"MongoNetworkError: connection 25 to 172.17.0.1:12301 timed out\"]}\n    at Function.<anonymous> (/home/safeuser/SLPDB/query.js:48:31)\n    at Generator.next (<anonymous>)\n    at fulfilled (/home/safeuser/SLPDB/query.js:5:58)\n    at runMicrotasks (<anonymous>)\n    at processTicksAndRejections (internal/process/task_queues.js:93:5)",

    "[Thu, 12 Dec 2019 03:30:23 GMT] Error: _dbQuery failed with query {\"v\":3,\"q\":{\"db\":[\"c\"],\"find\":{\"out.s3\":\"SEND\",\"out.b4\":\"vGHUkpc09vBJwoO/ya+j8ra+HuNuDgrEh/OfYkWetIE=\",\"out.b1\":\"U0xQAA==\"},\"sort\":{\"blk.i\":-1},\"limit\":1},\"r\":{\"f\":\"[ .[] | { block: (if .blk? then .blk.i else null end)} ]\"}} has response {\"errors\":[\"MongoNetworkError: connection 24 to 172.17.0.1:12301 timed out\"]}\n    at Function.<anonymous> (/home/safeuser/SLPDB/query.js:48:31)\n    at Generator.next (<anonymous>)\n    at fulfilled (/home/safeuser/SLPDB/query.js:5:58)\n    at runMicrotasks (<anonymous>)\n    at processTicksAndRejections (internal/process/task_queues.js:93:5)\n    at runNextTicks (internal/process/task_queues.js:62:3)\n    at processTimers (internal/timers.js:472:9)",

    "[Thu, 12 Dec 2019 03:25:11 GMT] Error: _dbQuery failed with query {\"v\":3,\"q\":{\"db\":[\"c\"],\"find\":{\"out.s3\":\"MINT\",\"out.b4\":\"a0bKzzca2B+7QjHX4/XmHnHQ9JYjldKCRveiJ1sVtCk=\",\"out.b1\":\"U0xQAA==\"},\"sort\":{\"blk.i\":-1},\"limit\":1},\"r\":{\"f\":\"[ .[] | { block: (if .blk? then .blk.i else null end)} ]\"}} has response {\"errors\":[\"MongoNetworkError: connection 24 to 172.17.0.1:12301 timed out\"]}\n    at Function.<anonymous> (/home/safeuser/SLPDB/query.js:48:31)\n    at Generator.next (<anonymous>)\n    at fulfilled (/home/safeuser/SLPDB/query.js:5:58)\n    at runMicrotasks (<anonymous>)\n    at processTicksAndRejections (internal/process/task_queues.js:93:5)\n    at runNextTicks (internal/process/task_queues.js:62:3)\n    at processTimers (internal/timers.js:472:9)"

have mint transactions include the tokenDetails from genesis

unit tests using regtest network

  • make SLPDB code changes to allow regtest network to function with SLPDB
  • document general process for adding new tests
  • double spends: create regtest code for creating the transactions
  • double spends: check the expected state of all collections
  • block reorgs: create regtest code for making reorg
  • block reorgs: check the expected state of all collections

slpdb exit due to memory and bug

slpdb version is

commit b362fd2a803ac42e72b538c0939f89003a24a394
Author: James Cramer <[email protected]>
Date:   Sat Feb 15 11:00:15 2020 -0500

    clarify realtime options

my slpdb exit log
it may becase the memory leak and get tx(pre tx in same block? or bitcoin cash rpc return error?)vin error

grep -B 10 "Shutting down SLPDB" -r slpdbnew.log*
slpdbnew.log-[INFO] JSON RPC: getBlockInfo/getBlockHash 622741
slpdbnew.log-[INFO] JSON RPC: getBlockInfo/getBlockHeader 0000000000000000017d4bf6850e8b1043ba507f8c4550ebb3035ea4f2ba3e6c true
slpdbnew.log-[INFO] Sending telemetry update to status.slpdb.io for unknown-91237...
slpdbnew.log-[INFO] Crawling block 622741 hash: 0000000000000000017d4bf6850e8b1043ba507f8c4550ebb3035ea4f2ba3e6c       slpdbnew.log-Error: Graph item cannot have inputs less than outputs (txid: bf5b67d8efa1ef145b4781236933bec2dcb51d9edf676da5798b2f776114001d, inputs: 0 | 0, outputs: 200191563 | 2).
slpdbnew.log-    at SlpTokenGraph.<anonymous> (/var/viabtc/SLPDBNEW/slptokengraph.js:508:27)
slpdbnew.log-    at Generator.next (<anonymous>)
slpdbnew.log-    at fulfilled (/var/viabtc/SLPDBNEW/slptokengraph.js:5:58)
slpdbnew.log-    at runMicrotasks (<anonymous>)
slpdbnew.log-    at processTicksAndRejections (internal/process/task_queues.js:93:5)
slpdbnew.log:[INFO] Shutting down SLPDB... Tue Feb 18 2020 07:12:44 GMT+0800 (China Standard Time)
--
slpdbnew.log.1-[DEBUG] graphItemsUpsert - inserted: 6bf8a1f48b2e776a3ed09b51b08f39ac9be84608b84750b6881c8f7f44bb72cd   slpdbnew.log.1-[DEBUG] graphItemsUpsert - inserted: 7ec6b90df4eba41b44a62ed39ece95f101483f1b6389b78d081e37cddceca3aa   slpdbnew.log.1-[DEBUG] graphItemsUpsert - inserted: 8b8107d2f669d19b4435508e407380e5bb323b6d9debd518555db6801c39c67b   slpdbnew.log.1-[INFO] Sending telemetry update to status.slpdb.io for unknown-91237...
slpdbnew.log.1-Error: _dbQuery failed with query {"v":3,"q":{"db":"c","find":{"blk.h":"000000000000000000fc1dcbf5d3dc73a2fe76ce6658ec979f386ca742ba2e3d"},"limit":10000000},"r":{"f":"[ .[] | { txid: .tx.h, timestamp: (if .blk? then (.blk.t | strftime(\"%Y-%m-%d %H:%M:%S\")) else null end), slp: .slp } ]"}} has response {"errors":["Error: spawn ENOMEM"]}   slpdbnew.log.1-    at Function.<anonymous> (/var/viabtc/SLPDBNEW/query.js:50:31)
slpdbnew.log.1-    at Generator.next (<anonymous>)
slpdbnew.log.1-    at fulfilled (/var/viabtc/SLPDBNEW/query.js:5:58)
slpdbnew.log.1-    at runMicrotasks (<anonymous>)
slpdbnew.log.1-    at processTicksAndRejections (internal/process/task_queues.js:93:5)
slpdbnew.log.1:[INFO] Shutting down SLPDB... Tue Feb 18 2020 04:06:43 GMT+0800 (China Standard Time)

Cannot read property 'outputs' of undefined

This issue is related to testing out a patch for Issue #40.

SLPDB exited with the following output:

SLPJS Validating: ed1edc9dca2b6573d85d9bfc944f26b7c9e8d256e8f39552c7dea8ec953ea287
SLPJS Result: true (ed1edc9dca2b6573d85d9bfc944f26b7c9e8d256e8f39552c7dea8ec953ea287)
[INFO] JSON RPC: getTxOut f8b42574344527dd817d90e373aea2043125e56df5a7d8ca51c87f5699f22a70 2 true
[Query] queryForTxoInputAsSlpSend(f8b42574344527dd817d90e373aea2043125e56df5a7d8ca51c87f5699f22a70,2)
STR =  [
  {
    "txid": "b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f",
    "block": 597769,
    "timestamp": "2019-08-28 15:28:05",
    "tokenid": "0df768b5485c72645de069b68f66d02205c26f827c608ef5ffa976266d753d50",
    "slp1": "0000000005f5e100",
    "slp2": "00000000e90ddd80",
    "slp3": null,
    "slp4": null,
    "slp5": null,
    "slp6": null,
    "slp7": null,
    "slp8": null,
    "slp9": null,
    "slp10": null,
    "slp11": null,
    "slp12": null,
    "slp13": null,
    "slp14": null,
    "slp15": null,
    "slp16": null,
    "slp17": null,
    "slp18": null,
    "slp19": null,
    "bch0": 0,
    "bch1": 2150,
    "bch2": 546,
    "bch3": 3958938,
    "bch4": null,
    "bch5": null,
    "bch6": null,
    "bch7": null,
    "bch8": null,
    "bch9": null,
    "bch10": null,
    "bch11": null,
    "bch12": null,
    "bch13": null,
    "bch14": null,
    "bch15": null,
    "bch16": null,
    "bch17": null,
    "bch18": null,
    "bch19": null
  }
]

SLPJS Validating: b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f
SLPJS Result: true (b53aa58e93fbc68bdcba65a2c24dd55e99ccf48454b9fbaa962bb937797ebc9f)
TypeError: Cannot read property 'outputs' of undefined
    at SlpTokenGraph.<anonymous> (/home/trout/work/slpdb/slpdb/SlpTokenGraph.js:383:32)
    at Generator.next (<anonymous>)
    at fulfilled (/home/trout/work/slpdb/slpdb/SlpTokenGraph.js:4:58)
    at process._tickCallback (internal/process/next_tick.js:68:7)

I was running this commit to the master branch:
083f462

Loops over warning when restarted

SLPDB loops over the following warning when restarted:

[INFO] JSON RPC: getBlockchainInfo
[WARN] bitcoind sync status did not change, check your bitcoind network connection.

Certain txn causes sync to hang

6f5f9a096a9824c2a031ddc6ed8281cb0065340ba64111077dbd1e753f4d9558 causes SLPDB to hang on the initial sync. Need to look into this further.

Simply stop / restart SLPDB to keep going.

Send queries to SLPDB example.

Hi Everyone,

I am quite new to blockchain, jq and slpdb. From the documentation, I understand that for a local installation of SLPDB, we can connect to the API endpoint at http://localhost:3000/q.

However, I do not understand how a query can be sent to the endpoint to get results from this local installation. Right now I have slpdb, slpserve and slpsockserve installed and running on my machine. Can you please let me know how I can send a query of the format, say,
{ "v": 3, "q": { "db": ["t"], "find": { "$query": { "tokenDetails.tokenIdHex": "959a6818cba5af8aba391d3f7649f5f6a5ceb6cdcd2c2a3dcb5d2fbfc4b08e98" } }, "project": {"tokenDetails": 1, "tokenStats": 1, "_id": 0 }, "limit": 1000 } }
to the endpoint?

Please let me know if you need any other information. I apologize for posting this as an issue as I feel this is mostly my ignorance.

utxos collection should be able to be searched by txid only

If for instance one was to create a SLP splitter, it could be useful to see how many utxos from a prior tx are remaining. This currently has to be done by doing something like:

$in: [
'txid:vout1',
'txid:vout2',
'txid:vout3',
...
]

I think there are two options possible for this:

  • txid and vout could be added, this keeps backwards compatibility at the expense of extra disk usage and bigger documents. In a future version utxo could be deprecated.
  • utxo could be fulltext indexed, this probably will slow down Mongo processing and makes people use regex to search.

On reorg event, delete graph collection items

In the event of a blockchain reorg, the graph collection needs to be rebuilt.

The process goes as follows:

  1. Full node block hash for the current block does not match the current SLPDB tip.
  2. Update status to "REORG_CORRECTION_IN_PROGRESS"
  3. SLPDB deletes all collection items related to the current SLPDB tip
  4. Rollback SLPDB tip.
  5. Fetch previous block header from the full node, this becomes "current block"
  6. Check previous block hash of "current block" with current SLPDB tip
  7. Repeat 1-6 until SLPDB tip is equal to the previous block hash of the current block.

Duplicate records in graphs

Somehow the graphTxn collection contains duplicate records. This shouldn't be happening since each token graph is of type Map<txid, graphTxn>. It may have something to do with the record deletion process being slow in mongodb.

Either way it needs to be handled. @blockparty-sh has recommended utilizing mongo transactions to guarantee to order db operations as a possible solution.

Note: It was observed that the duplicate records count increased when restarting SLPDB.

Previous outputs statuses changed incorrectly after being inputs into a burn transaction

A user reported a bug related to improper output status being set after a burn transaction occurs.

An actual token burn occurred in this transaction, caused by "outputs > inputs", 339fc548bd2496dfa8247c7e7357bae3733ed4f3c10f9ef86135fc1d8dc04b4d.

Once this happened SLPDB improperly updated the input transaction status to MISSING_BCH_VOUT in 6ae170ac91ffc54fd47e470bca770b57235f2d359eaceb69b03b3d71235822c0:1 and fb0d969340e46b4be418f7741804a51ec8a2fefb4501aae5f3023279a35244f2:2. Instead, the status should have been set as SPENT_INVALID_SLP.

Note: Both status types: BATON_SPENT_INVALID_SLP and SPENT_INVALID_SLP are commented out in code, so these should be used to fix this problem.

I believe the code needs to be improved to handle this edge case for the previous transaction (inputs) that are MINT transactions at this location:
https://github.com/simpleledger/SLPDB/blob/master/slptokengraph.ts#L292

Unable to sync from scratch anymore

It seems commit deb8f20 introduced a regression as I'm unable to sync a fresh SLPDB instance anymore (I did so successfully a few weeks ago).

https://github.com/simpleledger/SLPDB/blame/deb8f203299fe44a63d6c880b20da2dc1bed9503/slptokengraph.ts#L440

[INFO] Unpruned txn count: 101 (token: 9cc1cf24e502554d2d3d09918c27decda2c260762961acd469c5473fbcfe192e)
[INFO] addGraphTransaction: update the status of the input txns' outputs
[INFO] Used bit._spentTxoCache spend data 9574e0c0978cb1228f9ede38eda7497aa90be8f5460ffb66fb98b3462147fcab 3
SLPJS Validating: 74bc94b431096e392952ed4cdff1611f67675b253e86680d20a25874418106de
SLPJS Result: true (74bc94b431096e392952ed4cdff1611f67675b253e86680d20a25874418106de)
[ERROR] unhandledRejection Error: Cannot have a SEND or MINT transaction without any input.
    at SlpTokenGraph.<anonymous> (/var/lib/slpdb/SLPDB/slptokengraph.js:414:27)
    at Generator.next (<anonymous>)
    at fulfilled (/var/lib/slpdb/SLPDB/slptokengraph.js:5:58)
    at runMicrotasks (<anonymous>)
    at processTicksAndRejections (internal/process/task_queues.js:97:5)
[INFO] Block checkpoint retrieved:  574745 00000000000000000531437b7d87b6de4c9321ec2745995b834923814006938e
[INFO] Sending telemetry update to status.slpdb.io for xxx...
Error: Cannot have a SEND or MINT transaction without any input.
    at SlpTokenGraph.<anonymous> (/var/lib/slpdb/SLPDB/slptokengraph.js:414:27)
    at Generator.next (<anonymous>)
    at fulfilled (/var/lib/slpdb/SLPDB/slptokengraph.js:5:58)
    at runMicrotasks (<anonymous>)
    at processTicksAndRejections (internal/process/task_queues.js:97:5)
[INFO] Shutting down SLPDB... Thu Jul 09 2020 08:06:11 GMT+0000 (Coordinated Universal Time)

BTW it would be nice to have commit messages a bit less vague than just "updates" ;)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.