Giter VIP home page Giter VIP logo

lisk-service's Introduction

Logo

Lisk Service

License: Apache 2.0 GitHub tag (latest by date) GitHub repo size DeepScan grade GitHub issues GitHub closed issues Code coverage

Lisk Service is a web application middleware that allows interaction with various blockchain networks based on the Lisk protocol.

The main focus of Lisk Service is to provide data to the UI clients such as Lisk Desktop and Lisk Mobile. It allows accessing the live blockchain data similarly to the regular Lisk SDK API, albeit with more comprehensive features. Furthermore, Lisk Service also provides users with much more detailed information and endpoints, such as geolocation, network usage statistics, and more.

The project is a Microservices-based implementation. The technical stack design helps deliver several micro-services, whereby each one is responsible for a particular functionality. The data is served in JSON format and exposed by a public RESTful or a WebSocket-based RPC API.

Available Services

Lisk Service comprises of multiple microservices that can operate independently of each other. The Gateway is required to expose the APIs provided by the specific services.

Every microservice is independently managed and placed in a separate directory under the services directory. They contain their own package.json and Dockerfile that are beneficial when running the applications.

Service Description
Gateway The Gateway exposes the API for Lisk Service users to access and use over HTTP and WS protocols. Its main purpose is to proxy API requests from users to the concerned Lisk Service microservices. It provides the users with a central point of data access that ensures existing application compatibility.
Connector The Blockchain Connector connects with the node running a Lisk protocol-compliant blockchain application. It is primarily responsible for data transformation and caching, thus reducing the number of calls made to the node.
Coordinator The Blockchain Coordinator service is primarily responsible for ensuring the completeness of the index. It performs periodic checks for any gaps in the index and schedules tasks to update it, along with the latest block updates.
Indexer The Blockchain Indexer service, in the indexing mode, is primarily responsible to update the index, based on the scheduled jobs by the Blockchain Coordinator. In the data service mode, it serves user request queries made via the RESTful API or WebSocket-based RPC calls. It can run both the indexer and data service modes simultaneously, which is enabled by default.
App Registry The Blockchain Application Registry service is primarily responsible for regularly synchronizing and providing off-chain metadata information for known blockchain applications in the Lisk ecosystem. The metadata is maintained in the Lisk Application Registry repository.
Fee Estimator The Fee Estimator service implements the dynamic fee system algorithm to offer users transaction fee recommendations based on the network traffic.
Transaction Statistics The Transaction Statistics service, as the name suggests, is primarily responsible to compute various transaction statistics to offer users various real-time network insights.
Market The Market service allows price data retrieval. It supports multiple sources to keep the current Lisk token price up-to-date and available to the clients in real time.
Export The Export service enables users to download the transaction history as a CSV file for any given account on the blockchain.
Template The Template service is an abstract microservice from which all Lisk Service services are inherited. It allows all services to share a similar interface and design pattern. Its purpose is to reduce code duplication and increase consistency between each service, hence, simplifying code maintenance and testing.

Remarks

  • Lisk Service by default attempts to connect to a local node via WebSocket on port 7887 or IPC on ~/.lisk/lisk-core by default.
  • The default installation method is based on Docker.
  • Some token conversion rates in the Market service require their API keys.
  • For the events information to be always available in the API, please set system.keepEventsForHeights: -1 in the Lisk application node config.
  • It is highly recommended to NOT enable any plugins on the Lisk application node when running Lisk Service against it. Enabling them can cause performance issues in Lisk Service.

Architecture Diagram

Inter-microservice communications are enabled with a message broker, typically an instance of Redis or NATS.

Lisk Service Architecture

API documentation

The Gateway service provides the following APIs, which all users of Lisk Service can access and use.

API Description
HTTP API HTTP API is the public RESTful API that provides blockchain data in standardized JSON format.
WebSocket JSON-RPC API The WebSocket-based JSON-RPC API provides blockchain data in standardized JSON format. The API uses the Socket.IO library and is compatible with JSON-RPC 2.0 standards.
Subscribe API The Subscribe API is an event-driven API. It uses a two-way streaming connection, which can notify the client about new data instantly as it arrives. It is responsible for updating users regarding changes in the blockchain network and markets.

Installation

The default port for REST API requests and Socket.IO-based communication is 9901. The API is accessible through the URL http://127.0.0.1:9901 when running locally. The REST API is accessible via HTTP clients such as Postman, cURL and HTTPie.

WebSocket-based APIs can be used through the Socket.IO library available for many modern programming languages and frameworks.

To continue the installation ensure that you have the following dependencies installed:

Follow the instructions listed below, to acquire detailed information regarding the installation of required dependencies for various operating systems.

Retrieve the latest release from the official repository.

Unpack the source code archive by executing the following commands listed below:

tar -xf lisk-service-x.y.z.tar.gz
cd lisk-service

Although the above commands retrieve the entire source code, this instruction does not cover building a custom version of Lisk Service. For more information refer to this document: Building Lisk Service from source

Docker image build (Optional)

If you wish to build the local version of Lisk Service execute the following command below:

make build-images

This step is only necessary if you wish to build a custom or pre-release version of Lisk Service that does not have a pre-built Docker image published on the Docker Hub. The installation script chooses the last available stable version on Docker Hub, unless there is no local image. If you are unsure about any local builds, use the make clean command to remove all locally built docker images.

System requirements

The following system requirements are recommended to start Lisk Service:

Memory

  • Machines with a minimum of 16 GB RAM for the Mainnet.
  • Machines with a minimum of 16 GB RAM for the Testnet.

Storage

  • Machines with a minimum of 40 GB HDD.

Configuration

The default configuration is sufficient to run Lisk Service against the local node.

Before running the application copy the default docker-compose environment file:

cp docker/example.env .env

In the next step, set the required environment variables.

$EDITOR .env

The example below assumes that the Lisk Core (or any Lisk protocol-compliant blockchain application) node is running on the host machine, and not inside of a Docker container.

## Required
# The local Lisk Core node WebSocket API port
export LISK_APP_WS="ws://host.docker.internal:7887"

When running a node inside of a Docker container, the variable needs to refer to the container: LISK_APP_WS="ws://<your_docker_container>:7887".

Configuration options are described in this document.

Optional: Check your configuration with the command make print-config

Management

To run the application execute the following command:

make up

To stop the application execute the following command:

make down

Optional: It is possible to use regular docker-compose commands such as docker-compose up -d. Please check the Makefile for more examples.

Benchmark

Assuming lisk-service is running on the 127.0.0.1:9901, and you are in the root of this repo, you can run the following:

cd tests
LISK_SERVICE_URL=http://127.0.0.1:9901 yarn run benchmark

Further development

The possibility to customize and build Lisk Service from a local source is described in the following document Building Lisk Service from source. It may also be useful for PM2-based installations.

Contributors

https://github.com/LiskHQ/lisk-service/graphs/contributors

License

Copyright 2016-2024 Lisk Foundation

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

lisk-service's People

Contributors

chris529 avatar dependabot[bot] avatar fchavant avatar manugowda avatar nagdahimanshu avatar nazgolze avatar paveleu avatar priojeetpriyom avatar sameersubudhi avatar sridharmeganathan avatar talhamaliktz avatar tschakki avatar vardan10 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lisk-service's Issues

Prepare 0.1.0 for production

The list of task to deploy open-source Lisk Service into production

  • Additional input field for custom Lisk Service URL in Lisk Desktop
  • Deploy Lisk Cloud to cloud.lisk.io
  • Deploy Lisk Service mainnet to mainnet-service.lisk.io and service.lisk.io
  • Deploy Lisk Service testnet to testnet-service.lisk.io
  • Deploy Lisk Service betanet to betanet-service.lisk.io
  • Redirect service.lisk.io/api/v1/market to cloud.lisk.io

Tasks

Sprint 8

  • Enable CI/CD #5
  • Release first alpha version #60
  • Update README file #64

Sprint 9 & 10

  • Peer list cache #42

Sprint 10

  • Migrate core service to framework #41
  • Update integration tests #59
  • Keep compatibility layer for Lisk Desktop #88

Sprint 11

  • Update README documentation #68
  • Update dependencies #61
  • Auto-generated API documentation #62
  • Check compatibility with Lisk Core v3 #95
  • Fix SDKv3 compatibility for accounts #97
  • Relax JSON-RPC API envelope format #102
  • Update status endpoint #103

Update README documentation

The README documentation needs the following updates:

  • missing PM2-specific run-time options reference - update ecosystem.* filenames
  • there is no REDIS_PORT anymore replaced by SERVICE_BROKER and SERVICE_CORE_REDIS that point to two different Redis DBs
  • The note in the postgres section of the ubuntu prerequisites is identical with the Redis note. Should refer to Postgres instead.
  • Runtime reference Postgres URL needs an update

Migrate core service to framework

In order to keep the dependencies minimal the core service needs to be migrated to the framework.

During the code review a lot of redundant code across all modules was found. Most of the code shares similar logic and provides similar functionality. Migrating various Core imports to the framework project reduces the amount of code and helps with standardisation of the framework API.

Replace request library with axios

Request library is going to be deprecated soon.

  • Replace it with axios
  • Make unified http request library across the whole project and library-agnostic API

Change API to support Lisk SDK v4 API

Make Lisk Service compatible with the SDK v4. Address all upstream API changes made by the SDK maintainers, including new features. Create the persistent store for frequently used data such as blocks and transactions.

Tasks

Sprint 14

  • Migrate transaction statistics to embedded database #131
  • Data transformer: empty object not removed #143

Sprint 14 & 15

  • Block finality status #39
  • Add persistent block data storage #132
  • Migrate accounts to blockchain data storage #136
  • Migrate transactions to blockchain data storage #137

Sprint 15

  • Migrate delegate cache to data store #134
  • Migrate peer cache to blockchain data store #135

Sprint 16

  • Initiate SDKv4 support #133
  • Implement SDKv4 multisignature API #142
  • Implement blockchain data storage data retention #138
  • Keep list of pending transactions up-to-date #140
  • SDK version 4 delegate support #179
  • Delegate username resolver #181
  • Support for sent votes #188
  • Support for received voters #189
  • Fix RPC calls registration issue #193
  • Round forgers support #182

Sprint 17

  • Evaluate SDKv4 compatibilty of Lisk Service #141
  • Make integration tests schema-based #150
  • Move event triggers to SDK compatibility layer #186
  • Implement storage data retention for accounts #157

Implement fee estimation service

Description

Provide user interfaces with a fee estimate service that conforms with LIP-0016

This LIP proposes a fee estimation algorithm that suggests a fee depending on the priority of the transaction for the user and the blockchainโ€™s recent history. The algorithm is based on an Exponential Moving Average (EMA). It is implemented in a node of the Lisk network and the wallet queries its output to get the transaction fee estimation for the user.

The protocol will define a minimum fee for every transaction. It will be up to each user to set the fee for each transaction, but any transaction received with a fee below that minimum fee will be considered invalid by the protocol.

Tasks

Sprint 12

  • Implement fee estimation algorithm #35
  • Implement estimated moving average algorithm #106
  • HTTP endpoint for fee estimation algorithm #36
  • JSON-RPC endpoint for fee estimation algorithm #37
  • Artificial block generator #112

Sprint 13

  • Quality assurance of dynamic fee algorithm #105

Peer list cache

Description

In this scenario, the peers' list is updated by the core-component and stored in the Redis database. Modify the core-component to write to Redis and configure gateway-component to make use of that data.

Note that data has to be collected in several request as the Lisk Core node does not maintain a complete list of them.

Motivation

  • Retrieval of the peers data takes too long to make it exclusively during the client request
  • Data collected in several requests cannot be passed directly to the client and need to be processed
  • Geolocation data is not possible to retrieve instantly

Acceptance Criteria

  • Core Service collects peers data and keeps them for 5 min
  • Peer list is retrieved from Lisk Core with 30 seconds interval
  • To reduce response time controller provides only cached data
  • Geolocation data is being refreshed in the background, requests use cached data only

Support transition to open source - Quality assurance

The list of task to support transition to open source - Code refactoring

  • Move existing functional tests to the new repository
  • Update broken tests
  • Add custom npm commands to run tests
  • Update Jest library
  • Update Jest configuration
  • Add missing tests

Update npm dependencies

Description

Update dependencies with critical and high security issues.

Motivation

  • The release version cannot have any dependency-based security issues

Acceptance Criteria

  • npm audit shows no critical and high security issues

Support transition to open source

The lis of task to support transition to open source

  • Run copied code
  • Migrate to Node 12.x
  • Update dependencies to latest versions
  • Add support for the controllers based on the framework and Moleculer
  • Add Gateway based on Moleculer for HTTP
  • Add Gateway based on Moleculer for JSON-RPC
  • Add Mapper and copy mapping files from Lisk Cloud Gateway
  • Add PM2 configuration
  • Add Dockerfile

The list of task to support transition to open source - Quality Assurance

  • Move existing functional tests to the new repository
  • Update broken tests
  • Add custom npm commands to run tests
  • Update Jest library
  • Update Jest configuration
  • Add missing tests

Tasks

Sprint 1

  • Migrate to Node 12.x #10
  • Update project dependencies #11

Sprint 2

  • Add HTTP API #12
  • Migrate core service to framework #15

Sprint 3-4

  • Add PM2 configuration #14

Sprint 5

  • Add Mapper and copy mapping files from Lisk Cloud Gateway #25
  • Add Docker file #26
  • Project-wide ES linting #48
  • Develop integration test to support open source #51

Sprint 6-7

  • Develop missing functional test to support open source #52
  • Add JSON-RPC API #13

Sprint 7

  • Auto-pagination for Lisk Core endpoints #46

Sprint 11

  • JSON-RPC multi-request support #87

Update delegate tests

Description

Make sure that delegate-related API endpoints work properly.

Motivation

Making sure that delegate API endpoints are working is essential to the future Lisk Service release.

Acceptance Criteria

  • Tests are enabled
  • Any issues with the API are registered in GitHub

Auto-pagination for Lisk Core endpoints

Description

Some operations related to Lisk Core need to collect all information at one request.

Motivation

  • Most Lisk Core HTTP API endpoints - such as peer list - limit response to 100 items at one request.
  • Those endpoints are difficult to retrieve required information within their original limits.

Acceptance Criteria

  • The shared file has supports already developer API requests
  • The implementation should be based on HTTP API from service-framework
  • Make sure that Lisk Core API endpoints works with auto-pagination
  • Make sure that limit is implemented - default: 1000

Additional Information

Project-wide ES linting

Description

Add ES linting to the project.

Motivation

Keep high coding standard across all Lisk projects.

Acceptance Criteria

  • Project linting works properly
    • npm script command
    • VS Code

Block finality status

Description

The Service Blockchain API returns number of confirmations and does not associate it with any kind of information whether the block reached finality or not.

This was previously up to the UI client based on the number of confirmations - see this snippet.

Make sure that there is a reliable way to return that information from the API.

Motivation

With the implementation of the BFT protocol in the core product the way a block is considered final has changed. That means that the arbitrarily chosen number of confirmations (101) is essentially incorrect and can be misleading.

Acceptance Criteria

  • Lisk Service API returns a transaction status based on the information provided by the research team.

Additional Information

Update integration tests

Description

Integration tests needs to be updated.
There is a need for tests run against a real blockchain.

Motivation

  • There is a need for proper tests against a real blockchain

Acceptance Criteria

  • Test data reflect the testnet environment
  • Make sure that all tests are working
  • Update JSON-RPC tests

Fix broken peer API

Actual behavior

Peers API returns no data.

Expected behavior

Make sure that the API endpoint returns correct data.

Steps to reproduce

Call /api/v1/peers

HTTP endpoint for fee estimation algorithm

Description

Implement the HTTP endpoint for fee estimation algorithm.

/api/v1/fee_estimates

Motivation

  • The UI clients require an endpoint to retrieve data with fee estimations

Acceptance Criteria

  • /api/v1/fee_estimates returns the following structure:
{
  "data": {
    "feeEstimatePerByte": {
      "low": "<FEE_IN_BEDDOWS>",
      "medium": "<FEE_IN_BEDDOWS>",
      "high": "<FEE_IN_BEDDOWS>",
    },
  },
  "meta": {
    "updated": "<ISO_DATETIME>",
    "blockHeight": "<LAST_BLOCK_HEIGHT_USED>",
  },
  "links": {},
}

Additional Information

Make core service standalone

List of task to make core service as standalone

  • Copy microservice framework based on Lisk Cloud to Lisk Service repository
  • Add tests to microservice framework
  • Add readme file
  • Npm publish original microservice framework

Tasks

Sprint 1

  • Make microservice framework production-ready #9

Sprint 2

  • Add readme file for micro service framework #24
  • Publish original microservice framework as npm #23

Sprint 3-4

  • Add tests to microservice framework #22

Sprint 4

  • Replace request library with axios #21

Develop integration test to support open source

The list of task to support transition to open source

  • Move existing functional tests to the new repository
  • Update broken tests
  • Add custom npm commands to run tests
  • Update Jest library
  • Update Jest configuration

Add HTTP API

To ensure API stability and availability, a proper API gateway needs to be established.

Add HTTP API so there is a way to access the Lisk Service from a client app.

Acceptance criteria

  • Swagger YAML file contains all needed endpoints
  • Account-related endpoints are available
  • Block-related endpoints are available
  • Delegate-related endpoints are available
  • Peer-related endpoints are available
  • Transaction-related endpoints are available
  • Network-related endpoints are available
  • Search API is available

Implement fee estimation algorithm

Description

This issue concerns an implementation of the fee estimator in Lisk Service as described in LIP-0016.

Motivation

  • The UI products require recommendation for users' transactions (Lisk Desktop & Lisk Mobile)
  • The estimated moving average needs to be provided by the backend service

Implementation steps

  1. Retrieve last 20 blocks
  2. Retrieve all transactions for the last 20 blocks
  3. Calculate size of block payload (sum of transaction sizes)
  4. Calculate EMA based on the size of the blocks
  5. Store the average values in the temporary storage (Redis)
  6. Make sure that the average is recalculated on each new block
  7. Create an RPC method that allows to access that data in real-time
  8. Attach the metadata: updated and blockHeight

Acceptance Criteria

  • The core microservice is able to return a JSON with fees.
  • The block cache data is being accessed so the data are not requested twice
  • The estimated moving average is applied with recommendations from the research team

Migrate core service to framework

Description

To avoid repeating the code for all components based on the template, several parts can be moved to a common library shared among all projects.

This issue concerns migrating shared code of core-component to the new inter-component framework.

Motivation

Keeping all shared code in a separate npm library is beneficial for both Lisk Service and Lisk Could projects.

Acceptance Criteria

  • The framework API is stable
  • All services use the framework API

Add JSON-RPC API

Add JSON-RPC API so external clients can access it.

Requirements

  • Use the library called moleculer-io
  • Customise it to support JSON-RPC 2.0 if needed
  • Make configuration similar to HTTP REST API

Fix broken transaction statistics

Actual behavior

Transaction aggregator API returns no data.

Expected behavior

Make sure that the API endpoint returns correct data.

Steps to reproduce

Call /api/v1/transactions/statistics/day

Implement multisignature transaction notifications

Description

As an multi-signature API user I would be able to:

  • Be notified about a pending transaction
  • Be notified about a collected signature
  • Be notified about a pending transaction that expires soon

Motivation

Provide user interfaces with notifcations of multisignature transaction events (new accounts, transactions, signatures, rejections)

Tasks

TBD

Copy over project files from Lisk Cloud repo

Requirements

  • copy over project files from Lisk Cloud repository
  • update package.json
  • update directory structure incl.
    • services containing particular microservices
    • shared containing microservice framework
    • shared/bin containing CLI tools
    • tests containing tests

Fix broken next forgers

Actual behavior

Next forgers is not being updated and the API returns empty dataset.

Expected behavior

Make sure that the API endpoint returns correct data.

Steps to reproduce

Call /api/v1/delegates/next_forgers

Fix broken peer statistics

Actual behavior

Peer statistics are not able to retrieve by a client.

Expected behavior

Peer statistics are able to show in a client.

Steps to reproduce

Go to the the network stats page.

Which version(s) does this affect? (Environment, OS, etc...)

1.3.0

Quality assure 0.1.0 alpha version

The scope of this issue to test the alpha release candidate of 0.1.0 and fix the identified bugs.

Task

Sprint 9

  • Fix broken peer statistics #38
  • Fix broken peers API #71
  • Fix broken block API #66

Sprint 10

  • Fix broken next forgers #70
  • Transaction API: Invalid input data throws server error #79
  • Update delegate tests #78
  • Fix broken transaction statistics #76
  • Gateway readiness reporting #86
  • Fix accounts API #81

Add Dockerfile

Description

Add Dockerfile(s) required to run Lisk Service in Docker environment.

Motivation

Docker environment is needed to deploy Lisk Service on production.

Acceptance Criteria

  • Create Dockerfile with a valid config
  • Create docker-compose configuration

Release first alpha version

Description

Release the first alpha version on the development environment.

Motivation

  • In order to test Lisk Service in the pre-production environment the whole deployment procedure has to be gone through

Acceptance Criteria

  • Version 0.1.0-alpha.0 is deployed on development environment
  • Collect all bugs and create issues respectively

Update README file

Description

Update the documentation.

Motivation

  • Proper README file is required by the marketing team to continue work on Lisk Service documentation.

Acceptance Criteria

  • README.md file is updated
  • Running in Docker
  • Running in local env
  • Environment variables list

Framework: Add README file

Add readme file in markdown format for microservice framework.

Acceptance criteria

  • README in Markdown format
  • Project description
  • Sample app (template)
  • Licence information

Provide the list of latest votes

Description

As a user; I'd like to see the latest votes

Motivation

Help delegates have a better understanding about changes in the list of their voters.
Help voters get notified if the community is revoking trust from a delegate or increasing attention towards another.

Acceptance Criteria

  1. Should list the below information
  2. Date and time
  3. Transaction details
  4. Sender id
  5. Current balance
  6. Round
  7. Votes

Fix broken block API

Actual behavior

The /api/v1/blocks endpoint does not work.

Expected behavior

The /api/v1/blocks endpoint provides data about blocks.

Steps to reproduce

Request data from the /api/v1/blocks endpoint does not work.

Which version(s) does this affect? (Environment, OS, etc...)

Current development

Implement job queue in the framework

Description

Implement a redis-based job queue in the framework, so all projects can benefit from the implementation.

Update: this was requested in the 2020Q1 audit report.

Motivation

  • Shared code is needed for the future projects

Acceptance Criteria

  • Bull library is used
  • Redis server stores a queue information

Enable CI/CD

Description

The last stage of the open source transition - continuous integration and continuous delivery.

Motivation

  • Make tests running on every PR that is created
  • Ensure the product has no regressions

Acceptance Criteria

  • Set-up Jenkins for CI/CD purpose
  • Add tests to run on Jenkins against testnet
  • Enable ESLint
  • Enable functional tests
  • Enable integration tests

JSON-RPC endpoint for fee estimation algorithm

Description

Implement the JSON-RPC 2.0 endpoint for fee estimation algorithm.

get.fee_estimates

Motivation

  • The UI clients require an endpoint to retrieve data with fee estimations

Acceptance Criteria

  • get.fee_estimates returns the following structure:
{
    "jsonrpc": "2.0",
    "result": {
        "data": {
            "feeEstimatePerByte": {
            "low": "<FEE_IN_BEDDOWS>",
            "medium": "<FEE_IN_BEDDOWS>",
            "high": "<FEE_IN_BEDDOWS>",
            },
        },
        "meta": {
            "updated": "<ISO_DATETIME>",
            "blockHeight": "<LAST_BLOCK_HEIGHT_USED>",
        },
    },
    "id": 1
},

Implement multisignature pooling service

Description

According to proposals to the network consensus, enable behalf of user interfaces to create multisignature groups, initiate transactions and sign pending multisignature transactions, store with an incomplete/complete set off signatures, retrieve pending multisig accounts & transaction, mark rejected transactions and remove transaction once it is confirmed or expired

Task

Sprint 36

Sprint 37

Sprint 40

Sprint 41

Setup new open-source repository

The list of task to setup new open source repository

  • Create a new repository: lisk-service
  • Copying over the files
  • Update Licence to Apache 2.0
  • Update file headers

Task

Sprint 1

  • Copy over project files from Lisk Cloud repo #7
  • Update project licence #8

Auto-generated API documentation

Description

In order to support marketing, the API needs to have a proper documentation in a Swagger/OpenAPI format.

Make sure that reference documentation is generated from validation specs in the Gateway and it is in a proper YAML format.

For example:

{
	version: '2.0',
	swaggerApiPath: '/peers',
	params: {
                 ...
		sort: { optional: true, type: 'string', enum: ['height:asc', 'height:desc', 'version:asc', 'version:desc'] },
	},
	source: peersSource,
	envelope,
};

translates to:

  /peers:
    get:
      tags:
      - Peers
      summary: Retrieves network peers.
      operationId: getPeers
      description: |
        Retrieves network peers with details based on criteria.
        Supports pagination.
        ##
        **Example**
          `/api/v1/peers`
      parameters:
      ...
        - in: query
          name: sort
          description: Fields to sort results by
          required: false
          type: string
          enum:
            - height:asc
            - height:desc
            - version:asc
            - version:desc
          default: height:desc
      produces:
      - application/json
      responses:
        200:
          description: array of peers
          schema:
            $ref: '#/definitions/PeersWithEnvelope'
        400:
          description: bad input parameter

Use the file /services/gateway/apis/http-version1/swagger/version1.yaml as an example.

Motivation

  • Marketing needs to have a proper YAML file to generate documentation in the Lisk Docs.

Acceptance Criteria

  • The specs is generated in the OpenAPI format
  • Source data is the HTTP API definition

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.