Giter VIP home page Giter VIP logo

strest's Introduction

🚀 Flexible REST Tests

🔗 Connect multiple requests: Example Embed an authorization token you got as a response from a login request in your following requests automatically

📝 YAML Syntax: Write all of your tests in YAML files

🎉 Easy to understand: You'll understand the concept in seconds and be able to start instantly (seriously!)

Try it with Gitpod

Open in Gitpod

Run some Tests

npm i -g @strest/cli
strest tests/success/postman.strest.yml

Getting Started in your own environment

# Via Yarn
yarn global add @strest/cli
# Via npm
npm i -g @strest/cli
# Via Docker
# The image contains everything in the tests directory
docker run -it eykrehbein/strest:latest strest tests/success/chaining/

# Bring your own test and environment
docker run -it --env STREST_URL=https://jsonplaceholder.typicode.com -v ${PWD}:/app/data eykrehbein/strest:latest strest /data/tests/success/Env/

We'll be using the postman-echo test API in this tutorial.

To get started, we're using this file (The extension needs to be .strest.yml or .strest.yaml)

version: 2                            # only version at the moment

requests:                             # all test requests will be listed here
  testRequest:                        # name the request however you want
    request:
      url: https://postman-echo.com/get  # required
      method: GET                       # required
      queryString:
      - name: foo1
        value: bar1
      - name: foo2
        value: bar2
    # log: true # uncomment this to log the response

To run the test, open your terminal and type

strest tests/success/postman.strest.yml

You may also run multiple test files at the same time by pointing to the directory, where the files are stored

strest tests/success/chaining
# or
strest # this will recursively search for all .strest.yml files in the cwd and it's subdirectories

Success! If you've done everything correctly, you'll get a response like this

[ Strest ] Found 4 test file(s)
[ Strest ] Schema validation: 4 of 4 file(s) passed

Executing tests in ./
✔ Testing login succeeded (0.463s)
✔ Testing verify_login succeeded (0.32s)
✔ Testing verify_login_chained succeeded (0.233s)
Executing tests in: ./var/
✔ Testing chaining_var1 succeeded (0.128s)
✔ Testing chaining_var2 succeeded (0.131s)

[ Strest ] ✨  Done in 1.337s

Writing .strest.yml test files

The examples in tests/success are used for testing this library. Read through the examples to see what is possible.

VS Code extension

Send requests directly from the yml file.

source

extension

alt text

Documentation

Using & Connecting multiple requests

With traditional tools like Postman or Insomnia it's common to perform only a single request at a time. Moreover, you have to trigger each request on your own with a click on a button.

With Strest you're able to predefine a very well structured test file once, and every time you make any changes to your API you can test it with just one command in your terminal. Additionally, you can add hundreds or thousands of requests and endpoints which will run synchronously one after the other.

To create multiple requests, simply add multiple entries into the requests yaml object.

version: 2

requests:
  requestOne:
    ...
  requestTwo:
    ...
  requestThree:
    ...

Running this will result in something like

[ Strest ] Found 1 test file(s)
[ Strest ] Schema validation: 1 of 1 file(s) passed

✔ Testing requestOne succeeded (0.1s)
✔ Testing requestTwo succeeded (0.32s)
✔ Testing requestThree succeeded (0.11s)

[ Strest ] ✨  Done in 0.62s

Chaining multiple requests

What is meant by chaining multiple requests?

Chaining multiple requests means that you write a request and in each of the following requests you are able to use and insert any of the data that was responded by this request.

Each reponse is stored as a dictionary for future requests to use. The format is HAR. This format is used by browsers to store request and response history.

{
  "login": {
    "status": 200,
    "statusText": "OK",
    "headers": {
      "content-type": "application/json; charset=utf-8",
      "date": "Mon, 12 Nov 2018 19:04:52 GMT",
      "vary": "Accept-Encoding",
      "content-length": "22",
      "connection": "Close"
    },
    "content": {
      "authenticated": true
    }
  }
}

Chaining Example

requests:
  login: # will return { authenticated: true }
    ...
  authNeeded:
    request:
    ...
      headers:
      - name: Authorization
        value: Bearer <$ login.content.authenticated $>  # It's possible to use the status code, headers, and status text from previous calls.

As you could see, the usage is very simple. Just use <$ requestName.content.jsonKey $> to use any of the JSON data that was retrieved from a previous request. If you want to use raw data, just use <$ requestName.content $> without any keys.

You can use this syntax anywhere regardless of whether it is inside of some string like https://localhost/posts/<$ postKey.content.key $>/... or as a standalone term like Authorization: <$ login.content.token $>

This can also be used across files as demonstrated here

JsonPath

Use JsonPath to extract specific data from previous. This library is used.

version: 2
requests:
  set_JsonPath:
    request:
      url: https://jsonplaceholder.typicode.com/posts
      method: POST
      postData:
        mimeType: application/json
        text:
          firstName: John
          lastName: doe
          age: 26
          address:
              streetAddress: 'naist street'
              city: Nara
              postalCode: 630-0192
          phoneNumbers:
              - {type: iPhone, number: 0123-4567-8888}
              - {type: home, number: 0123-4567-8910}
  JsonPath:
    request:
      url: https://postman-echo.com/get
      method: GET
      queryString:
      - name: foo
        value: <$ JsonPath("set_JsonPath.content.phoneNumbers[?(@.type == \"home\")].number") $>
    validate:
    - jsonpath: content.args.foo
      expect: 0123-4567-8910

Practice here

Using random values with Faker

If you need to generate some random values, you are able to do so by using Faker API templates.

Example - Faker

version: 2

requests:
  fake:
    request:
      url: https://postman-echo.com/get
      method: GET
      queryString:
        - name: first
          value: <$ Faker("name.firstName") $>
        - name: first_last
          value: <$ Faker("name.firstName") $> <$ Faker("name.lastName") $>
    log: true

Visit Faker.js Documentation for more methods

Replacing values with predefined environment variables

Example - Environment Variables

export STREST_URL=https://jsonplaceholder.typicode.com
strest tests/success/Env/environ.strest.yml
version: 2
# ensure the ENV var is set: `export STREST_URL=https://jsonplaceholder.typicode.com`
requests:
  environment:
    request:
      url: <$ Env("STREST_URL") $>/todos/1
      method: GET

Replacing values with predefined custom variables

Example - User Defined Variables

version: 2

variables:  # Define variables here
  testUrl: https://jsonplaceholder.typicode.com/todos/1
  to_log: true

requests:
  my_variable_request:
    request:
      url: <$ testUrl $>
      method: GET
    log: <$ to_log $>

Only Execute If

With Strest you can skip a response by setting a match criteria

version: 2

requests:
  if_Set:
    request:
      url: https://jsonplaceholder.typicode.com/posts
      method: POST
      postData:
        mimeType: application/json
        text:
          foo: 1
  skipped:
    if:
      operand: <$ if_Set.content.foo $>
      equals: 2
    request:
      url: https://jsonplaceholder.typicode.com/todos/2
      method: GET
  executed:
    if:
      operand: <$ if_Set.content.foo $>
      equals: 1
    request:
      url: https://jsonplaceholder.typicode.com/todos/2
      method: GET

Use strest file name as parameter in the tests

You can use the strest file name as a parameter in the tests .

note that the strest suffix is removed

Usage The file name for this example is postman-echo.strest.yml

version: 2                            
requests:                             
  test-file-name:                       
    request:
      url: https://<$ Filename() $>.com/get  
      method: GET                       
    validate:
    - jsonpath: status
      expect: 200

Using dates and dates format

You can insert dates times plus format them using the custom nunjucks date filter under the hood its a wrapper for momentjs so all its formatting is supported

Usage You can use the date filter inside a nunjuck brackets in the request and inside the validate parts.

requests:
    moment-in-request:
      request:
        url: https://postman-echo.com/get
        method: GET
        queryString:
        - name: foo
          value: <$ now | date('YYYY') $>
      validate:
      - jsonpath: content.args.foo
        expect: "<$ '2019-10-10' | date('YYYY') $>"
    moment-in-validate:
      request:
        url: https://postman-echo.com/time/format?timestamp=2019-10-10&format=YYYY
        method: GET
      validate:
      - jsonpath: content.format
        expect: "<$ '2019-10-10' | date('YYYY') $>"

Sending JSON requests from external files

If you have a JSON file that represents the body of your request, you can use the json option.

Strest will read the JSON file you have specified and add it to the body of the request, you won't even need to worry about the Content-Type header, Strest will take care of that for you.

version: 2

requests:
  jsonfile:
    request:
      url: https://postman-echo.com/post
      method: POST
      json: tests/success/jsonfile/data.json  # You have to put the whole path relative to the current directory that you run strest
    log: true

Sending files and form data

Sending files and form data is easy, use params type in the postData prop.

version: 2
requests:
  postwithfile:
    request:
      url: https://postman-echo.com/post
      method: POST
      postData:
        mimeType: multipart/form-data
        params:
          - name: userId
            value: "1"
          - name: avatar
            value: <$ file("tests/strest.png") $>

Response Validation

The immediate response is stored in HAR Format

With Strest you can validate responses with:

Read jsonpath for more info and see this file for more complex example

Expect

requests:
  example:
    ...
    validate:
    - jsonpath: content
      expect: "the response has to match this string exactly"

Type

version: 2

requests:
  typeValidate:
    request:
      url: https://jsonplaceholder.typicode.com/todos
      method: GET
    validate:
    - jsonpath: headers["content-type"]
      type: [ string ]
    - jsonpath: status
      type: [ boolean, string, number ]
    - jsonpath: content.0.userId
      type: [ number ]

Regex

Regex can be used to validate status code or any other returned param

version: 2

requests:
  codeValidate:
    request:
      url: https://jsonplaceholder.typicode.com/todos
      method: GET
    validate: # Multiple ways to use regex to validate status code
    - jsonpath: status
      regex: 2\d+
    - jsonpath: status
      regex: 2[0-9]{2}
    - jsonpath: status
      regex: 2..
    - jsonpath: status
      regex: 2.*

jsonschema

Validate the response using a specified json(yaml) schema. The schema can be defined in the variables portion or within the request.

version: 2
variables:
  schemaValidate:
    properties:
      fruits:
        type: array
        items:
          type: string
      vegetables:
        type: array
        items:
          "$ref": "#/definitions/veggie"
    definitions:
      veggie:
        type: object
        required:
        - veggieName
        - veggieLike
        properties:
          veggieName:
            type: string
          veggieLike:
            type: boolean

requests:
  jsonschema1:
    request:
      url: https://postman-echo.com/post
      method: POST
      postData:
        mimeType: application/json
        text:
          fruits:
            - apple
            - orange
            - pear
          vegetables:
          - veggieName: potato
            veggieLike: true
          - veggieName: broccoli
            veggieLike: false
    validate:
    - jsonpath: content.data
      jsonschema: <$ schemaValidate | dump | safe $>
  jsonschema2:
    request:
      url: https://postman-echo.com/post
      method: POST
      postData:
        mimeType: application/json
        text:
          fruits:
            - apple
            - orange
            - pear
          vegetables:
          - veggieName: potato
            veggieLike: true
          - veggieName: broccoli
            veggieLike: false
    validate:
    - jsonpath: content.data
      jsonschema:
        properties:
          fruits:
            type: array
            items:
              type: string
          vegetables:
            type: array
            items:
              "$ref": "#/definitions/veggie"
        definitions:
          veggie:
            type: object
            required:
            - veggieName
            - veggieLike
            properties:
              veggieName:
                type: string
              veggieLike:
                type: boolean

Retry until validation succeeds

requests:
  waiter:
    request:
      url: https://postman-echo.com/time/now
      method: GET
    delay: 900
    maxRetries: 30
    validate:
    - jsonpath: status
      expect: 200
    - jsonpath: content
      expect: "Tue, 09 Oct 2018 03:07:20 GMT"
export STREST_GMT_DATE=$(TZ=GMT-0 date --date='15 seconds' --rfc-2822 | sed "s/+0000/GMT/g")
strest tests/success/validate/maxRetries.strest.yml

Reusing Objects

stREST uses nunjucks to parse everything inside <$ $>

This allows passing complex objects between requests using the dump filter

version: 2
requests:
  objectSet:
    request:
      url: https://postman-echo.com/post
      method: POST
      postData:
        mimeType: application/json
        text:
          foo: bar
          baz: 1
    log: true
  objectReset:
    request:
      url: https://postman-echo.com/post
      method: POST
      postData:
        mimeType: application/json
        text:
          new: <$ objectSet.content.data | dump | safe $>
    validate:
      - jsonpath: content.data
        expect: {"new":{"foo":"bar","baz":1}}
    log: true

Errors

Strest is a testing library so of course, you'll run into a few errors when testing an endpoint. Error handling is made very simple so can instantly see what caused an error and fix it. If a request fails, the process will be exited with exit code 1 and no other requests will be executed afterwards.

Example

[ Strest ] Found 1 test file(s)
[ Strest ] Schema validation: 1 of 1 file(s) passed

✖ Testing test failed (0.2s)

[ Validation ] The required item test wasn't found in the response data

[ Strest ] ✨  Done in 0.245s

Allow Insecure certs

Boolean to allow:

  • insecure certificates
  • self-signed certificates
  • expired certificates

Example - Allow Insecure certs

allowInsecure: true
requests:
  someRequest:
    ...

Print out the equivalent curl commands

To print out the equivalent curl commands for each request, add the following flag to the command

strest ... --output curl

Exiting on a failed request

By default, Strest will exit the process with an exit code 1 if any request failed. But you can also manipulate this by adding the -n or --no-exit flag into the command. This will instruct the program to go on with the following request if the request failed.

Bulk tests

Specify a list of tests or directories to execute.

strest tests/success/bulk.yml -b

Contents of bulk.yml:

---
- tests/success/postman.strest.yml
- tests/success/two.strest.yml
- tests/success/chaining/

Configuration

You can create a file in your Computer's home directory called .strestConfig.yml which will be the custom config for Strest.

config:
  primaryColor: "#2ed573" # Hexadecimal Color Code (don't forget the quotation marks)
  secondaryColor: "#ff4757" # Hexadecimal Color Code
  errorColor: "#576574" # Hexadecimal Color Code

License

Strest is MIT Licensed

strest's People

Contributors

adiman9 avatar curtisgibby avatar dungahk avatar eykrehbein avatar jayden-chan avatar jgroom33 avatar lpmi-13 avatar philippevk avatar shrikster avatar volvofixthis avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

strest's Issues

Reusable tests

I think it is a good idea to be able to extract repeatable tests in dedicated files and import them.

Ex: I want to modularize some tests by domain, and each domain needs to have an initial authentication request that is common to all tests.

Stress testing ?

Why don't add a possibility for stress testing permiting tests writer define how much requests per second wants to simulate ?

Error response data

For failed responses debugging would be useful to print to console the response data including headers. This usually include a request id and validation messages in case of 4XX exit codes.

log_to_file

  logger:
    url: foo.com
    method: GET
    log_to_file: true

This would result in a file called logger.json that contains the output of the request

Add Dockerfile

installing node/npm can be avoided for those using Docker

Environment Variables

Use Environment Variables to replace {{}}

version: 1                            # only version at the moment
requests:                             # all test requests will be listed here
  foo:
    url: "{{MY_ENV_VAR}}/api/v1/bar"
    method: GET                       # required

Curl to request

Add feature that converts a curl command to a request in yaml syntax

error code when test failing

Hi, thanks for writing this! I like the clean yaml approach to writing tests. So I started to do a little testing to see how this would fit in our processes for API validation / testing.

When running a simple test when I get an HTTP 400+ response I would expect this would result in setting the exit code in terminal.

Success

version: 1

requests:
  get-test-todo:
    url: https://jsonplaceholder.typicode.com/todos/1
    method: GET
_ strest test.yml

[ Strest ] Found 1 test file(s)
[ Strest ] Schema validation: 1 of 1 file(s) passed

_ Testing get-app-version succeeded

[ Strest ] _  Done in 0.131s

_
_ echo $?
0

Failure

version: 1

requests:
  get-test-todo:
    url: https://jsonplaceholder.typicode.com/todos/230
    method: GET
_ strest test.yml

[ Strest ] Found 1 test file(s)
[ Strest ] Schema validation: 1 of 1 file(s) passed

_ Testing get-test-todo succeeded
_ Testing get-more-todo failed

Error: Request failed with status code 404

[ Strest ] _  Done in 0.453s

_
_ echo $?
0

If there are other good ways to act on failing a test I am open for suggestions.

Thanks again for this project.

Issue with Env() on Mac OS and node 8

On Mac OS 10.13.6, with node 8.11.3, I have a test such as
requests:
login:
url: Env(URL)
method: POST
....
....

I set URL in my env to http://foo.com/bar, but when I run the test it errors saying it can't talk to 127.0.0.1 for the url. Exact error:

strest ./post.yml

[ Strest ] Found 1 test file(s)
[ Strest ] Schema validation: 1 of 1 file(s) passed

✖ Testing login failed (0.041s)

Error: connect ECONNREFUSED 127.0.0.1:80

I took the exact same version of strest, same strest test file, same env variable set, on a linux box with node 6.14.0 and the test works correctly (uses http://foo.com/bar). It seems as though the Env call on node 8 doesn't recognize my environment variable on node8+MacOS but does on node6+Linux. I did some googling but couldn't find if there was some changes or issues with getting the env in node 8 vs. 6. On node 8 I made a one-liner node app to print process.env and I see my environment (including URL) so I don't think there is. Maybe an issue with the templating/yaml parsing between versions where the Env() call exists?

Repeat until

Repeat a request until a criteria is met.

requests:
  userRequest:
    url: http://localhost:3001/user 
    method: GET
    data:
      params:
        name: testUser
    max_retries: 3
    delay: 3000
    validate:
      json:
        somethingCompleted: "true"

The above request executes a maximum of 3 times before the failure is propagated

Support BDD test

Hi guys,

With Postman, I follow BDD style Given...When...Then for naming request in a flow. Example:

Given user has a balance 
When user topup +$10
Then user balance will be increased by $10

I suggest adding a new field to describe the above style.

If could, please let me know which is the fastest way to add a new field I can help to contribute a PR.

Thanks for your awesome efforts.

WIP: Log params

  my_request:
    url: foo.com
    method: GET
    log_request: true

This would result in the request object and resolved uri being logged. This would be helpful to have a copy of the json and to show the URI with environment vars and values populated.

It might be good to create a log object as:

  my_request:
    url: foo.com
    method: GET
    log:
      output: true
      file: true
      request: true

Execute an array of folders or tests

strest execute_all_this.yml

where execute_all_this.yml contents are:

---
- path/to/folder/with/strest_dot_yml_files
- path/to/folder/my.strest.yml
- path/to/folder/with/more/strest_dot_yml_files

Feature: Set execution order

Option to add a parameter in each test file which defines in which order the files should be executed.

# File 1
version: 1
order: 1 # will be executed first
requests:
   ... 
# File 2
version: 1
order: 2 # will be executed after file 1
requests:
   ... 

Nested functions

I haven't tested this, so maybe it is already possible.

Support n levels deep of this:

Value(foo.Env(BAR))

Feature: Response code validation

Add feature to validate the response status code

version: 1

requests:
  test:
    url: https://echo.getpostman.com/status/400
    method: GET
    validate:
      code: 2xx # status code needs to be in range of 200 - 299

Support for form-data

In my cases my API takes files and performs some operations with them, after which some new file is returned.
It would be good if strest supported form-data, which would work the same as with other data.
One would provide the name of the parameter and then the filepath (either absolute or relative to the directory where strest is executed).

Feature: Logfiles

Add an option to the cli command to create a log file. Formatted as Json array with an entry for each request where response code, data, headers and a possible error is logged?

Support Value() across files

When running strest against a directory, it would be beneficial to use response data from file 1 as a Value() in file 2 request.

allow self signed cert

error:

/ # strest foo.yml

[ Strest ] Found 1 test file(s)
[ Strest ] Schema validation: 1 of 1 file(s) passed


✖ Testing login failed (0.434s)

Error: self signed certificate

[ Strest ] Failed before finishing all requests

solution:
add a param to allow self-signed certs:
axios/axios#535

Define Environment

This would allow the defined env vars to be stored in an initial test file.

version: 1
environment:
  STREST_URL: https://jsonplaceholder.typicode.com
requests:
  environment:
    url: Env(STREST_URL)/todos/1
    method: GET

UI

Create a UI

strest-ui /tests loads all requests into visualization (use similar layout to insomnia UI)
Allow stepping through requests and displaying results
Requests can be modified and saved back to disk
Values of variables are displayed along side the 'code': i.e.
Env(FOO)bar

Schema validation fails silently

Describe the bug
I've written a request that fails schema validation but the strest run does not fail.

To Reproduce

  • create a request yml that is invalid
  • run it
  • the run passes

Expected behavior

  • the run should fail with a non-zero exit code

Additional context

[ Strest ] Found 3 test file(s)
[ Strest ] Schema validation: 2 of 3 file(s) passed

Executing tests in ./
Executing tests in: /app/tests/
Executing tests in: /app/tests/tokens/
✔ Testing userToken succeeded (0.573s)

[ Strest ] ✨  Done in 0.634s

The only indication that something went wrong is 2 of 3 file(s) passed at the top, which is easy to miss, especially in longer scripts.

unit testing / CI

So far I see no unit tests for strest and no Travis CI / AppVeyor setup.

We should have this to catch any bugs.

Improved response validation

Thanks for this nice tool!

When writing tests it's also important to test failures, e.g. making sure that submitting invalid credentials does not give an auth token. It would be nice if you could do more advanced assertions on the response like so:

requests:
  login: # a successful login
    url: http://localhost:8080/auth
    method: POST
    body:
      json:
        username: alice
        password: secret
    response:
      code: [200-300) # expect a response code in range 200 (inclusive) to 300 (exlusive)
      body: 
        type: Base64
  login using params:
    url: http://localhost:8080/auth
    method: POST
    params:
      username: alice
      password: secret
    response:
      code: [200-300) # expect a response code in range 200 (inclusive) to 300 (exlusive)
      body:
        type: Base64
  login wrong password:
    url: http://localhost:8080/auth
    method: POST
    body:
      json:
        username: alice
        password: 123
    response:
      code: [400-500) # expect a response code in range 400 (inclusive) to 500 (exlusive)
      body:
        json:
          error: wrong username or password

Verify against returned arrays

Say the returned response data has the following format,

[
  {
    "id": "5b9763b3ef7c4000018387cd",
    "from": "Bank",
    "to": "Bank",
    "amount": 0,
    "gameID": "SomePreDefiniedValue"
  }
]

How do I verify that "gameId" == "SomePreDefiniedValue"?

Extract from array if

Assume a request returns:

[ {"foo":1, "bar": 1},  {"foo":2, "bar": 2} ]

It would be beneficial to extract the value of key bar given a value for foo.

This is typically handled in code using loops or find functions.

Log output invisible on Solarized color scheme

It took me a few minutes to realize that the log output was not missing, but simply colored in such a way that makes it invisible in terminals using the Solarized color scheme. Selecting the text shows the log output.

Before:

solarized-colors-bad-fs8

After:

solarized-colors-highlited-fs8

Abort directory execution on failure

When executing strest foo_dir/ file2 will run tests even if there is a failure from a request in file1

Option 1

Support strest --abort foodir/

Option 2

Change default behavior to abort directory execution if a request fails.
This might also require the ability to override this behavior in a per file basis:

version: 1
continue: true
...

Codegen

Provide a codegen function.
strest --output curl
would result in the equivalent curl commands to perform the actions

WIP: validate pointer

Support this:

version: 1

requests:
  todoOne:
    url: https://jsonplaceholder.typicode.com/posts
    method: POST
    data:
      json:
        myArray:
        - foo: 1
          bar: 1
        - foo: 2
          bar: 2
    validate:
      jsonpath:
        myArray.1.foo: 2

Postman to stREST

strest foo.postman.json --postman

This would convert a postman collection into a set of directories and files.

There's no reasonable way to convert all the functionality (javascripts) so this should focus on creating the requests only.

execution order

execution order should be:

test_dir
A_subdir
3.strest.yml
4.strest.yml
Z_subdir
5.strest.yml
6.strest.yml
1.strest.yml
2.strest.yml

Executing strest tests/success/ results in the following execution order:

✔ Testing todoOne succeeded (0.188s)
✔ Testing todoTwo succeeded (0.107s)
✔ Testing todoOne succeeded (0.221s)
✔ Testing todoTwo succeeded (0.102s)
✔ Testing postman succeeded (0.293s)
✔ Testing my_variable_request succeeded (0.102s)
✔ Testing environment succeeded (0.101s)
✔ Testing fake succeeded (0.195s)
✔ Testing arr succeeded (0.571s)
✔ Testing arr1 succeeded (0.106s)
✔ Testing value1 succeeded (0.215s)
✔ Testing value2 succeeded (0.184s)
✔ Testing basic_auth succeeded (0.19s)
✔ Testing logging succeeded (0.197s)
✔ Testing if_Set succeeded (0.372s)
✔ Testing skipped skipped (0.001s)
✔ Testing executed succeeded (0.105s)
✔ Testing codeValidate succeeded (0.102s)
✔ Testing code404 succeeded (0.671s)
✔ Testing headersValidate succeeded (0.124s)
✔ Testing jsonValidate succeeded (0.1s)
✔ Testing jsonpath succeeded (0.264s)
✖ Testing maxRetries failed to validate. Retrying... (1.192s)
✔ Testing maxRetries succeeded (0.193s)
✔ Testing raw succeeded (0.191s)
✔ Testing login succeeded (0.188s)
✔ Testing verify_login succeeded (0.41s)
✔ Testing verify_login_chained succeeded (0.282s)
✔ Testing chaining_var1 succeeded (0.096s)
✔ Testing chaining_var2 succeeded (0.108s)

Based on current tests, this should be the order:

✔ Testing todoOne succeeded (0.188s)
✔ Testing todoTwo succeeded (0.107s)
✔ Testing postman succeeded (0.293s)
✔ Testing todoOne succeeded (0.221s)
✔ Testing todoTwo succeeded (0.102s)
✔ Testing my_variable_request succeeded (0.102s)
✔ Testing basic_auth succeeded (0.19s)
✔ Testing login succeeded (0.188s)
✔ Testing verify_login succeeded (0.41s)
✔ Testing verify_login_chained succeeded (0.282s)
✔ Testing chaining_var1 succeeded (0.096s)
✔ Testing chaining_var2 succeeded (0.108s)

color highlighting in vscode

Possibly add this to a readme.

Assuming Dark+ theme

using this plugin:
https://github.com/fabiospampinato/vscode-highlight

add these settings:

    "highlight.regexes": {
      "(Value\\(.*?\\))": {
        "regexFlags": "g",
        "filterFileRegex": ".*\\.strest\\.yml",
        "decorations": [
          { "color": "#9CDCFE" }
        ]
      },
        "(Env\\(.*?\\))": {
          "regexFlags": "g",
          "filterFileRegex": ".*\\.strest\\.yml",
          "decorations": [
            { "color": "#C586C0" }
          ]
        }
      }

Release 2.0

  • API
  • use HAR format for request
  • nunjucks templating

Output response

Assume 2 files where the goal is to output an env var from the request in file 1 and use it in file 2

File 1:

requests:
  test:
    url: https://postman-echo.com/get
    method: GET
    data:
      params:
        foo: bar
    output_env_vars:
      FILE_1_OUTPUT: args.foo

output:

{"args":{"foo":"bar"}}
requests:
  test:
    url: https://postman-echo.com/get
    method: GET
    data:
      params:
        foo: Env(FILE_1_OUTPUT)

Feature: Basic Auth

support this:

version: 1

requests:
  login:
    url: https://postman-echo.com/basic-auth
    Auth:
        basic:
            username: postman
            password: password
    method: GET

This would result is a header:
Authorization: Basic cG9zdG1hbjpwYXNzd29yZA==

Filename()

Assume the filename is postman-echo.strest.yml

Access the filename in the request:

version: 1                            # only version at the moment

requests:                             # all test requests will be listed here
  testRequest:                        # name the request however you want
    url: https://Filename().com/get  # required
    method: GET                       # required

Add filename to log output

When executing against a directory, showing the filename (and possibly the sub-directories' names) would be beneficial

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.