Giter VIP home page Giter VIP logo

event-pipeline's Introduction

event-pipeline

Image of Event Pipeline

  1. Create Kinesis
  • Name= ‘event-pipe’
  • Only 1 shard
  • Create kinesis stream
  1. Create API Gateway
  • Name “Event-pipe”
  • Create API
  1. Create a role that allows API Gateway to write into Kinesis
  • Create role -> api gateway -> name “api-gateway-pipeline”
  • Attach another policy “Kinesis full access” (save ARN)
  1. Add Resources and Methods in API Gateway

    • Create URL path for REST API
    • Name “Streams”
    • Create resources
    • Method >>> - Get Method >>> - Integration Type: AWS Service >>> - Region: us-east-1 >>> - Service: Kinesis >>> - HTTP method: POST >>> - Action: ListStreams >>> - Execution role: ARN of the role we created. >> - Save
    • Checkout out Integration Request (tells Kinesis what kind of data we are dealing with) = in our case we want JSON data >>> - HTTP Headers >>> - Name “Content-Type” >>> - Mapped from: ‘application/x-amz-json-1.1’
      >>> - Mapping Templates - select (recommended) >>> - Content-type: application/json >>> - { } >>> - Save
    • Go back to method execution and Test >>> - Test Get method - should list all available Kinesis streams
  2. Create 2nd resource

    • Resource Name = KinesisStream
    • Resource Path = {stream-name}
    • Create
    • Method >> - Get Method >> - Integration Type: AWS Service >> - Region: us-east-1 >> - Service: Kinesis >> - HTTP method: POST >> - Action: DescribeStream >> - Execution role: ARN of the role we created. >> - Save
    • Checkout out Integration Request >> - HTTP Headers >> - Name “Content-Type” >> - Mapped from: ‘application/x-amz-json-1.1’
      >> - Mapping Templates - select (recommended) >> - Content-type: application/json >> - { "StreamName": "$input.params('stream-name')" } >> - Save
    • Go back to method execution and Test >> - Test Get method - this time add the name of the STREAM
  3. Create another resource (This path will give Kinesis which eventually will be transferred to s3 bucket)

    • Resource name “record”
    • Enable API Gateway CORS
    • Create Resource
    • Method (PUT) > - Put Method > - Integration Type: AWS Service > - Region: us-east-1 > - Service: Kinesis > - HTTP method: POST > - Action: PutRecord > - Execution role: ARN of the role we created. > - Save
    • Checkout out Integration Request
      > - HTTP Headers > - Name “Content-Type” > - Mapped from: ‘application/x-amz-json-1.1’
      > - Mapping Templates - select (recommended) > - Content-type: application/json > - { "StreamName": "$input.params('stream-name')", "Data": "$util.base64Encode($input.json('$.Data'))Cg==", "PartitionKey": "$input.path('$.PartitionKey')" } > - Save
      1. Mapping template will have the actual Data that we want to pass through to an s3 bucket
      2. 2nd will be a Stream Name - which we’ll get from our URL path
      3. PartitionKey - PartitionKey is an integer that’s hashed in order to arrive at a shard number
      4. Example: if we have multiple shards`
    • Go back to method execution and Test > - Add { “Data”: “testing”, “PartitionKey”: 1}
  4. Since Kinesis Streams can’t write directly to s3, we need to create delivery stream in Kinesis Firehose

    • Service Kinesis
    • Create Delivery Stream
    • Delivery Stream Name “event-pipe-delivery”
    • Select Kinesis stream & the stream we created -> next
    • Leave Lambda & converting disable (for this project) > - We’ll use lambda next time to see what we can do to the data that reached the s3 bucket
    • Destination s3 bucket
    • Name “event-type-data-XXXX” > - Prefix will be by year,month, date, hour -> next
    • Buffer conditions > - Change buffer to 60 seconds for our small project
    • Compression - select GZIP
    • Create a new ROLE
  5. Next & Create delivery stream

  6. Finally deploy api for testing

    • Api gateway
    • Select “event-pipe”
    • Action -> Deploy API > - Test, test, test,test -> deploy

We get the url

https://wle4ucytwl.execute-api.us-east-1.amazonaws.com/test Test our url

First Glance

https://wle4ucytwl.execute-api.us-east-1.amazonaws.com/test/streams List the streams Pick one of the streams

https://wle4ucytwl.execute-api.us-east-1.amazonaws.com/test/streams/event-pipel Detail about the stream

Write into this stream

http PUT https://wle4ucytwl.execute-api.us-east-1.amazonaws.com/test/streams/event-pipel/record Data="testing" PartitionKey=1

If it works you’ll see something like this with ShardID & SequenceNumber'

result

After 60 seconds (after batch collection) the data should be in s3. GZIP file

https://httpie.org/docs#installation

event-pipeline's People

Contributors

sammycoderu avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.