Giter VIP home page Giter VIP logo

streaming-games's Introduction

Streaming Games

Streaming Games is the funniest application that you ever see while applying principles of streaming analytics using Confluent Cloud. Built around the different games, this application ingest and store events from the game into Kafka topics and allow you to process them in near real-time using ksqlDB. In order to keep you focused this application is based on fully managed services in Confluent Cloud for both the Apache Kafka cluster (using Confluent Cloud KORA engine) as well as ksqlDB.

2048

To implement streaming analytics in the game we built a scoreboard using ksqlDB. The scoreboard will be a table containing aggregated metrics of the players such as their highest score, the highest level achieved, and the number of times that the player loses. As new events arrive the scoreboard gets instantly updated by the continuous queries that keep processing those events as they happen.

What you are going to need?

  • Java and Maven - The UI layer of the application relies on two APIs that are implemented using Java, therefore you will need to have Java 11+ installed to build the source-code. The build itseld is implemented using Maven, and it is triggered automatically by Terraform.

  • Install jq on your local computer

  • Confluent Cloud CLI - During the demo setup, a bash script will set up the whole Confluent Cloud environment for you. To do that it will need to have the Confluent Cloud CLI installed locally. You can find instructions about how to install it here. v1.7.0 or later is required, logged in with the --save argument which saves your Confluent Cloud user login credentials or refresh token (in the case of SSO) to the local netrc file.

  • Confluent Cloud account - If you do not have a Confluent Cloud account, you can create it with instructions below.

  • Terraform (0.14+) - The application is automatically created using Terraform. Besides having Terraform installed locally, will need to provide your cloud provider credentials so Terraform can create and manage the resources for you.

  • AWS account - This demo/Workshop runs on AWS

Workshop or Demo?

Choose your story! If you prefer to follow the different steps in a workshop style follow the instructions here. Otherwise for a more traditional demo keep reading below!

Pre-requisites

Important

Follow this part carefully before the demo/workshop!

Step 1. Install workshop dependencies

Run the following command to install workshop dependencies

brew install terraform \
mvn \
jq \
awscli \
confluent-cli

Step 2. Signup for Confluent Cloud

You can Create a Confluent Cloud Account following this link: Try Confluent Cloud For Free (Games Workshop)

If you want to create a Confluent Cloud account from the AWS Marketplace, follow this link: Try Confluent Cloud For Free - Sign up from AWS Marketplace (Games Workshop)

Step 2a. Create a Cloud API Key

  1. Open the Confluent Cloud Console and click Granular access tab, and then click Next.

  2. Click Create a new one to create tab. Enter the new service account name (tf_runner), then click Next.

  3. The Cloud API key and secret are generated for the tf_runner service account. Save your Cloud API key and secret in a secure location. You will need this API key and secret to use the Confluent Terraform Provider.

  4. Assign the OrganizationAdmin role to the tf_runner service account by following this guide.

Assigning the OrganizationAdmin role to tf_runner service account

Step 2b. Configure the Confluent CLI

  1. Log in to the Confluent Cloud CLI:

confluent login --save

The --save flag will save your Confluent Cloud login credentials to the ~/.netrc file.

Step 3. Clone this repo

First things first, clone this github repo to your local machine (Use your Terminal):

git clone https://github.com/gianlucanatali/streaming-games.git

Step 4. Configure the deployment

The whole workshop creation is scripted. The script will leverage Terraform to: * As mentioned before the application uses a Kafka cluster running in a fully managed service for Apache Kafka. Therefore the first thing it will provision is Confluent Cloud resources using the Confluent Cloud CLI. If you are interested in how you can create a cluster in Confluent Cloud via the Web UI have a look at our Quick Start for Apache Kafka using Confluent Cloud. * Spin up the other resources needed in AWS

Run the following steps to configure the workshop

  1. Create the demo.cfg file using the example provided in the config folder

    cp config/demo.cfg.example config/demo.cfg
  2. Provide the required information on the 'demo.cfg' file

    export TF_VAR_aws_profile="<AWS_PROFILE>"
    export TF_VAR_aws_region="eu-west-2"
    export TF_VAR_schema_registry_region="eu-central-1"
    export TF_VAR_confluent_cloud_api_key=="<CONFLUENT_CLOUD_API_KEY>"
    export TF_VAR_confluent_cloud_api_secret="<CONFLUENT_CLOUD_API_SECRET>"

    we advice using the utility gimme-aws-creds if you use Okta to login in AWS. You can also use the granted CLI for AWS creds. Amend any of the config as you see fit for your preference (Like the aws region or Schema registry Region)

  3. If you are not using gimme-aws-creds, create a credential file as described here. The file in ~/.aws/credentials should look like this (An example below)

    [default]
    aws_access_key_id=AKIAIOSFODNN7EXAMPLE
    aws_secret_access_key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY

    You can set TF_VAR_aws_profile="default" in the demo.cfg file

Step 5. Deploying the application

The application is essentially a set of HTML/CSS/JS files that forms a microsite that can be hosted statically anywhere. But for the sake of coolness we will deploy this microsite in a S3 bucket from AWS. This bucket will be created in the same region selected for the Confluent Cloud cluster to ensure that the application will be co-located. The application will emit events that will be processed by a event handler implemented as an API Gateway which uses a Lambda function as backend. This event handler API receives the events and writes them into Kafka using ksqlDB.

arch

Please note that during deployment, the script takes care of creating the required Kafka topics and also the ksqlDB queries. Therefore, there is no need to manually create them.

  1. Start the demo creation

    ./start.sh
  2. At the end of the provisioning the Output with the demo endpoint will be shown. Paste the demo url in your browser and start playing!

    Outputs:
    
    Game = https://d************.cloudfront.net/
  3. Wait for the content to be available

Note It could take a bit of time for the content to be available via cloudfront. If accessing the link returned by the script you see an error message like the one below, don’t worry: just give it some more minutes and try the link again. Make sure you are not hitting refresh, as cloudfront might have sent you to a different url. It can take up to 1hr for the cloudfront distribution to be available.

error cloud front

You can try to speed up this process using the trick explained in this medium article: Is your CloudFront distribution stuck “in progress”?

Check the scoreboard

First things first: Play with the game and share your game link with your friends to populate data! You can make sure the data is flowing into the Confluent followinf the steps below:

  1. In Confluent UI go to the environment and the cluster within it, created by the terraform script - should start with with streaming-games

  2. Click on Topics and choose USER_GAME topic

topicui

As users engage with the 2048 game, two types of events will be generated. The first is referred to as the "User Game" event and includes information about the user’s current game state, such as their score, level, and remaining lives. This event will be triggered every time the user’s score changes, advances to a new level, or loses a life.

The second type of event is called the "User Losses" event, which as the name suggests, captures data related to the user’s loss in the game. This event is triggered when the player reaches the game-over state. The scoreboard can be visualized in real time by clicking on the SCOREBOARD link in the 2048 game (top right corner). It is also available in the other games.

scoreboard

To build a scoreboard out of this, we created a streaming analytics pipeline that transform these raw events into a table with the scoreboard that is updated in near real-time.

pipeline

ksqlDB supports Pull queries, where you can get the results for a query in a more traditional fashion (instead of Push queries).

A query to the STATS_PER_USER table is sent to ksqlDB, to get all the players scores for the selected game.

SELECT
  USER_KEY->USER,
  HIGHEST_SCORE,
  HIGHEST_LEVEL,
  TOTAL_LOSSES
FROM STATS_PER_USER
WHERE GAME_NAME='2048';

the ksqlDB queries that built this streaming pipeline

To implement the pipeline we used ksqlDB. You can see below the queries for your reference.

LOSSES_PER_USER Table

A table to count the number of losses for each player.

CREATE TABLE LOSSES_PER_USER AS
SELECT
  USER_KEY,
  USER_KEY -> USER AS USER,
  USER_KEY -> GAME_NAME AS GAME_NAME,
  COUNT(USER_KEY) AS TOTAL_LOSSES
FROM
  USER_LOSSES
GROUP BY
  USER_KEY;

Create the STATS_PER_USER Table

CREATE TABLE STATS_PER_USER AS
SELECT
  UG.USER_KEY AS USER_KEY,
  UG.USER_KEY -> USER AS USER,
  UG.USER_KEY -> GAME_NAME AS GAME_NAME,
  MAX(UG.GAME -> SCORE) AS HIGHEST_SCORE,
  MAX(UG.GAME -> LEVEL) AS HIGHEST_LEVEL,
  MAX(
    CASE WHEN LPU.TOTAL_LOSSES IS NULL THEN CAST(0 AS BIGINT) ELSE LPU.TOTAL_LOSSES END
  ) AS TOTAL_LOSSES
FROM
  USER_GAME UG
  LEFT JOIN LOSSES_PER_USER LPU ON UG.USER_KEY = LPU.USER_KEY
GROUP BY
  UG.USER_KEY;

Step 7. Destroy the resources (save money!)

The great thing about Cloud resources is that you can spin the up and down with few commands. Once you are finished with this worksho/demo , remember to destroy the resources you created today, to avoid incuring in charges if you are not planning to use this. You can always spin it up again anytime you want (uncomment the run_as_workshop variable in the config file if you want to automate the creation of ksqlDB queries, so you can demo the app without any manual effort)!

Note: When you are done with the application, you can automatically destroy all the resources created using the command below:

./stop.sh

Troubleshooting

License

This project is licensed under the Apache 2.0 License.

streaming-games's People

Contributors

gianlucanatali avatar ahmedszamzam avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.