cloudcheflabs / dataroaster Goto Github PK
View Code? Open in Web Editor NEWLicense: Apache License 2.0
License: Apache License 2.0
Trino can be used for the cases of ETL, interactive(adhoc) and scheduled, etc.
We can create trino clusters for these usages individually. In order to load-balance trino clusters, and to support trino coordinator HA, trino gateway should be used. We can extend this concept to trino ecosystem.
Trino ecosystem consists of trino gateway, authenticator, admin and trino clusters, etc.
We can think of the following scenario which can happen in trino ecosystem.
Airflow is one of the popular workflows for data lakehouse.
Airflow should support the following list:
Trino controller needs to control all the trino ecosystems.
To create spark thrift server, there are several things to be created beforehand.
dataroaster operator needs to create nfs server, rbac, s3 secret, and pvc automatically before spark thrift server created.
Please share the link to download spark-thrift-server-4.8.0-SNAPSHOT-spark-job.jar To Run Spark Application
if there is no action notified for a long time, watcher will be closed with the message like this.
com.cloudcheflabs.dataroaster.operators.trino.handler.TrinoClusterWatcher: close watcher
hello,I want to try tun the spark demo,but cannot download spark-thrift-server-4.8.0-SNAPSHOT-spark-job.jar? where is can be download ,thanks?
It is needed that specific apis for data platform components should be created to create these components automatically.
for instance,
To create hive metastore, mysql server needs to be installed before installing hive metastore.
Add rest api for dataroaster operator to create mysql server and hive metastore automatically.
To install trino with different version, trino docker image tag needs to be parameterized in rest api currently provided by trino controller.
Trino operator provides jmx apis to get a lot of mbean metrics exposed by trino coordinator and workers.
Probably, using this api, trino controller can detect exhausted trino cluster automatically.
DataRoaster Operator is used to controll all the data platform components running on kubernetes.
It consists of mysql server, helm operator and dataroaster server which provides rest api to the client.
Most of components provided by dataroaster is based on helm chart. With helm operator, these components can be installed on kubernetes easily.
hello,where is the spark3 dockerfile,like cloudcheflabs/spark:v3.4.0 ,thanks!
Everytime trino configuration updated, trino coordinator and workers in the dedicated trino cluster will be rollout restarted.
At that time, all the endpoints of trino coordinator and workers will get new addresses for their pods, so these changed endpoints addresses of trino coordinator and workers will be updated as prometheus jobs to prometheus configmap.
Trino operator will be used to create / update / delete trino clusters.
In addition to that, It should also monitor trino clusters and sometimes replace the exhausted trino workers with the new ones if detected.
We can also consider support of graceful shutdown when there are existing queries being executed which have to be executed before trino cluster shutdown.
Add monitoring of trino clusters in trino operator.
To monitor trino clusters in trino operator, the following functionalities can be added to trino operator(#1):
Custom resources of data platform components like hive metastore, redash, etc needs to be added.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.