dcos cluster setup <cluster url>
For example:
dcos cluster setup http://egoode-no-elasticl-aod2qsebzlxm-1269016722.us-west-2.elb.amazonaws.com
dcos package install dcos-enterprise-cli
- Setup service account and secret
dcos security org service-accounts keypair /tmp/spark-private.pem /tmp/spark-public.pem
dcos security org service-accounts create -p /tmp/spark-public.pem -d "Spark service account" spark-principal
dcos security secrets create-sa-secret --strict /tmp/spark-private.pem spark-principal spark/secret
- Grant permissions to the Spark Service Account
dcos security org users grant spark-principal dcos:mesos:agent:task:user:root create
dcos security org users grant spark-principal "dcos:mesos:master:framework:role:*" create
dcos security org users grant spark-principal dcos:mesos:master:task:app_id:/spark create
dcos security org users grant spark-principal dcos:mesos:master:task:user:nobody create
dcos security org users grant spark-principal dcos:mesos:master:task:user:root create
- Grant permissions to Marathon in order to the Spark the dispatcher in root
dcos security org users grant dcos_marathon dcos:mesos:master:task:user:root create
- Create a configurationfile /tmp/spark.json and set the Spark principal and secret
cat <<EOF > /tmp/spark.json
{
"service": {
"name": "spark",
"service_account": "spark-principal",
"service_account_secret": "spark/secret",
"user": "root"
}
}
EOF
dcos package install --options=/tmp/spark.json spark --yes
You can verify the cli is ready and see available options by running.
It might take a couple minutes for Spark to come fully online.
dcos spark
dcos spark run --verbose --submit-args=" \
--conf spark.mesos.executor.docker.image=mesosphere/spark:2.3.1-2.2.1-2-hadoop-2.6 \
--conf spark.mesos.executor.docker.forcePullImage=true \
--conf spark.mesos.containerizer=mesos \
--conf spark.mesos.principal=spark-principal \
--class org.apache.spark.examples.SparkPi \
https://downloads.mesosphere.com/spark/assets/spark-examples_2.11-2.0.1.jar 30"
dcos spark status <driver id>
dcos spark log <driver id>