Giter VIP home page Giter VIP logo

ambari-zeppelin-service's People

Contributors

abajwa-hw avatar ameetp avatar bbusse avatar ngalstyan4 avatar prabhjyotsingh avatar r-kamath avatar seanorama avatar triggerboom avatar uprush avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ambari-zeppelin-service's Issues

Suse Linux Enterprise 11.4 - gcc-gfortran package not found

Hey guys,

can you help me with this ?
gcc-gfortran cannot be found. Is it a problem of a metainfo.xml dependency? (In Suse Linux the package is called gcc-fortran and not gcc-gfortran)

Thank you for helping!


Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 235, in
Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 216, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 54, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 392, in install_packages
Package(name)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 45, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/zypper.py", line 70, in install_package
shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, *_kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, *_kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/zypper --quiet install --auto-agree-with-licenses --no-confirm gcc-gfortran' returned 104. Package 'gcc-gfortran' not found.

Getting pyspark to work on HDP 2.3.4

for %pyspark zeppelin did not find module 'pyspark' (Error from python worker: /usr/bin/python: No module named pyspark)
I had to add

export PYTHONPATH="${SPARK_HOME}/python:${SPARK_HOME}/python/lib/py4j-0.8.2.1-src.zip"  
export SPARK_YARN_USER_ENV="PYTHONPATH=${PYTHONPATH}"

to the zeppelin-env template configuration.

(Thanks to http://mail-archives.apache.org/mod_mbox/incubator-zeppelin-users/201506.mbox/%3CCAFX-kxKR+1R11WoG69sa-63nwTcfT1VYFocJBywh6m0q72WqBg@mail.gmail.com%3E)

CentOS 7 PID_Dir issue: service won't start due to mkdir /var/run/zeppelin-notebook: permision denied

Even though CentOS 7 is not yet supported by this, there is an easy solution for this issue.

Carefull: Make sure you stop the zeppelin service before making the below changes!

Go to the config page of the Zeppelin service in Ambari and search for

'zeppelin_pid_dir' 

(It's under Advanced zeppelin-env).

Change the value to

/var/run/user/<zeppelin user id>/zeppelin-notebook

Where is the UID of the zeppelin user (find it using id -u zeppelin).
For me, the end result was /var/run/user/1008/zeppelin-notebook

The service should start normally now on CentOS 7.

Error installing on ubuntu

Hi,

I'm getting the following error when performing the install:

`resource_management.core.exceptions.Fail: Execution of '/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install zeppelin' returned 100. Reading package lists...
Building dependency tree...
Reading state information...
zeppelin is already the newest version.
0 upgraded, 0 newly installed, 0 to remove and 92 not upgraded.
2 not fully installed or removed.
After this operation, 0 B of additional disk space will be used.
Setting up zeppelin-2-4-0-0-169 (0.6.0.2.4.0.0-169) ...
update-alternatives: error: alternative path /etc/zeppelin/conf.dist doesn't exist
dpkg: error processing package zeppelin-2-4-0-0-169 (--configure):
subprocess installed post-installation script returned error exit status 2
dpkg: dependency problems prevent configuration of zeppelin:
zeppelin depends on zeppelin-2-4-0-0-169; however:
Package zeppelin-2-4-0-0-169 is not configured yet.

dpkg: error processing package zeppelin (--configure):
dependency problems - leaving unconfigured
No apport report written because the error message indicates its a followup error from a previous failure.
Errors were encountered while processing:
zeppelin-2-4-0-0-169
zeppelin
E: Sub-process /usr/bin/dpkg returned an error code (1)`

Any ideas how I can fix this? Looks like a cyclical dependency issue perhaps? I tried to install the zeppelin package before installing zeppelin-2-4-0-0-169 package but didnt help.

M

zeppelin 0.6.0 integration

Hello,
is there any ways to deploy the latest version of zeppelin with this plugin? I scan the code and it doesn't seem to be obvious.

Zeppelin processes remain active

When running a Hive paragraph in Zeppelin two things happen:

  • Processes become zombie at system level after displaying the output.
  • Paragraphs cannot be removed.

Any solution for this? Thank you.

Unknow error when deploy

I'm trying to install zeppelin according this link https://community.hortonworks.com/articles/34424/apache-zeppelin-on-hdp-242.html, but the deploy process fails and show the following error:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 236, in <module>
    Master().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 65, in install
    recursive_ownership=True                  
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 113, in __new__
    cls(name.pop(0), env, provider, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 146, in __init__
    raise Fail("%s received unsupported argument %s" % (self, key))
resource_management.core.exceptions.Fail: Directory['/var/run/zeppelin-notebook'] received unsupported argument create_parents

I've been searching for a solution without success, I'm using Ambari Version 2.2.1.1, HDP 2.4.3.0-227, Python 2.6 and CENTOS 7. I hope someone can help me, thanks.

Public Name with Space

Stumbled upon an issue with leading space while processing config -> public host name. It should be trimmed, to avoid urllib2 open error

Hardcoded hdfs user doesn't work in clusters with personalized usernames

Install script runs some commands as hardcoded hdfs user:

Execute('hadoop fs -mkdir -p /user/' + user, user='hdfs', ignore_failures=True) 
Execute('hadoop fs -chown ' + user + ' /user/' + user, user='hdfs') 
Execute('hadoop fs -chgrp ' + user + ' /user/' + user, user='hdfs') 

Execute('hadoop fs -mkdir -p ' + spark_jar_dir, user='hdfs', ignore_failures=True) 
Execute('hadoop fs -chown ' + user + ' ' + spark_jar_dir, user='hdfs') 
Execute('hadoop fs -chgrp ' + user + ' ' + spark_jar_dir, user='hdfs') 

This won't work if the cluster uses a diferent user (in our organization we prefix the username with the cluster name: <clustername>-hdfs:

Skipping failure of Execute['hadoop fs -mkdir -p /user/zeppelin'] due to ignore_failures. Failure reason: Execution of 'hadoop fs -mkdir -p /user/zeppelin' returned 1. 16/09/05 15:15:47 WARN ipc.Client: Exception encountered while connecting to the server : 
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

access error

Environment
14.04.1-Ubuntu
HDP-2.3.4.0-3485

I am getting the follwing access error when running install, looks like access rights on /var/log/zeppelin/zeppelin-setup.log'

ls -al /var/log/zeppelin
total 8
drwxr-xr-x  2 zeppelin zeppelin 4096 Feb 24 12:47 .
drwxrwxr-x 36 root     syslog   4096 Feb 24 12:47 ..
-rw-r--r--  1 zeppelin zeppelin    0 Feb 24 12:47 zeppelin-setup.log
<script id="metamorph-12336-start" type="text/x-placeholder"></script>Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 301, in <module>
    Master().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 58, in install
    Execute('echo spark_version:' + params.spark_version + ' detected for spark_home: ' + params.spark_home + ' >> ' + params.zeppelin_log_file)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
    raise Fail(err_msg)
2016-02-24 12:47:51,605 - File['/var/log/zeppelin/zeppelin-setup.log'] {'content': '', 'owner': 'zeppelin', 'group': 'zeppelin', 'mode': 0644}
2016-02-24 12:47:51,638 - Writing File['/var/log/zeppelin/zeppelin-setup.log'] because it doesn't exist
2016-02-24 12:47:51,661 - Changing owner for /var/log/zeppelin/zeppelin-setup.log from 0 to zeppelin
2016-02-24 12:47:51,662 - Changing group for /var/log/zeppelin/zeppelin-setup.log from 0 to zeppelin
2016-02-24 12:47:51,684 - Execute['echo spark_version:1.5 detected for spark_home: /usr/hdp/current/spark-client/ >> /var/log/zeppelin/zeppelin-setup.log'] {}<script id="metamorph-12338-end" type="text/x-placeholder"></script>

Zeppelin not restarting

I keep getting the next message when trying to restart Zeppelin service:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/ZEPPELIN/0.6.0.2.5/package/scripts/master.py", line 312, in <module>
    Master().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/ZEPPELIN/0.6.0.2.5/package/scripts/master.py", line 185, in start
    + params.zeppelin_log_file, user=params.zeppelin_user)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 273, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 71, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 93, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 141, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 294, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/hdp/current/zeppelin-server/bin/zeppelin-daemon.sh restart >> /var/log/zeppelin/zeppelin-setup.log' returned 1. mkdir: cannot create directory ‘/var/run/zeppelin’: Permission denied
/usr/hdp/current/zeppelin-server/bin/zeppelin-daemon.sh: line 187: /var/run/zeppelin/zeppelin-zeppelin-XXXXXXXXXX.pid: No such file or directory
cat: /var/run/zeppelin/zeppelin-zeppelin-XXXXXXXXXX.pid: No such file or directory

The important part is here: mkdir: cannot create directory ‘/var/run/zeppelin’: Permission denied

To solve it:

FOLDER=/var/run/zeppelin

sudo mkdir -p $FOLDER
sudo chown -Rf zeppelin:hadoop $FOLDER

Will you fix it for the next release? Thank you.

Zeppelin deployment in the host that has no internet or git support

Hi,

I would like to deploy the zeppelin service in the HDP 2.4 platform.
My data platform doesn't have internet connection or git on that host. How do I deploy this on that platform ?

Here the steps are not correct or I am missing something.

VERSION=hdp-select status hadoop-client | sed 's/hadoop-client - \([0-9]\.[0-9]\).*/\1/'

sudo git clone https://github.com/hortonworks-gallery/ambari-zeppelin-service.git /var/lib/ambari-server/resources/stacks/HDP/$VERSION/services/ZEPPELIN

Local variable 'response' referenced before assignment

File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 230, in post_request

jsonresp = json.loads(response.decode('utf-8'))
UnboundLocalError: local variable 'response' referenced before assignment

Above assignment should only be done if exception does not occur.

start error

after install ambari-zepplin, when I start the service, it popups following error:

Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 279, in
Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 252, in start
pidfile=glob.glob(status_params.zeppelin_pid_dir + '/zeppelin-'+params.zeppelin_user+'*.pid')[0]
IndexError: list index out of range

It seems the master.py is not correct, when I change the pidfile=glob.glob(status_params.zeppelin_pid_dir + '/zeppelin-'+params.zeppelin_user+'.pid')[0] to pidfile=glob.glob(status_params.zeppelin_pid_dir + '/zeppelin-'+status_params.zeppelin_user+'.pid')[0], the zeppelin started successfully in Ambari

Procedure to update Zeppelin

Zeppelin changes fast, will be good to know how to update it easily, maybe a small button or a shell script?

Thanks,

Authentication

I installed the zeppelin service through Ambari. But it is not authenticated with the users. Do we need to enable any settings to authenticate Zeppeling with users in Ambari?

Run button

Hello all
I wanna know it's possible to run paragraph code with un angular button ?
thx :)

Service install wizard

OS : Centos 7
Cluster of 5 servers + 1 Ambari server (2.1.1)

Hi there,

To begin, thanks you for your work for the integration in Ambari UI.

My issue:

After had the git in the services directory and restart Ambari server, when I add the service the wizard stay on step 4 with "loading". I tried twice again and same result.

Do you have any suggestion ?

I check on ambari-server.log and no clue...

Have a nive day :)

change service configure

hello, i have a custom service in ambari,and when i changed the configuration via ambari ui it gave me success but that value didnt change in real linux machine.can u give me some help?thanks a lot.

Install fails with 'UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 1033: ordinal not in range(128)'

Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 230, in
Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 218, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 101, in install
Execute(service_packagedir + '/scripts/setup_snapshot.sh '+params.zeppelin_dir+' '+params.hive_server_host+' '+params.hive_metastore_host+' '+params.hive_metastore_port+' FIRSTLAUNCH ' + params.spark_jar + ' ' + params.zeppelin_host + ' ' + str(params.zeppelin_port) + ' '+ str(params.setup_view) + ' >> ' + params.zeppelin_log_file, user=params.zeppelin_user)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 157, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 258, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, *_kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, *_kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 290, in _call
err_msg = Logger.filter_text(("Execution of '%s' returned %d. %s") % (command_alias, code, all_output))
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 1033: ordinal not in range(128)

Unable to install on Ubuntu LTS

Hello,
This is the error I receive when trying to install on ubuntu TLS.

Ambari 2.2
HDP 2.3
Ubuntu 14.04.4 LTS

resource_management.core.exceptions.Fail: Execution of '/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install zeppelin' returned 100. Reading package lists...
Building dependency tree...
Reading state information...

E: Unable to locate package zeppelin

setup_snapshot.sh

The setup_snapshot.sh return ${SETUP_VIEW,,}: bad substitution; which commenting it solve the issue, also it is trying to execute
"/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/zeppelin-stack/package/scripts/setup_snapshot.sh /opt/incubator-zeppelin HOST 9083 HOST 9995 True" and the validation is with "true" lowercase and it never configure the ambari view.
Tested on
OS: SUSE Linux Enterprise Server 11 SP3
HDP: 2.3.0.0-2557

ambari-zeppelin-service install error

Scenario: Installing Zeppelin on a node. configured maven on the node hosting zeppelin but installation fails.
My maven configuration(below):


Apache Maven 3.2.2 (45f7c06d68e745d05611f7fd14efb6594181933e; 2014-06-17T09:51:42-04:00)
Maven home: /usr/bin/maven
Java version: 1.7.0_85, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"

Error outupt(see below)
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 221, in
Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 218, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package/scripts/master.py", line 77, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 376, in install_packages
Package(name)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 157, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 45, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, *_kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, _kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install apache-maven' returned 1. Error: Nothing to do
stdout:
2015-08-16 06:34:48,266 - Directory['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/'] {'recursive': True}
2015-08-16 06:34:48,269 - File['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jce_policy-8.zip'] {'content': DownloadSource('http://ip-172-31-40-144.us-west-2.compute.internal:8080/resources//jce_policy-8.zip')}
2015-08-16 06:34:48,269 - Not downloading the file from http://ip-172-31-40-144.us-west-2.compute.internal:8080/resources//jce_policy-8.zip, because /var/lib/ambari-agent/data/tmp/jce_policy-8.zip already exists
2015-08-16 06:34:48,270 - Group['spark'] {'ignore_failures': False}
2015-08-16 06:34:48,271 - Group['zeppelin'] {'ignore_failures': False}
2015-08-16 06:34:48,271 - Group['hadoop'] {'ignore_failures': False}
2015-08-16 06:34:48,272 - Group['users'] {'ignore_failures': False}
2015-08-16 06:34:48,272 - Group['knox'] {'ignore_failures': False}
2015-08-16 06:34:48,273 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,274 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,275 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,276 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']}
2015-08-16 06:34:48,277 - User['atlas'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,278 - User['ams'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,279 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']}
2015-08-16 06:34:48,280 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']}
2015-08-16 06:34:48,281 - User['zeppelin'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,282 - User['mahout'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,283 - User['spark'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,284 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']}
2015-08-16 06:34:48,285 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,286 - User['kafka'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,287 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,288 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,289 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,290 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,291 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,292 - User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,293 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2015-08-16 06:34:48,294 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-08-16 06:34:48,296 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2015-08-16 06:34:48,312 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2015-08-16 06:34:48,313 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2015-08-16 06:34:48,318 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-08-16 06:34:48,320 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2015-08-16 06:34:48,335 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2015-08-16 06:34:48,336 - Group['hdfs'] {'ignore_failures': False}
2015-08-16 06:34:48,337 - User['hdfs'] {'ignore_failures': False, 'groups': ['hadoop', 'hdfs']}
2015-08-16 06:34:48,338 - Directory['/etc/hadoop'] {'mode': 0755}
2015-08-16 06:34:48,364 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2015-08-16 06:34:48,387 - Repository['HDP-2.3'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.3.0.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2015-08-16 06:34:48,403 - File['/etc/yum.repos.d/HDP.repo'] {'content': InlineTemplate(...)}
2015-08-16 06:34:48,405 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2015-08-16 06:34:48,410 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': InlineTemplate(...)}
2015-08-16 06:34:48,411 - Package['unzip'] {}
2015-08-16 06:34:48,599 - Skipping installation of existing package unzip
2015-08-16 06:34:48,599 - Package['curl'] {}
2015-08-16 06:34:48,658 - Skipping installation of existing package curl
2015-08-16 06:34:48,658 - Package['hdp-select'] {}
2015-08-16 06:34:48,716 - Skipping installation of existing package hdp-select
2015-08-16 06:34:48,717 - Directory['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/'] {'recursive': True}
2015-08-16 06:34:48,718 - File['/var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'] {'content': DownloadSource('http://ip-172-31-40-144.us-west-2.compute.internal:8080/resources//jdk-8u40-linux-x64.tar.gz'), 'not_if': 'test -f /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'}
2015-08-16 06:34:48,733 - Skipping File['/var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'] due to not_if
2015-08-16 06:34:48,734 - Directory['/usr/jdk64'] {}
2015-08-16 06:34:48,735 - Execute['('chmod', 'a+x', '/usr/jdk64')'] {'not_if': 'test -e /usr/jdk64/jdk1.8.0_40/bin/java', 'sudo': True}
2015-08-16 06:34:48,749 - Skipping Execute['('chmod', 'a+x', '/usr/jdk64')'] due to not_if
2015-08-16 06:34:48,750 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64'] {'not_if': 'test -e /usr/jdk64/jdk1.8.0_40/bin/java'}
2015-08-16 06:34:48,764 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64'] due to not_if
2015-08-16 06:34:48,765 - File['/usr/jdk64/jdk1.8.0_40/bin/java'] {'mode': 0755, 'cd_access': 'a'}
2015-08-16 06:34:48,767 - Execute['('chgrp', '-R', 'hadoop', '/usr/jdk64/jdk1.8.0_40')'] {'sudo': True}
2015-08-16 06:34:48,815 - Execute['('chown', '-R', 'root', '/usr/jdk64/jdk1.8.0_40')'] {'sudo': True}
2015-08-16 06:34:49,216 - Execute['echo User selected spark_version:1.3'] {}
2015-08-16 06:34:49,231 - Execute['find /var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package -iname "
.sh" | xargs chmod +x'] {}
2015-08-16 06:34:49,250 - Execute['hadoop fs -mkdir -p /user/zeppelin'] {'ignore_failures': True, 'user': 'hdfs'}
2015-08-16 06:34:52,983 - Execute['hadoop fs -chown zeppelin /user/zeppelin'] {'user': 'hdfs'}
2015-08-16 06:34:56,878 - Execute['hadoop fs -chgrp zeppelin /user/zeppelin'] {'user': 'hdfs'}
2015-08-16 06:35:00,728 - Execute['hadoop fs -mkdir -p hdfs:///apps/zeppelin'] {'ignore_failures': True, 'user': 'hdfs'}
2015-08-16 06:35:04,464 - Execute['hadoop fs -chown zeppelin hdfs:///apps/zeppelin'] {'user': 'hdfs'}
2015-08-16 06:35:08,144 - Execute['hadoop fs -chgrp zeppelin hdfs:///apps/zeppelin'] {'user': 'hdfs'}
2015-08-16 06:35:11,941 - Directory['/var/run/zeppelin-notebook'] {'owner': 'zeppelin', 'group': 'zeppelin', 'recursive': True}
2015-08-16 06:35:11,943 - Directory['/var/log/zeppelin'] {'owner': 'zeppelin', 'group': 'zeppelin', 'recursive': True}
2015-08-16 06:35:11,945 - Execute['touch /var/log/zeppelin/zeppelin-setup.log'] {'user': 'zeppelin'}
2015-08-16 06:35:12,031 - Execute['rm -rf /opt/incubator-zeppelin'] {'ignore_failures': True}
2015-08-16 06:35:12,045 - Execute['mkdir /opt/incubator-zeppelin'] {}
2015-08-16 06:35:12,060 - Execute['chown -R zeppelin:zeppelin /opt/incubator-zeppelin'] {}
2015-08-16 06:35:12,074 - Execute['echo Processing with zeppelin tar compiled with spark 1.3'] {}
2015-08-16 06:35:12,086 - Execute['echo Installing packages'] {}
2015-08-16 06:35:12,100 - Package['git'] {}
2015-08-16 06:35:12,284 - Skipping installation of existing package git
2015-08-16 06:35:12,285 - Package['java-1.7.0-openjdk-devel'] {}
2015-08-16 06:35:12,345 - Skipping installation of existing package java-1.7.0-openjdk-devel
2015-08-16 06:35:12,346 - Package['apache-maven'] {}
2015-08-16 06:35:12,405 - Installing package apache-maven ('/usr/bin/yum -d 0 -e 0 -y install apache-maven')

UBUNTU 14.04 Current Version: 2.3.2.0-2950 ambari-zeppelin-service install error

2016-06-07 19:54:27,248 - Group['spark'] {}
2016-06-07 19:54:27,249 - Group['zeppelin'] {}
2016-06-07 19:54:27,249 - Adding group Group['zeppelin']
2016-06-07 19:54:27,285 - Group['hadoop'] {}
2016-06-07 19:54:27,285 - Group['users'] {}
2016-06-07 19:54:27,286 - User['storm'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-06-07 19:54:27,287 - User['zookeeper'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-06-07 19:54:27,288 - User['spark'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-06-07 19:54:27,288 - User['oozie'] {'gid': 'hadoop', 'groups': [u'users']}
2016-06-07 19:54:27,289 - User['ambari-qa'] {'gid': 'hadoop', 'groups': [u'users']}
2016-06-07 19:54:27,290 - User['hdfs'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-06-07 19:54:27,291 - User['zeppelin'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-06-07 19:54:27,292 - Adding user User['zeppelin']
2016-06-07 19:54:27,512 - User['yarn'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-06-07 19:54:27,513 - User['mapred'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-06-07 19:54:27,513 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-06-07 19:54:27,515 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-06-07 19:54:27,522 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-06-07 19:54:27,522 - Group['hdfs'] {'ignore_failures': False}
2016-06-07 19:54:27,523 - User['hdfs'] {'ignore_failures': False, 'groups': [u'hadoop', u'hdfs']}
2016-06-07 19:54:27,523 - Directory['/etc/hadoop'] {'mode': 0755}
2016-06-07 19:54:27,536 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-06-07 19:54:27,537 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2016-06-07 19:54:27,546 - Repository['HDP-2.3'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu14/2.x/updates/2.3.4.7', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP', 'mirror_list': None}
2016-06-07 19:54:27,552 - File['/tmp/tmpeQHze_'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu14/2.x/updates/2.3.4.7 HDP main'}
2016-06-07 19:54:27,553 - Writing File['/tmp/tmpeQHze_'] because contents don't match
2016-06-07 19:54:27,561 - File['/tmp/tmphOycwy'] {'content': StaticFile('/etc/apt/sources.list.d/HDP.list')}
2016-06-07 19:54:27,562 - Writing File['/tmp/tmphOycwy'] because contents don't match
2016-06-07 19:54:27,562 - File['/etc/apt/sources.list.d/HDP.list'] {'content': StaticFile('/tmp/tmpeQHze_')}
2016-06-07 19:54:27,563 - Writing File['/etc/apt/sources.list.d/HDP.list'] because contents don't match
2016-06-07 19:54:27,564 - checked_call['apt-get update -qq -o Dir::Etc::sourcelist=sources.list.d/HDP.list -o Dir::Etc::sourceparts=- -o APT::Get::List-Cleanup=0'] {'sudo': True, 'quiet': False}
2016-06-07 19:54:36,343 - checked_call returned (0, '')
2016-06-07 19:54:36,344 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/ubuntu14', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2016-06-07 19:54:36,346 - File['/tmp/tmpYbbAZu'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/ubuntu14 HDP-UTILS main'}
2016-06-07 19:54:36,347 - Writing File['/tmp/tmpYbbAZu'] because contents don't match
2016-06-07 19:54:36,347 - File['/tmp/tmpSTZKqf'] {'content': StaticFile('/etc/apt/sources.list.d/HDP-UTILS.list')}
2016-06-07 19:54:36,347 - Writing File['/tmp/tmpSTZKqf'] because contents don't match
2016-06-07 19:54:36,348 - File['/etc/apt/sources.list.d/HDP-UTILS.list'] {'content': StaticFile('/tmp/tmpYbbAZu')}
2016-06-07 19:54:36,348 - Writing File['/etc/apt/sources.list.d/HDP-UTILS.list'] because contents don't match
2016-06-07 19:54:36,349 - checked_call['apt-get update -qq -o Dir::Etc::sourcelist=sources.list.d/HDP-UTILS.list -o Dir::Etc::sourceparts=- -o APT::Get::List-Cleanup=0'] {'sudo': True, 'quiet': False}
2016-06-07 19:54:49,422 - checked_call returned (0, '')
2016-06-07 19:54:49,423 - Package['unzip'] {}
2016-06-07 19:54:49,439 - Skipping installation of existing package unzip
2016-06-07 19:54:49,440 - Package['curl'] {}
2016-06-07 19:54:49,455 - Skipping installation of existing package curl
2016-06-07 19:54:49,455 - Package['hdp-select'] {}
2016-06-07 19:54:49,470 - Skipping installation of existing package hdp-select
2016-06-07 19:54:49,652 - Execute['find /var/lib/ambari-agent/cache/stacks/HDP/2.3/services/ZEPPELIN/package -iname "*.sh" | xargs chmod +x'] {}
2016-06-07 19:54:49,660 - Execute['echo platform.linux_distribution:Ubuntu+14.04+trusty'] {}
2016-06-07 19:54:49,664 - Package['liblapack-dev'] {}
2016-06-07 19:54:49,681 - Installing package liblapack-dev ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install liblapack-dev')
2016-06-07 19:55:06,178 - Package['gfortran'] {}
2016-06-07 19:55:06,194 - Installing package gfortran ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install gfortran')
2016-06-07 19:56:11,435 - Package['python-dev'] {}
2016-06-07 19:56:11,454 - Installing package python-dev ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install python-dev')
2016-06-07 19:57:04,434 - Package['g++'] {}
2016-06-07 19:57:04,452 - Installing package g++ ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install g++')
2016-06-07 19:57:58,456 - Package['python-pip'] {}
2016-06-07 19:57:58,474 - Installing package python-pip ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install python-pip')

2016-06-07 19:58:19,340 - Package['zeppelin'] {}

2016-06-07 19:58:19,358 - Installing package zeppelin ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install zeppelin')
2016-06-07 19:58:19,729 - Execution of '['/usr/bin/apt-get', '-q', '-o', 'Dpkg::Options::=--force-confdef', '--allow-unauthenticated', '--assume-yes', 'install', u'zeppelin']' returned 100. Reading package lists...
Building dependency tree...
Reading state information...
E: Unable to locate package zeppelin
2016-06-07 19:58:19,730 - Failed to install package zeppelin. Executing /usr/bin/apt-get update -qq
2016-06-07 19:58:51,124 - Retrying to install package zeppelin

Zepplin install failed on Azure HDP 2.5 standard 3 master 3 node cluster

Zepplin install failed on Azure HDP 2.5 standard 3 master 3 node cluster.

x
master1.t53hiwxtr3xunlr5uyfutxkiaa.bx.internal.cloudapp.net
Tasks
Copy Open Zeppelin Notebook Install
stderr: /var/lib/ambari-agent/data/errors-221.txt

Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 235, in
Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 52, in install
Execute('sudo yum install -y epel-release')
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'sudo yum install -y epel-release' returned 1. sudo: sorry, you must have a tty to run sudo
stdout: /var/lib/ambari-agent/data/output-221.txt

2017-04-21 02:55:53,398 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.0.0-169
2017-04-21 02:55:53,398 - Checking if need to create versioned conf dir /etc/hadoop/2.4.0.0-169/0
2017-04-21 02:55:53,398 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2017-04-21 02:55:53,420 - call returned (1, '/etc/hadoop/2.4.0.0-169/0 exist already', '')
2017-04-21 02:55:53,420 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2017-04-21 02:55:53,441 - checked_call returned (0, '/usr/hdp/2.4.0.0-169/hadoop/conf -> /etc/hadoop/2.4.0.0-169/0')
2017-04-21 02:55:53,441 - Ensuring that hadoop has the correct symlink structure
2017-04-21 02:55:53,442 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-04-21 02:55:53,443 - Group['spark'] {}
2017-04-21 02:55:53,444 - Group['zeppelin'] {}
2017-04-21 02:55:53,444 - Group['hadoop'] {}
2017-04-21 02:55:53,444 - Group['users'] {}
2017-04-21 02:55:53,444 - Group['knox'] {}
2017-04-21 02:55:53,445 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,445 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,446 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,446 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-04-21 02:55:53,447 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,447 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,448 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-04-21 02:55:53,448 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-04-21 02:55:53,449 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,450 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,450 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-04-21 02:55:53,451 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,451 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,452 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,453 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,453 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,454 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,454 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,455 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,455 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,456 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-04-21 02:55:53,457 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-04-21 02:55:53,462 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-04-21 02:55:53,462 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2017-04-21 02:55:53,463 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-04-21 02:55:53,463 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-04-21 02:55:53,467 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-04-21 02:55:53,467 - Group['hdfs'] {}
2017-04-21 02:55:53,468 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2017-04-21 02:55:53,468 - Directory['/etc/hadoop'] {'mode': 0755}
2017-04-21 02:55:53,481 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-04-21 02:55:53,482 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2017-04-21 02:55:53,493 - Repository['HDP-2.4'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-04-21 02:55:53,500 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.4]\nname=HDP-2.4\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.3.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-04-21 02:55:53,501 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-04-21 02:55:53,503 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.20]\nname=HDP-UTILS-1.1.0.20\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-04-21 02:55:53,504 - Package['unzip'] {}
2017-04-21 02:55:53,579 - Skipping installation of existing package unzip
2017-04-21 02:55:53,579 - Package['curl'] {}
2017-04-21 02:55:53,589 - Skipping installation of existing package curl
2017-04-21 02:55:53,589 - Package['hdp-select'] {}
2017-04-21 02:55:53,598 - Skipping installation of existing package hdp-select
2017-04-21 02:55:53,768 - Execute['find /var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package -iname "*.sh" | xargs chmod +x'] {}
2017-04-21 02:55:53,775 - Execute['echo platform.linux_distribution:CentOS Linux+7.2.1511+Core'] {}
2017-04-21 02:55:53,777 - Execute['echo Installing python packages for Centos'] {}
2017-04-21 02:55:53,780 - Execute['sudo yum install -y epel-release'] {}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.