Giter VIP home page Giter VIP logo

ambari-nifi-service's People

Contributors

abajwa-hw avatar aperepel avatar oluies avatar veteranbv avatar zacblanco avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ambari-nifi-service's Issues

'ascii' codec can't decode byte 0xe2 in position 191: ordinal not in range(128

Env: Ubuntu 14.04.1 HDP 2.4

I got this error File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 163, in run Logger.info("Skipping failure of %s due to ignore_failures. Failure reason: %s" % (resource, str(ex))) UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 191: ordinal not in range(128)

Lang settings

$ sudo -u nifi sh
sh-4.3$ echo $LANG
en_US.UTF-8
xadmin@c3gw:~$ sudo -u root sh
sh-4.3# echo $LANG
en_US.UTF-8
sh-4.3# python
Python 2.7.6 (default, Jun 22 2015, 17:58:13)
[GCC 4.8.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> s = '(\xef\xbd\xa1\xef\xbd\xa5\xcf\x89\xef\xbd\xa5\xef\xbd\xa1)\xef\xbe\x89'
>>> s1 = s.decode('utf-8')
>>> print s1
(。・ω・。)ノ

Full logs:


Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/NIFI/package/scripts/master.py", line 201, in <module>
    Master().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/NIFI/package/scripts/master.py", line 109, in install
    self.configure(env, True)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/NIFI/package/scripts/master.py", line 145, in configure
    Execute(format("cd {params.conf_dir}; mv flow.xml.gz flow_$(date +%d-%m-%Y).xml.gz ;"), user=params.nifi_user, ignore_failures=True)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in **init**
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 163, in run
    Logger.info("Skipping failure of %s due to ignore_failures. Failure reason: %s" % (resource, str(ex)))
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 191: ordinal not in range(128)
stdout:   /var/lib/ambari-agent/data/output-2802.txt

2016-03-21 17:50:08,974 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.0.0-169
2016-03-21 17:50:08,974 - Checking if need to create versioned conf dir /etc/hadoop/2.4.0.0-169/0
2016-03-21 17:50:08,975 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-03-21 17:50:09,003 - call returned (1, '/etc/hadoop/2.4.0.0-169/0 exist already', '')
2016-03-21 17:50:09,003 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-03-21 17:50:09,031 - checked_call returned (0, '/usr/hdp/2.4.0.0-169/hadoop/conf -> /etc/hadoop/2.4.0.0-169/0')
2016-03-21 17:50:09,031 - Ensuring that hadoop has the correct symlink structure
2016-03-21 17:50:09,031 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-03-21 17:50:09,033 - Group['spark'] {}
2016-03-21 17:50:09,034 - Group['zeppelin'] {}
2016-03-21 17:50:09,034 - Group['hadoop'] {}
2016-03-21 17:50:09,035 - Group['nifi'] {}
2016-03-21 17:50:09,035 - Group['users'] {}
2016-03-21 17:50:09,035 - Group['knox'] {}
2016-03-21 17:50:09,035 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,036 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,037 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,037 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,038 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-03-21 17:50:09,039 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,039 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-03-21 17:50:09,040 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-03-21 17:50:09,041 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,041 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,042 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,043 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,043 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,044 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-03-21 17:50:09,044 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,045 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,046 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,046 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,047 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,048 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,048 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,049 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,050 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-03-21 17:50:09,050 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-03-21 17:50:09,054 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-03-21 17:50:09,059 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-03-21 17:50:09,059 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2016-03-21 17:50:09,060 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-03-21 17:50:09,061 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2016-03-21 17:50:09,066 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2016-03-21 17:50:09,067 - Group['hdfs'] {}
2016-03-21 17:50:09,067 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2016-03-21 17:50:09,068 - Directory['/etc/hadoop'] {'mode': 0755}
2016-03-21 17:50:09,083 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-03-21 17:50:09,083 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2016-03-21 17:50:09,097 - Repository['HDP-2.4'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu14/2.x/updates/2.4.0.0/', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP', 'mirror_list': None}
2016-03-21 17:50:09,106 - File['/tmp/tmpnWfyts'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu14/2.x/updates/2.4.0.0/ HDP main'}
2016-03-21 17:50:09,106 - Writing File['/tmp/tmpnWfyts'] because contents don't match
2016-03-21 17:50:09,107 - File['/tmp/tmpdRtnXR'] {'content': StaticFile('/etc/apt/sources.list.d/HDP.list')}
2016-03-21 17:50:09,107 - Writing File['/tmp/tmpdRtnXR'] because contents don't match
2016-03-21 17:50:09,108 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/ubuntu14', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2016-03-21 17:50:09,110 - File['/tmp/tmpfGdQL8'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/ubuntu14 HDP-UTILS main'}
2016-03-21 17:50:09,110 - Writing File['/tmp/tmpfGdQL8'] because contents don't match
2016-03-21 17:50:09,110 - File['/tmp/tmpIqqJht'] {'content': StaticFile('/etc/apt/sources.list.d/HDP-UTILS.list')}
2016-03-21 17:50:09,110 - Writing File['/tmp/tmpIqqJht'] because contents don't match
2016-03-21 17:50:09,111 - Package['unzip'] {}
2016-03-21 17:50:09,146 - Skipping installation of existing package unzip
2016-03-21 17:50:09,146 - Package['curl'] {}
2016-03-21 17:50:09,187 - Skipping installation of existing package curl
2016-03-21 17:50:09,187 - Package['hdp-select'] {}
2016-03-21 17:50:09,241 - Skipping installation of existing package hdp-select
2016-03-21 17:50:09,442 - Execute['cp /etc/sudoers /etc/sudoers.bak'] {}
2016-03-21 17:50:09,449 - Execute['echo "nifi    ALL=(ALL)       NOPASSWD: ALL" >> /etc/sudoers'] {}
2016-03-21 17:50:09,452 - Execute['echo Creating /var/log/nifi /var/run/nifi'] {}
2016-03-21 17:50:09,456 - Directory['/var/run/nifi'] {'owner': 'nifi', 'group': 'nifi', 'recursive': True}
2016-03-21 17:50:09,456 - Creating directory Directory['/var/run/nifi'] since it doesn't exist.
2016-03-21 17:50:09,456 - Changing owner for /var/run/nifi from 0 to nifi
2016-03-21 17:50:09,456 - Changing group for /var/run/nifi from 0 to nifi
2016-03-21 17:50:09,457 - Directory['/var/log/nifi'] {'owner': 'nifi', 'group': 'nifi', 'recursive': True}
2016-03-21 17:50:09,457 - Execute['touch /var/log/nifi/nifi-setup.log'] {'user': 'nifi'}
2016-03-21 17:50:09,474 - Execute['rm -rf /opt/nifi-0.5.1.1.1.2.0-32'] {'ignore_failures': True}
2016-03-21 17:50:09,478 - Execute['mkdir -p /opt/nifi-0.5.1.1.1.2.0-32'] {}
2016-03-21 17:50:09,482 - Execute['chown -R nifi:nifi /opt/nifi-0.5.1.1.1.2.0-32'] {}
2016-03-21 17:50:09,486 - Execute['echo Compiling nifi from source'] {}
2016-03-21 17:50:09,490 - Execute['cd /opt; git clone https://git-wip-us.apache.org/repos/asf/nifi.git /opt/nifi-0.5.1.1.1.2.0-32 >> /var/log/nifi/nifi-setup.log'] {}
2016-03-21 17:50:37,284 - Execute['chown -R nifi:nifi /opt/nifi-0.5.1.1.1.2.0-32'] {}
2016-03-21 17:50:37,355 - Execute['cd /opt/nifi-0.5.1.1.1.2.0-32; mvn -T C2.0 clean install -DskipTests >> /var/log/nifi/nifi-setup.log'] {'user': 'nifi'}
2016-03-21 17:58:23,924 - File['/opt/nifi-0.5.1.1.1.2.0-32/nifi-assembly/target/nifi-0.6.0-SNAPSHOT-bin/nifi-0.6.0-SNAPSHOT/conf/nifi.properties'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi'}
2016-03-21 17:58:23,924 - Writing File['/opt/nifi-0.5.1.1.1.2.0-32/nifi-assembly/target/nifi-0.6.0-SNAPSHOT-bin/nifi-0.6.0-SNAPSHOT/conf/nifi.properties'] because contents don't match
2016-03-21 17:58:23,925 - Changing group for /opt/nifi-0.5.1.1.1.2.0-32/nifi-assembly/target/nifi-0.6.0-SNAPSHOT-bin/nifi-0.6.0-SNAPSHOT/conf/nifi.properties from 1003 to nifi
2016-03-21 17:58:23,925 - Execute['echo "First time setup so generating flow.xml.gz" >> /var/log/nifi/nifi-setup.log'] {}
2016-03-21 17:58:23,930 - File['/opt/nifi-0.5.1.1.1.2.0-32/nifi-assembly/target/nifi-0.6.0-SNAPSHOT-bin/nifi-0.6.0-SNAPSHOT/conf/flow.xml'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi'}
2016-03-21 17:58:23,931 - Writing File['/opt/nifi-0.5.1.1.1.2.0-32/nifi-assembly/target/nifi-0.6.0-SNAPSHOT-bin/nifi-0.6.0-SNAPSHOT/conf/flow.xml'] because it doesn't exist
2016-03-21 17:58:23,931 - Changing owner for /opt/nifi-0.5.1.1.1.2.0-32/nifi-assembly/target/nifi-0.6.0-SNAPSHOT-bin/nifi-0.6.0-SNAPSHOT/conf/flow.xml from 0 to nifi
2016-03-21 17:58:23,931 - Changing group for /opt/nifi-0.5.1.1.1.2.0-32/nifi-assembly/target/nifi-0.6.0-SNAPSHOT-bin/nifi-0.6.0-SNAPSHOT/conf/flow.xml from 0 to nifi
2016-03-21 17:58:23,931 - Execute['cd /opt/nifi-0.5.1.1.1.2.0-32/nifi-assembly/target/nifi-0.6.0-SNAPSHOT-bin/nifi-0.6.0-SNAPSHOT/conf; mv flow.xml.gz flow_$(date +%d-%m-%Y).xml.gz ;'] {'ignore_failures': True, 'user': 'nifi'}

Howto: Upgrade to 0.5.0

Hi,

I've installed NiFi as a service through Ambari on my cluster which works great but I want to upgrade to the latest release 0.5.0. What steps do I need to take in order to do this ?

Thanks,
Mike

Service install issue

when trying to add the service I get the following error: 500 status code received on GET method for API: /api/v1/stacks/HDP/versions/2.4/recommendations

new version of nifi

Hi all,
do you foresee to upgrade the nifi ambari service to support nifi 1.0?

Thanks,
Matteo.

how to specify latest nifi from source ?

Hi,

In the Read me it says "By default, downloads the current GA version - HDF 1.2.0.0 package (nifi 0.6.0) - but also gives option to build the latest Nifi from source instead"

On installing what are the specific steps required to build from the latest nifi source?

Thanks,
Mike

Why NiFi service requires Slider and Spark2.0?

Hi Ali,
I'm trying to use your code to setup NiFi on HDP2.5 with Ambari2.4 configured with Spark 1.6.
After importing your service package into Ambari and selecting it for installation, it requests adding Slider and Spark2 which are not needed for my demo.
Can you please advise how can I disable such dependency? is it because your script is using HDF 2.0 and that internally references Slider & Spark2?

pid file

I got this during a restart shoulnt the "pid" file be stored under "/var" ?

stderr: 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/NIFI/package/scripts/master.py", line 210, in <module>
    Master().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/NIFI/package/scripts/master.py", line 185, in start
    Execute('cat '+params.bin_dir+'/nifi.pid'+" | grep pid | sed 's/pid=\(\.*\)/\\1/' > " + status_params.nifi_pid_file)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'cat /opt/nifi-0.5.1.1.1.2.0-32/bin/nifi.pid | grep pid | sed 's/pid=\(\.*\)/\1/' > /var/run/nifi/nifi.pid' returned 1. /bin/bash: /var/run/nifi/nifi.pid: No such file or directory
grep: write error: Broken pipe
 stdout:
2016-04-22 10:21:06,528 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.0.0-169
2016-04-22 10:21:06,528 - Checking if need to create versioned conf dir /etc/hadoop/2.4.0.0-169/0
2016-04-22 10:21:06,528 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-04-22 10:21:06,557 - call returned (1, '/etc/hadoop/2.4.0.0-169/0 exist already', '')
2016-04-22 10:21:06,557 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-04-22 10:21:06,587 - checked_call returned (0, '/usr/hdp/2.4.0.0-169/hadoop/conf -> /etc/hadoop/2.4.0.0-169/0')
2016-04-22 10:21:06,587 - Ensuring that hadoop has the correct symlink structure
2016-04-22 10:21:06,588 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-04-22 10:21:06,870 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.0.0-169
2016-04-22 10:21:06,870 - Checking if need to create versioned conf dir /etc/hadoop/2.4.0.0-169/0
2016-04-22 10:21:06,870 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-04-22 10:21:06,898 - call returned (1, '/etc/hadoop/2.4.0.0-169/0 exist already', '')
2016-04-22 10:21:06,899 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-04-22 10:21:06,927 - checked_call returned (0, '/usr/hdp/2.4.0.0-169/hadoop/conf -> /etc/hadoop/2.4.0.0-169/0')
2016-04-22 10:21:06,927 - Ensuring that hadoop has the correct symlink structure
2016-04-22 10:21:06,927 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-04-22 10:21:06,929 - Group['spark'] {}
2016-04-22 10:21:06,930 - Group['zeppelin'] {}
2016-04-22 10:21:06,930 - Group['hadoop'] {}
2016-04-22 10:21:06,930 - Group['nifi'] {}
2016-04-22 10:21:06,931 - Group['users'] {}
2016-04-22 10:21:06,931 - Group['knox'] {}
2016-04-22 10:21:06,931 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,932 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,932 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,933 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,934 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-04-22 10:21:06,934 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,935 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-04-22 10:21:06,936 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-04-22 10:21:06,936 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,937 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,938 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,938 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,939 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,940 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-04-22 10:21:06,940 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,941 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,941 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,942 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,943 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,943 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,944 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,945 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,945 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-04-22 10:21:06,946 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-04-22 10:21:06,947 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-04-22 10:21:06,953 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-04-22 10:21:06,953 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2016-04-22 10:21:06,954 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-04-22 10:21:06,955 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2016-04-22 10:21:06,960 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2016-04-22 10:21:06,961 - Group['hdfs'] {}
2016-04-22 10:21:06,961 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2016-04-22 10:21:06,962 - Directory['/etc/hadoop'] {'mode': 0755}
2016-04-22 10:21:06,979 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-04-22 10:21:06,979 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2016-04-22 10:21:06,996 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2016-04-22 10:21:07,003 - Skipping Execute[('setenforce', '0')] due to not_if
2016-04-22 10:21:07,004 - Directory['/var/log/hadoop'] {'owner': 'root', 'mode': 0775, 'group': 'hadoop', 'recursive': True, 'cd_access': 'a'}
2016-04-22 10:21:07,006 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True, 'cd_access': 'a'}
2016-04-22 10:21:07,006 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True, 'cd_access': 'a'}
2016-04-22 10:21:07,011 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2016-04-22 10:21:07,013 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2016-04-22 10:21:07,014 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2016-04-22 10:21:07,026 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
2016-04-22 10:21:07,027 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2016-04-22 10:21:07,032 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2016-04-22 10:21:07,036 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2016-04-22 10:21:07,338 - File['/opt/nifi-0.5.1.1.1.2.0-32/conf/nifi.properties'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi'}
2016-04-22 10:21:07,343 - File['/opt/nifi-0.5.1.1.1.2.0-32/conf/bootstrap.conf'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi'}
2016-04-22 10:21:07,348 - File['/opt/nifi-0.5.1.1.1.2.0-32/conf/logback.xml'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi'}
2016-04-22 10:21:07,349 - Execute['echo pid file /var/run/nifi/nifi.pid'] {}
2016-04-22 10:21:07,352 - Execute['echo JAVA_HOME=/usr/jdk64/jdk1.8.0_60'] {}
2016-04-22 10:21:07,356 - Execute['export JAVA_HOME=/usr/jdk64/jdk1.8.0_60;/opt/nifi-0.5.1.1.1.2.0-32/bin/nifi.sh start >> /var/log/nifi/nifi-setup.log'] {'user': 'nifi'}
2016-04-22 10:21:12,364 - Execute['cat /opt/nifi-0.5.1.1.1.2.0-32/bin/nifi.pid | grep pid | sed 's/pid=\(\.*\)/\1/' > /var/run/nifi/nifi.pid'] {}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.