Continuous Self-Service Integration Deployment and Validation
ambari-cassandra-service's Introduction
ambari-cassandra-service's People
Forkers
anubh radcheb sleeplotus boldradius reportbrain beach33 saitejal grigorievnick ajak6 zujorv li-bo theanupam alururavindrababu lgromanowski fly365 hany-abdelrahman janehmueller dominion-digital semicolon1709 wisely147 ccbt87 codingzhouk ievgeniib philippeback finesure2017 isabella232 maduhuambari-cassandra-service's Issues
Fails on HDP 2.5: "cannot import name format hdp_stack_version"
I am trying to install this on HDP 2.5, and I get:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/CASSANDRA/package/scripts/clients.py", line 38, in <module>
clients().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/CASSANDRA/package/scripts/clients.py", line 32, in install
import params
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/CASSANDRA/package/scripts/params.py", line 16, in <module>
from resource_management.libraries.functions.version import format_hdp_stack_version, compare_versions
ImportError: cannot import name format_hdp_stack_version
Facing Error : resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install dsc21' returned 1. Error: Nothing to do
While Deploying Via Ambari .Facing the below Error.
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/CASSANDRA/package/scripts/cassandra_master.py", line 60, in
Cassandra_Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 216, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/CASSANDRA/package/scripts/cassandra_master.py", line 30, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 364, in install_packages
Package(name)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 157, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 43, in action_install
self.install_package(package_name, self.resource.use_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 50, in install_package
shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 68, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 86, in checked_call
return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout, path, sudo, on_new_line)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 204, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install dsc21' returned 1. Error: Nothing to do
stdout:
2017-03-09 16:22:10,777 - Group['hadoop'] {'ignore_failures': False}
2017-03-09 16:22:10,778 - Group['users'] {'ignore_failures': False}
2017-03-09 16:22:10,778 - Group['spark'] {'ignore_failures': False}
2017-03-09 16:22:10,779 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']}
2017-03-09 16:22:10,779 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,780 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']}
2017-03-09 16:22:10,781 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,781 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,782 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,782 - User['spark'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,783 - User['smoke'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,784 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,784 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,785 - User['cassandra'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,786 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']}
2017-03-09 16:22:10,786 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,787 - User['kafka'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,787 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,788 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,789 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,789 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,790 - User['ams'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,791 - User['atlas'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']}
2017-03-09 16:22:10,792 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-03-09 16:22:10,796 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-03-09 16:22:10,825 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-03-09 16:22:10,825 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2017-03-09 16:22:10,826 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-03-09 16:22:10,828 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-03-09 16:22:10,834 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-03-09 16:22:10,834 - Group['hdfs'] {'ignore_failures': False}
2017-03-09 16:22:10,834 - User['hdfs'] {'ignore_failures': False, 'groups': ['hadoop', 'hdfs']}
2017-03-09 16:22:10,836 - Directory['/etc/hadoop'] {'mode': 0755}
2017-03-09 16:22:10,851 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-03-09 16:22:10,869 - Repository['HDP-2.3'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-LABS/Projects/Dal-Preview/2.3.0.0-3/centos6/', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-03-09 16:22:10,879 - File['/etc/yum.repos.d/HDP.repo'] {'content': Template('/usr/lib/ambari-server/lib/resource_management/libraries/providers/../data/repo_suse_rhel.j2')}
2017-03-09 16:22:10,880 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-03-09 16:22:10,883 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': Template('/usr/lib/ambari-server/lib/resource_management/libraries/providers/../data/repo_suse_rhel.j2')}
2017-03-09 16:22:10,884 - Package['unzip'] {}
2017-03-09 16:22:11,837 - Skipping installation of existing package unzip
2017-03-09 16:22:11,837 - Package['curl'] {}
2017-03-09 16:22:12,010 - Skipping installation of existing package curl
2017-03-09 16:22:12,010 - Package['hdp-select'] {}
2017-03-09 16:22:12,116 - Skipping installation of existing package hdp-select
Install
2017-03-09 16:22:12,254 - Package['dsc21'] {}
2017-03-09 16:22:12,425 - Installing package dsc21 ('/usr/bin/yum -d 0 -e 0 -y install dsc21')
Cassandra 3.0 support
Will it support Cassandra 3.0 (by changing the package name to dsc30)?
Datastax repo must be added on RHEL/CentOS before Data Stax Cassandra 2.1 can be installed with yum
WebUI shows no nodes live when they're actually up and pass health checks
I was able to get the plugin working. I'm using this on CentOS and it was required that I install the datastax repo for yum first before anything would work (can this be automated?), but my main issue now is the UI is reporting inconsistent information.
The health checks for the "Cluster Nodes" is working (why is it called this? shouldn't they be more descriptive like "C* Nodes"?), but the Ambari UI shows the following:
(ignore the 4 warning alerts, they're not related to Cassandra)
When I run a nodetool status
you can see all my nodes are up:
Datacenter: DC1
===============
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
-- Address Load Tokens Owns (effective) Host ID Rack
UN 10.147.0.23 87.84 KB 256 51.4% 300c7c50-e1ca-4979-8fc4-0d7bf48e766b RAC1
UN 10.147.0.22 192.27 KB 256 52.0% 521ffe0d-4a32-4e29-8862-d9297c53e8d2 RAC1
UN 10.147.0.21 234.35 KB 256 48.9% 3c1f75d3-c111-45f0-85bc-cc0a795c5cad RAC1
UN 10.147.0.24 241.48 KB 256 47.7% 24b59f0b-24d4-4322-900c-4657f37e05af RAC1
Changes to the "data_file_directories" parameter are not taken into account.
Apparently, changes to the data_file_directories parameter are not really taken into account (as opposed to thoses to saved_caches_directory and commitlog_directory).
I am using HDP 2.4.
Thanks in advance for your help.
Add support for metrics and widgets
Error when install cassandra 2.1 on hdp 2.6
Hello.
I'm trying to use cassandra in ambari-server.
I installed ambari-server and some nodes on real machines instead of using HDPSandbox.
I'm using Ubuntu 16.04 as my OS. The version of cluster is HDP - 2.6.3.0
I followed your instructions to install cassandra. At the final step, the installation was failed because of this error of Cluster Nodes Install
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/CASSANDRA/package/scripts/cassandra_master.py", line 60, in
Cassandra_Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/CASSANDRA/package/scripts/cassandra_master.py", line 30, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 813, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 53, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/apt.py", line 75, in wrapper
return function_to_decorate(self, name, *args[2:])
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/apt.py", line 376, in install_package
self.checked_call_with_retries(cmd, sudo=True, env=INSTALL_CMD_ENV, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 251, in checked_call_with_retries
return self._call_with_retries(cmd, is_checked=True, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 268, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install dsc21' returned 100. Reading package lists...
Building dependency tree...
Reading state information...
E: Unable to locate package dsc21
stdout: /var/lib/ambari-agent/data/output-191.txt
2017-12-04 09:59:06,860 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2017-12-04 09:59:06,865 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-12-04 09:59:06,866 - Group['hdfs'] {}
2017-12-04 09:59:06,866 - Group['hadoop'] {}
2017-12-04 09:59:06,867 - Group['users'] {}
2017-12-04 09:59:06,867 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-12-04 09:59:06,868 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-12-04 09:59:06,868 - User['cassandra'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-12-04 09:59:06,869 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-12-04 09:59:06,869 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2017-12-04 09:59:06,870 - User['smoke'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-12-04 09:59:06,870 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2017-12-04 09:59:06,871 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-12-04 09:59:06,871 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2017-12-04 09:59:06,872 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-12-04 09:59:06,873 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-12-04 09:59:06,876 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2017-12-04 09:59:06,877 - Group['hdfs'] {}
2017-12-04 09:59:06,877 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2017-12-04 09:59:06,878 - FS Type:
2017-12-04 09:59:06,878 - Directory['/etc/hadoop'] {'mode': 0755}
2017-12-04 09:59:06,891 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-12-04 09:59:06,891 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-12-04 09:59:06,905 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2017-12-04 09:59:06,912 - File['/tmp/tmp6cMkrY'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.3.0 HDP main'}
2017-12-04 09:59:06,912 - Writing File['/tmp/tmp6cMkrY'] because contents don't match
2017-12-04 09:59:06,912 - File['/tmp/tmpgvtRaA'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdp-1.list')}
2017-12-04 09:59:06,913 - Writing File['/tmp/tmpgvtRaA'] because contents don't match
2017-12-04 09:59:06,913 - File['/etc/apt/sources.list.d/ambari-hdp-1.list'] {'content': StaticFile('/tmp/tmp6cMkrY')}
2017-12-04 09:59:06,913 - Writing File['/etc/apt/sources.list.d/ambari-hdp-1.list'] because contents don't match
2017-12-04 09:59:06,913 - checked_call[['apt-get', 'update', '-qq', '-o', u'Dir::Etc::sourcelist=sources.list.d/ambari-hdp-1.list', '-o', 'Dir::Etc::sourceparts=-', '-o', 'APT::Get::List-Cleanup=0']] {'sudo': True, 'quiet': False}
2017-12-04 09:59:07,040 - checked_call returned (0, '')
2017-12-04 09:59:07,041 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2017-12-04 09:59:07,043 - File['/tmp/tmpPBUb6S'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.3.0 HDP main\ndeb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16 HDP-UTILS main'}
2017-12-04 09:59:07,043 - Writing File['/tmp/tmpPBUb6S'] because contents don't match
2017-12-04 09:59:07,044 - File['/tmp/tmpbGWklO'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdp-1.list')}
2017-12-04 09:59:07,044 - Writing File['/tmp/tmpbGWklO'] because contents don't match
2017-12-04 09:59:07,044 - File['/etc/apt/sources.list.d/ambari-hdp-1.list'] {'content': StaticFile('/tmp/tmpPBUb6S')}
2017-12-04 09:59:07,044 - Writing File['/etc/apt/sources.list.d/ambari-hdp-1.list'] because contents don't match
2017-12-04 09:59:07,045 - checked_call[['apt-get', 'update', '-qq', '-o', u'Dir::Etc::sourcelist=sources.list.d/ambari-hdp-1.list', '-o', 'Dir::Etc::sourceparts=-', '-o', 'APT::Get::List-Cleanup=0']] {'sudo': True, 'quiet': False}
2017-12-04 09:59:07,178 - checked_call returned (0, 'W: http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16/dists/HDP-UTILS/InRelease: Signature by key DF52ED4F7A3A5882C0994C66B9733A7A07513CAD uses weak digest algorithm (SHA1)')
2017-12-04 09:59:07,178 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-12-04 09:59:07,194 - Skipping installation of existing package unzip
2017-12-04 09:59:07,194 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-12-04 09:59:07,209 - Skipping installation of existing package curl
2017-12-04 09:59:07,209 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-12-04 09:59:07,224 - Skipping installation of existing package hdp-select
2017-12-04 09:59:07,228 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2017-12-04 09:59:07,232 - Skipping stack-select on CASSANDRA because it does not exist in the stack-select package structure.
Install
2017-12-04 09:59:07,378 - Package['dsc21'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-12-04 09:59:07,394 - Installing package dsc21 ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install dsc21')
2017-12-04 09:59:08,682 - Execution of '/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install dsc21' returned 100. Reading package lists...
Building dependency tree...
Reading state information...
E: Unable to locate package dsc21
2017-12-04 09:59:08,682 - Failed to install package dsc21. Executing '/usr/bin/apt-get update -qq'
2017-12-04 09:59:10,008 - Retrying to install package dsc21 after 30 seconds
2017-12-04 09:59:40,218 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2017-12-04 09:59:40,223 - Skipping stack-select on CASSANDRA because it does not exist in the stack-select package structure.
Command failed after 1 tries
Can someone help me to fix it ?
Repository URL in instructions doesn't work.
Under the Setup section, you indicate the repository to clone from as:
https://github.com/ajak6/ambari-cassandra-service.git
but this should be Symantec instead of ajak6:
Trying to start Cassandra service over Ambari server Web UI
dsc21 : Depends: cassandra (= 2.1.13) but 3.0.4 is to be installed E: Unable to correct problems, you have held broken packages.
Hello,
Could you please guide me to config preferred cassandra v3? Following is the conflict during installation.
Thank you very much
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/CASSANDRA/package/scripts/cassandra_master.py", line 60, in
Cassandra_Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/CASSANDRA/package/scripts/cassandra_master.py", line 30, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 404, in install_packages
Package(name)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 49, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/apt.py", line 53, in wrapper
return function_to_decorate(self, name, _args[2:])
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/apt.py", line 97, in install_package
self.checked_call_until_not_locked(cmd, sudo=True, env=INSTALL_CMD_ENV, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 72, in checked_call_until_not_locked
return self.wait_until_not_locked(cmd, is_checked=True, *_kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 80, in wait_until_not_locked
code, out = func(cmd, *_kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, *_kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install dsc21' returned 100. Reading package lists...
Building dependency tree...
Reading state information...
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:
The following packages have unmet dependencies:
dsc21 : Depends: cassandra (= 2.1.13) but 3.0.4 is to be installed
E: Unable to correct problems, you have held broken packages.
Connection failed: [Errno 111] Connection refused to sandbox.hortonworks.com:7000
I followed your instructions to setup Cassandra to my Hortonworks Sandbox (HDP_2.6_vmware_19_04_2017_20_25_43_hdp_ambari_2_5_0_5_1
) and got the connection failed error. I googled it and found the same error for HDP 2.4 https://community.hortonworks.com/questions/104925/ambari-cassandra-serviceconnection-failed-errno-11.html.
Thank you in advance for your reply.
Heartbeat Lost during Installation of Cassandra Service.
Greets of the day!!!
First of all your tutorial and work done on github is appreciable and very interesting too.
But I have a problem installing the cassandra. The details for installation are as follows.
- Virtualbox v.5.1.22
- hortonworks hdp 2.3.2
i successfully run the first command ie.
- VERSION=
hdp-select status hadoop-client | sed 's/hadoop-client - \([0-9]\.[0-9]\).*/\1/'
Till this command everything works great and fine.
But as soon as I run the following two commands as a root user - git clone https://github.com/Symantec/ambari-cassandra-service.git /var/lib/ambari-server/resources/stacks/HDP/$VERSION/services/CASSANDRA
- service ambari-server
and login to AMBARI-SERVER using 127.0.0.1:8080 username: admin and password: admin all the services give me the following status
"HEARTBEAT LOST"
Please help me regarding this issue as I want to install cqlsh - cassandra in hdp 2.3.2
Regards,
Niraj Bhagchandani
Calling cassandra nodes "Cluster Nodes" in UI is confusing
Just about all technologies provisioned through Ambari are "Clustered". The Cassandra service plugin will reference Cassandra nodes simply as "Cluster Nodes", which is confusing within the context of Ambari. I propose changing it to "C* Nodes".
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.