Ambari安装报错解决办法

错误提示

stderr: /var/lib/ambari-agent/data/errors-66.txt

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
BeforeAnyHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 32, in hook
setup_java()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 216, in setup_java
raise Fail(format("Unable to access {java_exec}. Confirm you have copied jdk to this host."))
resource_management.core.exceptions.Fail: Unable to access /usr/java/jdk1.8.0_131/bin/java. Confirm you have copied jdk to this host.
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-66.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-66.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 37, in <module>
BeforeInstallHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 382, in execute
self.save_component_version_to_structured_out(self.command_name)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 244, in save_component_version_to_structured_out
stack_select_package_name = stack_select.get_package_name()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 110, in get_package_name
package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 224, in get_packages
supported_packages = get_supported_packages()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 148, in get_supported_packages
raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select

stdout: /var/lib/ambari-agent/data/output-66.txt

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
2018-06-09 09:51:34,066 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-06-09 09:51:34,074 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-06-09 09:51:34,076 - Group['livy'] {}
2018-06-09 09:51:34,078 - Adding group Group['livy']
2018-06-09 09:51:34,101 - Group['spark'] {}
2018-06-09 09:51:34,101 - Adding group Group['spark']
2018-06-09 09:51:34,118 - Group['hdfs'] {}
2018-06-09 09:51:34,119 - Adding group Group['hdfs']
2018-06-09 09:51:34,136 - Group['zeppelin'] {}
2018-06-09 09:51:34,137 - Adding group Group['zeppelin']
2018-06-09 09:51:34,155 - Group['hadoop'] {}
2018-06-09 09:51:34,157 - Adding group Group['hadoop']
2018-06-09 09:51:34,174 - Group['users'] {}
2018-06-09 09:51:34,175 - Group['knox'] {}
2018-06-09 09:51:34,175 - Adding group Group['knox']
2018-06-09 09:51:34,193 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,194 - Adding user User['hive']
2018-06-09 09:51:34,234 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,235 - Adding user User['infra-solr']
2018-06-09 09:51:34,274 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,275 - Adding user User['atlas']
2018-06-09 09:51:34,307 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,308 - Adding user User['ams']
2018-06-09 09:51:34,337 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-09 09:51:34,338 - Adding user User['falcon']
2018-06-09 09:51:34,367 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,367 - Adding user User['accumulo']
2018-06-09 09:51:34,396 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,397 - Adding user User['spark']
2018-06-09 09:51:34,424 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,425 - Adding user User['flume']
2018-06-09 09:51:34,451 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,451 - Adding user User['hbase']
2018-06-09 09:51:34,480 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,480 - Adding user User['hcat']
2018-06-09 09:51:34,511 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,511 - Adding user User['storm']
2018-06-09 09:51:34,538 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,539 - Adding user User['zookeeper']
2018-06-09 09:51:34,566 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-09 09:51:34,566 - Adding user User['oozie']
2018-06-09 09:51:34,595 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-09 09:51:34,595 - Adding user User['tez']
2018-06-09 09:51:34,620 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2018-06-09 09:51:34,621 - Adding user User['zeppelin']
2018-06-09 09:51:34,648 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,649 - Adding user User['livy']
2018-06-09 09:51:34,678 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,679 - Adding user User['mahout']
2018-06-09 09:51:34,702 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,703 - Adding user User['druid']
2018-06-09 09:51:34,731 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-09 09:51:34,731 - Adding user User['ambari-qa']
2018-06-09 09:51:34,761 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,761 - Adding user User['kafka']
2018-06-09 09:51:34,790 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-06-09 09:51:34,791 - Adding user User['hdfs']
2018-06-09 09:51:34,818 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,818 - Adding user User['sqoop']
2018-06-09 09:51:34,849 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,849 - Adding user User['yarn']
2018-06-09 09:51:34,883 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,883 - Adding user User['mapred']
2018-06-09 09:51:34,913 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,913 - Adding user User['knox']
2018-06-09 09:51:34,942 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-06-09 09:51:34,947 - Writing File['/var/lib/ambari-agent/tmp/changeUid.sh'] because it doesn't exist
2018-06-09 09:51:34,947 - Changing permission for /var/lib/ambari-agent/tmp/changeUid.sh from 644 to 555
2018-06-09 09:51:34,948 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-06-09 09:51:34,953 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-06-09 09:51:34,955 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-06-09 09:51:34,958 - Creating directory Directory['/tmp/hbase-hbase'] since it doesn't exist.
2018-06-09 09:51:34,958 - Changing owner for /tmp/hbase-hbase from 0 to hbase
2018-06-09 09:51:34,958 - Changing permission for /tmp/hbase-hbase from 755 to 775
2018-06-09 09:51:34,960 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-06-09 09:51:34,962 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-06-09 09:51:34,962 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-06-09 09:51:34,969 - call returned (0, '1009')
2018-06-09 09:51:34,970 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-06-09 09:51:34,979 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] due to not_if
2018-06-09 09:51:34,979 - Group['hdfs'] {}
2018-06-09 09:51:34,979 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-06-09 09:51:34,980 - FS Type:
2018-06-09 09:51:34,980 - Directory['/etc/hadoop'] {'mode': 0755}
2018-06-09 09:51:34,980 - Creating directory Directory['/etc/hadoop'] since it doesn't exist.
2018-06-09 09:51:34,980 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-06-09 09:51:34,981 - Creating directory Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] since it doesn't exist.
2018-06-09 09:51:34,981 - Changing owner for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs
2018-06-09 09:51:34,981 - Changing group for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hadoop
2018-06-09 09:51:34,981 - Changing permission for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 755 to 1777
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-66.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-66.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']

Command failed after 1 tries

解决办法

安装jdk。

坚持原创技术分享,您的支持将鼓励我继续创作!