Ambari基础

介绍

安装

部署环境

主机名 IP 操作系统
node1.wanglibing.com 10.10.1.31 CentOS7
node2.wanglibing.com 10.10.1.32 CentOS7
node3.wanglibing.com 10.10.1.33 CentOS7
node4.wanglibing.com 10.10.1.21 CentOS7

准备

配置项 说明
关闭防火墙 全部
关闭Selinux 全部
设置NTP 全部
安装JDK 全部
配置ip 全部
配置hostname 全部
配置最大打开文件要求 全部

配置最大打开文件要求

建议的最大打开文件描述符数为10000或更多。要检查为最大打开文件描述符数设置的当前值,请在每个主机上执行以下shell命令:

1
2
$ ulimit -Sn
$ ulimit -Hn

如果输出不大于10000,请运行以下命令将其设置为合适的默认值:

1
$ ulimit -n 10000

配置NTP

1
2
$ yum install -y ntp
$ systemctl enable ntpd

部署本地存储库

安装http server

1
2
3
$ yum install -y httpd
$ systemctl start httpd
$ systemctl enable httpd

默认web目录:/var/www/html

配置源

1
2
3
4
5
6
7
8
9
$ cd /usr/local/
$ wget http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.7.1.0/ambari-2.7.1.0-centos7.tar.gz
$ wget http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0/HDP-3.0.1.0-centos7-rpm.tar.gz
$ wget http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7/HDP-UTILS-1.1.0.22-centos7.tar.gz
$ wget http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0/HDP-GPL-3.0.1.0-centos7-gpl.tar.gz
$ tar zxvf ambari-2.7.1.0-centos7.tar.gz -C /var/www/html/
$ tar zxvf HDP-3.0.1.0-centos7-rpm.tar.gz -C /var/www/html/
$ tar zxvf HDP-UTILS-1.1.0.22-centos7.tar.gz -C /var/www/html/
$ tar zxvf HDP-GPL-3.0.1.0-centos7-gpl.tar.gz -C /var/www/html/

配置免密登陆

1
2
3
4
5
$ ssh-keygen
$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
$ scp -p ~/.ssh/id_rsa.pub root@10.10.1.32:/root/.ssh/authorized_keys
$ scp -p ~/.ssh/id_rsa.pub root@10.10.1.33:/root/.ssh/authorized_keys
$ scp -p ~/.ssh/id_rsa.pub root@10.10.1.21:/root/.ssh/authorized_keys

部署Ambari Server

安装jdk

Ubuntu

1
2
$ rpm -ivh jdk-8u191-linux-x64.rpm
$ java -version

配置公共源

1
2
3
4
$ cd /etc/yum.repos.d/
# 配置ambari源
$ wget -nv http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.7.1.0/ambari.repo
$ wget -nv http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0/hdp.repo

配置本地源

在配置公共源的基础上修改: ambari.repo

1
2
3
4
5
6
7
8
#VERSION_NUMBER=2.7.1.0-169
[ambari-2.7.1.0]
name=ambari Version - ambari-2.6.1.5
baseurl=http://10.10.1.31/ambari/centos7/2.7.1.0-169/
gpgcheck=1
gpgkey=http://10.10.1.31/ambari/centos7/2.7.1.0-169/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1
priority=1

hdp.repo

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#VERSION_NUMBER=3.0.1.0-187
[HDP-3.0.1.0]
name=HDP Version - HDP-3.0.1.0
baseurl=http://10.10.1.31/HDP/centos7/3.0.1.0-187
gpgcheck=1
gpgkey=http://10.10.1.31/HDP/centos7/3.0.1.0-187/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1
priority=1


[HDP-UTILS-1.1.0.22]
name=HDP-UTILS Version - HDP-UTILS-1.1.0.22
baseurl=http://10.10.1.31/HDP-UTILS/centos7/1.1.0.22
gpgcheck=1
gpgkey=http://10.10.1.31/HDP-UTILS/centos7/1.1.0.22/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1
priority=1

检查源列表

1
$ yum repolist

安装ambari-server

1
$ yum install -y ambari-server

配置ambari-server

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
$ ambari-server setup
Using python /usr/bin/python
Setup ambari-server
Checking SELinux...
SELinux status is 'disabled'
Customize user account for ambari-server daemon [y/n] (n)? y
Enter user account for ambari-server daemon (root):
Adjusting ambari-server permissions and ownership...
Checking firewall status...
Checking JDK...
Do you want to change Oracle JDK [y/n] (n)?
Check JDK version for Ambari Server...
JDK version found: 8
Minimum JDK version is 8 for Ambari. Skipping to setup different JDK for Ambari Server.
Checking GPL software agreement...
Completing setup...
Configuring database...
Enter advanced database configuration [y/n] (n)?
Configuring database...
Default properties detected. Using built-in database.
Configuring ambari database...
Checking PostgreSQL...
Running initdb: This may take up to a minute.
Initializing database ... OK


About to start PostgreSQL
Configuring local database...
Configuring PostgreSQL...
Restarting PostgreSQL
Creating schema and user...
done.
Creating tables...
done.
Extracting system views...
ambari-admin-2.7.1.0.169.jar
....
Ambari repo file doesnot contain latest json url, skipping repoinfos modification
Adjusting ambari-server permissions and ownership...
Ambari Server 'setup' completed successfully.

启动ambari-server

1
$ ambari-server start

访问ambari server

参数
URL http://10.10.1.31:8080/
用户名 admin
密码 admin

重新安装ambari server

在执行ambari-server setup时出现错误或者感觉里面的选项错误时要重新执行时先执行

1
$ ambari-server reset

后再执行

1
$ ambari-server setup

部署集群

Launch Install Wizard

1.png
1.png

Get Started

2.png
2.png

Select Version

OS Name Base URL
redhat7 HDP-3.0 http://10.10.1.31/HDP/centos7/3.0.1.0-187
redhat7 HDP-3.0-GPL http://10.10.1.31/HDP-GPL/centos7/3.0.1.0-187
redhat7 HDP-UTILS-1.1.0.22 http://10.10.1.31/HDP-UTILS/centos7/1.1.0.22

Install Options

查看私钥:

1
$ cat ~/.ssh/id_rsa

3.png 4.png

Confirm Hosts

5.png
5.png

Choose File System

6.png
6.png

勾选如下:

  • HDFS
  • ZooKeeper
  • Ambari Metrics

Assign Masters

7.png
7.png

Assign Slaves and Clients

8.png
8.png

常见错误

错误提示

stderr: /var/lib/ambari-agent/data/errors-66.txt

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
BeforeAnyHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 32, in hook
setup_java()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 216, in setup_java
raise Fail(format("Unable to access {java_exec}. Confirm you have copied jdk to this host."))
resource_management.core.exceptions.Fail: Unable to access /usr/java/jdk1.8.0_131/bin/java. Confirm you have copied jdk to this host.
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-66.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-66.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 37, in <module>
BeforeInstallHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 382, in execute
self.save_component_version_to_structured_out(self.command_name)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 244, in save_component_version_to_structured_out
stack_select_package_name = stack_select.get_package_name()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 110, in get_package_name
package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 224, in get_packages
supported_packages = get_supported_packages()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 148, in get_supported_packages
raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select

stdout: /var/lib/ambari-agent/data/output-66.txt

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
2018-06-09 09:51:34,066 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-06-09 09:51:34,074 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-06-09 09:51:34,076 - Group['livy'] {}
2018-06-09 09:51:34,078 - Adding group Group['livy']
2018-06-09 09:51:34,101 - Group['spark'] {}
2018-06-09 09:51:34,101 - Adding group Group['spark']
2018-06-09 09:51:34,118 - Group['hdfs'] {}
2018-06-09 09:51:34,119 - Adding group Group['hdfs']
2018-06-09 09:51:34,136 - Group['zeppelin'] {}
2018-06-09 09:51:34,137 - Adding group Group['zeppelin']
2018-06-09 09:51:34,155 - Group['hadoop'] {}
2018-06-09 09:51:34,157 - Adding group Group['hadoop']
2018-06-09 09:51:34,174 - Group['users'] {}
2018-06-09 09:51:34,175 - Group['knox'] {}
2018-06-09 09:51:34,175 - Adding group Group['knox']
2018-06-09 09:51:34,193 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,194 - Adding user User['hive']
2018-06-09 09:51:34,234 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,235 - Adding user User['infra-solr']
2018-06-09 09:51:34,274 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,275 - Adding user User['atlas']
2018-06-09 09:51:34,307 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,308 - Adding user User['ams']
2018-06-09 09:51:34,337 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-09 09:51:34,338 - Adding user User['falcon']
2018-06-09 09:51:34,367 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,367 - Adding user User['accumulo']
2018-06-09 09:51:34,396 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,397 - Adding user User['spark']
2018-06-09 09:51:34,424 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,425 - Adding user User['flume']
2018-06-09 09:51:34,451 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,451 - Adding user User['hbase']
2018-06-09 09:51:34,480 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,480 - Adding user User['hcat']
2018-06-09 09:51:34,511 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,511 - Adding user User['storm']
2018-06-09 09:51:34,538 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,539 - Adding user User['zookeeper']
2018-06-09 09:51:34,566 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-09 09:51:34,566 - Adding user User['oozie']
2018-06-09 09:51:34,595 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-09 09:51:34,595 - Adding user User['tez']
2018-06-09 09:51:34,620 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2018-06-09 09:51:34,621 - Adding user User['zeppelin']
2018-06-09 09:51:34,648 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,649 - Adding user User['livy']
2018-06-09 09:51:34,678 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,679 - Adding user User['mahout']
2018-06-09 09:51:34,702 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,703 - Adding user User['druid']
2018-06-09 09:51:34,731 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-09 09:51:34,731 - Adding user User['ambari-qa']
2018-06-09 09:51:34,761 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,761 - Adding user User['kafka']
2018-06-09 09:51:34,790 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-06-09 09:51:34,791 - Adding user User['hdfs']
2018-06-09 09:51:34,818 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,818 - Adding user User['sqoop']
2018-06-09 09:51:34,849 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,849 - Adding user User['yarn']
2018-06-09 09:51:34,883 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,883 - Adding user User['mapred']
2018-06-09 09:51:34,913 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-09 09:51:34,913 - Adding user User['knox']
2018-06-09 09:51:34,942 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-06-09 09:51:34,947 - Writing File['/var/lib/ambari-agent/tmp/changeUid.sh'] because it doesn't exist
2018-06-09 09:51:34,947 - Changing permission for /var/lib/ambari-agent/tmp/changeUid.sh from 644 to 555
2018-06-09 09:51:34,948 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-06-09 09:51:34,953 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-06-09 09:51:34,955 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-06-09 09:51:34,958 - Creating directory Directory['/tmp/hbase-hbase'] since it doesn't exist.
2018-06-09 09:51:34,958 - Changing owner for /tmp/hbase-hbase from 0 to hbase
2018-06-09 09:51:34,958 - Changing permission for /tmp/hbase-hbase from 755 to 775
2018-06-09 09:51:34,960 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-06-09 09:51:34,962 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-06-09 09:51:34,962 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-06-09 09:51:34,969 - call returned (0, '1009')
2018-06-09 09:51:34,970 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-06-09 09:51:34,979 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009'] due to not_if
2018-06-09 09:51:34,979 - Group['hdfs'] {}
2018-06-09 09:51:34,979 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-06-09 09:51:34,980 - FS Type:
2018-06-09 09:51:34,980 - Directory['/etc/hadoop'] {'mode': 0755}
2018-06-09 09:51:34,980 - Creating directory Directory['/etc/hadoop'] since it doesn't exist.
2018-06-09 09:51:34,980 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-06-09 09:51:34,981 - Creating directory Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] since it doesn't exist.
2018-06-09 09:51:34,981 - Changing owner for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs
2018-06-09 09:51:34,981 - Changing group for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hadoop
2018-06-09 09:51:34,981 - Changing permission for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 755 to 1777
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-66.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-66.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']

Command failed after 1 tries

解决办法

安装jdk。

参考

公共存储库

坚持原创技术分享,您的支持将鼓励我继续创作!
0%