前提条件:hadoop需要运行
本人hadoop为hadoop-2.7.1.tar.gz
首先下载hive http://archive.apache.org/dist/hive/
本人:apache-hive-2.1.1-bin.tar.gz
-----------------------------------------------------------------
下载完成

1.解压 tar -zxvf apache-hive-2.1.1-bin.tar.gz
2.mv apache-hive-2.1.1-bin hive
3.vim /etc/profile
export HIVE_HOME=/usr/local/hive export PATH=$PATH:$HIVE_HOME/bin
执行 source /etc/profile
执行hive --version

---------------------------------------------------------------------
1. cd /usr/local/hive/conf
cp hive-default.xml.template hive-site.xml
2.在hive-site配置具体信心
注意:ip配置涉及到schema初始化
测试最好mysql库在同一台机器上
正式:远程初始化需要注意mysql权限问题
<!-- 插入一下代码 -->
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123456</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://ip:3306/hive?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>Hive.exec.local.scratchdir</name>
<value>/usr/local/hadoop/tmp/${user.name}</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/usr/local/hadoop/tmp/${hive.session.id}_resources</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
<property>
<name>hive.server2.logging.operation.log.location</name>
<value>/usr/local/hadoop/tmp/operation_logs</value>
<description>Top level directory where operation logs are stored if logging functionality is enabled</description>
</property>
cp hive-default.xml.template hive-site.xml
3复制mysql的驱动程序到hive/lib下面
![]()
4.在mysql中hive的schema(在此之前需要创建mysql下的hive数据库)
[root@master conf]# schematool -initSchema -dbType mysql
这里很容易出错,具体问题具体分析

mysql映射

执行hive 成功进入hive

-------------------------------------------------
hive测试:
hadoop HDFS存储
bin/hadoop fs -lsr /
[root@master hadoop]# bin/hadoop fs -lsr /
lsr: DEPRECATED: Please use 'ls -R' instead.
drwxr-xr-x - root supergroup 0 2021-05-07 12:08 /input
drwxr-xr-x - root supergroup 0 2021-05-07 12:08 /input/out
-rw-r--r-- 1 root supergroup 0 2021-05-07 12:08 /input/out/_SUCCESS
-rw-r--r-- 1 root supergroup 131 2021-05-07 12:08 /input/out/part-r-00000
-rw-r--r-- 1 root supergroup 179 2021-05-07 11:29 /input/word.txt
drwxr-xr-x - root supergroup 0 2021-05-07 11:24 /test
-rw-r--r-- 1 root supergroup 179 2021-05-07 11:24 /test/word.txt
drwxr-xr-x - root supergroup 0 2021-06-02 17:17 /test2
drwx-wx-wx - root supergroup 0 2021-06-02 21:24 /tmp
drwx-wx-wx - root supergroup 0 2021-06-02 21:24 /tmp/hive
drwx------ - root supergroup 0 2021-06-02 21:30 /tmp/hive/root
drwx------ - root supergroup 0 2021-06-02 21:30 /tmp/hive/root/658a2a46-ea99-4967-a6a8-8954821f33c8
drwx------ - root supergroup 0 2021-06-02 21:30 /tmp/hive/root/658a2a46-ea99-4967-a6a8-8954821f33c8/_tmp_space.db
drwxr-xr-x - root supergroup 0 2021-06-02 21:30 /user
drwxr-xr-x - root supergroup 0 2021-06-02 21:30 /user/hive
drwxr-xr-x - root supergroup 0 2021-06-02 21:30 /user/hive/warehouse
drwxr-xr-x - root supergroup 0 2021-06-02 21:30 /user/hive/warehouse/hive_1.db
1.进入mysql查看
