此篇博客仅记录在使用sqoop时遇到的各种问题。持续更新,有问题评论区一起探讨,写得有不足之处见谅。
Oracle_to_hive
1. main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
日期:20241031
原因分析:Sqoop 在执行导入时依赖了 Hive,Hive 启用了 Log4j 的 JMX 功能,在没有足够权限的情况下尝试注册 MBeans 时被阻止,从而引发该错误
解决方法:
修改jdk的文件找到:jdk安装目录/jre/lib/security/java.policy
具体配置如下:在文件中添加如下内容
permission javax.management.MBeanTrustPermission "register";
2. org.apache.atlas.AtlasException: Failed to load application yaml当集群配置了atlas时,可能会遇到此问题
日期:20241031
原因分析:无法在classpath提及的目录中找到application-atlas.yml文件
将找到的配置文件,拷贝到classpath起止一个地址中:
3. cannot recognize input near ',' 'gcrq_month' ',' in column type
24/10/31 17:05:16 ERROR ql.Driver: FAILED: ParseException line 1:895 cannot recognize input near ',' 'gcrq_month' ',' in column type
org.apache.hadoop.hive.ql.parse.ParseException: line 1:895 cannot recognize input near ',' 'gcrq_month' ',' in column type。sqoop只支持单分区问题
日期:20241031
原因分析:导入数据到hive表时,使用了多个分区字段
源码级别解析可以看:Sqoop 数据导入多分区Hive解决方法_sqoop import 多个分区-CSDN博客
换种简单的解决方式,通过HCatalog解决,先查看是否安装HCatalog
然后参数使用HCatalog:
--split-by MINUTE \
--hive-import \
--hive-table ods_pre_dat_dcsj_time \
--target-dir /user/sqoop/hive/oracle_to_hive/ods_pre_dat_dcsj_time \
--delete-target-dir \
-- --hive-drop-import-delims \
--hcatalog-database dw \
--hcatalog-table ods_pre_dat_dcsj_time \
--hcatalog-storage-stanza 'stored as orc' \
--hcatalog-partition-keys "gcrq_year,gcrq_month,gcrq_day" \
--hcatalog-partition-values "${gcrq_year},${gcrq_month},${gcrq_day}" \
--num-mappers 3
4. FAILED: HiveAuthzPluginException Error getting permissions for hdfs://udh/user/sqoop/hive/oracle_to_hive/ods_pre_dat_dcsj_time : Unauthorized connection for super-user: hive from IP /xxx.xxx.xxx.xxx
日期:20241101
原因分析:猜测权限问题,其中xxx.xxx.xxx.xxx为我提交sqoop命令所在的服务器,“hive ”为我在服务器上以hive用户身份提交sqoop。我在sqoop命令中指定存放数据的临时目录为:--target-dir /user/sqoop/hive/oracle_to_hive/ods_pre_dat_dcsj_time,查看一下此目录的权限
发现存放临时数据目录下是有数据文件的,再看看它的父目录:
尝试用hdfs dfs -chown修改其父目录权限,改为拥有者为hive。再次执行,还是报错。
转换下思路,难道是hive的权限限制问题?Unauthorized connection for super-user: hive from IP /10.81.35.162,大白话翻译:超级用户的未经授权的连接。而在 Hive 中,hive
用户通常是一个默认的超级用户,具备访问和操作 Hive 的所有权限
在hdfs的core-site文件中配置:
<property>
<name>hadoop.proxyuser.hive.groups</name>
<value>*</value> <!-- 允许所有用户组访问 -->
</property>
<property>
<name>hadoop.proxyuser.hive.hosts</name>
<value>xxx.xxx.xxx.xxx</value> <!-- 允许特定 IP 地址访问 ,多ip逗号分隔-->
</property>
ambari修改的话一般在HDFS的自定义core-site中,改为以上配置后需要先重启HDFS再重启HIve
5. 24/11/05 09:49:00 INFO sqlstd.SQLStdHiveAccessController: Current user : hive, Current Roles : [public[hive:USER]]
FAILED: HiveAccessControlException Permission denied: Principal [name=hive, type=USER] does not have following privileges for operation LOAD [[INSERT, DELETE] on Object [type=TABLE_OR_VIEW, name=default.ods_pre_dat_dcsj_time, action=INSERT]]
原因分析:在使用sqoop将Oracle数据查询,导入到hive分区表时,遇到以上报错信息,当前用户hive对hive表default.ods_pre_dat_dcsj_time没有insert权限,但是在脚本中我指定的hive数据库明明是dw库,怎么成了default库呢?脚本相关参数如下:
--split-by MINUTE \
--hive-import \
--hive-table ods_pre_dat_dcsj_time \
--target-dir /user/sqoop/hive/oracle_to_hive/ods_pre_dat_dcsj_time \
--delete-target-dir \
-- --hive-drop-import-delims \
--hcatalog-database dw \
--hcatalog-table ods_pre_dat_dcsj_time \
--hcatalog-storage-stanza 'stored as orc' \
--hcatalog-partition-keys "gcrq_year,gcrq_month,gcrq_day" \
--hcatalog-partition-values "${gcrq_year},${gcrq_month},${gcrq_day}" \
--num-mappers 3
这是因为同时使用了hcatalog和hive-import,参数冲突,
解决方法:删除hive-import相关参数,修改后的脚本内容如下:
--split-by MINUTE \
--target-dir /user/sqoop/hive/oracle_to_hive/ods_pre_dat_dcsj_time \
--delete-target-dir \
-- --hive-drop-import-delims \
--hcatalog-database dw \
--hcatalog-table ods_pre_dat_dcsj_time \
--hcatalog-storage-stanza 'stored as orc' \
--hcatalog-partition-keys "gcrq_year,gcrq_month,gcrq_day" \
--hcatalog-partition-values "${gcrq_year},${gcrq_month},${gcrq_day}" \
--num-mappers 3
6. sqoop导入hive动态分区表,数据已经导入到HDFS,单并未导入到hive表中。安装第5点中提及的解决方法修改sqoop脚本后:
sqoop import \
--connect "jdbc:oracle:thin:@//localhost:61521/LZY2" \
--username root\
--password '12345678' \
--query "SELECT TO_NUMBER(TO_CHAR(GCRQ, 'YYYY')) AS gcrq_year,TO_NUMBER(TO_CHAR(GCRQ, 'MM')) AS gcrq_month,TO_NUMBER(TO_CHAR(GCRQ, 'DD')) AS gcrq_day,YEAR,GCRQ,GCZBS,......UPDATE_BY,UPDATE_TIME,INSERT_TIME
FROM LZJHGX.DAT_DCSJ_TIME
WHERE TO_CHAR(GCRQ , 'YYYY-MM-DD') = TO_CHAR(SYSDATE, 'YYYY-MM-DD') AND \$CONDITIONS" \
--split-by MINUTE \
--target-dir /user/sqoop/hive/oracle_to_hive/ods_pre_dat_dcsj_time \
--delete-target-dir \
-- --hive-drop-import-delims \
--hcatalog-database dw \
--hcatalog-table ods_pre_dat_dcsj_time \
--hcatalog-storage-stanza 'stored as orc' \
--hcatalog-partition-keys "gcrq_year,gcrq_month,gcrq_day" \
--hcatalog-partition-values "${gcrq_year},${gcrq_month},${gcrq_day}" \
--num-mappers 3
虽然执行没有报错了, 但是数据仅仅从Oracle中查询导入到了HDFS,并未将数据再从HDFS导入hive中:
解决办法:首先分区字段的类型必须为STRING,否则在使用--hcatalog参数时会遇见:
Only string fields are allowed in partition columns in HCatalog
然后sqoop关键参数如下:
--split-by MINUTE \
--hcatalog-database dw \
--hcatalog-table ods_pre_dat_dcsj_time \
--hcatalog-storage-stanza 'stored as orc' \
--num-mappers 3
注意,如果导入的Oracle数据中有日期时间而对应hive字段为timestamp,建议进行一次类似如下的转换,避免格式问题:
TO_CHAR(INSERT_TIME, 'YYYY-MM-DD HH24:MI:SS') AS INSERT_TIME
最后如图成功:
完整sqoop代码:
sqoop import \
--connect "jdbc:oracle:thin:@//Oracle数据库IP:端口/模式名称" \
--username 用户名 \
--password '密码' \
--query "SELECT TO_CHAR(GCRQ, 'YYYY') AS gcrq_year,TO_CHAR(GCRQ, 'MM') AS gcrq_month,TO_CHAR(GCRQ, 'DD') AS gcrq_day,YEAR,TO_CHAR(GCRQ, 'YYYY-MM-DD HH24:MI:SS') AS GCRQ,GCZBS,......,TO_CHAR(DELETE_TIME, 'YYYY-MM-DD HH24:MI:SS') AS DELETE_TIME,CREATE_BY,TO_CHAR(CREATE_TIME, 'YYYY-MM-DD HH24:MI:SS') AS CREATE_TIME,UPDATE_BY,TO_CHAR(UPDATE_TIME, 'YYYY-MM-DD HH24:MI:SS') AS UPDATE_TIME,TO_CHAR(INSERT_TIME, 'YYYY-MM-DD HH24:MI:SS') AS INSERT_TIMEFROM LZJHGX.dat_dcsj_time
WHERE TO_CHAR(GCRQ , 'YYYY-MM-DD') = TO_CHAR(SYSDATE, 'YYYY-MM-DD') AND \$CONDITIONS" \
--split-by split字段 \
--hcatalog-database 数据库名称 \
--hcatalog-table 表名 \
--hcatalog-storage-stanza 'stored as orc' \
--num-mappers mapper个数N