Spark操作hive遇到的问题

xiaoxiao2021-02-28  58

Spark操作hive报错如下

Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.     at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:778)     at scala.hiveDB.getRecords(hiveDB.scala:20)     at scala.TransData$.main(TransData.scala:12)     at scala.TransData.main(TransData.scala)

跟踪SparkSession.scala源码发现如下代码:

/** * Return true if Hive classes can be loaded, otherwise false. */ private[spark] def hiveClassesArePresent: Boolean = { try { Utils.classForName(HIVE_SESSION_STATE_CLASS_NAME) Utils.classForName(HIVE_SHARED_STATE_CLASS_NAME) Utils.classForName("org.apache.hadoop.hive.conf.HiveConf") true } catch { case _: ClassNotFoundException | _: NoClassDefFoundError => false } } 原来是缺少jar包,导致classForName失败,查询所需要的三个类所需要的maven引用并添加到pom中,解决。

---持续更新中,填平后再做总结。

转载请注明原文地址: https://www.6miu.com/read-82768.html

最新回复(0)