Spark-遇到的坑-启动spark报错

xiaoxiao2021-03-01  19

1.解决措施:

Please instead use:  - ./spark-submit with --num-executors to specify the number of executors  - Or set SPARK_EXECUTOR_INSTANCES  - spark.executor.instances to configure the number of instances in the spark config.

2.启动spark报错

[hadoop@hadoop001 sbin]$ spark-shell --master spark://hadoop001:7077 Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 18/07/24 23:56:28 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0 18/07/24 23:56:28 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 18/07/24 23:56:29 WARN SparkConf:  SPARK_WORKER_INSTANCES was detected (set to '1'). This is deprecated in Spark 1.0+.

Please instead use:  - ./spark-submit with --num-executors to specify the number of executors  - Or set SPARK_EXECUTOR_INSTANCES  - spark.executor.instances to configure the number of instances in the spark config.          18/07/24 23:56:34 WARN HiveMetaStore: Retrying creating default database after error: Error creating transactional connection factory javax.jdo.JDOFatalInternalException: Error creating transactional connection factory     at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)     at java.security.AccessController.doPrivileged(Native Method)     at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)     at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)     at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)     at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)     at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)     at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)     at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)     at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)     at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)     at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)     at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)     at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)     at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)     at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)     at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)     at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)     at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)     at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)     at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)     at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)     at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)     at scala.Option.getOrElse(Option.scala:121)     at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)     at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)     at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)     at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)     at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)     at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)     at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)     at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)     at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)     at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)     at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)     at $line3.$read$$iw$$iw.<init>(<console>:15)     at $line3.$read$$iw.<init>(<console>:42)     at $line3.$read.<init>(<console>:44)     at $line3.$read$.<init>(<console>:48)     at $line3.$read$.<clinit>(<console>)     at $line3.$eval$.$print$lzycompute(<console>:7)     at $line3.$eval$.$print(<console>:6)     at $line3.$eval.$print(<console>)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)     at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)     at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)     at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)     at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)     at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)     at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)     at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)     at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)     at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)     at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)     at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)     at org.apache.spark.repl.Main$.doMain(Main.scala:68)     at org.apache.spark.repl.Main$.main(Main.scala:51)     at org.apache.spark.repl.Main.main(Main.scala)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) NestedThrowablesStackTrace: java.lang.reflect.InvocationTargetException     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)     at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)     at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)     at org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)     at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)     at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)     at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)     at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)     at java.security.AccessController.doPrivileged(Native Method)     at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)     at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)     at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)     at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)     at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)     at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)     at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)     at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)     at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)     at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)     at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)     at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)     at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)     at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)     at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)     at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)     at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)     at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)     at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)     at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)     at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)     at scala.Option.getOrElse(Option.scala:121)     at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)     at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)     at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)     at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)     at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)     at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)     at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)     at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)     at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)     at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)     at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)     at $line3.$read$$iw$$iw.<init>(<console>:15)     at $line3.$read$$iw.<init>(<console>:42)     at $line3.$read.<init>(<console>:44)     at $line3.$read$.<init>(<console>:48)     at $line3.$read$.<clinit>(<console>)     at $line3.$eval$.$print$lzycompute(<console>:7)     at $line3.$eval$.$print(<console>:6)     at $line3.$eval.$print(<console>)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)     at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)     at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)     at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)     at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)     at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)     at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)     at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)     at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)     at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)     at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)     at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)     at org.apache.spark.repl.Main$.doMain(Main.scala:68)     at org.apache.spark.repl.Main$.main(Main.scala:51)     at org.apache.spark.repl.Main.main(Main.scala)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:259)     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)     ... 146 more Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.     at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:58)     at org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)     ... 148 more 18/07/24 23:56:34 WARN Hive: Failed to access metastore. This class should not accessed in runtime. org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient     at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)     at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)     at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)     at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)     at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)     at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)     at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)     at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)     at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)     at scala.Option.getOrElse(Option.scala:121)     at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)     at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)     at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)     at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)     at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)     at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)     at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)     at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)     at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)     at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)     at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)     at $line3.$read$$iw$$iw.<init>(<console>:15)     at $line3.$read$$iw.<init>(<console>:42)     at $line3.$read.<init>(<console>:44)     at $line3.$read$.<init>(<console>:48)     at $line3.$read$.<clinit>(<console>)     at $line3.$eval$.$print$lzycompute(<console>:7)     at $line3.$eval$.$print(<console>:6)     at $line3.$eval.$print(<console>)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)     at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)     at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)     at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)     at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)     at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)     at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)     at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)     at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)     at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)     at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)     at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)     at org.apache.spark.repl.Main$.doMain(Main.scala:68)     at org.apache.spark.repl.Main$.main(Main.scala:51)     at org.apache.spark.repl.Main.main(Main.scala)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient     at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)     at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)     at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)     at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)     ... 88 more Caused by: java.lang.reflect.InvocationTargetException     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)     ... 94 more Caused by: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory NestedThrowables: java.lang.reflect.InvocationTargetException     at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)     at java.security.AccessController.doPrivileged(Native Method)     at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)     at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)     at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)     at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)     at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)     at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)     at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)     at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)     at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)     at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)     ... 99 more Caused by: java.lang.reflect.InvocationTargetException     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)     at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)     at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)     at org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)     at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)     at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)     at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)     at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)     ... 128 more Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:259)     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)     ... 146 more Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.     at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:58)     at org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)     ... 148 more 18/07/24 23:56:34 WARN HiveMetaStore: Retrying creating default database after error: Error creating transactional connection factory javax.jdo.JDOFatalInternalException: Error creating transactional connection factory     at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)     at java.security.AccessController.doPrivileged(Native Method)     at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)     at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)     at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)     at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)     at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)     at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)     at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)     at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)     at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)     at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)     at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)     at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)     at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)     at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)     at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)     at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)     at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)     at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)     at scala.Option.getOrElse(Option.scala:121)     at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)     at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)     at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)     at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)     at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)     at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)     at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)     at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)     at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)     at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)     at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)     at $line3.$read$$iw$$iw.<init>(<console>:15)     at $line3.$read$$iw.<init>(<console>:42)     at $line3.$read.<init>(<console>:44)     at $line3.$read$.<init>(<console>:48)     at $line3.$read$.<clinit>(<console>)     at $line3.$eval$.$print$lzycompute(<console>:7)     at $line3.$eval$.$print(<console>:6)     at $line3.$eval.$print(<console>)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)     at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)     at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)     at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)     at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)     at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)     at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)     at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)     at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)     at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)     at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)     at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)     at org.apache.spark.repl.Main$.doMain(Main.scala:68)     at org.apache.spark.repl.Main$.main(Main.scala:51)     at org.apache.spark.repl.Main.main(Main.scala)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) NestedThrowablesStackTrace: java.lang.reflect.InvocationTargetException     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)     at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)     at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)     at org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)     at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)     at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)     at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)     at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)     at java.security.AccessController.doPrivileged(Native Method)     at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)     at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)     at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)     at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)     at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)     at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)     at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)     at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)     at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)     at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)     at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)     at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)     at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)     at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)     at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)     at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)     at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)     at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)     at scala.Option.getOrElse(Option.scala:121)     at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)     at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)     at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)     at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)     at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)     at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)     at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)     at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)     at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)     at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)     at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)     at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)     at $line3.$read$$iw$$iw.<init>(<console>:15)     at $line3.$read$$iw.<init>(<console>:42)     at $line3.$read.<init>(<console>:44)     at $line3.$read$.<init>(<console>:48)     at $line3.$read$.<clinit>(<console>)     at $line3.$eval$.$print$lzycompute(<console>:7)     at $line3.$eval$.$print(<console>:6)     at $line3.$eval.$print(<console>)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)     at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)     at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)     at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)     at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)     at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)     at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)     at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)     at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)     at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)     at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)     at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)     at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)     at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)     at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)     at org.apache.spark.repl.Main$.doMain(Main.scala:68)     at org.apache.spark.repl.Main$.main(Main.scala:51)     at org.apache.spark.repl.Main.main(Main.scala)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:259)     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)     ... 143 more Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.     at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:58)     at org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)     at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)     ... 145 more java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':   at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)   at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)   at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)   at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)   at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)   at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)   at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)   at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)   at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)   at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)   at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)   at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)   ... 47 elided Caused by: java.lang.reflect.InvocationTargetException: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)   at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)   ... 58 more Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':   at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)   at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)   at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)   at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)   at scala.Option.getOrElse(Option.scala:121)   at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)   at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)   at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)   at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)   ... 63 more Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)   at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)   ... 71 more Caused by: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)   at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)   at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)   at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)   at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)   ... 76 more Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient   at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)   at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)   ... 84 more Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient   at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)   at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)   at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)   at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)   at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)   at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)   at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)   ... 85 more Caused by: java.lang.reflect.InvocationTargetException: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)   at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)   ... 91 more Caused by: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory   at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)   at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)   at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)   at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)   at java.lang.reflect.Method.invoke(Method.java:606)   at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)   at java.security.AccessController.doPrivileged(Native Method)   at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)   at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)   at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)   at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)   at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)   at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)   at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)   at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)   at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)   at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)   at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)   at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)   at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)   at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)   at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)   at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)   at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)   at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)   at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)   at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)   at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)   ... 96 more Caused by: java.lang.reflect.InvocationTargetException: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)   at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)   at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)   at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)   at org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)   at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)   at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)   at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)   at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)   at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)   at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)   ... 125 more Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.   at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:259)   at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)   at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)   ... 143 more Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.   at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:58)   at org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)   at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)   ... 145 more <console>:14: error: not found: value spark        import spark.implicits._               ^ <console>:14: error: not found: value spark        import spark.sql               ^ Welcome to       ____              __      / __/__  ___ _____/ /__     _\ \/ _ \/ _ `/ __/  '_/    /___/ .__/\_,_/_/ /_/\_\   version 2.1.0       /_/           Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_51) Type in expressions to have them evaluated. Type :help for more information.

scala> :quit  

转载请注明原文地址: https://www.6miu.com/read-4150065.html

最新回复(0)