Can not create a Path from an empty string解决

news/2024/11/29 7:53:09/

完整报错如下:


 

java.lang.IllegalArgumentException: Can not create a Path from an empty stringat org.apache.hadoop.fs.Path.checkPathArg(Path.java:126)at org.apache.hadoop.fs.Path.<init>(Path.java:134)at org.apache.hadoop.fs.Path.<init>(Path.java:93)at org.apache.spark.deploy.yarn.Client.copyFileToRemote(Client.scala:370)at org.apache.spark.deploy.yarn.Client.org$apache$spark$deploy$yarn$Client$$distribute$1(Client.scala:478)at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:517)at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:869)at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169)at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:933)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:924)at scala.Option.getOrElse(Option.scala:121)at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:924)at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)at $line3.$read$$iw$$iw.<init>(<console>:15)at $line3.$read$$iw.<init>(<console>:43)at $line3.$read.<init>(<console>:45)at $line3.$read$.<init>(<console>:49)at $line3.$read$.<clinit>(<console>)at $line3.$eval$.$print$lzycompute(<console>:7)at $line3.$eval$.$print(<console>:6)at $line3.$eval.$print(<console>)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)at scala.collection.immutable.List.foreach(List.scala:381)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:79)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:78)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:77)at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:110)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)at org.apache.spark.repl.Main$.doMain(Main.scala:76)at org.apache.spark.repl.Main$.main(Main.scala:56)at org.apache.spark.repl.Main.main(Main.scala)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2020-05-01 12:05:39 WARN  YarnSchedulerBackend$YarnSchedulerEndpoint:66 - Attempted to request executors before the AM has registered!
2020-05-01 12:05:39 WARN  MetricsSystem:66 - Stopping a MetricsSystem that is not running
java.lang.IllegalArgumentException: Can not create a Path from an empty stringat org.apache.hadoop.fs.Path.checkPathArg(Path.java:126)at org.apache.hadoop.fs.Path.<init>(Path.java:134)at org.apache.hadoop.fs.Path.<init>(Path.java:93)at org.apache.spark.deploy.yarn.Client.copyFileToRemote(Client.scala:370)at org.apache.spark.deploy.yarn.Client.org$apache$spark$deploy$yarn$Client$$distribute$1(Client.scala:478)at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:517)at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:869)at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169)at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:933)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:924)at scala.Option.getOrElse(Option.scala:121)at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:924)at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)... 55 elided
<console>:14: error: not found: value sparkimport spark.implicits._^
<console>:14: error: not found: value sparkimport spark.sql^

 

 

解决方案:

spark-defaults.conf中的

spark.yarn.archive hdfs://Desktop:9000/spark/jars/

改成

spark.yarn.jars hdfs://Desktop:9000/spark/jars/

 

注意:

不要指望上面的解决方案直接copy过去就能解决你的问题,你必须想清楚最近自己改了什么配置,

这个配置可能在你的代码文件中,也可能是某个配置文件。

这种报错的根本原因是:

你的配置中的某个变量的赋值格式或者赋值路径不正确,如果不把最近的改动找出来,那么即使是上stackoverflow找答案也没用。

 


http://www.ppmy.cn/news/563794.html

相关文章

C#操作字符串方法总结lt;转gt;

1 staticvoid Main(string[] args)2 {3 string s "";4 //&#xff08;1&#xff09;字符访问&#xff08;下标访问s[i]&#xff09;5 s "ABCD";6 Console.WriteLine(s[0]); // 输出"A";7 …

linux下移植电容屏驱动gt9xx 笔记

# evtest /dev/event0 或者 # hexdump /dev/event0// 分析&#xff1a; \drivers\i2c\busses\I2c-digicolor.c static struct platform_driver i2c_dc_driver {.probe i2c_dc_probe,.remove __exit_p(i2c_dc_remove),.driver {.name "i2c-digicolor",.owne…

mtk6582平台GT9157触摸屏驱动移植

一.GT9157触摸屏移植 1.在mediatek/config/sanstar82_cwet_kk/ProjectConfig.mk中 CUSTOM_KERNEL_TOUCHPANELGT9XX (将GT9XX驱动加载到内核中) 2.在mediatek/custom/sanstar82_cwet_kk/kernel/touchpanel/GT9XX/tpd_custom_gt9xx.h中修改固件 (1)在有厂商固件的情况下 将…

前端vue经典面试题78道(重点详细简洁)

目录 1.自我介绍 2.vue面试题 1.v-show和v-if区别的区别&#xff1a; 2.为何v-for要用key 3.描述vue组件声明周期mm 单组件声明周期图 ​父子组件生命周期图 4.vue组件如何通信 5.描述组件渲染和更新的过程 1、vue 组件初次渲染过程 2、vue 组件更新过程 6.双向数据…

MATLAB2020a报错:函数或变量 ‘svmtrain‘ 无法识别。

版权声明&#xff1a;转载请注明作者&#xff08;独孤尚良dugushangliang&#xff09;出处&#xff1a;https://blog.csdn.net/dugushangliang/article/details/108594259 遇到问题&#xff0c;上网搜索&#xff0c;折腾许久&#xff0c;悬而未决。 遇到这个错误的原因是&…

vscode——HTML基本标签A标签超级链接

什么是超级链接&#xff1a; 1通过超级连接建立链接的关系——>从一个页面跳转另一个页面——>a标签是成对出现的——<a></a>有开始有结束——>a标签不加属性&#xff1a; <!DOCTYPE html> <html lang"en"> <head><meta c…

gt9xx驱动分析笔记

# evtest /dev/event0 或者 # hexdump /dev/event0 // 分析&#xff1a; \drivers\i2c\busses\I2c-digicolor.c static struct platform_driver i2c_dc_driver { .probe i2c_dc_probe, .remove __exit_p(i2c_dc_remove), .driver { .name "i2c-digicolor", .o…

ctfshow 文件包含 web 78 -117

目录 web 78web 79web 80web 81web 82-86(环境暂时做不了)web 87web 88web 116web 117 web 78 data://text/plain,<?php system("cat flag.php");?> php://filter/convert.base64-encode/resourceflag.php web 79 filedata://text/plain;base64,PD9waHAgc…