我正在尝试向兽人写入数据帧,但无济于事。我正在使用 Spark 1.6 和 Java。我在本地计算机上运行,我尝试安装一些依赖项但没有成功。我有一个工作火花,我想将其写入 orc 文件,但此错误返回给我:Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: orc. Please find packages at http://spark-packages.org at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:219) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:139) at Confiaveis.main(Confiaveis.java:96)Caused by: java.lang.ClassNotFoundException: orc.DefaultSource at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62) at scala.util.Try.orElse(Try.scala:84) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:62) ... 4 more我用这个命令来写:df.write().mode("append").format("orc").save("path");有谁知道我该如何解决这个问题?据我对 Spark 的了解,我知道这是他找不到的图书馆,但我找不到任何地方来澄清该图书馆是什么。
1 回答
月关宝盒
TA贡献1772条经验 获得超5个赞
尝试
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_*your_version*</artifactId>
<version>*your_version*</version>
<scope>provided</scope>
</dependency>
添加回答
举报
0/150
提交
取消