scala.util.matching (Java String,Java Long/Scala String,Scala Long)V错误解析
Exception in thread “main” java.lang.NoSuchMethodError: scala.util.matching (Java String,Java Long/Scala String,Scala Long)V
at org.apache.spark.internal.config.ConfigBuilder.(ConfigBuilder.scala:177)
at org.apache.spark.sql.internal.SQLConf
.
b
u
i
l
d
C
o
n
f
(
S
Q
L
C
o
n
f
.
s
c
a
l
a
:
65
)
a
t
o
r
g
.
a
p
a
c
h
e
.
s
p
a
r
k
.
s
q
l
.
i
n
t
e
r
n
a
l
.
S
Q
L
C
o
n
f
.buildConf(SQLConf.scala:65) at org.apache.spark.sql.internal.SQLConf
.buildConf(SQLConf.scala:65)atorg.apache.spark.sql.internal.SQLConf.(SQLConf.scala:161)
at org.apache.spark.sql.internal.SQLConf
.
<
c
l
i
n
i
t
>
(
S
Q
L
C
o
n
f
.
s
c
a
l
a
)
a
t
o
r
g
.
a
p
a
c
h
e
.
s
p
a
r
k
.
s
q
l
.
i
n
t
e
r
n
a
l
.
S
t
a
t
i
c
S
Q
L
C
o
n
f
.<clinit>(SQLConf.scala) at org.apache.spark.sql.internal.StaticSQLConf
.<clinit>(SQLConf.scala)atorg.apache.spark.sql.internal.StaticSQLConf.(StaticSQLConf.scala:31)
at org.apache.spark.sql.internal.StaticSQLConf
.
<
c
l
i
n
i
t
>
(
S
t
a
t
i
c
S
Q
L
C
o
n
f
.
s
c
a
l
a
)
a
t
o
r
g
.
a
p
a
c
h
e
.
s
p
a
r
k
.
s
q
l
.
S
p
a
r
k
S
e
s
s
i
o
n
.<clinit>(StaticSQLConf.scala) at org.apache.spark.sql.SparkSession
.<clinit>(StaticSQLConf.scala)atorg.apache.spark.sql.SparkSessionBuilder.enableHiveSupport(SparkSession.scala:868)
at com.digitalchina.AirQualityHive$.main(AirQualityHive.scala:12)
at com.digitalchina.AirQualityHive.main(AirQualityHive.scala)
Process finished with exit code 1

这是我在编写spark代码的时候遇到的问题。
这种错误是经典的scala版本不正确,需要重新配置scala版本。
注意不是切换pom文件依赖就可以的,需要File->project structure->Global Libraries,这里面一般会有一个scala的编译环境。

我这里的2.10.5是为了适配CDHsaprk1.6.0的,现在比较流行的版本都是2.11.8对应的spark2.2.0,或者是2.12.x对应的spark2.4.x。
修改以下这个版本使之和spark版本对应,具体你的spark中scala包是什么版本的,可以通过spark-shell来查看。

做到版本对应以后再修改pom依赖,刷新下maven让他自己加载一来就就可以解决问题了。
<properties>
<!-- 设置scala的编译版本,需要和Global Libraries中一致 -->
<scala.version>2.10.5</scala.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<!-- spark-core_2.1x就是代表的spark兼容的scala版本,这里spark-core_2.10就兼容2.10.x -->
<artifactId>spark-core_2.10</artifactId>
<!-- spark的版本 -->
<version>1.6.0</version>
</dependency>
</dependencies>