Problem
When trying to run Apache Spark with MongoDB I was getting a SecurityException (below).
Solution
It seems that “javax.servlet” gets included by multiple packages and not resolved properly. Hence the fix is to ignore it in one of the packages (in this case mongodb-hadoop).
Here is my build.sbt
name := "mypackage" version := "1.0" scalaVersion := "2.11.7" libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.4.1" % "provided" libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "1.4.1" libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "1.4.1" libraryDependencies += "org.mongodb" % "mongo-hadoop-core" % "1.3.0" % "provided" excludeAll ExclusionRule(organization = "javax.servlet")
Do note “% provided” being added to several of the lines. Thanks to link for this tip.
p.s. quite a few posts incorrectly suggested that the problem was caused by the jetty’s package
Details
Excpetion
ERROR SparkContext: Error initializing SparkContext. java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package at java.lang.ClassLoader.checkCerts(ClassLoader.java:895) at java.lang.ClassLoader.preDefineClass(ClassLoader.java:665) at java.lang.ClassLoader.defineClass(ClassLoader.java:758) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:136) at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129) at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:98) at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:108) at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:99) at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:78) at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62) at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:62) at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:61) at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:74) at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:190) at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:141) at org.apache.spark.SparkContext.<init>(SparkContext.scala:440) at Driver$.main(Driver.scala:11) at Driver.main(Driver.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at sbt.Run.invokeMain(Run.scala:67) at sbt.Run.run0(Run.scala:61) at sbt.Run.sbt$Run$$execute$1(Run.scala:51) at sbt.Run$$anonfun$run$1.apply$mcV$sp(Run.scala:55) at sbt.Run$$anonfun$run$1.apply(Run.scala:55) at sbt.Run$$anonfun$run$1.apply(Run.scala:55) at sbt.Logger$$anon$4.apply(Logger.scala:85) at sbt.TrapExit$App.run(TrapExit.scala:248) at java.lang.Thread.run(Thread.java:745)
keywords
sbt exception
10:33pm
10:31pm
10:31pm
10:28pm
10:28pm
10:21pm
10:06pm
10:06pm
10:06pm
10:06pm
10:01pm
9:58pm
9:57pm
9:56pm
9:53pm
9:42pm