This release is based on git tag v3.0.0 which includes all commits up to June 10. The vote passed on the 10th of June, 2020. I’ve also just encountered the same problem.Apache Spark 3.0.0 is the first release of the 3.x line. This works fine in Scala REPL and in IntelliJ. If I package my app and I just call the method with same params, It’s fails with: $MappingException: No usable value for personsĭid not find value which can be converted into Īt $.fail(package.scala:96)Īt $$json4s$Extraction$ClassInstanceBuilder$$buildCtorArg(Extraction.scala:443)Īt $ClassInstanceBuilder$$anonfun$14.apply(Extraction.scala:463)Īt $$anonfun$map$1.apply(TraversableLike.scala:244)Īt $class.foreach(ResizableArray.scala:59)Īt .foreach(ArrayBuffer.scala:47)Īt $class.map(TraversableLike.scala:244)Īt (Traversable.scala:105)Īt $$json4s$Extraction$ClassInstanceBuilder$$instantiate(Extraction.scala:451)Īt $ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:491)Īt $ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:488)Īt $.org$json4s$Extraction$$customOrElse(Extraction.scala:500)Īt $ClassInstanceBuilder.result(Extraction.scala:488)Īt $.extract(Extraction.scala:332)Īt $.extract(Extraction.scala:42)Īt (ExtractableJsonAstNode.scala:21)Īt .util.PolicyUtils$.parseJson(PolicyUtils.scala:41)Īt .service.SparktaJob$.runSparktaJob(SparktaJob.scala:127)Īt $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:23)Īt $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:28)Īt 0(Native Method)Īt (NativeMethodAccessorImpl.java:62)Īt (DelegatingMethodAccessorImpl.java:43)Īt .invoke(Method.java:497)Īt .SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)Īt .SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)Īt .SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)Īt .SparkIMain.interpret(SparkIMain.scala:871)Īt .SparkIMain.interpret(SparkIMain.scala:819)Īt .SparkILoop.reallyInterpret$1(SparkILoop.scala:857)Īt .SparkILoop.interpretStartingWith(SparkILoop.scala:902)Īt .mand(SparkILoop.scala:814)Īt .SparkILoop.processLine$1(SparkILoop.scala:657)Īt .SparkILoop.innerLoop$1(SparkILoop.scala:665)Īt .$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)Īt .SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)Īt .SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)Īt .ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)Īt .$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)Īt .SparkILoop.process(SparkILoop.scala:1059)Īt .Main$.main(Main.scala:31)Īt .Main.main(Main.scala)Īt .SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)Īt .SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)Īt .SparkSubmit$.submit(SparkSubmit.scala:192)Īt .SparkSubmit$.main(SparkSubmit.scala:111)Īt .SparkSubmit.main(SparkSubmit.scala)Ĭaused by: $MappingException: No usable value for multiplexerĪt $CollectionBuilder$$anonfun$6.apply(Extraction.scala:341)Īt .foreach(List.scala:318)Īt $CollectionBuilder.mkCollection(Extraction.scala:341)Īt $CollectionBuilder.result(Extraction.scala:365)Īt $.extract(Extraction.scala:320)Īt $$json4s$Extraction$ClassInstanceBuilder$$buildCtorArg(Extraction.scala:431)Ĭaused by: $MappingException: Did not find value which can be converted into Īt $.convert(Extraction.scala:554)Īt $.extract(Extraction.scala:331) This works fine in some cases except through Apache Spark. I have a class like:Ĭase class Person(name:String,gender:String = "male") I know this is not your problem, but I hope maybe you can help me. I’m having some troubles with default fields deserialization and Apache Spark.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |