Spark 2.4.0 doesn't work with JDK 11 : “Unsupported minor major version 55” on classes within org.apache.xbean package
Few months ago, I checked among spark pages, it was not very clear on how Spark
standed JDK 9+
. I believed it could.
And for months after, I've been lucky. Calling mainly lookup()
functions on RDDs, it was running fine. I was confident. Using Spark 2.3.2
, I migrated to JDK 11
to enjoy the new features it offers for the others components I use, expecially business objects.
But today, I had to call a collect()
function on a RDD and then Spark 2.3.2
fails on an IllegalArgumentException
somewhere inside its org.apache.xbean
package, with no message.
Still lucky, I discovered that a new version is here : the Spark 2.4.0
, that came this month of November and I installed it.
Spark
still fails with an IllegalArgumentException
but now a message is coming :
"Unsupported minor major version 55" : it doesn't stand JDK 11
, in fact.
I am trapped, I cannot run backward. All the software around has reached Java 11
without troubles.
– What are those classes inside that package org.apache.xbean
? May I find a newer version for them that accepts JDK 11
?
– What is the root of the problem ? A serial of Apache
projects (Spark
and few other like that xbean
) that have gone too far in technicity / complexity and are now unable to reach JDK 11
? Is it coming from Spark
memory management that I red it was using some tricks ? I have not enough knowledge to figure what they might have done and why they cannot return to classical memory management.
– Is there anything that can help me (not revoking my code from JDK 11
to 8
: this is not the way of History) ?
– Do you have some knowledge if it is planned that Spark
will support JDK 11
on a later 2.x
version ?
Regards,
Comments
Post a Comment