Skip to content

Java 11 support so connector can be used on Synapse? #689

@wyatt-troia-msft

Description

@wyatt-troia-msft

The README says Java 8 is required to use the connector, but Azure Synapse will discontinue support for its Java 8 runtime (Spark 3.3) on March 31, 2025. After that, Synapse will only support Java 11 (in its Spark 3.4 runtime).

Are there plans to support Java 11? What are my options for continuing to use azure-event-hubs-spark if not?

We get this error when trying to use the connector on a Synapse Spark 3.4 pool. We're not sure if the error is related to the Java version. There is a related issue open.

The class that can't be found extends the org.apache.spark.eventhubs.utils.AadAuthenticationCallback class, as described here.

java.lang.ClassNotFoundException: com.microsoft.commerce.priceproducer.common.auth.EventHubAadAuthenticationCallBack at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:527) at java.base/java.lang.Class.forName0(Native Method) at java.base/java.lang.Class.forName(Class.java:315) at org.apache.spark.eventhubs.EventHubsConf.$anonfun$aadAuthCallback$4(EventHubsConf.scala:641) at scala.Option.map(Option.scala:230) at org.apache.spark.eventhubs.EventHubsConf.aadAuthCallback(EventHubsConf.scala:639) at org.apache.spark.eventhubs.client.ClientConnectionPool.$anonfun$borrowClient$6(ClientConnectionPool.scala:73) at org.apache.spark.eventhubs.utils.RetryUtils$.$anonfun$retryJava$1(RetryUtils.scala:91) at org.apache.spark.eventhubs.utils.RetryUtils$.org$apache$spark$eventhubs$utils$RetryUtils$$retryHelper$1(RetryUtils.scala:116) at org.apache.spark.eventhubs.utils.RetryUtils$.retryScala(RetryUtils.scala:149) at org.apache.spark.eventhubs.utils.RetryUtils$.retryJava(RetryUtils.scala:91) at org.apache.spark.eventhubs.client.ClientConnectionPool.org$apache$spark$eventhubs$client$ClientConnectionPool$$borrowClient(ClientConnectionPool.scala:69) at org.apache.spark.eventhubs.client.ClientConnectionPool$.borrowClient(ClientConnectionPool.scala:170) at org.apache.spark.eventhubs.client.EventHubsClient.client(EventHubsClient.scala:62) at org.apache.spark.eventhubs.client.EventHubsClient.liftedTree1$1(EventHubsClient.scala:187) at org.apache.spark.eventhubs.client.EventHubsClient.partitionCountLazyVal$lzycompute(EventHubsClient.scala:184) at org.apache.spark.eventhubs.client.EventHubsClient.partitionCountLazyVal(EventHubsClient.scala:183) at org.apache.spark.eventhubs.client.EventHubsClient.partitionCount(EventHubsClient.scala:176) at org.apache.spark.sql.eventhubs.EventHubsSource.partitionCount(EventHubsSource.scala:81) at org.apache.spark.sql.eventhubs.EventHubsSource.$anonfun$maxOffsetsPerTrigger$4(EventHubsSource.scala:96) at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.eventhubs.EventHubsSource.$anonfun$maxOffsetsPerTrigger$2(EventHubsSource.scala:96) at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.eventhubs.EventHubsSource.<init>(EventHubsSource.scala:96) at org.apache.spark.sql.eventhubs.EventHubsSourceProvider.createSource(EventHubsSourceProvider.scala:84) at org.apache.spark.sql.execution.datasources.DataSource.createSource(DataSource.scala:289) at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$1.$anonfun$applyOrElse$1(MicroBatchExecution.scala:89) at scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86) at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$1.applyOrElse(MicroBatchExecution.scala:86) at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$1.applyOrElse(MicroBatchExecution.scala:84) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:104) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions