-
Notifications
You must be signed in to change notification settings - Fork 179
Description
Issue Summary
The EventHubsConf.toConf method in the Azure Event Hubs Spark connector (see line 731) attempts to decode connection strings based on the detected Spark version. However, when running a PySpark notebook in Microsoft Fabric, this logic fails because:
Fabric does not expose the current Spark version to the user.
Without knowing the Spark version, there is no clear guidance on how to correctly encrypt or format the connection string so that decryption succeeds.
Steps Tried
Used a raw Event Hubs connection string → decryption failed inside Fabric.
Tried encoding the string as Base64 based on connector logic → failed with:
javax.crypto.IllegalBlockSizeException: Input length must be a multiple of 16 when decrypting with padded cipher
Switched to SAS key–based configuration (avoiding full connection string), but this also failed with:
java.util.NoSuchElementException: None.get
Configuration used:
event_hub_conf = { "eventhubs.namespace": "namespace-<name>.servicebus.windows.net", "eventhubs.name": "eventhub-<name>", "eventhubs.sasKeyName": "policy<name>", "eventhubs.sasKey": "<KEY>", "eventhubs.partition.count": "4" }
Expected Behavior
Clear documentation or connector behavior that works in Microsoft Fabric for authenticating Event Hubs (either via full connection string or SAS key).
No dependency on Spark version to determine the correct encryption/encoding format.
Actual Behavior
Connection string decryption fails due to hidden Spark version logic.
SAS key–based approach throws None.get error, leaving no viable option for Event Hubs publishing from Fabric notebooks.
Proposed Fix / Feature Request
Provide a version-agnostic way to authenticate Event Hubs in Fabric (e.g., always accept raw SAS keys or Managed Identity).
Or expose spark.version in Fabric so encryption can be applied consistently with connector expectations.
Update documentation with examples that have been tested specifically in Fabric environments.