Open
Description
Hello everyone. I am using Jupyter Enterprise Gateway with PySpark sessions on Kubernetes. The elyra/kernel-spark-py:3.2.3 image works as expected.
I modified the image and rebuilt it to upgrade the Spark version to 3.5.3. When I start this kernel through JEG, the Spark driver and executor pods are created and run as expected. However, within the notebook, the spark variable is stuck in the WaitingForSparkSessionToBeInitialized
value. If I redefine it with spark=SparkSession.builder.getOrCreate()
, it doesn't give an error and works.