[ZEPPELIN-6259] Add null safety checks in SparkInterpreterLauncher.detectSparkScalaVersionByReplClass #4999
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What is this PR for?
This PR adds null safety checks to the
detectSparkScalaVersionByReplClass
method inSparkInterpreterLauncher.java
to prevent NullPointerException and provide clear error messages when the Spark jars directory is inaccessible.Current Issues Fixed:
listFiles()
returns null when directory doesn't exist or is inaccessibleWhat type of PR is it?
Bug Fix / Improvement
Todos
What is the Jira issue?
How should this be tested?
Unit tests added:
testDetectSparkScalaVersionByReplClassWithNonExistentDirectory
- Verifies error when directory doesn't existtestDetectSparkScalaVersionByReplClassWithFileInsteadOfDirectory
- Verifies error when path is a filetestDetectSparkScalaVersionByReplClassWithValidDirectory
- Verifies normal operation workstestDetectSparkScalaVersionByReplClassWithEmptyDirectory
- Verifies error when no spark-repl jars foundManual testing:
CI: All existing tests pass
Screenshots (if appropriate)
N/A
Questions:
Implementation Details
listFiles()
result with permission hintError Message Examples
Before (NPE):
java.lang.NullPointerException
at java.util.stream.Stream.of(Stream.java:1012)
After (Clear messages):
java.io.IOException: Spark jars directory does not exist: /invalid/path/jars. Please check your SPARK_HOME setting.
java.io.IOException: Spark jars path is not a directory: /some/file/jars
java.io.IOException: Cannot access Spark jars directory: /restricted/jars. Please check permissions.
Benefits