Skip to content

Conversation

@stevenayers
Copy link

Duplicate of @aagumin's #175 but with fixed tests:

*Issue #170

Description of changes:
The default value for the SPARK_VERSION variable will be taken from pyspark.version. In case of problems, the user also sets the environment variable himself

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

FYI @chenliu0831

@stevenayers
Copy link
Author

@aagumin @chenliu0831 ive just noticed this class in pyspark - should we use this instead? https://spark.apache.org/docs/3.4.1/api/python/reference/api/pyspark.util.VersionUtils.html

@stevenayers stevenayers closed this Feb 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant