Skip to content

Commit 400caf4

Browse files
authored
[Spark] Pin the pip version to 24.0 to get around the version format requirement (#3484)
(cherry-pick #3302) ... enforced by the `pip` from `24.1` Recent `delta-spark` [CI jobs](https://github.com/delta-io/delta/actions/runs/9628486756/job/26556785657) are failing with the following error. ``` ERROR: Invalid requirement: 'delta-spark==3.3.0-SNAPSHOT': Expected end or semicolon (after version specifier) delta-spark==3.3.0-SNAPSHOT ~~~~~~~^ ``` Earlier [runs](https://github.com/delta-io/delta/actions/runs/9526169441/job/26261227425) had the following warning ``` DEPRECATION: delta-spark 3.3.0-SNAPSHOT has a non-standard version number. pip 23.3 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of delta-spark or contact the author to suggest that they release a version with a conforming version number. Discussion can be found at pypa/pip#12063 ``` Pinning the `pip` version to `23.2.1` to let the jobs pass. We need to find a long-term solution on the version of the PyPI generated to avoid this issue but it is a bit complicated as the `delta-spark` PyPI also depends on the delta jars with the same version as the PyPI package name.
1 parent 0183834 commit 400caf4

File tree

1 file changed

+6
-0
lines changed

1 file changed

+6
-0
lines changed

.github/workflows/spark_test.yaml

+6
Original file line numberDiff line numberDiff line change
@@ -49,6 +49,12 @@ jobs:
4949
pyenv install 3.8.18
5050
pyenv global system 3.8.18
5151
pipenv --python 3.8 install
52+
# Update the pip version to 24.0. By default `pyenv.run` installs the latest pip version
53+
# available. From version 24.1, `pip` doesn't allow installing python packages
54+
# with version string containing `-`. In Delta-Spark case, the pypi package generated has
55+
# `-SNAPSHOT` in version (e.g. `3.3.0-SNAPSHOT`) as the version is picked up from
56+
# the`version.sbt` file.
57+
pipenv run pip install pip==24.0 setuptools==69.5.1 wheel==0.43.0
5258
pipenv run pip install pyspark==3.5.0
5359
pipenv run pip install flake8==3.5.0 pypandoc==1.3.3
5460
pipenv run pip install importlib_metadata==3.10.0

0 commit comments

Comments
 (0)