Skip to content

Commit

Permalink
Updates 3.6 limits for latest versions of a few libraries (#15209)
Browse files Browse the repository at this point in the history
This PR sets Pythong 3.6 specific limits for some of the packages
that recently dropped support for Python 3.6 binary packages
released via PyPI. Even if those packages did not drop the
Python 3.6 support entirely, it gets more and more difficult to
get those packages installed (both locally and in the Docker image)
because the require the packages to be compiled and they often
require a number of external dependencies to do so.

This makes it difficult to automatically upgrade dependencies,
because such upgrade fails for Python 3.6 images if we attempt
to do so.

This PR limits several of those dependencies (dask/pandas/numpy)
to not use the lates major releases for those packages but limits
them to the latest released versions.

Also comment/clarification was added to recently (#15114) added limit
for `pandas-gbq`. This limit has been added because of broken
import for bigquery provider, but the comment about it was missing
so the comment is added now.
  • Loading branch information
potiuk committed Apr 5, 2021
1 parent 1087226 commit e497228
Show file tree
Hide file tree
Showing 4 changed files with 18 additions and 6 deletions.
2 changes: 1 addition & 1 deletion airflow/models/baseoperator.py
Original file line number Diff line number Diff line change
Expand Up @@ -1523,7 +1523,7 @@ def cross_downstream(
class BaseOperatorLink(metaclass=ABCMeta):
"""Abstract base class that defines how we get an operator link."""

operators: ClassVar[List[Type[BaseOperator]]] = []
operators: ClassVar[List[Type[BaseOperator]]] = [] # pylint: disable=invalid-name
"""
This property will be used by Airflow Plugins to find the Operators to which you want
to assign this Operator Link
Expand Down
6 changes: 3 additions & 3 deletions airflow/providers/google/cloud/operators/dataflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,9 @@ class CheckJobRunning(Enum):
WaitForRun - wait for job to finish and then continue with new job
"""

IgnoreJob = 1
FinishIfRunning = 2
WaitForRun = 3
IgnoreJob = 1 # pylint: disable=invalid-name
FinishIfRunning = 2 # pylint: disable=invalid-name
WaitForRun = 3 # pylint: disable=invalid-name


class DataflowConfiguration:
Expand Down
7 changes: 6 additions & 1 deletion setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,12 @@ install_requires =
markdown>=2.5.2, <4.0
markupsafe>=1.1.1, <2.0
marshmallow-oneofschema>=2.0.1
pandas>=0.17.1, <2.0
# Numpy stopped releasing 3.6 binaries for 1.20.* series.
numpy<1.20;python_version<"3.7"
numpy;python_version>="3.7"
# Pandas stopped releasing 3.6 binaries for 1.2.* series.
pandas>=0.17.1, <1.2;python_version<"3.7"
pandas>=0.17.1, <2.0;python_version>="3.7"
pendulum~=2.0
pep562~=1.0;python_version<"3.7"
psutil>=4.2.0, <6.0.0
Expand Down
9 changes: 8 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -237,7 +237,12 @@ def get_sphinx_theme_version() -> str:
cloudant = [
'cloudant>=2.0',
]
dask = ['cloudpickle>=1.4.1, <1.5.0', 'distributed>=2.11.1, <2.20']
dask = [
'cloudpickle>=1.4.1, <1.5.0',
'dask<2021.3.1;python_version>"3.7"', # dask stopped supporting python 3.6 in 2021.3.1 version
'dask>=2.9.0;python_version>="3.7"',
'distributed>=2.11.1, <2.20',
]
databricks = [
'requests>=2.20.0, <3',
]
Expand Down Expand Up @@ -313,6 +318,8 @@ def get_sphinx_theme_version() -> str:
'google-cloud-workflows>=0.1.0,<2.0.0',
'grpcio-gcp>=0.2.2',
'json-merge-patch~=0.2',
# pandas-gbq 0.15.0 release broke google provider's bigquery import
# _check_google_client_version (airflow/providers/google/cloud/hooks/bigquery.py:49)
'pandas-gbq<0.15.0',
'plyvel',
]
Expand Down

0 comments on commit e497228

Please sign in to comment.