blob: 72ce0e4fe0a5dcde6731536152919ce388f6c1e3 [file] [log] [blame]
.. Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
.. http://www.apache.org/licenses/LICENSE-2.0
.. Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
.. NOTE TO CONTRIBUTORS:
Please, only add notes to the Changelog just below the "Changelog" header when there are some breaking changes
and you want to add an explanation to the users on how they are supposed to deal with them.
The changelog is updated and maintained semi-automatically by release manager.
Changelog
---------
5.1.1
.....
.. note::
This release dropped support for Python 3.7
Misc
~~~~
* ``Add note about dropping Python 3.7 for providers (#32015)``
.. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
* ``Add D400 pydocstyle check - Apache providers only (#31424)``
5.1.0
.....
.. note::
This release of provider is only available for Airflow 2.4+ as explained in the
`Apache Airflow providers support policy <https://github.com/apache/airflow/blob/main/PROVIDERS.rst#minimum-supported-version-of-airflow-for-community-managed-providers>`_.
Misc
~~~~
* ``Bump minimum Airflow version in providers (#30917)``
* ``Update SDKs for google provider package (#30067)``
.. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
* ``Add full automation for min Airflow version for providers (#30994)``
* ``Use '__version__' in providers not 'version' (#31393)``
* ``Fixing circular import error in providers caused by airflow version check (#31379)``
* ``Prepare docs for May 2023 wave of Providers (#31252)``
5.0.0
......
Breaking changes
~~~~~~~~~~~~~~~~
.. warning::
In this version of the provider, deprecated GCS and Dataflow hooks' param ``delegate_to`` is removed from all Beam operators.
Impersonation can be achieved instead by utilizing the ``impersonation_chain`` param.
* ``remove delegate_to from GCP operators and hooks (#30748)``
.. Review and move the new changes to one of the sections above:
* ``Add mechanism to suspend providers (#30422)``
4.3.0
.....
Features
~~~~~~~~
* ``Get rid of state in Apache Beam provider hook (#29503)``
4.2.0
.....
Features
~~~~~~~~
* ``Add support for running a Beam Go pipeline with an executable binary (#28764)``
Misc
~~~~
* ``Deprecate 'delegate_to' param in GCP operators and update docs (#29088)``
4.1.1
.....
Bug Fixes
~~~~~~~~~
* ``Ensure Beam Go file downloaded from GCS still exists when referenced (#28664)``
.. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
4.1.0
.....
.. note::
This release of provider is only available for Airflow 2.3+ as explained in the
`Apache Airflow providers support policy <https://github.com/apache/airflow/blob/main/PROVIDERS.rst#minimum-supported-version-of-airflow-for-community-managed-providers>`_.
Misc
~~~~
* ``Move min airflow version to 2.3.0 for all providers (#27196)``
Features
~~~~~~~~
* ``Add backward compatibility with old versions of Apache Beam (#27263)``
.. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
* ``Add documentation for July 2022 Provider's release (#25030)``
* ``Update old style typing (#26872)``
* ``Enable string normalization in python formatting - providers (#27205)``
* ``Update docs for September Provider's release (#26731)``
* ``Apply PEP-563 (Postponed Evaluation of Annotations) to non-core airflow (#26289)``
* ``Prepare docs for new providers release (August 2022) (#25618)``
* ``Move provider dependencies to inside provider folders (#24672)``
4.0.0
.....
Breaking changes
~~~~~~~~~~~~~~~~
.. note::
This release of provider is only available for Airflow 2.2+ as explained in the
`Apache Airflow providers support policy <https://github.com/apache/airflow/blob/main/PROVIDERS.rst#minimum-supported-version-of-airflow-for-community-managed-providers>`_.
Features
~~~~~~~~
* ``Added missing project_id to the wait_for_job (#24020)``
* ``Support impersonation service account parameter for Dataflow runner (#23961)``
Misc
~~~~
* ``chore: Refactoring and Cleaning Apache Providers (#24219)``
.. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
* ``Add explanatory note for contributors about updating Changelog (#24229)``
* ``AIP-47 - Migrate beam DAGs to new design #22439 (#24211)``
* ``Prepare docs for May 2022 provider's release (#24231)``
* ``Update package description to remove double min-airflow specification (#24292)``
3.4.0
.....
Features
~~~~~~~~
* ``Support serviceAccount attr for dataflow in the Apache beam``
.. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
3.3.0
.....
Features
~~~~~~~~
* ``Add recipe for BeamRunGoPipelineOperator (#22296)``
Bug Fixes
~~~~~~~~~
* ``Fix mistakenly added install_requires for all providers (#22382)``
3.2.1
.....
Misc
~~~~~
* ``Add Trove classifiers in PyPI (Framework :: Apache Airflow :: Provider)``
3.2.0
.....
Features
~~~~~~~~
* ``Add support for BeamGoPipelineOperator (#20386)``
Misc
~~~~
* ``Support for Python 3.10``
.. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
* ``Fixed changelog for January 2022 (delayed) provider's release (#21439)``
* ``Fix mypy apache beam operators (#20610)``
* ``Fix K8S changelog to be PyPI-compatible (#20614)``
* ``Fix template_fields type to have MyPy friendly Sequence type (#20571)``
* ``Fix MyPy Errors for Apache Beam (and Dataflow) provider. (#20301)``
* ``Fix broken anchors markdown files (#19847)``
* ``Add documentation for January 2021 providers release (#21257)``
* ``Dataflow Assets (#21639)``
* ``Remove ':type' lines now sphinx-autoapi supports typehints (#20951)``
* ``Update documentation for provider December 2021 release (#20523)``
* ``Use typed Context EVERYWHERE (#20565)``
* ``Update documentation for November 2021 provider's release (#19882)``
* ``Cleanup of start_date and default arg use for Apache example DAGs (#18657)``
3.1.0
.....
Features
~~~~~~~~
* ``Use google cloud credentials when executing beam command in subprocess (#18992)``
.. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
3.0.1
.....
Misc
~~~~
* ``Optimise connection importing for Airflow 2.2.0``
.. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
* ``Fixed wrongly escaped characters in amazon's changelog (#17020)``
* ``Prepares docs for Rc2 release of July providers (#17116)``
* ``Prepare documentation for July release of providers. (#17015)``
* ``Removes pylint from our toolchain (#16682)``
3.0.0
.....
Breaking changes
~~~~~~~~~~~~~~~~
* ``Auto-apply apply_default decorator (#15667)``
.. warning:: Due to apply_default decorator removal, this version of the provider requires Airflow 2.1.0+.
If your Airflow version is < 2.1.0, and you want to install this provider version, first upgrade
Airflow to at least version 2.1.0. Otherwise your Airflow package version will be upgraded
automatically and you will have to manually run ``airflow upgrade db`` to complete the migration.
.. Below changes are excluded from the changelog. Move them to
appropriate section above if needed. Do not delete the lines(!):
* ``Rename the main branch of the Airflow repo to be main (#16149)``
* ``Check synctatic correctness for code-snippets (#16005)``
* ``Rename example bucket names to use INVALID BUCKET NAME by default (#15651)``
* ``Updated documentation for June 2021 provider release (#16294)``
* ``More documentation update for June providers release (#16405)``
* ``Synchronizes updated changelog after buggfix release (#16464)``
2.0.0
.....
Breaking changes
~~~~~~~~~~~~~~~~
Integration with the ``google`` provider
````````````````````````````````````````
In 2.0.0 version of the provider we've changed the way of integrating with the ``google`` provider.
The previous versions of both providers caused conflicts when trying to install them together
using PIP > 20.2.4. The conflict is not detected by PIP 20.2.4 and below but it was there and
the version of ``Google BigQuery`` python client was not matching on both sides. As the result, when
both ``apache.beam`` and ``google`` provider were installed, some features of the ``BigQuery`` operators
might not work properly. This was cause by ``apache-beam`` client not yet supporting the new google
python clients when ``apache-beam[gcp]`` extra was used. The ``apache-beam[gcp]`` extra is used
by ``Dataflow`` operators and while they might work with the newer version of the ``Google BigQuery``
python client, it is not guaranteed.
This version introduces additional extra requirement for the ``apache.beam`` extra of the ``google`` provider
and symmetrically the additional requirement for the ``google`` extra of the ``apache.beam`` provider.
Both ``google`` and ``apache.beam`` provider do not use those extras by default, but you can specify
them when installing the providers. The consequence of that is that some functionality of the ``Dataflow``
operators might not be available.
Unfortunately the only ``complete`` solution to the problem is for the ``apache.beam`` to migrate to the
new (>=2.0.0) Google Python clients.
This is the extra for the ``google`` provider:
.. code-block:: python
extras_require = (
{
# ...
"apache.beam": ["apache-airflow-providers-apache-beam", "apache-beam[gcp]"],
# ...
},
)
And likewise this is the extra for the ``apache.beam`` provider:
.. code-block:: python
extras_require = ({"google": ["apache-airflow-providers-google", "apache-beam[gcp]"]},)
You can still run this with PIP version <= 20.2.4 and go back to the previous behaviour:
.. code-block:: shell
pip install apache-airflow-providers-google[apache.beam]
or
.. code-block:: shell
pip install apache-airflow-providers-apache-beam[google]
But be aware that some ``BigQuery`` operators functionality might not be available in this case.
1.0.1
.....
Bug fixes
~~~~~~~~~
* ``Improve Apache Beam operators - refactor operator - common Dataflow logic (#14094)``
* ``Corrections in docs and tools after releasing provider RCs (#14082)``
* ``Remove WARNINGs from BeamHook (#14554)``
1.0.0
.....
Initial version of the provider.