tree: e991cd57e0165e86f1f4e7c345a7c6b34964a2de [path history] [tgz]
  1. amazon/
  2. apache/
  3. celery/
  4. cloudant/
  5. cncf/
  6. databricks/
  7. datadog/
  8. dingding/
  9. discord/
  10. docker/
  11. elasticsearch/
  12. exasol/
  13. facebook/
  14. ftp/
  15. google/
  16. grpc/
  17. hashicorp/
  18. http/
  19. imap/
  20. jdbc/
  21. jenkins/
  22. jira/
  23. microsoft/
  24. mongo/
  25. mysql/
  26. odbc/
  27. openfaas/
  28. opsgenie/
  29. oracle/
  30. pagerduty/
  31. papermill/
  32. plexus/
  33. postgres/
  34. presto/
  35. qubole/
  36. redis/
  37. salesforce/
  38. samba/
  39. segment/
  40. sendgrid/
  41. sftp/
  42. singularity/
  43. slack/
  44. snowflake/
  45. sqlite/
  46. ssh/
  47. telegram/
  48. vertica/
  49. yandex/
  50. zendesk/
  51. dependencies.json
  52. README.md
airflow/providers/README.md

Airflow Providers

Providers are logical abstractions of submodules that can be used to interface with various tools and endpoints from your Airflow DAGs. Each provider is grouped by the relevant top-level service that a user might need to interact with and submodules for specific forms of interaction, including hooks, operators, sensors, and transfers, exist within each provider directory.

Using Providers

As of Airflow 2.0, the provider packages contained in this subdirectory will be versioned and released independently of the core Airflow codebase. That means that, in order to use the submodules contained within these provider directories, a user will need to install the relevant provider python package into their Airflow environment. The relevant pip commands to install these providers and their submodules are documented in READMEs within each provider subdirectory.

Note that this does not mean that all Airflow operators will be abstracted away into python packages- core Airflow hooks and operators that exist in airflow/operators and airflow/hooks will continue to be included in core Airflow releases and directly accessible within any Airflow environment.