Update README for 2.9.0

These were missed during the initial release
diff --git a/README.md b/README.md
index dc8fd40..f0964d5 100644
--- a/README.md
+++ b/README.md
@@ -240,9 +240,10 @@
 
 This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:
 
-- API version: 2.8.0
-- Package version: 2.8.0
+- API version: 2.9.0
+- Package version: 2.9.0
 - Build package: org.openapitools.codegen.languages.PythonClientCodegen
+
 For more information, please visit [https://airflow.apache.org](https://airflow.apache.org)
 
 ## Requirements.
diff --git a/airflow_client/README.md b/airflow_client/README.md
index 3ff2491..f0964d5 100644
--- a/airflow_client/README.md
+++ b/airflow_client/README.md
@@ -18,6 +18,7 @@
  -->
 
 # Apache Airflow Python Client
+
 # Overview
 
 To facilitate management, Apache Airflow supports a range of REST API endpoints across its
@@ -26,6 +27,7 @@
 
 Most of the endpoints accept `JSON` as input and return `JSON` responses.
 This means that you must usually add the following headers to your request:
+
 ```
 Content-type: application/json
 Accept: application/json
@@ -41,7 +43,7 @@
 
 ## CRUD Operations
 
-The platform supports **C**reate, **R**ead, **U**pdate, and **D**elete operations on most resources.
+The platform supports **Create**, **Read**, **Update**, and **Delete** operations on most resources.
 You can review the standards for these operations and their standard parameters below.
 
 Some endpoints have special behavior as exceptions.
@@ -66,6 +68,7 @@
 of resources' metadata in the response body.
 
 When reading resources, some common query parameters are usually available. e.g.:
+
 ```
 v1/connections?limit=25&offset=25
 ```
@@ -84,7 +87,7 @@
 
 ### Delete
 
-Deleting a resource requires the resource `id` and is typically executed via an HTTP `DELETE` request.
+Deleting a resource requires the resource `id` and is typically executing via an HTTP `DELETE` request.
 The response usually returns a `204 No Content` response code upon success.
 
 ## Conventions
@@ -93,16 +96,15 @@
 - Names are consistent between URL parameter name and field name.
 
 - Field names are in snake_case.
+
 ```json
 {
-    \"description\": \"string\",
     \"name\": \"string\",
-    \"occupied_slots\": 0,
-    \"open_slots\": 0
-    \"queued_slots\": 0,
-    \"running_slots\": 0,
-    \"scheduled_slots\": 0,
     \"slots\": 0,
+    \"occupied_slots\": 0,
+    \"used_slots\": 0,
+    \"queued_slots\": 0,
+    \"open_slots\": 0
 }
 ```
 
@@ -115,10 +117,13 @@
 their current values.
 
 Example:
-```
-  resource = request.get('/resource/my-id').json()
-  resource['my_field'] = 'new-value'
-  request.patch('/resource/my-id?update_mask=my_field', data=json.dumps(resource))
+
+```python
+import requests
+
+resource = requests.get("/resource/my-id").json()
+resource["my_field"] = "new-value"
+requests.patch("/resource/my-id?update_mask=my_field", data=json.dumps(resource))
 ```
 
 ## Versioning and Endpoint Lifecycle
@@ -136,6 +141,7 @@
 Note that you will need to pass credentials data.
 
 For e.g., here is how to pause a DAG with [curl](https://curl.haxx.se/), when basic authorization is used:
+
 ```bash
 curl -X PATCH 'https://example.com/api/v1/dags/{dag_id}?update_mask=is_paused' \\
 -H 'Content-Type: application/json' \\
@@ -148,8 +154,9 @@
 Using a graphical tool such as [Postman](https://www.postman.com/) or [Insomnia](https://insomnia.rest/),
 it is possible to import the API specifications directly:
 
-1. Download the API specification by clicking the **Download** button at the top of this document
+1. Download the API specification by clicking the **Download** button at top of this document.
 2. Import the JSON specification in the graphical tool of your choice.
+
   - In *Postman*, you can click the **import** button at the top
   - With *Insomnia*, you can just drag-and-drop the file on the UI
 
@@ -172,10 +179,12 @@
 
 If you want to check which auth backend is currently set, you can use
 `airflow config get-value api auth_backends` command as in the example below.
+
 ```bash
 $ airflow config get-value api auth_backends
 airflow.api.auth.backend.basic_auth
 ```
+
 The default is to deny all requests.
 
 For details on configuring the authentication, see
@@ -229,43 +238,40 @@
 This means that the server encountered an unexpected condition that prevented it from
 fulfilling the request.
 
-
 This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:
 
-- API version: 2.8.0
-- Package version: 2.8.0
+- API version: 2.9.0
+- Package version: 2.9.0
 - Build package: org.openapitools.codegen.languages.PythonClientCodegen
+
 For more information, please visit [https://airflow.apache.org](https://airflow.apache.org)
 
 ## Requirements.
 
-Python >=3.6
+Python >=3.8
 
 ## Installation & Usage
+
 ### pip install
 
+You can install the client using standard Python installation tools. It is hosted
+in PyPI with `apache-airflow-client` package id so the easiest way to get the latest
+version is to run:
+
+```bash
+pip install apache-airflow-client
+```
+
 If the python package is hosted on a repository, you can install directly using:
 
-```sh
+```bash
 pip install git+https://github.com/apache/airflow-client-python.git
 ```
-(you may need to run `pip` with root permission: `sudo pip install git+https://github.com/apache/airflow-client-python.git`)
+
+### Import check
 
 Then import the package:
-```python
-import airflow_client.client
-```
 
-### Setuptools
-
-Install via [Setuptools](http://pypi.python.org/pypi/setuptools).
-
-```sh
-python setup.py install --user
-```
-(or `sudo python setup.py install` to install the package for all users)
-
-Then import the package:
 ```python
 import airflow_client.client
 ```
@@ -275,18 +281,16 @@
 Please follow the [installation procedure](#installation--usage) and then run the following:
 
 ```python
-
 import time
 import airflow_client.client
 from pprint import pprint
 from airflow_client.client.api import config_api
 from airflow_client.client.model.config import Config
 from airflow_client.client.model.error import Error
+
 # Defining the host is optional and defaults to /api/v1
 # See configuration.py for a list of all supported configuration parameters.
-configuration = client.Configuration(
-    host = "/api/v1"
-)
+configuration = client.Configuration(host="/api/v1")
 
 # The client must configure the authentication and authorization parameters
 # in accordance with the API server security policy.
@@ -294,21 +298,17 @@
 # satisfies your auth use case.
 
 # Configure HTTP basic authorization: Basic
-configuration = client.Configuration(
-    username = 'YOUR_USERNAME',
-    password = 'YOUR_PASSWORD'
-)
+configuration = client.Configuration(username="YOUR_USERNAME", password="YOUR_PASSWORD")
 
 
 # Enter a context with an instance of the API client
 with client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = config_api.ConfigApi(api_client)
-    section = "section_example" # str | If given, only return config of this section. (optional)
 
     try:
         # Get current configuration
-        api_response = api_instance.get_config(section=section)
+        api_response = api_instance.get_config()
         pprint(api_response)
     except client.ApiException as e:
         print("Exception when calling ConfigApi->get_config: %s\n" % e)
@@ -321,7 +321,6 @@
 Class | Method | HTTP request | Description
 ------------ | ------------- | ------------- | -------------
 *ConfigApi* | [**get_config**](docs/ConfigApi.md#get_config) | **GET** /config | Get current configuration
-*ConfigApi* | [**get_value**](docs/ConfigApi.md#get_value) | **GET** /config/section/{section}/option/{option} | Get a option from configuration
 *ConnectionApi* | [**delete_connection**](docs/ConnectionApi.md#delete_connection) | **DELETE** /connections/{connection_id} | Delete a connection
 *ConnectionApi* | [**get_connection**](docs/ConnectionApi.md#get_connection) | **GET** /connections/{connection_id} | Get a connection
 *ConnectionApi* | [**get_connections**](docs/ConnectionApi.md#get_connections) | **GET** /connections | List connections
@@ -345,7 +344,7 @@
 *DAGRunApi* | [**get_dag_runs**](docs/DAGRunApi.md#get_dag_runs) | **GET** /dags/{dag_id}/dagRuns | List DAG runs
 *DAGRunApi* | [**get_dag_runs_batch**](docs/DAGRunApi.md#get_dag_runs_batch) | **POST** /dags/~/dagRuns/list | List DAG runs (batch)
 *DAGRunApi* | [**get_upstream_dataset_events**](docs/DAGRunApi.md#get_upstream_dataset_events) | **GET** /dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents | Get dataset events for a DAG run
-*DAGRunApi* | [**post_dag_run**](docs/DAGRunApi.md#post_dag_run) | **POST** /dags/{dag_id}/dagRuns | Trigger a new DAG run.
+*DAGRunApi* | [**post_dag_run**](docs/DAGRunApi.md#post_dag_run) | **POST** /dags/{dag_id}/dagRuns | Trigger a new DAG run
 *DAGRunApi* | [**set_dag_run_note**](docs/DAGRunApi.md#set_dag_run_note) | **PATCH** /dags/{dag_id}/dagRuns/{dag_run_id}/setNote | Update the DagRun note.
 *DAGRunApi* | [**update_dag_run_state**](docs/DAGRunApi.md#update_dag_run_state) | **PATCH** /dags/{dag_id}/dagRuns/{dag_run_id} | Modify a DAG run
 *DagWarningApi* | [**get_dag_warnings**](docs/DagWarningApi.md#get_dag_warnings) | **GET** /dagWarnings | List dag warnings
@@ -427,7 +426,6 @@
  - [DAGRun](docs/DAGRun.md)
  - [DAGRunCollection](docs/DAGRunCollection.md)
  - [DAGRunCollectionAllOf](docs/DAGRunCollectionAllOf.md)
- - [DagProcessorStatus](docs/DagProcessorStatus.md)
  - [DagScheduleDatasetReference](docs/DagScheduleDatasetReference.md)
  - [DagState](docs/DagState.md)
  - [DagWarning](docs/DagWarning.md)
@@ -488,11 +486,9 @@
  - [TimeDelta](docs/TimeDelta.md)
  - [Trigger](docs/Trigger.md)
  - [TriggerRule](docs/TriggerRule.md)
- - [TriggererStatus](docs/TriggererStatus.md)
  - [UpdateDagRunState](docs/UpdateDagRunState.md)
  - [UpdateTaskInstance](docs/UpdateTaskInstance.md)
  - [UpdateTaskInstancesState](docs/UpdateTaskInstancesState.md)
- - [UpdateTaskState](docs/UpdateTaskState.md)
  - [User](docs/User.md)
  - [UserAllOf](docs/UserAllOf.md)
  - [UserCollection](docs/UserCollection.md)
@@ -512,40 +508,104 @@
  - [XComCollectionAllOf](docs/XComCollectionAllOf.md)
  - [XComCollectionItem](docs/XComCollectionItem.md)
 
-
 ## Documentation For Authorization
 
+By default the generated client supports the three authentication schemes:
 
-## Basic
+* Basic
+* GoogleOpenID
+* Kerberos
 
-- **Type**: HTTP basic authentication
+However, you can generate client and documentation with your own schemes by adding your own schemes in
+the security section of the OpenAPI specification. You can do it with Breeze CLI by adding the
+``--security-schemes`` option to the ``breeze release-management prepare-python-client`` command.
 
+## Basic "smoke" tests
 
-## Kerberos
+You can run basic smoke tests to check if the client is working properly - we have a simple test script
+that uses the API to run the tests. To do that, you need to:
 
+* install the `apache-airflow-client` package as described above
+* install ``rich`` Python package
+* download the [test_python_client.py](test_python_client.py) file
+* make sure you have test airflow installation running. Do not experiment with your production deployment
+* configure your airflow webserver to enable basic authentication
+  In the `[api]` section of your `airflow.cfg` set:
 
+```ini
+[api]
+auth_backend = airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth
+```
 
-## Author
+You can also set it by env variable:
+`export AIRFLOW__API__AUTH_BACKENDS=airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth`
 
-dev@airflow.apache.org
+* configure your airflow webserver to load example dags
+  In the `[core]` section of your `airflow.cfg` set:
+
+```ini
+[core]
+load_examples = True
+```
+
+You can also set it by env variable: `export AIRFLOW__CORE__LOAD_EXAMPLES=True`
+
+* optionally expose configuration (NOTE! that this is dangerous setting). The script will happily run with
+  the default setting, but if you want to see the configuration, you need to expose it.
+  In the `[webserver]` section of your `airflow.cfg` set:
+
+```ini
+[webserver]
+expose_config = True
+```
+
+You can also set it by env variable: `export AIRFLOW__WEBSERVER__EXPOSE_CONFIG=True`
+
+* Configure your host/ip/user/password in the `test_python_client.py` file
+
+```python
+import airflow_client
+
+# Configure HTTP basic authorization: Basic
+configuration = airflow_client.client.Configuration(
+    host="http://localhost:8080/api/v1", username="admin", password="admin"
+)
+```
+
+* Run scheduler (or dag file processor you have setup with standalone dag file processor) for few parsing
+  loops (you can pass --num-runs parameter to it or keep it running in the background). The script relies
+  on example DAGs being serialized to the DB and this only
+  happens when scheduler runs with ``core/load_examples`` set to True.
+
+* Run webserver - reachable at the host/port for the test script you want to run. Make sure it had enough
+  time to initialize.
+
+Run `python test_python_client.py` and you should see colored output showing attempts to connect and status.
 
 
 ## Notes for Large OpenAPI documents
+
 If the OpenAPI document is large, imports in client.apis and client.models may fail with a
 RecursionError indicating the maximum recursion limit has been exceeded. In that case, there are a couple of solutions:
 
 Solution 1:
 Use specific imports for apis and models like:
+
 - `from airflow_client.client.api.default_api import DefaultApi`
 - `from airflow_client.client.model.pet import Pet`
 
 Solution 2:
 Before importing the package, adjust the maximum recursion limit as shown below:
-```
+
+```python
 import sys
+
 sys.setrecursionlimit(1500)
 import airflow_client.client
 from airflow_client.client.apis import *
 from airflow_client.client.models import *
 ```
 
+## Authors
+
+dev@airflow.apache.org