Update Python Client to 3.1.0rc1 (#138)

(from https://github.com/apache/airflow/tree/python-client/3.1.0rc1)

## New Features:

- Add `map_index` filter to TaskInstance API queries ([#55614](https://github.com/apache/airflow/pull/55614))
- Add `has_import_errors` filter to Core API GET /dags endpoint ([#54563](https://github.com/apache/airflow/pull/54563))
- Add `dag_version` filter to get_dag_runs endpoint ([#54882](https://github.com/apache/airflow/pull/54882))
- Implement pattern search for event log endpoint ([#55114](https://github.com/apache/airflow/pull/55114))
- Add asset-based filtering support to DAG API endpoint ([#54263](https://github.com/apache/airflow/pull/54263))
- Add Greater Than and Less Than range filters to DagRuns and Task Instance list ([#54302](https://github.com/apache/airflow/pull/54302))
- Add `try_number` as filter to task instances ([#54695](https://github.com/apache/airflow/pull/54695))
- Add filters to Browse XComs endpoint ([#54049](https://github.com/apache/airflow/pull/54049))
- Add Filtering by DAG Bundle Name and Version to API routes ([#54004](https://github.com/apache/airflow/pull/54004))
- Add search filter for DAG runs by triggering user name ([#53652](https://github.com/apache/airflow/pull/53652))
- Enable multi sorting (AIP-84) ([#53408](https://github.com/apache/airflow/pull/53408))
- Add `run_on_latest_version` support for backfill and clear operations ([#52177](https://github.com/apache/airflow/pull/52177))
- Add `run_id_pattern` search for Dag Run API ([#52437](https://github.com/apache/airflow/pull/52437))
- Add tracking of triggering user to Dag runs ([#51738](https://github.com/apache/airflow/pull/51738))
- Expose DAG parsing duration in the API ([#54752](https://github.com/apache/airflow/pull/54752))

## New API Endpoints:

- Add Human-in-the-Loop (HITL) endpoints for approval workflows ([#52868](https://github.com/apache/airflow/pull/52868), [#53373](https://github.com/apache/airflow/pull/53373), [#53376](https://github.com/apache/airflow/pull/53376), [#53885](https://github.com/apache/airflow/pull/53885), [#53923](https://github.com/apache/airflow/pull/53923), [#54308](https://github.com/apache/airflow/pull/54308), [#54310](https://github.com/apache/airflow/pull/54310), [#54723](https://github.com/apache/airflow/pull/54723), [#54773](https://github.com/apache/airflow/pull/54773), [#55019](https://github.com/apache/airflow/pull/55019), [#55463](https://github.com/apache/airflow/pull/55463), [#55525](https://github.com/apache/airflow/pull/55525), [#55535](https://github.com/apache/airflow/pull/55535), [#55603](https://github.com/apache/airflow/pull/55603), [#55776](https://github.com/apache/airflow/pull/55776))
- Add endpoint to watch dag run until finish ([#51920](https://github.com/apache/airflow/pull/51920))
- Add TI bulk actions endpoint ([#50443](https://github.com/apache/airflow/pull/50443))
- Add Keycloak Refresh Token Endpoint ([#51657](https://github.com/apache/airflow/pull/51657))

## Deprecations:

- Mark `DagDetailsResponse.concurrency` as deprecated ([#55150](https://github.com/apache/airflow/pull/55150))

## Bug Fixes:

- Fix dag import error modal pagination ([#55719](https://github.com/apache/airflow/pull/55719))
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 095e52c..7be084d 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -17,6 +17,42 @@
  under the License.
  -->
 
+# v3.1.0
+
+## New Features:
+
+- Add `map_index` filter to TaskInstance API queries ([#55614](https://github.com/apache/airflow/pull/55614))
+- Add `has_import_errors` filter to Core API GET /dags endpoint ([#54563](https://github.com/apache/airflow/pull/54563))
+- Add `dag_version` filter to get_dag_runs endpoint ([#54882](https://github.com/apache/airflow/pull/54882))
+- Implement pattern search for event log endpoint ([#55114](https://github.com/apache/airflow/pull/55114))
+- Add asset-based filtering support to DAG API endpoint ([#54263](https://github.com/apache/airflow/pull/54263))
+- Add Greater Than and Less Than range filters to DagRuns and Task Instance list ([#54302](https://github.com/apache/airflow/pull/54302))
+- Add `try_number` as filter to task instances ([#54695](https://github.com/apache/airflow/pull/54695))
+- Add filters to Browse XComs endpoint ([#54049](https://github.com/apache/airflow/pull/54049))
+- Add Filtering by DAG Bundle Name and Version to API routes ([#54004](https://github.com/apache/airflow/pull/54004))
+- Add search filter for DAG runs by triggering user name ([#53652](https://github.com/apache/airflow/pull/53652))
+- Enable multi sorting (AIP-84) ([#53408](https://github.com/apache/airflow/pull/53408))
+- Add `run_on_latest_version` support for backfill and clear operations ([#52177](https://github.com/apache/airflow/pull/52177))
+- Add `run_id_pattern` search for Dag Run API ([#52437](https://github.com/apache/airflow/pull/52437))
+- Add tracking of triggering user to Dag runs ([#51738](https://github.com/apache/airflow/pull/51738))
+- Expose DAG parsing duration in the API ([#54752](https://github.com/apache/airflow/pull/54752))
+
+## New API Endpoints:
+
+- Add Human-in-the-Loop (HITL) endpoints for approval workflows ([#52868](https://github.com/apache/airflow/pull/52868), [#53373](https://github.com/apache/airflow/pull/53373), [#53376](https://github.com/apache/airflow/pull/53376), [#53885](https://github.com/apache/airflow/pull/53885), [#53923](https://github.com/apache/airflow/pull/53923), [#54308](https://github.com/apache/airflow/pull/54308), [#54310](https://github.com/apache/airflow/pull/54310), [#54723](https://github.com/apache/airflow/pull/54723), [#54773](https://github.com/apache/airflow/pull/54773), [#55019](https://github.com/apache/airflow/pull/55019), [#55463](https://github.com/apache/airflow/pull/55463), [#55525](https://github.com/apache/airflow/pull/55525), [#55535](https://github.com/apache/airflow/pull/55535), [#55603](https://github.com/apache/airflow/pull/55603), [#55776](https://github.com/apache/airflow/pull/55776))
+- Add endpoint to watch dag run until finish ([#51920](https://github.com/apache/airflow/pull/51920))
+- Add TI bulk actions endpoint ([#50443](https://github.com/apache/airflow/pull/50443))
+- Add Keycloak Refresh Token Endpoint ([#51657](https://github.com/apache/airflow/pull/51657))
+
+## Deprecations:
+
+- Mark `DagDetailsResponse.concurrency` as deprecated ([#55150](https://github.com/apache/airflow/pull/55150))
+
+## Bug Fixes:
+
+- Fix dag import error modal pagination ([#55719](https://github.com/apache/airflow/pull/55719))
+
+
 # v3.0.2
 
 ## Major changes:
diff --git a/README.md b/README.md
index f071d07..e164f4b 100644
--- a/README.md
+++ b/README.md
@@ -394,6 +394,7 @@
 *DagRunApi* | [**get_dag_runs**](docs/DagRunApi.md#get_dag_runs) | **GET** /api/v2/dags/{dag_id}/dagRuns | Get Dag Runs
 *DagRunApi* | [**get_list_dag_runs_batch**](docs/DagRunApi.md#get_list_dag_runs_batch) | **POST** /api/v2/dags/{dag_id}/dagRuns/list | Get List Dag Runs Batch
 *DagRunApi* | [**get_upstream_asset_events**](docs/DagRunApi.md#get_upstream_asset_events) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamAssetEvents | Get Upstream Asset Events
+*DagRunApi* | [**wait_dag_run_until_finished**](docs/DagRunApi.md#wait_dag_run_until_finished) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/wait | Experimental: Wait for a dag run to complete, and return task results if requested.
 *DagRunApi* | [**patch_dag_run**](docs/DagRunApi.md#patch_dag_run) | **PATCH** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id} | Patch Dag Run
 *DagRunApi* | [**trigger_dag_run**](docs/DagRunApi.md#trigger_dag_run) | **POST** /api/v2/dags/{dag_id}/dagRuns | Trigger Dag Run
 *DagSourceApi* | [**get_dag_source**](docs/DagSourceApi.md#get_dag_source) | **GET** /api/v2/dagSources/{dag_id} | Get Dag Source
diff --git a/airflow_client/client/__init__.py b/airflow_client/client/__init__.py
index 07e883b..e95b35a 100644
--- a/airflow_client/client/__init__.py
+++ b/airflow_client/client/__init__.py
@@ -14,7 +14,7 @@
 """  # noqa: E501
 
 
-__version__ = "3.0.2"
+__version__ = "3.1.0"
 
 # import apis into sdk package
 from airflow_client.client.api.asset_api import AssetApi
@@ -43,6 +43,7 @@
 from airflow_client.client.api.variable_api import VariableApi
 from airflow_client.client.api.version_api import VersionApi
 from airflow_client.client.api.x_com_api import XComApi
+from airflow_client.client.api.experimental_api import ExperimentalApi
 
 # import ApiClient
 from airflow_client.client.api_response import ApiResponse
@@ -71,19 +72,26 @@
 from airflow_client.client.models.bulk_action_not_on_existence import BulkActionNotOnExistence
 from airflow_client.client.models.bulk_action_on_existence import BulkActionOnExistence
 from airflow_client.client.models.bulk_action_response import BulkActionResponse
+from airflow_client.client.models.bulk_body_bulk_task_instance_body import BulkBodyBulkTaskInstanceBody
+from airflow_client.client.models.bulk_body_bulk_task_instance_body_actions_inner import BulkBodyBulkTaskInstanceBodyActionsInner
 from airflow_client.client.models.bulk_body_connection_body import BulkBodyConnectionBody
 from airflow_client.client.models.bulk_body_connection_body_actions_inner import BulkBodyConnectionBodyActionsInner
 from airflow_client.client.models.bulk_body_pool_body import BulkBodyPoolBody
 from airflow_client.client.models.bulk_body_pool_body_actions_inner import BulkBodyPoolBodyActionsInner
 from airflow_client.client.models.bulk_body_variable_body import BulkBodyVariableBody
 from airflow_client.client.models.bulk_body_variable_body_actions_inner import BulkBodyVariableBodyActionsInner
+from airflow_client.client.models.bulk_create_action_bulk_task_instance_body import BulkCreateActionBulkTaskInstanceBody
 from airflow_client.client.models.bulk_create_action_connection_body import BulkCreateActionConnectionBody
 from airflow_client.client.models.bulk_create_action_pool_body import BulkCreateActionPoolBody
 from airflow_client.client.models.bulk_create_action_variable_body import BulkCreateActionVariableBody
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body import BulkDeleteActionBulkTaskInstanceBody
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body_entities_inner import BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
 from airflow_client.client.models.bulk_delete_action_connection_body import BulkDeleteActionConnectionBody
 from airflow_client.client.models.bulk_delete_action_pool_body import BulkDeleteActionPoolBody
 from airflow_client.client.models.bulk_delete_action_variable_body import BulkDeleteActionVariableBody
 from airflow_client.client.models.bulk_response import BulkResponse
+from airflow_client.client.models.bulk_task_instance_body import BulkTaskInstanceBody
+from airflow_client.client.models.bulk_update_action_bulk_task_instance_body import BulkUpdateActionBulkTaskInstanceBody
 from airflow_client.client.models.bulk_update_action_connection_body import BulkUpdateActionConnectionBody
 from airflow_client.client.models.bulk_update_action_pool_body import BulkUpdateActionPoolBody
 from airflow_client.client.models.bulk_update_action_variable_body import BulkUpdateActionVariableBody
@@ -130,9 +138,15 @@
 from airflow_client.client.models.dry_run_backfill_response import DryRunBackfillResponse
 from airflow_client.client.models.event_log_collection_response import EventLogCollectionResponse
 from airflow_client.client.models.event_log_response import EventLogResponse
+from airflow_client.client.models.external_log_url_response import ExternalLogUrlResponse
+from airflow_client.client.models.external_view_response import ExternalViewResponse
 from airflow_client.client.models.extra_link_collection_response import ExtraLinkCollectionResponse
 from airflow_client.client.models.fast_api_app_response import FastAPIAppResponse
 from airflow_client.client.models.fast_api_root_middleware_response import FastAPIRootMiddlewareResponse
+from airflow_client.client.models.hitl_detail import HITLDetail
+from airflow_client.client.models.hitl_detail_collection import HITLDetailCollection
+from airflow_client.client.models.hitl_detail_response import HITLDetailResponse
+from airflow_client.client.models.hitl_user import HITLUser
 from airflow_client.client.models.http_exception_response import HTTPExceptionResponse
 from airflow_client.client.models.http_validation_error import HTTPValidationError
 from airflow_client.client.models.health_info_response import HealthInfoResponse
@@ -140,8 +154,11 @@
 from airflow_client.client.models.import_error_response import ImportErrorResponse
 from airflow_client.client.models.job_collection_response import JobCollectionResponse
 from airflow_client.client.models.job_response import JobResponse
+from airflow_client.client.models.last_asset_event_response import LastAssetEventResponse
 from airflow_client.client.models.patch_task_instance_body import PatchTaskInstanceBody
 from airflow_client.client.models.plugin_collection_response import PluginCollectionResponse
+from airflow_client.client.models.plugin_import_error_collection_response import PluginImportErrorCollectionResponse
+from airflow_client.client.models.plugin_import_error_response import PluginImportErrorResponse
 from airflow_client.client.models.plugin_response import PluginResponse
 from airflow_client.client.models.pool_body import PoolBody
 from airflow_client.client.models.pool_collection_response import PoolCollectionResponse
@@ -151,6 +168,7 @@
 from airflow_client.client.models.provider_response import ProviderResponse
 from airflow_client.client.models.queued_event_collection_response import QueuedEventCollectionResponse
 from airflow_client.client.models.queued_event_response import QueuedEventResponse
+from airflow_client.client.models.react_app_response import ReactAppResponse
 from airflow_client.client.models.reprocess_behavior import ReprocessBehavior
 from airflow_client.client.models.response_clear_dag_run import ResponseClearDagRun
 from airflow_client.client.models.response_get_xcom_entry import ResponseGetXcomEntry
@@ -159,6 +177,7 @@
 from airflow_client.client.models.task_collection_response import TaskCollectionResponse
 from airflow_client.client.models.task_dependency_collection_response import TaskDependencyCollectionResponse
 from airflow_client.client.models.task_dependency_response import TaskDependencyResponse
+from airflow_client.client.models.task_inlet_asset_reference import TaskInletAssetReference
 from airflow_client.client.models.task_instance_collection_response import TaskInstanceCollectionResponse
 from airflow_client.client.models.task_instance_history_collection_response import TaskInstanceHistoryCollectionResponse
 from airflow_client.client.models.task_instance_history_response import TaskInstanceHistoryResponse
@@ -172,6 +191,7 @@
 from airflow_client.client.models.trigger_dag_run_post_body import TriggerDAGRunPostBody
 from airflow_client.client.models.trigger_response import TriggerResponse
 from airflow_client.client.models.triggerer_info_response import TriggererInfoResponse
+from airflow_client.client.models.update_hitl_detail_payload import UpdateHITLDetailPayload
 from airflow_client.client.models.validation_error import ValidationError
 from airflow_client.client.models.validation_error_loc_inner import ValidationErrorLocInner
 from airflow_client.client.models.value import Value
diff --git a/airflow_client/client/api/__init__.py b/airflow_client/client/api/__init__.py
index b776754..bfbd4b0 100644
--- a/airflow_client/client/api/__init__.py
+++ b/airflow_client/client/api/__init__.py
@@ -27,4 +27,5 @@
 from airflow_client.client.api.variable_api import VariableApi
 from airflow_client.client.api.version_api import VersionApi
 from airflow_client.client.api.x_com_api import XComApi
+from airflow_client.client.api.experimental_api import ExperimentalApi
 
diff --git a/airflow_client/client/api/asset_api.py b/airflow_client/client/api/asset_api.py
index a367486..1a4ab9d 100644
--- a/airflow_client/client/api/asset_api.py
+++ b/airflow_client/client/api/asset_api.py
@@ -313,7 +313,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -603,7 +604,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -911,7 +913,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1201,7 +1204,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1474,7 +1478,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1747,7 +1752,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1773,8 +1779,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        name_pattern: Optional[StrictStr] = None,
-        order_by: Optional[StrictStr] = None,
+        name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1796,10 +1802,10 @@
         :type limit: int
         :param offset:
         :type offset: int
-        :param name_pattern:
+        :param name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type name_pattern: str
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1856,8 +1862,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        name_pattern: Optional[StrictStr] = None,
-        order_by: Optional[StrictStr] = None,
+        name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1879,10 +1885,10 @@
         :type limit: int
         :param offset:
         :type offset: int
-        :param name_pattern:
+        :param name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type name_pattern: str
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1939,8 +1945,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        name_pattern: Optional[StrictStr] = None,
-        order_by: Optional[StrictStr] = None,
+        name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1962,10 +1968,10 @@
         :type limit: int
         :param offset:
         :type offset: int
-        :param name_pattern:
+        :param name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type name_pattern: str
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -2028,6 +2034,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -2073,7 +2080,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2099,14 +2107,16 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         asset_id: Optional[StrictInt] = None,
         source_dag_id: Optional[StrictStr] = None,
         source_task_id: Optional[StrictStr] = None,
         source_run_id: Optional[StrictStr] = None,
         source_map_index: Optional[StrictInt] = None,
         timestamp_gte: Optional[datetime] = None,
+        timestamp_gt: Optional[datetime] = None,
         timestamp_lte: Optional[datetime] = None,
+        timestamp_lt: Optional[datetime] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -2129,7 +2139,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param asset_id:
         :type asset_id: int
         :param source_dag_id:
@@ -2142,8 +2152,12 @@
         :type source_map_index: int
         :param timestamp_gte:
         :type timestamp_gte: datetime
+        :param timestamp_gt:
+        :type timestamp_gt: datetime
         :param timestamp_lte:
         :type timestamp_lte: datetime
+        :param timestamp_lt:
+        :type timestamp_lt: datetime
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -2176,7 +2190,9 @@
             source_run_id=source_run_id,
             source_map_index=source_map_index,
             timestamp_gte=timestamp_gte,
+            timestamp_gt=timestamp_gt,
             timestamp_lte=timestamp_lte,
+            timestamp_lt=timestamp_lt,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -2206,14 +2222,16 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         asset_id: Optional[StrictInt] = None,
         source_dag_id: Optional[StrictStr] = None,
         source_task_id: Optional[StrictStr] = None,
         source_run_id: Optional[StrictStr] = None,
         source_map_index: Optional[StrictInt] = None,
         timestamp_gte: Optional[datetime] = None,
+        timestamp_gt: Optional[datetime] = None,
         timestamp_lte: Optional[datetime] = None,
+        timestamp_lt: Optional[datetime] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -2236,7 +2254,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param asset_id:
         :type asset_id: int
         :param source_dag_id:
@@ -2249,8 +2267,12 @@
         :type source_map_index: int
         :param timestamp_gte:
         :type timestamp_gte: datetime
+        :param timestamp_gt:
+        :type timestamp_gt: datetime
         :param timestamp_lte:
         :type timestamp_lte: datetime
+        :param timestamp_lt:
+        :type timestamp_lt: datetime
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -2283,7 +2305,9 @@
             source_run_id=source_run_id,
             source_map_index=source_map_index,
             timestamp_gte=timestamp_gte,
+            timestamp_gt=timestamp_gt,
             timestamp_lte=timestamp_lte,
+            timestamp_lt=timestamp_lt,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -2313,14 +2337,16 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         asset_id: Optional[StrictInt] = None,
         source_dag_id: Optional[StrictStr] = None,
         source_task_id: Optional[StrictStr] = None,
         source_run_id: Optional[StrictStr] = None,
         source_map_index: Optional[StrictInt] = None,
         timestamp_gte: Optional[datetime] = None,
+        timestamp_gt: Optional[datetime] = None,
         timestamp_lte: Optional[datetime] = None,
+        timestamp_lt: Optional[datetime] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -2343,7 +2369,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param asset_id:
         :type asset_id: int
         :param source_dag_id:
@@ -2356,8 +2382,12 @@
         :type source_map_index: int
         :param timestamp_gte:
         :type timestamp_gte: datetime
+        :param timestamp_gt:
+        :type timestamp_gt: datetime
         :param timestamp_lte:
         :type timestamp_lte: datetime
+        :param timestamp_lt:
+        :type timestamp_lt: datetime
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -2390,7 +2420,9 @@
             source_run_id=source_run_id,
             source_map_index=source_map_index,
             timestamp_gte=timestamp_gte,
+            timestamp_gt=timestamp_gt,
             timestamp_lte=timestamp_lte,
+            timestamp_lt=timestamp_lt,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -2422,7 +2454,9 @@
         source_run_id,
         source_map_index,
         timestamp_gte,
+        timestamp_gt,
         timestamp_lte,
+        timestamp_lt,
         _request_auth,
         _content_type,
         _headers,
@@ -2432,6 +2466,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -2490,6 +2525,19 @@
             else:
                 _query_params.append(('timestamp_gte', timestamp_gte))
             
+        if timestamp_gt is not None:
+            if isinstance(timestamp_gt, datetime):
+                _query_params.append(
+                    (
+                        'timestamp_gt',
+                        timestamp_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('timestamp_gt', timestamp_gt))
+            
         if timestamp_lte is not None:
             if isinstance(timestamp_lte, datetime):
                 _query_params.append(
@@ -2503,6 +2551,19 @@
             else:
                 _query_params.append(('timestamp_lte', timestamp_lte))
             
+        if timestamp_lt is not None:
+            if isinstance(timestamp_lt, datetime):
+                _query_params.append(
+                    (
+                        'timestamp_lt',
+                        timestamp_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('timestamp_lt', timestamp_lt))
+            
         # process the header parameters
         # process the form parameters
         # process the body parameter
@@ -2519,7 +2580,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2809,7 +2871,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2835,11 +2898,11 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        name_pattern: Optional[StrictStr] = None,
-        uri_pattern: Optional[StrictStr] = None,
+        name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        uri_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         dag_ids: Optional[List[StrictStr]] = None,
         only_active: Optional[StrictBool] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -2861,16 +2924,16 @@
         :type limit: int
         :param offset:
         :type offset: int
-        :param name_pattern:
+        :param name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type name_pattern: str
-        :param uri_pattern:
+        :param uri_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type uri_pattern: str
         :param dag_ids:
         :type dag_ids: List[str]
         :param only_active:
         :type only_active: bool
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -2930,11 +2993,11 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        name_pattern: Optional[StrictStr] = None,
-        uri_pattern: Optional[StrictStr] = None,
+        name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        uri_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         dag_ids: Optional[List[StrictStr]] = None,
         only_active: Optional[StrictBool] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -2956,16 +3019,16 @@
         :type limit: int
         :param offset:
         :type offset: int
-        :param name_pattern:
+        :param name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type name_pattern: str
-        :param uri_pattern:
+        :param uri_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type uri_pattern: str
         :param dag_ids:
         :type dag_ids: List[str]
         :param only_active:
         :type only_active: bool
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -3025,11 +3088,11 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        name_pattern: Optional[StrictStr] = None,
-        uri_pattern: Optional[StrictStr] = None,
+        name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        uri_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         dag_ids: Optional[List[StrictStr]] = None,
         only_active: Optional[StrictBool] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -3051,16 +3114,16 @@
         :type limit: int
         :param offset:
         :type offset: int
-        :param name_pattern:
+        :param name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type name_pattern: str
-        :param uri_pattern:
+        :param uri_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type uri_pattern: str
         :param dag_ids:
         :type dag_ids: List[str]
         :param only_active:
         :type only_active: bool
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -3130,6 +3193,7 @@
 
         _collection_formats: Dict[str, str] = {
             'dag_ids': 'multi',
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -3187,7 +3251,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -3492,7 +3557,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -3782,7 +3848,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -4058,7 +4125,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/backfill_api.py b/airflow_client/client/api/backfill_api.py
index 3086458..1e453cd 100644
--- a/airflow_client/client/api/backfill_api.py
+++ b/airflow_client/client/api/backfill_api.py
@@ -17,7 +17,7 @@
 from typing_extensions import Annotated
 
 from pydantic import Field, StrictStr
-from typing import Optional
+from typing import List, Optional
 from typing_extensions import Annotated
 from airflow_client.client.models.backfill_collection_response import BackfillCollectionResponse
 from airflow_client.client.models.backfill_post_body import BackfillPostBody
@@ -294,7 +294,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -580,7 +581,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -866,7 +868,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1136,7 +1139,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1163,7 +1167,7 @@
         dag_id: StrictStr,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1187,7 +1191,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1244,7 +1248,7 @@
         dag_id: StrictStr,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1268,7 +1272,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1325,7 +1329,7 @@
         dag_id: StrictStr,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1349,7 +1353,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1411,6 +1415,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -1456,7 +1461,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1729,7 +1735,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2002,7 +2009,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/config_api.py b/airflow_client/client/api/config_api.py
index 728cbee..c0fe410 100644
--- a/airflow_client/client/api/config_api.py
+++ b/airflow_client/client/api/config_api.py
@@ -308,7 +308,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -612,7 +613,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/connection_api.py b/airflow_client/client/api/connection_api.py
index bdbb7a1..3bec862 100644
--- a/airflow_client/client/api/connection_api.py
+++ b/airflow_client/client/api/connection_api.py
@@ -306,7 +306,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -558,7 +559,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -831,7 +833,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1104,7 +1107,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1130,8 +1134,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        connection_id_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        connection_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1154,8 +1158,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param connection_id_pattern:
+        :type order_by: List[str]
+        :param connection_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type connection_id_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -1213,8 +1217,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        connection_id_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        connection_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1237,8 +1241,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param connection_id_pattern:
+        :type order_by: List[str]
+        :param connection_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type connection_id_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -1296,8 +1300,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        connection_id_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        connection_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1320,8 +1324,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param connection_id_pattern:
+        :type order_by: List[str]
+        :param connection_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type connection_id_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -1385,6 +1389,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -1430,7 +1435,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1752,7 +1758,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2038,7 +2045,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2321,7 +2329,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/dag_api.py b/airflow_client/client/api/dag_api.py
index 019ec28..6a7bceb 100644
--- a/airflow_client/client/api/dag_api.py
+++ b/airflow_client/client/api/dag_api.py
@@ -300,7 +300,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -322,6 +323,280 @@
 
 
     @validate_call
+    def favorite_dag(
+        self,
+        dag_id: StrictStr,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> None:
+        """Favorite Dag
+
+        Mark the DAG as favorite.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._favorite_dag_serialize(
+            dag_id=dag_id,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '204': None,
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def favorite_dag_with_http_info(
+        self,
+        dag_id: StrictStr,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[None]:
+        """Favorite Dag
+
+        Mark the DAG as favorite.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._favorite_dag_serialize(
+            dag_id=dag_id,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '204': None,
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def favorite_dag_without_preload_content(
+        self,
+        dag_id: StrictStr,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Favorite Dag
+
+        Mark the DAG as favorite.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._favorite_dag_serialize(
+            dag_id=dag_id,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '204': None,
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _favorite_dag_serialize(
+        self,
+        dag_id,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        if dag_id is not None:
+            _path_params['dag_id'] = dag_id
+        # process the query parameters
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json'
+                ]
+            )
+
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='POST',
+            resource_path='/api/v2/dags/{dag_id}/favorite',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
+
+
+    @validate_call
     def get_dag(
         self,
         dag_id: StrictStr,
@@ -576,7 +851,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -852,7 +1128,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -878,8 +1155,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        tag_name_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        tag_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -902,8 +1179,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param tag_name_pattern:
+        :type order_by: List[str]
+        :param tag_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type tag_name_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -960,8 +1237,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        tag_name_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        tag_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -984,8 +1261,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param tag_name_pattern:
+        :type order_by: List[str]
+        :param tag_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type tag_name_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -1042,8 +1319,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        tag_name_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        tag_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1066,8 +1343,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param tag_name_pattern:
+        :type order_by: List[str]
+        :param tag_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type tag_name_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -1130,6 +1407,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -1175,7 +1453,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1204,17 +1483,27 @@
         tags: Optional[List[StrictStr]] = None,
         tags_match_mode: Optional[StrictStr] = None,
         owners: Optional[List[StrictStr]] = None,
-        dag_id_pattern: Optional[StrictStr] = None,
-        dag_display_name_pattern: Optional[StrictStr] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        dag_display_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         exclude_stale: Optional[StrictBool] = None,
         paused: Optional[StrictBool] = None,
+        has_import_errors: Annotated[Optional[StrictBool], Field(description="Filter Dags by having import errors. Only Dags that have been successfully loaded before will be returned.")] = None,
         last_dag_run_state: Optional[DagRunState] = None,
+        bundle_name: Optional[StrictStr] = None,
+        bundle_version: Optional[StrictStr] = None,
+        has_asset_schedule: Annotated[Optional[StrictBool], Field(description="Filter Dags with asset-based scheduling")] = None,
+        asset_dependency: Annotated[Optional[StrictStr], Field(description="Filter Dags by asset dependency (name or URI)")] = None,
         dag_run_start_date_gte: Optional[datetime] = None,
+        dag_run_start_date_gt: Optional[datetime] = None,
         dag_run_start_date_lte: Optional[datetime] = None,
+        dag_run_start_date_lt: Optional[datetime] = None,
         dag_run_end_date_gte: Optional[datetime] = None,
+        dag_run_end_date_gt: Optional[datetime] = None,
         dag_run_end_date_lte: Optional[datetime] = None,
+        dag_run_end_date_lt: Optional[datetime] = None,
         dag_run_state: Optional[List[StrictStr]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        is_favorite: Optional[StrictBool] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1242,28 +1531,48 @@
         :type tags_match_mode: str
         :param owners:
         :type owners: List[str]
-        :param dag_id_pattern:
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type dag_id_pattern: str
-        :param dag_display_name_pattern:
+        :param dag_display_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type dag_display_name_pattern: str
         :param exclude_stale:
         :type exclude_stale: bool
         :param paused:
         :type paused: bool
+        :param has_import_errors: Filter Dags by having import errors. Only Dags that have been successfully loaded before will be returned.
+        :type has_import_errors: bool
         :param last_dag_run_state:
         :type last_dag_run_state: DagRunState
+        :param bundle_name:
+        :type bundle_name: str
+        :param bundle_version:
+        :type bundle_version: str
+        :param has_asset_schedule: Filter Dags with asset-based scheduling
+        :type has_asset_schedule: bool
+        :param asset_dependency: Filter Dags by asset dependency (name or URI)
+        :type asset_dependency: str
         :param dag_run_start_date_gte:
         :type dag_run_start_date_gte: datetime
+        :param dag_run_start_date_gt:
+        :type dag_run_start_date_gt: datetime
         :param dag_run_start_date_lte:
         :type dag_run_start_date_lte: datetime
+        :param dag_run_start_date_lt:
+        :type dag_run_start_date_lt: datetime
         :param dag_run_end_date_gte:
         :type dag_run_end_date_gte: datetime
+        :param dag_run_end_date_gt:
+        :type dag_run_end_date_gt: datetime
         :param dag_run_end_date_lte:
         :type dag_run_end_date_lte: datetime
+        :param dag_run_end_date_lt:
+        :type dag_run_end_date_lt: datetime
         :param dag_run_state:
         :type dag_run_state: List[str]
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
+        :param is_favorite:
+        :type is_favorite: bool
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1296,13 +1605,23 @@
             dag_display_name_pattern=dag_display_name_pattern,
             exclude_stale=exclude_stale,
             paused=paused,
+            has_import_errors=has_import_errors,
             last_dag_run_state=last_dag_run_state,
+            bundle_name=bundle_name,
+            bundle_version=bundle_version,
+            has_asset_schedule=has_asset_schedule,
+            asset_dependency=asset_dependency,
             dag_run_start_date_gte=dag_run_start_date_gte,
+            dag_run_start_date_gt=dag_run_start_date_gt,
             dag_run_start_date_lte=dag_run_start_date_lte,
+            dag_run_start_date_lt=dag_run_start_date_lt,
             dag_run_end_date_gte=dag_run_end_date_gte,
+            dag_run_end_date_gt=dag_run_end_date_gt,
             dag_run_end_date_lte=dag_run_end_date_lte,
+            dag_run_end_date_lt=dag_run_end_date_lt,
             dag_run_state=dag_run_state,
             order_by=order_by,
+            is_favorite=is_favorite,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -1334,17 +1653,27 @@
         tags: Optional[List[StrictStr]] = None,
         tags_match_mode: Optional[StrictStr] = None,
         owners: Optional[List[StrictStr]] = None,
-        dag_id_pattern: Optional[StrictStr] = None,
-        dag_display_name_pattern: Optional[StrictStr] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        dag_display_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         exclude_stale: Optional[StrictBool] = None,
         paused: Optional[StrictBool] = None,
+        has_import_errors: Annotated[Optional[StrictBool], Field(description="Filter Dags by having import errors. Only Dags that have been successfully loaded before will be returned.")] = None,
         last_dag_run_state: Optional[DagRunState] = None,
+        bundle_name: Optional[StrictStr] = None,
+        bundle_version: Optional[StrictStr] = None,
+        has_asset_schedule: Annotated[Optional[StrictBool], Field(description="Filter Dags with asset-based scheduling")] = None,
+        asset_dependency: Annotated[Optional[StrictStr], Field(description="Filter Dags by asset dependency (name or URI)")] = None,
         dag_run_start_date_gte: Optional[datetime] = None,
+        dag_run_start_date_gt: Optional[datetime] = None,
         dag_run_start_date_lte: Optional[datetime] = None,
+        dag_run_start_date_lt: Optional[datetime] = None,
         dag_run_end_date_gte: Optional[datetime] = None,
+        dag_run_end_date_gt: Optional[datetime] = None,
         dag_run_end_date_lte: Optional[datetime] = None,
+        dag_run_end_date_lt: Optional[datetime] = None,
         dag_run_state: Optional[List[StrictStr]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        is_favorite: Optional[StrictBool] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1372,28 +1701,48 @@
         :type tags_match_mode: str
         :param owners:
         :type owners: List[str]
-        :param dag_id_pattern:
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type dag_id_pattern: str
-        :param dag_display_name_pattern:
+        :param dag_display_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type dag_display_name_pattern: str
         :param exclude_stale:
         :type exclude_stale: bool
         :param paused:
         :type paused: bool
+        :param has_import_errors: Filter Dags by having import errors. Only Dags that have been successfully loaded before will be returned.
+        :type has_import_errors: bool
         :param last_dag_run_state:
         :type last_dag_run_state: DagRunState
+        :param bundle_name:
+        :type bundle_name: str
+        :param bundle_version:
+        :type bundle_version: str
+        :param has_asset_schedule: Filter Dags with asset-based scheduling
+        :type has_asset_schedule: bool
+        :param asset_dependency: Filter Dags by asset dependency (name or URI)
+        :type asset_dependency: str
         :param dag_run_start_date_gte:
         :type dag_run_start_date_gte: datetime
+        :param dag_run_start_date_gt:
+        :type dag_run_start_date_gt: datetime
         :param dag_run_start_date_lte:
         :type dag_run_start_date_lte: datetime
+        :param dag_run_start_date_lt:
+        :type dag_run_start_date_lt: datetime
         :param dag_run_end_date_gte:
         :type dag_run_end_date_gte: datetime
+        :param dag_run_end_date_gt:
+        :type dag_run_end_date_gt: datetime
         :param dag_run_end_date_lte:
         :type dag_run_end_date_lte: datetime
+        :param dag_run_end_date_lt:
+        :type dag_run_end_date_lt: datetime
         :param dag_run_state:
         :type dag_run_state: List[str]
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
+        :param is_favorite:
+        :type is_favorite: bool
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1426,13 +1775,23 @@
             dag_display_name_pattern=dag_display_name_pattern,
             exclude_stale=exclude_stale,
             paused=paused,
+            has_import_errors=has_import_errors,
             last_dag_run_state=last_dag_run_state,
+            bundle_name=bundle_name,
+            bundle_version=bundle_version,
+            has_asset_schedule=has_asset_schedule,
+            asset_dependency=asset_dependency,
             dag_run_start_date_gte=dag_run_start_date_gte,
+            dag_run_start_date_gt=dag_run_start_date_gt,
             dag_run_start_date_lte=dag_run_start_date_lte,
+            dag_run_start_date_lt=dag_run_start_date_lt,
             dag_run_end_date_gte=dag_run_end_date_gte,
+            dag_run_end_date_gt=dag_run_end_date_gt,
             dag_run_end_date_lte=dag_run_end_date_lte,
+            dag_run_end_date_lt=dag_run_end_date_lt,
             dag_run_state=dag_run_state,
             order_by=order_by,
+            is_favorite=is_favorite,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -1464,17 +1823,27 @@
         tags: Optional[List[StrictStr]] = None,
         tags_match_mode: Optional[StrictStr] = None,
         owners: Optional[List[StrictStr]] = None,
-        dag_id_pattern: Optional[StrictStr] = None,
-        dag_display_name_pattern: Optional[StrictStr] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        dag_display_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         exclude_stale: Optional[StrictBool] = None,
         paused: Optional[StrictBool] = None,
+        has_import_errors: Annotated[Optional[StrictBool], Field(description="Filter Dags by having import errors. Only Dags that have been successfully loaded before will be returned.")] = None,
         last_dag_run_state: Optional[DagRunState] = None,
+        bundle_name: Optional[StrictStr] = None,
+        bundle_version: Optional[StrictStr] = None,
+        has_asset_schedule: Annotated[Optional[StrictBool], Field(description="Filter Dags with asset-based scheduling")] = None,
+        asset_dependency: Annotated[Optional[StrictStr], Field(description="Filter Dags by asset dependency (name or URI)")] = None,
         dag_run_start_date_gte: Optional[datetime] = None,
+        dag_run_start_date_gt: Optional[datetime] = None,
         dag_run_start_date_lte: Optional[datetime] = None,
+        dag_run_start_date_lt: Optional[datetime] = None,
         dag_run_end_date_gte: Optional[datetime] = None,
+        dag_run_end_date_gt: Optional[datetime] = None,
         dag_run_end_date_lte: Optional[datetime] = None,
+        dag_run_end_date_lt: Optional[datetime] = None,
         dag_run_state: Optional[List[StrictStr]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        is_favorite: Optional[StrictBool] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1502,28 +1871,48 @@
         :type tags_match_mode: str
         :param owners:
         :type owners: List[str]
-        :param dag_id_pattern:
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type dag_id_pattern: str
-        :param dag_display_name_pattern:
+        :param dag_display_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type dag_display_name_pattern: str
         :param exclude_stale:
         :type exclude_stale: bool
         :param paused:
         :type paused: bool
+        :param has_import_errors: Filter Dags by having import errors. Only Dags that have been successfully loaded before will be returned.
+        :type has_import_errors: bool
         :param last_dag_run_state:
         :type last_dag_run_state: DagRunState
+        :param bundle_name:
+        :type bundle_name: str
+        :param bundle_version:
+        :type bundle_version: str
+        :param has_asset_schedule: Filter Dags with asset-based scheduling
+        :type has_asset_schedule: bool
+        :param asset_dependency: Filter Dags by asset dependency (name or URI)
+        :type asset_dependency: str
         :param dag_run_start_date_gte:
         :type dag_run_start_date_gte: datetime
+        :param dag_run_start_date_gt:
+        :type dag_run_start_date_gt: datetime
         :param dag_run_start_date_lte:
         :type dag_run_start_date_lte: datetime
+        :param dag_run_start_date_lt:
+        :type dag_run_start_date_lt: datetime
         :param dag_run_end_date_gte:
         :type dag_run_end_date_gte: datetime
+        :param dag_run_end_date_gt:
+        :type dag_run_end_date_gt: datetime
         :param dag_run_end_date_lte:
         :type dag_run_end_date_lte: datetime
+        :param dag_run_end_date_lt:
+        :type dag_run_end_date_lt: datetime
         :param dag_run_state:
         :type dag_run_state: List[str]
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
+        :param is_favorite:
+        :type is_favorite: bool
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1556,13 +1945,23 @@
             dag_display_name_pattern=dag_display_name_pattern,
             exclude_stale=exclude_stale,
             paused=paused,
+            has_import_errors=has_import_errors,
             last_dag_run_state=last_dag_run_state,
+            bundle_name=bundle_name,
+            bundle_version=bundle_version,
+            has_asset_schedule=has_asset_schedule,
+            asset_dependency=asset_dependency,
             dag_run_start_date_gte=dag_run_start_date_gte,
+            dag_run_start_date_gt=dag_run_start_date_gt,
             dag_run_start_date_lte=dag_run_start_date_lte,
+            dag_run_start_date_lt=dag_run_start_date_lt,
             dag_run_end_date_gte=dag_run_end_date_gte,
+            dag_run_end_date_gt=dag_run_end_date_gt,
             dag_run_end_date_lte=dag_run_end_date_lte,
+            dag_run_end_date_lt=dag_run_end_date_lt,
             dag_run_state=dag_run_state,
             order_by=order_by,
+            is_favorite=is_favorite,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -1593,13 +1992,23 @@
         dag_display_name_pattern,
         exclude_stale,
         paused,
+        has_import_errors,
         last_dag_run_state,
+        bundle_name,
+        bundle_version,
+        has_asset_schedule,
+        asset_dependency,
         dag_run_start_date_gte,
+        dag_run_start_date_gt,
         dag_run_start_date_lte,
+        dag_run_start_date_lt,
         dag_run_end_date_gte,
+        dag_run_end_date_gt,
         dag_run_end_date_lte,
+        dag_run_end_date_lt,
         dag_run_state,
         order_by,
+        is_favorite,
         _request_auth,
         _content_type,
         _headers,
@@ -1612,6 +2021,7 @@
             'tags': 'multi',
             'owners': 'multi',
             'dag_run_state': 'multi',
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -1661,10 +2071,30 @@
             
             _query_params.append(('paused', paused))
             
+        if has_import_errors is not None:
+            
+            _query_params.append(('has_import_errors', has_import_errors))
+            
         if last_dag_run_state is not None:
             
             _query_params.append(('last_dag_run_state', last_dag_run_state.value))
             
+        if bundle_name is not None:
+            
+            _query_params.append(('bundle_name', bundle_name))
+            
+        if bundle_version is not None:
+            
+            _query_params.append(('bundle_version', bundle_version))
+            
+        if has_asset_schedule is not None:
+            
+            _query_params.append(('has_asset_schedule', has_asset_schedule))
+            
+        if asset_dependency is not None:
+            
+            _query_params.append(('asset_dependency', asset_dependency))
+            
         if dag_run_start_date_gte is not None:
             if isinstance(dag_run_start_date_gte, datetime):
                 _query_params.append(
@@ -1678,6 +2108,19 @@
             else:
                 _query_params.append(('dag_run_start_date_gte', dag_run_start_date_gte))
             
+        if dag_run_start_date_gt is not None:
+            if isinstance(dag_run_start_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'dag_run_start_date_gt',
+                        dag_run_start_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('dag_run_start_date_gt', dag_run_start_date_gt))
+            
         if dag_run_start_date_lte is not None:
             if isinstance(dag_run_start_date_lte, datetime):
                 _query_params.append(
@@ -1691,6 +2134,19 @@
             else:
                 _query_params.append(('dag_run_start_date_lte', dag_run_start_date_lte))
             
+        if dag_run_start_date_lt is not None:
+            if isinstance(dag_run_start_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'dag_run_start_date_lt',
+                        dag_run_start_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('dag_run_start_date_lt', dag_run_start_date_lt))
+            
         if dag_run_end_date_gte is not None:
             if isinstance(dag_run_end_date_gte, datetime):
                 _query_params.append(
@@ -1704,6 +2160,19 @@
             else:
                 _query_params.append(('dag_run_end_date_gte', dag_run_end_date_gte))
             
+        if dag_run_end_date_gt is not None:
+            if isinstance(dag_run_end_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'dag_run_end_date_gt',
+                        dag_run_end_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('dag_run_end_date_gt', dag_run_end_date_gt))
+            
         if dag_run_end_date_lte is not None:
             if isinstance(dag_run_end_date_lte, datetime):
                 _query_params.append(
@@ -1717,6 +2186,19 @@
             else:
                 _query_params.append(('dag_run_end_date_lte', dag_run_end_date_lte))
             
+        if dag_run_end_date_lt is not None:
+            if isinstance(dag_run_end_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'dag_run_end_date_lt',
+                        dag_run_end_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('dag_run_end_date_lt', dag_run_end_date_lt))
+            
         if dag_run_state is not None:
             
             _query_params.append(('dag_run_state', dag_run_state))
@@ -1725,6 +2207,10 @@
             
             _query_params.append(('order_by', order_by))
             
+        if is_favorite is not None:
+            
+            _query_params.append(('is_favorite', is_favorite))
+            
         # process the header parameters
         # process the form parameters
         # process the body parameter
@@ -1741,7 +2227,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2063,7 +2550,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2094,7 +2582,7 @@
         tags: Optional[List[StrictStr]] = None,
         tags_match_mode: Optional[StrictStr] = None,
         owners: Optional[List[StrictStr]] = None,
-        dag_id_pattern: Optional[StrictStr] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         exclude_stale: Optional[StrictBool] = None,
         paused: Optional[StrictBool] = None,
         _request_timeout: Union[
@@ -2128,7 +2616,7 @@
         :type tags_match_mode: str
         :param owners:
         :type owners: List[str]
-        :param dag_id_pattern:
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type dag_id_pattern: str
         :param exclude_stale:
         :type exclude_stale: bool
@@ -2202,7 +2690,7 @@
         tags: Optional[List[StrictStr]] = None,
         tags_match_mode: Optional[StrictStr] = None,
         owners: Optional[List[StrictStr]] = None,
-        dag_id_pattern: Optional[StrictStr] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         exclude_stale: Optional[StrictBool] = None,
         paused: Optional[StrictBool] = None,
         _request_timeout: Union[
@@ -2236,7 +2724,7 @@
         :type tags_match_mode: str
         :param owners:
         :type owners: List[str]
-        :param dag_id_pattern:
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type dag_id_pattern: str
         :param exclude_stale:
         :type exclude_stale: bool
@@ -2310,7 +2798,7 @@
         tags: Optional[List[StrictStr]] = None,
         tags_match_mode: Optional[StrictStr] = None,
         owners: Optional[List[StrictStr]] = None,
-        dag_id_pattern: Optional[StrictStr] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         exclude_stale: Optional[StrictBool] = None,
         paused: Optional[StrictBool] = None,
         _request_timeout: Union[
@@ -2344,7 +2832,7 @@
         :type tags_match_mode: str
         :param owners:
         :type owners: List[str]
-        :param dag_id_pattern:
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type dag_id_pattern: str
         :param exclude_stale:
         :type exclude_stale: bool
@@ -2508,7 +2996,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2527,3 +3016,280 @@
         )
 
 
+
+
+    @validate_call
+    def unfavorite_dag(
+        self,
+        dag_id: StrictStr,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> None:
+        """Unfavorite Dag
+
+        Unmark the DAG as favorite.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._unfavorite_dag_serialize(
+            dag_id=dag_id,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '204': None,
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '409': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def unfavorite_dag_with_http_info(
+        self,
+        dag_id: StrictStr,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[None]:
+        """Unfavorite Dag
+
+        Unmark the DAG as favorite.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._unfavorite_dag_serialize(
+            dag_id=dag_id,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '204': None,
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '409': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def unfavorite_dag_without_preload_content(
+        self,
+        dag_id: StrictStr,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Unfavorite Dag
+
+        Unmark the DAG as favorite.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._unfavorite_dag_serialize(
+            dag_id=dag_id,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '204': None,
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '409': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _unfavorite_dag_serialize(
+        self,
+        dag_id,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        if dag_id is not None:
+            _path_params['dag_id'] = dag_id
+        # process the query parameters
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json'
+                ]
+            )
+
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='POST',
+            resource_path='/api/v2/dags/{dag_id}/unfavorite',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
diff --git a/airflow_client/client/api/dag_parsing_api.py b/airflow_client/client/api/dag_parsing_api.py
index c769b8d..422d80e 100644
--- a/airflow_client/client/api/dag_parsing_api.py
+++ b/airflow_client/client/api/dag_parsing_api.py
@@ -289,7 +289,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/dag_report_api.py b/airflow_client/client/api/dag_report_api.py
index 3b6048d..43b7d7d 100644
--- a/airflow_client/client/api/dag_report_api.py
+++ b/airflow_client/client/api/dag_report_api.py
@@ -291,7 +291,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/dag_run_api.py b/airflow_client/client/api/dag_run_api.py
index 7bceae8..764643b 100644
--- a/airflow_client/client/api/dag_run_api.py
+++ b/airflow_client/client/api/dag_run_api.py
@@ -17,8 +17,8 @@
 from typing_extensions import Annotated
 
 from datetime import datetime
-from pydantic import Field, StrictStr, field_validator
-from typing import Any, List, Optional
+from pydantic import Field, StrictFloat, StrictInt, StrictStr, field_validator
+from typing import Any, List, Optional, Union
 from typing_extensions import Annotated
 from airflow_client.client.models.asset_event_collection_response import AssetEventCollectionResponse
 from airflow_client.client.models.dag_run_clear_body import DAGRunClearBody
@@ -339,7 +339,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -630,7 +631,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -915,7 +917,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -943,18 +946,31 @@
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
         run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
         logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         updated_at_gte: Optional[datetime] = None,
+        updated_at_gt: Optional[datetime] = None,
         updated_at_lte: Optional[datetime] = None,
+        updated_at_lt: Optional[datetime] = None,
         run_type: Optional[List[StrictStr]] = None,
         state: Optional[List[StrictStr]] = None,
-        order_by: Optional[StrictStr] = None,
+        dag_version: Optional[List[StrictInt]] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        run_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        triggering_user_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -980,30 +996,56 @@
         :type offset: int
         :param run_after_gte:
         :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
         :param run_after_lte:
         :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param logical_date_gte:
         :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
         :param logical_date_lte:
         :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param updated_at_gte:
         :type updated_at_gte: datetime
+        :param updated_at_gt:
+        :type updated_at_gt: datetime
         :param updated_at_lte:
         :type updated_at_lte: datetime
+        :param updated_at_lt:
+        :type updated_at_lt: datetime
         :param run_type:
         :type run_type: List[str]
         :param state:
         :type state: List[str]
+        :param dag_version:
+        :type dag_version: List[int]
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
+        :param run_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type run_id_pattern: str
+        :param triggering_user_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type triggering_user_name_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1031,18 +1073,31 @@
             limit=limit,
             offset=offset,
             run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
             run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
             logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             updated_at_gte=updated_at_gte,
+            updated_at_gt=updated_at_gt,
             updated_at_lte=updated_at_lte,
+            updated_at_lt=updated_at_lt,
             run_type=run_type,
             state=state,
+            dag_version=dag_version,
             order_by=order_by,
+            run_id_pattern=run_id_pattern,
+            triggering_user_name_pattern=triggering_user_name_pattern,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -1074,18 +1129,31 @@
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
         run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
         logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         updated_at_gte: Optional[datetime] = None,
+        updated_at_gt: Optional[datetime] = None,
         updated_at_lte: Optional[datetime] = None,
+        updated_at_lt: Optional[datetime] = None,
         run_type: Optional[List[StrictStr]] = None,
         state: Optional[List[StrictStr]] = None,
-        order_by: Optional[StrictStr] = None,
+        dag_version: Optional[List[StrictInt]] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        run_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        triggering_user_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1111,30 +1179,56 @@
         :type offset: int
         :param run_after_gte:
         :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
         :param run_after_lte:
         :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param logical_date_gte:
         :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
         :param logical_date_lte:
         :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param updated_at_gte:
         :type updated_at_gte: datetime
+        :param updated_at_gt:
+        :type updated_at_gt: datetime
         :param updated_at_lte:
         :type updated_at_lte: datetime
+        :param updated_at_lt:
+        :type updated_at_lt: datetime
         :param run_type:
         :type run_type: List[str]
         :param state:
         :type state: List[str]
+        :param dag_version:
+        :type dag_version: List[int]
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
+        :param run_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type run_id_pattern: str
+        :param triggering_user_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type triggering_user_name_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1162,18 +1256,31 @@
             limit=limit,
             offset=offset,
             run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
             run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
             logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             updated_at_gte=updated_at_gte,
+            updated_at_gt=updated_at_gt,
             updated_at_lte=updated_at_lte,
+            updated_at_lt=updated_at_lt,
             run_type=run_type,
             state=state,
+            dag_version=dag_version,
             order_by=order_by,
+            run_id_pattern=run_id_pattern,
+            triggering_user_name_pattern=triggering_user_name_pattern,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -1205,18 +1312,31 @@
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
         run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
         logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         updated_at_gte: Optional[datetime] = None,
+        updated_at_gt: Optional[datetime] = None,
         updated_at_lte: Optional[datetime] = None,
+        updated_at_lt: Optional[datetime] = None,
         run_type: Optional[List[StrictStr]] = None,
         state: Optional[List[StrictStr]] = None,
-        order_by: Optional[StrictStr] = None,
+        dag_version: Optional[List[StrictInt]] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        run_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        triggering_user_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1242,30 +1362,56 @@
         :type offset: int
         :param run_after_gte:
         :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
         :param run_after_lte:
         :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param logical_date_gte:
         :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
         :param logical_date_lte:
         :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param updated_at_gte:
         :type updated_at_gte: datetime
+        :param updated_at_gt:
+        :type updated_at_gt: datetime
         :param updated_at_lte:
         :type updated_at_lte: datetime
+        :param updated_at_lt:
+        :type updated_at_lt: datetime
         :param run_type:
         :type run_type: List[str]
         :param state:
         :type state: List[str]
+        :param dag_version:
+        :type dag_version: List[int]
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
+        :param run_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type run_id_pattern: str
+        :param triggering_user_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type triggering_user_name_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1293,18 +1439,31 @@
             limit=limit,
             offset=offset,
             run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
             run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
             logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             updated_at_gte=updated_at_gte,
+            updated_at_gt=updated_at_gt,
             updated_at_lte=updated_at_lte,
+            updated_at_lt=updated_at_lt,
             run_type=run_type,
             state=state,
+            dag_version=dag_version,
             order_by=order_by,
+            run_id_pattern=run_id_pattern,
+            triggering_user_name_pattern=triggering_user_name_pattern,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -1331,18 +1490,31 @@
         limit,
         offset,
         run_after_gte,
+        run_after_gt,
         run_after_lte,
+        run_after_lt,
         logical_date_gte,
+        logical_date_gt,
         logical_date_lte,
+        logical_date_lt,
         start_date_gte,
+        start_date_gt,
         start_date_lte,
+        start_date_lt,
         end_date_gte,
+        end_date_gt,
         end_date_lte,
+        end_date_lt,
         updated_at_gte,
+        updated_at_gt,
         updated_at_lte,
+        updated_at_lt,
         run_type,
         state,
+        dag_version,
         order_by,
+        run_id_pattern,
+        triggering_user_name_pattern,
         _request_auth,
         _content_type,
         _headers,
@@ -1354,6 +1526,8 @@
         _collection_formats: Dict[str, str] = {
             'run_type': 'multi',
             'state': 'multi',
+            'dag_version': 'multi',
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -1390,6 +1564,19 @@
             else:
                 _query_params.append(('run_after_gte', run_after_gte))
             
+        if run_after_gt is not None:
+            if isinstance(run_after_gt, datetime):
+                _query_params.append(
+                    (
+                        'run_after_gt',
+                        run_after_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('run_after_gt', run_after_gt))
+            
         if run_after_lte is not None:
             if isinstance(run_after_lte, datetime):
                 _query_params.append(
@@ -1403,6 +1590,19 @@
             else:
                 _query_params.append(('run_after_lte', run_after_lte))
             
+        if run_after_lt is not None:
+            if isinstance(run_after_lt, datetime):
+                _query_params.append(
+                    (
+                        'run_after_lt',
+                        run_after_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('run_after_lt', run_after_lt))
+            
         if logical_date_gte is not None:
             if isinstance(logical_date_gte, datetime):
                 _query_params.append(
@@ -1416,6 +1616,19 @@
             else:
                 _query_params.append(('logical_date_gte', logical_date_gte))
             
+        if logical_date_gt is not None:
+            if isinstance(logical_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'logical_date_gt',
+                        logical_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('logical_date_gt', logical_date_gt))
+            
         if logical_date_lte is not None:
             if isinstance(logical_date_lte, datetime):
                 _query_params.append(
@@ -1429,6 +1642,19 @@
             else:
                 _query_params.append(('logical_date_lte', logical_date_lte))
             
+        if logical_date_lt is not None:
+            if isinstance(logical_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'logical_date_lt',
+                        logical_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('logical_date_lt', logical_date_lt))
+            
         if start_date_gte is not None:
             if isinstance(start_date_gte, datetime):
                 _query_params.append(
@@ -1442,6 +1668,19 @@
             else:
                 _query_params.append(('start_date_gte', start_date_gte))
             
+        if start_date_gt is not None:
+            if isinstance(start_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'start_date_gt',
+                        start_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('start_date_gt', start_date_gt))
+            
         if start_date_lte is not None:
             if isinstance(start_date_lte, datetime):
                 _query_params.append(
@@ -1455,6 +1694,19 @@
             else:
                 _query_params.append(('start_date_lte', start_date_lte))
             
+        if start_date_lt is not None:
+            if isinstance(start_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'start_date_lt',
+                        start_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('start_date_lt', start_date_lt))
+            
         if end_date_gte is not None:
             if isinstance(end_date_gte, datetime):
                 _query_params.append(
@@ -1468,6 +1720,19 @@
             else:
                 _query_params.append(('end_date_gte', end_date_gte))
             
+        if end_date_gt is not None:
+            if isinstance(end_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'end_date_gt',
+                        end_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('end_date_gt', end_date_gt))
+            
         if end_date_lte is not None:
             if isinstance(end_date_lte, datetime):
                 _query_params.append(
@@ -1481,6 +1746,19 @@
             else:
                 _query_params.append(('end_date_lte', end_date_lte))
             
+        if end_date_lt is not None:
+            if isinstance(end_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'end_date_lt',
+                        end_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('end_date_lt', end_date_lt))
+            
         if updated_at_gte is not None:
             if isinstance(updated_at_gte, datetime):
                 _query_params.append(
@@ -1494,6 +1772,19 @@
             else:
                 _query_params.append(('updated_at_gte', updated_at_gte))
             
+        if updated_at_gt is not None:
+            if isinstance(updated_at_gt, datetime):
+                _query_params.append(
+                    (
+                        'updated_at_gt',
+                        updated_at_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('updated_at_gt', updated_at_gt))
+            
         if updated_at_lte is not None:
             if isinstance(updated_at_lte, datetime):
                 _query_params.append(
@@ -1507,6 +1798,19 @@
             else:
                 _query_params.append(('updated_at_lte', updated_at_lte))
             
+        if updated_at_lt is not None:
+            if isinstance(updated_at_lt, datetime):
+                _query_params.append(
+                    (
+                        'updated_at_lt',
+                        updated_at_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('updated_at_lt', updated_at_lt))
+            
         if run_type is not None:
             
             _query_params.append(('run_type', run_type))
@@ -1515,10 +1819,22 @@
             
             _query_params.append(('state', state))
             
+        if dag_version is not None:
+            
+            _query_params.append(('dag_version', dag_version))
+            
         if order_by is not None:
             
             _query_params.append(('order_by', order_by))
             
+        if run_id_pattern is not None:
+            
+            _query_params.append(('run_id_pattern', run_id_pattern))
+            
+        if triggering_user_name_pattern is not None:
+            
+            _query_params.append(('triggering_user_name_pattern', triggering_user_name_pattern))
+            
         # process the header parameters
         # process the form parameters
         # process the body parameter
@@ -1535,7 +1851,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1836,7 +2153,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2124,7 +2442,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2461,7 +2780,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2768,7 +3088,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2787,3 +3108,328 @@
         )
 
 
+
+
+    @validate_call
+    def wait_dag_run_until_finished(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        interval: Annotated[Union[StrictFloat, StrictInt], Field(description="Seconds to wait between dag run state checks")],
+        result: Annotated[Optional[List[StrictStr]], Field(description="Collect result XCom from task. Can be set multiple times.")] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> object:
+        """Experimental: Wait for a dag run to complete, and return task results if requested.
+
+        🚧 This is an experimental endpoint and may change or be removed without notice.Successful response are streamed as newline-delimited JSON (NDJSON). Each line is a JSON object representing the DAG run state.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param interval: Seconds to wait between dag run state checks (required)
+        :type interval: float
+        :param result: Collect result XCom from task. Can be set multiple times.
+        :type result: List[str]
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._wait_dag_run_until_finished_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            interval=interval,
+            result=result,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def wait_dag_run_until_finished_with_http_info(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        interval: Annotated[Union[StrictFloat, StrictInt], Field(description="Seconds to wait between dag run state checks")],
+        result: Annotated[Optional[List[StrictStr]], Field(description="Collect result XCom from task. Can be set multiple times.")] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[object]:
+        """Experimental: Wait for a dag run to complete, and return task results if requested.
+
+        🚧 This is an experimental endpoint and may change or be removed without notice.Successful response are streamed as newline-delimited JSON (NDJSON). Each line is a JSON object representing the DAG run state.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param interval: Seconds to wait between dag run state checks (required)
+        :type interval: float
+        :param result: Collect result XCom from task. Can be set multiple times.
+        :type result: List[str]
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._wait_dag_run_until_finished_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            interval=interval,
+            result=result,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def wait_dag_run_until_finished_without_preload_content(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        interval: Annotated[Union[StrictFloat, StrictInt], Field(description="Seconds to wait between dag run state checks")],
+        result: Annotated[Optional[List[StrictStr]], Field(description="Collect result XCom from task. Can be set multiple times.")] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Experimental: Wait for a dag run to complete, and return task results if requested.
+
+        🚧 This is an experimental endpoint and may change or be removed without notice.Successful response are streamed as newline-delimited JSON (NDJSON). Each line is a JSON object representing the DAG run state.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param interval: Seconds to wait between dag run state checks (required)
+        :type interval: float
+        :param result: Collect result XCom from task. Can be set multiple times.
+        :type result: List[str]
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._wait_dag_run_until_finished_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            interval=interval,
+            result=result,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _wait_dag_run_until_finished_serialize(
+        self,
+        dag_id,
+        dag_run_id,
+        interval,
+        result,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+            'result': 'multi',
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        if dag_id is not None:
+            _path_params['dag_id'] = dag_id
+        if dag_run_id is not None:
+            _path_params['dag_run_id'] = dag_run_id
+        # process the query parameters
+        if interval is not None:
+            
+            _query_params.append(('interval', interval))
+            
+        if result is not None:
+            
+            _query_params.append(('result', result))
+            
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json', 
+                    'application/x-ndjson'
+                ]
+            )
+
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='GET',
+            resource_path='/api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/wait',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
diff --git a/airflow_client/client/api/dag_source_api.py b/airflow_client/client/api/dag_source_api.py
index 2abd7bd..575e683 100644
--- a/airflow_client/client/api/dag_source_api.py
+++ b/airflow_client/client/api/dag_source_api.py
@@ -329,7 +329,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/dag_stats_api.py b/airflow_client/client/api/dag_stats_api.py
index 30bdaa1..979dc1d 100644
--- a/airflow_client/client/api/dag_stats_api.py
+++ b/airflow_client/client/api/dag_stats_api.py
@@ -296,7 +296,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/dag_version_api.py b/airflow_client/client/api/dag_version_api.py
index 719169d..438b0b3 100644
--- a/airflow_client/client/api/dag_version_api.py
+++ b/airflow_client/client/api/dag_version_api.py
@@ -17,7 +17,7 @@
 from typing_extensions import Annotated
 
 from pydantic import Field, StrictInt, StrictStr
-from typing import Optional
+from typing import List, Optional
 from typing_extensions import Annotated
 from airflow_client.client.models.dag_version_collection_response import DAGVersionCollectionResponse
 from airflow_client.client.models.dag_version_response import DagVersionResponse
@@ -307,7 +307,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -337,7 +338,7 @@
         version_number: Optional[StrictInt] = None,
         bundle_name: Optional[StrictStr] = None,
         bundle_version: Optional[StrictStr] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -368,7 +369,7 @@
         :param bundle_version:
         :type bundle_version: str
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -432,7 +433,7 @@
         version_number: Optional[StrictInt] = None,
         bundle_name: Optional[StrictStr] = None,
         bundle_version: Optional[StrictStr] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -463,7 +464,7 @@
         :param bundle_version:
         :type bundle_version: str
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -527,7 +528,7 @@
         version_number: Optional[StrictInt] = None,
         bundle_name: Optional[StrictStr] = None,
         bundle_version: Optional[StrictStr] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -558,7 +559,7 @@
         :param bundle_version:
         :type bundle_version: str
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -627,6 +628,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -682,7 +684,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/dag_warning_api.py b/airflow_client/client/api/dag_warning_api.py
index 2f3469d..eb99ab7 100644
--- a/airflow_client/client/api/dag_warning_api.py
+++ b/airflow_client/client/api/dag_warning_api.py
@@ -17,7 +17,7 @@
 from typing_extensions import Annotated
 
 from pydantic import Field, StrictStr
-from typing import Optional
+from typing import List, Optional
 from typing_extensions import Annotated
 from airflow_client.client.models.dag_warning_collection_response import DAGWarningCollectionResponse
 from airflow_client.client.models.dag_warning_type import DagWarningType
@@ -47,7 +47,7 @@
         warning_type: Optional[DagWarningType] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -74,7 +74,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -133,7 +133,7 @@
         warning_type: Optional[DagWarningType] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -160,7 +160,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -219,7 +219,7 @@
         warning_type: Optional[DagWarningType] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -246,7 +246,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -310,6 +310,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -359,7 +360,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/event_log_api.py b/airflow_client/client/api/event_log_api.py
index 92696b9..161afc0 100644
--- a/airflow_client/client/api/event_log_api.py
+++ b/airflow_client/client/api/event_log_api.py
@@ -290,7 +290,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -316,7 +317,7 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         dag_id: Optional[StrictStr] = None,
         task_id: Optional[StrictStr] = None,
         run_id: Optional[StrictStr] = None,
@@ -328,6 +329,11 @@
         included_events: Optional[List[StrictStr]] = None,
         before: Optional[datetime] = None,
         after: Optional[datetime] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        task_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        run_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        owner_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        event_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -350,7 +356,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param dag_id:
         :type dag_id: str
         :param task_id:
@@ -373,6 +379,16 @@
         :type before: datetime
         :param after:
         :type after: datetime
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type dag_id_pattern: str
+        :param task_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type task_id_pattern: str
+        :param run_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type run_id_pattern: str
+        :param owner_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type owner_pattern: str
+        :param event_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type event_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -410,6 +426,11 @@
             included_events=included_events,
             before=before,
             after=after,
+            dag_id_pattern=dag_id_pattern,
+            task_id_pattern=task_id_pattern,
+            run_id_pattern=run_id_pattern,
+            owner_pattern=owner_pattern,
+            event_pattern=event_pattern,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -438,7 +459,7 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         dag_id: Optional[StrictStr] = None,
         task_id: Optional[StrictStr] = None,
         run_id: Optional[StrictStr] = None,
@@ -450,6 +471,11 @@
         included_events: Optional[List[StrictStr]] = None,
         before: Optional[datetime] = None,
         after: Optional[datetime] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        task_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        run_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        owner_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        event_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -472,7 +498,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param dag_id:
         :type dag_id: str
         :param task_id:
@@ -495,6 +521,16 @@
         :type before: datetime
         :param after:
         :type after: datetime
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type dag_id_pattern: str
+        :param task_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type task_id_pattern: str
+        :param run_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type run_id_pattern: str
+        :param owner_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type owner_pattern: str
+        :param event_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type event_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -532,6 +568,11 @@
             included_events=included_events,
             before=before,
             after=after,
+            dag_id_pattern=dag_id_pattern,
+            task_id_pattern=task_id_pattern,
+            run_id_pattern=run_id_pattern,
+            owner_pattern=owner_pattern,
+            event_pattern=event_pattern,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -560,7 +601,7 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         dag_id: Optional[StrictStr] = None,
         task_id: Optional[StrictStr] = None,
         run_id: Optional[StrictStr] = None,
@@ -572,6 +613,11 @@
         included_events: Optional[List[StrictStr]] = None,
         before: Optional[datetime] = None,
         after: Optional[datetime] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        task_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        run_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        owner_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        event_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -594,7 +640,7 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param dag_id:
         :type dag_id: str
         :param task_id:
@@ -617,6 +663,16 @@
         :type before: datetime
         :param after:
         :type after: datetime
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type dag_id_pattern: str
+        :param task_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type task_id_pattern: str
+        :param run_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type run_id_pattern: str
+        :param owner_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type owner_pattern: str
+        :param event_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type event_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -654,6 +710,11 @@
             included_events=included_events,
             before=before,
             after=after,
+            dag_id_pattern=dag_id_pattern,
+            task_id_pattern=task_id_pattern,
+            run_id_pattern=run_id_pattern,
+            owner_pattern=owner_pattern,
+            event_pattern=event_pattern,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -689,6 +750,11 @@
         included_events,
         before,
         after,
+        dag_id_pattern,
+        task_id_pattern,
+        run_id_pattern,
+        owner_pattern,
+        event_pattern,
         _request_auth,
         _content_type,
         _headers,
@@ -698,6 +764,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
             'excluded_events': 'multi',
             'included_events': 'multi',
         }
@@ -787,6 +854,26 @@
             else:
                 _query_params.append(('after', after))
             
+        if dag_id_pattern is not None:
+            
+            _query_params.append(('dag_id_pattern', dag_id_pattern))
+            
+        if task_id_pattern is not None:
+            
+            _query_params.append(('task_id_pattern', task_id_pattern))
+            
+        if run_id_pattern is not None:
+            
+            _query_params.append(('run_id_pattern', run_id_pattern))
+            
+        if owner_pattern is not None:
+            
+            _query_params.append(('owner_pattern', owner_pattern))
+            
+        if event_pattern is not None:
+            
+            _query_params.append(('event_pattern', event_pattern))
+            
         # process the header parameters
         # process the form parameters
         # process the body parameter
@@ -803,7 +890,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/experimental_api.py b/airflow_client/client/api/experimental_api.py
new file mode 100644
index 0000000..cd5410e
--- /dev/null
+++ b/airflow_client/client/api/experimental_api.py
@@ -0,0 +1,363 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+import warnings
+from pydantic import validate_call, Field, StrictFloat, StrictStr, StrictInt
+from typing import Any, Dict, List, Optional, Tuple, Union
+from typing_extensions import Annotated
+
+from pydantic import Field, StrictFloat, StrictInt, StrictStr
+from typing import Any, List, Optional, Union
+from typing_extensions import Annotated
+
+from airflow_client.client.api_client import ApiClient, RequestSerialized
+from airflow_client.client.api_response import ApiResponse
+from airflow_client.client.rest import RESTResponseType
+
+
+class ExperimentalApi:
+    """NOTE: This class is auto generated by OpenAPI Generator
+    Ref: https://openapi-generator.tech
+
+    Do not edit the class manually.
+    """
+
+    def __init__(self, api_client=None) -> None:
+        if api_client is None:
+            api_client = ApiClient.get_default()
+        self.api_client = api_client
+
+
+    @validate_call
+    def wait_dag_run_until_finished(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        interval: Annotated[Union[StrictFloat, StrictInt], Field(description="Seconds to wait between dag run state checks")],
+        result: Annotated[Optional[List[StrictStr]], Field(description="Collect result XCom from task. Can be set multiple times.")] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> object:
+        """Experimental: Wait for a dag run to complete, and return task results if requested.
+
+        🚧 This is an experimental endpoint and may change or be removed without notice.Successful response are streamed as newline-delimited JSON (NDJSON). Each line is a JSON object representing the DAG run state.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param interval: Seconds to wait between dag run state checks (required)
+        :type interval: float
+        :param result: Collect result XCom from task. Can be set multiple times.
+        :type result: List[str]
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._wait_dag_run_until_finished_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            interval=interval,
+            result=result,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def wait_dag_run_until_finished_with_http_info(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        interval: Annotated[Union[StrictFloat, StrictInt], Field(description="Seconds to wait between dag run state checks")],
+        result: Annotated[Optional[List[StrictStr]], Field(description="Collect result XCom from task. Can be set multiple times.")] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[object]:
+        """Experimental: Wait for a dag run to complete, and return task results if requested.
+
+        🚧 This is an experimental endpoint and may change or be removed without notice.Successful response are streamed as newline-delimited JSON (NDJSON). Each line is a JSON object representing the DAG run state.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param interval: Seconds to wait between dag run state checks (required)
+        :type interval: float
+        :param result: Collect result XCom from task. Can be set multiple times.
+        :type result: List[str]
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._wait_dag_run_until_finished_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            interval=interval,
+            result=result,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def wait_dag_run_until_finished_without_preload_content(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        interval: Annotated[Union[StrictFloat, StrictInt], Field(description="Seconds to wait between dag run state checks")],
+        result: Annotated[Optional[List[StrictStr]], Field(description="Collect result XCom from task. Can be set multiple times.")] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Experimental: Wait for a dag run to complete, and return task results if requested.
+
+        🚧 This is an experimental endpoint and may change or be removed without notice.Successful response are streamed as newline-delimited JSON (NDJSON). Each line is a JSON object representing the DAG run state.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param interval: Seconds to wait between dag run state checks (required)
+        :type interval: float
+        :param result: Collect result XCom from task. Can be set multiple times.
+        :type result: List[str]
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._wait_dag_run_until_finished_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            interval=interval,
+            result=result,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _wait_dag_run_until_finished_serialize(
+        self,
+        dag_id,
+        dag_run_id,
+        interval,
+        result,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+            'result': 'multi',
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        if dag_id is not None:
+            _path_params['dag_id'] = dag_id
+        if dag_run_id is not None:
+            _path_params['dag_run_id'] = dag_run_id
+        # process the query parameters
+        if interval is not None:
+            
+            _query_params.append(('interval', interval))
+            
+        if result is not None:
+            
+            _query_params.append(('result', result))
+            
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json', 
+                    'application/x-ndjson'
+                ]
+            )
+
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='GET',
+            resource_path='/api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/wait',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
diff --git a/airflow_client/client/api/extra_links_api.py b/airflow_client/client/api/extra_links_api.py
index 3da284c..504f4db 100644
--- a/airflow_client/client/api/extra_links_api.py
+++ b/airflow_client/client/api/extra_links_api.py
@@ -337,7 +337,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/import_error_api.py b/airflow_client/client/api/import_error_api.py
index 8e9f668..b63dbdd 100644
--- a/airflow_client/client/api/import_error_api.py
+++ b/airflow_client/client/api/import_error_api.py
@@ -17,7 +17,7 @@
 from typing_extensions import Annotated
 
 from pydantic import Field, StrictInt, StrictStr
-from typing import Optional
+from typing import List, Optional
 from typing_extensions import Annotated
 from airflow_client.client.models.import_error_collection_response import ImportErrorCollectionResponse
 from airflow_client.client.models.import_error_response import ImportErrorResponse
@@ -292,7 +292,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -318,7 +319,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        filename_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -341,7 +343,9 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
+        :param filename_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type filename_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -368,6 +372,7 @@
             limit=limit,
             offset=offset,
             order_by=order_by,
+            filename_pattern=filename_pattern,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -396,7 +401,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        filename_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -419,7 +425,9 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
+        :param filename_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type filename_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -446,6 +454,7 @@
             limit=limit,
             offset=offset,
             order_by=order_by,
+            filename_pattern=filename_pattern,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -474,7 +483,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        filename_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -497,7 +507,9 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
+        :param filename_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type filename_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -524,6 +536,7 @@
             limit=limit,
             offset=offset,
             order_by=order_by,
+            filename_pattern=filename_pattern,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -548,6 +561,7 @@
         limit,
         offset,
         order_by,
+        filename_pattern,
         _request_auth,
         _content_type,
         _headers,
@@ -557,6 +571,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -582,6 +597,10 @@
             
             _query_params.append(('order_by', order_by))
             
+        if filename_pattern is not None:
+            
+            _query_params.append(('filename_pattern', filename_pattern))
+            
         # process the header parameters
         # process the form parameters
         # process the body parameter
@@ -598,7 +617,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/job_api.py b/airflow_client/client/api/job_api.py
index db0531e..6c0c727 100644
--- a/airflow_client/client/api/job_api.py
+++ b/airflow_client/client/api/job_api.py
@@ -18,7 +18,7 @@
 
 from datetime import datetime
 from pydantic import Field, StrictBool, StrictStr
-from typing import Optional
+from typing import List, Optional
 from typing_extensions import Annotated
 from airflow_client.client.models.job_collection_response import JobCollectionResponse
 
@@ -45,12 +45,16 @@
         self,
         is_alive: Optional[StrictBool] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         job_state: Optional[StrictStr] = None,
         job_type: Optional[StrictStr] = None,
         hostname: Optional[StrictStr] = None,
@@ -76,18 +80,26 @@
         :type is_alive: bool
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param limit:
         :type limit: int
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param job_state:
         :type job_state: str
         :param job_type:
@@ -121,9 +133,13 @@
         _param = self._get_jobs_serialize(
             is_alive=is_alive,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             limit=limit,
             offset=offset,
             order_by=order_by,
@@ -160,12 +176,16 @@
         self,
         is_alive: Optional[StrictBool] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         job_state: Optional[StrictStr] = None,
         job_type: Optional[StrictStr] = None,
         hostname: Optional[StrictStr] = None,
@@ -191,18 +211,26 @@
         :type is_alive: bool
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param limit:
         :type limit: int
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param job_state:
         :type job_state: str
         :param job_type:
@@ -236,9 +264,13 @@
         _param = self._get_jobs_serialize(
             is_alive=is_alive,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             limit=limit,
             offset=offset,
             order_by=order_by,
@@ -275,12 +307,16 @@
         self,
         is_alive: Optional[StrictBool] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         job_state: Optional[StrictStr] = None,
         job_type: Optional[StrictStr] = None,
         hostname: Optional[StrictStr] = None,
@@ -306,18 +342,26 @@
         :type is_alive: bool
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param limit:
         :type limit: int
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param job_state:
         :type job_state: str
         :param job_type:
@@ -351,9 +395,13 @@
         _param = self._get_jobs_serialize(
             is_alive=is_alive,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             limit=limit,
             offset=offset,
             order_by=order_by,
@@ -385,9 +433,13 @@
         self,
         is_alive,
         start_date_gte,
+        start_date_gt,
         start_date_lte,
+        start_date_lt,
         end_date_gte,
+        end_date_gt,
         end_date_lte,
+        end_date_lt,
         limit,
         offset,
         order_by,
@@ -404,6 +456,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -434,6 +487,19 @@
             else:
                 _query_params.append(('start_date_gte', start_date_gte))
             
+        if start_date_gt is not None:
+            if isinstance(start_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'start_date_gt',
+                        start_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('start_date_gt', start_date_gt))
+            
         if start_date_lte is not None:
             if isinstance(start_date_lte, datetime):
                 _query_params.append(
@@ -447,6 +513,19 @@
             else:
                 _query_params.append(('start_date_lte', start_date_lte))
             
+        if start_date_lt is not None:
+            if isinstance(start_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'start_date_lt',
+                        start_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('start_date_lt', start_date_lt))
+            
         if end_date_gte is not None:
             if isinstance(end_date_gte, datetime):
                 _query_params.append(
@@ -460,6 +539,19 @@
             else:
                 _query_params.append(('end_date_gte', end_date_gte))
             
+        if end_date_gt is not None:
+            if isinstance(end_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'end_date_gt',
+                        end_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('end_date_gt', end_date_gt))
+            
         if end_date_lte is not None:
             if isinstance(end_date_lte, datetime):
                 _query_params.append(
@@ -473,6 +565,19 @@
             else:
                 _query_params.append(('end_date_lte', end_date_lte))
             
+        if end_date_lt is not None:
+            if isinstance(end_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'end_date_lt',
+                        end_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('end_date_lt', end_date_lt))
+            
         if limit is not None:
             
             _query_params.append(('limit', limit))
@@ -517,7 +622,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/login_api.py b/airflow_client/client/api/login_api.py
index a4b362a..e1103cb 100644
--- a/airflow_client/client/api/login_api.py
+++ b/airflow_client/client/api/login_api.py
@@ -571,3 +571,271 @@
         )
 
 
+
+
+    @validate_call
+    def refresh(
+        self,
+        next: Optional[StrictStr] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> object:
+        """Refresh
+
+        Refresh the authentication token.
+
+        :param next:
+        :type next: str
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._refresh_serialize(
+            next=next,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '307': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def refresh_with_http_info(
+        self,
+        next: Optional[StrictStr] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[object]:
+        """Refresh
+
+        Refresh the authentication token.
+
+        :param next:
+        :type next: str
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._refresh_serialize(
+            next=next,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '307': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def refresh_without_preload_content(
+        self,
+        next: Optional[StrictStr] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Refresh
+
+        Refresh the authentication token.
+
+        :param next:
+        :type next: str
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._refresh_serialize(
+            next=next,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '307': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _refresh_serialize(
+        self,
+        next,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        # process the query parameters
+        if next is not None:
+            
+            _query_params.append(('next', next))
+            
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json'
+                ]
+            )
+
+
+        # authentication setting
+        _auth_settings: List[str] = [
+        ]
+
+        return self.api_client.param_serialize(
+            method='GET',
+            resource_path='/api/v2/auth/refresh',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
diff --git a/airflow_client/client/api/plugin_api.py b/airflow_client/client/api/plugin_api.py
index b577a24..3f03801 100644
--- a/airflow_client/client/api/plugin_api.py
+++ b/airflow_client/client/api/plugin_api.py
@@ -20,6 +20,7 @@
 from typing import Optional
 from typing_extensions import Annotated
 from airflow_client.client.models.plugin_collection_response import PluginCollectionResponse
+from airflow_client.client.models.plugin_import_error_collection_response import PluginImportErrorCollectionResponse
 
 from airflow_client.client.api_client import ApiClient, RequestSerialized
 from airflow_client.client.api_response import ApiResponse
@@ -304,7 +305,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -323,3 +325,253 @@
         )
 
 
+
+
+    @validate_call
+    def import_errors(
+        self,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> PluginImportErrorCollectionResponse:
+        """Import Errors
+
+
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._import_errors_serialize(
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "PluginImportErrorCollectionResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def import_errors_with_http_info(
+        self,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[PluginImportErrorCollectionResponse]:
+        """Import Errors
+
+
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._import_errors_serialize(
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "PluginImportErrorCollectionResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def import_errors_without_preload_content(
+        self,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Import Errors
+
+
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._import_errors_serialize(
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "PluginImportErrorCollectionResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _import_errors_serialize(
+        self,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        # process the query parameters
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json'
+                ]
+            )
+
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='GET',
+            resource_path='/api/v2/plugins/importErrors',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
diff --git a/airflow_client/client/api/pool_api.py b/airflow_client/client/api/pool_api.py
index be2909d..50ac7f7 100644
--- a/airflow_client/client/api/pool_api.py
+++ b/airflow_client/client/api/pool_api.py
@@ -306,7 +306,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -582,7 +583,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -855,7 +857,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -881,8 +884,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        pool_name_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        pool_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -905,8 +908,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param pool_name_pattern:
+        :type order_by: List[str]
+        :param pool_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type pool_name_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -964,8 +967,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        pool_name_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        pool_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -988,8 +991,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param pool_name_pattern:
+        :type order_by: List[str]
+        :param pool_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type pool_name_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -1047,8 +1050,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        pool_name_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        pool_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1071,8 +1074,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param pool_name_pattern:
+        :type order_by: List[str]
+        :param pool_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type pool_name_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -1136,6 +1139,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -1181,7 +1185,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1503,7 +1508,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1789,7 +1795,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/provider_api.py b/airflow_client/client/api/provider_api.py
index 40ca1c3..50f9b1d 100644
--- a/airflow_client/client/api/provider_api.py
+++ b/airflow_client/client/api/provider_api.py
@@ -307,7 +307,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/task_api.py b/airflow_client/client/api/task_api.py
index b44e4f8..c0470ca 100644
--- a/airflow_client/client/api/task_api.py
+++ b/airflow_client/client/api/task_api.py
@@ -309,7 +309,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -602,7 +603,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/task_instance_api.py b/airflow_client/client/api/task_instance_api.py
index fcbbb7e..8d3211f 100644
--- a/airflow_client/client/api/task_instance_api.py
+++ b/airflow_client/client/api/task_instance_api.py
@@ -18,10 +18,16 @@
 
 from datetime import datetime
 from pydantic import Field, StrictBool, StrictFloat, StrictInt, StrictStr, field_validator
-from typing import List, Optional, Union
+from typing import Any, List, Optional, Union
 from typing_extensions import Annotated
+from airflow_client.client.models.bulk_body_bulk_task_instance_body import BulkBodyBulkTaskInstanceBody
+from airflow_client.client.models.bulk_response import BulkResponse
 from airflow_client.client.models.clear_task_instances_body import ClearTaskInstancesBody
+from airflow_client.client.models.external_log_url_response import ExternalLogUrlResponse
 from airflow_client.client.models.extra_link_collection_response import ExtraLinkCollectionResponse
+from airflow_client.client.models.hitl_detail import HITLDetail
+from airflow_client.client.models.hitl_detail_collection import HITLDetailCollection
+from airflow_client.client.models.hitl_detail_response import HITLDetailResponse
 from airflow_client.client.models.patch_task_instance_body import PatchTaskInstanceBody
 from airflow_client.client.models.task_dependency_collection_response import TaskDependencyCollectionResponse
 from airflow_client.client.models.task_instance_collection_response import TaskInstanceCollectionResponse
@@ -30,6 +36,7 @@
 from airflow_client.client.models.task_instance_response import TaskInstanceResponse
 from airflow_client.client.models.task_instances_batch_body import TaskInstancesBatchBody
 from airflow_client.client.models.task_instances_log_response import TaskInstancesLogResponse
+from airflow_client.client.models.update_hitl_detail_payload import UpdateHITLDetailPayload
 
 from airflow_client.client.api_client import ApiClient, RequestSerialized
 from airflow_client.client.api_response import ApiResponse
@@ -50,6 +57,980 @@
 
 
     @validate_call
+    def bulk_task_instances(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        bulk_body_bulk_task_instance_body: BulkBodyBulkTaskInstanceBody,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> BulkResponse:
+        """Bulk Task Instances
+
+        Bulk update, and delete task instances.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param bulk_body_bulk_task_instance_body: (required)
+        :type bulk_body_bulk_task_instance_body: BulkBodyBulkTaskInstanceBody
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._bulk_task_instances_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            bulk_body_bulk_task_instance_body=bulk_body_bulk_task_instance_body,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "BulkResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def bulk_task_instances_with_http_info(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        bulk_body_bulk_task_instance_body: BulkBodyBulkTaskInstanceBody,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[BulkResponse]:
+        """Bulk Task Instances
+
+        Bulk update, and delete task instances.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param bulk_body_bulk_task_instance_body: (required)
+        :type bulk_body_bulk_task_instance_body: BulkBodyBulkTaskInstanceBody
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._bulk_task_instances_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            bulk_body_bulk_task_instance_body=bulk_body_bulk_task_instance_body,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "BulkResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def bulk_task_instances_without_preload_content(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        bulk_body_bulk_task_instance_body: BulkBodyBulkTaskInstanceBody,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Bulk Task Instances
+
+        Bulk update, and delete task instances.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param bulk_body_bulk_task_instance_body: (required)
+        :type bulk_body_bulk_task_instance_body: BulkBodyBulkTaskInstanceBody
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._bulk_task_instances_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            bulk_body_bulk_task_instance_body=bulk_body_bulk_task_instance_body,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "BulkResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _bulk_task_instances_serialize(
+        self,
+        dag_id,
+        dag_run_id,
+        bulk_body_bulk_task_instance_body,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        if dag_id is not None:
+            _path_params['dag_id'] = dag_id
+        if dag_run_id is not None:
+            _path_params['dag_run_id'] = dag_run_id
+        # process the query parameters
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+        if bulk_body_bulk_task_instance_body is not None:
+            _body_params = bulk_body_bulk_task_instance_body
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json'
+                ]
+            )
+
+        # set the HTTP header `Content-Type`
+        if _content_type:
+            _header_params['Content-Type'] = _content_type
+        else:
+            _default_content_type = (
+                self.api_client.select_header_content_type(
+                    [
+                        'application/json'
+                    ]
+                )
+            )
+            if _default_content_type is not None:
+                _header_params['Content-Type'] = _default_content_type
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='PATCH',
+            resource_path='/api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
+
+
+    @validate_call
+    def delete_task_instance(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        map_index: Optional[StrictInt] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> object:
+        """Delete Task Instance
+
+        Delete a task instance.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param map_index:
+        :type map_index: int
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._delete_task_instance_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            map_index=map_index,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def delete_task_instance_with_http_info(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        map_index: Optional[StrictInt] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[object]:
+        """Delete Task Instance
+
+        Delete a task instance.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param map_index:
+        :type map_index: int
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._delete_task_instance_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            map_index=map_index,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def delete_task_instance_without_preload_content(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        map_index: Optional[StrictInt] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Delete Task Instance
+
+        Delete a task instance.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param map_index:
+        :type map_index: int
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._delete_task_instance_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            map_index=map_index,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "object",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _delete_task_instance_serialize(
+        self,
+        dag_id,
+        dag_run_id,
+        task_id,
+        map_index,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        if dag_id is not None:
+            _path_params['dag_id'] = dag_id
+        if dag_run_id is not None:
+            _path_params['dag_run_id'] = dag_run_id
+        if task_id is not None:
+            _path_params['task_id'] = task_id
+        # process the query parameters
+        if map_index is not None:
+            
+            _query_params.append(('map_index', map_index))
+            
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json'
+                ]
+            )
+
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='DELETE',
+            resource_path='/api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
+
+
+    @validate_call
+    def get_external_log_url(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        try_number: StrictInt,
+        map_index: Optional[StrictInt] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ExternalLogUrlResponse:
+        """Get External Log Url
+
+        Get external log URL for a specific task instance.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param try_number: (required)
+        :type try_number: int
+        :param map_index:
+        :type map_index: int
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._get_external_log_url_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            try_number=try_number,
+            map_index=map_index,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "ExternalLogUrlResponse",
+            '400': "HTTPExceptionResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def get_external_log_url_with_http_info(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        try_number: StrictInt,
+        map_index: Optional[StrictInt] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[ExternalLogUrlResponse]:
+        """Get External Log Url
+
+        Get external log URL for a specific task instance.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param try_number: (required)
+        :type try_number: int
+        :param map_index:
+        :type map_index: int
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._get_external_log_url_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            try_number=try_number,
+            map_index=map_index,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "ExternalLogUrlResponse",
+            '400': "HTTPExceptionResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def get_external_log_url_without_preload_content(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        try_number: StrictInt,
+        map_index: Optional[StrictInt] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Get External Log Url
+
+        Get external log URL for a specific task instance.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param try_number: (required)
+        :type try_number: int
+        :param map_index:
+        :type map_index: int
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._get_external_log_url_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            try_number=try_number,
+            map_index=map_index,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "ExternalLogUrlResponse",
+            '400': "HTTPExceptionResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _get_external_log_url_serialize(
+        self,
+        dag_id,
+        dag_run_id,
+        task_id,
+        try_number,
+        map_index,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        if dag_id is not None:
+            _path_params['dag_id'] = dag_id
+        if dag_run_id is not None:
+            _path_params['dag_run_id'] = dag_run_id
+        if task_id is not None:
+            _path_params['task_id'] = task_id
+        if try_number is not None:
+            _path_params['try_number'] = try_number
+        # process the query parameters
+        if map_index is not None:
+            
+            _query_params.append(('map_index', map_index))
+            
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json'
+                ]
+            )
+
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='GET',
+            resource_path='/api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/externalLogUrl/{try_number}',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
+
+
+    @validate_call
     def get_extra_links(
         self,
         dag_id: StrictStr,
@@ -348,7 +1329,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -370,12 +1352,946 @@
 
 
     @validate_call
+    def get_hitl_detail(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        map_index: StrictInt,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> HITLDetail:
+        """Get Hitl Detail
+
+        Get a Human-in-the-loop detail of a specific task instance.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param map_index: (required)
+        :type map_index: int
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._get_hitl_detail_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            map_index=map_index,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "HITLDetail",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def get_hitl_detail_with_http_info(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        map_index: StrictInt,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[HITLDetail]:
+        """Get Hitl Detail
+
+        Get a Human-in-the-loop detail of a specific task instance.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param map_index: (required)
+        :type map_index: int
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._get_hitl_detail_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            map_index=map_index,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "HITLDetail",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def get_hitl_detail_without_preload_content(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        map_index: StrictInt,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Get Hitl Detail
+
+        Get a Human-in-the-loop detail of a specific task instance.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param map_index: (required)
+        :type map_index: int
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._get_hitl_detail_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            map_index=map_index,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "HITLDetail",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _get_hitl_detail_serialize(
+        self,
+        dag_id,
+        dag_run_id,
+        task_id,
+        map_index,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        if dag_id is not None:
+            _path_params['dag_id'] = dag_id
+        if dag_run_id is not None:
+            _path_params['dag_run_id'] = dag_run_id
+        if task_id is not None:
+            _path_params['task_id'] = task_id
+        if map_index is not None:
+            _path_params['map_index'] = map_index
+        # process the query parameters
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json'
+                ]
+            )
+
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='GET',
+            resource_path='/api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/hitlDetails',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
+
+
+    @validate_call
+    def get_hitl_details(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
+        offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        task_id: Optional[StrictStr] = None,
+        task_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        map_index: Optional[StrictInt] = None,
+        state: Optional[List[StrictStr]] = None,
+        response_received: Optional[StrictBool] = None,
+        responded_by_user_id: Optional[List[StrictStr]] = None,
+        responded_by_user_name: Optional[List[StrictStr]] = None,
+        subject_search: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        body_search: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        created_at_gte: Optional[datetime] = None,
+        created_at_gt: Optional[datetime] = None,
+        created_at_lte: Optional[datetime] = None,
+        created_at_lt: Optional[datetime] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> HITLDetailCollection:
+        """Get Hitl Details
+
+        Get Human-in-the-loop details.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param limit:
+        :type limit: int
+        :param offset:
+        :type offset: int
+        :param order_by:
+        :type order_by: List[str]
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type dag_id_pattern: str
+        :param task_id:
+        :type task_id: str
+        :param task_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type task_id_pattern: str
+        :param map_index:
+        :type map_index: int
+        :param state:
+        :type state: List[str]
+        :param response_received:
+        :type response_received: bool
+        :param responded_by_user_id:
+        :type responded_by_user_id: List[str]
+        :param responded_by_user_name:
+        :type responded_by_user_name: List[str]
+        :param subject_search: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type subject_search: str
+        :param body_search: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type body_search: str
+        :param created_at_gte:
+        :type created_at_gte: datetime
+        :param created_at_gt:
+        :type created_at_gt: datetime
+        :param created_at_lte:
+        :type created_at_lte: datetime
+        :param created_at_lt:
+        :type created_at_lt: datetime
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._get_hitl_details_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            limit=limit,
+            offset=offset,
+            order_by=order_by,
+            dag_id_pattern=dag_id_pattern,
+            task_id=task_id,
+            task_id_pattern=task_id_pattern,
+            map_index=map_index,
+            state=state,
+            response_received=response_received,
+            responded_by_user_id=responded_by_user_id,
+            responded_by_user_name=responded_by_user_name,
+            subject_search=subject_search,
+            body_search=body_search,
+            created_at_gte=created_at_gte,
+            created_at_gt=created_at_gt,
+            created_at_lte=created_at_lte,
+            created_at_lt=created_at_lt,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "HITLDetailCollection",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def get_hitl_details_with_http_info(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
+        offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        task_id: Optional[StrictStr] = None,
+        task_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        map_index: Optional[StrictInt] = None,
+        state: Optional[List[StrictStr]] = None,
+        response_received: Optional[StrictBool] = None,
+        responded_by_user_id: Optional[List[StrictStr]] = None,
+        responded_by_user_name: Optional[List[StrictStr]] = None,
+        subject_search: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        body_search: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        created_at_gte: Optional[datetime] = None,
+        created_at_gt: Optional[datetime] = None,
+        created_at_lte: Optional[datetime] = None,
+        created_at_lt: Optional[datetime] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[HITLDetailCollection]:
+        """Get Hitl Details
+
+        Get Human-in-the-loop details.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param limit:
+        :type limit: int
+        :param offset:
+        :type offset: int
+        :param order_by:
+        :type order_by: List[str]
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type dag_id_pattern: str
+        :param task_id:
+        :type task_id: str
+        :param task_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type task_id_pattern: str
+        :param map_index:
+        :type map_index: int
+        :param state:
+        :type state: List[str]
+        :param response_received:
+        :type response_received: bool
+        :param responded_by_user_id:
+        :type responded_by_user_id: List[str]
+        :param responded_by_user_name:
+        :type responded_by_user_name: List[str]
+        :param subject_search: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type subject_search: str
+        :param body_search: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type body_search: str
+        :param created_at_gte:
+        :type created_at_gte: datetime
+        :param created_at_gt:
+        :type created_at_gt: datetime
+        :param created_at_lte:
+        :type created_at_lte: datetime
+        :param created_at_lt:
+        :type created_at_lt: datetime
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._get_hitl_details_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            limit=limit,
+            offset=offset,
+            order_by=order_by,
+            dag_id_pattern=dag_id_pattern,
+            task_id=task_id,
+            task_id_pattern=task_id_pattern,
+            map_index=map_index,
+            state=state,
+            response_received=response_received,
+            responded_by_user_id=responded_by_user_id,
+            responded_by_user_name=responded_by_user_name,
+            subject_search=subject_search,
+            body_search=body_search,
+            created_at_gte=created_at_gte,
+            created_at_gt=created_at_gt,
+            created_at_lte=created_at_lte,
+            created_at_lt=created_at_lt,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "HITLDetailCollection",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def get_hitl_details_without_preload_content(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
+        offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        dag_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        task_id: Optional[StrictStr] = None,
+        task_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        map_index: Optional[StrictInt] = None,
+        state: Optional[List[StrictStr]] = None,
+        response_received: Optional[StrictBool] = None,
+        responded_by_user_id: Optional[List[StrictStr]] = None,
+        responded_by_user_name: Optional[List[StrictStr]] = None,
+        subject_search: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        body_search: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        created_at_gte: Optional[datetime] = None,
+        created_at_gt: Optional[datetime] = None,
+        created_at_lte: Optional[datetime] = None,
+        created_at_lt: Optional[datetime] = None,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Get Hitl Details
+
+        Get Human-in-the-loop details.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param limit:
+        :type limit: int
+        :param offset:
+        :type offset: int
+        :param order_by:
+        :type order_by: List[str]
+        :param dag_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type dag_id_pattern: str
+        :param task_id:
+        :type task_id: str
+        :param task_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type task_id_pattern: str
+        :param map_index:
+        :type map_index: int
+        :param state:
+        :type state: List[str]
+        :param response_received:
+        :type response_received: bool
+        :param responded_by_user_id:
+        :type responded_by_user_id: List[str]
+        :param responded_by_user_name:
+        :type responded_by_user_name: List[str]
+        :param subject_search: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type subject_search: str
+        :param body_search: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type body_search: str
+        :param created_at_gte:
+        :type created_at_gte: datetime
+        :param created_at_gt:
+        :type created_at_gt: datetime
+        :param created_at_lte:
+        :type created_at_lte: datetime
+        :param created_at_lt:
+        :type created_at_lt: datetime
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._get_hitl_details_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            limit=limit,
+            offset=offset,
+            order_by=order_by,
+            dag_id_pattern=dag_id_pattern,
+            task_id=task_id,
+            task_id_pattern=task_id_pattern,
+            map_index=map_index,
+            state=state,
+            response_received=response_received,
+            responded_by_user_id=responded_by_user_id,
+            responded_by_user_name=responded_by_user_name,
+            subject_search=subject_search,
+            body_search=body_search,
+            created_at_gte=created_at_gte,
+            created_at_gt=created_at_gt,
+            created_at_lte=created_at_lte,
+            created_at_lt=created_at_lt,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "HITLDetailCollection",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _get_hitl_details_serialize(
+        self,
+        dag_id,
+        dag_run_id,
+        limit,
+        offset,
+        order_by,
+        dag_id_pattern,
+        task_id,
+        task_id_pattern,
+        map_index,
+        state,
+        response_received,
+        responded_by_user_id,
+        responded_by_user_name,
+        subject_search,
+        body_search,
+        created_at_gte,
+        created_at_gt,
+        created_at_lte,
+        created_at_lt,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
+            'state': 'multi',
+            'responded_by_user_id': 'multi',
+            'responded_by_user_name': 'multi',
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        if dag_id is not None:
+            _path_params['dag_id'] = dag_id
+        if dag_run_id is not None:
+            _path_params['dag_run_id'] = dag_run_id
+        # process the query parameters
+        if limit is not None:
+            
+            _query_params.append(('limit', limit))
+            
+        if offset is not None:
+            
+            _query_params.append(('offset', offset))
+            
+        if order_by is not None:
+            
+            _query_params.append(('order_by', order_by))
+            
+        if dag_id_pattern is not None:
+            
+            _query_params.append(('dag_id_pattern', dag_id_pattern))
+            
+        if task_id is not None:
+            
+            _query_params.append(('task_id', task_id))
+            
+        if task_id_pattern is not None:
+            
+            _query_params.append(('task_id_pattern', task_id_pattern))
+            
+        if map_index is not None:
+            
+            _query_params.append(('map_index', map_index))
+            
+        if state is not None:
+            
+            _query_params.append(('state', state))
+            
+        if response_received is not None:
+            
+            _query_params.append(('response_received', response_received))
+            
+        if responded_by_user_id is not None:
+            
+            _query_params.append(('responded_by_user_id', responded_by_user_id))
+            
+        if responded_by_user_name is not None:
+            
+            _query_params.append(('responded_by_user_name', responded_by_user_name))
+            
+        if subject_search is not None:
+            
+            _query_params.append(('subject_search', subject_search))
+            
+        if body_search is not None:
+            
+            _query_params.append(('body_search', body_search))
+            
+        if created_at_gte is not None:
+            if isinstance(created_at_gte, datetime):
+                _query_params.append(
+                    (
+                        'created_at_gte',
+                        created_at_gte.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('created_at_gte', created_at_gte))
+            
+        if created_at_gt is not None:
+            if isinstance(created_at_gt, datetime):
+                _query_params.append(
+                    (
+                        'created_at_gt',
+                        created_at_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('created_at_gt', created_at_gt))
+            
+        if created_at_lte is not None:
+            if isinstance(created_at_lte, datetime):
+                _query_params.append(
+                    (
+                        'created_at_lte',
+                        created_at_lte.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('created_at_lte', created_at_lte))
+            
+        if created_at_lt is not None:
+            if isinstance(created_at_lt, datetime):
+                _query_params.append(
+                    (
+                        'created_at_lt',
+                        created_at_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('created_at_lt', created_at_lt))
+            
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json'
+                ]
+            )
+
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='GET',
+            resource_path='/api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/hitlDetails',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
+
+
+    @validate_call
     def get_log(
         self,
         dag_id: StrictStr,
         dag_run_id: StrictStr,
         task_id: StrictStr,
-        try_number: StrictInt,
+        try_number: Annotated[int, Field(strict=True, ge=0)],
         full_content: Optional[StrictBool] = None,
         map_index: Optional[StrictInt] = None,
         token: Optional[StrictStr] = None,
@@ -474,7 +2390,7 @@
         dag_id: StrictStr,
         dag_run_id: StrictStr,
         task_id: StrictStr,
-        try_number: StrictInt,
+        try_number: Annotated[int, Field(strict=True, ge=0)],
         full_content: Optional[StrictBool] = None,
         map_index: Optional[StrictInt] = None,
         token: Optional[StrictStr] = None,
@@ -573,7 +2489,7 @@
         dag_id: StrictStr,
         dag_run_id: StrictStr,
         task_id: StrictStr,
-        try_number: StrictInt,
+        try_number: Annotated[int, Field(strict=True, ge=0)],
         full_content: Optional[StrictBool] = None,
         map_index: Optional[StrictInt] = None,
         token: Optional[StrictStr] = None,
@@ -733,7 +2649,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1051,7 +2968,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1366,7 +3284,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1696,7 +3615,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1724,25 +3644,40 @@
         dag_run_id: StrictStr,
         task_id: StrictStr,
         run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
         run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
         logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         updated_at_gte: Optional[datetime] = None,
+        updated_at_gt: Optional[datetime] = None,
         updated_at_lte: Optional[datetime] = None,
+        updated_at_lt: Optional[datetime] = None,
         duration_gte: Optional[Union[StrictFloat, StrictInt]] = None,
+        duration_gt: Optional[Union[StrictFloat, StrictInt]] = None,
         duration_lte: Optional[Union[StrictFloat, StrictInt]] = None,
+        duration_lt: Optional[Union[StrictFloat, StrictInt]] = None,
         state: Optional[List[StrictStr]] = None,
         pool: Optional[List[StrictStr]] = None,
         queue: Optional[List[StrictStr]] = None,
         executor: Optional[List[StrictStr]] = None,
         version_number: Optional[List[StrictInt]] = None,
+        try_number: Optional[List[StrictInt]] = None,
+        operator: Optional[List[StrictStr]] = None,
+        map_index: Optional[List[StrictInt]] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1768,28 +3703,52 @@
         :type task_id: str
         :param run_after_gte:
         :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
         :param run_after_lte:
         :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param logical_date_gte:
         :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
         :param logical_date_lte:
         :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param updated_at_gte:
         :type updated_at_gte: datetime
+        :param updated_at_gt:
+        :type updated_at_gt: datetime
         :param updated_at_lte:
         :type updated_at_lte: datetime
+        :param updated_at_lt:
+        :type updated_at_lt: datetime
         :param duration_gte:
         :type duration_gte: float
+        :param duration_gt:
+        :type duration_gt: float
         :param duration_lte:
         :type duration_lte: float
+        :param duration_lt:
+        :type duration_lt: float
         :param state:
         :type state: List[str]
         :param pool:
@@ -1800,12 +3759,18 @@
         :type executor: List[str]
         :param version_number:
         :type version_number: List[int]
+        :param try_number:
+        :type try_number: List[int]
+        :param operator:
+        :type operator: List[str]
+        :param map_index:
+        :type map_index: List[int]
         :param limit:
         :type limit: int
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1833,22 +3798,37 @@
             dag_run_id=dag_run_id,
             task_id=task_id,
             run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
             run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
             logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             updated_at_gte=updated_at_gte,
+            updated_at_gt=updated_at_gt,
             updated_at_lte=updated_at_lte,
+            updated_at_lt=updated_at_lt,
             duration_gte=duration_gte,
+            duration_gt=duration_gt,
             duration_lte=duration_lte,
+            duration_lt=duration_lt,
             state=state,
             pool=pool,
             queue=queue,
             executor=executor,
             version_number=version_number,
+            try_number=try_number,
+            operator=operator,
+            map_index=map_index,
             limit=limit,
             offset=offset,
             order_by=order_by,
@@ -1883,25 +3863,40 @@
         dag_run_id: StrictStr,
         task_id: StrictStr,
         run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
         run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
         logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         updated_at_gte: Optional[datetime] = None,
+        updated_at_gt: Optional[datetime] = None,
         updated_at_lte: Optional[datetime] = None,
+        updated_at_lt: Optional[datetime] = None,
         duration_gte: Optional[Union[StrictFloat, StrictInt]] = None,
+        duration_gt: Optional[Union[StrictFloat, StrictInt]] = None,
         duration_lte: Optional[Union[StrictFloat, StrictInt]] = None,
+        duration_lt: Optional[Union[StrictFloat, StrictInt]] = None,
         state: Optional[List[StrictStr]] = None,
         pool: Optional[List[StrictStr]] = None,
         queue: Optional[List[StrictStr]] = None,
         executor: Optional[List[StrictStr]] = None,
         version_number: Optional[List[StrictInt]] = None,
+        try_number: Optional[List[StrictInt]] = None,
+        operator: Optional[List[StrictStr]] = None,
+        map_index: Optional[List[StrictInt]] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1927,28 +3922,52 @@
         :type task_id: str
         :param run_after_gte:
         :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
         :param run_after_lte:
         :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param logical_date_gte:
         :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
         :param logical_date_lte:
         :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param updated_at_gte:
         :type updated_at_gte: datetime
+        :param updated_at_gt:
+        :type updated_at_gt: datetime
         :param updated_at_lte:
         :type updated_at_lte: datetime
+        :param updated_at_lt:
+        :type updated_at_lt: datetime
         :param duration_gte:
         :type duration_gte: float
+        :param duration_gt:
+        :type duration_gt: float
         :param duration_lte:
         :type duration_lte: float
+        :param duration_lt:
+        :type duration_lt: float
         :param state:
         :type state: List[str]
         :param pool:
@@ -1959,12 +3978,18 @@
         :type executor: List[str]
         :param version_number:
         :type version_number: List[int]
+        :param try_number:
+        :type try_number: List[int]
+        :param operator:
+        :type operator: List[str]
+        :param map_index:
+        :type map_index: List[int]
         :param limit:
         :type limit: int
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -1992,22 +4017,37 @@
             dag_run_id=dag_run_id,
             task_id=task_id,
             run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
             run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
             logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             updated_at_gte=updated_at_gte,
+            updated_at_gt=updated_at_gt,
             updated_at_lte=updated_at_lte,
+            updated_at_lt=updated_at_lt,
             duration_gte=duration_gte,
+            duration_gt=duration_gt,
             duration_lte=duration_lte,
+            duration_lt=duration_lt,
             state=state,
             pool=pool,
             queue=queue,
             executor=executor,
             version_number=version_number,
+            try_number=try_number,
+            operator=operator,
+            map_index=map_index,
             limit=limit,
             offset=offset,
             order_by=order_by,
@@ -2042,25 +4082,40 @@
         dag_run_id: StrictStr,
         task_id: StrictStr,
         run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
         run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
         logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         updated_at_gte: Optional[datetime] = None,
+        updated_at_gt: Optional[datetime] = None,
         updated_at_lte: Optional[datetime] = None,
+        updated_at_lt: Optional[datetime] = None,
         duration_gte: Optional[Union[StrictFloat, StrictInt]] = None,
+        duration_gt: Optional[Union[StrictFloat, StrictInt]] = None,
         duration_lte: Optional[Union[StrictFloat, StrictInt]] = None,
+        duration_lt: Optional[Union[StrictFloat, StrictInt]] = None,
         state: Optional[List[StrictStr]] = None,
         pool: Optional[List[StrictStr]] = None,
         queue: Optional[List[StrictStr]] = None,
         executor: Optional[List[StrictStr]] = None,
         version_number: Optional[List[StrictInt]] = None,
+        try_number: Optional[List[StrictInt]] = None,
+        operator: Optional[List[StrictStr]] = None,
+        map_index: Optional[List[StrictInt]] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -2086,28 +4141,52 @@
         :type task_id: str
         :param run_after_gte:
         :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
         :param run_after_lte:
         :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param logical_date_gte:
         :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
         :param logical_date_lte:
         :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param updated_at_gte:
         :type updated_at_gte: datetime
+        :param updated_at_gt:
+        :type updated_at_gt: datetime
         :param updated_at_lte:
         :type updated_at_lte: datetime
+        :param updated_at_lt:
+        :type updated_at_lt: datetime
         :param duration_gte:
         :type duration_gte: float
+        :param duration_gt:
+        :type duration_gt: float
         :param duration_lte:
         :type duration_lte: float
+        :param duration_lt:
+        :type duration_lt: float
         :param state:
         :type state: List[str]
         :param pool:
@@ -2118,12 +4197,18 @@
         :type executor: List[str]
         :param version_number:
         :type version_number: List[int]
+        :param try_number:
+        :type try_number: List[int]
+        :param operator:
+        :type operator: List[str]
+        :param map_index:
+        :type map_index: List[int]
         :param limit:
         :type limit: int
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -2151,22 +4236,37 @@
             dag_run_id=dag_run_id,
             task_id=task_id,
             run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
             run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
             logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             updated_at_gte=updated_at_gte,
+            updated_at_gt=updated_at_gt,
             updated_at_lte=updated_at_lte,
+            updated_at_lt=updated_at_lt,
             duration_gte=duration_gte,
+            duration_gt=duration_gt,
             duration_lte=duration_lte,
+            duration_lt=duration_lt,
             state=state,
             pool=pool,
             queue=queue,
             executor=executor,
             version_number=version_number,
+            try_number=try_number,
+            operator=operator,
+            map_index=map_index,
             limit=limit,
             offset=offset,
             order_by=order_by,
@@ -2196,22 +4296,37 @@
         dag_run_id,
         task_id,
         run_after_gte,
+        run_after_gt,
         run_after_lte,
+        run_after_lt,
         logical_date_gte,
+        logical_date_gt,
         logical_date_lte,
+        logical_date_lt,
         start_date_gte,
+        start_date_gt,
         start_date_lte,
+        start_date_lt,
         end_date_gte,
+        end_date_gt,
         end_date_lte,
+        end_date_lt,
         updated_at_gte,
+        updated_at_gt,
         updated_at_lte,
+        updated_at_lt,
         duration_gte,
+        duration_gt,
         duration_lte,
+        duration_lt,
         state,
         pool,
         queue,
         executor,
         version_number,
+        try_number,
+        operator,
+        map_index,
         limit,
         offset,
         order_by,
@@ -2229,6 +4344,10 @@
             'queue': 'multi',
             'executor': 'multi',
             'version_number': 'multi',
+            'try_number': 'multi',
+            'operator': 'multi',
+            'map_index': 'multi',
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -2261,6 +4380,19 @@
             else:
                 _query_params.append(('run_after_gte', run_after_gte))
             
+        if run_after_gt is not None:
+            if isinstance(run_after_gt, datetime):
+                _query_params.append(
+                    (
+                        'run_after_gt',
+                        run_after_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('run_after_gt', run_after_gt))
+            
         if run_after_lte is not None:
             if isinstance(run_after_lte, datetime):
                 _query_params.append(
@@ -2274,6 +4406,19 @@
             else:
                 _query_params.append(('run_after_lte', run_after_lte))
             
+        if run_after_lt is not None:
+            if isinstance(run_after_lt, datetime):
+                _query_params.append(
+                    (
+                        'run_after_lt',
+                        run_after_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('run_after_lt', run_after_lt))
+            
         if logical_date_gte is not None:
             if isinstance(logical_date_gte, datetime):
                 _query_params.append(
@@ -2287,6 +4432,19 @@
             else:
                 _query_params.append(('logical_date_gte', logical_date_gte))
             
+        if logical_date_gt is not None:
+            if isinstance(logical_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'logical_date_gt',
+                        logical_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('logical_date_gt', logical_date_gt))
+            
         if logical_date_lte is not None:
             if isinstance(logical_date_lte, datetime):
                 _query_params.append(
@@ -2300,6 +4458,19 @@
             else:
                 _query_params.append(('logical_date_lte', logical_date_lte))
             
+        if logical_date_lt is not None:
+            if isinstance(logical_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'logical_date_lt',
+                        logical_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('logical_date_lt', logical_date_lt))
+            
         if start_date_gte is not None:
             if isinstance(start_date_gte, datetime):
                 _query_params.append(
@@ -2313,6 +4484,19 @@
             else:
                 _query_params.append(('start_date_gte', start_date_gte))
             
+        if start_date_gt is not None:
+            if isinstance(start_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'start_date_gt',
+                        start_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('start_date_gt', start_date_gt))
+            
         if start_date_lte is not None:
             if isinstance(start_date_lte, datetime):
                 _query_params.append(
@@ -2326,6 +4510,19 @@
             else:
                 _query_params.append(('start_date_lte', start_date_lte))
             
+        if start_date_lt is not None:
+            if isinstance(start_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'start_date_lt',
+                        start_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('start_date_lt', start_date_lt))
+            
         if end_date_gte is not None:
             if isinstance(end_date_gte, datetime):
                 _query_params.append(
@@ -2339,6 +4536,19 @@
             else:
                 _query_params.append(('end_date_gte', end_date_gte))
             
+        if end_date_gt is not None:
+            if isinstance(end_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'end_date_gt',
+                        end_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('end_date_gt', end_date_gt))
+            
         if end_date_lte is not None:
             if isinstance(end_date_lte, datetime):
                 _query_params.append(
@@ -2352,6 +4562,19 @@
             else:
                 _query_params.append(('end_date_lte', end_date_lte))
             
+        if end_date_lt is not None:
+            if isinstance(end_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'end_date_lt',
+                        end_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('end_date_lt', end_date_lt))
+            
         if updated_at_gte is not None:
             if isinstance(updated_at_gte, datetime):
                 _query_params.append(
@@ -2365,6 +4588,19 @@
             else:
                 _query_params.append(('updated_at_gte', updated_at_gte))
             
+        if updated_at_gt is not None:
+            if isinstance(updated_at_gt, datetime):
+                _query_params.append(
+                    (
+                        'updated_at_gt',
+                        updated_at_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('updated_at_gt', updated_at_gt))
+            
         if updated_at_lte is not None:
             if isinstance(updated_at_lte, datetime):
                 _query_params.append(
@@ -2378,14 +4614,35 @@
             else:
                 _query_params.append(('updated_at_lte', updated_at_lte))
             
+        if updated_at_lt is not None:
+            if isinstance(updated_at_lt, datetime):
+                _query_params.append(
+                    (
+                        'updated_at_lt',
+                        updated_at_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('updated_at_lt', updated_at_lt))
+            
         if duration_gte is not None:
             
             _query_params.append(('duration_gte', duration_gte))
             
+        if duration_gt is not None:
+            
+            _query_params.append(('duration_gt', duration_gt))
+            
         if duration_lte is not None:
             
             _query_params.append(('duration_lte', duration_lte))
             
+        if duration_lt is not None:
+            
+            _query_params.append(('duration_lt', duration_lt))
+            
         if state is not None:
             
             _query_params.append(('state', state))
@@ -2406,6 +4663,18 @@
             
             _query_params.append(('version_number', version_number))
             
+        if try_number is not None:
+            
+            _query_params.append(('try_number', try_number))
+            
+        if operator is not None:
+            
+            _query_params.append(('operator', operator))
+            
+        if map_index is not None:
+            
+            _query_params.append(('map_index', map_index))
+            
         if limit is not None:
             
             _query_params.append(('limit', limit))
@@ -2434,7 +4703,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -2737,7 +5007,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -3057,7 +5328,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -3375,7 +5647,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -3695,7 +5968,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -4030,7 +6304,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -4058,26 +6333,41 @@
         dag_run_id: StrictStr,
         task_id: Optional[StrictStr] = None,
         run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
         run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
         logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         updated_at_gte: Optional[datetime] = None,
+        updated_at_gt: Optional[datetime] = None,
         updated_at_lte: Optional[datetime] = None,
+        updated_at_lt: Optional[datetime] = None,
         duration_gte: Optional[Union[StrictFloat, StrictInt]] = None,
+        duration_gt: Optional[Union[StrictFloat, StrictInt]] = None,
         duration_lte: Optional[Union[StrictFloat, StrictInt]] = None,
-        task_display_name_pattern: Optional[StrictStr] = None,
+        duration_lt: Optional[Union[StrictFloat, StrictInt]] = None,
+        task_display_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         state: Optional[List[StrictStr]] = None,
         pool: Optional[List[StrictStr]] = None,
         queue: Optional[List[StrictStr]] = None,
         executor: Optional[List[StrictStr]] = None,
         version_number: Optional[List[StrictInt]] = None,
+        try_number: Optional[List[StrictInt]] = None,
+        operator: Optional[List[StrictStr]] = None,
+        map_index: Optional[List[StrictInt]] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -4103,29 +6393,53 @@
         :type task_id: str
         :param run_after_gte:
         :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
         :param run_after_lte:
         :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param logical_date_gte:
         :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
         :param logical_date_lte:
         :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param updated_at_gte:
         :type updated_at_gte: datetime
+        :param updated_at_gt:
+        :type updated_at_gt: datetime
         :param updated_at_lte:
         :type updated_at_lte: datetime
+        :param updated_at_lt:
+        :type updated_at_lt: datetime
         :param duration_gte:
         :type duration_gte: float
+        :param duration_gt:
+        :type duration_gt: float
         :param duration_lte:
         :type duration_lte: float
-        :param task_display_name_pattern:
+        :param duration_lt:
+        :type duration_lt: float
+        :param task_display_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type task_display_name_pattern: str
         :param state:
         :type state: List[str]
@@ -4137,12 +6451,18 @@
         :type executor: List[str]
         :param version_number:
         :type version_number: List[int]
+        :param try_number:
+        :type try_number: List[int]
+        :param operator:
+        :type operator: List[str]
+        :param map_index:
+        :type map_index: List[int]
         :param limit:
         :type limit: int
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -4170,23 +6490,38 @@
             dag_run_id=dag_run_id,
             task_id=task_id,
             run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
             run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
             logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             updated_at_gte=updated_at_gte,
+            updated_at_gt=updated_at_gt,
             updated_at_lte=updated_at_lte,
+            updated_at_lt=updated_at_lt,
             duration_gte=duration_gte,
+            duration_gt=duration_gt,
             duration_lte=duration_lte,
+            duration_lt=duration_lt,
             task_display_name_pattern=task_display_name_pattern,
             state=state,
             pool=pool,
             queue=queue,
             executor=executor,
             version_number=version_number,
+            try_number=try_number,
+            operator=operator,
+            map_index=map_index,
             limit=limit,
             offset=offset,
             order_by=order_by,
@@ -4221,26 +6556,41 @@
         dag_run_id: StrictStr,
         task_id: Optional[StrictStr] = None,
         run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
         run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
         logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         updated_at_gte: Optional[datetime] = None,
+        updated_at_gt: Optional[datetime] = None,
         updated_at_lte: Optional[datetime] = None,
+        updated_at_lt: Optional[datetime] = None,
         duration_gte: Optional[Union[StrictFloat, StrictInt]] = None,
+        duration_gt: Optional[Union[StrictFloat, StrictInt]] = None,
         duration_lte: Optional[Union[StrictFloat, StrictInt]] = None,
-        task_display_name_pattern: Optional[StrictStr] = None,
+        duration_lt: Optional[Union[StrictFloat, StrictInt]] = None,
+        task_display_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         state: Optional[List[StrictStr]] = None,
         pool: Optional[List[StrictStr]] = None,
         queue: Optional[List[StrictStr]] = None,
         executor: Optional[List[StrictStr]] = None,
         version_number: Optional[List[StrictInt]] = None,
+        try_number: Optional[List[StrictInt]] = None,
+        operator: Optional[List[StrictStr]] = None,
+        map_index: Optional[List[StrictInt]] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -4266,29 +6616,53 @@
         :type task_id: str
         :param run_after_gte:
         :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
         :param run_after_lte:
         :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param logical_date_gte:
         :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
         :param logical_date_lte:
         :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param updated_at_gte:
         :type updated_at_gte: datetime
+        :param updated_at_gt:
+        :type updated_at_gt: datetime
         :param updated_at_lte:
         :type updated_at_lte: datetime
+        :param updated_at_lt:
+        :type updated_at_lt: datetime
         :param duration_gte:
         :type duration_gte: float
+        :param duration_gt:
+        :type duration_gt: float
         :param duration_lte:
         :type duration_lte: float
-        :param task_display_name_pattern:
+        :param duration_lt:
+        :type duration_lt: float
+        :param task_display_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type task_display_name_pattern: str
         :param state:
         :type state: List[str]
@@ -4300,12 +6674,18 @@
         :type executor: List[str]
         :param version_number:
         :type version_number: List[int]
+        :param try_number:
+        :type try_number: List[int]
+        :param operator:
+        :type operator: List[str]
+        :param map_index:
+        :type map_index: List[int]
         :param limit:
         :type limit: int
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -4333,23 +6713,38 @@
             dag_run_id=dag_run_id,
             task_id=task_id,
             run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
             run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
             logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             updated_at_gte=updated_at_gte,
+            updated_at_gt=updated_at_gt,
             updated_at_lte=updated_at_lte,
+            updated_at_lt=updated_at_lt,
             duration_gte=duration_gte,
+            duration_gt=duration_gt,
             duration_lte=duration_lte,
+            duration_lt=duration_lt,
             task_display_name_pattern=task_display_name_pattern,
             state=state,
             pool=pool,
             queue=queue,
             executor=executor,
             version_number=version_number,
+            try_number=try_number,
+            operator=operator,
+            map_index=map_index,
             limit=limit,
             offset=offset,
             order_by=order_by,
@@ -4384,26 +6779,41 @@
         dag_run_id: StrictStr,
         task_id: Optional[StrictStr] = None,
         run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
         run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
         logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
         start_date_gte: Optional[datetime] = None,
+        start_date_gt: Optional[datetime] = None,
         start_date_lte: Optional[datetime] = None,
+        start_date_lt: Optional[datetime] = None,
         end_date_gte: Optional[datetime] = None,
+        end_date_gt: Optional[datetime] = None,
         end_date_lte: Optional[datetime] = None,
+        end_date_lt: Optional[datetime] = None,
         updated_at_gte: Optional[datetime] = None,
+        updated_at_gt: Optional[datetime] = None,
         updated_at_lte: Optional[datetime] = None,
+        updated_at_lt: Optional[datetime] = None,
         duration_gte: Optional[Union[StrictFloat, StrictInt]] = None,
+        duration_gt: Optional[Union[StrictFloat, StrictInt]] = None,
         duration_lte: Optional[Union[StrictFloat, StrictInt]] = None,
-        task_display_name_pattern: Optional[StrictStr] = None,
+        duration_lt: Optional[Union[StrictFloat, StrictInt]] = None,
+        task_display_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         state: Optional[List[StrictStr]] = None,
         pool: Optional[List[StrictStr]] = None,
         queue: Optional[List[StrictStr]] = None,
         executor: Optional[List[StrictStr]] = None,
         version_number: Optional[List[StrictInt]] = None,
+        try_number: Optional[List[StrictInt]] = None,
+        operator: Optional[List[StrictStr]] = None,
+        map_index: Optional[List[StrictInt]] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -4429,29 +6839,53 @@
         :type task_id: str
         :param run_after_gte:
         :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
         :param run_after_lte:
         :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param logical_date_gte:
         :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
         :param logical_date_lte:
         :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
         :param start_date_gte:
         :type start_date_gte: datetime
+        :param start_date_gt:
+        :type start_date_gt: datetime
         :param start_date_lte:
         :type start_date_lte: datetime
+        :param start_date_lt:
+        :type start_date_lt: datetime
         :param end_date_gte:
         :type end_date_gte: datetime
+        :param end_date_gt:
+        :type end_date_gt: datetime
         :param end_date_lte:
         :type end_date_lte: datetime
+        :param end_date_lt:
+        :type end_date_lt: datetime
         :param updated_at_gte:
         :type updated_at_gte: datetime
+        :param updated_at_gt:
+        :type updated_at_gt: datetime
         :param updated_at_lte:
         :type updated_at_lte: datetime
+        :param updated_at_lt:
+        :type updated_at_lt: datetime
         :param duration_gte:
         :type duration_gte: float
+        :param duration_gt:
+        :type duration_gt: float
         :param duration_lte:
         :type duration_lte: float
-        :param task_display_name_pattern:
+        :param duration_lt:
+        :type duration_lt: float
+        :param task_display_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type task_display_name_pattern: str
         :param state:
         :type state: List[str]
@@ -4463,12 +6897,18 @@
         :type executor: List[str]
         :param version_number:
         :type version_number: List[int]
+        :param try_number:
+        :type try_number: List[int]
+        :param operator:
+        :type operator: List[str]
+        :param map_index:
+        :type map_index: List[int]
         :param limit:
         :type limit: int
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
+        :type order_by: List[str]
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -4496,23 +6936,38 @@
             dag_run_id=dag_run_id,
             task_id=task_id,
             run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
             run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
             logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
             start_date_gte=start_date_gte,
+            start_date_gt=start_date_gt,
             start_date_lte=start_date_lte,
+            start_date_lt=start_date_lt,
             end_date_gte=end_date_gte,
+            end_date_gt=end_date_gt,
             end_date_lte=end_date_lte,
+            end_date_lt=end_date_lt,
             updated_at_gte=updated_at_gte,
+            updated_at_gt=updated_at_gt,
             updated_at_lte=updated_at_lte,
+            updated_at_lt=updated_at_lt,
             duration_gte=duration_gte,
+            duration_gt=duration_gt,
             duration_lte=duration_lte,
+            duration_lt=duration_lt,
             task_display_name_pattern=task_display_name_pattern,
             state=state,
             pool=pool,
             queue=queue,
             executor=executor,
             version_number=version_number,
+            try_number=try_number,
+            operator=operator,
+            map_index=map_index,
             limit=limit,
             offset=offset,
             order_by=order_by,
@@ -4542,23 +6997,38 @@
         dag_run_id,
         task_id,
         run_after_gte,
+        run_after_gt,
         run_after_lte,
+        run_after_lt,
         logical_date_gte,
+        logical_date_gt,
         logical_date_lte,
+        logical_date_lt,
         start_date_gte,
+        start_date_gt,
         start_date_lte,
+        start_date_lt,
         end_date_gte,
+        end_date_gt,
         end_date_lte,
+        end_date_lt,
         updated_at_gte,
+        updated_at_gt,
         updated_at_lte,
+        updated_at_lt,
         duration_gte,
+        duration_gt,
         duration_lte,
+        duration_lt,
         task_display_name_pattern,
         state,
         pool,
         queue,
         executor,
         version_number,
+        try_number,
+        operator,
+        map_index,
         limit,
         offset,
         order_by,
@@ -4576,6 +7046,10 @@
             'queue': 'multi',
             'executor': 'multi',
             'version_number': 'multi',
+            'try_number': 'multi',
+            'operator': 'multi',
+            'map_index': 'multi',
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -4610,6 +7084,19 @@
             else:
                 _query_params.append(('run_after_gte', run_after_gte))
             
+        if run_after_gt is not None:
+            if isinstance(run_after_gt, datetime):
+                _query_params.append(
+                    (
+                        'run_after_gt',
+                        run_after_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('run_after_gt', run_after_gt))
+            
         if run_after_lte is not None:
             if isinstance(run_after_lte, datetime):
                 _query_params.append(
@@ -4623,6 +7110,19 @@
             else:
                 _query_params.append(('run_after_lte', run_after_lte))
             
+        if run_after_lt is not None:
+            if isinstance(run_after_lt, datetime):
+                _query_params.append(
+                    (
+                        'run_after_lt',
+                        run_after_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('run_after_lt', run_after_lt))
+            
         if logical_date_gte is not None:
             if isinstance(logical_date_gte, datetime):
                 _query_params.append(
@@ -4636,6 +7136,19 @@
             else:
                 _query_params.append(('logical_date_gte', logical_date_gte))
             
+        if logical_date_gt is not None:
+            if isinstance(logical_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'logical_date_gt',
+                        logical_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('logical_date_gt', logical_date_gt))
+            
         if logical_date_lte is not None:
             if isinstance(logical_date_lte, datetime):
                 _query_params.append(
@@ -4649,6 +7162,19 @@
             else:
                 _query_params.append(('logical_date_lte', logical_date_lte))
             
+        if logical_date_lt is not None:
+            if isinstance(logical_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'logical_date_lt',
+                        logical_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('logical_date_lt', logical_date_lt))
+            
         if start_date_gte is not None:
             if isinstance(start_date_gte, datetime):
                 _query_params.append(
@@ -4662,6 +7188,19 @@
             else:
                 _query_params.append(('start_date_gte', start_date_gte))
             
+        if start_date_gt is not None:
+            if isinstance(start_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'start_date_gt',
+                        start_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('start_date_gt', start_date_gt))
+            
         if start_date_lte is not None:
             if isinstance(start_date_lte, datetime):
                 _query_params.append(
@@ -4675,6 +7214,19 @@
             else:
                 _query_params.append(('start_date_lte', start_date_lte))
             
+        if start_date_lt is not None:
+            if isinstance(start_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'start_date_lt',
+                        start_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('start_date_lt', start_date_lt))
+            
         if end_date_gte is not None:
             if isinstance(end_date_gte, datetime):
                 _query_params.append(
@@ -4688,6 +7240,19 @@
             else:
                 _query_params.append(('end_date_gte', end_date_gte))
             
+        if end_date_gt is not None:
+            if isinstance(end_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'end_date_gt',
+                        end_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('end_date_gt', end_date_gt))
+            
         if end_date_lte is not None:
             if isinstance(end_date_lte, datetime):
                 _query_params.append(
@@ -4701,6 +7266,19 @@
             else:
                 _query_params.append(('end_date_lte', end_date_lte))
             
+        if end_date_lt is not None:
+            if isinstance(end_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'end_date_lt',
+                        end_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('end_date_lt', end_date_lt))
+            
         if updated_at_gte is not None:
             if isinstance(updated_at_gte, datetime):
                 _query_params.append(
@@ -4714,6 +7292,19 @@
             else:
                 _query_params.append(('updated_at_gte', updated_at_gte))
             
+        if updated_at_gt is not None:
+            if isinstance(updated_at_gt, datetime):
+                _query_params.append(
+                    (
+                        'updated_at_gt',
+                        updated_at_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('updated_at_gt', updated_at_gt))
+            
         if updated_at_lte is not None:
             if isinstance(updated_at_lte, datetime):
                 _query_params.append(
@@ -4727,14 +7318,35 @@
             else:
                 _query_params.append(('updated_at_lte', updated_at_lte))
             
+        if updated_at_lt is not None:
+            if isinstance(updated_at_lt, datetime):
+                _query_params.append(
+                    (
+                        'updated_at_lt',
+                        updated_at_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('updated_at_lt', updated_at_lt))
+            
         if duration_gte is not None:
             
             _query_params.append(('duration_gte', duration_gte))
             
+        if duration_gt is not None:
+            
+            _query_params.append(('duration_gt', duration_gt))
+            
         if duration_lte is not None:
             
             _query_params.append(('duration_lte', duration_lte))
             
+        if duration_lt is not None:
+            
+            _query_params.append(('duration_lt', duration_lt))
+            
         if task_display_name_pattern is not None:
             
             _query_params.append(('task_display_name_pattern', task_display_name_pattern))
@@ -4759,6 +7371,18 @@
             
             _query_params.append(('version_number', version_number))
             
+        if try_number is not None:
+            
+            _query_params.append(('try_number', try_number))
+            
+        if operator is not None:
+            
+            _query_params.append(('operator', operator))
+            
+        if map_index is not None:
+            
+            _query_params.append(('map_index', map_index))
+            
         if limit is not None:
             
             _query_params.append(('limit', limit))
@@ -4787,7 +7411,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -5103,7 +7728,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -5475,7 +8101,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -5845,7 +8472,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -6214,7 +8842,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -6581,7 +9210,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -6882,7 +9512,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -6901,3 +9532,353 @@
         )
 
 
+
+
+    @validate_call
+    def update_hitl_detail(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        map_index: StrictInt,
+        update_hitl_detail_payload: UpdateHITLDetailPayload,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> HITLDetailResponse:
+        """Update Hitl Detail
+
+        Update a Human-in-the-loop detail.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param map_index: (required)
+        :type map_index: int
+        :param update_hitl_detail_payload: (required)
+        :type update_hitl_detail_payload: UpdateHITLDetailPayload
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._update_hitl_detail_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            map_index=map_index,
+            update_hitl_detail_payload=update_hitl_detail_payload,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "HITLDetailResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '409': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        ).data
+
+
+    @validate_call
+    def update_hitl_detail_with_http_info(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        map_index: StrictInt,
+        update_hitl_detail_payload: UpdateHITLDetailPayload,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> ApiResponse[HITLDetailResponse]:
+        """Update Hitl Detail
+
+        Update a Human-in-the-loop detail.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param map_index: (required)
+        :type map_index: int
+        :param update_hitl_detail_payload: (required)
+        :type update_hitl_detail_payload: UpdateHITLDetailPayload
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._update_hitl_detail_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            map_index=map_index,
+            update_hitl_detail_payload=update_hitl_detail_payload,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "HITLDetailResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '409': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        response_data.read()
+        return self.api_client.response_deserialize(
+            response_data=response_data,
+            response_types_map=_response_types_map,
+        )
+
+
+    @validate_call
+    def update_hitl_detail_without_preload_content(
+        self,
+        dag_id: StrictStr,
+        dag_run_id: StrictStr,
+        task_id: StrictStr,
+        map_index: StrictInt,
+        update_hitl_detail_payload: UpdateHITLDetailPayload,
+        _request_timeout: Union[
+            None,
+            Annotated[StrictFloat, Field(gt=0)],
+            Tuple[
+                Annotated[StrictFloat, Field(gt=0)],
+                Annotated[StrictFloat, Field(gt=0)]
+            ]
+        ] = None,
+        _request_auth: Optional[Dict[StrictStr, Any]] = None,
+        _content_type: Optional[StrictStr] = None,
+        _headers: Optional[Dict[StrictStr, Any]] = None,
+        _host_index: Annotated[StrictInt, Field(ge=0, le=0)] = 0,
+    ) -> RESTResponseType:
+        """Update Hitl Detail
+
+        Update a Human-in-the-loop detail.
+
+        :param dag_id: (required)
+        :type dag_id: str
+        :param dag_run_id: (required)
+        :type dag_run_id: str
+        :param task_id: (required)
+        :type task_id: str
+        :param map_index: (required)
+        :type map_index: int
+        :param update_hitl_detail_payload: (required)
+        :type update_hitl_detail_payload: UpdateHITLDetailPayload
+        :param _request_timeout: timeout setting for this request. If one
+                                 number provided, it will be total request
+                                 timeout. It can also be a pair (tuple) of
+                                 (connection, read) timeouts.
+        :type _request_timeout: int, tuple(int, int), optional
+        :param _request_auth: set to override the auth_settings for an a single
+                              request; this effectively ignores the
+                              authentication in the spec for a single request.
+        :type _request_auth: dict, optional
+        :param _content_type: force content-type for the request.
+        :type _content_type: str, Optional
+        :param _headers: set to override the headers for a single
+                         request; this effectively ignores the headers
+                         in the spec for a single request.
+        :type _headers: dict, optional
+        :param _host_index: set to override the host_index for a single
+                            request; this effectively ignores the host_index
+                            in the spec for a single request.
+        :type _host_index: int, optional
+        :return: Returns the result object.
+        """ # noqa: E501
+
+        _param = self._update_hitl_detail_serialize(
+            dag_id=dag_id,
+            dag_run_id=dag_run_id,
+            task_id=task_id,
+            map_index=map_index,
+            update_hitl_detail_payload=update_hitl_detail_payload,
+            _request_auth=_request_auth,
+            _content_type=_content_type,
+            _headers=_headers,
+            _host_index=_host_index
+        )
+
+        _response_types_map: Dict[str, Optional[str]] = {
+            '200': "HITLDetailResponse",
+            '401': "HTTPExceptionResponse",
+            '403': "HTTPExceptionResponse",
+            '404': "HTTPExceptionResponse",
+            '409': "HTTPExceptionResponse",
+            '422': "HTTPValidationError",
+        }
+        response_data = self.api_client.call_api(
+            *_param,
+            _request_timeout=_request_timeout
+        )
+        return response_data.response
+
+
+    def _update_hitl_detail_serialize(
+        self,
+        dag_id,
+        dag_run_id,
+        task_id,
+        map_index,
+        update_hitl_detail_payload,
+        _request_auth,
+        _content_type,
+        _headers,
+        _host_index,
+    ) -> RequestSerialized:
+
+        _host = None
+
+        _collection_formats: Dict[str, str] = {
+        }
+
+        _path_params: Dict[str, str] = {}
+        _query_params: List[Tuple[str, str]] = []
+        _header_params: Dict[str, Optional[str]] = _headers or {}
+        _form_params: List[Tuple[str, str]] = []
+        _files: Dict[
+            str, Union[str, bytes, List[str], List[bytes], List[Tuple[str, bytes]]]
+        ] = {}
+        _body_params: Optional[bytes] = None
+
+        # process the path parameters
+        if dag_id is not None:
+            _path_params['dag_id'] = dag_id
+        if dag_run_id is not None:
+            _path_params['dag_run_id'] = dag_run_id
+        if task_id is not None:
+            _path_params['task_id'] = task_id
+        if map_index is not None:
+            _path_params['map_index'] = map_index
+        # process the query parameters
+        # process the header parameters
+        # process the form parameters
+        # process the body parameter
+        if update_hitl_detail_payload is not None:
+            _body_params = update_hitl_detail_payload
+
+
+        # set the HTTP header `Accept`
+        if 'Accept' not in _header_params:
+            _header_params['Accept'] = self.api_client.select_header_accept(
+                [
+                    'application/json'
+                ]
+            )
+
+        # set the HTTP header `Content-Type`
+        if _content_type:
+            _header_params['Content-Type'] = _content_type
+        else:
+            _default_content_type = (
+                self.api_client.select_header_content_type(
+                    [
+                        'application/json'
+                    ]
+                )
+            )
+            if _default_content_type is not None:
+                _header_params['Content-Type'] = _default_content_type
+
+        # authentication setting
+        _auth_settings: List[str] = [
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
+        ]
+
+        return self.api_client.param_serialize(
+            method='PATCH',
+            resource_path='/api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/hitlDetails',
+            path_params=_path_params,
+            query_params=_query_params,
+            header_params=_header_params,
+            body=_body_params,
+            post_params=_form_params,
+            files=_files,
+            auth_settings=_auth_settings,
+            collection_formats=_collection_formats,
+            _host=_host,
+            _request_auth=_request_auth
+        )
+
+
diff --git a/airflow_client/client/api/variable_api.py b/airflow_client/client/api/variable_api.py
index 8900fa9..271735f 100644
--- a/airflow_client/client/api/variable_api.py
+++ b/airflow_client/client/api/variable_api.py
@@ -305,7 +305,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -578,7 +579,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -851,7 +853,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -877,8 +880,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        variable_key_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        variable_key_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -901,8 +904,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param variable_key_pattern:
+        :type order_by: List[str]
+        :param variable_key_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type variable_key_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -959,8 +962,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        variable_key_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        variable_key_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -983,8 +986,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param variable_key_pattern:
+        :type order_by: List[str]
+        :param variable_key_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type variable_key_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -1041,8 +1044,8 @@
         self,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
-        order_by: Optional[StrictStr] = None,
-        variable_key_pattern: Optional[StrictStr] = None,
+        order_by: Optional[List[StrictStr]] = None,
+        variable_key_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -1065,8 +1068,8 @@
         :param offset:
         :type offset: int
         :param order_by:
-        :type order_by: str
-        :param variable_key_pattern:
+        :type order_by: List[str]
+        :param variable_key_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
         :type variable_key_pattern: str
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
@@ -1129,6 +1132,7 @@
         _host = None
 
         _collection_formats: Dict[str, str] = {
+            'order_by': 'multi',
         }
 
         _path_params: Dict[str, str] = {}
@@ -1174,7 +1178,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1496,7 +1501,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1782,7 +1788,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api/x_com_api.py b/airflow_client/client/api/x_com_api.py
index 21abbd8..ebf4c49 100644
--- a/airflow_client/client/api/x_com_api.py
+++ b/airflow_client/client/api/x_com_api.py
@@ -16,7 +16,8 @@
 from typing import Any, Dict, List, Optional, Tuple, Union
 from typing_extensions import Annotated
 
-from pydantic import Field, StrictBool, StrictStr
+from datetime import datetime
+from pydantic import Field, StrictBool, StrictInt, StrictStr
 from typing import Optional
 from typing_extensions import Annotated
 from airflow_client.client.models.response_get_xcom_entry import ResponseGetXcomEntry
@@ -356,7 +357,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -387,6 +389,19 @@
         map_index: Optional[Annotated[int, Field(strict=True, ge=-1)]] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
+        xcom_key_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        dag_display_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        run_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        task_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        map_index_filter: Optional[StrictInt] = None,
+        logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
+        logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
+        run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
+        run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -418,6 +433,32 @@
         :type limit: int
         :param offset:
         :type offset: int
+        :param xcom_key_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type xcom_key_pattern: str
+        :param dag_display_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type dag_display_name_pattern: str
+        :param run_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type run_id_pattern: str
+        :param task_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type task_id_pattern: str
+        :param map_index_filter:
+        :type map_index_filter: int
+        :param logical_date_gte:
+        :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
+        :param logical_date_lte:
+        :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
+        :param run_after_gte:
+        :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
+        :param run_after_lte:
+        :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -448,6 +489,19 @@
             map_index=map_index,
             limit=limit,
             offset=offset,
+            xcom_key_pattern=xcom_key_pattern,
+            dag_display_name_pattern=dag_display_name_pattern,
+            run_id_pattern=run_id_pattern,
+            task_id_pattern=task_id_pattern,
+            map_index_filter=map_index_filter,
+            logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
+            logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
+            run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
+            run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -483,6 +537,19 @@
         map_index: Optional[Annotated[int, Field(strict=True, ge=-1)]] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
+        xcom_key_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        dag_display_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        run_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        task_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        map_index_filter: Optional[StrictInt] = None,
+        logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
+        logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
+        run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
+        run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -514,6 +581,32 @@
         :type limit: int
         :param offset:
         :type offset: int
+        :param xcom_key_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type xcom_key_pattern: str
+        :param dag_display_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type dag_display_name_pattern: str
+        :param run_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type run_id_pattern: str
+        :param task_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type task_id_pattern: str
+        :param map_index_filter:
+        :type map_index_filter: int
+        :param logical_date_gte:
+        :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
+        :param logical_date_lte:
+        :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
+        :param run_after_gte:
+        :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
+        :param run_after_lte:
+        :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -544,6 +637,19 @@
             map_index=map_index,
             limit=limit,
             offset=offset,
+            xcom_key_pattern=xcom_key_pattern,
+            dag_display_name_pattern=dag_display_name_pattern,
+            run_id_pattern=run_id_pattern,
+            task_id_pattern=task_id_pattern,
+            map_index_filter=map_index_filter,
+            logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
+            logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
+            run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
+            run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -579,6 +685,19 @@
         map_index: Optional[Annotated[int, Field(strict=True, ge=-1)]] = None,
         limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
         offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = None,
+        xcom_key_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        dag_display_name_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        run_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        task_id_pattern: Annotated[Optional[StrictStr], Field(description="SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.")] = None,
+        map_index_filter: Optional[StrictInt] = None,
+        logical_date_gte: Optional[datetime] = None,
+        logical_date_gt: Optional[datetime] = None,
+        logical_date_lte: Optional[datetime] = None,
+        logical_date_lt: Optional[datetime] = None,
+        run_after_gte: Optional[datetime] = None,
+        run_after_gt: Optional[datetime] = None,
+        run_after_lte: Optional[datetime] = None,
+        run_after_lt: Optional[datetime] = None,
         _request_timeout: Union[
             None,
             Annotated[StrictFloat, Field(gt=0)],
@@ -610,6 +729,32 @@
         :type limit: int
         :param offset:
         :type offset: int
+        :param xcom_key_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type xcom_key_pattern: str
+        :param dag_display_name_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type dag_display_name_pattern: str
+        :param run_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type run_id_pattern: str
+        :param task_id_pattern: SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported.
+        :type task_id_pattern: str
+        :param map_index_filter:
+        :type map_index_filter: int
+        :param logical_date_gte:
+        :type logical_date_gte: datetime
+        :param logical_date_gt:
+        :type logical_date_gt: datetime
+        :param logical_date_lte:
+        :type logical_date_lte: datetime
+        :param logical_date_lt:
+        :type logical_date_lt: datetime
+        :param run_after_gte:
+        :type run_after_gte: datetime
+        :param run_after_gt:
+        :type run_after_gt: datetime
+        :param run_after_lte:
+        :type run_after_lte: datetime
+        :param run_after_lt:
+        :type run_after_lt: datetime
         :param _request_timeout: timeout setting for this request. If one
                                  number provided, it will be total request
                                  timeout. It can also be a pair (tuple) of
@@ -640,6 +785,19 @@
             map_index=map_index,
             limit=limit,
             offset=offset,
+            xcom_key_pattern=xcom_key_pattern,
+            dag_display_name_pattern=dag_display_name_pattern,
+            run_id_pattern=run_id_pattern,
+            task_id_pattern=task_id_pattern,
+            map_index_filter=map_index_filter,
+            logical_date_gte=logical_date_gte,
+            logical_date_gt=logical_date_gt,
+            logical_date_lte=logical_date_lte,
+            logical_date_lt=logical_date_lt,
+            run_after_gte=run_after_gte,
+            run_after_gt=run_after_gt,
+            run_after_lte=run_after_lte,
+            run_after_lt=run_after_lt,
             _request_auth=_request_auth,
             _content_type=_content_type,
             _headers=_headers,
@@ -670,6 +828,19 @@
         map_index,
         limit,
         offset,
+        xcom_key_pattern,
+        dag_display_name_pattern,
+        run_id_pattern,
+        task_id_pattern,
+        map_index_filter,
+        logical_date_gte,
+        logical_date_gt,
+        logical_date_lte,
+        logical_date_lt,
+        run_after_gte,
+        run_after_gt,
+        run_after_lte,
+        run_after_lt,
         _request_auth,
         _content_type,
         _headers,
@@ -714,6 +885,130 @@
             
             _query_params.append(('offset', offset))
             
+        if xcom_key_pattern is not None:
+            
+            _query_params.append(('xcom_key_pattern', xcom_key_pattern))
+            
+        if dag_display_name_pattern is not None:
+            
+            _query_params.append(('dag_display_name_pattern', dag_display_name_pattern))
+            
+        if run_id_pattern is not None:
+            
+            _query_params.append(('run_id_pattern', run_id_pattern))
+            
+        if task_id_pattern is not None:
+            
+            _query_params.append(('task_id_pattern', task_id_pattern))
+            
+        if map_index_filter is not None:
+            
+            _query_params.append(('map_index_filter', map_index_filter))
+            
+        if logical_date_gte is not None:
+            if isinstance(logical_date_gte, datetime):
+                _query_params.append(
+                    (
+                        'logical_date_gte',
+                        logical_date_gte.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('logical_date_gte', logical_date_gte))
+            
+        if logical_date_gt is not None:
+            if isinstance(logical_date_gt, datetime):
+                _query_params.append(
+                    (
+                        'logical_date_gt',
+                        logical_date_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('logical_date_gt', logical_date_gt))
+            
+        if logical_date_lte is not None:
+            if isinstance(logical_date_lte, datetime):
+                _query_params.append(
+                    (
+                        'logical_date_lte',
+                        logical_date_lte.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('logical_date_lte', logical_date_lte))
+            
+        if logical_date_lt is not None:
+            if isinstance(logical_date_lt, datetime):
+                _query_params.append(
+                    (
+                        'logical_date_lt',
+                        logical_date_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('logical_date_lt', logical_date_lt))
+            
+        if run_after_gte is not None:
+            if isinstance(run_after_gte, datetime):
+                _query_params.append(
+                    (
+                        'run_after_gte',
+                        run_after_gte.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('run_after_gte', run_after_gte))
+            
+        if run_after_gt is not None:
+            if isinstance(run_after_gt, datetime):
+                _query_params.append(
+                    (
+                        'run_after_gt',
+                        run_after_gt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('run_after_gt', run_after_gt))
+            
+        if run_after_lte is not None:
+            if isinstance(run_after_lte, datetime):
+                _query_params.append(
+                    (
+                        'run_after_lte',
+                        run_after_lte.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('run_after_lte', run_after_lte))
+            
+        if run_after_lt is not None:
+            if isinstance(run_after_lt, datetime):
+                _query_params.append(
+                    (
+                        'run_after_lt',
+                        run_after_lt.strftime(
+                            self.api_client.configuration.datetime_format
+                        )
+                    )
+                )
+            else:
+                _query_params.append(('run_after_lt', run_after_lt))
+            
         # process the header parameters
         # process the form parameters
         # process the body parameter
@@ -730,7 +1025,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1102,7 +1398,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
@@ -1451,7 +1748,8 @@
 
         # authentication setting
         _auth_settings: List[str] = [
-            'OAuth2PasswordBearer'
+            'OAuth2PasswordBearer', 
+            'HTTPBearer'
         ]
 
         return self.api_client.param_serialize(
diff --git a/airflow_client/client/api_client.py b/airflow_client/client/api_client.py
index 5e08829..fddd4ff 100644
--- a/airflow_client/client/api_client.py
+++ b/airflow_client/client/api_client.py
@@ -90,7 +90,7 @@
             self.default_headers[header_name] = header_value
         self.cookie = cookie
         # Set default User-Agent.
-        self.user_agent = 'OpenAPI-Generator/3.0.2/python'
+        self.user_agent = 'OpenAPI-Generator/3.1.0/python'
         self.client_side_validation = configuration.client_side_validation
 
     def __enter__(self):
diff --git a/airflow_client/client/configuration.py b/airflow_client/client/configuration.py
index 97061cd..dbc9328 100644
--- a/airflow_client/client/configuration.py
+++ b/airflow_client/client/configuration.py
@@ -113,6 +113,7 @@
 AuthSettings = TypedDict(
     "AuthSettings",
     {
+        "HTTPBearer": BearerAuthSetting,
         "OAuth2PasswordBearer": OAuth2AuthSetting,
     },
     total=False,
@@ -493,6 +494,13 @@
         """
         auth: AuthSettings = {}
         if self.access_token is not None:
+            auth['HTTPBearer'] = {
+                'type': 'bearer',
+                'in': 'header',
+                'key': 'Authorization',
+                'value': 'Bearer ' + self.access_token
+            }
+        if self.access_token is not None:
             auth['OAuth2PasswordBearer'] = {
                 'type': 'oauth2',
                 'in': 'header',
@@ -510,7 +518,7 @@
                "OS: {env}\n"\
                "Python Version: {pyversion}\n"\
                "Version of the API: 2\n"\
-               "SDK Package Version: 3.0.2".\
+               "SDK Package Version: 3.1.0".\
                format(env=sys.platform, pyversion=sys.version)
 
     def get_host_settings(self) -> List[HostSetting]:
diff --git a/airflow_client/client/models/__init__.py b/airflow_client/client/models/__init__.py
index 69ba66d..c4ce5a7 100644
--- a/airflow_client/client/models/__init__.py
+++ b/airflow_client/client/models/__init__.py
@@ -29,19 +29,26 @@
 from airflow_client.client.models.bulk_action_not_on_existence import BulkActionNotOnExistence
 from airflow_client.client.models.bulk_action_on_existence import BulkActionOnExistence
 from airflow_client.client.models.bulk_action_response import BulkActionResponse
+from airflow_client.client.models.bulk_body_bulk_task_instance_body import BulkBodyBulkTaskInstanceBody
+from airflow_client.client.models.bulk_body_bulk_task_instance_body_actions_inner import BulkBodyBulkTaskInstanceBodyActionsInner
 from airflow_client.client.models.bulk_body_connection_body import BulkBodyConnectionBody
 from airflow_client.client.models.bulk_body_connection_body_actions_inner import BulkBodyConnectionBodyActionsInner
 from airflow_client.client.models.bulk_body_pool_body import BulkBodyPoolBody
 from airflow_client.client.models.bulk_body_pool_body_actions_inner import BulkBodyPoolBodyActionsInner
 from airflow_client.client.models.bulk_body_variable_body import BulkBodyVariableBody
 from airflow_client.client.models.bulk_body_variable_body_actions_inner import BulkBodyVariableBodyActionsInner
+from airflow_client.client.models.bulk_create_action_bulk_task_instance_body import BulkCreateActionBulkTaskInstanceBody
 from airflow_client.client.models.bulk_create_action_connection_body import BulkCreateActionConnectionBody
 from airflow_client.client.models.bulk_create_action_pool_body import BulkCreateActionPoolBody
 from airflow_client.client.models.bulk_create_action_variable_body import BulkCreateActionVariableBody
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body import BulkDeleteActionBulkTaskInstanceBody
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body_entities_inner import BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
 from airflow_client.client.models.bulk_delete_action_connection_body import BulkDeleteActionConnectionBody
 from airflow_client.client.models.bulk_delete_action_pool_body import BulkDeleteActionPoolBody
 from airflow_client.client.models.bulk_delete_action_variable_body import BulkDeleteActionVariableBody
 from airflow_client.client.models.bulk_response import BulkResponse
+from airflow_client.client.models.bulk_task_instance_body import BulkTaskInstanceBody
+from airflow_client.client.models.bulk_update_action_bulk_task_instance_body import BulkUpdateActionBulkTaskInstanceBody
 from airflow_client.client.models.bulk_update_action_connection_body import BulkUpdateActionConnectionBody
 from airflow_client.client.models.bulk_update_action_pool_body import BulkUpdateActionPoolBody
 from airflow_client.client.models.bulk_update_action_variable_body import BulkUpdateActionVariableBody
@@ -88,9 +95,15 @@
 from airflow_client.client.models.dry_run_backfill_response import DryRunBackfillResponse
 from airflow_client.client.models.event_log_collection_response import EventLogCollectionResponse
 from airflow_client.client.models.event_log_response import EventLogResponse
+from airflow_client.client.models.external_log_url_response import ExternalLogUrlResponse
+from airflow_client.client.models.external_view_response import ExternalViewResponse
 from airflow_client.client.models.extra_link_collection_response import ExtraLinkCollectionResponse
 from airflow_client.client.models.fast_api_app_response import FastAPIAppResponse
 from airflow_client.client.models.fast_api_root_middleware_response import FastAPIRootMiddlewareResponse
+from airflow_client.client.models.hitl_detail import HITLDetail
+from airflow_client.client.models.hitl_detail_collection import HITLDetailCollection
+from airflow_client.client.models.hitl_detail_response import HITLDetailResponse
+from airflow_client.client.models.hitl_user import HITLUser
 from airflow_client.client.models.http_exception_response import HTTPExceptionResponse
 from airflow_client.client.models.http_validation_error import HTTPValidationError
 from airflow_client.client.models.health_info_response import HealthInfoResponse
@@ -98,8 +111,11 @@
 from airflow_client.client.models.import_error_response import ImportErrorResponse
 from airflow_client.client.models.job_collection_response import JobCollectionResponse
 from airflow_client.client.models.job_response import JobResponse
+from airflow_client.client.models.last_asset_event_response import LastAssetEventResponse
 from airflow_client.client.models.patch_task_instance_body import PatchTaskInstanceBody
 from airflow_client.client.models.plugin_collection_response import PluginCollectionResponse
+from airflow_client.client.models.plugin_import_error_collection_response import PluginImportErrorCollectionResponse
+from airflow_client.client.models.plugin_import_error_response import PluginImportErrorResponse
 from airflow_client.client.models.plugin_response import PluginResponse
 from airflow_client.client.models.pool_body import PoolBody
 from airflow_client.client.models.pool_collection_response import PoolCollectionResponse
@@ -109,6 +125,7 @@
 from airflow_client.client.models.provider_response import ProviderResponse
 from airflow_client.client.models.queued_event_collection_response import QueuedEventCollectionResponse
 from airflow_client.client.models.queued_event_response import QueuedEventResponse
+from airflow_client.client.models.react_app_response import ReactAppResponse
 from airflow_client.client.models.reprocess_behavior import ReprocessBehavior
 from airflow_client.client.models.response_clear_dag_run import ResponseClearDagRun
 from airflow_client.client.models.response_get_xcom_entry import ResponseGetXcomEntry
@@ -117,6 +134,7 @@
 from airflow_client.client.models.task_collection_response import TaskCollectionResponse
 from airflow_client.client.models.task_dependency_collection_response import TaskDependencyCollectionResponse
 from airflow_client.client.models.task_dependency_response import TaskDependencyResponse
+from airflow_client.client.models.task_inlet_asset_reference import TaskInletAssetReference
 from airflow_client.client.models.task_instance_collection_response import TaskInstanceCollectionResponse
 from airflow_client.client.models.task_instance_history_collection_response import TaskInstanceHistoryCollectionResponse
 from airflow_client.client.models.task_instance_history_response import TaskInstanceHistoryResponse
@@ -130,6 +148,7 @@
 from airflow_client.client.models.trigger_dag_run_post_body import TriggerDAGRunPostBody
 from airflow_client.client.models.trigger_response import TriggerResponse
 from airflow_client.client.models.triggerer_info_response import TriggererInfoResponse
+from airflow_client.client.models.update_hitl_detail_payload import UpdateHITLDetailPayload
 from airflow_client.client.models.validation_error import ValidationError
 from airflow_client.client.models.validation_error_loc_inner import ValidationErrorLocInner
 from airflow_client.client.models.value import Value
diff --git a/airflow_client/client/models/app_builder_menu_item_response.py b/airflow_client/client/models/app_builder_menu_item_response.py
index 42c15a4..131955d 100644
--- a/airflow_client/client/models/app_builder_menu_item_response.py
+++ b/airflow_client/client/models/app_builder_menu_item_response.py
@@ -27,7 +27,7 @@
     Serializer for AppBuilder Menu Item responses.
     """ # noqa: E501
     category: Optional[StrictStr] = None
-    href: Optional[StrictStr] = None
+    href: StrictStr
     name: StrictStr
     __properties: ClassVar[List[str]] = ["category", "href", "name"]
 
diff --git a/airflow_client/client/models/asset_response.py b/airflow_client/client/models/asset_response.py
index 49a3235..544ec66 100644
--- a/airflow_client/client/models/asset_response.py
+++ b/airflow_client/client/models/asset_response.py
@@ -22,6 +22,8 @@
 from typing import Any, ClassVar, Dict, List, Optional
 from airflow_client.client.models.asset_alias_response import AssetAliasResponse
 from airflow_client.client.models.dag_schedule_asset_reference import DagScheduleAssetReference
+from airflow_client.client.models.last_asset_event_response import LastAssetEventResponse
+from airflow_client.client.models.task_inlet_asset_reference import TaskInletAssetReference
 from airflow_client.client.models.task_outlet_asset_reference import TaskOutletAssetReference
 from typing import Optional, Set
 from typing_extensions import Self
@@ -31,16 +33,18 @@
     Asset serializer for responses.
     """ # noqa: E501
     aliases: List[AssetAliasResponse]
-    consuming_dags: List[DagScheduleAssetReference]
+    consuming_tasks: List[TaskInletAssetReference]
     created_at: datetime
     extra: Optional[Dict[str, Any]] = None
     group: StrictStr
     id: StrictInt
+    last_asset_event: Optional[LastAssetEventResponse] = None
     name: StrictStr
     producing_tasks: List[TaskOutletAssetReference]
+    scheduled_dags: List[DagScheduleAssetReference]
     updated_at: datetime
     uri: StrictStr
-    __properties: ClassVar[List[str]] = ["aliases", "consuming_dags", "created_at", "extra", "group", "id", "name", "producing_tasks", "updated_at", "uri"]
+    __properties: ClassVar[List[str]] = ["aliases", "consuming_tasks", "created_at", "extra", "group", "id", "last_asset_event", "name", "producing_tasks", "scheduled_dags", "updated_at", "uri"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -88,13 +92,16 @@
                 if _item_aliases:
                     _items.append(_item_aliases.to_dict())
             _dict['aliases'] = _items
-        # override the default output from pydantic by calling `to_dict()` of each item in consuming_dags (list)
+        # override the default output from pydantic by calling `to_dict()` of each item in consuming_tasks (list)
         _items = []
-        if self.consuming_dags:
-            for _item_consuming_dags in self.consuming_dags:
-                if _item_consuming_dags:
-                    _items.append(_item_consuming_dags.to_dict())
-            _dict['consuming_dags'] = _items
+        if self.consuming_tasks:
+            for _item_consuming_tasks in self.consuming_tasks:
+                if _item_consuming_tasks:
+                    _items.append(_item_consuming_tasks.to_dict())
+            _dict['consuming_tasks'] = _items
+        # override the default output from pydantic by calling `to_dict()` of last_asset_event
+        if self.last_asset_event:
+            _dict['last_asset_event'] = self.last_asset_event.to_dict()
         # override the default output from pydantic by calling `to_dict()` of each item in producing_tasks (list)
         _items = []
         if self.producing_tasks:
@@ -102,6 +109,13 @@
                 if _item_producing_tasks:
                     _items.append(_item_producing_tasks.to_dict())
             _dict['producing_tasks'] = _items
+        # override the default output from pydantic by calling `to_dict()` of each item in scheduled_dags (list)
+        _items = []
+        if self.scheduled_dags:
+            for _item_scheduled_dags in self.scheduled_dags:
+                if _item_scheduled_dags:
+                    _items.append(_item_scheduled_dags.to_dict())
+            _dict['scheduled_dags'] = _items
         return _dict
 
     @classmethod
@@ -115,13 +129,15 @@
 
         _obj = cls.model_validate({
             "aliases": [AssetAliasResponse.from_dict(_item) for _item in obj["aliases"]] if obj.get("aliases") is not None else None,
-            "consuming_dags": [DagScheduleAssetReference.from_dict(_item) for _item in obj["consuming_dags"]] if obj.get("consuming_dags") is not None else None,
+            "consuming_tasks": [TaskInletAssetReference.from_dict(_item) for _item in obj["consuming_tasks"]] if obj.get("consuming_tasks") is not None else None,
             "created_at": obj.get("created_at"),
             "extra": obj.get("extra"),
             "group": obj.get("group"),
             "id": obj.get("id"),
+            "last_asset_event": LastAssetEventResponse.from_dict(obj["last_asset_event"]) if obj.get("last_asset_event") is not None else None,
             "name": obj.get("name"),
             "producing_tasks": [TaskOutletAssetReference.from_dict(_item) for _item in obj["producing_tasks"]] if obj.get("producing_tasks") is not None else None,
+            "scheduled_dags": [DagScheduleAssetReference.from_dict(_item) for _item in obj["scheduled_dags"]] if obj.get("scheduled_dags") is not None else None,
             "updated_at": obj.get("updated_at"),
             "uri": obj.get("uri")
         })
diff --git a/airflow_client/client/models/backfill_response.py b/airflow_client/client/models/backfill_response.py
index 569b6e3..4f8664d 100644
--- a/airflow_client/client/models/backfill_response.py
+++ b/airflow_client/client/models/backfill_response.py
@@ -31,6 +31,7 @@
     """ # noqa: E501
     completed_at: Optional[datetime] = None
     created_at: datetime
+    dag_display_name: StrictStr
     dag_id: StrictStr
     dag_run_conf: Dict[str, Any]
     from_date: datetime
@@ -40,7 +41,7 @@
     reprocess_behavior: ReprocessBehavior
     to_date: datetime
     updated_at: datetime
-    __properties: ClassVar[List[str]] = ["completed_at", "created_at", "dag_id", "dag_run_conf", "from_date", "id", "is_paused", "max_active_runs", "reprocess_behavior", "to_date", "updated_at"]
+    __properties: ClassVar[List[str]] = ["completed_at", "created_at", "dag_display_name", "dag_id", "dag_run_conf", "from_date", "id", "is_paused", "max_active_runs", "reprocess_behavior", "to_date", "updated_at"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -95,6 +96,7 @@
         _obj = cls.model_validate({
             "completed_at": obj.get("completed_at"),
             "created_at": obj.get("created_at"),
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "dag_run_conf": obj.get("dag_run_conf"),
             "from_date": obj.get("from_date"),
diff --git a/airflow_client/client/models/bulk_body_bulk_task_instance_body.py b/airflow_client/client/models/bulk_body_bulk_task_instance_body.py
new file mode 100644
index 0000000..0119517
--- /dev/null
+++ b/airflow_client/client/models/bulk_body_bulk_task_instance_body.py
@@ -0,0 +1,95 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict
+from typing import Any, ClassVar, Dict, List
+from airflow_client.client.models.bulk_body_bulk_task_instance_body_actions_inner import BulkBodyBulkTaskInstanceBodyActionsInner
+from typing import Optional, Set
+from typing_extensions import Self
+
+class BulkBodyBulkTaskInstanceBody(BaseModel):
+    """
+    BulkBodyBulkTaskInstanceBody
+    """ # noqa: E501
+    actions: List[BulkBodyBulkTaskInstanceBodyActionsInner]
+    __properties: ClassVar[List[str]] = ["actions"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of BulkBodyBulkTaskInstanceBody from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        # override the default output from pydantic by calling `to_dict()` of each item in actions (list)
+        _items = []
+        if self.actions:
+            for _item_actions in self.actions:
+                if _item_actions:
+                    _items.append(_item_actions.to_dict())
+            _dict['actions'] = _items
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of BulkBodyBulkTaskInstanceBody from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "actions": [BulkBodyBulkTaskInstanceBodyActionsInner.from_dict(_item) for _item in obj["actions"]] if obj.get("actions") is not None else None
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/bulk_body_bulk_task_instance_body_actions_inner.py b/airflow_client/client/models/bulk_body_bulk_task_instance_body_actions_inner.py
new file mode 100644
index 0000000..e33168a
--- /dev/null
+++ b/airflow_client/client/models/bulk_body_bulk_task_instance_body_actions_inner.py
@@ -0,0 +1,151 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import json
+import pprint
+from pydantic import BaseModel, ConfigDict, Field, StrictStr, ValidationError, field_validator
+from typing import Any, List, Optional
+from airflow_client.client.models.bulk_create_action_bulk_task_instance_body import BulkCreateActionBulkTaskInstanceBody
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body import BulkDeleteActionBulkTaskInstanceBody
+from airflow_client.client.models.bulk_update_action_bulk_task_instance_body import BulkUpdateActionBulkTaskInstanceBody
+from pydantic import StrictStr, Field
+from typing import Union, List, Set, Optional, Dict
+from typing_extensions import Literal, Self
+
+BULKBODYBULKTASKINSTANCEBODYACTIONSINNER_ONE_OF_SCHEMAS = ["BulkCreateActionBulkTaskInstanceBody", "BulkDeleteActionBulkTaskInstanceBody", "BulkUpdateActionBulkTaskInstanceBody"]
+
+class BulkBodyBulkTaskInstanceBodyActionsInner(BaseModel):
+    """
+    BulkBodyBulkTaskInstanceBodyActionsInner
+    """
+    # data type: BulkCreateActionBulkTaskInstanceBody
+    oneof_schema_1_validator: Optional[BulkCreateActionBulkTaskInstanceBody] = None
+    # data type: BulkUpdateActionBulkTaskInstanceBody
+    oneof_schema_2_validator: Optional[BulkUpdateActionBulkTaskInstanceBody] = None
+    # data type: BulkDeleteActionBulkTaskInstanceBody
+    oneof_schema_3_validator: Optional[BulkDeleteActionBulkTaskInstanceBody] = None
+    actual_instance: Optional[Union[BulkCreateActionBulkTaskInstanceBody, BulkDeleteActionBulkTaskInstanceBody, BulkUpdateActionBulkTaskInstanceBody]] = None
+    one_of_schemas: Set[str] = { "BulkCreateActionBulkTaskInstanceBody", "BulkDeleteActionBulkTaskInstanceBody", "BulkUpdateActionBulkTaskInstanceBody" }
+
+    model_config = ConfigDict(
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def __init__(self, *args, **kwargs) -> None:
+        if args:
+            if len(args) > 1:
+                raise ValueError("If a position argument is used, only 1 is allowed to set `actual_instance`")
+            if kwargs:
+                raise ValueError("If a position argument is used, keyword arguments cannot be used.")
+            super().__init__(actual_instance=args[0])
+        else:
+            super().__init__(**kwargs)
+
+    @field_validator('actual_instance')
+    def actual_instance_must_validate_oneof(cls, v):
+        instance = BulkBodyBulkTaskInstanceBodyActionsInner.model_construct()
+        error_messages = []
+        match = 0
+        # validate data type: BulkCreateActionBulkTaskInstanceBody
+        if not isinstance(v, BulkCreateActionBulkTaskInstanceBody):
+            error_messages.append(f"Error! Input type `{type(v)}` is not `BulkCreateActionBulkTaskInstanceBody`")
+        else:
+            match += 1
+        # validate data type: BulkUpdateActionBulkTaskInstanceBody
+        if not isinstance(v, BulkUpdateActionBulkTaskInstanceBody):
+            error_messages.append(f"Error! Input type `{type(v)}` is not `BulkUpdateActionBulkTaskInstanceBody`")
+        else:
+            match += 1
+        # validate data type: BulkDeleteActionBulkTaskInstanceBody
+        if not isinstance(v, BulkDeleteActionBulkTaskInstanceBody):
+            error_messages.append(f"Error! Input type `{type(v)}` is not `BulkDeleteActionBulkTaskInstanceBody`")
+        else:
+            match += 1
+        if match > 1:
+            # more than 1 match
+            raise ValueError("Multiple matches found when setting `actual_instance` in BulkBodyBulkTaskInstanceBodyActionsInner with oneOf schemas: BulkCreateActionBulkTaskInstanceBody, BulkDeleteActionBulkTaskInstanceBody, BulkUpdateActionBulkTaskInstanceBody. Details: " + ", ".join(error_messages))
+        elif match == 0:
+            # no match
+            raise ValueError("No match found when setting `actual_instance` in BulkBodyBulkTaskInstanceBodyActionsInner with oneOf schemas: BulkCreateActionBulkTaskInstanceBody, BulkDeleteActionBulkTaskInstanceBody, BulkUpdateActionBulkTaskInstanceBody. Details: " + ", ".join(error_messages))
+        else:
+            return v
+
+    @classmethod
+    def from_dict(cls, obj: Union[str, Dict[str, Any]]) -> Self:
+        return cls.from_json(json.dumps(obj))
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Self:
+        """Returns the object represented by the json string"""
+        instance = cls.model_construct()
+        error_messages = []
+        match = 0
+
+        # deserialize data into BulkCreateActionBulkTaskInstanceBody
+        try:
+            instance.actual_instance = BulkCreateActionBulkTaskInstanceBody.from_json(json_str)
+            match += 1
+        except (ValidationError, ValueError) as e:
+            error_messages.append(str(e))
+        # deserialize data into BulkUpdateActionBulkTaskInstanceBody
+        try:
+            instance.actual_instance = BulkUpdateActionBulkTaskInstanceBody.from_json(json_str)
+            match += 1
+        except (ValidationError, ValueError) as e:
+            error_messages.append(str(e))
+        # deserialize data into BulkDeleteActionBulkTaskInstanceBody
+        try:
+            instance.actual_instance = BulkDeleteActionBulkTaskInstanceBody.from_json(json_str)
+            match += 1
+        except (ValidationError, ValueError) as e:
+            error_messages.append(str(e))
+
+        if match > 1:
+            # more than 1 match
+            raise ValueError("Multiple matches found when deserializing the JSON string into BulkBodyBulkTaskInstanceBodyActionsInner with oneOf schemas: BulkCreateActionBulkTaskInstanceBody, BulkDeleteActionBulkTaskInstanceBody, BulkUpdateActionBulkTaskInstanceBody. Details: " + ", ".join(error_messages))
+        elif match == 0:
+            # no match
+            raise ValueError("No match found when deserializing the JSON string into BulkBodyBulkTaskInstanceBodyActionsInner with oneOf schemas: BulkCreateActionBulkTaskInstanceBody, BulkDeleteActionBulkTaskInstanceBody, BulkUpdateActionBulkTaskInstanceBody. Details: " + ", ".join(error_messages))
+        else:
+            return instance
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the actual instance"""
+        if self.actual_instance is None:
+            return "null"
+
+        if hasattr(self.actual_instance, "to_json") and callable(self.actual_instance.to_json):
+            return self.actual_instance.to_json()
+        else:
+            return json.dumps(self.actual_instance)
+
+    def to_dict(self) -> Optional[Union[Dict[str, Any], BulkCreateActionBulkTaskInstanceBody, BulkDeleteActionBulkTaskInstanceBody, BulkUpdateActionBulkTaskInstanceBody]]:
+        """Returns the dict representation of the actual instance"""
+        if self.actual_instance is None:
+            return None
+
+        if hasattr(self.actual_instance, "to_dict") and callable(self.actual_instance.to_dict):
+            return self.actual_instance.to_dict()
+        else:
+            # primitive type
+            return self.actual_instance
+
+    def to_str(self) -> str:
+        """Returns the string representation of the actual instance"""
+        return pprint.pformat(self.model_dump())
+
+
diff --git a/airflow_client/client/models/bulk_create_action_bulk_task_instance_body.py b/airflow_client/client/models/bulk_create_action_bulk_task_instance_body.py
new file mode 100644
index 0000000..1e7f0f8
--- /dev/null
+++ b/airflow_client/client/models/bulk_create_action_bulk_task_instance_body.py
@@ -0,0 +1,107 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, Field, StrictStr, field_validator
+from typing import Any, ClassVar, Dict, List, Optional
+from airflow_client.client.models.bulk_action_on_existence import BulkActionOnExistence
+from airflow_client.client.models.bulk_task_instance_body import BulkTaskInstanceBody
+from typing import Optional, Set
+from typing_extensions import Self
+
+class BulkCreateActionBulkTaskInstanceBody(BaseModel):
+    """
+    BulkCreateActionBulkTaskInstanceBody
+    """ # noqa: E501
+    action: StrictStr = Field(description="The action to be performed on the entities.")
+    action_on_existence: Optional[BulkActionOnExistence] = None
+    entities: List[BulkTaskInstanceBody] = Field(description="A list of entities to be created.")
+    __properties: ClassVar[List[str]] = ["action", "action_on_existence", "entities"]
+
+    @field_validator('action')
+    def action_validate_enum(cls, value):
+        """Validates the enum"""
+        if value not in set(['create']):
+            raise ValueError("must be one of enum values ('create')")
+        return value
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of BulkCreateActionBulkTaskInstanceBody from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        # override the default output from pydantic by calling `to_dict()` of each item in entities (list)
+        _items = []
+        if self.entities:
+            for _item_entities in self.entities:
+                if _item_entities:
+                    _items.append(_item_entities.to_dict())
+            _dict['entities'] = _items
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of BulkCreateActionBulkTaskInstanceBody from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "action": obj.get("action"),
+            "action_on_existence": obj.get("action_on_existence"),
+            "entities": [BulkTaskInstanceBody.from_dict(_item) for _item in obj["entities"]] if obj.get("entities") is not None else None
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/bulk_delete_action_bulk_task_instance_body.py b/airflow_client/client/models/bulk_delete_action_bulk_task_instance_body.py
new file mode 100644
index 0000000..de80b8c
--- /dev/null
+++ b/airflow_client/client/models/bulk_delete_action_bulk_task_instance_body.py
@@ -0,0 +1,107 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, Field, StrictStr, field_validator
+from typing import Any, ClassVar, Dict, List, Optional
+from airflow_client.client.models.bulk_action_not_on_existence import BulkActionNotOnExistence
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body_entities_inner import BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
+from typing import Optional, Set
+from typing_extensions import Self
+
+class BulkDeleteActionBulkTaskInstanceBody(BaseModel):
+    """
+    BulkDeleteActionBulkTaskInstanceBody
+    """ # noqa: E501
+    action: StrictStr = Field(description="The action to be performed on the entities.")
+    action_on_non_existence: Optional[BulkActionNotOnExistence] = None
+    entities: List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner] = Field(description="A list of entity id/key or entity objects to be deleted.")
+    __properties: ClassVar[List[str]] = ["action", "action_on_non_existence", "entities"]
+
+    @field_validator('action')
+    def action_validate_enum(cls, value):
+        """Validates the enum"""
+        if value not in set(['delete']):
+            raise ValueError("must be one of enum values ('delete')")
+        return value
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of BulkDeleteActionBulkTaskInstanceBody from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        # override the default output from pydantic by calling `to_dict()` of each item in entities (list)
+        _items = []
+        if self.entities:
+            for _item_entities in self.entities:
+                if _item_entities:
+                    _items.append(_item_entities.to_dict())
+            _dict['entities'] = _items
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of BulkDeleteActionBulkTaskInstanceBody from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "action": obj.get("action"),
+            "action_on_non_existence": obj.get("action_on_non_existence"),
+            "entities": [BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.from_dict(_item) for _item in obj["entities"]] if obj.get("entities") is not None else None
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/bulk_delete_action_bulk_task_instance_body_entities_inner.py b/airflow_client/client/models/bulk_delete_action_bulk_task_instance_body_entities_inner.py
new file mode 100644
index 0000000..f68f6db
--- /dev/null
+++ b/airflow_client/client/models/bulk_delete_action_bulk_task_instance_body_entities_inner.py
@@ -0,0 +1,136 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+from inspect import getfullargspec
+import json
+import pprint
+import re  # noqa: F401
+from pydantic import BaseModel, ConfigDict, Field, StrictStr, ValidationError, field_validator
+from typing import Optional
+from airflow_client.client.models.bulk_task_instance_body import BulkTaskInstanceBody
+from typing import Union, Any, List, Set, TYPE_CHECKING, Optional, Dict
+from typing_extensions import Literal, Self
+from pydantic import Field
+
+BULKDELETEACTIONBULKTASKINSTANCEBODYENTITIESINNER_ANY_OF_SCHEMAS = ["BulkTaskInstanceBody", "str"]
+
+class BulkDeleteActionBulkTaskInstanceBodyEntitiesInner(BaseModel):
+    """
+    BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
+    """
+
+    # data type: str
+    anyof_schema_1_validator: Optional[StrictStr] = None
+    # data type: BulkTaskInstanceBody
+    anyof_schema_2_validator: Optional[BulkTaskInstanceBody] = None
+    if TYPE_CHECKING:
+        actual_instance: Optional[Union[BulkTaskInstanceBody, str]] = None
+    else:
+        actual_instance: Any = None
+    any_of_schemas: Set[str] = { "BulkTaskInstanceBody", "str" }
+
+    model_config = {
+        "validate_assignment": True,
+        "protected_namespaces": (),
+    }
+
+    def __init__(self, *args, **kwargs) -> None:
+        if args:
+            if len(args) > 1:
+                raise ValueError("If a position argument is used, only 1 is allowed to set `actual_instance`")
+            if kwargs:
+                raise ValueError("If a position argument is used, keyword arguments cannot be used.")
+            super().__init__(actual_instance=args[0])
+        else:
+            super().__init__(**kwargs)
+
+    @field_validator('actual_instance')
+    def actual_instance_must_validate_anyof(cls, v):
+        instance = BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.model_construct()
+        error_messages = []
+        # validate data type: str
+        try:
+            instance.anyof_schema_1_validator = v
+            return v
+        except (ValidationError, ValueError) as e:
+            error_messages.append(str(e))
+        # validate data type: BulkTaskInstanceBody
+        if not isinstance(v, BulkTaskInstanceBody):
+            error_messages.append(f"Error! Input type `{type(v)}` is not `BulkTaskInstanceBody`")
+        else:
+            return v
+
+        if error_messages:
+            # no match
+            raise ValueError("No match found when setting the actual_instance in BulkDeleteActionBulkTaskInstanceBodyEntitiesInner with anyOf schemas: BulkTaskInstanceBody, str. Details: " + ", ".join(error_messages))
+        else:
+            return v
+
+    @classmethod
+    def from_dict(cls, obj: Dict[str, Any]) -> Self:
+        return cls.from_json(json.dumps(obj))
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Self:
+        """Returns the object represented by the json string"""
+        instance = cls.model_construct()
+        error_messages = []
+        # deserialize data into str
+        try:
+            # validation
+            instance.anyof_schema_1_validator = json.loads(json_str)
+            # assign value to actual_instance
+            instance.actual_instance = instance.anyof_schema_1_validator
+            return instance
+        except (ValidationError, ValueError) as e:
+            error_messages.append(str(e))
+        # anyof_schema_2_validator: Optional[BulkTaskInstanceBody] = None
+        try:
+            instance.actual_instance = BulkTaskInstanceBody.from_json(json_str)
+            return instance
+        except (ValidationError, ValueError) as e:
+             error_messages.append(str(e))
+
+        if error_messages:
+            # no match
+            raise ValueError("No match found when deserializing the JSON string into BulkDeleteActionBulkTaskInstanceBodyEntitiesInner with anyOf schemas: BulkTaskInstanceBody, str. Details: " + ", ".join(error_messages))
+        else:
+            return instance
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the actual instance"""
+        if self.actual_instance is None:
+            return "null"
+
+        if hasattr(self.actual_instance, "to_json") and callable(self.actual_instance.to_json):
+            return self.actual_instance.to_json()
+        else:
+            return json.dumps(self.actual_instance)
+
+    def to_dict(self) -> Optional[Union[Dict[str, Any], BulkTaskInstanceBody, str]]:
+        """Returns the dict representation of the actual instance"""
+        if self.actual_instance is None:
+            return None
+
+        if hasattr(self.actual_instance, "to_dict") and callable(self.actual_instance.to_dict):
+            return self.actual_instance.to_dict()
+        else:
+            return self.actual_instance
+
+    def to_str(self) -> str:
+        """Returns the string representation of the actual instance"""
+        return pprint.pformat(self.model_dump())
+
+
diff --git a/airflow_client/client/models/bulk_delete_action_connection_body.py b/airflow_client/client/models/bulk_delete_action_connection_body.py
index a8f8692..8011f00 100644
--- a/airflow_client/client/models/bulk_delete_action_connection_body.py
+++ b/airflow_client/client/models/bulk_delete_action_connection_body.py
@@ -20,6 +20,7 @@
 from pydantic import BaseModel, ConfigDict, Field, StrictStr, field_validator
 from typing import Any, ClassVar, Dict, List, Optional
 from airflow_client.client.models.bulk_action_not_on_existence import BulkActionNotOnExistence
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body_entities_inner import BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
 from typing import Optional, Set
 from typing_extensions import Self
 
@@ -29,7 +30,7 @@
     """ # noqa: E501
     action: StrictStr = Field(description="The action to be performed on the entities.")
     action_on_non_existence: Optional[BulkActionNotOnExistence] = None
-    entities: List[StrictStr] = Field(description="A list of entity id/key to be deleted.")
+    entities: List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner] = Field(description="A list of entity id/key or entity objects to be deleted.")
     __properties: ClassVar[List[str]] = ["action", "action_on_non_existence", "entities"]
 
     @field_validator('action')
@@ -78,6 +79,13 @@
             exclude=excluded_fields,
             exclude_none=True,
         )
+        # override the default output from pydantic by calling `to_dict()` of each item in entities (list)
+        _items = []
+        if self.entities:
+            for _item_entities in self.entities:
+                if _item_entities:
+                    _items.append(_item_entities.to_dict())
+            _dict['entities'] = _items
         return _dict
 
     @classmethod
@@ -92,7 +100,7 @@
         _obj = cls.model_validate({
             "action": obj.get("action"),
             "action_on_non_existence": obj.get("action_on_non_existence"),
-            "entities": obj.get("entities")
+            "entities": [BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.from_dict(_item) for _item in obj["entities"]] if obj.get("entities") is not None else None
         })
         return _obj
 
diff --git a/airflow_client/client/models/bulk_delete_action_pool_body.py b/airflow_client/client/models/bulk_delete_action_pool_body.py
index b25d6a2..0ed5953 100644
--- a/airflow_client/client/models/bulk_delete_action_pool_body.py
+++ b/airflow_client/client/models/bulk_delete_action_pool_body.py
@@ -20,6 +20,7 @@
 from pydantic import BaseModel, ConfigDict, Field, StrictStr, field_validator
 from typing import Any, ClassVar, Dict, List, Optional
 from airflow_client.client.models.bulk_action_not_on_existence import BulkActionNotOnExistence
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body_entities_inner import BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
 from typing import Optional, Set
 from typing_extensions import Self
 
@@ -29,7 +30,7 @@
     """ # noqa: E501
     action: StrictStr = Field(description="The action to be performed on the entities.")
     action_on_non_existence: Optional[BulkActionNotOnExistence] = None
-    entities: List[StrictStr] = Field(description="A list of entity id/key to be deleted.")
+    entities: List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner] = Field(description="A list of entity id/key or entity objects to be deleted.")
     __properties: ClassVar[List[str]] = ["action", "action_on_non_existence", "entities"]
 
     @field_validator('action')
@@ -78,6 +79,13 @@
             exclude=excluded_fields,
             exclude_none=True,
         )
+        # override the default output from pydantic by calling `to_dict()` of each item in entities (list)
+        _items = []
+        if self.entities:
+            for _item_entities in self.entities:
+                if _item_entities:
+                    _items.append(_item_entities.to_dict())
+            _dict['entities'] = _items
         return _dict
 
     @classmethod
@@ -92,7 +100,7 @@
         _obj = cls.model_validate({
             "action": obj.get("action"),
             "action_on_non_existence": obj.get("action_on_non_existence"),
-            "entities": obj.get("entities")
+            "entities": [BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.from_dict(_item) for _item in obj["entities"]] if obj.get("entities") is not None else None
         })
         return _obj
 
diff --git a/airflow_client/client/models/bulk_delete_action_variable_body.py b/airflow_client/client/models/bulk_delete_action_variable_body.py
index 358529d..e7c3503 100644
--- a/airflow_client/client/models/bulk_delete_action_variable_body.py
+++ b/airflow_client/client/models/bulk_delete_action_variable_body.py
@@ -20,6 +20,7 @@
 from pydantic import BaseModel, ConfigDict, Field, StrictStr, field_validator
 from typing import Any, ClassVar, Dict, List, Optional
 from airflow_client.client.models.bulk_action_not_on_existence import BulkActionNotOnExistence
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body_entities_inner import BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
 from typing import Optional, Set
 from typing_extensions import Self
 
@@ -29,7 +30,7 @@
     """ # noqa: E501
     action: StrictStr = Field(description="The action to be performed on the entities.")
     action_on_non_existence: Optional[BulkActionNotOnExistence] = None
-    entities: List[StrictStr] = Field(description="A list of entity id/key to be deleted.")
+    entities: List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner] = Field(description="A list of entity id/key or entity objects to be deleted.")
     __properties: ClassVar[List[str]] = ["action", "action_on_non_existence", "entities"]
 
     @field_validator('action')
@@ -78,6 +79,13 @@
             exclude=excluded_fields,
             exclude_none=True,
         )
+        # override the default output from pydantic by calling `to_dict()` of each item in entities (list)
+        _items = []
+        if self.entities:
+            for _item_entities in self.entities:
+                if _item_entities:
+                    _items.append(_item_entities.to_dict())
+            _dict['entities'] = _items
         return _dict
 
     @classmethod
@@ -92,7 +100,7 @@
         _obj = cls.model_validate({
             "action": obj.get("action"),
             "action_on_non_existence": obj.get("action_on_non_existence"),
-            "entities": obj.get("entities")
+            "entities": [BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.from_dict(_item) for _item in obj["entities"]] if obj.get("entities") is not None else None
         })
         return _obj
 
diff --git a/airflow_client/client/models/bulk_task_instance_body.py b/airflow_client/client/models/bulk_task_instance_body.py
new file mode 100644
index 0000000..c3126cf
--- /dev/null
+++ b/airflow_client/client/models/bulk_task_instance_body.py
@@ -0,0 +1,103 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, Field, StrictBool, StrictInt, StrictStr
+from typing import Any, ClassVar, Dict, List, Optional
+from typing_extensions import Annotated
+from airflow_client.client.models.task_instance_state import TaskInstanceState
+from typing import Optional, Set
+from typing_extensions import Self
+
+class BulkTaskInstanceBody(BaseModel):
+    """
+    Request body for bulk update, and delete task instances.
+    """ # noqa: E501
+    include_downstream: Optional[StrictBool] = False
+    include_future: Optional[StrictBool] = False
+    include_past: Optional[StrictBool] = False
+    include_upstream: Optional[StrictBool] = False
+    map_index: Optional[StrictInt] = None
+    new_state: Optional[TaskInstanceState] = None
+    note: Optional[Annotated[str, Field(strict=True, max_length=1000)]] = None
+    task_id: StrictStr
+    __properties: ClassVar[List[str]] = ["include_downstream", "include_future", "include_past", "include_upstream", "map_index", "new_state", "note", "task_id"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of BulkTaskInstanceBody from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of BulkTaskInstanceBody from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "include_downstream": obj.get("include_downstream") if obj.get("include_downstream") is not None else False,
+            "include_future": obj.get("include_future") if obj.get("include_future") is not None else False,
+            "include_past": obj.get("include_past") if obj.get("include_past") is not None else False,
+            "include_upstream": obj.get("include_upstream") if obj.get("include_upstream") is not None else False,
+            "map_index": obj.get("map_index"),
+            "new_state": obj.get("new_state"),
+            "note": obj.get("note"),
+            "task_id": obj.get("task_id")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/bulk_update_action_bulk_task_instance_body.py b/airflow_client/client/models/bulk_update_action_bulk_task_instance_body.py
new file mode 100644
index 0000000..4d547e6
--- /dev/null
+++ b/airflow_client/client/models/bulk_update_action_bulk_task_instance_body.py
@@ -0,0 +1,107 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, Field, StrictStr, field_validator
+from typing import Any, ClassVar, Dict, List, Optional
+from airflow_client.client.models.bulk_action_not_on_existence import BulkActionNotOnExistence
+from airflow_client.client.models.bulk_task_instance_body import BulkTaskInstanceBody
+from typing import Optional, Set
+from typing_extensions import Self
+
+class BulkUpdateActionBulkTaskInstanceBody(BaseModel):
+    """
+    BulkUpdateActionBulkTaskInstanceBody
+    """ # noqa: E501
+    action: StrictStr = Field(description="The action to be performed on the entities.")
+    action_on_non_existence: Optional[BulkActionNotOnExistence] = None
+    entities: List[BulkTaskInstanceBody] = Field(description="A list of entities to be updated.")
+    __properties: ClassVar[List[str]] = ["action", "action_on_non_existence", "entities"]
+
+    @field_validator('action')
+    def action_validate_enum(cls, value):
+        """Validates the enum"""
+        if value not in set(['update']):
+            raise ValueError("must be one of enum values ('update')")
+        return value
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of BulkUpdateActionBulkTaskInstanceBody from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        # override the default output from pydantic by calling `to_dict()` of each item in entities (list)
+        _items = []
+        if self.entities:
+            for _item_entities in self.entities:
+                if _item_entities:
+                    _items.append(_item_entities.to_dict())
+            _dict['entities'] = _items
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of BulkUpdateActionBulkTaskInstanceBody from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "action": obj.get("action"),
+            "action_on_non_existence": obj.get("action_on_non_existence"),
+            "entities": [BulkTaskInstanceBody.from_dict(_item) for _item in obj["entities"]] if obj.get("entities") is not None else None
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/clear_task_instances_body.py b/airflow_client/client/models/clear_task_instances_body.py
index 3eca008..d605c84 100644
--- a/airflow_client/client/models/clear_task_instances_body.py
+++ b/airflow_client/client/models/clear_task_instances_body.py
@@ -18,7 +18,7 @@
 import json
 
 from datetime import datetime
-from pydantic import BaseModel, ConfigDict, StrictBool, StrictStr
+from pydantic import BaseModel, ConfigDict, Field, StrictBool, StrictStr
 from typing import Any, ClassVar, Dict, List, Optional
 from airflow_client.client.models.clear_task_instances_body_task_ids_inner import ClearTaskInstancesBodyTaskIdsInner
 from typing import Optional, Set
@@ -38,9 +38,10 @@
     only_failed: Optional[StrictBool] = True
     only_running: Optional[StrictBool] = False
     reset_dag_runs: Optional[StrictBool] = True
+    run_on_latest_version: Optional[StrictBool] = Field(default=False, description="(Experimental) Run on the latest bundle version of the dag after clearing the task instances.")
     start_date: Optional[datetime] = None
     task_ids: Optional[List[ClearTaskInstancesBodyTaskIdsInner]] = None
-    __properties: ClassVar[List[str]] = ["dag_run_id", "dry_run", "end_date", "include_downstream", "include_future", "include_past", "include_upstream", "only_failed", "only_running", "reset_dag_runs", "start_date", "task_ids"]
+    __properties: ClassVar[List[str]] = ["dag_run_id", "dry_run", "end_date", "include_downstream", "include_future", "include_past", "include_upstream", "only_failed", "only_running", "reset_dag_runs", "run_on_latest_version", "start_date", "task_ids"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -110,6 +111,7 @@
             "only_failed": obj.get("only_failed") if obj.get("only_failed") is not None else True,
             "only_running": obj.get("only_running") if obj.get("only_running") is not None else False,
             "reset_dag_runs": obj.get("reset_dag_runs") if obj.get("reset_dag_runs") is not None else True,
+            "run_on_latest_version": obj.get("run_on_latest_version") if obj.get("run_on_latest_version") is not None else False,
             "start_date": obj.get("start_date"),
             "task_ids": [ClearTaskInstancesBodyTaskIdsInner.from_dict(_item) for _item in obj["task_ids"]] if obj.get("task_ids") is not None else None
         })
diff --git a/airflow_client/client/models/dag_details_response.py b/airflow_client/client/models/dag_details_response.py
index 8c4bef7..ec885e0 100644
--- a/airflow_client/client/models/dag_details_response.py
+++ b/airflow_client/client/models/dag_details_response.py
@@ -18,8 +18,8 @@
 import json
 
 from datetime import datetime
-from pydantic import BaseModel, ConfigDict, Field, StrictBool, StrictInt, StrictStr
-from typing import Any, ClassVar, Dict, List, Optional
+from pydantic import BaseModel, ConfigDict, Field, StrictBool, StrictFloat, StrictInt, StrictStr
+from typing import Any, ClassVar, Dict, List, Optional, Union
 from airflow_client.client.models.dag_tag_response import DagTagResponse
 from airflow_client.client.models.dag_version_response import DagVersionResponse
 from typing import Optional, Set
@@ -33,10 +33,11 @@
     bundle_name: Optional[StrictStr] = None
     bundle_version: Optional[StrictStr] = None
     catchup: StrictBool
-    concurrency: StrictInt = Field(description="Return max_active_tasks as concurrency.")
+    concurrency: StrictInt = Field(description="Return max_active_tasks as concurrency.  Deprecated: Use max_active_tasks instead.")
     dag_display_name: StrictStr
     dag_id: StrictStr
     dag_run_timeout: Optional[StrictStr] = None
+    default_args: Optional[Dict[str, Any]] = None
     description: Optional[StrictStr] = None
     doc_md: Optional[StrictStr] = None
     end_date: Optional[datetime] = None
@@ -48,6 +49,7 @@
     is_paused_upon_creation: Optional[StrictBool] = None
     is_stale: StrictBool
     last_expired: Optional[datetime] = None
+    last_parse_duration: Optional[Union[StrictFloat, StrictInt]] = None
     last_parsed: Optional[datetime] = None
     last_parsed_time: Optional[datetime] = None
     latest_dag_version: Optional[DagVersionResponse] = None
@@ -69,7 +71,7 @@
     timetable_description: Optional[StrictStr] = None
     timetable_summary: Optional[StrictStr] = None
     timezone: Optional[StrictStr] = None
-    __properties: ClassVar[List[str]] = ["asset_expression", "bundle_name", "bundle_version", "catchup", "concurrency", "dag_display_name", "dag_id", "dag_run_timeout", "description", "doc_md", "end_date", "file_token", "fileloc", "has_import_errors", "has_task_concurrency_limits", "is_paused", "is_paused_upon_creation", "is_stale", "last_expired", "last_parsed", "last_parsed_time", "latest_dag_version", "max_active_runs", "max_active_tasks", "max_consecutive_failed_dag_runs", "next_dagrun_data_interval_end", "next_dagrun_data_interval_start", "next_dagrun_logical_date", "next_dagrun_run_after", "owner_links", "owners", "params", "relative_fileloc", "render_template_as_native_obj", "start_date", "tags", "template_search_path", "timetable_description", "timetable_summary", "timezone"]
+    __properties: ClassVar[List[str]] = ["asset_expression", "bundle_name", "bundle_version", "catchup", "concurrency", "dag_display_name", "dag_id", "dag_run_timeout", "default_args", "description", "doc_md", "end_date", "file_token", "fileloc", "has_import_errors", "has_task_concurrency_limits", "is_paused", "is_paused_upon_creation", "is_stale", "last_expired", "last_parse_duration", "last_parsed", "last_parsed_time", "latest_dag_version", "max_active_runs", "max_active_tasks", "max_consecutive_failed_dag_runs", "next_dagrun_data_interval_end", "next_dagrun_data_interval_start", "next_dagrun_logical_date", "next_dagrun_run_after", "owner_links", "owners", "params", "relative_fileloc", "render_template_as_native_obj", "start_date", "tags", "template_search_path", "timetable_description", "timetable_summary", "timezone"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -144,6 +146,7 @@
             "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "dag_run_timeout": obj.get("dag_run_timeout"),
+            "default_args": obj.get("default_args"),
             "description": obj.get("description"),
             "doc_md": obj.get("doc_md"),
             "end_date": obj.get("end_date"),
@@ -155,6 +158,7 @@
             "is_paused_upon_creation": obj.get("is_paused_upon_creation"),
             "is_stale": obj.get("is_stale"),
             "last_expired": obj.get("last_expired"),
+            "last_parse_duration": obj.get("last_parse_duration"),
             "last_parsed": obj.get("last_parsed"),
             "last_parsed_time": obj.get("last_parsed_time"),
             "latest_dag_version": DagVersionResponse.from_dict(obj["latest_dag_version"]) if obj.get("latest_dag_version") is not None else None,
diff --git a/airflow_client/client/models/dag_response.py b/airflow_client/client/models/dag_response.py
index 04c51ed..f1e0bd2 100644
--- a/airflow_client/client/models/dag_response.py
+++ b/airflow_client/client/models/dag_response.py
@@ -18,8 +18,8 @@
 import json
 
 from datetime import datetime
-from pydantic import BaseModel, ConfigDict, Field, StrictBool, StrictInt, StrictStr
-from typing import Any, ClassVar, Dict, List, Optional
+from pydantic import BaseModel, ConfigDict, Field, StrictBool, StrictFloat, StrictInt, StrictStr
+from typing import Any, ClassVar, Dict, List, Optional, Union
 from airflow_client.client.models.dag_tag_response import DagTagResponse
 from typing import Optional, Set
 from typing_extensions import Self
@@ -40,6 +40,7 @@
     is_paused: StrictBool
     is_stale: StrictBool
     last_expired: Optional[datetime] = None
+    last_parse_duration: Optional[Union[StrictFloat, StrictInt]] = None
     last_parsed_time: Optional[datetime] = None
     max_active_runs: Optional[StrictInt] = None
     max_active_tasks: StrictInt
@@ -53,7 +54,7 @@
     tags: List[DagTagResponse]
     timetable_description: Optional[StrictStr] = None
     timetable_summary: Optional[StrictStr] = None
-    __properties: ClassVar[List[str]] = ["bundle_name", "bundle_version", "dag_display_name", "dag_id", "description", "file_token", "fileloc", "has_import_errors", "has_task_concurrency_limits", "is_paused", "is_stale", "last_expired", "last_parsed_time", "max_active_runs", "max_active_tasks", "max_consecutive_failed_dag_runs", "next_dagrun_data_interval_end", "next_dagrun_data_interval_start", "next_dagrun_logical_date", "next_dagrun_run_after", "owners", "relative_fileloc", "tags", "timetable_description", "timetable_summary"]
+    __properties: ClassVar[List[str]] = ["bundle_name", "bundle_version", "dag_display_name", "dag_id", "description", "file_token", "fileloc", "has_import_errors", "has_task_concurrency_limits", "is_paused", "is_stale", "last_expired", "last_parse_duration", "last_parsed_time", "max_active_runs", "max_active_tasks", "max_consecutive_failed_dag_runs", "next_dagrun_data_interval_end", "next_dagrun_data_interval_start", "next_dagrun_logical_date", "next_dagrun_run_after", "owners", "relative_fileloc", "tags", "timetable_description", "timetable_summary"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -127,6 +128,7 @@
             "is_paused": obj.get("is_paused"),
             "is_stale": obj.get("is_stale"),
             "last_expired": obj.get("last_expired"),
+            "last_parse_duration": obj.get("last_parse_duration"),
             "last_parsed_time": obj.get("last_parsed_time"),
             "max_active_runs": obj.get("max_active_runs"),
             "max_active_tasks": obj.get("max_active_tasks"),
diff --git a/airflow_client/client/models/dag_run_clear_body.py b/airflow_client/client/models/dag_run_clear_body.py
index 98b4b4f..d415920 100644
--- a/airflow_client/client/models/dag_run_clear_body.py
+++ b/airflow_client/client/models/dag_run_clear_body.py
@@ -17,7 +17,7 @@
 import re  # noqa: F401
 import json
 
-from pydantic import BaseModel, ConfigDict, StrictBool
+from pydantic import BaseModel, ConfigDict, Field, StrictBool
 from typing import Any, ClassVar, Dict, List, Optional
 from typing import Optional, Set
 from typing_extensions import Self
@@ -28,7 +28,8 @@
     """ # noqa: E501
     dry_run: Optional[StrictBool] = True
     only_failed: Optional[StrictBool] = False
-    __properties: ClassVar[List[str]] = ["dry_run", "only_failed"]
+    run_on_latest_version: Optional[StrictBool] = Field(default=False, description="(Experimental) Run on the latest bundle version of the Dag after clearing the Dag Run.")
+    __properties: ClassVar[List[str]] = ["dry_run", "only_failed", "run_on_latest_version"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -82,7 +83,8 @@
 
         _obj = cls.model_validate({
             "dry_run": obj.get("dry_run") if obj.get("dry_run") is not None else True,
-            "only_failed": obj.get("only_failed") if obj.get("only_failed") is not None else False
+            "only_failed": obj.get("only_failed") if obj.get("only_failed") is not None else False,
+            "run_on_latest_version": obj.get("run_on_latest_version") if obj.get("run_on_latest_version") is not None else False
         })
         return _obj
 
diff --git a/airflow_client/client/models/dag_run_response.py b/airflow_client/client/models/dag_run_response.py
index 3a6fbd4..ecf2b54 100644
--- a/airflow_client/client/models/dag_run_response.py
+++ b/airflow_client/client/models/dag_run_response.py
@@ -18,8 +18,8 @@
 import json
 
 from datetime import datetime
-from pydantic import BaseModel, ConfigDict, StrictStr
-from typing import Any, ClassVar, Dict, List, Optional
+from pydantic import BaseModel, ConfigDict, StrictFloat, StrictInt, StrictStr
+from typing import Any, ClassVar, Dict, List, Optional, Union
 from airflow_client.client.models.dag_run_state import DagRunState
 from airflow_client.client.models.dag_run_triggered_by_type import DagRunTriggeredByType
 from airflow_client.client.models.dag_run_type import DagRunType
@@ -33,11 +33,13 @@
     """ # noqa: E501
     bundle_version: Optional[StrictStr] = None
     conf: Optional[Dict[str, Any]] = None
+    dag_display_name: StrictStr
     dag_id: StrictStr
     dag_run_id: StrictStr
     dag_versions: List[DagVersionResponse]
     data_interval_end: Optional[datetime] = None
     data_interval_start: Optional[datetime] = None
+    duration: Optional[Union[StrictFloat, StrictInt]] = None
     end_date: Optional[datetime] = None
     last_scheduling_decision: Optional[datetime] = None
     logical_date: Optional[datetime] = None
@@ -48,7 +50,8 @@
     start_date: Optional[datetime] = None
     state: DagRunState
     triggered_by: Optional[DagRunTriggeredByType] = None
-    __properties: ClassVar[List[str]] = ["bundle_version", "conf", "dag_id", "dag_run_id", "dag_versions", "data_interval_end", "data_interval_start", "end_date", "last_scheduling_decision", "logical_date", "note", "queued_at", "run_after", "run_type", "start_date", "state", "triggered_by"]
+    triggering_user_name: Optional[StrictStr] = None
+    __properties: ClassVar[List[str]] = ["bundle_version", "conf", "dag_display_name", "dag_id", "dag_run_id", "dag_versions", "data_interval_end", "data_interval_start", "duration", "end_date", "last_scheduling_decision", "logical_date", "note", "queued_at", "run_after", "run_type", "start_date", "state", "triggered_by", "triggering_user_name"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -110,11 +113,13 @@
         _obj = cls.model_validate({
             "bundle_version": obj.get("bundle_version"),
             "conf": obj.get("conf"),
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "dag_run_id": obj.get("dag_run_id"),
             "dag_versions": [DagVersionResponse.from_dict(_item) for _item in obj["dag_versions"]] if obj.get("dag_versions") is not None else None,
             "data_interval_end": obj.get("data_interval_end"),
             "data_interval_start": obj.get("data_interval_start"),
+            "duration": obj.get("duration"),
             "end_date": obj.get("end_date"),
             "last_scheduling_decision": obj.get("last_scheduling_decision"),
             "logical_date": obj.get("logical_date"),
@@ -124,7 +129,8 @@
             "run_type": obj.get("run_type"),
             "start_date": obj.get("start_date"),
             "state": obj.get("state"),
-            "triggered_by": obj.get("triggered_by")
+            "triggered_by": obj.get("triggered_by"),
+            "triggering_user_name": obj.get("triggering_user_name")
         })
         return _obj
 
diff --git a/airflow_client/client/models/dag_runs_batch_body.py b/airflow_client/client/models/dag_runs_batch_body.py
index b0ff1eb..4d43680 100644
--- a/airflow_client/client/models/dag_runs_batch_body.py
+++ b/airflow_client/client/models/dag_runs_batch_body.py
@@ -30,19 +30,27 @@
     List DAG Runs body for batch endpoint.
     """ # noqa: E501
     dag_ids: Optional[List[StrictStr]] = None
+    end_date_gt: Optional[datetime] = None
     end_date_gte: Optional[datetime] = None
+    end_date_lt: Optional[datetime] = None
     end_date_lte: Optional[datetime] = None
+    logical_date_gt: Optional[datetime] = None
     logical_date_gte: Optional[datetime] = None
+    logical_date_lt: Optional[datetime] = None
     logical_date_lte: Optional[datetime] = None
     order_by: Optional[StrictStr] = None
     page_limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = 100
     page_offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = 0
+    run_after_gt: Optional[datetime] = None
     run_after_gte: Optional[datetime] = None
+    run_after_lt: Optional[datetime] = None
     run_after_lte: Optional[datetime] = None
+    start_date_gt: Optional[datetime] = None
     start_date_gte: Optional[datetime] = None
+    start_date_lt: Optional[datetime] = None
     start_date_lte: Optional[datetime] = None
     states: Optional[List[Optional[DagRunState]]] = None
-    __properties: ClassVar[List[str]] = ["dag_ids", "end_date_gte", "end_date_lte", "logical_date_gte", "logical_date_lte", "order_by", "page_limit", "page_offset", "run_after_gte", "run_after_lte", "start_date_gte", "start_date_lte", "states"]
+    __properties: ClassVar[List[str]] = ["dag_ids", "end_date_gt", "end_date_gte", "end_date_lt", "end_date_lte", "logical_date_gt", "logical_date_gte", "logical_date_lt", "logical_date_lte", "order_by", "page_limit", "page_offset", "run_after_gt", "run_after_gte", "run_after_lt", "run_after_lte", "start_date_gt", "start_date_gte", "start_date_lt", "start_date_lte", "states"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -96,16 +104,24 @@
 
         _obj = cls.model_validate({
             "dag_ids": obj.get("dag_ids"),
+            "end_date_gt": obj.get("end_date_gt"),
             "end_date_gte": obj.get("end_date_gte"),
+            "end_date_lt": obj.get("end_date_lt"),
             "end_date_lte": obj.get("end_date_lte"),
+            "logical_date_gt": obj.get("logical_date_gt"),
             "logical_date_gte": obj.get("logical_date_gte"),
+            "logical_date_lt": obj.get("logical_date_lt"),
             "logical_date_lte": obj.get("logical_date_lte"),
             "order_by": obj.get("order_by"),
             "page_limit": obj.get("page_limit") if obj.get("page_limit") is not None else 100,
             "page_offset": obj.get("page_offset") if obj.get("page_offset") is not None else 0,
+            "run_after_gt": obj.get("run_after_gt"),
             "run_after_gte": obj.get("run_after_gte"),
+            "run_after_lt": obj.get("run_after_lt"),
             "run_after_lte": obj.get("run_after_lte"),
+            "start_date_gt": obj.get("start_date_gt"),
             "start_date_gte": obj.get("start_date_gte"),
+            "start_date_lt": obj.get("start_date_lt"),
             "start_date_lte": obj.get("start_date_lte"),
             "states": obj.get("states")
         })
diff --git a/airflow_client/client/models/dag_source_response.py b/airflow_client/client/models/dag_source_response.py
index a0ee01f..a93f3cc 100644
--- a/airflow_client/client/models/dag_source_response.py
+++ b/airflow_client/client/models/dag_source_response.py
@@ -27,9 +27,10 @@
     DAG Source serializer for responses.
     """ # noqa: E501
     content: Optional[StrictStr] = None
+    dag_display_name: StrictStr
     dag_id: StrictStr
     version_number: Optional[StrictInt] = None
-    __properties: ClassVar[List[str]] = ["content", "dag_id", "version_number"]
+    __properties: ClassVar[List[str]] = ["content", "dag_display_name", "dag_id", "version_number"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -83,6 +84,7 @@
 
         _obj = cls.model_validate({
             "content": obj.get("content"),
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "version_number": obj.get("version_number")
         })
diff --git a/airflow_client/client/models/dag_stats_response.py b/airflow_client/client/models/dag_stats_response.py
index 8710c64..3fcc1b9 100644
--- a/airflow_client/client/models/dag_stats_response.py
+++ b/airflow_client/client/models/dag_stats_response.py
@@ -27,9 +27,10 @@
     """
     DAG Stats serializer for responses.
     """ # noqa: E501
+    dag_display_name: StrictStr
     dag_id: StrictStr
     stats: List[DagStatsStateResponse]
-    __properties: ClassVar[List[str]] = ["dag_id", "stats"]
+    __properties: ClassVar[List[str]] = ["dag_display_name", "dag_id", "stats"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -89,6 +90,7 @@
             return cls.model_validate(obj)
 
         _obj = cls.model_validate({
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "stats": [DagStatsStateResponse.from_dict(_item) for _item in obj["stats"]] if obj.get("stats") is not None else None
         })
diff --git a/airflow_client/client/models/dag_tag_response.py b/airflow_client/client/models/dag_tag_response.py
index cf8eb2b..0079203 100644
--- a/airflow_client/client/models/dag_tag_response.py
+++ b/airflow_client/client/models/dag_tag_response.py
@@ -26,9 +26,10 @@
     """
     DAG Tag serializer for responses.
     """ # noqa: E501
+    dag_display_name: StrictStr
     dag_id: StrictStr
     name: StrictStr
-    __properties: ClassVar[List[str]] = ["dag_id", "name"]
+    __properties: ClassVar[List[str]] = ["dag_display_name", "dag_id", "name"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -81,6 +82,7 @@
             return cls.model_validate(obj)
 
         _obj = cls.model_validate({
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "name": obj.get("name")
         })
diff --git a/airflow_client/client/models/dag_version_response.py b/airflow_client/client/models/dag_version_response.py
index d9ad70b..b9ed86d 100644
--- a/airflow_client/client/models/dag_version_response.py
+++ b/airflow_client/client/models/dag_version_response.py
@@ -31,10 +31,11 @@
     bundle_url: Optional[StrictStr] = None
     bundle_version: Optional[StrictStr] = None
     created_at: datetime
+    dag_display_name: StrictStr
     dag_id: StrictStr
     id: StrictStr
     version_number: StrictInt
-    __properties: ClassVar[List[str]] = ["bundle_name", "bundle_url", "bundle_version", "created_at", "dag_id", "id", "version_number"]
+    __properties: ClassVar[List[str]] = ["bundle_name", "bundle_url", "bundle_version", "created_at", "dag_display_name", "dag_id", "id", "version_number"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -91,6 +92,7 @@
             "bundle_url": obj.get("bundle_url"),
             "bundle_version": obj.get("bundle_version"),
             "created_at": obj.get("created_at"),
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "id": obj.get("id"),
             "version_number": obj.get("version_number")
diff --git a/airflow_client/client/models/dag_warning_response.py b/airflow_client/client/models/dag_warning_response.py
index 1954643..9d7bc57 100644
--- a/airflow_client/client/models/dag_warning_response.py
+++ b/airflow_client/client/models/dag_warning_response.py
@@ -28,11 +28,12 @@
     """
     DAG Warning serializer for responses.
     """ # noqa: E501
+    dag_display_name: StrictStr
     dag_id: StrictStr
     message: StrictStr
     timestamp: datetime
     warning_type: DagWarningType
-    __properties: ClassVar[List[str]] = ["dag_id", "message", "timestamp", "warning_type"]
+    __properties: ClassVar[List[str]] = ["dag_display_name", "dag_id", "message", "timestamp", "warning_type"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -85,6 +86,7 @@
             return cls.model_validate(obj)
 
         _obj = cls.model_validate({
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "message": obj.get("message"),
             "timestamp": obj.get("timestamp"),
diff --git a/airflow_client/client/models/event_log_response.py b/airflow_client/client/models/event_log_response.py
index e7cb2bb..d864db2 100644
--- a/airflow_client/client/models/event_log_response.py
+++ b/airflow_client/client/models/event_log_response.py
@@ -27,6 +27,7 @@
     """
     Event Log Response.
     """ # noqa: E501
+    dag_display_name: Optional[StrictStr] = None
     dag_id: Optional[StrictStr] = None
     event: StrictStr
     event_log_id: StrictInt
@@ -38,7 +39,7 @@
     task_id: Optional[StrictStr] = None
     try_number: Optional[StrictInt] = None
     when: datetime
-    __properties: ClassVar[List[str]] = ["dag_id", "event", "event_log_id", "extra", "logical_date", "map_index", "owner", "run_id", "task_id", "try_number", "when"]
+    __properties: ClassVar[List[str]] = ["dag_display_name", "dag_id", "event", "event_log_id", "extra", "logical_date", "map_index", "owner", "run_id", "task_id", "try_number", "when"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -91,6 +92,7 @@
             return cls.model_validate(obj)
 
         _obj = cls.model_validate({
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "event": obj.get("event"),
             "event_log_id": obj.get("event_log_id"),
diff --git a/airflow_client/client/models/external_log_url_response.py b/airflow_client/client/models/external_log_url_response.py
new file mode 100644
index 0000000..3b903a6
--- /dev/null
+++ b/airflow_client/client/models/external_log_url_response.py
@@ -0,0 +1,87 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, StrictStr
+from typing import Any, ClassVar, Dict, List
+from typing import Optional, Set
+from typing_extensions import Self
+
+class ExternalLogUrlResponse(BaseModel):
+    """
+    Response for the external log URL endpoint.
+    """ # noqa: E501
+    url: StrictStr
+    __properties: ClassVar[List[str]] = ["url"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of ExternalLogUrlResponse from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of ExternalLogUrlResponse from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "url": obj.get("url")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/external_view_response.py b/airflow_client/client/models/external_view_response.py
new file mode 100644
index 0000000..7ae8e81
--- /dev/null
+++ b/airflow_client/client/models/external_view_response.py
@@ -0,0 +1,109 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, StrictStr, field_validator
+from typing import Any, ClassVar, Dict, List, Optional
+from typing import Optional, Set
+from typing_extensions import Self
+
+class ExternalViewResponse(BaseModel):
+    """
+    Serializer for External View Plugin responses.
+    """ # noqa: E501
+    category: Optional[StrictStr] = None
+    destination: Optional[StrictStr] = 'nav'
+    href: StrictStr
+    icon: Optional[StrictStr] = None
+    icon_dark_mode: Optional[StrictStr] = None
+    name: StrictStr
+    url_route: Optional[StrictStr] = None
+    __properties: ClassVar[List[str]] = ["category", "destination", "href", "icon", "icon_dark_mode", "name", "url_route"]
+
+    @field_validator('destination')
+    def destination_validate_enum(cls, value):
+        """Validates the enum"""
+        if value is None:
+            return value
+
+        if value not in set(['nav', 'dag', 'dag_run', 'task', 'task_instance']):
+            raise ValueError("must be one of enum values ('nav', 'dag', 'dag_run', 'task', 'task_instance')")
+        return value
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of ExternalViewResponse from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of ExternalViewResponse from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "category": obj.get("category"),
+            "destination": obj.get("destination") if obj.get("destination") is not None else 'nav',
+            "href": obj.get("href"),
+            "icon": obj.get("icon"),
+            "icon_dark_mode": obj.get("icon_dark_mode"),
+            "name": obj.get("name"),
+            "url_route": obj.get("url_route")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/hitl_detail.py b/airflow_client/client/models/hitl_detail.py
new file mode 100644
index 0000000..de3f73f
--- /dev/null
+++ b/airflow_client/client/models/hitl_detail.py
@@ -0,0 +1,130 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from datetime import datetime
+from pydantic import BaseModel, ConfigDict, Field, StrictBool, StrictStr
+from typing import Any, ClassVar, Dict, List, Optional
+from typing_extensions import Annotated
+from airflow_client.client.models.hitl_user import HITLUser
+from airflow_client.client.models.task_instance_response import TaskInstanceResponse
+from typing import Optional, Set
+from typing_extensions import Self
+
+class HITLDetail(BaseModel):
+    """
+    Schema for Human-in-the-loop detail.
+    """ # noqa: E501
+    assigned_users: Optional[List[HITLUser]] = None
+    body: Optional[StrictStr] = None
+    chosen_options: Optional[List[StrictStr]] = None
+    created_at: datetime
+    defaults: Optional[List[StrictStr]] = None
+    multiple: Optional[StrictBool] = False
+    options: Annotated[List[StrictStr], Field(min_length=1)]
+    params: Optional[Dict[str, Any]] = None
+    params_input: Optional[Dict[str, Any]] = None
+    responded_at: Optional[datetime] = None
+    responded_by_user: Optional[HITLUser] = None
+    response_received: Optional[StrictBool] = False
+    subject: StrictStr
+    task_instance: TaskInstanceResponse
+    __properties: ClassVar[List[str]] = ["assigned_users", "body", "chosen_options", "created_at", "defaults", "multiple", "options", "params", "params_input", "responded_at", "responded_by_user", "response_received", "subject", "task_instance"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of HITLDetail from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        # override the default output from pydantic by calling `to_dict()` of each item in assigned_users (list)
+        _items = []
+        if self.assigned_users:
+            for _item_assigned_users in self.assigned_users:
+                if _item_assigned_users:
+                    _items.append(_item_assigned_users.to_dict())
+            _dict['assigned_users'] = _items
+        # override the default output from pydantic by calling `to_dict()` of responded_by_user
+        if self.responded_by_user:
+            _dict['responded_by_user'] = self.responded_by_user.to_dict()
+        # override the default output from pydantic by calling `to_dict()` of task_instance
+        if self.task_instance:
+            _dict['task_instance'] = self.task_instance.to_dict()
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of HITLDetail from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "assigned_users": [HITLUser.from_dict(_item) for _item in obj["assigned_users"]] if obj.get("assigned_users") is not None else None,
+            "body": obj.get("body"),
+            "chosen_options": obj.get("chosen_options"),
+            "created_at": obj.get("created_at"),
+            "defaults": obj.get("defaults"),
+            "multiple": obj.get("multiple") if obj.get("multiple") is not None else False,
+            "options": obj.get("options"),
+            "params": obj.get("params"),
+            "params_input": obj.get("params_input"),
+            "responded_at": obj.get("responded_at"),
+            "responded_by_user": HITLUser.from_dict(obj["responded_by_user"]) if obj.get("responded_by_user") is not None else None,
+            "response_received": obj.get("response_received") if obj.get("response_received") is not None else False,
+            "subject": obj.get("subject"),
+            "task_instance": TaskInstanceResponse.from_dict(obj["task_instance"]) if obj.get("task_instance") is not None else None
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/hitl_detail_collection.py b/airflow_client/client/models/hitl_detail_collection.py
new file mode 100644
index 0000000..79dfa4a
--- /dev/null
+++ b/airflow_client/client/models/hitl_detail_collection.py
@@ -0,0 +1,97 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, StrictInt
+from typing import Any, ClassVar, Dict, List
+from airflow_client.client.models.hitl_detail import HITLDetail
+from typing import Optional, Set
+from typing_extensions import Self
+
+class HITLDetailCollection(BaseModel):
+    """
+    Schema for a collection of Human-in-the-loop details.
+    """ # noqa: E501
+    hitl_details: List[HITLDetail]
+    total_entries: StrictInt
+    __properties: ClassVar[List[str]] = ["hitl_details", "total_entries"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of HITLDetailCollection from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        # override the default output from pydantic by calling `to_dict()` of each item in hitl_details (list)
+        _items = []
+        if self.hitl_details:
+            for _item_hitl_details in self.hitl_details:
+                if _item_hitl_details:
+                    _items.append(_item_hitl_details.to_dict())
+            _dict['hitl_details'] = _items
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of HITLDetailCollection from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "hitl_details": [HITLDetail.from_dict(_item) for _item in obj["hitl_details"]] if obj.get("hitl_details") is not None else None,
+            "total_entries": obj.get("total_entries")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/hitl_detail_response.py b/airflow_client/client/models/hitl_detail_response.py
new file mode 100644
index 0000000..03810fe
--- /dev/null
+++ b/airflow_client/client/models/hitl_detail_response.py
@@ -0,0 +1,99 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from datetime import datetime
+from pydantic import BaseModel, ConfigDict, Field, StrictStr
+from typing import Any, ClassVar, Dict, List, Optional
+from typing_extensions import Annotated
+from airflow_client.client.models.hitl_user import HITLUser
+from typing import Optional, Set
+from typing_extensions import Self
+
+class HITLDetailResponse(BaseModel):
+    """
+    Response of updating a Human-in-the-loop detail.
+    """ # noqa: E501
+    chosen_options: Annotated[List[StrictStr], Field(min_length=1)]
+    params_input: Optional[Dict[str, Any]] = None
+    responded_at: datetime
+    responded_by: HITLUser
+    __properties: ClassVar[List[str]] = ["chosen_options", "params_input", "responded_at", "responded_by"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of HITLDetailResponse from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        # override the default output from pydantic by calling `to_dict()` of responded_by
+        if self.responded_by:
+            _dict['responded_by'] = self.responded_by.to_dict()
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of HITLDetailResponse from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "chosen_options": obj.get("chosen_options"),
+            "params_input": obj.get("params_input"),
+            "responded_at": obj.get("responded_at"),
+            "responded_by": HITLUser.from_dict(obj["responded_by"]) if obj.get("responded_by") is not None else None
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/hitl_user.py b/airflow_client/client/models/hitl_user.py
new file mode 100644
index 0000000..e2ee8e1
--- /dev/null
+++ b/airflow_client/client/models/hitl_user.py
@@ -0,0 +1,89 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, StrictStr
+from typing import Any, ClassVar, Dict, List
+from typing import Optional, Set
+from typing_extensions import Self
+
+class HITLUser(BaseModel):
+    """
+    Schema for a Human-in-the-loop users.
+    """ # noqa: E501
+    id: StrictStr
+    name: StrictStr
+    __properties: ClassVar[List[str]] = ["id", "name"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of HITLUser from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of HITLUser from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "id": obj.get("id"),
+            "name": obj.get("name")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/job_response.py b/airflow_client/client/models/job_response.py
index 07287b5..a153cf8 100644
--- a/airflow_client/client/models/job_response.py
+++ b/airflow_client/client/models/job_response.py
@@ -27,6 +27,7 @@
     """
     Job serializer for responses.
     """ # noqa: E501
+    dag_display_name: Optional[StrictStr] = None
     dag_id: Optional[StrictStr] = None
     end_date: Optional[datetime] = None
     executor_class: Optional[StrictStr] = None
@@ -37,7 +38,7 @@
     start_date: Optional[datetime] = None
     state: Optional[StrictStr] = None
     unixname: Optional[StrictStr] = None
-    __properties: ClassVar[List[str]] = ["dag_id", "end_date", "executor_class", "hostname", "id", "job_type", "latest_heartbeat", "start_date", "state", "unixname"]
+    __properties: ClassVar[List[str]] = ["dag_display_name", "dag_id", "end_date", "executor_class", "hostname", "id", "job_type", "latest_heartbeat", "start_date", "state", "unixname"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -90,6 +91,7 @@
             return cls.model_validate(obj)
 
         _obj = cls.model_validate({
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "end_date": obj.get("end_date"),
             "executor_class": obj.get("executor_class"),
diff --git a/airflow_client/client/models/last_asset_event_response.py b/airflow_client/client/models/last_asset_event_response.py
new file mode 100644
index 0000000..288f545
--- /dev/null
+++ b/airflow_client/client/models/last_asset_event_response.py
@@ -0,0 +1,91 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from datetime import datetime
+from pydantic import BaseModel, ConfigDict, Field
+from typing import Any, ClassVar, Dict, List, Optional
+from typing_extensions import Annotated
+from typing import Optional, Set
+from typing_extensions import Self
+
+class LastAssetEventResponse(BaseModel):
+    """
+    Last asset event response serializer.
+    """ # noqa: E501
+    id: Optional[Annotated[int, Field(strict=True, ge=0)]] = None
+    timestamp: Optional[datetime] = None
+    __properties: ClassVar[List[str]] = ["id", "timestamp"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of LastAssetEventResponse from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of LastAssetEventResponse from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "id": obj.get("id"),
+            "timestamp": obj.get("timestamp")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/plugin_import_error_collection_response.py b/airflow_client/client/models/plugin_import_error_collection_response.py
new file mode 100644
index 0000000..ce6fdfe
--- /dev/null
+++ b/airflow_client/client/models/plugin_import_error_collection_response.py
@@ -0,0 +1,97 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, StrictInt
+from typing import Any, ClassVar, Dict, List
+from airflow_client.client.models.plugin_import_error_response import PluginImportErrorResponse
+from typing import Optional, Set
+from typing_extensions import Self
+
+class PluginImportErrorCollectionResponse(BaseModel):
+    """
+    Plugin Import Error Collection serializer.
+    """ # noqa: E501
+    import_errors: List[PluginImportErrorResponse]
+    total_entries: StrictInt
+    __properties: ClassVar[List[str]] = ["import_errors", "total_entries"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of PluginImportErrorCollectionResponse from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        # override the default output from pydantic by calling `to_dict()` of each item in import_errors (list)
+        _items = []
+        if self.import_errors:
+            for _item_import_errors in self.import_errors:
+                if _item_import_errors:
+                    _items.append(_item_import_errors.to_dict())
+            _dict['import_errors'] = _items
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of PluginImportErrorCollectionResponse from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "import_errors": [PluginImportErrorResponse.from_dict(_item) for _item in obj["import_errors"]] if obj.get("import_errors") is not None else None,
+            "total_entries": obj.get("total_entries")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/plugin_import_error_response.py b/airflow_client/client/models/plugin_import_error_response.py
new file mode 100644
index 0000000..9e6d312
--- /dev/null
+++ b/airflow_client/client/models/plugin_import_error_response.py
@@ -0,0 +1,89 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, StrictStr
+from typing import Any, ClassVar, Dict, List
+from typing import Optional, Set
+from typing_extensions import Self
+
+class PluginImportErrorResponse(BaseModel):
+    """
+    Plugin Import Error serializer for responses.
+    """ # noqa: E501
+    error: StrictStr
+    source: StrictStr
+    __properties: ClassVar[List[str]] = ["error", "source"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of PluginImportErrorResponse from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of PluginImportErrorResponse from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "error": obj.get("error"),
+            "source": obj.get("source")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/plugin_response.py b/airflow_client/client/models/plugin_response.py
index dca448e..f24443c 100644
--- a/airflow_client/client/models/plugin_response.py
+++ b/airflow_client/client/models/plugin_response.py
@@ -17,12 +17,14 @@
 import re  # noqa: F401
 import json
 
-from pydantic import BaseModel, ConfigDict, StrictStr
+from pydantic import BaseModel, ConfigDict, Field, StrictStr
 from typing import Any, ClassVar, Dict, List
 from airflow_client.client.models.app_builder_menu_item_response import AppBuilderMenuItemResponse
 from airflow_client.client.models.app_builder_view_response import AppBuilderViewResponse
+from airflow_client.client.models.external_view_response import ExternalViewResponse
 from airflow_client.client.models.fast_api_app_response import FastAPIAppResponse
 from airflow_client.client.models.fast_api_root_middleware_response import FastAPIRootMiddlewareResponse
+from airflow_client.client.models.react_app_response import ReactAppResponse
 from typing import Optional, Set
 from typing_extensions import Self
 
@@ -32,6 +34,7 @@
     """ # noqa: E501
     appbuilder_menu_items: List[AppBuilderMenuItemResponse]
     appbuilder_views: List[AppBuilderViewResponse]
+    external_views: List[ExternalViewResponse] = Field(description="Aggregate all external views. Both 'external_views' and 'appbuilder_menu_items' are included here.")
     fastapi_apps: List[FastAPIAppResponse]
     fastapi_root_middlewares: List[FastAPIRootMiddlewareResponse]
     flask_blueprints: List[StrictStr]
@@ -40,9 +43,10 @@
     macros: List[StrictStr]
     name: StrictStr
     operator_extra_links: List[StrictStr]
+    react_apps: List[ReactAppResponse]
     source: StrictStr
     timetables: List[StrictStr]
-    __properties: ClassVar[List[str]] = ["appbuilder_menu_items", "appbuilder_views", "fastapi_apps", "fastapi_root_middlewares", "flask_blueprints", "global_operator_extra_links", "listeners", "macros", "name", "operator_extra_links", "source", "timetables"]
+    __properties: ClassVar[List[str]] = ["appbuilder_menu_items", "appbuilder_views", "external_views", "fastapi_apps", "fastapi_root_middlewares", "flask_blueprints", "global_operator_extra_links", "listeners", "macros", "name", "operator_extra_links", "react_apps", "source", "timetables"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -97,6 +101,13 @@
                 if _item_appbuilder_views:
                     _items.append(_item_appbuilder_views.to_dict())
             _dict['appbuilder_views'] = _items
+        # override the default output from pydantic by calling `to_dict()` of each item in external_views (list)
+        _items = []
+        if self.external_views:
+            for _item_external_views in self.external_views:
+                if _item_external_views:
+                    _items.append(_item_external_views.to_dict())
+            _dict['external_views'] = _items
         # override the default output from pydantic by calling `to_dict()` of each item in fastapi_apps (list)
         _items = []
         if self.fastapi_apps:
@@ -111,6 +122,13 @@
                 if _item_fastapi_root_middlewares:
                     _items.append(_item_fastapi_root_middlewares.to_dict())
             _dict['fastapi_root_middlewares'] = _items
+        # override the default output from pydantic by calling `to_dict()` of each item in react_apps (list)
+        _items = []
+        if self.react_apps:
+            for _item_react_apps in self.react_apps:
+                if _item_react_apps:
+                    _items.append(_item_react_apps.to_dict())
+            _dict['react_apps'] = _items
         return _dict
 
     @classmethod
@@ -125,6 +143,7 @@
         _obj = cls.model_validate({
             "appbuilder_menu_items": [AppBuilderMenuItemResponse.from_dict(_item) for _item in obj["appbuilder_menu_items"]] if obj.get("appbuilder_menu_items") is not None else None,
             "appbuilder_views": [AppBuilderViewResponse.from_dict(_item) for _item in obj["appbuilder_views"]] if obj.get("appbuilder_views") is not None else None,
+            "external_views": [ExternalViewResponse.from_dict(_item) for _item in obj["external_views"]] if obj.get("external_views") is not None else None,
             "fastapi_apps": [FastAPIAppResponse.from_dict(_item) for _item in obj["fastapi_apps"]] if obj.get("fastapi_apps") is not None else None,
             "fastapi_root_middlewares": [FastAPIRootMiddlewareResponse.from_dict(_item) for _item in obj["fastapi_root_middlewares"]] if obj.get("fastapi_root_middlewares") is not None else None,
             "flask_blueprints": obj.get("flask_blueprints"),
@@ -133,6 +152,7 @@
             "macros": obj.get("macros"),
             "name": obj.get("name"),
             "operator_extra_links": obj.get("operator_extra_links"),
+            "react_apps": [ReactAppResponse.from_dict(_item) for _item in obj["react_apps"]] if obj.get("react_apps") is not None else None,
             "source": obj.get("source"),
             "timetables": obj.get("timetables")
         })
diff --git a/airflow_client/client/models/queued_event_response.py b/airflow_client/client/models/queued_event_response.py
index 3725503..d9a8a2a 100644
--- a/airflow_client/client/models/queued_event_response.py
+++ b/airflow_client/client/models/queued_event_response.py
@@ -29,8 +29,9 @@
     """ # noqa: E501
     asset_id: StrictInt
     created_at: datetime
+    dag_display_name: StrictStr
     dag_id: StrictStr
-    __properties: ClassVar[List[str]] = ["asset_id", "created_at", "dag_id"]
+    __properties: ClassVar[List[str]] = ["asset_id", "created_at", "dag_display_name", "dag_id"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -85,6 +86,7 @@
         _obj = cls.model_validate({
             "asset_id": obj.get("asset_id"),
             "created_at": obj.get("created_at"),
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id")
         })
         return _obj
diff --git a/airflow_client/client/models/react_app_response.py b/airflow_client/client/models/react_app_response.py
new file mode 100644
index 0000000..50046ea
--- /dev/null
+++ b/airflow_client/client/models/react_app_response.py
@@ -0,0 +1,109 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, StrictStr, field_validator
+from typing import Any, ClassVar, Dict, List, Optional
+from typing import Optional, Set
+from typing_extensions import Self
+
+class ReactAppResponse(BaseModel):
+    """
+    Serializer for React App Plugin responses.
+    """ # noqa: E501
+    bundle_url: StrictStr
+    category: Optional[StrictStr] = None
+    destination: Optional[StrictStr] = 'nav'
+    icon: Optional[StrictStr] = None
+    icon_dark_mode: Optional[StrictStr] = None
+    name: StrictStr
+    url_route: Optional[StrictStr] = None
+    __properties: ClassVar[List[str]] = ["bundle_url", "category", "destination", "icon", "icon_dark_mode", "name", "url_route"]
+
+    @field_validator('destination')
+    def destination_validate_enum(cls, value):
+        """Validates the enum"""
+        if value is None:
+            return value
+
+        if value not in set(['nav', 'dag', 'dag_run', 'task', 'task_instance', 'dashboard']):
+            raise ValueError("must be one of enum values ('nav', 'dag', 'dag_run', 'task', 'task_instance', 'dashboard')")
+        return value
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of ReactAppResponse from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of ReactAppResponse from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "bundle_url": obj.get("bundle_url"),
+            "category": obj.get("category"),
+            "destination": obj.get("destination") if obj.get("destination") is not None else 'nav',
+            "icon": obj.get("icon"),
+            "icon_dark_mode": obj.get("icon_dark_mode"),
+            "name": obj.get("name"),
+            "url_route": obj.get("url_route")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/task_inlet_asset_reference.py b/airflow_client/client/models/task_inlet_asset_reference.py
new file mode 100644
index 0000000..fff4617
--- /dev/null
+++ b/airflow_client/client/models/task_inlet_asset_reference.py
@@ -0,0 +1,94 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from datetime import datetime
+from pydantic import BaseModel, ConfigDict, StrictStr
+from typing import Any, ClassVar, Dict, List
+from typing import Optional, Set
+from typing_extensions import Self
+
+class TaskInletAssetReference(BaseModel):
+    """
+    Task inlet reference serializer for assets.
+    """ # noqa: E501
+    created_at: datetime
+    dag_id: StrictStr
+    task_id: StrictStr
+    updated_at: datetime
+    __properties: ClassVar[List[str]] = ["created_at", "dag_id", "task_id", "updated_at"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of TaskInletAssetReference from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of TaskInletAssetReference from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "created_at": obj.get("created_at"),
+            "dag_id": obj.get("dag_id"),
+            "task_id": obj.get("task_id"),
+            "updated_at": obj.get("updated_at")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/task_instance_history_response.py b/airflow_client/client/models/task_instance_history_response.py
index 3fdbedb..c8b4f90 100644
--- a/airflow_client/client/models/task_instance_history_response.py
+++ b/airflow_client/client/models/task_instance_history_response.py
@@ -29,6 +29,7 @@
     """
     TaskInstanceHistory serializer for responses.
     """ # noqa: E501
+    dag_display_name: StrictStr
     dag_id: StrictStr
     dag_run_id: StrictStr
     dag_version: Optional[DagVersionResponse] = None
@@ -40,6 +41,7 @@
     map_index: StrictInt
     max_tries: StrictInt
     operator: Optional[StrictStr] = None
+    operator_name: Optional[StrictStr] = None
     pid: Optional[StrictInt] = None
     pool: StrictStr
     pool_slots: StrictInt
@@ -53,7 +55,7 @@
     task_id: StrictStr
     try_number: StrictInt
     unixname: Optional[StrictStr] = None
-    __properties: ClassVar[List[str]] = ["dag_id", "dag_run_id", "dag_version", "duration", "end_date", "executor", "executor_config", "hostname", "map_index", "max_tries", "operator", "pid", "pool", "pool_slots", "priority_weight", "queue", "queued_when", "scheduled_when", "start_date", "state", "task_display_name", "task_id", "try_number", "unixname"]
+    __properties: ClassVar[List[str]] = ["dag_display_name", "dag_id", "dag_run_id", "dag_version", "duration", "end_date", "executor", "executor_config", "hostname", "map_index", "max_tries", "operator", "operator_name", "pid", "pool", "pool_slots", "priority_weight", "queue", "queued_when", "scheduled_when", "start_date", "state", "task_display_name", "task_id", "try_number", "unixname"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -109,6 +111,7 @@
             return cls.model_validate(obj)
 
         _obj = cls.model_validate({
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "dag_run_id": obj.get("dag_run_id"),
             "dag_version": DagVersionResponse.from_dict(obj["dag_version"]) if obj.get("dag_version") is not None else None,
@@ -120,6 +123,7 @@
             "map_index": obj.get("map_index"),
             "max_tries": obj.get("max_tries"),
             "operator": obj.get("operator"),
+            "operator_name": obj.get("operator_name"),
             "pid": obj.get("pid"),
             "pool": obj.get("pool"),
             "pool_slots": obj.get("pool_slots"),
diff --git a/airflow_client/client/models/task_instance_response.py b/airflow_client/client/models/task_instance_response.py
index e507075..b428f66 100644
--- a/airflow_client/client/models/task_instance_response.py
+++ b/airflow_client/client/models/task_instance_response.py
@@ -31,6 +31,7 @@
     """
     TaskInstance serializer for responses.
     """ # noqa: E501
+    dag_display_name: StrictStr
     dag_id: StrictStr
     dag_run_id: StrictStr
     dag_version: Optional[DagVersionResponse] = None
@@ -45,6 +46,7 @@
     max_tries: StrictInt
     note: Optional[StrictStr] = None
     operator: Optional[StrictStr] = None
+    operator_name: Optional[StrictStr] = None
     pid: Optional[StrictInt] = None
     pool: StrictStr
     pool_slots: StrictInt
@@ -63,7 +65,7 @@
     triggerer_job: Optional[JobResponse] = None
     try_number: StrictInt
     unixname: Optional[StrictStr] = None
-    __properties: ClassVar[List[str]] = ["dag_id", "dag_run_id", "dag_version", "duration", "end_date", "executor", "executor_config", "hostname", "id", "logical_date", "map_index", "max_tries", "note", "operator", "pid", "pool", "pool_slots", "priority_weight", "queue", "queued_when", "rendered_fields", "rendered_map_index", "run_after", "scheduled_when", "start_date", "state", "task_display_name", "task_id", "trigger", "triggerer_job", "try_number", "unixname"]
+    __properties: ClassVar[List[str]] = ["dag_display_name", "dag_id", "dag_run_id", "dag_version", "duration", "end_date", "executor", "executor_config", "hostname", "id", "logical_date", "map_index", "max_tries", "note", "operator", "operator_name", "pid", "pool", "pool_slots", "priority_weight", "queue", "queued_when", "rendered_fields", "rendered_map_index", "run_after", "scheduled_when", "start_date", "state", "task_display_name", "task_id", "trigger", "triggerer_job", "try_number", "unixname"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -125,6 +127,7 @@
             return cls.model_validate(obj)
 
         _obj = cls.model_validate({
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "dag_run_id": obj.get("dag_run_id"),
             "dag_version": DagVersionResponse.from_dict(obj["dag_version"]) if obj.get("dag_version") is not None else None,
@@ -139,6 +142,7 @@
             "max_tries": obj.get("max_tries"),
             "note": obj.get("note"),
             "operator": obj.get("operator"),
+            "operator_name": obj.get("operator_name"),
             "pid": obj.get("pid"),
             "pool": obj.get("pool"),
             "pool_slots": obj.get("pool_slots"),
diff --git a/airflow_client/client/models/task_instances_batch_body.py b/airflow_client/client/models/task_instances_batch_body.py
index 0477907..f201e3d 100644
--- a/airflow_client/client/models/task_instances_batch_body.py
+++ b/airflow_client/client/models/task_instances_batch_body.py
@@ -31,25 +31,35 @@
     """ # noqa: E501
     dag_ids: Optional[List[StrictStr]] = None
     dag_run_ids: Optional[List[StrictStr]] = None
+    duration_gt: Optional[Union[StrictFloat, StrictInt]] = None
     duration_gte: Optional[Union[StrictFloat, StrictInt]] = None
+    duration_lt: Optional[Union[StrictFloat, StrictInt]] = None
     duration_lte: Optional[Union[StrictFloat, StrictInt]] = None
+    end_date_gt: Optional[datetime] = None
     end_date_gte: Optional[datetime] = None
+    end_date_lt: Optional[datetime] = None
     end_date_lte: Optional[datetime] = None
     executor: Optional[List[StrictStr]] = None
+    logical_date_gt: Optional[datetime] = None
     logical_date_gte: Optional[datetime] = None
+    logical_date_lt: Optional[datetime] = None
     logical_date_lte: Optional[datetime] = None
     order_by: Optional[StrictStr] = None
     page_limit: Optional[Annotated[int, Field(strict=True, ge=0)]] = 100
     page_offset: Optional[Annotated[int, Field(strict=True, ge=0)]] = 0
     pool: Optional[List[StrictStr]] = None
     queue: Optional[List[StrictStr]] = None
+    run_after_gt: Optional[datetime] = None
     run_after_gte: Optional[datetime] = None
+    run_after_lt: Optional[datetime] = None
     run_after_lte: Optional[datetime] = None
+    start_date_gt: Optional[datetime] = None
     start_date_gte: Optional[datetime] = None
+    start_date_lt: Optional[datetime] = None
     start_date_lte: Optional[datetime] = None
     state: Optional[List[Optional[TaskInstanceState]]] = None
     task_ids: Optional[List[StrictStr]] = None
-    __properties: ClassVar[List[str]] = ["dag_ids", "dag_run_ids", "duration_gte", "duration_lte", "end_date_gte", "end_date_lte", "executor", "logical_date_gte", "logical_date_lte", "order_by", "page_limit", "page_offset", "pool", "queue", "run_after_gte", "run_after_lte", "start_date_gte", "start_date_lte", "state", "task_ids"]
+    __properties: ClassVar[List[str]] = ["dag_ids", "dag_run_ids", "duration_gt", "duration_gte", "duration_lt", "duration_lte", "end_date_gt", "end_date_gte", "end_date_lt", "end_date_lte", "executor", "logical_date_gt", "logical_date_gte", "logical_date_lt", "logical_date_lte", "order_by", "page_limit", "page_offset", "pool", "queue", "run_after_gt", "run_after_gte", "run_after_lt", "run_after_lte", "start_date_gt", "start_date_gte", "start_date_lt", "start_date_lte", "state", "task_ids"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -104,21 +114,31 @@
         _obj = cls.model_validate({
             "dag_ids": obj.get("dag_ids"),
             "dag_run_ids": obj.get("dag_run_ids"),
+            "duration_gt": obj.get("duration_gt"),
             "duration_gte": obj.get("duration_gte"),
+            "duration_lt": obj.get("duration_lt"),
             "duration_lte": obj.get("duration_lte"),
+            "end_date_gt": obj.get("end_date_gt"),
             "end_date_gte": obj.get("end_date_gte"),
+            "end_date_lt": obj.get("end_date_lt"),
             "end_date_lte": obj.get("end_date_lte"),
             "executor": obj.get("executor"),
+            "logical_date_gt": obj.get("logical_date_gt"),
             "logical_date_gte": obj.get("logical_date_gte"),
+            "logical_date_lt": obj.get("logical_date_lt"),
             "logical_date_lte": obj.get("logical_date_lte"),
             "order_by": obj.get("order_by"),
             "page_limit": obj.get("page_limit") if obj.get("page_limit") is not None else 100,
             "page_offset": obj.get("page_offset") if obj.get("page_offset") is not None else 0,
             "pool": obj.get("pool"),
             "queue": obj.get("queue"),
+            "run_after_gt": obj.get("run_after_gt"),
             "run_after_gte": obj.get("run_after_gte"),
+            "run_after_lt": obj.get("run_after_lt"),
             "run_after_lte": obj.get("run_after_lte"),
+            "start_date_gt": obj.get("start_date_gt"),
             "start_date_gte": obj.get("start_date_gte"),
+            "start_date_lt": obj.get("start_date_lt"),
             "start_date_lte": obj.get("start_date_lte"),
             "state": obj.get("state"),
             "task_ids": obj.get("task_ids")
diff --git a/airflow_client/client/models/trigger_dag_run_post_body.py b/airflow_client/client/models/trigger_dag_run_post_body.py
index 08c53a1..430112b 100644
--- a/airflow_client/client/models/trigger_dag_run_post_body.py
+++ b/airflow_client/client/models/trigger_dag_run_post_body.py
@@ -62,7 +62,7 @@
         if 'logical_date' not in _dict:
             _dict['logical_date'] = None
         if self.additional_properties is not None:
-            for _key, _value in self.additional_properties.items():
+            for (_key, _value) in self.additional_properties.items():
                 _dict[_key] = _value
         return _dict
 
diff --git a/airflow_client/client/models/update_hitl_detail_payload.py b/airflow_client/client/models/update_hitl_detail_payload.py
new file mode 100644
index 0000000..db159c7
--- /dev/null
+++ b/airflow_client/client/models/update_hitl_detail_payload.py
@@ -0,0 +1,90 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+from __future__ import annotations
+import pprint
+import re  # noqa: F401
+import json
+
+from pydantic import BaseModel, ConfigDict, Field, StrictStr
+from typing import Any, ClassVar, Dict, List, Optional
+from typing_extensions import Annotated
+from typing import Optional, Set
+from typing_extensions import Self
+
+class UpdateHITLDetailPayload(BaseModel):
+    """
+    Schema for updating the content of a Human-in-the-loop detail.
+    """ # noqa: E501
+    chosen_options: Annotated[List[StrictStr], Field(min_length=1)]
+    params_input: Optional[Dict[str, Any]] = None
+    __properties: ClassVar[List[str]] = ["chosen_options", "params_input"]
+
+    model_config = ConfigDict(
+        populate_by_name=True,
+        validate_assignment=True,
+        protected_namespaces=(),
+    )
+
+
+    def to_str(self) -> str:
+        """Returns the string representation of the model using alias"""
+        return pprint.pformat(self.model_dump(by_alias=True))
+
+    def to_json(self) -> str:
+        """Returns the JSON representation of the model using alias"""
+        # TODO: pydantic v2: use .model_dump_json(by_alias=True, exclude_unset=True) instead
+        return json.dumps(self.to_dict())
+
+    @classmethod
+    def from_json(cls, json_str: str) -> Optional[Self]:
+        """Create an instance of UpdateHITLDetailPayload from a JSON string"""
+        return cls.from_dict(json.loads(json_str))
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Return the dictionary representation of the model using alias.
+
+        This has the following differences from calling pydantic's
+        `self.model_dump(by_alias=True)`:
+
+        * `None` is only added to the output dict for nullable fields that
+          were set at model initialization. Other fields with value `None`
+          are ignored.
+        """
+        excluded_fields: Set[str] = set([
+        ])
+
+        _dict = self.model_dump(
+            by_alias=True,
+            exclude=excluded_fields,
+            exclude_none=True,
+        )
+        return _dict
+
+    @classmethod
+    def from_dict(cls, obj: Optional[Dict[str, Any]]) -> Optional[Self]:
+        """Create an instance of UpdateHITLDetailPayload from a dict"""
+        if obj is None:
+            return None
+
+        if not isinstance(obj, dict):
+            return cls.model_validate(obj)
+
+        _obj = cls.model_validate({
+            "chosen_options": obj.get("chosen_options"),
+            "params_input": obj.get("params_input")
+        })
+        return _obj
+
+
diff --git a/airflow_client/client/models/x_com_response.py b/airflow_client/client/models/x_com_response.py
index 19876c7..7c846bf 100644
--- a/airflow_client/client/models/x_com_response.py
+++ b/airflow_client/client/models/x_com_response.py
@@ -27,14 +27,16 @@
     """
     Serializer for a xcom item.
     """ # noqa: E501
+    dag_display_name: StrictStr
     dag_id: StrictStr
     key: StrictStr
     logical_date: Optional[datetime] = None
     map_index: StrictInt
     run_id: StrictStr
+    task_display_name: StrictStr
     task_id: StrictStr
     timestamp: datetime
-    __properties: ClassVar[List[str]] = ["dag_id", "key", "logical_date", "map_index", "run_id", "task_id", "timestamp"]
+    __properties: ClassVar[List[str]] = ["dag_display_name", "dag_id", "key", "logical_date", "map_index", "run_id", "task_display_name", "task_id", "timestamp"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -87,11 +89,13 @@
             return cls.model_validate(obj)
 
         _obj = cls.model_validate({
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "key": obj.get("key"),
             "logical_date": obj.get("logical_date"),
             "map_index": obj.get("map_index"),
             "run_id": obj.get("run_id"),
+            "task_display_name": obj.get("task_display_name"),
             "task_id": obj.get("task_id"),
             "timestamp": obj.get("timestamp")
         })
diff --git a/airflow_client/client/models/x_com_response_native.py b/airflow_client/client/models/x_com_response_native.py
index 19feb8d..948da66 100644
--- a/airflow_client/client/models/x_com_response_native.py
+++ b/airflow_client/client/models/x_com_response_native.py
@@ -27,15 +27,17 @@
     """
     XCom response serializer with native return type.
     """ # noqa: E501
+    dag_display_name: StrictStr
     dag_id: StrictStr
     key: StrictStr
     logical_date: Optional[datetime] = None
     map_index: StrictInt
     run_id: StrictStr
+    task_display_name: StrictStr
     task_id: StrictStr
     timestamp: datetime
     value: Optional[Any]
-    __properties: ClassVar[List[str]] = ["dag_id", "key", "logical_date", "map_index", "run_id", "task_id", "timestamp", "value"]
+    __properties: ClassVar[List[str]] = ["dag_display_name", "dag_id", "key", "logical_date", "map_index", "run_id", "task_display_name", "task_id", "timestamp", "value"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -93,11 +95,13 @@
             return cls.model_validate(obj)
 
         _obj = cls.model_validate({
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "key": obj.get("key"),
             "logical_date": obj.get("logical_date"),
             "map_index": obj.get("map_index"),
             "run_id": obj.get("run_id"),
+            "task_display_name": obj.get("task_display_name"),
             "task_id": obj.get("task_id"),
             "timestamp": obj.get("timestamp"),
             "value": obj.get("value")
diff --git a/airflow_client/client/models/x_com_response_string.py b/airflow_client/client/models/x_com_response_string.py
index 2c30e01..aa426f1 100644
--- a/airflow_client/client/models/x_com_response_string.py
+++ b/airflow_client/client/models/x_com_response_string.py
@@ -27,15 +27,17 @@
     """
     XCom response serializer with string return type.
     """ # noqa: E501
+    dag_display_name: StrictStr
     dag_id: StrictStr
     key: StrictStr
     logical_date: Optional[datetime] = None
     map_index: StrictInt
     run_id: StrictStr
+    task_display_name: StrictStr
     task_id: StrictStr
     timestamp: datetime
     value: Optional[StrictStr] = None
-    __properties: ClassVar[List[str]] = ["dag_id", "key", "logical_date", "map_index", "run_id", "task_id", "timestamp", "value"]
+    __properties: ClassVar[List[str]] = ["dag_display_name", "dag_id", "key", "logical_date", "map_index", "run_id", "task_display_name", "task_id", "timestamp", "value"]
 
     model_config = ConfigDict(
         populate_by_name=True,
@@ -88,11 +90,13 @@
             return cls.model_validate(obj)
 
         _obj = cls.model_validate({
+            "dag_display_name": obj.get("dag_display_name"),
             "dag_id": obj.get("dag_id"),
             "key": obj.get("key"),
             "logical_date": obj.get("logical_date"),
             "map_index": obj.get("map_index"),
             "run_id": obj.get("run_id"),
+            "task_display_name": obj.get("task_display_name"),
             "task_id": obj.get("task_id"),
             "timestamp": obj.get("timestamp"),
             "value": obj.get("value")
diff --git a/docs/AppBuilderMenuItemResponse.md b/docs/AppBuilderMenuItemResponse.md
index 2660554..82a8f0e 100644
--- a/docs/AppBuilderMenuItemResponse.md
+++ b/docs/AppBuilderMenuItemResponse.md
@@ -7,7 +7,7 @@
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
 **category** | **str** |  | [optional] 
-**href** | **str** |  | [optional] 
+**href** | **str** |  | 
 **name** | **str** |  | 
 
 ## Example
diff --git a/docs/AssetApi.md b/docs/AssetApi.md
index 78d960e..0e60be4 100644
--- a/docs/AssetApi.md
+++ b/docs/AssetApi.md
@@ -29,6 +29,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -50,6 +51,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -80,7 +86,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -109,6 +115,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -128,6 +135,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -158,7 +170,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -187,6 +199,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -206,6 +219,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -238,7 +256,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -266,6 +284,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -285,6 +304,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -315,7 +339,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -345,6 +369,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -365,6 +390,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -395,7 +425,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -424,6 +454,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -443,6 +474,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -473,7 +509,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -502,6 +538,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -522,14 +559,19 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = airflow_client.client.AssetApi(api_client)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    name_pattern = 'name_pattern_example' # str |  (optional)
-    order_by = 'id' # str |  (optional) (default to 'id')
+    name_pattern = 'name_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    order_by = [id] # List[str] |  (optional) (default to [id])
 
     try:
         # Get Asset Aliases
@@ -549,8 +591,8 @@
 ------------- | ------------- | ------------- | -------------
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **name_pattern** | **str**|  | [optional] 
- **order_by** | **str**|  | [optional] [default to 'id']
+ **name_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [id]]
 
 ### Return type
 
@@ -558,7 +600,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -578,7 +620,7 @@
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
 # **get_asset_events**
-> AssetEventCollectionResponse get_asset_events(limit=limit, offset=offset, order_by=order_by, asset_id=asset_id, source_dag_id=source_dag_id, source_task_id=source_task_id, source_run_id=source_run_id, source_map_index=source_map_index, timestamp_gte=timestamp_gte, timestamp_lte=timestamp_lte)
+> AssetEventCollectionResponse get_asset_events(limit=limit, offset=offset, order_by=order_by, asset_id=asset_id, source_dag_id=source_dag_id, source_task_id=source_task_id, source_run_id=source_run_id, source_map_index=source_map_index, timestamp_gte=timestamp_gte, timestamp_gt=timestamp_gt, timestamp_lte=timestamp_lte, timestamp_lt=timestamp_lt)
 
 Get Asset Events
 
@@ -587,6 +629,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -607,24 +650,31 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = airflow_client.client.AssetApi(api_client)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'timestamp' # str |  (optional) (default to 'timestamp')
+    order_by = ["timestamp"] # List[str] |  (optional) (default to ["timestamp"])
     asset_id = 56 # int |  (optional)
     source_dag_id = 'source_dag_id_example' # str |  (optional)
     source_task_id = 'source_task_id_example' # str |  (optional)
     source_run_id = 'source_run_id_example' # str |  (optional)
     source_map_index = 56 # int |  (optional)
     timestamp_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    timestamp_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     timestamp_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    timestamp_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
 
     try:
         # Get Asset Events
-        api_response = api_instance.get_asset_events(limit=limit, offset=offset, order_by=order_by, asset_id=asset_id, source_dag_id=source_dag_id, source_task_id=source_task_id, source_run_id=source_run_id, source_map_index=source_map_index, timestamp_gte=timestamp_gte, timestamp_lte=timestamp_lte)
+        api_response = api_instance.get_asset_events(limit=limit, offset=offset, order_by=order_by, asset_id=asset_id, source_dag_id=source_dag_id, source_task_id=source_task_id, source_run_id=source_run_id, source_map_index=source_map_index, timestamp_gte=timestamp_gte, timestamp_gt=timestamp_gt, timestamp_lte=timestamp_lte, timestamp_lt=timestamp_lt)
         print("The response of AssetApi->get_asset_events:\n")
         pprint(api_response)
     except Exception as e:
@@ -640,14 +690,16 @@
 ------------- | ------------- | ------------- | -------------
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'timestamp']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to ["timestamp"]]
  **asset_id** | **int**|  | [optional] 
  **source_dag_id** | **str**|  | [optional] 
  **source_task_id** | **str**|  | [optional] 
  **source_run_id** | **str**|  | [optional] 
  **source_map_index** | **int**|  | [optional] 
  **timestamp_gte** | **datetime**|  | [optional] 
+ **timestamp_gt** | **datetime**|  | [optional] 
  **timestamp_lte** | **datetime**|  | [optional] 
+ **timestamp_lt** | **datetime**|  | [optional] 
 
 ### Return type
 
@@ -655,7 +707,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -684,6 +736,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -704,6 +757,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -736,7 +794,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -765,6 +823,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -785,17 +844,22 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = airflow_client.client.AssetApi(api_client)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    name_pattern = 'name_pattern_example' # str |  (optional)
-    uri_pattern = 'uri_pattern_example' # str |  (optional)
+    name_pattern = 'name_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    uri_pattern = 'uri_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
     dag_ids = ['dag_ids_example'] # List[str] |  (optional)
     only_active = True # bool |  (optional) (default to True)
-    order_by = 'id' # str |  (optional) (default to 'id')
+    order_by = ["id"] # List[str] |  (optional) (default to ["id"])
 
     try:
         # Get Assets
@@ -815,11 +879,11 @@
 ------------- | ------------- | ------------- | -------------
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **name_pattern** | **str**|  | [optional] 
- **uri_pattern** | **str**|  | [optional] 
+ **name_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **uri_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
  **dag_ids** | [**List[str]**](str.md)|  | [optional] 
  **only_active** | **bool**|  | [optional] [default to True]
- **order_by** | **str**|  | [optional] [default to 'id']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to ["id"]]
 
 ### Return type
 
@@ -827,7 +891,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -856,6 +920,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -876,6 +941,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -910,7 +980,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -939,6 +1009,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -959,6 +1030,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -991,7 +1067,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -1020,6 +1096,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -1040,6 +1117,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -1070,7 +1152,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/AssetResponse.md b/docs/AssetResponse.md
index e99bb00..b34c514 100644
--- a/docs/AssetResponse.md
+++ b/docs/AssetResponse.md
@@ -7,13 +7,15 @@
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
 **aliases** | [**List[AssetAliasResponse]**](AssetAliasResponse.md) |  | 
-**consuming_dags** | [**List[DagScheduleAssetReference]**](DagScheduleAssetReference.md) |  | 
+**consuming_tasks** | [**List[TaskInletAssetReference]**](TaskInletAssetReference.md) |  | 
 **created_at** | **datetime** |  | 
 **extra** | **object** |  | [optional] 
 **group** | **str** |  | 
 **id** | **int** |  | 
+**last_asset_event** | [**LastAssetEventResponse**](LastAssetEventResponse.md) |  | [optional] 
 **name** | **str** |  | 
 **producing_tasks** | [**List[TaskOutletAssetReference]**](TaskOutletAssetReference.md) |  | 
+**scheduled_dags** | [**List[DagScheduleAssetReference]**](DagScheduleAssetReference.md) |  | 
 **updated_at** | **datetime** |  | 
 **uri** | **str** |  | 
 
diff --git a/docs/BackfillApi.md b/docs/BackfillApi.md
index e806f0a..43bcacc 100644
--- a/docs/BackfillApi.md
+++ b/docs/BackfillApi.md
@@ -21,6 +21,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -41,6 +42,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -71,7 +77,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -99,6 +105,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -120,6 +127,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -150,7 +162,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -178,6 +190,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -199,6 +212,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -229,7 +247,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -257,6 +275,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -277,6 +296,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -307,7 +331,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -334,6 +358,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -354,6 +379,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -361,7 +391,7 @@
     dag_id = 'dag_id_example' # str | 
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'id' # str |  (optional) (default to 'id')
+    order_by = [id] # List[str] |  (optional) (default to [id])
 
     try:
         # List Backfills
@@ -382,7 +412,7 @@
  **dag_id** | **str**|  | 
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'id']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [id]]
 
 ### Return type
 
@@ -390,7 +420,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -416,6 +446,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -436,6 +467,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -466,7 +502,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -494,6 +530,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -514,6 +551,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -544,7 +586,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/BackfillResponse.md b/docs/BackfillResponse.md
index f3479bd..bf89ab7 100644
--- a/docs/BackfillResponse.md
+++ b/docs/BackfillResponse.md
@@ -8,6 +8,7 @@
 ------------ | ------------- | ------------- | -------------
 **completed_at** | **datetime** |  | [optional] 
 **created_at** | **datetime** |  | 
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **dag_run_conf** | **object** |  | 
 **from_date** | **datetime** |  | 
diff --git a/docs/BulkBodyBulkTaskInstanceBody.md b/docs/BulkBodyBulkTaskInstanceBody.md
new file mode 100644
index 0000000..bcc390d
--- /dev/null
+++ b/docs/BulkBodyBulkTaskInstanceBody.md
@@ -0,0 +1,29 @@
+# BulkBodyBulkTaskInstanceBody
+
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**actions** | [**List[BulkBodyBulkTaskInstanceBodyActionsInner]**](BulkBodyBulkTaskInstanceBodyActionsInner.md) |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.bulk_body_bulk_task_instance_body import BulkBodyBulkTaskInstanceBody
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of BulkBodyBulkTaskInstanceBody from a JSON string
+bulk_body_bulk_task_instance_body_instance = BulkBodyBulkTaskInstanceBody.from_json(json)
+# print the JSON string representation of the object
+print(BulkBodyBulkTaskInstanceBody.to_json())
+
+# convert the object into a dict
+bulk_body_bulk_task_instance_body_dict = bulk_body_bulk_task_instance_body_instance.to_dict()
+# create an instance of BulkBodyBulkTaskInstanceBody from a dict
+bulk_body_bulk_task_instance_body_from_dict = BulkBodyBulkTaskInstanceBody.from_dict(bulk_body_bulk_task_instance_body_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/BulkBodyBulkTaskInstanceBodyActionsInner.md b/docs/BulkBodyBulkTaskInstanceBodyActionsInner.md
new file mode 100644
index 0000000..522d990
--- /dev/null
+++ b/docs/BulkBodyBulkTaskInstanceBodyActionsInner.md
@@ -0,0 +1,32 @@
+# BulkBodyBulkTaskInstanceBodyActionsInner
+
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**action** | **str** | The action to be performed on the entities. | 
+**action_on_existence** | [**BulkActionOnExistence**](BulkActionOnExistence.md) |  | [optional] 
+**entities** | [**List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner]**](BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md) | A list of entity id/key or entity objects to be deleted. | 
+**action_on_non_existence** | [**BulkActionNotOnExistence**](BulkActionNotOnExistence.md) |  | [optional] 
+
+## Example
+
+```python
+from airflow_client.client.models.bulk_body_bulk_task_instance_body_actions_inner import BulkBodyBulkTaskInstanceBodyActionsInner
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of BulkBodyBulkTaskInstanceBodyActionsInner from a JSON string
+bulk_body_bulk_task_instance_body_actions_inner_instance = BulkBodyBulkTaskInstanceBodyActionsInner.from_json(json)
+# print the JSON string representation of the object
+print(BulkBodyBulkTaskInstanceBodyActionsInner.to_json())
+
+# convert the object into a dict
+bulk_body_bulk_task_instance_body_actions_inner_dict = bulk_body_bulk_task_instance_body_actions_inner_instance.to_dict()
+# create an instance of BulkBodyBulkTaskInstanceBodyActionsInner from a dict
+bulk_body_bulk_task_instance_body_actions_inner_from_dict = BulkBodyBulkTaskInstanceBodyActionsInner.from_dict(bulk_body_bulk_task_instance_body_actions_inner_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/BulkBodyConnectionBodyActionsInner.md b/docs/BulkBodyConnectionBodyActionsInner.md
index 0598108..c0b2cba 100644
--- a/docs/BulkBodyConnectionBodyActionsInner.md
+++ b/docs/BulkBodyConnectionBodyActionsInner.md
@@ -7,7 +7,7 @@
 ------------ | ------------- | ------------- | -------------
 **action** | **str** | The action to be performed on the entities. | 
 **action_on_existence** | [**BulkActionOnExistence**](BulkActionOnExistence.md) |  | [optional] 
-**entities** | **List[str]** | A list of entity id/key to be deleted. | 
+**entities** | [**List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner]**](BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md) | A list of entity id/key or entity objects to be deleted. | 
 **action_on_non_existence** | [**BulkActionNotOnExistence**](BulkActionNotOnExistence.md) |  | [optional] 
 
 ## Example
diff --git a/docs/BulkBodyPoolBodyActionsInner.md b/docs/BulkBodyPoolBodyActionsInner.md
index 23e1da6..82597b7 100644
--- a/docs/BulkBodyPoolBodyActionsInner.md
+++ b/docs/BulkBodyPoolBodyActionsInner.md
@@ -7,7 +7,7 @@
 ------------ | ------------- | ------------- | -------------
 **action** | **str** | The action to be performed on the entities. | 
 **action_on_existence** | [**BulkActionOnExistence**](BulkActionOnExistence.md) |  | [optional] 
-**entities** | **List[str]** | A list of entity id/key to be deleted. | 
+**entities** | [**List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner]**](BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md) | A list of entity id/key or entity objects to be deleted. | 
 **action_on_non_existence** | [**BulkActionNotOnExistence**](BulkActionNotOnExistence.md) |  | [optional] 
 
 ## Example
diff --git a/docs/BulkBodyVariableBodyActionsInner.md b/docs/BulkBodyVariableBodyActionsInner.md
index 379083b..10cf393 100644
--- a/docs/BulkBodyVariableBodyActionsInner.md
+++ b/docs/BulkBodyVariableBodyActionsInner.md
@@ -7,7 +7,7 @@
 ------------ | ------------- | ------------- | -------------
 **action** | **str** | The action to be performed on the entities. | 
 **action_on_existence** | [**BulkActionOnExistence**](BulkActionOnExistence.md) |  | [optional] 
-**entities** | **List[str]** | A list of entity id/key to be deleted. | 
+**entities** | [**List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner]**](BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md) | A list of entity id/key or entity objects to be deleted. | 
 **action_on_non_existence** | [**BulkActionNotOnExistence**](BulkActionNotOnExistence.md) |  | [optional] 
 
 ## Example
diff --git a/docs/BulkCreateActionBulkTaskInstanceBody.md b/docs/BulkCreateActionBulkTaskInstanceBody.md
new file mode 100644
index 0000000..ad1f190
--- /dev/null
+++ b/docs/BulkCreateActionBulkTaskInstanceBody.md
@@ -0,0 +1,31 @@
+# BulkCreateActionBulkTaskInstanceBody
+
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**action** | **str** | The action to be performed on the entities. | 
+**action_on_existence** | [**BulkActionOnExistence**](BulkActionOnExistence.md) |  | [optional] 
+**entities** | [**List[BulkTaskInstanceBody]**](BulkTaskInstanceBody.md) | A list of entities to be created. | 
+
+## Example
+
+```python
+from airflow_client.client.models.bulk_create_action_bulk_task_instance_body import BulkCreateActionBulkTaskInstanceBody
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of BulkCreateActionBulkTaskInstanceBody from a JSON string
+bulk_create_action_bulk_task_instance_body_instance = BulkCreateActionBulkTaskInstanceBody.from_json(json)
+# print the JSON string representation of the object
+print(BulkCreateActionBulkTaskInstanceBody.to_json())
+
+# convert the object into a dict
+bulk_create_action_bulk_task_instance_body_dict = bulk_create_action_bulk_task_instance_body_instance.to_dict()
+# create an instance of BulkCreateActionBulkTaskInstanceBody from a dict
+bulk_create_action_bulk_task_instance_body_from_dict = BulkCreateActionBulkTaskInstanceBody.from_dict(bulk_create_action_bulk_task_instance_body_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/BulkDeleteActionBulkTaskInstanceBody.md b/docs/BulkDeleteActionBulkTaskInstanceBody.md
new file mode 100644
index 0000000..5a757fb
--- /dev/null
+++ b/docs/BulkDeleteActionBulkTaskInstanceBody.md
@@ -0,0 +1,31 @@
+# BulkDeleteActionBulkTaskInstanceBody
+
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**action** | **str** | The action to be performed on the entities. | 
+**action_on_non_existence** | [**BulkActionNotOnExistence**](BulkActionNotOnExistence.md) |  | [optional] 
+**entities** | [**List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner]**](BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md) | A list of entity id/key or entity objects to be deleted. | 
+
+## Example
+
+```python
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body import BulkDeleteActionBulkTaskInstanceBody
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of BulkDeleteActionBulkTaskInstanceBody from a JSON string
+bulk_delete_action_bulk_task_instance_body_instance = BulkDeleteActionBulkTaskInstanceBody.from_json(json)
+# print the JSON string representation of the object
+print(BulkDeleteActionBulkTaskInstanceBody.to_json())
+
+# convert the object into a dict
+bulk_delete_action_bulk_task_instance_body_dict = bulk_delete_action_bulk_task_instance_body_instance.to_dict()
+# create an instance of BulkDeleteActionBulkTaskInstanceBody from a dict
+bulk_delete_action_bulk_task_instance_body_from_dict = BulkDeleteActionBulkTaskInstanceBody.from_dict(bulk_delete_action_bulk_task_instance_body_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md b/docs/BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md
new file mode 100644
index 0000000..c98c117
--- /dev/null
+++ b/docs/BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md
@@ -0,0 +1,36 @@
+# BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
+
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**include_downstream** | **bool** |  | [optional] [default to False]
+**include_future** | **bool** |  | [optional] [default to False]
+**include_past** | **bool** |  | [optional] [default to False]
+**include_upstream** | **bool** |  | [optional] [default to False]
+**map_index** | **int** |  | [optional] 
+**new_state** | [**TaskInstanceState**](TaskInstanceState.md) |  | [optional] 
+**note** | **str** |  | [optional] 
+**task_id** | **str** |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body_entities_inner import BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of BulkDeleteActionBulkTaskInstanceBodyEntitiesInner from a JSON string
+bulk_delete_action_bulk_task_instance_body_entities_inner_instance = BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.from_json(json)
+# print the JSON string representation of the object
+print(BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.to_json())
+
+# convert the object into a dict
+bulk_delete_action_bulk_task_instance_body_entities_inner_dict = bulk_delete_action_bulk_task_instance_body_entities_inner_instance.to_dict()
+# create an instance of BulkDeleteActionBulkTaskInstanceBodyEntitiesInner from a dict
+bulk_delete_action_bulk_task_instance_body_entities_inner_from_dict = BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.from_dict(bulk_delete_action_bulk_task_instance_body_entities_inner_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/BulkDeleteActionConnectionBody.md b/docs/BulkDeleteActionConnectionBody.md
index 026de47..852bf61 100644
--- a/docs/BulkDeleteActionConnectionBody.md
+++ b/docs/BulkDeleteActionConnectionBody.md
@@ -7,7 +7,7 @@
 ------------ | ------------- | ------------- | -------------
 **action** | **str** | The action to be performed on the entities. | 
 **action_on_non_existence** | [**BulkActionNotOnExistence**](BulkActionNotOnExistence.md) |  | [optional] 
-**entities** | **List[str]** | A list of entity id/key to be deleted. | 
+**entities** | [**List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner]**](BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md) | A list of entity id/key or entity objects to be deleted. | 
 
 ## Example
 
diff --git a/docs/BulkDeleteActionPoolBody.md b/docs/BulkDeleteActionPoolBody.md
index 0974ea0..1837f3c 100644
--- a/docs/BulkDeleteActionPoolBody.md
+++ b/docs/BulkDeleteActionPoolBody.md
@@ -7,7 +7,7 @@
 ------------ | ------------- | ------------- | -------------
 **action** | **str** | The action to be performed on the entities. | 
 **action_on_non_existence** | [**BulkActionNotOnExistence**](BulkActionNotOnExistence.md) |  | [optional] 
-**entities** | **List[str]** | A list of entity id/key to be deleted. | 
+**entities** | [**List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner]**](BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md) | A list of entity id/key or entity objects to be deleted. | 
 
 ## Example
 
diff --git a/docs/BulkDeleteActionVariableBody.md b/docs/BulkDeleteActionVariableBody.md
index e2802df..83d901d 100644
--- a/docs/BulkDeleteActionVariableBody.md
+++ b/docs/BulkDeleteActionVariableBody.md
@@ -7,7 +7,7 @@
 ------------ | ------------- | ------------- | -------------
 **action** | **str** | The action to be performed on the entities. | 
 **action_on_non_existence** | [**BulkActionNotOnExistence**](BulkActionNotOnExistence.md) |  | [optional] 
-**entities** | **List[str]** | A list of entity id/key to be deleted. | 
+**entities** | [**List[BulkDeleteActionBulkTaskInstanceBodyEntitiesInner]**](BulkDeleteActionBulkTaskInstanceBodyEntitiesInner.md) | A list of entity id/key or entity objects to be deleted. | 
 
 ## Example
 
diff --git a/docs/BulkTaskInstanceBody.md b/docs/BulkTaskInstanceBody.md
new file mode 100644
index 0000000..ae1ce2b
--- /dev/null
+++ b/docs/BulkTaskInstanceBody.md
@@ -0,0 +1,37 @@
+# BulkTaskInstanceBody
+
+Request body for bulk update, and delete task instances.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**include_downstream** | **bool** |  | [optional] [default to False]
+**include_future** | **bool** |  | [optional] [default to False]
+**include_past** | **bool** |  | [optional] [default to False]
+**include_upstream** | **bool** |  | [optional] [default to False]
+**map_index** | **int** |  | [optional] 
+**new_state** | [**TaskInstanceState**](TaskInstanceState.md) |  | [optional] 
+**note** | **str** |  | [optional] 
+**task_id** | **str** |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.bulk_task_instance_body import BulkTaskInstanceBody
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of BulkTaskInstanceBody from a JSON string
+bulk_task_instance_body_instance = BulkTaskInstanceBody.from_json(json)
+# print the JSON string representation of the object
+print(BulkTaskInstanceBody.to_json())
+
+# convert the object into a dict
+bulk_task_instance_body_dict = bulk_task_instance_body_instance.to_dict()
+# create an instance of BulkTaskInstanceBody from a dict
+bulk_task_instance_body_from_dict = BulkTaskInstanceBody.from_dict(bulk_task_instance_body_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/BulkUpdateActionBulkTaskInstanceBody.md b/docs/BulkUpdateActionBulkTaskInstanceBody.md
new file mode 100644
index 0000000..d6691d3
--- /dev/null
+++ b/docs/BulkUpdateActionBulkTaskInstanceBody.md
@@ -0,0 +1,31 @@
+# BulkUpdateActionBulkTaskInstanceBody
+
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**action** | **str** | The action to be performed on the entities. | 
+**action_on_non_existence** | [**BulkActionNotOnExistence**](BulkActionNotOnExistence.md) |  | [optional] 
+**entities** | [**List[BulkTaskInstanceBody]**](BulkTaskInstanceBody.md) | A list of entities to be updated. | 
+
+## Example
+
+```python
+from airflow_client.client.models.bulk_update_action_bulk_task_instance_body import BulkUpdateActionBulkTaskInstanceBody
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of BulkUpdateActionBulkTaskInstanceBody from a JSON string
+bulk_update_action_bulk_task_instance_body_instance = BulkUpdateActionBulkTaskInstanceBody.from_json(json)
+# print the JSON string representation of the object
+print(BulkUpdateActionBulkTaskInstanceBody.to_json())
+
+# convert the object into a dict
+bulk_update_action_bulk_task_instance_body_dict = bulk_update_action_bulk_task_instance_body_instance.to_dict()
+# create an instance of BulkUpdateActionBulkTaskInstanceBody from a dict
+bulk_update_action_bulk_task_instance_body_from_dict = BulkUpdateActionBulkTaskInstanceBody.from_dict(bulk_update_action_bulk_task_instance_body_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/ClearTaskInstancesBody.md b/docs/ClearTaskInstancesBody.md
index 7418783..3ff6154 100644
--- a/docs/ClearTaskInstancesBody.md
+++ b/docs/ClearTaskInstancesBody.md
@@ -16,6 +16,7 @@
 **only_failed** | **bool** |  | [optional] [default to True]
 **only_running** | **bool** |  | [optional] [default to False]
 **reset_dag_runs** | **bool** |  | [optional] [default to True]
+**run_on_latest_version** | **bool** | (Experimental) Run on the latest bundle version of the dag after clearing the task instances. | [optional] [default to False]
 **start_date** | **datetime** |  | [optional] 
 **task_ids** | [**List[ClearTaskInstancesBodyTaskIdsInner]**](ClearTaskInstancesBodyTaskIdsInner.md) |  | [optional] 
 
diff --git a/docs/ConfigApi.md b/docs/ConfigApi.md
index 1954fef..38ea955 100644
--- a/docs/ConfigApi.md
+++ b/docs/ConfigApi.md
@@ -16,6 +16,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -36,6 +37,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -68,7 +74,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -96,6 +102,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -116,6 +123,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -150,7 +162,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/ConnectionApi.md b/docs/ConnectionApi.md
index af99887..07e933e 100644
--- a/docs/ConnectionApi.md
+++ b/docs/ConnectionApi.md
@@ -24,6 +24,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -45,6 +46,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -75,7 +81,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -103,6 +109,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -122,6 +129,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -146,7 +158,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -173,6 +185,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -192,6 +205,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -220,7 +238,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -249,6 +267,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -269,6 +288,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -299,7 +323,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -328,6 +352,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -348,14 +373,19 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = airflow_client.client.ConnectionApi(api_client)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'id' # str |  (optional) (default to 'id')
-    connection_id_pattern = 'connection_id_pattern_example' # str |  (optional)
+    order_by = [id] # List[str] |  (optional) (default to [id])
+    connection_id_pattern = 'connection_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
 
     try:
         # Get Connections
@@ -375,8 +405,8 @@
 ------------- | ------------- | ------------- | -------------
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'id']
- **connection_id_pattern** | **str**|  | [optional] 
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [id]]
+ **connection_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
 
 ### Return type
 
@@ -384,7 +414,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -413,6 +443,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -434,6 +465,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -468,7 +504,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -498,6 +534,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -519,6 +556,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -549,7 +591,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -582,6 +624,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -603,6 +646,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -633,7 +681,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/DAGApi.md b/docs/DAGApi.md
index 32f0f7a..2f3274c 100644
--- a/docs/DAGApi.md
+++ b/docs/DAGApi.md
@@ -5,12 +5,14 @@
 Method | HTTP request | Description
 ------------- | ------------- | -------------
 [**delete_dag**](DAGApi.md#delete_dag) | **DELETE** /api/v2/dags/{dag_id} | Delete Dag
+[**favorite_dag**](DAGApi.md#favorite_dag) | **POST** /api/v2/dags/{dag_id}/favorite | Favorite Dag
 [**get_dag**](DAGApi.md#get_dag) | **GET** /api/v2/dags/{dag_id} | Get Dag
 [**get_dag_details**](DAGApi.md#get_dag_details) | **GET** /api/v2/dags/{dag_id}/details | Get Dag Details
 [**get_dag_tags**](DAGApi.md#get_dag_tags) | **GET** /api/v2/dagTags | Get Dag Tags
 [**get_dags**](DAGApi.md#get_dags) | **GET** /api/v2/dags | Get Dags
 [**patch_dag**](DAGApi.md#patch_dag) | **PATCH** /api/v2/dags/{dag_id} | Patch Dag
 [**patch_dags**](DAGApi.md#patch_dags) | **PATCH** /api/v2/dags | Patch Dags
+[**unfavorite_dag**](DAGApi.md#unfavorite_dag) | **POST** /api/v2/dags/{dag_id}/unfavorite | Unfavorite Dag
 
 
 # **delete_dag**
@@ -23,6 +25,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -42,6 +45,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -72,7 +80,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -92,6 +100,88 @@
 
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
+# **favorite_dag**
+> favorite_dag(dag_id)
+
+Favorite Dag
+
+Mark the DAG as favorite.
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.DAGApi(api_client)
+    dag_id = 'dag_id_example' # str | 
+
+    try:
+        # Favorite Dag
+        api_instance.favorite_dag(dag_id)
+    except Exception as e:
+        print("Exception when calling DAGApi->favorite_dag: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **dag_id** | **str**|  | 
+
+### Return type
+
+void (empty response body)
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: Not defined
+ - **Accept**: application/json
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**204** | Successful Response |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+**404** | Not Found |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
 # **get_dag**
 > DAGResponse get_dag(dag_id)
 
@@ -102,6 +192,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -122,6 +213,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -152,7 +248,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -182,6 +278,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -202,6 +299,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -232,7 +334,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -262,6 +364,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -282,14 +385,19 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = airflow_client.client.DAGApi(api_client)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'name' # str |  (optional) (default to 'name')
-    tag_name_pattern = 'tag_name_pattern_example' # str |  (optional)
+    order_by = ["name"] # List[str] |  (optional) (default to ["name"])
+    tag_name_pattern = 'tag_name_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
 
     try:
         # Get Dag Tags
@@ -309,8 +417,8 @@
 ------------- | ------------- | ------------- | -------------
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'name']
- **tag_name_pattern** | **str**|  | [optional] 
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to ["name"]]
+ **tag_name_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
 
 ### Return type
 
@@ -318,7 +426,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -337,7 +445,7 @@
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
 # **get_dags**
-> DAGCollectionResponse get_dags(limit=limit, offset=offset, tags=tags, tags_match_mode=tags_match_mode, owners=owners, dag_id_pattern=dag_id_pattern, dag_display_name_pattern=dag_display_name_pattern, exclude_stale=exclude_stale, paused=paused, last_dag_run_state=last_dag_run_state, dag_run_start_date_gte=dag_run_start_date_gte, dag_run_start_date_lte=dag_run_start_date_lte, dag_run_end_date_gte=dag_run_end_date_gte, dag_run_end_date_lte=dag_run_end_date_lte, dag_run_state=dag_run_state, order_by=order_by)
+> DAGCollectionResponse get_dags(limit=limit, offset=offset, tags=tags, tags_match_mode=tags_match_mode, owners=owners, dag_id_pattern=dag_id_pattern, dag_display_name_pattern=dag_display_name_pattern, exclude_stale=exclude_stale, paused=paused, has_import_errors=has_import_errors, last_dag_run_state=last_dag_run_state, bundle_name=bundle_name, bundle_version=bundle_version, has_asset_schedule=has_asset_schedule, asset_dependency=asset_dependency, dag_run_start_date_gte=dag_run_start_date_gte, dag_run_start_date_gt=dag_run_start_date_gt, dag_run_start_date_lte=dag_run_start_date_lte, dag_run_start_date_lt=dag_run_start_date_lt, dag_run_end_date_gte=dag_run_end_date_gte, dag_run_end_date_gt=dag_run_end_date_gt, dag_run_end_date_lte=dag_run_end_date_lte, dag_run_end_date_lt=dag_run_end_date_lt, dag_run_state=dag_run_state, order_by=order_by, is_favorite=is_favorite)
 
 Get Dags
 
@@ -346,6 +454,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -367,6 +476,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -376,21 +490,31 @@
     tags = ['tags_example'] # List[str] |  (optional)
     tags_match_mode = 'tags_match_mode_example' # str |  (optional)
     owners = ['owners_example'] # List[str] |  (optional)
-    dag_id_pattern = 'dag_id_pattern_example' # str |  (optional)
-    dag_display_name_pattern = 'dag_display_name_pattern_example' # str |  (optional)
+    dag_id_pattern = 'dag_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    dag_display_name_pattern = 'dag_display_name_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
     exclude_stale = True # bool |  (optional) (default to True)
     paused = True # bool |  (optional)
+    has_import_errors = True # bool | Filter Dags by having import errors. Only Dags that have been successfully loaded before will be returned. (optional)
     last_dag_run_state = airflow_client.client.DagRunState() # DagRunState |  (optional)
+    bundle_name = 'bundle_name_example' # str |  (optional)
+    bundle_version = 'bundle_version_example' # str |  (optional)
+    has_asset_schedule = True # bool | Filter Dags with asset-based scheduling (optional)
+    asset_dependency = 'asset_dependency_example' # str | Filter Dags by asset dependency (name or URI) (optional)
     dag_run_start_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    dag_run_start_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     dag_run_start_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    dag_run_start_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     dag_run_end_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    dag_run_end_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     dag_run_end_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    dag_run_end_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     dag_run_state = ['dag_run_state_example'] # List[str] |  (optional)
-    order_by = 'dag_id' # str |  (optional) (default to 'dag_id')
+    order_by = [dag_id] # List[str] |  (optional) (default to [dag_id])
+    is_favorite = True # bool |  (optional)
 
     try:
         # Get Dags
-        api_response = api_instance.get_dags(limit=limit, offset=offset, tags=tags, tags_match_mode=tags_match_mode, owners=owners, dag_id_pattern=dag_id_pattern, dag_display_name_pattern=dag_display_name_pattern, exclude_stale=exclude_stale, paused=paused, last_dag_run_state=last_dag_run_state, dag_run_start_date_gte=dag_run_start_date_gte, dag_run_start_date_lte=dag_run_start_date_lte, dag_run_end_date_gte=dag_run_end_date_gte, dag_run_end_date_lte=dag_run_end_date_lte, dag_run_state=dag_run_state, order_by=order_by)
+        api_response = api_instance.get_dags(limit=limit, offset=offset, tags=tags, tags_match_mode=tags_match_mode, owners=owners, dag_id_pattern=dag_id_pattern, dag_display_name_pattern=dag_display_name_pattern, exclude_stale=exclude_stale, paused=paused, has_import_errors=has_import_errors, last_dag_run_state=last_dag_run_state, bundle_name=bundle_name, bundle_version=bundle_version, has_asset_schedule=has_asset_schedule, asset_dependency=asset_dependency, dag_run_start_date_gte=dag_run_start_date_gte, dag_run_start_date_gt=dag_run_start_date_gt, dag_run_start_date_lte=dag_run_start_date_lte, dag_run_start_date_lt=dag_run_start_date_lt, dag_run_end_date_gte=dag_run_end_date_gte, dag_run_end_date_gt=dag_run_end_date_gt, dag_run_end_date_lte=dag_run_end_date_lte, dag_run_end_date_lt=dag_run_end_date_lt, dag_run_state=dag_run_state, order_by=order_by, is_favorite=is_favorite)
         print("The response of DAGApi->get_dags:\n")
         pprint(api_response)
     except Exception as e:
@@ -409,17 +533,27 @@
  **tags** | [**List[str]**](str.md)|  | [optional] 
  **tags_match_mode** | **str**|  | [optional] 
  **owners** | [**List[str]**](str.md)|  | [optional] 
- **dag_id_pattern** | **str**|  | [optional] 
- **dag_display_name_pattern** | **str**|  | [optional] 
+ **dag_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **dag_display_name_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
  **exclude_stale** | **bool**|  | [optional] [default to True]
  **paused** | **bool**|  | [optional] 
+ **has_import_errors** | **bool**| Filter Dags by having import errors. Only Dags that have been successfully loaded before will be returned. | [optional] 
  **last_dag_run_state** | [**DagRunState**](.md)|  | [optional] 
+ **bundle_name** | **str**|  | [optional] 
+ **bundle_version** | **str**|  | [optional] 
+ **has_asset_schedule** | **bool**| Filter Dags with asset-based scheduling | [optional] 
+ **asset_dependency** | **str**| Filter Dags by asset dependency (name or URI) | [optional] 
  **dag_run_start_date_gte** | **datetime**|  | [optional] 
+ **dag_run_start_date_gt** | **datetime**|  | [optional] 
  **dag_run_start_date_lte** | **datetime**|  | [optional] 
+ **dag_run_start_date_lt** | **datetime**|  | [optional] 
  **dag_run_end_date_gte** | **datetime**|  | [optional] 
+ **dag_run_end_date_gt** | **datetime**|  | [optional] 
  **dag_run_end_date_lte** | **datetime**|  | [optional] 
+ **dag_run_end_date_lt** | **datetime**|  | [optional] 
  **dag_run_state** | [**List[str]**](str.md)|  | [optional] 
- **order_by** | **str**|  | [optional] [default to 'dag_id']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [dag_id]]
+ **is_favorite** | **bool**|  | [optional] 
 
 ### Return type
 
@@ -427,7 +561,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -455,6 +589,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -476,6 +611,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -510,7 +650,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -540,6 +680,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -561,6 +702,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -572,7 +718,7 @@
     tags = ['tags_example'] # List[str] |  (optional)
     tags_match_mode = 'tags_match_mode_example' # str |  (optional)
     owners = ['owners_example'] # List[str] |  (optional)
-    dag_id_pattern = 'dag_id_pattern_example' # str |  (optional)
+    dag_id_pattern = 'dag_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
     exclude_stale = True # bool |  (optional) (default to True)
     paused = True # bool |  (optional)
 
@@ -599,7 +745,7 @@
  **tags** | [**List[str]**](str.md)|  | [optional] 
  **tags_match_mode** | **str**|  | [optional] 
  **owners** | [**List[str]**](str.md)|  | [optional] 
- **dag_id_pattern** | **str**|  | [optional] 
+ **dag_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
  **exclude_stale** | **bool**|  | [optional] [default to True]
  **paused** | **bool**|  | [optional] 
 
@@ -609,7 +755,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -629,3 +775,86 @@
 
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
+# **unfavorite_dag**
+> unfavorite_dag(dag_id)
+
+Unfavorite Dag
+
+Unmark the DAG as favorite.
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.DAGApi(api_client)
+    dag_id = 'dag_id_example' # str | 
+
+    try:
+        # Unfavorite Dag
+        api_instance.unfavorite_dag(dag_id)
+    except Exception as e:
+        print("Exception when calling DAGApi->unfavorite_dag: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **dag_id** | **str**|  | 
+
+### Return type
+
+void (empty response body)
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: Not defined
+ - **Accept**: application/json
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**204** | Successful Response |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+**404** | Not Found |  -  |
+**409** | Conflict |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
diff --git a/docs/DAGDetailsResponse.md b/docs/DAGDetailsResponse.md
index dc71dc3..5f60764 100644
--- a/docs/DAGDetailsResponse.md
+++ b/docs/DAGDetailsResponse.md
@@ -10,10 +10,11 @@
 **bundle_name** | **str** |  | [optional] 
 **bundle_version** | **str** |  | [optional] 
 **catchup** | **bool** |  | 
-**concurrency** | **int** | Return max_active_tasks as concurrency. | [readonly] 
+**concurrency** | **int** | Return max_active_tasks as concurrency.  Deprecated: Use max_active_tasks instead. | [readonly] 
 **dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **dag_run_timeout** | **str** |  | [optional] 
+**default_args** | **object** |  | [optional] 
 **description** | **str** |  | [optional] 
 **doc_md** | **str** |  | [optional] 
 **end_date** | **datetime** |  | [optional] 
@@ -25,6 +26,7 @@
 **is_paused_upon_creation** | **bool** |  | [optional] 
 **is_stale** | **bool** |  | 
 **last_expired** | **datetime** |  | [optional] 
+**last_parse_duration** | **float** |  | [optional] 
 **last_parsed** | **datetime** |  | [optional] 
 **last_parsed_time** | **datetime** |  | [optional] 
 **latest_dag_version** | [**DagVersionResponse**](DagVersionResponse.md) |  | [optional] 
diff --git a/docs/DAGParsingApi.md b/docs/DAGParsingApi.md
index 85cff92..063e7b1 100644
--- a/docs/DAGParsingApi.md
+++ b/docs/DAGParsingApi.md
@@ -17,6 +17,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -36,6 +37,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -66,7 +72,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/DAGResponse.md b/docs/DAGResponse.md
index 37f8f2d..4a4619a 100644
--- a/docs/DAGResponse.md
+++ b/docs/DAGResponse.md
@@ -18,6 +18,7 @@
 **is_paused** | **bool** |  | 
 **is_stale** | **bool** |  | 
 **last_expired** | **datetime** |  | [optional] 
+**last_parse_duration** | **float** |  | [optional] 
 **last_parsed_time** | **datetime** |  | [optional] 
 **max_active_runs** | **int** |  | [optional] 
 **max_active_tasks** | **int** |  | 
diff --git a/docs/DAGRunApi.md b/docs/DAGRunApi.md
index 9695e00..fee40f2 100644
--- a/docs/DAGRunApi.md
+++ b/docs/DAGRunApi.md
@@ -12,6 +12,7 @@
 [**get_upstream_asset_events**](DagRunApi.md#get_upstream_asset_events) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamAssetEvents | Get Upstream Asset Events
 [**patch_dag_run**](DagRunApi.md#patch_dag_run) | **PATCH** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id} | Patch Dag Run
 [**trigger_dag_run**](DagRunApi.md#trigger_dag_run) | **POST** /api/v2/dags/{dag_id}/dagRuns | Trigger Dag Run
+[**wait_dag_run_until_finished**](DagRunApi.md#wait_dag_run_until_finished) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/wait | Experimental: Wait for a dag run to complete, and return task results if requested.
 
 
 # **clear_dag_run**
@@ -22,6 +23,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -43,6 +45,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -77,7 +84,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -106,6 +113,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -125,6 +133,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -155,7 +168,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -183,6 +196,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -203,6 +217,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -235,7 +254,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -255,7 +274,7 @@
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
 # **get_dag_runs**
-> DAGRunCollectionResponse get_dag_runs(dag_id, limit=limit, offset=offset, run_after_gte=run_after_gte, run_after_lte=run_after_lte, logical_date_gte=logical_date_gte, logical_date_lte=logical_date_lte, start_date_gte=start_date_gte, start_date_lte=start_date_lte, end_date_gte=end_date_gte, end_date_lte=end_date_lte, updated_at_gte=updated_at_gte, updated_at_lte=updated_at_lte, run_type=run_type, state=state, order_by=order_by)
+> DAGRunCollectionResponse get_dag_runs(dag_id, limit=limit, offset=offset, run_after_gte=run_after_gte, run_after_gt=run_after_gt, run_after_lte=run_after_lte, run_after_lt=run_after_lt, logical_date_gte=logical_date_gte, logical_date_gt=logical_date_gt, logical_date_lte=logical_date_lte, logical_date_lt=logical_date_lt, start_date_gte=start_date_gte, start_date_gt=start_date_gt, start_date_lte=start_date_lte, start_date_lt=start_date_lt, end_date_gte=end_date_gte, end_date_gt=end_date_gt, end_date_lte=end_date_lte, end_date_lt=end_date_lt, updated_at_gte=updated_at_gte, updated_at_gt=updated_at_gt, updated_at_lte=updated_at_lte, updated_at_lt=updated_at_lt, run_type=run_type, state=state, dag_version=dag_version, order_by=order_by, run_id_pattern=run_id_pattern, triggering_user_name_pattern=triggering_user_name_pattern)
 
 Get Dag Runs
 
@@ -266,6 +285,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -286,6 +306,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -294,22 +319,35 @@
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
     run_after_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    run_after_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     run_after_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    run_after_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     logical_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    logical_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     logical_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    logical_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     start_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    start_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     start_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    start_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     end_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    end_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     end_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    end_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     updated_at_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    updated_at_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     updated_at_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    updated_at_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     run_type = ['run_type_example'] # List[str] |  (optional)
     state = ['state_example'] # List[str] |  (optional)
-    order_by = 'id' # str |  (optional) (default to 'id')
+    dag_version = [56] # List[int] |  (optional)
+    order_by = [id] # List[str] |  (optional) (default to [id])
+    run_id_pattern = 'run_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    triggering_user_name_pattern = 'triggering_user_name_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
 
     try:
         # Get Dag Runs
-        api_response = api_instance.get_dag_runs(dag_id, limit=limit, offset=offset, run_after_gte=run_after_gte, run_after_lte=run_after_lte, logical_date_gte=logical_date_gte, logical_date_lte=logical_date_lte, start_date_gte=start_date_gte, start_date_lte=start_date_lte, end_date_gte=end_date_gte, end_date_lte=end_date_lte, updated_at_gte=updated_at_gte, updated_at_lte=updated_at_lte, run_type=run_type, state=state, order_by=order_by)
+        api_response = api_instance.get_dag_runs(dag_id, limit=limit, offset=offset, run_after_gte=run_after_gte, run_after_gt=run_after_gt, run_after_lte=run_after_lte, run_after_lt=run_after_lt, logical_date_gte=logical_date_gte, logical_date_gt=logical_date_gt, logical_date_lte=logical_date_lte, logical_date_lt=logical_date_lt, start_date_gte=start_date_gte, start_date_gt=start_date_gt, start_date_lte=start_date_lte, start_date_lt=start_date_lt, end_date_gte=end_date_gte, end_date_gt=end_date_gt, end_date_lte=end_date_lte, end_date_lt=end_date_lt, updated_at_gte=updated_at_gte, updated_at_gt=updated_at_gt, updated_at_lte=updated_at_lte, updated_at_lt=updated_at_lt, run_type=run_type, state=state, dag_version=dag_version, order_by=order_by, run_id_pattern=run_id_pattern, triggering_user_name_pattern=triggering_user_name_pattern)
         print("The response of DagRunApi->get_dag_runs:\n")
         pprint(api_response)
     except Exception as e:
@@ -327,18 +365,31 @@
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
  **run_after_gte** | **datetime**|  | [optional] 
+ **run_after_gt** | **datetime**|  | [optional] 
  **run_after_lte** | **datetime**|  | [optional] 
+ **run_after_lt** | **datetime**|  | [optional] 
  **logical_date_gte** | **datetime**|  | [optional] 
+ **logical_date_gt** | **datetime**|  | [optional] 
  **logical_date_lte** | **datetime**|  | [optional] 
+ **logical_date_lt** | **datetime**|  | [optional] 
  **start_date_gte** | **datetime**|  | [optional] 
+ **start_date_gt** | **datetime**|  | [optional] 
  **start_date_lte** | **datetime**|  | [optional] 
+ **start_date_lt** | **datetime**|  | [optional] 
  **end_date_gte** | **datetime**|  | [optional] 
+ **end_date_gt** | **datetime**|  | [optional] 
  **end_date_lte** | **datetime**|  | [optional] 
+ **end_date_lt** | **datetime**|  | [optional] 
  **updated_at_gte** | **datetime**|  | [optional] 
+ **updated_at_gt** | **datetime**|  | [optional] 
  **updated_at_lte** | **datetime**|  | [optional] 
+ **updated_at_lt** | **datetime**|  | [optional] 
  **run_type** | [**List[str]**](str.md)|  | [optional] 
  **state** | [**List[str]**](str.md)|  | [optional] 
- **order_by** | **str**|  | [optional] [default to 'id']
+ **dag_version** | [**List[int]**](int.md)|  | [optional] 
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [id]]
+ **run_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **triggering_user_name_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
 
 ### Return type
 
@@ -346,7 +397,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -375,6 +426,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -396,6 +448,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -428,7 +485,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -457,6 +514,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -477,6 +535,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -509,7 +572,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -538,6 +601,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -559,6 +623,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -595,7 +664,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -625,6 +694,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -646,6 +716,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -678,7 +753,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -699,3 +774,93 @@
 
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
+# **wait_dag_run_until_finished**
+> object wait_dag_run_until_finished(dag_id, dag_run_id, interval, result=result)
+
+Experimental: Wait for a dag run to complete, and return task results if requested.
+
+🚧 This is an experimental endpoint and may change or be removed without notice.Successful response are streamed as newline-delimited JSON (NDJSON). Each line is a JSON object representing the DAG run state.
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.DagRunApi(api_client)
+    dag_id = 'dag_id_example' # str | 
+    dag_run_id = 'dag_run_id_example' # str | 
+    interval = 3.4 # float | Seconds to wait between dag run state checks
+    result = ['result_example'] # List[str] | Collect result XCom from task. Can be set multiple times. (optional)
+
+    try:
+        # Experimental: Wait for a dag run to complete, and return task results if requested.
+        api_response = api_instance.wait_dag_run_until_finished(dag_id, dag_run_id, interval, result=result)
+        print("The response of DagRunApi->wait_dag_run_until_finished:\n")
+        pprint(api_response)
+    except Exception as e:
+        print("Exception when calling DagRunApi->wait_dag_run_until_finished: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **dag_id** | **str**|  | 
+ **dag_run_id** | **str**|  | 
+ **interval** | **float**| Seconds to wait between dag run state checks | 
+ **result** | [**List[str]**](str.md)| Collect result XCom from task. Can be set multiple times. | [optional] 
+
+### Return type
+
+**object**
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: Not defined
+ - **Accept**: application/json, application/x-ndjson
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**200** | Successful Response |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+**404** | Not Found |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
diff --git a/docs/DAGRunClearBody.md b/docs/DAGRunClearBody.md
index 0ce2492..485ec58 100644
--- a/docs/DAGRunClearBody.md
+++ b/docs/DAGRunClearBody.md
@@ -8,6 +8,7 @@
 ------------ | ------------- | ------------- | -------------
 **dry_run** | **bool** |  | [optional] [default to True]
 **only_failed** | **bool** |  | [optional] [default to False]
+**run_on_latest_version** | **bool** | (Experimental) Run on the latest bundle version of the Dag after clearing the Dag Run. | [optional] [default to False]
 
 ## Example
 
diff --git a/docs/DAGRunResponse.md b/docs/DAGRunResponse.md
index 3545263..9df1d8e 100644
--- a/docs/DAGRunResponse.md
+++ b/docs/DAGRunResponse.md
@@ -8,11 +8,13 @@
 ------------ | ------------- | ------------- | -------------
 **bundle_version** | **str** |  | [optional] 
 **conf** | **object** |  | [optional] 
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **dag_run_id** | **str** |  | 
 **dag_versions** | [**List[DagVersionResponse]**](DagVersionResponse.md) |  | 
 **data_interval_end** | **datetime** |  | [optional] 
 **data_interval_start** | **datetime** |  | [optional] 
+**duration** | **float** |  | [optional] 
 **end_date** | **datetime** |  | [optional] 
 **last_scheduling_decision** | **datetime** |  | [optional] 
 **logical_date** | **datetime** |  | [optional] 
@@ -23,6 +25,7 @@
 **start_date** | **datetime** |  | [optional] 
 **state** | [**DagRunState**](DagRunState.md) |  | 
 **triggered_by** | [**DagRunTriggeredByType**](DagRunTriggeredByType.md) |  | [optional] 
+**triggering_user_name** | **str** |  | [optional] 
 
 ## Example
 
diff --git a/docs/DAGRunsBatchBody.md b/docs/DAGRunsBatchBody.md
index bd86e10..a710504 100644
--- a/docs/DAGRunsBatchBody.md
+++ b/docs/DAGRunsBatchBody.md
@@ -7,16 +7,24 @@
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
 **dag_ids** | **List[str]** |  | [optional] 
+**end_date_gt** | **datetime** |  | [optional] 
 **end_date_gte** | **datetime** |  | [optional] 
+**end_date_lt** | **datetime** |  | [optional] 
 **end_date_lte** | **datetime** |  | [optional] 
+**logical_date_gt** | **datetime** |  | [optional] 
 **logical_date_gte** | **datetime** |  | [optional] 
+**logical_date_lt** | **datetime** |  | [optional] 
 **logical_date_lte** | **datetime** |  | [optional] 
 **order_by** | **str** |  | [optional] 
 **page_limit** | **int** |  | [optional] [default to 100]
 **page_offset** | **int** |  | [optional] [default to 0]
+**run_after_gt** | **datetime** |  | [optional] 
 **run_after_gte** | **datetime** |  | [optional] 
+**run_after_lt** | **datetime** |  | [optional] 
 **run_after_lte** | **datetime** |  | [optional] 
+**start_date_gt** | **datetime** |  | [optional] 
 **start_date_gte** | **datetime** |  | [optional] 
+**start_date_lt** | **datetime** |  | [optional] 
 **start_date_lte** | **datetime** |  | [optional] 
 **states** | [**List[Optional[DagRunState]]**](DagRunState.md) |  | [optional] 
 
diff --git a/docs/DAGSourceResponse.md b/docs/DAGSourceResponse.md
index bb25c10..c806e90 100644
--- a/docs/DAGSourceResponse.md
+++ b/docs/DAGSourceResponse.md
@@ -7,6 +7,7 @@
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
 **content** | **str** |  | [optional] 
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **version_number** | **int** |  | [optional] 
 
diff --git a/docs/DAGWarningResponse.md b/docs/DAGWarningResponse.md
index 446df5a..c16e979 100644
--- a/docs/DAGWarningResponse.md
+++ b/docs/DAGWarningResponse.md
@@ -6,6 +6,7 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **message** | **str** |  | 
 **timestamp** | **datetime** |  | 
diff --git a/docs/DagReportApi.md b/docs/DagReportApi.md
index d8ae434..0085a1a 100644
--- a/docs/DagReportApi.md
+++ b/docs/DagReportApi.md
@@ -17,6 +17,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -36,6 +37,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -66,7 +72,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/DagSourceApi.md b/docs/DagSourceApi.md
index 07bf41d..b407989 100644
--- a/docs/DagSourceApi.md
+++ b/docs/DagSourceApi.md
@@ -17,6 +17,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -37,6 +38,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -71,7 +77,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/DagStatsApi.md b/docs/DagStatsApi.md
index 8dc35da..9ce4546 100644
--- a/docs/DagStatsApi.md
+++ b/docs/DagStatsApi.md
@@ -17,6 +17,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -37,6 +38,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -67,7 +73,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/DagStatsResponse.md b/docs/DagStatsResponse.md
index 2366d8e..a061c1e 100644
--- a/docs/DagStatsResponse.md
+++ b/docs/DagStatsResponse.md
@@ -6,6 +6,7 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **stats** | [**List[DagStatsStateResponse]**](DagStatsStateResponse.md) |  | 
 
diff --git a/docs/DagTagResponse.md b/docs/DagTagResponse.md
index 5c79a92..95b94b2 100644
--- a/docs/DagTagResponse.md
+++ b/docs/DagTagResponse.md
@@ -6,6 +6,7 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **name** | **str** |  | 
 
diff --git a/docs/DagVersionApi.md b/docs/DagVersionApi.md
index d8a5c3f..9368793 100644
--- a/docs/DagVersionApi.md
+++ b/docs/DagVersionApi.md
@@ -18,6 +18,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -38,6 +39,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -70,7 +76,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -101,6 +107,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -121,6 +128,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -131,7 +143,7 @@
     version_number = 56 # int |  (optional)
     bundle_name = 'bundle_name_example' # str |  (optional)
     bundle_version = 'bundle_version_example' # str |  (optional)
-    order_by = 'id' # str |  (optional) (default to 'id')
+    order_by = [id] # List[str] |  (optional) (default to [id])
 
     try:
         # Get Dag Versions
@@ -155,7 +167,7 @@
  **version_number** | **int**|  | [optional] 
  **bundle_name** | **str**|  | [optional] 
  **bundle_version** | **str**|  | [optional] 
- **order_by** | **str**|  | [optional] [default to 'id']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [id]]
 
 ### Return type
 
@@ -163,7 +175,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/DagVersionResponse.md b/docs/DagVersionResponse.md
index a18910f..a89bf6a 100644
--- a/docs/DagVersionResponse.md
+++ b/docs/DagVersionResponse.md
@@ -10,6 +10,7 @@
 **bundle_url** | **str** |  | [optional] 
 **bundle_version** | **str** |  | [optional] 
 **created_at** | **datetime** |  | 
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **id** | **str** |  | 
 **version_number** | **int** |  | 
diff --git a/docs/DagWarningApi.md b/docs/DagWarningApi.md
index 979c00e..70d3111 100644
--- a/docs/DagWarningApi.md
+++ b/docs/DagWarningApi.md
@@ -17,6 +17,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -38,6 +39,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -46,7 +52,7 @@
     warning_type = airflow_client.client.DagWarningType() # DagWarningType |  (optional)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'dag_id' # str |  (optional) (default to 'dag_id')
+    order_by = ["dag_id"] # List[str] |  (optional) (default to ["dag_id"])
 
     try:
         # List Dag Warnings
@@ -68,7 +74,7 @@
  **warning_type** | [**DagWarningType**](.md)|  | [optional] 
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'dag_id']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to ["dag_id"]]
 
 ### Return type
 
@@ -76,7 +82,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/EventLogApi.md b/docs/EventLogApi.md
index d93573d..558a1d1 100644
--- a/docs/EventLogApi.md
+++ b/docs/EventLogApi.md
@@ -16,6 +16,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -36,6 +37,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -66,7 +72,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -86,7 +92,7 @@
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
 # **get_event_logs**
-> EventLogCollectionResponse get_event_logs(limit=limit, offset=offset, order_by=order_by, dag_id=dag_id, task_id=task_id, run_id=run_id, map_index=map_index, try_number=try_number, owner=owner, event=event, excluded_events=excluded_events, included_events=included_events, before=before, after=after)
+> EventLogCollectionResponse get_event_logs(limit=limit, offset=offset, order_by=order_by, dag_id=dag_id, task_id=task_id, run_id=run_id, map_index=map_index, try_number=try_number, owner=owner, event=event, excluded_events=excluded_events, included_events=included_events, before=before, after=after, dag_id_pattern=dag_id_pattern, task_id_pattern=task_id_pattern, run_id_pattern=run_id_pattern, owner_pattern=owner_pattern, event_pattern=event_pattern)
 
 Get Event Logs
 
@@ -95,6 +101,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -115,13 +122,18 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = airflow_client.client.EventLogApi(api_client)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'id' # str |  (optional) (default to 'id')
+    order_by = [id] # List[str] |  (optional) (default to [id])
     dag_id = 'dag_id_example' # str |  (optional)
     task_id = 'task_id_example' # str |  (optional)
     run_id = 'run_id_example' # str |  (optional)
@@ -133,10 +145,15 @@
     included_events = ['included_events_example'] # List[str] |  (optional)
     before = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     after = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    dag_id_pattern = 'dag_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    task_id_pattern = 'task_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    run_id_pattern = 'run_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    owner_pattern = 'owner_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    event_pattern = 'event_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
 
     try:
         # Get Event Logs
-        api_response = api_instance.get_event_logs(limit=limit, offset=offset, order_by=order_by, dag_id=dag_id, task_id=task_id, run_id=run_id, map_index=map_index, try_number=try_number, owner=owner, event=event, excluded_events=excluded_events, included_events=included_events, before=before, after=after)
+        api_response = api_instance.get_event_logs(limit=limit, offset=offset, order_by=order_by, dag_id=dag_id, task_id=task_id, run_id=run_id, map_index=map_index, try_number=try_number, owner=owner, event=event, excluded_events=excluded_events, included_events=included_events, before=before, after=after, dag_id_pattern=dag_id_pattern, task_id_pattern=task_id_pattern, run_id_pattern=run_id_pattern, owner_pattern=owner_pattern, event_pattern=event_pattern)
         print("The response of EventLogApi->get_event_logs:\n")
         pprint(api_response)
     except Exception as e:
@@ -152,7 +169,7 @@
 ------------- | ------------- | ------------- | -------------
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'id']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [id]]
  **dag_id** | **str**|  | [optional] 
  **task_id** | **str**|  | [optional] 
  **run_id** | **str**|  | [optional] 
@@ -164,6 +181,11 @@
  **included_events** | [**List[str]**](str.md)|  | [optional] 
  **before** | **datetime**|  | [optional] 
  **after** | **datetime**|  | [optional] 
+ **dag_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **task_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **run_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **owner_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **event_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
 
 ### Return type
 
@@ -171,7 +193,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/EventLogResponse.md b/docs/EventLogResponse.md
index b054803..23df114 100644
--- a/docs/EventLogResponse.md
+++ b/docs/EventLogResponse.md
@@ -6,6 +6,7 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | [optional] 
 **dag_id** | **str** |  | [optional] 
 **event** | **str** |  | 
 **event_log_id** | **int** |  | 
diff --git a/docs/ExperimentalApi.md b/docs/ExperimentalApi.md
new file mode 100644
index 0000000..940b81e
--- /dev/null
+++ b/docs/ExperimentalApi.md
@@ -0,0 +1,99 @@
+# airflow_client.client.ExperimentalApi
+
+All URIs are relative to *http://localhost*
+
+Method | HTTP request | Description
+------------- | ------------- | -------------
+[**wait_dag_run_until_finished**](ExperimentalApi.md#wait_dag_run_until_finished) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/wait | Experimental: Wait for a dag run to complete, and return task results if requested.
+
+
+# **wait_dag_run_until_finished**
+> object wait_dag_run_until_finished(dag_id, dag_run_id, interval, result=result)
+
+Experimental: Wait for a dag run to complete, and return task results if requested.
+
+🚧 This is an experimental endpoint and may change or be removed without notice.Successful response are streamed as newline-delimited JSON (NDJSON). Each line is a JSON object representing the DAG run state.
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.ExperimentalApi(api_client)
+    dag_id = 'dag_id_example' # str | 
+    dag_run_id = 'dag_run_id_example' # str | 
+    interval = 3.4 # float | Seconds to wait between dag run state checks
+    result = ['result_example'] # List[str] | Collect result XCom from task. Can be set multiple times. (optional)
+
+    try:
+        # Experimental: Wait for a dag run to complete, and return task results if requested.
+        api_response = api_instance.wait_dag_run_until_finished(dag_id, dag_run_id, interval, result=result)
+        print("The response of ExperimentalApi->wait_dag_run_until_finished:\n")
+        pprint(api_response)
+    except Exception as e:
+        print("Exception when calling ExperimentalApi->wait_dag_run_until_finished: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **dag_id** | **str**|  | 
+ **dag_run_id** | **str**|  | 
+ **interval** | **float**| Seconds to wait between dag run state checks | 
+ **result** | [**List[str]**](str.md)| Collect result XCom from task. Can be set multiple times. | [optional] 
+
+### Return type
+
+**object**
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: Not defined
+ - **Accept**: application/json, application/x-ndjson
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**200** | Successful Response |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+**404** | Not Found |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
diff --git a/docs/ExternalLogUrlResponse.md b/docs/ExternalLogUrlResponse.md
new file mode 100644
index 0000000..bce0a0d
--- /dev/null
+++ b/docs/ExternalLogUrlResponse.md
@@ -0,0 +1,30 @@
+# ExternalLogUrlResponse
+
+Response for the external log URL endpoint.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**url** | **str** |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.external_log_url_response import ExternalLogUrlResponse
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of ExternalLogUrlResponse from a JSON string
+external_log_url_response_instance = ExternalLogUrlResponse.from_json(json)
+# print the JSON string representation of the object
+print(ExternalLogUrlResponse.to_json())
+
+# convert the object into a dict
+external_log_url_response_dict = external_log_url_response_instance.to_dict()
+# create an instance of ExternalLogUrlResponse from a dict
+external_log_url_response_from_dict = ExternalLogUrlResponse.from_dict(external_log_url_response_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/ExternalViewResponse.md b/docs/ExternalViewResponse.md
new file mode 100644
index 0000000..beb3ad3
--- /dev/null
+++ b/docs/ExternalViewResponse.md
@@ -0,0 +1,36 @@
+# ExternalViewResponse
+
+Serializer for External View Plugin responses.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**category** | **str** |  | [optional] 
+**destination** | **str** |  | [optional] [default to 'nav']
+**href** | **str** |  | 
+**icon** | **str** |  | [optional] 
+**icon_dark_mode** | **str** |  | [optional] 
+**name** | **str** |  | 
+**url_route** | **str** |  | [optional] 
+
+## Example
+
+```python
+from airflow_client.client.models.external_view_response import ExternalViewResponse
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of ExternalViewResponse from a JSON string
+external_view_response_instance = ExternalViewResponse.from_json(json)
+# print the JSON string representation of the object
+print(ExternalViewResponse.to_json())
+
+# convert the object into a dict
+external_view_response_dict = external_view_response_instance.to_dict()
+# create an instance of ExternalViewResponse from a dict
+external_view_response_from_dict = ExternalViewResponse.from_dict(external_view_response_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/ExtraLinksApi.md b/docs/ExtraLinksApi.md
index 4d3a2db..94acdca 100644
--- a/docs/ExtraLinksApi.md
+++ b/docs/ExtraLinksApi.md
@@ -17,6 +17,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -37,6 +38,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -73,7 +79,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/HITLDetail.md b/docs/HITLDetail.md
new file mode 100644
index 0000000..951d878
--- /dev/null
+++ b/docs/HITLDetail.md
@@ -0,0 +1,43 @@
+# HITLDetail
+
+Schema for Human-in-the-loop detail.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**assigned_users** | [**List[HITLUser]**](HITLUser.md) |  | [optional] 
+**body** | **str** |  | [optional] 
+**chosen_options** | **List[str]** |  | [optional] 
+**created_at** | **datetime** |  | 
+**defaults** | **List[str]** |  | [optional] 
+**multiple** | **bool** |  | [optional] [default to False]
+**options** | **List[str]** |  | 
+**params** | **object** |  | [optional] 
+**params_input** | **object** |  | [optional] 
+**responded_at** | **datetime** |  | [optional] 
+**responded_by_user** | [**HITLUser**](HITLUser.md) |  | [optional] 
+**response_received** | **bool** |  | [optional] [default to False]
+**subject** | **str** |  | 
+**task_instance** | [**TaskInstanceResponse**](TaskInstanceResponse.md) |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.hitl_detail import HITLDetail
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of HITLDetail from a JSON string
+hitl_detail_instance = HITLDetail.from_json(json)
+# print the JSON string representation of the object
+print(HITLDetail.to_json())
+
+# convert the object into a dict
+hitl_detail_dict = hitl_detail_instance.to_dict()
+# create an instance of HITLDetail from a dict
+hitl_detail_from_dict = HITLDetail.from_dict(hitl_detail_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/HITLDetailCollection.md b/docs/HITLDetailCollection.md
new file mode 100644
index 0000000..7740cd2
--- /dev/null
+++ b/docs/HITLDetailCollection.md
@@ -0,0 +1,31 @@
+# HITLDetailCollection
+
+Schema for a collection of Human-in-the-loop details.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**hitl_details** | [**List[HITLDetail]**](HITLDetail.md) |  | 
+**total_entries** | **int** |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.hitl_detail_collection import HITLDetailCollection
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of HITLDetailCollection from a JSON string
+hitl_detail_collection_instance = HITLDetailCollection.from_json(json)
+# print the JSON string representation of the object
+print(HITLDetailCollection.to_json())
+
+# convert the object into a dict
+hitl_detail_collection_dict = hitl_detail_collection_instance.to_dict()
+# create an instance of HITLDetailCollection from a dict
+hitl_detail_collection_from_dict = HITLDetailCollection.from_dict(hitl_detail_collection_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/HITLDetailResponse.md b/docs/HITLDetailResponse.md
new file mode 100644
index 0000000..2bde7a2
--- /dev/null
+++ b/docs/HITLDetailResponse.md
@@ -0,0 +1,33 @@
+# HITLDetailResponse
+
+Response of updating a Human-in-the-loop detail.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**chosen_options** | **List[str]** |  | 
+**params_input** | **object** |  | [optional] 
+**responded_at** | **datetime** |  | 
+**responded_by** | [**HITLUser**](HITLUser.md) |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.hitl_detail_response import HITLDetailResponse
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of HITLDetailResponse from a JSON string
+hitl_detail_response_instance = HITLDetailResponse.from_json(json)
+# print the JSON string representation of the object
+print(HITLDetailResponse.to_json())
+
+# convert the object into a dict
+hitl_detail_response_dict = hitl_detail_response_instance.to_dict()
+# create an instance of HITLDetailResponse from a dict
+hitl_detail_response_from_dict = HITLDetailResponse.from_dict(hitl_detail_response_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/HITLUser.md b/docs/HITLUser.md
new file mode 100644
index 0000000..06e3818
--- /dev/null
+++ b/docs/HITLUser.md
@@ -0,0 +1,31 @@
+# HITLUser
+
+Schema for a Human-in-the-loop users.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**id** | **str** |  | 
+**name** | **str** |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.hitl_user import HITLUser
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of HITLUser from a JSON string
+hitl_user_instance = HITLUser.from_json(json)
+# print the JSON string representation of the object
+print(HITLUser.to_json())
+
+# convert the object into a dict
+hitl_user_dict = hitl_user_instance.to_dict()
+# create an instance of HITLUser from a dict
+hitl_user_from_dict = HITLUser.from_dict(hitl_user_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/ImportErrorApi.md b/docs/ImportErrorApi.md
index 5e16c18..943973a 100644
--- a/docs/ImportErrorApi.md
+++ b/docs/ImportErrorApi.md
@@ -18,6 +18,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -38,6 +39,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -68,7 +74,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -88,7 +94,7 @@
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
 # **get_import_errors**
-> ImportErrorCollectionResponse get_import_errors(limit=limit, offset=offset, order_by=order_by)
+> ImportErrorCollectionResponse get_import_errors(limit=limit, offset=offset, order_by=order_by, filename_pattern=filename_pattern)
 
 Get Import Errors
 
@@ -97,6 +103,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -117,17 +124,23 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = airflow_client.client.ImportErrorApi(api_client)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'id' # str |  (optional) (default to 'id')
+    order_by = [id] # List[str] |  (optional) (default to [id])
+    filename_pattern = 'filename_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
 
     try:
         # Get Import Errors
-        api_response = api_instance.get_import_errors(limit=limit, offset=offset, order_by=order_by)
+        api_response = api_instance.get_import_errors(limit=limit, offset=offset, order_by=order_by, filename_pattern=filename_pattern)
         print("The response of ImportErrorApi->get_import_errors:\n")
         pprint(api_response)
     except Exception as e:
@@ -143,7 +156,8 @@
 ------------- | ------------- | ------------- | -------------
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'id']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [id]]
+ **filename_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
 
 ### Return type
 
@@ -151,7 +165,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/JobApi.md b/docs/JobApi.md
index 5f3999d..f7be0f2 100644
--- a/docs/JobApi.md
+++ b/docs/JobApi.md
@@ -8,7 +8,7 @@
 
 
 # **get_jobs**
-> JobCollectionResponse get_jobs(is_alive=is_alive, start_date_gte=start_date_gte, start_date_lte=start_date_lte, end_date_gte=end_date_gte, end_date_lte=end_date_lte, limit=limit, offset=offset, order_by=order_by, job_state=job_state, job_type=job_type, hostname=hostname, executor_class=executor_class)
+> JobCollectionResponse get_jobs(is_alive=is_alive, start_date_gte=start_date_gte, start_date_gt=start_date_gt, start_date_lte=start_date_lte, start_date_lt=start_date_lt, end_date_gte=end_date_gte, end_date_gt=end_date_gt, end_date_lte=end_date_lte, end_date_lt=end_date_lt, limit=limit, offset=offset, order_by=order_by, job_state=job_state, job_type=job_type, hostname=hostname, executor_class=executor_class)
 
 Get Jobs
 
@@ -17,6 +17,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -37,18 +38,27 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = airflow_client.client.JobApi(api_client)
     is_alive = True # bool |  (optional)
     start_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    start_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     start_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    start_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     end_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    end_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     end_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    end_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'id' # str |  (optional) (default to 'id')
+    order_by = [id] # List[str] |  (optional) (default to [id])
     job_state = 'job_state_example' # str |  (optional)
     job_type = 'job_type_example' # str |  (optional)
     hostname = 'hostname_example' # str |  (optional)
@@ -56,7 +66,7 @@
 
     try:
         # Get Jobs
-        api_response = api_instance.get_jobs(is_alive=is_alive, start_date_gte=start_date_gte, start_date_lte=start_date_lte, end_date_gte=end_date_gte, end_date_lte=end_date_lte, limit=limit, offset=offset, order_by=order_by, job_state=job_state, job_type=job_type, hostname=hostname, executor_class=executor_class)
+        api_response = api_instance.get_jobs(is_alive=is_alive, start_date_gte=start_date_gte, start_date_gt=start_date_gt, start_date_lte=start_date_lte, start_date_lt=start_date_lt, end_date_gte=end_date_gte, end_date_gt=end_date_gt, end_date_lte=end_date_lte, end_date_lt=end_date_lt, limit=limit, offset=offset, order_by=order_by, job_state=job_state, job_type=job_type, hostname=hostname, executor_class=executor_class)
         print("The response of JobApi->get_jobs:\n")
         pprint(api_response)
     except Exception as e:
@@ -72,12 +82,16 @@
 ------------- | ------------- | ------------- | -------------
  **is_alive** | **bool**|  | [optional] 
  **start_date_gte** | **datetime**|  | [optional] 
+ **start_date_gt** | **datetime**|  | [optional] 
  **start_date_lte** | **datetime**|  | [optional] 
+ **start_date_lt** | **datetime**|  | [optional] 
  **end_date_gte** | **datetime**|  | [optional] 
+ **end_date_gt** | **datetime**|  | [optional] 
  **end_date_lte** | **datetime**|  | [optional] 
+ **end_date_lt** | **datetime**|  | [optional] 
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'id']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [id]]
  **job_state** | **str**|  | [optional] 
  **job_type** | **str**|  | [optional] 
  **hostname** | **str**|  | [optional] 
@@ -89,7 +103,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/JobResponse.md b/docs/JobResponse.md
index f20cbe6..74cb3d8 100644
--- a/docs/JobResponse.md
+++ b/docs/JobResponse.md
@@ -6,6 +6,7 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | [optional] 
 **dag_id** | **str** |  | [optional] 
 **end_date** | **datetime** |  | [optional] 
 **executor_class** | **str** |  | [optional] 
diff --git a/docs/LastAssetEventResponse.md b/docs/LastAssetEventResponse.md
new file mode 100644
index 0000000..485c02e
--- /dev/null
+++ b/docs/LastAssetEventResponse.md
@@ -0,0 +1,31 @@
+# LastAssetEventResponse
+
+Last asset event response serializer.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**id** | **int** |  | [optional] 
+**timestamp** | **datetime** |  | [optional] 
+
+## Example
+
+```python
+from airflow_client.client.models.last_asset_event_response import LastAssetEventResponse
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of LastAssetEventResponse from a JSON string
+last_asset_event_response_instance = LastAssetEventResponse.from_json(json)
+# print the JSON string representation of the object
+print(LastAssetEventResponse.to_json())
+
+# convert the object into a dict
+last_asset_event_response_dict = last_asset_event_response_instance.to_dict()
+# create an instance of LastAssetEventResponse from a dict
+last_asset_event_response_from_dict = LastAssetEventResponse.from_dict(last_asset_event_response_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/LoginApi.md b/docs/LoginApi.md
index 1205826..12bc074 100644
--- a/docs/LoginApi.md
+++ b/docs/LoginApi.md
@@ -6,6 +6,7 @@
 ------------- | ------------- | -------------
 [**login**](LoginApi.md#login) | **GET** /api/v2/auth/login | Login
 [**logout**](LoginApi.md#logout) | **GET** /api/v2/auth/logout | Logout
+[**refresh**](LoginApi.md#refresh) | **GET** /api/v2/auth/refresh | Refresh
 
 
 # **login**
@@ -146,3 +147,72 @@
 
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
+# **refresh**
+> object refresh(next=next)
+
+Refresh
+
+Refresh the authentication token.
+
+### Example
+
+
+```python
+import airflow_client.client
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.LoginApi(api_client)
+    next = 'next_example' # str |  (optional)
+
+    try:
+        # Refresh
+        api_response = api_instance.refresh(next=next)
+        print("The response of LoginApi->refresh:\n")
+        pprint(api_response)
+    except Exception as e:
+        print("Exception when calling LoginApi->refresh: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **next** | **str**|  | [optional] 
+
+### Return type
+
+**object**
+
+### Authorization
+
+No authorization required
+
+### HTTP request headers
+
+ - **Content-Type**: Not defined
+ - **Accept**: application/json
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**200** | Successful Response |  -  |
+**307** | Temporary Redirect |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
diff --git a/docs/PluginApi.md b/docs/PluginApi.md
index 76108c8..e46d493 100644
--- a/docs/PluginApi.md
+++ b/docs/PluginApi.md
@@ -5,6 +5,7 @@
 Method | HTTP request | Description
 ------------- | ------------- | -------------
 [**get_plugins**](PluginApi.md#get_plugins) | **GET** /api/v2/plugins | Get Plugins
+[**import_errors**](PluginApi.md#import_errors) | **GET** /api/v2/plugins/importErrors | Import Errors
 
 
 # **get_plugins**
@@ -15,6 +16,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -35,6 +37,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -67,7 +74,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -85,3 +92,80 @@
 
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
+# **import_errors**
+> PluginImportErrorCollectionResponse import_errors()
+
+Import Errors
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.models.plugin_import_error_collection_response import PluginImportErrorCollectionResponse
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.PluginApi(api_client)
+
+    try:
+        # Import Errors
+        api_response = api_instance.import_errors()
+        print("The response of PluginApi->import_errors:\n")
+        pprint(api_response)
+    except Exception as e:
+        print("Exception when calling PluginApi->import_errors: %s\n" % e)
+```
+
+
+
+### Parameters
+
+This endpoint does not need any parameter.
+
+### Return type
+
+[**PluginImportErrorCollectionResponse**](PluginImportErrorCollectionResponse.md)
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: Not defined
+ - **Accept**: application/json
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**200** | Successful Response |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
diff --git a/docs/PluginImportErrorCollectionResponse.md b/docs/PluginImportErrorCollectionResponse.md
new file mode 100644
index 0000000..a404f4b
--- /dev/null
+++ b/docs/PluginImportErrorCollectionResponse.md
@@ -0,0 +1,31 @@
+# PluginImportErrorCollectionResponse
+
+Plugin Import Error Collection serializer.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**import_errors** | [**List[PluginImportErrorResponse]**](PluginImportErrorResponse.md) |  | 
+**total_entries** | **int** |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.plugin_import_error_collection_response import PluginImportErrorCollectionResponse
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of PluginImportErrorCollectionResponse from a JSON string
+plugin_import_error_collection_response_instance = PluginImportErrorCollectionResponse.from_json(json)
+# print the JSON string representation of the object
+print(PluginImportErrorCollectionResponse.to_json())
+
+# convert the object into a dict
+plugin_import_error_collection_response_dict = plugin_import_error_collection_response_instance.to_dict()
+# create an instance of PluginImportErrorCollectionResponse from a dict
+plugin_import_error_collection_response_from_dict = PluginImportErrorCollectionResponse.from_dict(plugin_import_error_collection_response_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/PluginImportErrorResponse.md b/docs/PluginImportErrorResponse.md
new file mode 100644
index 0000000..751fbaa
--- /dev/null
+++ b/docs/PluginImportErrorResponse.md
@@ -0,0 +1,31 @@
+# PluginImportErrorResponse
+
+Plugin Import Error serializer for responses.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**error** | **str** |  | 
+**source** | **str** |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.plugin_import_error_response import PluginImportErrorResponse
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of PluginImportErrorResponse from a JSON string
+plugin_import_error_response_instance = PluginImportErrorResponse.from_json(json)
+# print the JSON string representation of the object
+print(PluginImportErrorResponse.to_json())
+
+# convert the object into a dict
+plugin_import_error_response_dict = plugin_import_error_response_instance.to_dict()
+# create an instance of PluginImportErrorResponse from a dict
+plugin_import_error_response_from_dict = PluginImportErrorResponse.from_dict(plugin_import_error_response_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/PluginResponse.md b/docs/PluginResponse.md
index 04d1529..ad9c6a9 100644
--- a/docs/PluginResponse.md
+++ b/docs/PluginResponse.md
@@ -8,6 +8,7 @@
 ------------ | ------------- | ------------- | -------------
 **appbuilder_menu_items** | [**List[AppBuilderMenuItemResponse]**](AppBuilderMenuItemResponse.md) |  | 
 **appbuilder_views** | [**List[AppBuilderViewResponse]**](AppBuilderViewResponse.md) |  | 
+**external_views** | [**List[ExternalViewResponse]**](ExternalViewResponse.md) | Aggregate all external views. Both 'external_views' and 'appbuilder_menu_items' are included here. | 
 **fastapi_apps** | [**List[FastAPIAppResponse]**](FastAPIAppResponse.md) |  | 
 **fastapi_root_middlewares** | [**List[FastAPIRootMiddlewareResponse]**](FastAPIRootMiddlewareResponse.md) |  | 
 **flask_blueprints** | **List[str]** |  | 
@@ -16,6 +17,7 @@
 **macros** | **List[str]** |  | 
 **name** | **str** |  | 
 **operator_extra_links** | **List[str]** |  | 
+**react_apps** | [**List[ReactAppResponse]**](ReactAppResponse.md) |  | 
 **source** | **str** |  | 
 **timetables** | **List[str]** |  | 
 
diff --git a/docs/PoolApi.md b/docs/PoolApi.md
index e20b2c4..9d8aba9 100644
--- a/docs/PoolApi.md
+++ b/docs/PoolApi.md
@@ -22,6 +22,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -43,6 +44,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -73,7 +79,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -101,6 +107,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -120,6 +127,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -148,7 +160,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -178,6 +190,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -198,6 +211,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -228,7 +246,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -257,6 +275,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -277,14 +296,19 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = airflow_client.client.PoolApi(api_client)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'id' # str |  (optional) (default to 'id')
-    pool_name_pattern = 'pool_name_pattern_example' # str |  (optional)
+    order_by = [id] # List[str] |  (optional) (default to [id])
+    pool_name_pattern = 'pool_name_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
 
     try:
         # Get Pools
@@ -304,8 +328,8 @@
 ------------- | ------------- | ------------- | -------------
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'id']
- **pool_name_pattern** | **str**|  | [optional] 
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [id]]
+ **pool_name_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
 
 ### Return type
 
@@ -313,7 +337,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -342,6 +366,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -363,6 +388,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -397,7 +427,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -427,6 +457,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -448,6 +479,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -478,7 +514,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/ProviderApi.md b/docs/ProviderApi.md
index 0916f1d..ddacc85 100644
--- a/docs/ProviderApi.md
+++ b/docs/ProviderApi.md
@@ -17,6 +17,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -37,6 +38,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -69,7 +75,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/QueuedEventResponse.md b/docs/QueuedEventResponse.md
index d96ae94..9a645a2 100644
--- a/docs/QueuedEventResponse.md
+++ b/docs/QueuedEventResponse.md
@@ -8,6 +8,7 @@
 ------------ | ------------- | ------------- | -------------
 **asset_id** | **int** |  | 
 **created_at** | **datetime** |  | 
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 
 ## Example
diff --git a/docs/ReactAppResponse.md b/docs/ReactAppResponse.md
new file mode 100644
index 0000000..acbde08
--- /dev/null
+++ b/docs/ReactAppResponse.md
@@ -0,0 +1,36 @@
+# ReactAppResponse
+
+Serializer for React App Plugin responses.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**bundle_url** | **str** |  | 
+**category** | **str** |  | [optional] 
+**destination** | **str** |  | [optional] [default to 'nav']
+**icon** | **str** |  | [optional] 
+**icon_dark_mode** | **str** |  | [optional] 
+**name** | **str** |  | 
+**url_route** | **str** |  | [optional] 
+
+## Example
+
+```python
+from airflow_client.client.models.react_app_response import ReactAppResponse
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of ReactAppResponse from a JSON string
+react_app_response_instance = ReactAppResponse.from_json(json)
+# print the JSON string representation of the object
+print(ReactAppResponse.to_json())
+
+# convert the object into a dict
+react_app_response_dict = react_app_response_instance.to_dict()
+# create an instance of ReactAppResponse from a dict
+react_app_response_from_dict = ReactAppResponse.from_dict(react_app_response_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/ResponseClearDagRun.md b/docs/ResponseClearDagRun.md
index 650ac08..231a042 100644
--- a/docs/ResponseClearDagRun.md
+++ b/docs/ResponseClearDagRun.md
@@ -9,11 +9,13 @@
 **total_entries** | **int** |  | 
 **bundle_version** | **str** |  | [optional] 
 **conf** | **object** |  | [optional] 
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **dag_run_id** | **str** |  | 
 **dag_versions** | [**List[DagVersionResponse]**](DagVersionResponse.md) |  | 
 **data_interval_end** | **datetime** |  | [optional] 
 **data_interval_start** | **datetime** |  | [optional] 
+**duration** | **float** |  | [optional] 
 **end_date** | **datetime** |  | [optional] 
 **last_scheduling_decision** | **datetime** |  | [optional] 
 **logical_date** | **datetime** |  | [optional] 
@@ -24,6 +26,7 @@
 **start_date** | **datetime** |  | [optional] 
 **state** | [**DagRunState**](DagRunState.md) |  | 
 **triggered_by** | [**DagRunTriggeredByType**](DagRunTriggeredByType.md) |  | [optional] 
+**triggering_user_name** | **str** |  | [optional] 
 
 ## Example
 
diff --git a/docs/ResponseGetXcomEntry.md b/docs/ResponseGetXcomEntry.md
index b105d59..9db7390 100644
--- a/docs/ResponseGetXcomEntry.md
+++ b/docs/ResponseGetXcomEntry.md
@@ -5,11 +5,13 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **key** | **str** |  | 
 **logical_date** | **datetime** |  | [optional] 
 **map_index** | **int** |  | 
 **run_id** | **str** |  | 
+**task_display_name** | **str** |  | 
 **task_id** | **str** |  | 
 **timestamp** | **datetime** |  | 
 **value** | **str** |  | 
diff --git a/docs/TaskApi.md b/docs/TaskApi.md
index 09fb32a..c1247b7 100644
--- a/docs/TaskApi.md
+++ b/docs/TaskApi.md
@@ -18,6 +18,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -38,6 +39,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -70,7 +76,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -100,6 +106,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -120,6 +127,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -152,7 +164,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/TaskInletAssetReference.md b/docs/TaskInletAssetReference.md
new file mode 100644
index 0000000..0e45131
--- /dev/null
+++ b/docs/TaskInletAssetReference.md
@@ -0,0 +1,33 @@
+# TaskInletAssetReference
+
+Task inlet reference serializer for assets.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**created_at** | **datetime** |  | 
+**dag_id** | **str** |  | 
+**task_id** | **str** |  | 
+**updated_at** | **datetime** |  | 
+
+## Example
+
+```python
+from airflow_client.client.models.task_inlet_asset_reference import TaskInletAssetReference
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of TaskInletAssetReference from a JSON string
+task_inlet_asset_reference_instance = TaskInletAssetReference.from_json(json)
+# print the JSON string representation of the object
+print(TaskInletAssetReference.to_json())
+
+# convert the object into a dict
+task_inlet_asset_reference_dict = task_inlet_asset_reference_instance.to_dict()
+# create an instance of TaskInletAssetReference from a dict
+task_inlet_asset_reference_from_dict = TaskInletAssetReference.from_dict(task_inlet_asset_reference_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/TaskInstanceApi.md b/docs/TaskInstanceApi.md
index eae906b..991c449 100644
--- a/docs/TaskInstanceApi.md
+++ b/docs/TaskInstanceApi.md
@@ -4,7 +4,12 @@
 
 Method | HTTP request | Description
 ------------- | ------------- | -------------
+[**bulk_task_instances**](TaskInstanceApi.md#bulk_task_instances) | **PATCH** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances | Bulk Task Instances
+[**delete_task_instance**](TaskInstanceApi.md#delete_task_instance) | **DELETE** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id} | Delete Task Instance
+[**get_external_log_url**](TaskInstanceApi.md#get_external_log_url) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/externalLogUrl/{try_number} | Get External Log Url
 [**get_extra_links**](TaskInstanceApi.md#get_extra_links) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/links | Get Extra Links
+[**get_hitl_detail**](TaskInstanceApi.md#get_hitl_detail) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/hitlDetails | Get Hitl Detail
+[**get_hitl_details**](TaskInstanceApi.md#get_hitl_details) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/hitlDetails | Get Hitl Details
 [**get_log**](TaskInstanceApi.md#get_log) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/logs/{try_number} | Get Log
 [**get_mapped_task_instance**](TaskInstanceApi.md#get_mapped_task_instance) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index} | Get Mapped Task Instance
 [**get_mapped_task_instance_tries**](TaskInstanceApi.md#get_mapped_task_instance_tries) | **GET** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/tries | Get Mapped Task Instance Tries
@@ -22,8 +27,282 @@
 [**patch_task_instance_dry_run**](TaskInstanceApi.md#patch_task_instance_dry_run) | **PATCH** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/dry_run | Patch Task Instance Dry Run
 [**patch_task_instance_dry_run_by_map_index**](TaskInstanceApi.md#patch_task_instance_dry_run_by_map_index) | **PATCH** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/dry_run | Patch Task Instance Dry Run
 [**post_clear_task_instances**](TaskInstanceApi.md#post_clear_task_instances) | **POST** /api/v2/dags/{dag_id}/clearTaskInstances | Post Clear Task Instances
+[**update_hitl_detail**](TaskInstanceApi.md#update_hitl_detail) | **PATCH** /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/hitlDetails | Update Hitl Detail
 
 
+# **bulk_task_instances**
+> BulkResponse bulk_task_instances(dag_id, dag_run_id, bulk_body_bulk_task_instance_body)
+
+Bulk Task Instances
+
+Bulk update, and delete task instances.
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.models.bulk_body_bulk_task_instance_body import BulkBodyBulkTaskInstanceBody
+from airflow_client.client.models.bulk_response import BulkResponse
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.TaskInstanceApi(api_client)
+    dag_id = 'dag_id_example' # str | 
+    dag_run_id = 'dag_run_id_example' # str | 
+    bulk_body_bulk_task_instance_body = airflow_client.client.BulkBodyBulkTaskInstanceBody() # BulkBodyBulkTaskInstanceBody | 
+
+    try:
+        # Bulk Task Instances
+        api_response = api_instance.bulk_task_instances(dag_id, dag_run_id, bulk_body_bulk_task_instance_body)
+        print("The response of TaskInstanceApi->bulk_task_instances:\n")
+        pprint(api_response)
+    except Exception as e:
+        print("Exception when calling TaskInstanceApi->bulk_task_instances: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **dag_id** | **str**|  | 
+ **dag_run_id** | **str**|  | 
+ **bulk_body_bulk_task_instance_body** | [**BulkBodyBulkTaskInstanceBody**](BulkBodyBulkTaskInstanceBody.md)|  | 
+
+### Return type
+
+[**BulkResponse**](BulkResponse.md)
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**200** | Successful Response |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
+# **delete_task_instance**
+> object delete_task_instance(dag_id, dag_run_id, task_id, map_index=map_index)
+
+Delete Task Instance
+
+Delete a task instance.
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.TaskInstanceApi(api_client)
+    dag_id = 'dag_id_example' # str | 
+    dag_run_id = 'dag_run_id_example' # str | 
+    task_id = 'task_id_example' # str | 
+    map_index = -1 # int |  (optional) (default to -1)
+
+    try:
+        # Delete Task Instance
+        api_response = api_instance.delete_task_instance(dag_id, dag_run_id, task_id, map_index=map_index)
+        print("The response of TaskInstanceApi->delete_task_instance:\n")
+        pprint(api_response)
+    except Exception as e:
+        print("Exception when calling TaskInstanceApi->delete_task_instance: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **dag_id** | **str**|  | 
+ **dag_run_id** | **str**|  | 
+ **task_id** | **str**|  | 
+ **map_index** | **int**|  | [optional] [default to -1]
+
+### Return type
+
+**object**
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: Not defined
+ - **Accept**: application/json
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**200** | Successful Response |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+**404** | Not Found |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
+# **get_external_log_url**
+> ExternalLogUrlResponse get_external_log_url(dag_id, dag_run_id, task_id, try_number, map_index=map_index)
+
+Get External Log Url
+
+Get external log URL for a specific task instance.
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.models.external_log_url_response import ExternalLogUrlResponse
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.TaskInstanceApi(api_client)
+    dag_id = 'dag_id_example' # str | 
+    dag_run_id = 'dag_run_id_example' # str | 
+    task_id = 'task_id_example' # str | 
+    try_number = 56 # int | 
+    map_index = -1 # int |  (optional) (default to -1)
+
+    try:
+        # Get External Log Url
+        api_response = api_instance.get_external_log_url(dag_id, dag_run_id, task_id, try_number, map_index=map_index)
+        print("The response of TaskInstanceApi->get_external_log_url:\n")
+        pprint(api_response)
+    except Exception as e:
+        print("Exception when calling TaskInstanceApi->get_external_log_url: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **dag_id** | **str**|  | 
+ **dag_run_id** | **str**|  | 
+ **task_id** | **str**|  | 
+ **try_number** | **int**|  | 
+ **map_index** | **int**|  | [optional] [default to -1]
+
+### Return type
+
+[**ExternalLogUrlResponse**](ExternalLogUrlResponse.md)
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: Not defined
+ - **Accept**: application/json
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**200** | Successful Response |  -  |
+**400** | Bad Request |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+**404** | Not Found |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
 # **get_extra_links**
 > ExtraLinkCollectionResponse get_extra_links(dag_id, dag_run_id, task_id, map_index=map_index)
 
@@ -34,6 +313,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -54,6 +334,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -90,7 +375,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -109,6 +394,217 @@
 
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
+# **get_hitl_detail**
+> HITLDetail get_hitl_detail(dag_id, dag_run_id, task_id, map_index)
+
+Get Hitl Detail
+
+Get a Human-in-the-loop detail of a specific task instance.
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.models.hitl_detail import HITLDetail
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.TaskInstanceApi(api_client)
+    dag_id = 'dag_id_example' # str | 
+    dag_run_id = 'dag_run_id_example' # str | 
+    task_id = 'task_id_example' # str | 
+    map_index = 56 # int | 
+
+    try:
+        # Get Hitl Detail
+        api_response = api_instance.get_hitl_detail(dag_id, dag_run_id, task_id, map_index)
+        print("The response of TaskInstanceApi->get_hitl_detail:\n")
+        pprint(api_response)
+    except Exception as e:
+        print("Exception when calling TaskInstanceApi->get_hitl_detail: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **dag_id** | **str**|  | 
+ **dag_run_id** | **str**|  | 
+ **task_id** | **str**|  | 
+ **map_index** | **int**|  | 
+
+### Return type
+
+[**HITLDetail**](HITLDetail.md)
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: Not defined
+ - **Accept**: application/json
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**200** | Successful Response |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+**404** | Not Found |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
+# **get_hitl_details**
+> HITLDetailCollection get_hitl_details(dag_id, dag_run_id, limit=limit, offset=offset, order_by=order_by, dag_id_pattern=dag_id_pattern, task_id=task_id, task_id_pattern=task_id_pattern, map_index=map_index, state=state, response_received=response_received, responded_by_user_id=responded_by_user_id, responded_by_user_name=responded_by_user_name, subject_search=subject_search, body_search=body_search, created_at_gte=created_at_gte, created_at_gt=created_at_gt, created_at_lte=created_at_lte, created_at_lt=created_at_lt)
+
+Get Hitl Details
+
+Get Human-in-the-loop details.
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.models.hitl_detail_collection import HITLDetailCollection
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.TaskInstanceApi(api_client)
+    dag_id = 'dag_id_example' # str | 
+    dag_run_id = 'dag_run_id_example' # str | 
+    limit = 50 # int |  (optional) (default to 50)
+    offset = 0 # int |  (optional) (default to 0)
+    order_by = ["ti_id"] # List[str] |  (optional) (default to ["ti_id"])
+    dag_id_pattern = 'dag_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    task_id = 'task_id_example' # str |  (optional)
+    task_id_pattern = 'task_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    map_index = 56 # int |  (optional)
+    state = ['state_example'] # List[str] |  (optional)
+    response_received = True # bool |  (optional)
+    responded_by_user_id = ['responded_by_user_id_example'] # List[str] |  (optional)
+    responded_by_user_name = ['responded_by_user_name_example'] # List[str] |  (optional)
+    subject_search = 'subject_search_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    body_search = 'body_search_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    created_at_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    created_at_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    created_at_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    created_at_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+
+    try:
+        # Get Hitl Details
+        api_response = api_instance.get_hitl_details(dag_id, dag_run_id, limit=limit, offset=offset, order_by=order_by, dag_id_pattern=dag_id_pattern, task_id=task_id, task_id_pattern=task_id_pattern, map_index=map_index, state=state, response_received=response_received, responded_by_user_id=responded_by_user_id, responded_by_user_name=responded_by_user_name, subject_search=subject_search, body_search=body_search, created_at_gte=created_at_gte, created_at_gt=created_at_gt, created_at_lte=created_at_lte, created_at_lt=created_at_lt)
+        print("The response of TaskInstanceApi->get_hitl_details:\n")
+        pprint(api_response)
+    except Exception as e:
+        print("Exception when calling TaskInstanceApi->get_hitl_details: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **dag_id** | **str**|  | 
+ **dag_run_id** | **str**|  | 
+ **limit** | **int**|  | [optional] [default to 50]
+ **offset** | **int**|  | [optional] [default to 0]
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to ["ti_id"]]
+ **dag_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **task_id** | **str**|  | [optional] 
+ **task_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **map_index** | **int**|  | [optional] 
+ **state** | [**List[str]**](str.md)|  | [optional] 
+ **response_received** | **bool**|  | [optional] 
+ **responded_by_user_id** | [**List[str]**](str.md)|  | [optional] 
+ **responded_by_user_name** | [**List[str]**](str.md)|  | [optional] 
+ **subject_search** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **body_search** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **created_at_gte** | **datetime**|  | [optional] 
+ **created_at_gt** | **datetime**|  | [optional] 
+ **created_at_lte** | **datetime**|  | [optional] 
+ **created_at_lt** | **datetime**|  | [optional] 
+
+### Return type
+
+[**HITLDetailCollection**](HITLDetailCollection.md)
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: Not defined
+ - **Accept**: application/json
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**200** | Successful Response |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
 # **get_log**
 > TaskInstancesLogResponse get_log(dag_id, dag_run_id, task_id, try_number, full_content=full_content, map_index=map_index, token=token, accept=accept)
 
@@ -119,6 +615,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -139,6 +636,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -183,7 +685,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -212,6 +714,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -232,6 +735,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -268,7 +776,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -295,6 +803,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -315,6 +824,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -351,7 +865,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -378,6 +892,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -398,6 +913,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -436,7 +956,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -456,7 +976,7 @@
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
 # **get_mapped_task_instances**
-> TaskInstanceCollectionResponse get_mapped_task_instances(dag_id, dag_run_id, task_id, run_after_gte=run_after_gte, run_after_lte=run_after_lte, logical_date_gte=logical_date_gte, logical_date_lte=logical_date_lte, start_date_gte=start_date_gte, start_date_lte=start_date_lte, end_date_gte=end_date_gte, end_date_lte=end_date_lte, updated_at_gte=updated_at_gte, updated_at_lte=updated_at_lte, duration_gte=duration_gte, duration_lte=duration_lte, state=state, pool=pool, queue=queue, executor=executor, version_number=version_number, limit=limit, offset=offset, order_by=order_by)
+> TaskInstanceCollectionResponse get_mapped_task_instances(dag_id, dag_run_id, task_id, run_after_gte=run_after_gte, run_after_gt=run_after_gt, run_after_lte=run_after_lte, run_after_lt=run_after_lt, logical_date_gte=logical_date_gte, logical_date_gt=logical_date_gt, logical_date_lte=logical_date_lte, logical_date_lt=logical_date_lt, start_date_gte=start_date_gte, start_date_gt=start_date_gt, start_date_lte=start_date_lte, start_date_lt=start_date_lt, end_date_gte=end_date_gte, end_date_gt=end_date_gt, end_date_lte=end_date_lte, end_date_lt=end_date_lt, updated_at_gte=updated_at_gte, updated_at_gt=updated_at_gt, updated_at_lte=updated_at_lte, updated_at_lt=updated_at_lt, duration_gte=duration_gte, duration_gt=duration_gt, duration_lte=duration_lte, duration_lt=duration_lt, state=state, pool=pool, queue=queue, executor=executor, version_number=version_number, try_number=try_number, operator=operator, map_index=map_index, limit=limit, offset=offset, order_by=order_by)
 
 Get Mapped Task Instances
 
@@ -465,6 +985,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -485,6 +1006,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -493,29 +1019,44 @@
     dag_run_id = 'dag_run_id_example' # str | 
     task_id = 'task_id_example' # str | 
     run_after_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    run_after_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     run_after_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    run_after_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     logical_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    logical_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     logical_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    logical_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     start_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    start_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     start_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    start_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     end_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    end_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     end_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    end_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     updated_at_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    updated_at_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     updated_at_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    updated_at_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     duration_gte = 3.4 # float |  (optional)
+    duration_gt = 3.4 # float |  (optional)
     duration_lte = 3.4 # float |  (optional)
+    duration_lt = 3.4 # float |  (optional)
     state = ['state_example'] # List[str] |  (optional)
     pool = ['pool_example'] # List[str] |  (optional)
     queue = ['queue_example'] # List[str] |  (optional)
     executor = ['executor_example'] # List[str] |  (optional)
     version_number = [56] # List[int] |  (optional)
+    try_number = [56] # List[int] |  (optional)
+    operator = ['operator_example'] # List[str] |  (optional)
+    map_index = [56] # List[int] |  (optional)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'map_index' # str |  (optional) (default to 'map_index')
+    order_by = [map_index] # List[str] |  (optional) (default to [map_index])
 
     try:
         # Get Mapped Task Instances
-        api_response = api_instance.get_mapped_task_instances(dag_id, dag_run_id, task_id, run_after_gte=run_after_gte, run_after_lte=run_after_lte, logical_date_gte=logical_date_gte, logical_date_lte=logical_date_lte, start_date_gte=start_date_gte, start_date_lte=start_date_lte, end_date_gte=end_date_gte, end_date_lte=end_date_lte, updated_at_gte=updated_at_gte, updated_at_lte=updated_at_lte, duration_gte=duration_gte, duration_lte=duration_lte, state=state, pool=pool, queue=queue, executor=executor, version_number=version_number, limit=limit, offset=offset, order_by=order_by)
+        api_response = api_instance.get_mapped_task_instances(dag_id, dag_run_id, task_id, run_after_gte=run_after_gte, run_after_gt=run_after_gt, run_after_lte=run_after_lte, run_after_lt=run_after_lt, logical_date_gte=logical_date_gte, logical_date_gt=logical_date_gt, logical_date_lte=logical_date_lte, logical_date_lt=logical_date_lt, start_date_gte=start_date_gte, start_date_gt=start_date_gt, start_date_lte=start_date_lte, start_date_lt=start_date_lt, end_date_gte=end_date_gte, end_date_gt=end_date_gt, end_date_lte=end_date_lte, end_date_lt=end_date_lt, updated_at_gte=updated_at_gte, updated_at_gt=updated_at_gt, updated_at_lte=updated_at_lte, updated_at_lt=updated_at_lt, duration_gte=duration_gte, duration_gt=duration_gt, duration_lte=duration_lte, duration_lt=duration_lt, state=state, pool=pool, queue=queue, executor=executor, version_number=version_number, try_number=try_number, operator=operator, map_index=map_index, limit=limit, offset=offset, order_by=order_by)
         print("The response of TaskInstanceApi->get_mapped_task_instances:\n")
         pprint(api_response)
     except Exception as e:
@@ -533,25 +1074,40 @@
  **dag_run_id** | **str**|  | 
  **task_id** | **str**|  | 
  **run_after_gte** | **datetime**|  | [optional] 
+ **run_after_gt** | **datetime**|  | [optional] 
  **run_after_lte** | **datetime**|  | [optional] 
+ **run_after_lt** | **datetime**|  | [optional] 
  **logical_date_gte** | **datetime**|  | [optional] 
+ **logical_date_gt** | **datetime**|  | [optional] 
  **logical_date_lte** | **datetime**|  | [optional] 
+ **logical_date_lt** | **datetime**|  | [optional] 
  **start_date_gte** | **datetime**|  | [optional] 
+ **start_date_gt** | **datetime**|  | [optional] 
  **start_date_lte** | **datetime**|  | [optional] 
+ **start_date_lt** | **datetime**|  | [optional] 
  **end_date_gte** | **datetime**|  | [optional] 
+ **end_date_gt** | **datetime**|  | [optional] 
  **end_date_lte** | **datetime**|  | [optional] 
+ **end_date_lt** | **datetime**|  | [optional] 
  **updated_at_gte** | **datetime**|  | [optional] 
+ **updated_at_gt** | **datetime**|  | [optional] 
  **updated_at_lte** | **datetime**|  | [optional] 
+ **updated_at_lt** | **datetime**|  | [optional] 
  **duration_gte** | **float**|  | [optional] 
+ **duration_gt** | **float**|  | [optional] 
  **duration_lte** | **float**|  | [optional] 
+ **duration_lt** | **float**|  | [optional] 
  **state** | [**List[str]**](str.md)|  | [optional] 
  **pool** | [**List[str]**](str.md)|  | [optional] 
  **queue** | [**List[str]**](str.md)|  | [optional] 
  **executor** | [**List[str]**](str.md)|  | [optional] 
  **version_number** | [**List[int]**](int.md)|  | [optional] 
+ **try_number** | [**List[int]**](int.md)|  | [optional] 
+ **operator** | [**List[str]**](str.md)|  | [optional] 
+ **map_index** | [**List[int]**](int.md)|  | [optional] 
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'map_index']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [map_index]]
 
 ### Return type
 
@@ -559,7 +1115,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -588,6 +1144,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -608,6 +1165,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -642,7 +1204,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -671,6 +1233,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -691,6 +1254,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -727,7 +1295,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -756,6 +1324,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -776,6 +1345,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -812,7 +1386,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -841,6 +1415,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -861,6 +1436,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -897,7 +1477,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -926,6 +1506,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -946,6 +1527,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -984,7 +1570,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -1004,7 +1590,7 @@
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
 # **get_task_instances**
-> TaskInstanceCollectionResponse get_task_instances(dag_id, dag_run_id, task_id=task_id, run_after_gte=run_after_gte, run_after_lte=run_after_lte, logical_date_gte=logical_date_gte, logical_date_lte=logical_date_lte, start_date_gte=start_date_gte, start_date_lte=start_date_lte, end_date_gte=end_date_gte, end_date_lte=end_date_lte, updated_at_gte=updated_at_gte, updated_at_lte=updated_at_lte, duration_gte=duration_gte, duration_lte=duration_lte, task_display_name_pattern=task_display_name_pattern, state=state, pool=pool, queue=queue, executor=executor, version_number=version_number, limit=limit, offset=offset, order_by=order_by)
+> TaskInstanceCollectionResponse get_task_instances(dag_id, dag_run_id, task_id=task_id, run_after_gte=run_after_gte, run_after_gt=run_after_gt, run_after_lte=run_after_lte, run_after_lt=run_after_lt, logical_date_gte=logical_date_gte, logical_date_gt=logical_date_gt, logical_date_lte=logical_date_lte, logical_date_lt=logical_date_lt, start_date_gte=start_date_gte, start_date_gt=start_date_gt, start_date_lte=start_date_lte, start_date_lt=start_date_lt, end_date_gte=end_date_gte, end_date_gt=end_date_gt, end_date_lte=end_date_lte, end_date_lt=end_date_lt, updated_at_gte=updated_at_gte, updated_at_gt=updated_at_gt, updated_at_lte=updated_at_lte, updated_at_lt=updated_at_lt, duration_gte=duration_gte, duration_gt=duration_gt, duration_lte=duration_lte, duration_lt=duration_lt, task_display_name_pattern=task_display_name_pattern, state=state, pool=pool, queue=queue, executor=executor, version_number=version_number, try_number=try_number, operator=operator, map_index=map_index, limit=limit, offset=offset, order_by=order_by)
 
 Get Task Instances
 
@@ -1016,6 +1602,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -1036,6 +1623,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -1044,30 +1636,45 @@
     dag_run_id = 'dag_run_id_example' # str | 
     task_id = 'task_id_example' # str |  (optional)
     run_after_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    run_after_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     run_after_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    run_after_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     logical_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    logical_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     logical_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    logical_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     start_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    start_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     start_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    start_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     end_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    end_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     end_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    end_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     updated_at_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    updated_at_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     updated_at_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    updated_at_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
     duration_gte = 3.4 # float |  (optional)
+    duration_gt = 3.4 # float |  (optional)
     duration_lte = 3.4 # float |  (optional)
-    task_display_name_pattern = 'task_display_name_pattern_example' # str |  (optional)
+    duration_lt = 3.4 # float |  (optional)
+    task_display_name_pattern = 'task_display_name_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
     state = ['state_example'] # List[str] |  (optional)
     pool = ['pool_example'] # List[str] |  (optional)
     queue = ['queue_example'] # List[str] |  (optional)
     executor = ['executor_example'] # List[str] |  (optional)
     version_number = [56] # List[int] |  (optional)
+    try_number = [56] # List[int] |  (optional)
+    operator = ['operator_example'] # List[str] |  (optional)
+    map_index = [56] # List[int] |  (optional)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'map_index' # str |  (optional) (default to 'map_index')
+    order_by = ["map_index"] # List[str] |  (optional) (default to ["map_index"])
 
     try:
         # Get Task Instances
-        api_response = api_instance.get_task_instances(dag_id, dag_run_id, task_id=task_id, run_after_gte=run_after_gte, run_after_lte=run_after_lte, logical_date_gte=logical_date_gte, logical_date_lte=logical_date_lte, start_date_gte=start_date_gte, start_date_lte=start_date_lte, end_date_gte=end_date_gte, end_date_lte=end_date_lte, updated_at_gte=updated_at_gte, updated_at_lte=updated_at_lte, duration_gte=duration_gte, duration_lte=duration_lte, task_display_name_pattern=task_display_name_pattern, state=state, pool=pool, queue=queue, executor=executor, version_number=version_number, limit=limit, offset=offset, order_by=order_by)
+        api_response = api_instance.get_task_instances(dag_id, dag_run_id, task_id=task_id, run_after_gte=run_after_gte, run_after_gt=run_after_gt, run_after_lte=run_after_lte, run_after_lt=run_after_lt, logical_date_gte=logical_date_gte, logical_date_gt=logical_date_gt, logical_date_lte=logical_date_lte, logical_date_lt=logical_date_lt, start_date_gte=start_date_gte, start_date_gt=start_date_gt, start_date_lte=start_date_lte, start_date_lt=start_date_lt, end_date_gte=end_date_gte, end_date_gt=end_date_gt, end_date_lte=end_date_lte, end_date_lt=end_date_lt, updated_at_gte=updated_at_gte, updated_at_gt=updated_at_gt, updated_at_lte=updated_at_lte, updated_at_lt=updated_at_lt, duration_gte=duration_gte, duration_gt=duration_gt, duration_lte=duration_lte, duration_lt=duration_lt, task_display_name_pattern=task_display_name_pattern, state=state, pool=pool, queue=queue, executor=executor, version_number=version_number, try_number=try_number, operator=operator, map_index=map_index, limit=limit, offset=offset, order_by=order_by)
         print("The response of TaskInstanceApi->get_task_instances:\n")
         pprint(api_response)
     except Exception as e:
@@ -1085,26 +1692,41 @@
  **dag_run_id** | **str**|  | 
  **task_id** | **str**|  | [optional] 
  **run_after_gte** | **datetime**|  | [optional] 
+ **run_after_gt** | **datetime**|  | [optional] 
  **run_after_lte** | **datetime**|  | [optional] 
+ **run_after_lt** | **datetime**|  | [optional] 
  **logical_date_gte** | **datetime**|  | [optional] 
+ **logical_date_gt** | **datetime**|  | [optional] 
  **logical_date_lte** | **datetime**|  | [optional] 
+ **logical_date_lt** | **datetime**|  | [optional] 
  **start_date_gte** | **datetime**|  | [optional] 
+ **start_date_gt** | **datetime**|  | [optional] 
  **start_date_lte** | **datetime**|  | [optional] 
+ **start_date_lt** | **datetime**|  | [optional] 
  **end_date_gte** | **datetime**|  | [optional] 
+ **end_date_gt** | **datetime**|  | [optional] 
  **end_date_lte** | **datetime**|  | [optional] 
+ **end_date_lt** | **datetime**|  | [optional] 
  **updated_at_gte** | **datetime**|  | [optional] 
+ **updated_at_gt** | **datetime**|  | [optional] 
  **updated_at_lte** | **datetime**|  | [optional] 
+ **updated_at_lt** | **datetime**|  | [optional] 
  **duration_gte** | **float**|  | [optional] 
+ **duration_gt** | **float**|  | [optional] 
  **duration_lte** | **float**|  | [optional] 
- **task_display_name_pattern** | **str**|  | [optional] 
+ **duration_lt** | **float**|  | [optional] 
+ **task_display_name_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
  **state** | [**List[str]**](str.md)|  | [optional] 
  **pool** | [**List[str]**](str.md)|  | [optional] 
  **queue** | [**List[str]**](str.md)|  | [optional] 
  **executor** | [**List[str]**](str.md)|  | [optional] 
  **version_number** | [**List[int]**](int.md)|  | [optional] 
+ **try_number** | [**List[int]**](int.md)|  | [optional] 
+ **operator** | [**List[str]**](str.md)|  | [optional] 
+ **map_index** | [**List[int]**](int.md)|  | [optional] 
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'map_index']
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to ["map_index"]]
 
 ### Return type
 
@@ -1112,7 +1734,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -1141,6 +1763,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -1162,6 +1785,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -1196,7 +1824,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -1225,6 +1853,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -1246,6 +1875,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -1286,7 +1920,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -1317,6 +1951,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -1338,6 +1973,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -1378,7 +2018,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -1409,6 +2049,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -1430,6 +2071,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -1470,7 +2116,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -1500,6 +2146,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -1521,6 +2168,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -1561,7 +2213,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -1591,6 +2243,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -1612,6 +2265,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -1644,7 +2302,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -1663,3 +2321,98 @@
 
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
+# **update_hitl_detail**
+> HITLDetailResponse update_hitl_detail(dag_id, dag_run_id, task_id, map_index, update_hitl_detail_payload)
+
+Update Hitl Detail
+
+Update a Human-in-the-loop detail.
+
+### Example
+
+* OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
+
+```python
+import airflow_client.client
+from airflow_client.client.models.hitl_detail_response import HITLDetailResponse
+from airflow_client.client.models.update_hitl_detail_payload import UpdateHITLDetailPayload
+from airflow_client.client.rest import ApiException
+from pprint import pprint
+
+# Defining the host is optional and defaults to http://localhost
+# See configuration.py for a list of all supported configuration parameters.
+configuration = airflow_client.client.Configuration(
+    host = "http://localhost"
+)
+
+# The client must configure the authentication and authorization parameters
+# in accordance with the API server security policy.
+# Examples for each auth method are provided below, use the example that
+# satisfies your auth use case.
+
+configuration.access_token = os.environ["ACCESS_TOKEN"]
+
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
+# Enter a context with an instance of the API client
+with airflow_client.client.ApiClient(configuration) as api_client:
+    # Create an instance of the API class
+    api_instance = airflow_client.client.TaskInstanceApi(api_client)
+    dag_id = 'dag_id_example' # str | 
+    dag_run_id = 'dag_run_id_example' # str | 
+    task_id = 'task_id_example' # str | 
+    map_index = 56 # int | 
+    update_hitl_detail_payload = airflow_client.client.UpdateHITLDetailPayload() # UpdateHITLDetailPayload | 
+
+    try:
+        # Update Hitl Detail
+        api_response = api_instance.update_hitl_detail(dag_id, dag_run_id, task_id, map_index, update_hitl_detail_payload)
+        print("The response of TaskInstanceApi->update_hitl_detail:\n")
+        pprint(api_response)
+    except Exception as e:
+        print("Exception when calling TaskInstanceApi->update_hitl_detail: %s\n" % e)
+```
+
+
+
+### Parameters
+
+
+Name | Type | Description  | Notes
+------------- | ------------- | ------------- | -------------
+ **dag_id** | **str**|  | 
+ **dag_run_id** | **str**|  | 
+ **task_id** | **str**|  | 
+ **map_index** | **int**|  | 
+ **update_hitl_detail_payload** | [**UpdateHITLDetailPayload**](UpdateHITLDetailPayload.md)|  | 
+
+### Return type
+
+[**HITLDetailResponse**](HITLDetailResponse.md)
+
+### Authorization
+
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
+
+### HTTP request headers
+
+ - **Content-Type**: application/json
+ - **Accept**: application/json
+
+### HTTP response details
+
+| Status code | Description | Response headers |
+|-------------|-------------|------------------|
+**200** | Successful Response |  -  |
+**401** | Unauthorized |  -  |
+**403** | Forbidden |  -  |
+**404** | Not Found |  -  |
+**409** | Conflict |  -  |
+**422** | Validation Error |  -  |
+
+[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
+
diff --git a/docs/TaskInstanceHistoryResponse.md b/docs/TaskInstanceHistoryResponse.md
index ee09369..cfe2ce2 100644
--- a/docs/TaskInstanceHistoryResponse.md
+++ b/docs/TaskInstanceHistoryResponse.md
@@ -6,6 +6,7 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **dag_run_id** | **str** |  | 
 **dag_version** | [**DagVersionResponse**](DagVersionResponse.md) |  | [optional] 
@@ -17,6 +18,7 @@
 **map_index** | **int** |  | 
 **max_tries** | **int** |  | 
 **operator** | **str** |  | [optional] 
+**operator_name** | **str** |  | [optional] 
 **pid** | **int** |  | [optional] 
 **pool** | **str** |  | 
 **pool_slots** | **int** |  | 
diff --git a/docs/TaskInstanceResponse.md b/docs/TaskInstanceResponse.md
index c4d711b..8fd5904 100644
--- a/docs/TaskInstanceResponse.md
+++ b/docs/TaskInstanceResponse.md
@@ -6,6 +6,7 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **dag_run_id** | **str** |  | 
 **dag_version** | [**DagVersionResponse**](DagVersionResponse.md) |  | [optional] 
@@ -20,6 +21,7 @@
 **max_tries** | **int** |  | 
 **note** | **str** |  | [optional] 
 **operator** | **str** |  | [optional] 
+**operator_name** | **str** |  | [optional] 
 **pid** | **int** |  | [optional] 
 **pool** | **str** |  | 
 **pool_slots** | **int** |  | 
diff --git a/docs/TaskInstancesBatchBody.md b/docs/TaskInstancesBatchBody.md
index d020e0b..34246dd 100644
--- a/docs/TaskInstancesBatchBody.md
+++ b/docs/TaskInstancesBatchBody.md
@@ -8,21 +8,31 @@
 ------------ | ------------- | ------------- | -------------
 **dag_ids** | **List[str]** |  | [optional] 
 **dag_run_ids** | **List[str]** |  | [optional] 
+**duration_gt** | **float** |  | [optional] 
 **duration_gte** | **float** |  | [optional] 
+**duration_lt** | **float** |  | [optional] 
 **duration_lte** | **float** |  | [optional] 
+**end_date_gt** | **datetime** |  | [optional] 
 **end_date_gte** | **datetime** |  | [optional] 
+**end_date_lt** | **datetime** |  | [optional] 
 **end_date_lte** | **datetime** |  | [optional] 
 **executor** | **List[str]** |  | [optional] 
+**logical_date_gt** | **datetime** |  | [optional] 
 **logical_date_gte** | **datetime** |  | [optional] 
+**logical_date_lt** | **datetime** |  | [optional] 
 **logical_date_lte** | **datetime** |  | [optional] 
 **order_by** | **str** |  | [optional] 
 **page_limit** | **int** |  | [optional] [default to 100]
 **page_offset** | **int** |  | [optional] [default to 0]
 **pool** | **List[str]** |  | [optional] 
 **queue** | **List[str]** |  | [optional] 
+**run_after_gt** | **datetime** |  | [optional] 
 **run_after_gte** | **datetime** |  | [optional] 
+**run_after_lt** | **datetime** |  | [optional] 
 **run_after_lte** | **datetime** |  | [optional] 
+**start_date_gt** | **datetime** |  | [optional] 
 **start_date_gte** | **datetime** |  | [optional] 
+**start_date_lt** | **datetime** |  | [optional] 
 **start_date_lte** | **datetime** |  | [optional] 
 **state** | [**List[Optional[TaskInstanceState]]**](TaskInstanceState.md) |  | [optional] 
 **task_ids** | **List[str]** |  | [optional] 
diff --git a/docs/UpdateHITLDetailPayload.md b/docs/UpdateHITLDetailPayload.md
new file mode 100644
index 0000000..4605c63
--- /dev/null
+++ b/docs/UpdateHITLDetailPayload.md
@@ -0,0 +1,31 @@
+# UpdateHITLDetailPayload
+
+Schema for updating the content of a Human-in-the-loop detail.
+
+## Properties
+
+Name | Type | Description | Notes
+------------ | ------------- | ------------- | -------------
+**chosen_options** | **List[str]** |  | 
+**params_input** | **object** |  | [optional] 
+
+## Example
+
+```python
+from airflow_client.client.models.update_hitl_detail_payload import UpdateHITLDetailPayload
+
+# TODO update the JSON string below
+json = "{}"
+# create an instance of UpdateHITLDetailPayload from a JSON string
+update_hitl_detail_payload_instance = UpdateHITLDetailPayload.from_json(json)
+# print the JSON string representation of the object
+print(UpdateHITLDetailPayload.to_json())
+
+# convert the object into a dict
+update_hitl_detail_payload_dict = update_hitl_detail_payload_instance.to_dict()
+# create an instance of UpdateHITLDetailPayload from a dict
+update_hitl_detail_payload_from_dict = UpdateHITLDetailPayload.from_dict(update_hitl_detail_payload_dict)
+```
+[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
+
+
diff --git a/docs/VariableApi.md b/docs/VariableApi.md
index d7ad02f..d7a7776 100644
--- a/docs/VariableApi.md
+++ b/docs/VariableApi.md
@@ -22,6 +22,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -43,6 +44,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -73,7 +79,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -101,6 +107,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -120,6 +127,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -148,7 +160,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -177,6 +189,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -197,6 +210,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -227,7 +245,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -256,6 +274,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -276,14 +295,19 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
     api_instance = airflow_client.client.VariableApi(api_client)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
-    order_by = 'id' # str |  (optional) (default to 'id')
-    variable_key_pattern = 'variable_key_pattern_example' # str |  (optional)
+    order_by = [id] # List[str] |  (optional) (default to [id])
+    variable_key_pattern = 'variable_key_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
 
     try:
         # Get Variables
@@ -303,8 +327,8 @@
 ------------- | ------------- | ------------- | -------------
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
- **order_by** | **str**|  | [optional] [default to 'id']
- **variable_key_pattern** | **str**|  | [optional] 
+ **order_by** | [**List[str]**](str.md)|  | [optional] [default to [id]]
+ **variable_key_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
 
 ### Return type
 
@@ -312,7 +336,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -340,6 +364,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -361,6 +386,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -395,7 +425,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -425,6 +455,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -446,6 +477,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -476,7 +512,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/XComApi.md b/docs/XComApi.md
index 914026c..0a59e77 100644
--- a/docs/XComApi.md
+++ b/docs/XComApi.md
@@ -20,6 +20,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -41,6 +42,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -77,7 +83,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -98,7 +104,7 @@
 [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
 
 # **get_xcom_entries**
-> XComCollectionResponse get_xcom_entries(dag_id, dag_run_id, task_id, xcom_key=xcom_key, map_index=map_index, limit=limit, offset=offset)
+> XComCollectionResponse get_xcom_entries(dag_id, dag_run_id, task_id, xcom_key=xcom_key, map_index=map_index, limit=limit, offset=offset, xcom_key_pattern=xcom_key_pattern, dag_display_name_pattern=dag_display_name_pattern, run_id_pattern=run_id_pattern, task_id_pattern=task_id_pattern, map_index_filter=map_index_filter, logical_date_gte=logical_date_gte, logical_date_gt=logical_date_gt, logical_date_lte=logical_date_lte, logical_date_lt=logical_date_lt, run_after_gte=run_after_gte, run_after_gt=run_after_gt, run_after_lte=run_after_lte, run_after_lt=run_after_lt)
 
 Get Xcom Entries
 
@@ -109,6 +115,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -129,6 +136,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -140,10 +152,23 @@
     map_index = 56 # int |  (optional)
     limit = 50 # int |  (optional) (default to 50)
     offset = 0 # int |  (optional) (default to 0)
+    xcom_key_pattern = 'xcom_key_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    dag_display_name_pattern = 'dag_display_name_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    run_id_pattern = 'run_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    task_id_pattern = 'task_id_pattern_example' # str | SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. (optional)
+    map_index_filter = 56 # int |  (optional)
+    logical_date_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    logical_date_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    logical_date_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    logical_date_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    run_after_gte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    run_after_gt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    run_after_lte = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
+    run_after_lt = '2013-10-20T19:20:30+01:00' # datetime |  (optional)
 
     try:
         # Get Xcom Entries
-        api_response = api_instance.get_xcom_entries(dag_id, dag_run_id, task_id, xcom_key=xcom_key, map_index=map_index, limit=limit, offset=offset)
+        api_response = api_instance.get_xcom_entries(dag_id, dag_run_id, task_id, xcom_key=xcom_key, map_index=map_index, limit=limit, offset=offset, xcom_key_pattern=xcom_key_pattern, dag_display_name_pattern=dag_display_name_pattern, run_id_pattern=run_id_pattern, task_id_pattern=task_id_pattern, map_index_filter=map_index_filter, logical_date_gte=logical_date_gte, logical_date_gt=logical_date_gt, logical_date_lte=logical_date_lte, logical_date_lt=logical_date_lt, run_after_gte=run_after_gte, run_after_gt=run_after_gt, run_after_lte=run_after_lte, run_after_lt=run_after_lt)
         print("The response of XComApi->get_xcom_entries:\n")
         pprint(api_response)
     except Exception as e:
@@ -164,6 +189,19 @@
  **map_index** | **int**|  | [optional] 
  **limit** | **int**|  | [optional] [default to 50]
  **offset** | **int**|  | [optional] [default to 0]
+ **xcom_key_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **dag_display_name_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **run_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **task_id_pattern** | **str**| SQL LIKE expression — use `%` / `_` wildcards (e.g. `%customer_%`). Regular expressions are **not** supported. | [optional] 
+ **map_index_filter** | **int**|  | [optional] 
+ **logical_date_gte** | **datetime**|  | [optional] 
+ **logical_date_gt** | **datetime**|  | [optional] 
+ **logical_date_lte** | **datetime**|  | [optional] 
+ **logical_date_lt** | **datetime**|  | [optional] 
+ **run_after_gte** | **datetime**|  | [optional] 
+ **run_after_gt** | **datetime**|  | [optional] 
+ **run_after_lte** | **datetime**|  | [optional] 
+ **run_after_lt** | **datetime**|  | [optional] 
 
 ### Return type
 
@@ -171,7 +209,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -201,6 +239,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -221,6 +260,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -263,7 +307,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
@@ -293,6 +337,7 @@
 ### Example
 
 * OAuth Authentication (OAuth2PasswordBearer):
+* Bearer Authentication (HTTPBearer):
 
 ```python
 import airflow_client.client
@@ -314,6 +359,11 @@
 
 configuration.access_token = os.environ["ACCESS_TOKEN"]
 
+# Configure Bearer authorization: HTTPBearer
+configuration = airflow_client.client.Configuration(
+    access_token = os.environ["BEARER_TOKEN"]
+)
+
 # Enter a context with an instance of the API client
 with airflow_client.client.ApiClient(configuration) as api_client:
     # Create an instance of the API class
@@ -352,7 +402,7 @@
 
 ### Authorization
 
-[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer)
+[OAuth2PasswordBearer](../README.md#OAuth2PasswordBearer), [HTTPBearer](../README.md#HTTPBearer)
 
 ### HTTP request headers
 
diff --git a/docs/XComResponse.md b/docs/XComResponse.md
index b346b71..fb495d9 100644
--- a/docs/XComResponse.md
+++ b/docs/XComResponse.md
@@ -6,11 +6,13 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **key** | **str** |  | 
 **logical_date** | **datetime** |  | [optional] 
 **map_index** | **int** |  | 
 **run_id** | **str** |  | 
+**task_display_name** | **str** |  | 
 **task_id** | **str** |  | 
 **timestamp** | **datetime** |  | 
 
diff --git a/docs/XComResponseNative.md b/docs/XComResponseNative.md
index 7567dd3..ca4d4dd 100644
--- a/docs/XComResponseNative.md
+++ b/docs/XComResponseNative.md
@@ -6,11 +6,13 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **key** | **str** |  | 
 **logical_date** | **datetime** |  | [optional] 
 **map_index** | **int** |  | 
 **run_id** | **str** |  | 
+**task_display_name** | **str** |  | 
 **task_id** | **str** |  | 
 **timestamp** | **datetime** |  | 
 **value** | **object** |  | 
diff --git a/docs/XComResponseString.md b/docs/XComResponseString.md
index dd87c31..d83dff4 100644
--- a/docs/XComResponseString.md
+++ b/docs/XComResponseString.md
@@ -6,11 +6,13 @@
 
 Name | Type | Description | Notes
 ------------ | ------------- | ------------- | -------------
+**dag_display_name** | **str** |  | 
 **dag_id** | **str** |  | 
 **key** | **str** |  | 
 **logical_date** | **datetime** |  | [optional] 
 **map_index** | **int** |  | 
 **run_id** | **str** |  | 
+**task_display_name** | **str** |  | 
 **task_id** | **str** |  | 
 **timestamp** | **datetime** |  | 
 **value** | **str** |  | [optional] 
diff --git a/pyproject.toml b/pyproject.toml
index ec4baeb..1692d6f 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -25,7 +25,7 @@
 description = "Apache Airflow API (Stable)"
 readme = "README.md"
 license-files.globs = ["LICENSE", "NOTICE"]
-requires-python = "~=3.9"
+requires-python = ">=3.10"
 authors = [
     { name = "Apache Software Foundation", email = "dev@airflow.apache.org" },
 ]
@@ -42,7 +42,6 @@
     "Intended Audience :: Developers",
     "Intended Audience :: System Administrators",
     "License :: OSI Approved :: Apache Software License",
-    "Programming Language :: Python :: 3.9",
     "Programming Language :: Python :: 3.10",
     "Programming Language :: Python :: 3.11",
     "Programming Language :: Python :: 3.12",
@@ -73,7 +72,7 @@
 run = "run-coverage --no-cov"
 
 [[tool.hatch.envs.test.matrix]]
-python = ["3.9", "3.10", "3.11"]
+python = ["3.10", "3.11"]
 
 [tool.hatch.version]
 path = "./version.txt"
@@ -84,7 +83,7 @@
     "/airflow_client",
     "/docs",
     "/test",
-    "v1.yaml",
+    "v2.yaml",
 ]
 include = [
     "version.txt",
@@ -97,7 +96,7 @@
     "/airflow_client",
     "/docs",
     "/test",
-    "v1.yaml",
+    "v2.yaml",
 ]
 include = [
     "/airflow_client",
diff --git a/spec/v2.yaml b/spec/v2.yaml
index 6edaab2..6c2522e 100644
--- a/spec/v2.yaml
+++ b/spec/v2.yaml
@@ -8,13 +8,14 @@
           nullable: true
           type: string
         href:
-          nullable: true
+          title: Href
           type: string
         name:
           title: Name
           type: string
       required:
       - name
+      - href
       title: AppBuilderMenuItemResponse
       type: object
     AppBuilderViewResponse:
@@ -160,10 +161,10 @@
             $ref: '#/components/schemas/AssetAliasResponse'
           title: Aliases
           type: array
-        consuming_dags:
+        consuming_tasks:
           items:
-            $ref: '#/components/schemas/DagScheduleAssetReference'
-          title: Consuming Dags
+            $ref: '#/components/schemas/TaskInletAssetReference'
+          title: Consuming Tasks
           type: array
         created_at:
           format: date-time
@@ -179,6 +180,9 @@
         id:
           title: Id
           type: integer
+        last_asset_event:
+          $ref: '#/components/schemas/LastAssetEventResponse'
+          nullable: true
         name:
           title: Name
           type: string
@@ -187,6 +191,11 @@
             $ref: '#/components/schemas/TaskOutletAssetReference'
           title: Producing Tasks
           type: array
+        scheduled_dags:
+          items:
+            $ref: '#/components/schemas/DagScheduleAssetReference'
+          title: Scheduled Dags
+          type: array
         updated_at:
           format: date-time
           title: Updated At
@@ -201,8 +210,9 @@
       - group
       - created_at
       - updated_at
-      - consuming_dags
+      - scheduled_dags
       - producing_tasks
+      - consuming_tasks
       - aliases
       title: AssetResponse
       type: object
@@ -270,6 +280,9 @@
           format: date-time
           title: Created At
           type: string
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -312,6 +325,7 @@
       - max_active_runs
       - created_at
       - updated_at
+      - dag_display_name
       title: BackfillResponse
       type: object
     BaseInfoResponse:
@@ -368,6 +382,21 @@
           type: array
       title: BulkActionResponse
       type: object
+    BulkBody_BulkTaskInstanceBody_:
+      additionalProperties: false
+      properties:
+        actions:
+          items:
+            oneOf:
+            - $ref: '#/components/schemas/BulkCreateAction_BulkTaskInstanceBody_'
+            - $ref: '#/components/schemas/BulkUpdateAction_BulkTaskInstanceBody_'
+            - $ref: '#/components/schemas/BulkDeleteAction_BulkTaskInstanceBody_'
+          title: Actions
+          type: array
+      required:
+      - actions
+      title: BulkBody[BulkTaskInstanceBody]
+      type: object
     BulkBody_ConnectionBody_:
       additionalProperties: false
       properties:
@@ -413,6 +442,28 @@
       - actions
       title: BulkBody[VariableBody]
       type: object
+    BulkCreateAction_BulkTaskInstanceBody_:
+      additionalProperties: false
+      properties:
+        action:
+          const: create
+          description: The action to be performed on the entities.
+          title: Action
+          type: string
+        action_on_existence:
+          $ref: '#/components/schemas/BulkActionOnExistence'
+          default: fail
+        entities:
+          description: A list of entities to be created.
+          items:
+            $ref: '#/components/schemas/BulkTaskInstanceBody'
+          title: Entities
+          type: array
+      required:
+      - action
+      - entities
+      title: BulkCreateAction[BulkTaskInstanceBody]
+      type: object
     BulkCreateAction_ConnectionBody_:
       additionalProperties: false
       properties:
@@ -479,6 +530,30 @@
       - entities
       title: BulkCreateAction[VariableBody]
       type: object
+    BulkDeleteAction_BulkTaskInstanceBody_:
+      additionalProperties: false
+      properties:
+        action:
+          const: delete
+          description: The action to be performed on the entities.
+          title: Action
+          type: string
+        action_on_non_existence:
+          $ref: '#/components/schemas/BulkActionNotOnExistence'
+          default: fail
+        entities:
+          description: A list of entity id/key or entity objects to be deleted.
+          items:
+            anyOf:
+            - type: string
+            - $ref: '#/components/schemas/BulkTaskInstanceBody'
+          title: Entities
+          type: array
+      required:
+      - action
+      - entities
+      title: BulkDeleteAction[BulkTaskInstanceBody]
+      type: object
     BulkDeleteAction_ConnectionBody_:
       additionalProperties: false
       properties:
@@ -491,9 +566,11 @@
           $ref: '#/components/schemas/BulkActionNotOnExistence'
           default: fail
         entities:
-          description: A list of entity id/key to be deleted.
+          description: A list of entity id/key or entity objects to be deleted.
           items:
-            type: string
+            anyOf:
+            - type: string
+            - $ref: '#/components/schemas/BulkTaskInstanceBody'
           title: Entities
           type: array
       required:
@@ -513,9 +590,11 @@
           $ref: '#/components/schemas/BulkActionNotOnExistence'
           default: fail
         entities:
-          description: A list of entity id/key to be deleted.
+          description: A list of entity id/key or entity objects to be deleted.
           items:
-            type: string
+            anyOf:
+            - type: string
+            - $ref: '#/components/schemas/BulkTaskInstanceBody'
           title: Entities
           type: array
       required:
@@ -535,9 +614,11 @@
           $ref: '#/components/schemas/BulkActionNotOnExistence'
           default: fail
         entities:
-          description: A list of entity id/key to be deleted.
+          description: A list of entity id/key or entity objects to be deleted.
           items:
-            type: string
+            anyOf:
+            - type: string
+            - $ref: '#/components/schemas/BulkTaskInstanceBody'
           title: Entities
           type: array
       required:
@@ -569,6 +650,65 @@
           nullable: true
       title: BulkResponse
       type: object
+    BulkTaskInstanceBody:
+      additionalProperties: false
+      description: Request body for bulk update, and delete task instances.
+      properties:
+        include_downstream:
+          default: false
+          title: Include Downstream
+          type: boolean
+        include_future:
+          default: false
+          title: Include Future
+          type: boolean
+        include_past:
+          default: false
+          title: Include Past
+          type: boolean
+        include_upstream:
+          default: false
+          title: Include Upstream
+          type: boolean
+        map_index:
+          nullable: true
+          type: integer
+        new_state:
+          $ref: '#/components/schemas/TaskInstanceState'
+          nullable: true
+        note:
+          maxLength: 1000
+          nullable: true
+          type: string
+        task_id:
+          title: Task Id
+          type: string
+      required:
+      - task_id
+      title: BulkTaskInstanceBody
+      type: object
+    BulkUpdateAction_BulkTaskInstanceBody_:
+      additionalProperties: false
+      properties:
+        action:
+          const: update
+          description: The action to be performed on the entities.
+          title: Action
+          type: string
+        action_on_non_existence:
+          $ref: '#/components/schemas/BulkActionNotOnExistence'
+          default: fail
+        entities:
+          description: A list of entities to be updated.
+          items:
+            $ref: '#/components/schemas/BulkTaskInstanceBody'
+          title: Entities
+          type: array
+      required:
+      - action
+      - entities
+      title: BulkUpdateAction[BulkTaskInstanceBody]
+      type: object
     BulkUpdateAction_ConnectionBody_:
       additionalProperties: false
       properties:
@@ -678,6 +818,12 @@
           default: true
           title: Reset Dag Runs
           type: boolean
+        run_on_latest_version:
+          default: false
+          description: (Experimental) Run on the latest bundle version of the dag
+            after clearing the task instances.
+          title: Run On Latest Version
+          type: boolean
         start_date:
           format: date-time
           nullable: true
@@ -899,7 +1045,11 @@
           title: Catchup
           type: boolean
         concurrency:
-          description: Return max_active_tasks as concurrency.
+          deprecated: true
+          description: 'Return max_active_tasks as concurrency.
+
+
+            Deprecated: Use max_active_tasks instead.'
           readOnly: true
           title: Concurrency
           type: integer
@@ -913,6 +1063,10 @@
           format: duration
           nullable: true
           type: string
+        default_args:
+          additionalProperties: true
+          nullable: true
+          type: object
         description:
           nullable: true
           type: string
@@ -950,6 +1104,9 @@
           format: date-time
           nullable: true
           type: string
+        last_parse_duration:
+          nullable: true
+          type: number
         last_parsed:
           format: date-time
           nullable: true
@@ -1100,6 +1257,9 @@
           format: date-time
           nullable: true
           type: string
+        last_parse_duration:
+          nullable: true
+          type: number
         last_parsed_time:
           format: date-time
           nullable: true
@@ -1175,6 +1335,12 @@
           default: false
           title: Only Failed
           type: boolean
+        run_on_latest_version:
+          default: false
+          description: (Experimental) Run on the latest bundle version of the Dag
+            after clearing the Dag Run.
+          title: Run On Latest Version
+          type: boolean
       title: DAGRunClearBody
       type: object
     DAGRunCollectionResponse:
@@ -1224,6 +1390,9 @@
           additionalProperties: true
           nullable: true
           type: object
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -1243,6 +1412,9 @@
           format: date-time
           nullable: true
           type: string
+        duration:
+          nullable: true
+          type: number
         end_date:
           format: date-time
           nullable: true
@@ -1277,6 +1449,9 @@
         triggered_by:
           $ref: '#/components/schemas/DagRunTriggeredByType'
           nullable: true
+        triggering_user_name:
+          nullable: true
+          type: string
       required:
       - dag_run_id
       - dag_id
@@ -1284,6 +1459,7 @@
       - run_type
       - state
       - dag_versions
+      - dag_display_name
       title: DAGRunResponse
       type: object
     DAGRunsBatchBody:
@@ -1295,18 +1471,34 @@
             type: string
           nullable: true
           type: array
+        end_date_gt:
+          format: date-time
+          nullable: true
+          type: string
         end_date_gte:
           format: date-time
           nullable: true
           type: string
+        end_date_lt:
+          format: date-time
+          nullable: true
+          type: string
         end_date_lte:
           format: date-time
           nullable: true
           type: string
+        logical_date_gt:
+          format: date-time
+          nullable: true
+          type: string
         logical_date_gte:
           format: date-time
           nullable: true
           type: string
+        logical_date_lt:
+          format: date-time
+          nullable: true
+          type: string
         logical_date_lte:
           format: date-time
           nullable: true
@@ -1324,18 +1516,34 @@
           minimum: 0.0
           title: Page Offset
           type: integer
+        run_after_gt:
+          format: date-time
+          nullable: true
+          type: string
         run_after_gte:
           format: date-time
           nullable: true
           type: string
+        run_after_lt:
+          format: date-time
+          nullable: true
+          type: string
         run_after_lte:
           format: date-time
           nullable: true
           type: string
+        start_date_gt:
+          format: date-time
+          nullable: true
+          type: string
         start_date_gte:
           format: date-time
           nullable: true
           type: string
+        start_date_lt:
+          format: date-time
+          nullable: true
+          type: string
         start_date_lte:
           format: date-time
           nullable: true
@@ -1355,6 +1563,9 @@
         content:
           nullable: true
           type: string
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -1363,6 +1574,7 @@
           type: integer
       required:
       - dag_id
+      - dag_display_name
       title: DAGSourceResponse
       type: object
     DAGTagCollectionResponse:
@@ -1416,6 +1628,9 @@
     DAGWarningResponse:
       description: DAG Warning serializer for responses.
       properties:
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -1433,6 +1648,7 @@
       - warning_type
       - message
       - timestamp
+      - dag_display_name
       title: DAGWarningResponse
       type: object
     DagProcessorInfoResponse:
@@ -1565,6 +1781,9 @@
     DagStatsResponse:
       description: DAG Stats serializer for responses.
       properties:
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -1575,6 +1794,7 @@
           type: array
       required:
       - dag_id
+      - dag_display_name
       - stats
       title: DagStatsResponse
       type: object
@@ -1594,6 +1814,9 @@
     DagTagResponse:
       description: DAG Tag serializer for responses.
       properties:
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -1603,6 +1826,7 @@
       required:
       - name
       - dag_id
+      - dag_display_name
       title: DagTagResponse
       type: object
     DagVersionResponse:
@@ -1621,6 +1845,9 @@
           format: date-time
           title: Created At
           type: string
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -1636,6 +1863,7 @@
       - version_number
       - dag_id
       - created_at
+      - dag_display_name
       title: DagVersionResponse
       type: object
     DagWarningType:
@@ -1696,6 +1924,9 @@
     EventLogResponse:
       description: Event Log Response.
       properties:
+        dag_display_name:
+          nullable: true
+          type: string
         dag_id:
           nullable: true
           type: string
@@ -1737,6 +1968,53 @@
       - event
       title: EventLogResponse
       type: object
+    ExternalLogUrlResponse:
+      description: Response for the external log URL endpoint.
+      properties:
+        url:
+          title: Url
+          type: string
+      required:
+      - url
+      title: ExternalLogUrlResponse
+      type: object
+    ExternalViewResponse:
+      additionalProperties: true
+      description: Serializer for External View Plugin responses.
+      properties:
+        category:
+          nullable: true
+          type: string
+        destination:
+          default: nav
+          enum:
+          - nav
+          - dag
+          - dag_run
+          - task
+          - task_instance
+          title: Destination
+          type: string
+        href:
+          title: Href
+          type: string
+        icon:
+          nullable: true
+          type: string
+        icon_dark_mode:
+          nullable: true
+          type: string
+        name:
+          title: Name
+          type: string
+        url_route:
+          nullable: true
+          type: string
+      required:
+      - name
+      - href
+      title: ExternalViewResponse
+      type: object
     ExtraLinkCollectionResponse:
       description: Extra Links Response.
       properties:
@@ -1788,6 +2066,127 @@
       - name
       title: FastAPIRootMiddlewareResponse
       type: object
+    HITLDetail:
+      description: Schema for Human-in-the-loop detail.
+      properties:
+        assigned_users:
+          items:
+            $ref: '#/components/schemas/HITLUser'
+          title: Assigned Users
+          type: array
+        body:
+          nullable: true
+          type: string
+        chosen_options:
+          items:
+            type: string
+          nullable: true
+          type: array
+        created_at:
+          format: date-time
+          title: Created At
+          type: string
+        defaults:
+          items:
+            type: string
+          nullable: true
+          type: array
+        multiple:
+          default: false
+          title: Multiple
+          type: boolean
+        options:
+          items:
+            type: string
+          minItems: 1
+          title: Options
+          type: array
+        params:
+          additionalProperties: true
+          title: Params
+          type: object
+        params_input:
+          additionalProperties: true
+          title: Params Input
+          type: object
+        responded_at:
+          format: date-time
+          nullable: true
+          type: string
+        responded_by_user:
+          $ref: '#/components/schemas/HITLUser'
+          nullable: true
+        response_received:
+          default: false
+          title: Response Received
+          type: boolean
+        subject:
+          title: Subject
+          type: string
+        task_instance:
+          $ref: '#/components/schemas/TaskInstanceResponse'
+      required:
+      - task_instance
+      - options
+      - subject
+      - created_at
+      title: HITLDetail
+      type: object
+    HITLDetailCollection:
+      description: Schema for a collection of Human-in-the-loop details.
+      properties:
+        hitl_details:
+          items:
+            $ref: '#/components/schemas/HITLDetail'
+          title: Hitl Details
+          type: array
+        total_entries:
+          title: Total Entries
+          type: integer
+      required:
+      - hitl_details
+      - total_entries
+      title: HITLDetailCollection
+      type: object
+    HITLDetailResponse:
+      description: Response of updating a Human-in-the-loop detail.
+      properties:
+        chosen_options:
+          items:
+            type: string
+          minItems: 1
+          title: Chosen Options
+          type: array
+        params_input:
+          additionalProperties: true
+          title: Params Input
+          type: object
+        responded_at:
+          format: date-time
+          title: Responded At
+          type: string
+        responded_by:
+          $ref: '#/components/schemas/HITLUser'
+      required:
+      - responded_by
+      - responded_at
+      - chosen_options
+      title: HITLDetailResponse
+      type: object
+    HITLUser:
+      description: Schema for a Human-in-the-loop users.
+      properties:
+        id:
+          title: Id
+          type: string
+        name:
+          title: Name
+          type: string
+      required:
+      - id
+      - name
+      title: HITLUser
+      type: object
     HTTPExceptionResponse:
       description: HTTPException Model used for error response.
       properties:
@@ -1889,6 +2288,9 @@
     JobResponse:
       description: Job serializer for responses.
       properties:
+        dag_display_name:
+          nullable: true
+          type: string
         dag_id:
           nullable: true
           type: string
@@ -1927,6 +2329,19 @@
       title: JobResponse
       type: object
     JsonValue: {}
+    LastAssetEventResponse:
+      description: Last asset event response serializer.
+      properties:
+        id:
+          minimum: 0.0
+          nullable: true
+          type: integer
+        timestamp:
+          format: date-time
+          nullable: true
+          type: string
+      title: LastAssetEventResponse
+      type: object
     PatchTaskInstanceBody:
       additionalProperties: false
       description: Request body for Clear Task Instances endpoint.
@@ -1972,10 +2387,41 @@
       - total_entries
       title: PluginCollectionResponse
       type: object
+    PluginImportErrorCollectionResponse:
+      description: Plugin Import Error Collection serializer.
+      properties:
+        import_errors:
+          items:
+            $ref: '#/components/schemas/PluginImportErrorResponse'
+          title: Import Errors
+          type: array
+        total_entries:
+          title: Total Entries
+          type: integer
+      required:
+      - import_errors
+      - total_entries
+      title: PluginImportErrorCollectionResponse
+      type: object
+    PluginImportErrorResponse:
+      description: Plugin Import Error serializer for responses.
+      properties:
+        error:
+          title: Error
+          type: string
+        source:
+          title: Source
+          type: string
+      required:
+      - source
+      - error
+      title: PluginImportErrorResponse
+      type: object
     PluginResponse:
       description: Plugin serializer.
       properties:
         appbuilder_menu_items:
+          deprecated: true
           items:
             $ref: '#/components/schemas/AppBuilderMenuItemResponse'
           title: Appbuilder Menu Items
@@ -1985,6 +2431,13 @@
             $ref: '#/components/schemas/AppBuilderViewResponse'
           title: Appbuilder Views
           type: array
+        external_views:
+          description: Aggregate all external views. Both 'external_views' and 'appbuilder_menu_items'
+            are included here.
+          items:
+            $ref: '#/components/schemas/ExternalViewResponse'
+          title: External Views
+          type: array
         fastapi_apps:
           items:
             $ref: '#/components/schemas/FastAPIAppResponse'
@@ -2023,6 +2476,11 @@
             type: string
           title: Operator Extra Links
           type: array
+        react_apps:
+          items:
+            $ref: '#/components/schemas/ReactAppResponse'
+          title: React Apps
+          type: array
         source:
           title: Source
           type: string
@@ -2037,6 +2495,8 @@
       - flask_blueprints
       - fastapi_apps
       - fastapi_root_middlewares
+      - external_views
+      - react_apps
       - appbuilder_views
       - appbuilder_menu_items
       - global_operator_extra_links
@@ -2208,6 +2668,9 @@
           format: date-time
           title: Created At
           type: string
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -2215,8 +2678,47 @@
       - dag_id
       - asset_id
       - created_at
+      - dag_display_name
       title: QueuedEventResponse
       type: object
+    ReactAppResponse:
+      additionalProperties: true
+      description: Serializer for React App Plugin responses.
+      properties:
+        bundle_url:
+          title: Bundle Url
+          type: string
+        category:
+          nullable: true
+          type: string
+        destination:
+          default: nav
+          enum:
+          - nav
+          - dag
+          - dag_run
+          - task
+          - task_instance
+          - dashboard
+          title: Destination
+          type: string
+        icon:
+          nullable: true
+          type: string
+        icon_dark_mode:
+          nullable: true
+          type: string
+        name:
+          title: Name
+          type: string
+        url_route:
+          nullable: true
+          type: string
+      required:
+      - name
+      - bundle_url
+      title: ReactAppResponse
+      type: object
     ReprocessBehavior:
       description: 'Internal enum for setting reprocess behavior in a backfill.
 
@@ -2297,6 +2799,31 @@
       - reason
       title: TaskDependencyResponse
       type: object
+    TaskInletAssetReference:
+      additionalProperties: false
+      description: Task inlet reference serializer for assets.
+      properties:
+        created_at:
+          format: date-time
+          title: Created At
+          type: string
+        dag_id:
+          title: Dag Id
+          type: string
+        task_id:
+          title: Task Id
+          type: string
+        updated_at:
+          format: date-time
+          title: Updated At
+          type: string
+      required:
+      - dag_id
+      - task_id
+      - created_at
+      - updated_at
+      title: TaskInletAssetReference
+      type: object
     TaskInstanceCollectionResponse:
       description: Task Instance Collection serializer for responses.
       properties:
@@ -2332,6 +2859,9 @@
     TaskInstanceHistoryResponse:
       description: TaskInstanceHistory serializer for responses.
       properties:
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -2366,6 +2896,9 @@
         operator:
           nullable: true
           type: string
+        operator_name:
+          nullable: true
+          type: string
         pid:
           nullable: true
           type: integer
@@ -2416,6 +2949,7 @@
       - try_number
       - max_tries
       - task_display_name
+      - dag_display_name
       - pool
       - pool_slots
       - executor_config
@@ -2424,6 +2958,9 @@
     TaskInstanceResponse:
       description: TaskInstance serializer for responses.
       properties:
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -2468,6 +3005,9 @@
         operator:
           nullable: true
           type: string
+        operator_name:
+          nullable: true
+          type: string
         pid:
           nullable: true
           type: integer
@@ -2537,6 +3077,7 @@
       - try_number
       - max_tries
       - task_display_name
+      - dag_display_name
       - pool
       - pool_slots
       - executor_config
@@ -2576,16 +3117,30 @@
             type: string
           nullable: true
           type: array
+        duration_gt:
+          nullable: true
+          type: number
         duration_gte:
           nullable: true
           type: number
+        duration_lt:
+          nullable: true
+          type: number
         duration_lte:
           nullable: true
           type: number
+        end_date_gt:
+          format: date-time
+          nullable: true
+          type: string
         end_date_gte:
           format: date-time
           nullable: true
           type: string
+        end_date_lt:
+          format: date-time
+          nullable: true
+          type: string
         end_date_lte:
           format: date-time
           nullable: true
@@ -2595,10 +3150,18 @@
             type: string
           nullable: true
           type: array
+        logical_date_gt:
+          format: date-time
+          nullable: true
+          type: string
         logical_date_gte:
           format: date-time
           nullable: true
           type: string
+        logical_date_lt:
+          format: date-time
+          nullable: true
+          type: string
         logical_date_lte:
           format: date-time
           nullable: true
@@ -2626,18 +3189,34 @@
             type: string
           nullable: true
           type: array
+        run_after_gt:
+          format: date-time
+          nullable: true
+          type: string
         run_after_gte:
           format: date-time
           nullable: true
           type: string
+        run_after_lt:
+          format: date-time
+          nullable: true
+          type: string
         run_after_lte:
           format: date-time
           nullable: true
           type: string
+        start_date_gt:
+          format: date-time
+          nullable: true
+          type: string
         start_date_gte:
           format: date-time
           nullable: true
           type: string
+        start_date_lt:
+          format: date-time
+          nullable: true
+          type: string
         start_date_lte:
           format: date-time
           nullable: true
@@ -2831,7 +3410,7 @@
       properties:
         conf:
           additionalProperties: true
-          title: Conf
+          nullable: true
           type: object
         dag_run_id:
           nullable: true
@@ -2896,6 +3475,23 @@
       required: []
       title: TriggererInfoResponse
       type: object
+    UpdateHITLDetailPayload:
+      description: Schema for updating the content of a Human-in-the-loop detail.
+      properties:
+        chosen_options:
+          items:
+            type: string
+          minItems: 1
+          title: Chosen Options
+          type: array
+        params_input:
+          additionalProperties: true
+          title: Params Input
+          type: object
+      required:
+      - chosen_options
+      title: UpdateHITLDetailPayload
+      type: object
     ValidationError:
       properties:
         loc:
@@ -3022,6 +3618,9 @@
     XComResponse:
       description: Serializer for a xcom item.
       properties:
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -3038,6 +3637,9 @@
         run_id:
           title: Run Id
           type: string
+        task_display_name:
+          title: Task Display Name
+          type: string
         task_id:
           title: Task Id
           type: string
@@ -3052,11 +3654,16 @@
       - task_id
       - dag_id
       - run_id
+      - dag_display_name
+      - task_display_name
       title: XComResponse
       type: object
     XComResponseNative:
       description: XCom response serializer with native return type.
       properties:
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -3073,6 +3680,9 @@
         run_id:
           title: Run Id
           type: string
+        task_display_name:
+          title: Task Display Name
+          type: string
         task_id:
           title: Task Id
           type: string
@@ -3089,12 +3699,17 @@
       - task_id
       - dag_id
       - run_id
+      - dag_display_name
+      - task_display_name
       - value
       title: XComResponseNative
       type: object
     XComResponseString:
       description: XCom response serializer with string return type.
       properties:
+        dag_display_name:
+          title: Dag Display Name
+          type: string
         dag_id:
           title: Dag Id
           type: string
@@ -3111,6 +3726,9 @@
         run_id:
           title: Run Id
           type: string
+        task_display_name:
+          title: Task Display Name
+          type: string
         task_id:
           title: Task Id
           type: string
@@ -3128,6 +3746,8 @@
       - task_id
       - dag_id
       - run_id
+      - dag_display_name
+      - task_display_name
       title: XComResponseString
       type: object
     XComUpdateBody:
@@ -3145,6 +3765,9 @@
       title: XComUpdateBody
       type: object
   securitySchemes:
+    HTTPBearer:
+      scheme: bearer
+      type: http
     OAuth2PasswordBearer:
       description: To authenticate Airflow API requests, clients must include a JWT
         (JSON Web Token) in the Authorization header of each request. This token is
@@ -3189,13 +3812,17 @@
           minimum: 0
           title: Offset
           type: integer
-      - in: query
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: name_pattern
         required: false
         schema:
           nullable: true
           type: string
-      - in: query
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: uri_pattern
         required: false
         schema:
@@ -3220,9 +3847,12 @@
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
       responses:
         '200':
           content:
@@ -3256,6 +3886,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Assets
       tags:
       - Asset
@@ -3280,7 +3911,9 @@
           minimum: 0
           title: Offset
           type: integer
-      - in: query
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: name_pattern
         required: false
         schema:
@@ -3290,9 +3923,12 @@
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
       responses:
         '200':
           content:
@@ -3326,6 +3962,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Asset Aliases
       tags:
       - Asset
@@ -3372,6 +4009,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Asset Alias
       tags:
       - Asset
@@ -3400,9 +4038,12 @@
         name: order_by
         required: false
         schema:
-          default: timestamp
+          default:
+          - timestamp
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
       - in: query
         name: asset_id
         required: false
@@ -3441,12 +4082,26 @@
           nullable: true
           type: string
       - in: query
+        name: timestamp_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: timestamp_lte
         required: false
         schema:
           format: date-time
           nullable: true
           type: string
+      - in: query
+        name: timestamp_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
       responses:
         '200':
           content:
@@ -3480,6 +4135,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Asset Events
       tags:
       - Asset
@@ -3525,6 +4181,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Create Asset Event
       tags:
       - Asset
@@ -3572,6 +4229,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Asset
       tags:
       - Asset
@@ -3625,6 +4283,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Materialize Asset
       tags:
       - Asset
@@ -3674,6 +4333,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Delete Asset Queued Events
       tags:
       - Asset
@@ -3726,6 +4386,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Asset Queued Events
       tags:
       - Asset
@@ -3793,6 +4454,38 @@
       summary: Logout
       tags:
       - Login
+  /api/v2/auth/refresh:
+    get:
+      description: Refresh the authentication token.
+      operationId: refresh
+      parameters:
+      - in: query
+        name: next
+        required: false
+        schema:
+          nullable: true
+          type: string
+      responses:
+        '200':
+          content:
+            application/json:
+              schema: {}
+          description: Successful Response
+        '307':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Temporary Redirect
+        '422':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPValidationError'
+          description: Validation Error
+      summary: Refresh
+      tags:
+      - Login
   /api/v2/backfills:
     get:
       operationId: list_backfills
@@ -3823,9 +4516,12 @@
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
       responses:
         '200':
           content:
@@ -3853,6 +4549,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: List Backfills
       tags:
       - Backfill
@@ -3903,6 +4600,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Create Backfill
       tags:
       - Backfill
@@ -3954,6 +4652,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Create Backfill Dry Run
       tags:
       - Backfill
@@ -4001,6 +4700,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Backfill
       tags:
       - Backfill
@@ -4054,6 +4754,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Cancel Backfill
       tags:
       - Backfill
@@ -4107,6 +4808,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Pause Backfill
       tags:
       - Backfill
@@ -4160,6 +4862,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Unpause Backfill
       tags:
       - Backfill
@@ -4240,6 +4943,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Config
       tags:
       - Config
@@ -4319,6 +5023,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Config Value
       tags:
       - Config
@@ -4347,10 +5052,15 @@
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
-          type: string
-      - in: query
+          type: array
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: connection_id_pattern
         required: false
         schema:
@@ -4389,6 +5099,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Connections
       tags:
       - Connection
@@ -4428,6 +5139,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Bulk Connections
       tags:
       - Connection
@@ -4473,6 +5185,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Post Connection
       tags:
       - Connection
@@ -4497,6 +5210,7 @@
           description: Forbidden
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Create Default Connections
       tags:
       - Connection
@@ -4546,6 +5260,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Test Connection
       tags:
       - Connection
@@ -4589,6 +5304,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Delete Connection
       tags:
       - Connection
@@ -4635,6 +5351,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Connection
       tags:
       - Connection
@@ -4701,6 +5418,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Patch Connection
       tags:
       - Connection
@@ -4747,6 +5465,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Reports
       tags:
       - DagReport
@@ -4827,6 +5546,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Source
       tags:
       - DagSource
@@ -4882,6 +5602,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Stats
       tags:
       - DagStats
@@ -4910,10 +5631,15 @@
         name: order_by
         required: false
         schema:
-          default: name
+          default:
+          - name
+          items:
+            type: string
           title: Order By
-          type: string
-      - in: query
+          type: array
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: tag_name_pattern
         required: false
         schema:
@@ -4946,6 +5672,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Tags
       tags:
       - DAG
@@ -4986,9 +5713,12 @@
         name: order_by
         required: false
         schema:
-          default: dag_id
+          default:
+          - dag_id
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
       responses:
         '200':
           content:
@@ -5016,6 +5746,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: List Dag Warnings
       tags:
       - DagWarning
@@ -5065,13 +5796,17 @@
             type: string
           title: Owners
           type: array
-      - in: query
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: dag_id_pattern
         required: false
         schema:
           nullable: true
           type: string
-      - in: query
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: dag_display_name_pattern
         required: false
         schema:
@@ -5090,6 +5825,14 @@
         schema:
           nullable: true
           type: boolean
+      - description: Filter Dags by having import errors. Only Dags that have been
+          successfully loaded before will be returned.
+        in: query
+        name: has_import_errors
+        required: false
+        schema:
+          nullable: true
+          type: boolean
       - in: query
         name: last_dag_run_state
         required: false
@@ -5097,6 +5840,32 @@
           $ref: '#/components/schemas/DagRunState'
           nullable: true
       - in: query
+        name: bundle_name
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - in: query
+        name: bundle_version
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: Filter Dags with asset-based scheduling
+        in: query
+        name: has_asset_schedule
+        required: false
+        schema:
+          nullable: true
+          type: boolean
+      - description: Filter Dags by asset dependency (name or URI)
+        in: query
+        name: asset_dependency
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - in: query
         name: dag_run_start_date_gte
         required: false
         schema:
@@ -5104,6 +5873,13 @@
           nullable: true
           type: string
       - in: query
+        name: dag_run_start_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: dag_run_start_date_lte
         required: false
         schema:
@@ -5111,6 +5887,13 @@
           nullable: true
           type: string
       - in: query
+        name: dag_run_start_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: dag_run_end_date_gte
         required: false
         schema:
@@ -5118,6 +5901,13 @@
           nullable: true
           type: string
       - in: query
+        name: dag_run_end_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: dag_run_end_date_lte
         required: false
         schema:
@@ -5125,6 +5915,13 @@
           nullable: true
           type: string
       - in: query
+        name: dag_run_end_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: dag_run_state
         required: false
         schema:
@@ -5136,9 +5933,18 @@
         name: order_by
         required: false
         schema:
-          default: dag_id
+          default:
+          - dag_id
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
+      - in: query
+        name: is_favorite
+        required: false
+        schema:
+          nullable: true
+          type: boolean
       responses:
         '200':
           content:
@@ -5166,6 +5972,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dags
       tags:
       - DAG
@@ -5222,7 +6029,9 @@
             type: string
           title: Owners
           type: array
-      - in: query
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: dag_id_pattern
         required: false
         schema:
@@ -5286,6 +6095,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Patch Dags
       tags:
       - DAG
@@ -5338,6 +6148,7 @@
           description: Unprocessable Entity
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Delete Dag
       tags:
       - DAG
@@ -5390,6 +6201,7 @@
           description: Unprocessable Entity
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag
       tags:
       - DAG
@@ -5456,6 +6268,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Patch Dag
       tags:
       - DAG
@@ -5510,6 +6323,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Delete Dag Asset Queued Events
       tags:
       - Asset
@@ -5562,6 +6376,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Asset Queued Events
       tags:
       - Asset
@@ -5623,6 +6438,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Delete Dag Asset Queued Event
       tags:
       - Asset
@@ -5681,6 +6497,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Asset Queued Event
       tags:
       - Asset
@@ -5734,6 +6551,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Post Clear Task Instances
       tags:
       - Task Instance
@@ -5776,6 +6594,13 @@
           nullable: true
           type: string
       - in: query
+        name: run_after_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: run_after_lte
         required: false
         schema:
@@ -5783,6 +6608,13 @@
           nullable: true
           type: string
       - in: query
+        name: run_after_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: logical_date_gte
         required: false
         schema:
@@ -5790,6 +6622,13 @@
           nullable: true
           type: string
       - in: query
+        name: logical_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: logical_date_lte
         required: false
         schema:
@@ -5797,6 +6636,13 @@
           nullable: true
           type: string
       - in: query
+        name: logical_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: start_date_gte
         required: false
         schema:
@@ -5804,6 +6650,13 @@
           nullable: true
           type: string
       - in: query
+        name: start_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: start_date_lte
         required: false
         schema:
@@ -5811,6 +6664,13 @@
           nullable: true
           type: string
       - in: query
+        name: start_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: end_date_gte
         required: false
         schema:
@@ -5818,6 +6678,13 @@
           nullable: true
           type: string
       - in: query
+        name: end_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: end_date_lte
         required: false
         schema:
@@ -5825,6 +6692,13 @@
           nullable: true
           type: string
       - in: query
+        name: end_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: updated_at_gte
         required: false
         schema:
@@ -5832,6 +6706,13 @@
           nullable: true
           type: string
       - in: query
+        name: updated_at_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: updated_at_lte
         required: false
         schema:
@@ -5839,6 +6720,13 @@
           nullable: true
           type: string
       - in: query
+        name: updated_at_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: run_type
         required: false
         schema:
@@ -5855,11 +6743,38 @@
           title: State
           type: array
       - in: query
+        name: dag_version
+        required: false
+        schema:
+          items:
+            type: integer
+          title: Dag Version
+          type: array
+      - in: query
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
+          type: array
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: run_id_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: triggering_user_name_pattern
+        required: false
+        schema:
+          nullable: true
           type: string
       responses:
         '200':
@@ -5894,6 +6809,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Runs
       tags:
       - DagRun
@@ -5957,6 +6873,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Trigger Dag Run
       tags:
       - DagRun
@@ -6011,6 +6928,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get List Dag Runs Batch
       tags:
       - DagRun
@@ -6066,6 +6984,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Delete Dag Run
       tags:
       - DagRun
@@ -6117,6 +7036,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Run
       tags:
       - DagRun
@@ -6189,6 +7109,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Patch Dag Run
       tags:
       - DagRun
@@ -6250,9 +7171,186 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Clear Dag Run
       tags:
       - DagRun
+  /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/hitlDetails:
+    get:
+      description: Get Human-in-the-loop details.
+      operationId: get_hitl_details
+      parameters:
+      - in: path
+        name: dag_id
+        required: true
+        schema:
+          title: Dag Id
+          type: string
+      - in: path
+        name: dag_run_id
+        required: true
+        schema:
+          title: Dag Run Id
+          type: string
+      - in: query
+        name: limit
+        required: false
+        schema:
+          default: 50
+          minimum: 0
+          title: Limit
+          type: integer
+      - in: query
+        name: offset
+        required: false
+        schema:
+          default: 0
+          minimum: 0
+          title: Offset
+          type: integer
+      - in: query
+        name: order_by
+        required: false
+        schema:
+          default:
+          - ti_id
+          items:
+            type: string
+          title: Order By
+          type: array
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: dag_id_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - in: query
+        name: task_id
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: task_id_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - in: query
+        name: map_index
+        required: false
+        schema:
+          nullable: true
+          type: integer
+      - in: query
+        name: state
+        required: false
+        schema:
+          items:
+            type: string
+          title: State
+          type: array
+      - in: query
+        name: response_received
+        required: false
+        schema:
+          nullable: true
+          type: boolean
+      - in: query
+        name: responded_by_user_id
+        required: false
+        schema:
+          items:
+            type: string
+          title: Responded By User Id
+          type: array
+      - in: query
+        name: responded_by_user_name
+        required: false
+        schema:
+          items:
+            type: string
+          title: Responded By User Name
+          type: array
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: subject_search
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: body_search
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - in: query
+        name: created_at_gte
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
+        name: created_at_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
+        name: created_at_lte
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
+        name: created_at_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      responses:
+        '200':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HITLDetailCollection'
+          description: Successful Response
+        '401':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Unauthorized
+        '403':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Forbidden
+        '422':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPValidationError'
+          description: Validation Error
+      security:
+      - OAuth2PasswordBearer: []
+      - HTTPBearer: []
+      summary: Get Hitl Details
+      tags:
+      - Task Instance
   /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances:
     get:
       description: 'Get list of task instances.
@@ -6290,6 +7388,13 @@
           nullable: true
           type: string
       - in: query
+        name: run_after_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: run_after_lte
         required: false
         schema:
@@ -6297,6 +7402,13 @@
           nullable: true
           type: string
       - in: query
+        name: run_after_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: logical_date_gte
         required: false
         schema:
@@ -6304,6 +7416,13 @@
           nullable: true
           type: string
       - in: query
+        name: logical_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: logical_date_lte
         required: false
         schema:
@@ -6311,6 +7430,13 @@
           nullable: true
           type: string
       - in: query
+        name: logical_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: start_date_gte
         required: false
         schema:
@@ -6318,6 +7444,13 @@
           nullable: true
           type: string
       - in: query
+        name: start_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: start_date_lte
         required: false
         schema:
@@ -6325,6 +7458,13 @@
           nullable: true
           type: string
       - in: query
+        name: start_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: end_date_gte
         required: false
         schema:
@@ -6332,6 +7472,13 @@
           nullable: true
           type: string
       - in: query
+        name: end_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: end_date_lte
         required: false
         schema:
@@ -6339,6 +7486,13 @@
           nullable: true
           type: string
       - in: query
+        name: end_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: updated_at_gte
         required: false
         schema:
@@ -6346,6 +7500,13 @@
           nullable: true
           type: string
       - in: query
+        name: updated_at_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: updated_at_lte
         required: false
         schema:
@@ -6353,18 +7514,39 @@
           nullable: true
           type: string
       - in: query
+        name: updated_at_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: duration_gte
         required: false
         schema:
           nullable: true
           type: number
       - in: query
+        name: duration_gt
+        required: false
+        schema:
+          nullable: true
+          type: number
+      - in: query
         name: duration_lte
         required: false
         schema:
           nullable: true
           type: number
       - in: query
+        name: duration_lt
+        required: false
+        schema:
+          nullable: true
+          type: number
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: task_display_name_pattern
         required: false
         schema:
@@ -6411,6 +7593,30 @@
           title: Version Number
           type: array
       - in: query
+        name: try_number
+        required: false
+        schema:
+          items:
+            type: integer
+          title: Try Number
+          type: array
+      - in: query
+        name: operator
+        required: false
+        schema:
+          items:
+            type: string
+          title: Operator
+          type: array
+      - in: query
+        name: map_index
+        required: false
+        schema:
+          items:
+            type: integer
+          title: Map Index
+          type: array
+      - in: query
         name: limit
         required: false
         schema:
@@ -6430,9 +7636,12 @@
         name: order_by
         required: false
         schema:
-          default: map_index
+          default:
+          - map_index
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
       responses:
         '200':
           content:
@@ -6466,9 +7675,63 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Task Instances
       tags:
       - Task Instance
+    patch:
+      description: Bulk update, and delete task instances.
+      operationId: bulk_task_instances
+      parameters:
+      - in: path
+        name: dag_id
+        required: true
+        schema:
+          title: Dag Id
+          type: string
+      - in: path
+        name: dag_run_id
+        required: true
+        schema:
+          title: Dag Run Id
+          type: string
+      requestBody:
+        content:
+          application/json:
+            schema:
+              $ref: '#/components/schemas/BulkBody_BulkTaskInstanceBody_'
+        required: true
+      responses:
+        '200':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/BulkResponse'
+          description: Successful Response
+        '401':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Unauthorized
+        '403':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Forbidden
+        '422':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPValidationError'
+          description: Validation Error
+      security:
+      - OAuth2PasswordBearer: []
+      - HTTPBearer: []
+      summary: Bulk Task Instances
+      tags:
+      - Task Instance
   /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/list:
     post:
       description: Get list of task instances.
@@ -6527,10 +7790,76 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Task Instances Batch
       tags:
       - Task Instance
   /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}:
+    delete:
+      description: Delete a task instance.
+      operationId: delete_task_instance
+      parameters:
+      - in: path
+        name: dag_id
+        required: true
+        schema:
+          title: Dag Id
+          type: string
+      - in: path
+        name: dag_run_id
+        required: true
+        schema:
+          title: Dag Run Id
+          type: string
+      - in: path
+        name: task_id
+        required: true
+        schema:
+          title: Task Id
+          type: string
+      - in: query
+        name: map_index
+        required: false
+        schema:
+          default: -1
+          title: Map Index
+          type: integer
+      responses:
+        '200':
+          content:
+            application/json:
+              schema: {}
+          description: Successful Response
+        '401':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Unauthorized
+        '403':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Forbidden
+        '404':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Not Found
+        '422':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPValidationError'
+          description: Validation Error
+      security:
+      - OAuth2PasswordBearer: []
+      - HTTPBearer: []
+      summary: Delete Task Instance
+      tags:
+      - Task Instance
     get:
       description: Get task instance.
       operationId: get_task_instance
@@ -6586,6 +7915,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Task Instance
       tags:
       - Task Instance
@@ -6676,6 +8006,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Patch Task Instance
       tags:
       - Task Instance
@@ -6742,6 +8073,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Task Instance Dependencies
       tags:
       - Task Instance
@@ -6827,9 +8159,90 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Patch Task Instance Dry Run
       tags:
       - Task Instance
+  /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/externalLogUrl/{try_number}:
+    get:
+      description: Get external log URL for a specific task instance.
+      operationId: get_external_log_url
+      parameters:
+      - in: path
+        name: dag_id
+        required: true
+        schema:
+          title: Dag Id
+          type: string
+      - in: path
+        name: dag_run_id
+        required: true
+        schema:
+          title: Dag Run Id
+          type: string
+      - in: path
+        name: task_id
+        required: true
+        schema:
+          title: Task Id
+          type: string
+      - in: path
+        name: try_number
+        required: true
+        schema:
+          exclusiveMinimum: 0
+          title: Try Number
+          type: integer
+      - in: query
+        name: map_index
+        required: false
+        schema:
+          default: -1
+          title: Map Index
+          type: integer
+      responses:
+        '200':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/ExternalLogUrlResponse'
+          description: Successful Response
+        '400':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Bad Request
+        '401':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Unauthorized
+        '403':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Forbidden
+        '404':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Not Found
+        '422':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPValidationError'
+          description: Validation Error
+      security:
+      - OAuth2PasswordBearer: []
+      - HTTPBearer: []
+      summary: Get External Log Url
+      tags:
+      - Task Instance
   /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/links:
     get:
       description: Get extra links for task instance.
@@ -6893,6 +8306,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Extra Links
       tags:
       - Extra Links
@@ -6928,6 +8342,13 @@
           nullable: true
           type: string
       - in: query
+        name: run_after_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: run_after_lte
         required: false
         schema:
@@ -6935,6 +8356,13 @@
           nullable: true
           type: string
       - in: query
+        name: run_after_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: logical_date_gte
         required: false
         schema:
@@ -6942,6 +8370,13 @@
           nullable: true
           type: string
       - in: query
+        name: logical_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: logical_date_lte
         required: false
         schema:
@@ -6949,6 +8384,13 @@
           nullable: true
           type: string
       - in: query
+        name: logical_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: start_date_gte
         required: false
         schema:
@@ -6956,6 +8398,13 @@
           nullable: true
           type: string
       - in: query
+        name: start_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: start_date_lte
         required: false
         schema:
@@ -6963,6 +8412,13 @@
           nullable: true
           type: string
       - in: query
+        name: start_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: end_date_gte
         required: false
         schema:
@@ -6970,6 +8426,13 @@
           nullable: true
           type: string
       - in: query
+        name: end_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: end_date_lte
         required: false
         schema:
@@ -6977,6 +8440,13 @@
           nullable: true
           type: string
       - in: query
+        name: end_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: updated_at_gte
         required: false
         schema:
@@ -6984,6 +8454,13 @@
           nullable: true
           type: string
       - in: query
+        name: updated_at_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: updated_at_lte
         required: false
         schema:
@@ -6991,18 +8468,37 @@
           nullable: true
           type: string
       - in: query
+        name: updated_at_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: duration_gte
         required: false
         schema:
           nullable: true
           type: number
       - in: query
+        name: duration_gt
+        required: false
+        schema:
+          nullable: true
+          type: number
+      - in: query
         name: duration_lte
         required: false
         schema:
           nullable: true
           type: number
       - in: query
+        name: duration_lt
+        required: false
+        schema:
+          nullable: true
+          type: number
+      - in: query
         name: state
         required: false
         schema:
@@ -7043,6 +8539,30 @@
           title: Version Number
           type: array
       - in: query
+        name: try_number
+        required: false
+        schema:
+          items:
+            type: integer
+          title: Try Number
+          type: array
+      - in: query
+        name: operator
+        required: false
+        schema:
+          items:
+            type: string
+          title: Operator
+          type: array
+      - in: query
+        name: map_index
+        required: false
+        schema:
+          items:
+            type: integer
+          title: Map Index
+          type: array
+      - in: query
         name: limit
         required: false
         schema:
@@ -7062,9 +8582,12 @@
         name: order_by
         required: false
         schema:
-          default: map_index
+          default:
+          - map_index
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
       responses:
         '200':
           content:
@@ -7098,6 +8621,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Mapped Task Instances
       tags:
       - Task Instance
@@ -7128,7 +8652,7 @@
         name: try_number
         required: true
         schema:
-          exclusiveMinimum: 0
+          minimum: 0
           title: Try Number
           type: integer
       - in: query
@@ -7203,6 +8727,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Log
       tags:
       - Task Instance
@@ -7269,6 +8794,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Task Instance Tries
       tags:
       - Task Instance
@@ -7341,6 +8867,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Task Instance Try Details
       tags:
       - Task Instance
@@ -7400,6 +8927,100 @@
           minimum: 0
           title: Offset
           type: integer
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: xcom_key_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: dag_display_name_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: run_id_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: task_id_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - in: query
+        name: map_index_filter
+        required: false
+        schema:
+          nullable: true
+          type: integer
+      - in: query
+        name: logical_date_gte
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
+        name: logical_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
+        name: logical_date_lte
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
+        name: logical_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
+        name: run_after_gte
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
+        name: run_after_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
+        name: run_after_lte
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
+        name: run_after_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
       responses:
         '200':
           content:
@@ -7439,6 +9060,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Xcom Entries
       tags:
       - XCom
@@ -7509,6 +9131,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Create Xcom Entry
       tags:
       - XCom
@@ -7605,6 +9228,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Xcom Entry
       tags:
       - XCom
@@ -7681,6 +9305,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Update Xcom Entry
       tags:
       - XCom
@@ -7746,6 +9371,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Mapped Task Instance
       tags:
       - Task Instance
@@ -7836,6 +9462,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Patch Task Instance
       tags:
       - Task Instance
@@ -7901,6 +9528,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Task Instance Dependencies
       tags:
       - Task Instance
@@ -7986,9 +9614,153 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Patch Task Instance Dry Run
       tags:
       - Task Instance
+  /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/hitlDetails:
+    get:
+      description: Get a Human-in-the-loop detail of a specific task instance.
+      operationId: get_hitl_detail
+      parameters:
+      - in: path
+        name: dag_id
+        required: true
+        schema:
+          title: Dag Id
+          type: string
+      - in: path
+        name: dag_run_id
+        required: true
+        schema:
+          title: Dag Run Id
+          type: string
+      - in: path
+        name: task_id
+        required: true
+        schema:
+          title: Task Id
+          type: string
+      - in: path
+        name: map_index
+        required: true
+        schema:
+          title: Map Index
+          type: integer
+      responses:
+        '200':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HITLDetail'
+          description: Successful Response
+        '401':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Unauthorized
+        '403':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Forbidden
+        '404':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Not Found
+        '422':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPValidationError'
+          description: Validation Error
+      security:
+      - OAuth2PasswordBearer: []
+      - HTTPBearer: []
+      summary: Get Hitl Detail
+      tags:
+      - Task Instance
+    patch:
+      description: Update a Human-in-the-loop detail.
+      operationId: update_hitl_detail
+      parameters:
+      - in: path
+        name: dag_id
+        required: true
+        schema:
+          title: Dag Id
+          type: string
+      - in: path
+        name: dag_run_id
+        required: true
+        schema:
+          title: Dag Run Id
+          type: string
+      - in: path
+        name: task_id
+        required: true
+        schema:
+          title: Task Id
+          type: string
+      - in: path
+        name: map_index
+        required: true
+        schema:
+          title: Map Index
+          type: integer
+      requestBody:
+        content:
+          application/json:
+            schema:
+              $ref: '#/components/schemas/UpdateHITLDetailPayload'
+        required: true
+      responses:
+        '200':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HITLDetailResponse'
+          description: Successful Response
+        '401':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Unauthorized
+        '403':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Forbidden
+        '404':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Not Found
+        '409':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Conflict
+        '422':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPValidationError'
+          description: Validation Error
+      security:
+      - OAuth2PasswordBearer: []
+      - HTTPBearer: []
+      summary: Update Hitl Detail
+      tags:
+      - Task Instance
   /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/tries:
     get:
       operationId: get_mapped_task_instance_tries
@@ -8050,6 +9822,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Mapped Task Instance Tries
       tags:
       - Task Instance
@@ -8120,6 +9893,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Mapped Task Instance Try Details
       tags:
       - Task Instance
@@ -8174,9 +9948,93 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Upstream Asset Events
       tags:
       - DagRun
+  /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/wait:
+    get:
+      description: "\U0001F6A7 This is an experimental endpoint and may change or\
+        \ be removed without notice.Successful response are streamed as newline-delimited\
+        \ JSON (NDJSON). Each line is a JSON object representing the DAG run state."
+      operationId: wait_dag_run_until_finished
+      parameters:
+      - in: path
+        name: dag_id
+        required: true
+        schema:
+          title: Dag Id
+          type: string
+      - in: path
+        name: dag_run_id
+        required: true
+        schema:
+          title: Dag Run Id
+          type: string
+      - description: Seconds to wait between dag run state checks
+        in: query
+        name: interval
+        required: true
+        schema:
+          description: Seconds to wait between dag run state checks
+          exclusiveMinimum: 0.0
+          title: Interval
+          type: number
+      - description: Collect result XCom from task. Can be set multiple times.
+        in: query
+        name: result
+        required: false
+        schema:
+          items:
+            type: string
+          nullable: true
+          type: array
+      responses:
+        '200':
+          content:
+            application/json:
+              schema: {}
+            application/x-ndjson:
+              schema:
+                example: '{"state": "running"}
+
+                  {"state": "success", "results": {"op": 42}}
+
+                  '
+                type: string
+          description: Successful Response
+        '401':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Unauthorized
+        '403':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Forbidden
+        '404':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Not Found
+        '422':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPValidationError'
+          description: Validation Error
+      security:
+      - OAuth2PasswordBearer: []
+      - HTTPBearer: []
+      summary: 'Experimental: Wait for a dag run to complete, and return task results
+        if requested.'
+      tags:
+      - DagRun
+      - experimental
   /api/v2/dags/{dag_id}/dagVersions:
     get:
       description: 'Get all DAG Versions.
@@ -8230,9 +10088,12 @@
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
       responses:
         '200':
           content:
@@ -8266,6 +10127,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Versions
       tags:
       - DagVersion
@@ -8319,6 +10181,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Version
       tags:
       - DagVersion
@@ -8372,9 +10235,54 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Dag Details
       tags:
       - DAG
+  /api/v2/dags/{dag_id}/favorite:
+    post:
+      description: Mark the DAG as favorite.
+      operationId: favorite_dag
+      parameters:
+      - in: path
+        name: dag_id
+        required: true
+        schema:
+          title: Dag Id
+          type: string
+      responses:
+        '204':
+          description: Successful Response
+        '401':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Unauthorized
+        '403':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Forbidden
+        '404':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Not Found
+        '422':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPValidationError'
+          description: Validation Error
+      security:
+      - OAuth2PasswordBearer: []
+      - HTTPBearer: []
+      summary: Favorite Dag
+      tags:
+      - DAG
   /api/v2/dags/{dag_id}/tasks:
     get:
       description: Get tasks for DAG.
@@ -8432,6 +10340,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Tasks
       tags:
       - Task
@@ -8490,9 +10399,60 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Task
       tags:
       - Task
+  /api/v2/dags/{dag_id}/unfavorite:
+    post:
+      description: Unmark the DAG as favorite.
+      operationId: unfavorite_dag
+      parameters:
+      - in: path
+        name: dag_id
+        required: true
+        schema:
+          title: Dag Id
+          type: string
+      responses:
+        '204':
+          description: Successful Response
+        '401':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Unauthorized
+        '403':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Forbidden
+        '404':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Not Found
+        '409':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Conflict
+        '422':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPValidationError'
+          description: Validation Error
+      security:
+      - OAuth2PasswordBearer: []
+      - HTTPBearer: []
+      summary: Unfavorite Dag
+      tags:
+      - DAG
   /api/v2/eventLogs:
     get:
       description: Get all Event Logs.
@@ -8518,9 +10478,12 @@
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
       - in: query
         name: dag_id
         required: false
@@ -8593,6 +10556,46 @@
           format: date-time
           nullable: true
           type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: dag_id_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: task_id_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: run_id_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: owner_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: event_pattern
+        required: false
+        schema:
+          nullable: true
+          type: string
       responses:
         '200':
           content:
@@ -8620,6 +10623,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Event Logs
       tags:
       - Event Log
@@ -8666,6 +10670,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Event Log
       tags:
       - Event Log
@@ -8694,8 +10699,19 @@
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
+          type: array
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
+        name: filename_pattern
+        required: false
+        schema:
+          nullable: true
           type: string
       responses:
         '200':
@@ -8724,6 +10740,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Import Errors
       tags:
       - Import Error
@@ -8771,6 +10788,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Import Error
       tags:
       - Import Error
@@ -8793,6 +10811,13 @@
           nullable: true
           type: string
       - in: query
+        name: start_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: start_date_lte
         required: false
         schema:
@@ -8800,6 +10825,13 @@
           nullable: true
           type: string
       - in: query
+        name: start_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: end_date_gte
         required: false
         schema:
@@ -8807,6 +10839,13 @@
           nullable: true
           type: string
       - in: query
+        name: end_date_gt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: end_date_lte
         required: false
         schema:
@@ -8814,6 +10853,13 @@
           nullable: true
           type: string
       - in: query
+        name: end_date_lt
+        required: false
+        schema:
+          format: date-time
+          nullable: true
+          type: string
+      - in: query
         name: limit
         required: false
         schema:
@@ -8833,9 +10879,12 @@
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
-          type: string
+          type: array
       - in: query
         name: job_state
         required: false
@@ -8893,6 +10942,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Jobs
       tags:
       - Job
@@ -8924,9 +10974,7 @@
         '201':
           content:
             application/json:
-              schema:
-                title: Response Reparse Dag File
-                type: 'null'
+              schema: {}
           description: Successful Response
         '401':
           content:
@@ -8954,6 +11002,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Reparse Dag File
       tags:
       - DAG Parsing
@@ -9004,9 +11053,38 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Plugins
       tags:
       - Plugin
+  /api/v2/plugins/importErrors:
+    get:
+      operationId: import_errors
+      responses:
+        '200':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/PluginImportErrorCollectionResponse'
+          description: Successful Response
+        '401':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Unauthorized
+        '403':
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/HTTPExceptionResponse'
+          description: Forbidden
+      security:
+      - OAuth2PasswordBearer: []
+      - HTTPBearer: []
+      summary: Import Errors
+      tags:
+      - Plugin
   /api/v2/pools:
     get:
       description: Get all pools entries.
@@ -9032,10 +11110,15 @@
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
-          type: string
-      - in: query
+          type: array
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: pool_name_pattern
         required: false
         schema:
@@ -9074,6 +11157,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Pools
       tags:
       - Pool
@@ -9113,6 +11197,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Bulk Pools
       tags:
       - Pool
@@ -9158,6 +11243,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Post Pool
       tags:
       - Pool
@@ -9207,6 +11293,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Delete Pool
       tags:
       - Pool
@@ -9253,6 +11340,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Pool
       tags:
       - Pool
@@ -9319,6 +11407,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Patch Pool
       tags:
       - Pool
@@ -9370,6 +11459,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Providers
       tags:
       - Provider
@@ -9398,10 +11488,15 @@
         name: order_by
         required: false
         schema:
-          default: id
+          default:
+          - id
+          items:
+            type: string
           title: Order By
-          type: string
-      - in: query
+          type: array
+      - description: "SQL LIKE expression \u2014 use `%` / `_` wildcards (e.g. `%customer_%`).\
+          \ Regular expressions are **not** supported."
+        in: query
         name: variable_key_pattern
         required: false
         schema:
@@ -9434,6 +11529,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Variables
       tags:
       - Variable
@@ -9473,6 +11569,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Bulk Variables
       tags:
       - Variable
@@ -9518,6 +11615,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Post Variable
       tags:
       - Variable
@@ -9561,6 +11659,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Delete Variable
       tags:
       - Variable
@@ -9607,6 +11706,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Get Variable
       tags:
       - Variable
@@ -9673,6 +11773,7 @@
           description: Validation Error
       security:
       - OAuth2PasswordBearer: []
+      - HTTPBearer: []
       summary: Patch Variable
       tags:
       - Variable
diff --git a/test/test_app_builder_menu_item_response.py b/test/test_app_builder_menu_item_response.py
index c836539..bcc7c1e 100644
--- a/test/test_app_builder_menu_item_response.py
+++ b/test/test_app_builder_menu_item_response.py
@@ -41,6 +41,7 @@
             )
         else:
             return AppBuilderMenuItemResponse(
+                href = '',
                 name = '',
         )
         """
diff --git a/test/test_asset_collection_response.py b/test/test_asset_collection_response.py
index 267b1eb..8f5416e 100644
--- a/test/test_asset_collection_response.py
+++ b/test/test_asset_collection_response.py
@@ -43,16 +43,20 @@
                                 id = 56, 
                                 name = '', )
                             ], 
-                        consuming_dags = [
-                            airflow_client.client.models.dag_schedule_asset_reference.DagScheduleAssetReference(
+                        consuming_tasks = [
+                            airflow_client.client.models.task_inlet_asset_reference.TaskInletAssetReference(
                                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                                 dag_id = '', 
+                                task_id = '', 
                                 updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
                             ], 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         extra = airflow_client.client.models.extra.extra(), 
                         group = '', 
                         id = 56, 
+                        last_asset_event = airflow_client.client.models.last_asset_event_response.LastAssetEventResponse(
+                            id = 0.0, 
+                            timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), ), 
                         name = '', 
                         producing_tasks = [
                             airflow_client.client.models.task_outlet_asset_reference.TaskOutletAssetReference(
@@ -61,6 +65,12 @@
                                 task_id = '', 
                                 updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
                             ], 
+                        scheduled_dags = [
+                            airflow_client.client.models.dag_schedule_asset_reference.DagScheduleAssetReference(
+                                created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                dag_id = '', 
+                                updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
+                            ], 
                         updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         uri = '', )
                     ],
@@ -76,16 +86,20 @@
                                 id = 56, 
                                 name = '', )
                             ], 
-                        consuming_dags = [
-                            airflow_client.client.models.dag_schedule_asset_reference.DagScheduleAssetReference(
+                        consuming_tasks = [
+                            airflow_client.client.models.task_inlet_asset_reference.TaskInletAssetReference(
                                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                                 dag_id = '', 
+                                task_id = '', 
                                 updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
                             ], 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         extra = airflow_client.client.models.extra.extra(), 
                         group = '', 
                         id = 56, 
+                        last_asset_event = airflow_client.client.models.last_asset_event_response.LastAssetEventResponse(
+                            id = 0.0, 
+                            timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), ), 
                         name = '', 
                         producing_tasks = [
                             airflow_client.client.models.task_outlet_asset_reference.TaskOutletAssetReference(
@@ -94,6 +108,12 @@
                                 task_id = '', 
                                 updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
                             ], 
+                        scheduled_dags = [
+                            airflow_client.client.models.dag_schedule_asset_reference.DagScheduleAssetReference(
+                                created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                dag_id = '', 
+                                updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
+                            ], 
                         updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         uri = '', )
                     ],
diff --git a/test/test_asset_response.py b/test/test_asset_response.py
index 507a0db..a9788cf 100644
--- a/test/test_asset_response.py
+++ b/test/test_asset_response.py
@@ -41,16 +41,20 @@
                         id = 56, 
                         name = '', )
                     ],
-                consuming_dags = [
-                    airflow_client.client.models.dag_schedule_asset_reference.DagScheduleAssetReference(
+                consuming_tasks = [
+                    airflow_client.client.models.task_inlet_asset_reference.TaskInletAssetReference(
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         dag_id = '', 
+                        task_id = '', 
                         updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
                     ],
                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 extra = airflow_client.client.models.extra.extra(),
                 group = '',
                 id = 56,
+                last_asset_event = airflow_client.client.models.last_asset_event_response.LastAssetEventResponse(
+                    id = 0.0, 
+                    timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), ),
                 name = '',
                 producing_tasks = [
                     airflow_client.client.models.task_outlet_asset_reference.TaskOutletAssetReference(
@@ -59,6 +63,12 @@
                         task_id = '', 
                         updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
                     ],
+                scheduled_dags = [
+                    airflow_client.client.models.dag_schedule_asset_reference.DagScheduleAssetReference(
+                        created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_id = '', 
+                        updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
+                    ],
                 updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 uri = ''
             )
@@ -70,10 +80,11 @@
                         id = 56, 
                         name = '', )
                     ],
-                consuming_dags = [
-                    airflow_client.client.models.dag_schedule_asset_reference.DagScheduleAssetReference(
+                consuming_tasks = [
+                    airflow_client.client.models.task_inlet_asset_reference.TaskInletAssetReference(
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         dag_id = '', 
+                        task_id = '', 
                         updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
                     ],
                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
@@ -87,6 +98,12 @@
                         task_id = '', 
                         updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
                     ],
+                scheduled_dags = [
+                    airflow_client.client.models.dag_schedule_asset_reference.DagScheduleAssetReference(
+                        created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_id = '', 
+                        updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
+                    ],
                 updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 uri = '',
         )
diff --git a/test/test_backfill_collection_response.py b/test/test_backfill_collection_response.py
index e66c7f1..5fac5b1 100644
--- a/test/test_backfill_collection_response.py
+++ b/test/test_backfill_collection_response.py
@@ -39,6 +39,7 @@
                     airflow_client.client.models.backfill_response.BackfillResponse(
                         completed_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
                         dag_id = '', 
                         dag_run_conf = airflow_client.client.models.dag_run_conf.Dag Run Conf(), 
                         from_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
@@ -57,6 +58,7 @@
                     airflow_client.client.models.backfill_response.BackfillResponse(
                         completed_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
                         dag_id = '', 
                         dag_run_conf = airflow_client.client.models.dag_run_conf.Dag Run Conf(), 
                         from_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
diff --git a/test/test_backfill_response.py b/test/test_backfill_response.py
index 864988e..64b398a 100644
--- a/test/test_backfill_response.py
+++ b/test/test_backfill_response.py
@@ -37,6 +37,7 @@
             return BackfillResponse(
                 completed_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                dag_display_name = '',
                 dag_id = '',
                 dag_run_conf = airflow_client.client.models.dag_run_conf.Dag Run Conf(),
                 from_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
@@ -50,6 +51,7 @@
         else:
             return BackfillResponse(
                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                dag_display_name = '',
                 dag_id = '',
                 dag_run_conf = airflow_client.client.models.dag_run_conf.Dag Run Conf(),
                 from_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
diff --git a/test/test_bulk_body_bulk_task_instance_body.py b/test/test_bulk_body_bulk_task_instance_body.py
new file mode 100644
index 0000000..f109612
--- /dev/null
+++ b/test/test_bulk_body_bulk_task_instance_body.py
@@ -0,0 +1,56 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.bulk_body_bulk_task_instance_body import BulkBodyBulkTaskInstanceBody
+
+class TestBulkBodyBulkTaskInstanceBody(unittest.TestCase):
+    """BulkBodyBulkTaskInstanceBody unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> BulkBodyBulkTaskInstanceBody:
+        """Test BulkBodyBulkTaskInstanceBody
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `BulkBodyBulkTaskInstanceBody`
+        """
+        model = BulkBodyBulkTaskInstanceBody()
+        if include_optional:
+            return BulkBodyBulkTaskInstanceBody(
+                actions = [
+                    null
+                    ]
+            )
+        else:
+            return BulkBodyBulkTaskInstanceBody(
+                actions = [
+                    null
+                    ],
+        )
+        """
+
+    def testBulkBodyBulkTaskInstanceBody(self):
+        """Test BulkBodyBulkTaskInstanceBody"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_bulk_body_bulk_task_instance_body_actions_inner.py b/test/test_bulk_body_bulk_task_instance_body_actions_inner.py
new file mode 100644
index 0000000..ed1b902
--- /dev/null
+++ b/test/test_bulk_body_bulk_task_instance_body_actions_inner.py
@@ -0,0 +1,60 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.bulk_body_bulk_task_instance_body_actions_inner import BulkBodyBulkTaskInstanceBodyActionsInner
+
+class TestBulkBodyBulkTaskInstanceBodyActionsInner(unittest.TestCase):
+    """BulkBodyBulkTaskInstanceBodyActionsInner unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> BulkBodyBulkTaskInstanceBodyActionsInner:
+        """Test BulkBodyBulkTaskInstanceBodyActionsInner
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `BulkBodyBulkTaskInstanceBodyActionsInner`
+        """
+        model = BulkBodyBulkTaskInstanceBodyActionsInner()
+        if include_optional:
+            return BulkBodyBulkTaskInstanceBodyActionsInner(
+                action = 'delete',
+                action_on_existence = 'fail',
+                entities = [
+                    null
+                    ],
+                action_on_non_existence = 'fail'
+            )
+        else:
+            return BulkBodyBulkTaskInstanceBodyActionsInner(
+                action = 'delete',
+                entities = [
+                    null
+                    ],
+        )
+        """
+
+    def testBulkBodyBulkTaskInstanceBodyActionsInner(self):
+        """Test BulkBodyBulkTaskInstanceBodyActionsInner"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_bulk_body_connection_body_actions_inner.py b/test/test_bulk_body_connection_body_actions_inner.py
index 780238d..9cb0d2c 100644
--- a/test/test_bulk_body_connection_body_actions_inner.py
+++ b/test/test_bulk_body_connection_body_actions_inner.py
@@ -38,7 +38,7 @@
                 action = 'delete',
                 action_on_existence = 'fail',
                 entities = [
-                    ''
+                    null
                     ],
                 action_on_non_existence = 'fail'
             )
@@ -46,7 +46,7 @@
             return BulkBodyConnectionBodyActionsInner(
                 action = 'delete',
                 entities = [
-                    ''
+                    null
                     ],
         )
         """
diff --git a/test/test_bulk_body_pool_body_actions_inner.py b/test/test_bulk_body_pool_body_actions_inner.py
index c17dee2..fdbde40 100644
--- a/test/test_bulk_body_pool_body_actions_inner.py
+++ b/test/test_bulk_body_pool_body_actions_inner.py
@@ -38,7 +38,7 @@
                 action = 'delete',
                 action_on_existence = 'fail',
                 entities = [
-                    ''
+                    null
                     ],
                 action_on_non_existence = 'fail'
             )
@@ -46,7 +46,7 @@
             return BulkBodyPoolBodyActionsInner(
                 action = 'delete',
                 entities = [
-                    ''
+                    null
                     ],
         )
         """
diff --git a/test/test_bulk_body_variable_body_actions_inner.py b/test/test_bulk_body_variable_body_actions_inner.py
index 122d5d4..b061074 100644
--- a/test/test_bulk_body_variable_body_actions_inner.py
+++ b/test/test_bulk_body_variable_body_actions_inner.py
@@ -38,7 +38,7 @@
                 action = 'delete',
                 action_on_existence = 'fail',
                 entities = [
-                    ''
+                    null
                     ],
                 action_on_non_existence = 'fail'
             )
@@ -46,7 +46,7 @@
             return BulkBodyVariableBodyActionsInner(
                 action = 'delete',
                 entities = [
-                    ''
+                    null
                     ],
         )
         """
diff --git a/test/test_bulk_create_action_bulk_task_instance_body.py b/test/test_bulk_create_action_bulk_task_instance_body.py
new file mode 100644
index 0000000..b6e4fcd
--- /dev/null
+++ b/test/test_bulk_create_action_bulk_task_instance_body.py
@@ -0,0 +1,75 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.bulk_create_action_bulk_task_instance_body import BulkCreateActionBulkTaskInstanceBody
+
+class TestBulkCreateActionBulkTaskInstanceBody(unittest.TestCase):
+    """BulkCreateActionBulkTaskInstanceBody unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> BulkCreateActionBulkTaskInstanceBody:
+        """Test BulkCreateActionBulkTaskInstanceBody
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `BulkCreateActionBulkTaskInstanceBody`
+        """
+        model = BulkCreateActionBulkTaskInstanceBody()
+        if include_optional:
+            return BulkCreateActionBulkTaskInstanceBody(
+                action = 'create',
+                action_on_existence = 'fail',
+                entities = [
+                    airflow_client.client.models.bulk_task_instance_body.BulkTaskInstanceBody(
+                        include_downstream = True, 
+                        include_future = True, 
+                        include_past = True, 
+                        include_upstream = True, 
+                        map_index = 56, 
+                        new_state = 'removed', 
+                        note = '', 
+                        task_id = '', )
+                    ]
+            )
+        else:
+            return BulkCreateActionBulkTaskInstanceBody(
+                action = 'create',
+                entities = [
+                    airflow_client.client.models.bulk_task_instance_body.BulkTaskInstanceBody(
+                        include_downstream = True, 
+                        include_future = True, 
+                        include_past = True, 
+                        include_upstream = True, 
+                        map_index = 56, 
+                        new_state = 'removed', 
+                        note = '', 
+                        task_id = '', )
+                    ],
+        )
+        """
+
+    def testBulkCreateActionBulkTaskInstanceBody(self):
+        """Test BulkCreateActionBulkTaskInstanceBody"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_bulk_delete_action_bulk_task_instance_body.py b/test/test_bulk_delete_action_bulk_task_instance_body.py
new file mode 100644
index 0000000..13fd1b7
--- /dev/null
+++ b/test/test_bulk_delete_action_bulk_task_instance_body.py
@@ -0,0 +1,59 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body import BulkDeleteActionBulkTaskInstanceBody
+
+class TestBulkDeleteActionBulkTaskInstanceBody(unittest.TestCase):
+    """BulkDeleteActionBulkTaskInstanceBody unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> BulkDeleteActionBulkTaskInstanceBody:
+        """Test BulkDeleteActionBulkTaskInstanceBody
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `BulkDeleteActionBulkTaskInstanceBody`
+        """
+        model = BulkDeleteActionBulkTaskInstanceBody()
+        if include_optional:
+            return BulkDeleteActionBulkTaskInstanceBody(
+                action = 'delete',
+                action_on_non_existence = 'fail',
+                entities = [
+                    null
+                    ]
+            )
+        else:
+            return BulkDeleteActionBulkTaskInstanceBody(
+                action = 'delete',
+                entities = [
+                    null
+                    ],
+        )
+        """
+
+    def testBulkDeleteActionBulkTaskInstanceBody(self):
+        """Test BulkDeleteActionBulkTaskInstanceBody"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_bulk_delete_action_bulk_task_instance_body_entities_inner.py b/test/test_bulk_delete_action_bulk_task_instance_body_entities_inner.py
new file mode 100644
index 0000000..d0eae69
--- /dev/null
+++ b/test/test_bulk_delete_action_bulk_task_instance_body_entities_inner.py
@@ -0,0 +1,59 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.bulk_delete_action_bulk_task_instance_body_entities_inner import BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
+
+class TestBulkDeleteActionBulkTaskInstanceBodyEntitiesInner(unittest.TestCase):
+    """BulkDeleteActionBulkTaskInstanceBodyEntitiesInner unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> BulkDeleteActionBulkTaskInstanceBodyEntitiesInner:
+        """Test BulkDeleteActionBulkTaskInstanceBodyEntitiesInner
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `BulkDeleteActionBulkTaskInstanceBodyEntitiesInner`
+        """
+        model = BulkDeleteActionBulkTaskInstanceBodyEntitiesInner()
+        if include_optional:
+            return BulkDeleteActionBulkTaskInstanceBodyEntitiesInner(
+                include_downstream = True,
+                include_future = True,
+                include_past = True,
+                include_upstream = True,
+                map_index = 56,
+                new_state = 'removed',
+                note = '',
+                task_id = ''
+            )
+        else:
+            return BulkDeleteActionBulkTaskInstanceBodyEntitiesInner(
+                task_id = '',
+        )
+        """
+
+    def testBulkDeleteActionBulkTaskInstanceBodyEntitiesInner(self):
+        """Test BulkDeleteActionBulkTaskInstanceBodyEntitiesInner"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_bulk_delete_action_connection_body.py b/test/test_bulk_delete_action_connection_body.py
index b9b7823..5404203 100644
--- a/test/test_bulk_delete_action_connection_body.py
+++ b/test/test_bulk_delete_action_connection_body.py
@@ -38,14 +38,14 @@
                 action = 'delete',
                 action_on_non_existence = 'fail',
                 entities = [
-                    ''
+                    null
                     ]
             )
         else:
             return BulkDeleteActionConnectionBody(
                 action = 'delete',
                 entities = [
-                    ''
+                    null
                     ],
         )
         """
diff --git a/test/test_bulk_delete_action_pool_body.py b/test/test_bulk_delete_action_pool_body.py
index 8097175..fc9def4 100644
--- a/test/test_bulk_delete_action_pool_body.py
+++ b/test/test_bulk_delete_action_pool_body.py
@@ -38,14 +38,14 @@
                 action = 'delete',
                 action_on_non_existence = 'fail',
                 entities = [
-                    ''
+                    null
                     ]
             )
         else:
             return BulkDeleteActionPoolBody(
                 action = 'delete',
                 entities = [
-                    ''
+                    null
                     ],
         )
         """
diff --git a/test/test_bulk_delete_action_variable_body.py b/test/test_bulk_delete_action_variable_body.py
index 152618f..d9a827b 100644
--- a/test/test_bulk_delete_action_variable_body.py
+++ b/test/test_bulk_delete_action_variable_body.py
@@ -38,14 +38,14 @@
                 action = 'delete',
                 action_on_non_existence = 'fail',
                 entities = [
-                    ''
+                    null
                     ]
             )
         else:
             return BulkDeleteActionVariableBody(
                 action = 'delete',
                 entities = [
-                    ''
+                    null
                     ],
         )
         """
diff --git a/test/test_bulk_task_instance_body.py b/test/test_bulk_task_instance_body.py
new file mode 100644
index 0000000..e0f7a2c
--- /dev/null
+++ b/test/test_bulk_task_instance_body.py
@@ -0,0 +1,59 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.bulk_task_instance_body import BulkTaskInstanceBody
+
+class TestBulkTaskInstanceBody(unittest.TestCase):
+    """BulkTaskInstanceBody unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> BulkTaskInstanceBody:
+        """Test BulkTaskInstanceBody
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `BulkTaskInstanceBody`
+        """
+        model = BulkTaskInstanceBody()
+        if include_optional:
+            return BulkTaskInstanceBody(
+                include_downstream = True,
+                include_future = True,
+                include_past = True,
+                include_upstream = True,
+                map_index = 56,
+                new_state = 'removed',
+                note = '',
+                task_id = ''
+            )
+        else:
+            return BulkTaskInstanceBody(
+                task_id = '',
+        )
+        """
+
+    def testBulkTaskInstanceBody(self):
+        """Test BulkTaskInstanceBody"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_bulk_update_action_bulk_task_instance_body.py b/test/test_bulk_update_action_bulk_task_instance_body.py
new file mode 100644
index 0000000..56ab7b3
--- /dev/null
+++ b/test/test_bulk_update_action_bulk_task_instance_body.py
@@ -0,0 +1,75 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.bulk_update_action_bulk_task_instance_body import BulkUpdateActionBulkTaskInstanceBody
+
+class TestBulkUpdateActionBulkTaskInstanceBody(unittest.TestCase):
+    """BulkUpdateActionBulkTaskInstanceBody unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> BulkUpdateActionBulkTaskInstanceBody:
+        """Test BulkUpdateActionBulkTaskInstanceBody
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `BulkUpdateActionBulkTaskInstanceBody`
+        """
+        model = BulkUpdateActionBulkTaskInstanceBody()
+        if include_optional:
+            return BulkUpdateActionBulkTaskInstanceBody(
+                action = 'update',
+                action_on_non_existence = 'fail',
+                entities = [
+                    airflow_client.client.models.bulk_task_instance_body.BulkTaskInstanceBody(
+                        include_downstream = True, 
+                        include_future = True, 
+                        include_past = True, 
+                        include_upstream = True, 
+                        map_index = 56, 
+                        new_state = 'removed', 
+                        note = '', 
+                        task_id = '', )
+                    ]
+            )
+        else:
+            return BulkUpdateActionBulkTaskInstanceBody(
+                action = 'update',
+                entities = [
+                    airflow_client.client.models.bulk_task_instance_body.BulkTaskInstanceBody(
+                        include_downstream = True, 
+                        include_future = True, 
+                        include_past = True, 
+                        include_upstream = True, 
+                        map_index = 56, 
+                        new_state = 'removed', 
+                        note = '', 
+                        task_id = '', )
+                    ],
+        )
+        """
+
+    def testBulkUpdateActionBulkTaskInstanceBody(self):
+        """Test BulkUpdateActionBulkTaskInstanceBody"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_clear_task_instances_body.py b/test/test_clear_task_instances_body.py
index 8364c87..ea5ba36 100644
--- a/test/test_clear_task_instances_body.py
+++ b/test/test_clear_task_instances_body.py
@@ -45,6 +45,7 @@
                 only_failed = True,
                 only_running = True,
                 reset_dag_runs = True,
+                run_on_latest_version = True,
                 start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 task_ids = [
                     null
diff --git a/test/test_dag_api.py b/test/test_dag_api.py
index d874d87..6f531db 100644
--- a/test/test_dag_api.py
+++ b/test/test_dag_api.py
@@ -33,6 +33,13 @@
         """
         pass
 
+    def test_favorite_dag(self) -> None:
+        """Test case for favorite_dag
+
+        Favorite Dag
+        """
+        pass
+
     def test_get_dag(self) -> None:
         """Test case for get_dag
 
@@ -75,6 +82,13 @@
         """
         pass
 
+    def test_unfavorite_dag(self) -> None:
+        """Test case for unfavorite_dag
+
+        Unfavorite Dag
+        """
+        pass
+
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/test/test_dag_collection_response.py b/test/test_dag_collection_response.py
index 21d5db7..f4a24b6 100644
--- a/test/test_dag_collection_response.py
+++ b/test/test_dag_collection_response.py
@@ -49,6 +49,7 @@
                         is_paused = True, 
                         is_stale = True, 
                         last_expired = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        last_parse_duration = 1.337, 
                         last_parsed_time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         max_active_runs = 56, 
                         max_active_tasks = 56, 
@@ -63,6 +64,7 @@
                         relative_fileloc = '', 
                         tags = [
                             airflow_client.client.models.dag_tag_response.DagTagResponse(
+                                dag_display_name = '', 
                                 dag_id = '', 
                                 name = '', )
                             ], 
@@ -87,6 +89,7 @@
                         is_paused = True, 
                         is_stale = True, 
                         last_expired = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        last_parse_duration = 1.337, 
                         last_parsed_time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         max_active_runs = 56, 
                         max_active_tasks = 56, 
@@ -101,6 +104,7 @@
                         relative_fileloc = '', 
                         tags = [
                             airflow_client.client.models.dag_tag_response.DagTagResponse(
+                                dag_display_name = '', 
                                 dag_id = '', 
                                 name = '', )
                             ], 
diff --git a/test/test_dag_details_response.py b/test/test_dag_details_response.py
index d43e503..87509a8 100644
--- a/test/test_dag_details_response.py
+++ b/test/test_dag_details_response.py
@@ -43,6 +43,7 @@
                 dag_display_name = '',
                 dag_id = '',
                 dag_run_timeout = '',
+                default_args = airflow_client.client.models.extra.extra(),
                 description = '',
                 doc_md = '',
                 end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
@@ -54,6 +55,7 @@
                 is_paused_upon_creation = True,
                 is_stale = True,
                 last_expired = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                last_parse_duration = 1.337,
                 last_parsed = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 last_parsed_time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 latest_dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
@@ -61,6 +63,7 @@
                     bundle_url = '', 
                     bundle_version = '', 
                     created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    dag_display_name = '', 
                     dag_id = '', 
                     id = '', 
                     version_number = 56, ),
@@ -83,6 +86,7 @@
                 start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 tags = [
                     airflow_client.client.models.dag_tag_response.DagTagResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         name = '', )
                     ],
@@ -113,6 +117,7 @@
                 render_template_as_native_obj = True,
                 tags = [
                     airflow_client.client.models.dag_tag_response.DagTagResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         name = '', )
                     ],
diff --git a/test/test_dag_response.py b/test/test_dag_response.py
index 3d67d2e..7896db6 100644
--- a/test/test_dag_response.py
+++ b/test/test_dag_response.py
@@ -47,6 +47,7 @@
                 is_paused = True,
                 is_stale = True,
                 last_expired = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                last_parse_duration = 1.337,
                 last_parsed_time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 max_active_runs = 56,
                 max_active_tasks = 56,
@@ -61,6 +62,7 @@
                 relative_fileloc = '',
                 tags = [
                     airflow_client.client.models.dag_tag_response.DagTagResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         name = '', )
                     ],
@@ -84,6 +86,7 @@
                     ],
                 tags = [
                     airflow_client.client.models.dag_tag_response.DagTagResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         name = '', )
                     ],
diff --git a/test/test_dag_run_api.py b/test/test_dag_run_api.py
index 1d2e164..2541cae 100644
--- a/test/test_dag_run_api.py
+++ b/test/test_dag_run_api.py
@@ -82,6 +82,13 @@
         """
         pass
 
+    def test_wait_dag_run_until_finished(self) -> None:
+        """Test case for wait_dag_run_until_finished
+
+        Experimental: Wait for a dag run to complete, and return task results if requested.
+        """
+        pass
+
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/test/test_dag_run_clear_body.py b/test/test_dag_run_clear_body.py
index 5e6dfb0..669df41 100644
--- a/test/test_dag_run_clear_body.py
+++ b/test/test_dag_run_clear_body.py
@@ -36,7 +36,8 @@
         if include_optional:
             return DAGRunClearBody(
                 dry_run = True,
-                only_failed = True
+                only_failed = True,
+                run_on_latest_version = True
             )
         else:
             return DAGRunClearBody(
diff --git a/test/test_dag_run_collection_response.py b/test/test_dag_run_collection_response.py
index 1355672..04877f4 100644
--- a/test/test_dag_run_collection_response.py
+++ b/test/test_dag_run_collection_response.py
@@ -39,6 +39,7 @@
                     airflow_client.client.models.dag_run_response.DAGRunResponse(
                         bundle_version = '', 
                         conf = airflow_client.client.models.extra.extra(), 
+                        dag_display_name = '', 
                         dag_id = '', 
                         dag_run_id = '', 
                         dag_versions = [
@@ -47,12 +48,14 @@
                                 bundle_url = '', 
                                 bundle_version = '', 
                                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                dag_display_name = '', 
                                 dag_id = '', 
                                 id = '', 
                                 version_number = 56, )
                             ], 
                         data_interval_end = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         data_interval_start = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        duration = 1.337, 
                         end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         last_scheduling_decision = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
@@ -62,7 +65,8 @@
                         run_type = 'backfill', 
                         start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         state = 'queued', 
-                        triggered_by = 'cli', )
+                        triggered_by = 'cli', 
+                        triggering_user_name = '', )
                     ],
                 total_entries = 56
             )
@@ -72,6 +76,7 @@
                     airflow_client.client.models.dag_run_response.DAGRunResponse(
                         bundle_version = '', 
                         conf = airflow_client.client.models.extra.extra(), 
+                        dag_display_name = '', 
                         dag_id = '', 
                         dag_run_id = '', 
                         dag_versions = [
@@ -80,12 +85,14 @@
                                 bundle_url = '', 
                                 bundle_version = '', 
                                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                dag_display_name = '', 
                                 dag_id = '', 
                                 id = '', 
                                 version_number = 56, )
                             ], 
                         data_interval_end = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         data_interval_start = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        duration = 1.337, 
                         end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         last_scheduling_decision = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
@@ -95,7 +102,8 @@
                         run_type = 'backfill', 
                         start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         state = 'queued', 
-                        triggered_by = 'cli', )
+                        triggered_by = 'cli', 
+                        triggering_user_name = '', )
                     ],
                 total_entries = 56,
         )
diff --git a/test/test_dag_run_response.py b/test/test_dag_run_response.py
index 9064ed5..7e37897 100644
--- a/test/test_dag_run_response.py
+++ b/test/test_dag_run_response.py
@@ -37,6 +37,7 @@
             return DAGRunResponse(
                 bundle_version = '',
                 conf = airflow_client.client.models.extra.extra(),
+                dag_display_name = '',
                 dag_id = '',
                 dag_run_id = '',
                 dag_versions = [
@@ -45,12 +46,14 @@
                         bundle_url = '', 
                         bundle_version = '', 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
                         dag_id = '', 
                         id = '', 
                         version_number = 56, )
                     ],
                 data_interval_end = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 data_interval_start = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                duration = 1.337,
                 end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 last_scheduling_decision = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
@@ -60,10 +63,12 @@
                 run_type = 'backfill',
                 start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 state = 'queued',
-                triggered_by = 'cli'
+                triggered_by = 'cli',
+                triggering_user_name = ''
             )
         else:
             return DAGRunResponse(
+                dag_display_name = '',
                 dag_id = '',
                 dag_run_id = '',
                 dag_versions = [
@@ -72,6 +77,7 @@
                         bundle_url = '', 
                         bundle_version = '', 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
                         dag_id = '', 
                         id = '', 
                         version_number = 56, )
diff --git a/test/test_dag_runs_batch_body.py b/test/test_dag_runs_batch_body.py
index 3888342..62555a2 100644
--- a/test/test_dag_runs_batch_body.py
+++ b/test/test_dag_runs_batch_body.py
@@ -38,16 +38,24 @@
                 dag_ids = [
                     ''
                     ],
+                end_date_gt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 end_date_gte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                end_date_lt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 end_date_lte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                logical_date_gt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 logical_date_gte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                logical_date_lt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 logical_date_lte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 order_by = '',
                 page_limit = 0.0,
                 page_offset = 0.0,
+                run_after_gt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 run_after_gte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                run_after_lt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 run_after_lte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                start_date_gt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 start_date_gte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                start_date_lt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 start_date_lte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 states = [
                     'queued'
diff --git a/test/test_dag_source_response.py b/test/test_dag_source_response.py
index a95b9dc..ca636a6 100644
--- a/test/test_dag_source_response.py
+++ b/test/test_dag_source_response.py
@@ -36,11 +36,13 @@
         if include_optional:
             return DAGSourceResponse(
                 content = '',
+                dag_display_name = '',
                 dag_id = '',
                 version_number = 56
             )
         else:
             return DAGSourceResponse(
+                dag_display_name = '',
                 dag_id = '',
         )
         """
diff --git a/test/test_dag_stats_collection_response.py b/test/test_dag_stats_collection_response.py
index a3fac41..36ee568 100644
--- a/test/test_dag_stats_collection_response.py
+++ b/test/test_dag_stats_collection_response.py
@@ -37,6 +37,7 @@
             return DagStatsCollectionResponse(
                 dags = [
                     airflow_client.client.models.dag_stats_response.DagStatsResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         stats = [
                             airflow_client.client.models.dag_stats_state_response.DagStatsStateResponse(
@@ -50,6 +51,7 @@
             return DagStatsCollectionResponse(
                 dags = [
                     airflow_client.client.models.dag_stats_response.DagStatsResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         stats = [
                             airflow_client.client.models.dag_stats_state_response.DagStatsStateResponse(
diff --git a/test/test_dag_stats_response.py b/test/test_dag_stats_response.py
index 9a53097..4c48277 100644
--- a/test/test_dag_stats_response.py
+++ b/test/test_dag_stats_response.py
@@ -35,6 +35,7 @@
         model = DagStatsResponse()
         if include_optional:
             return DagStatsResponse(
+                dag_display_name = '',
                 dag_id = '',
                 stats = [
                     airflow_client.client.models.dag_stats_state_response.DagStatsStateResponse(
@@ -44,6 +45,7 @@
             )
         else:
             return DagStatsResponse(
+                dag_display_name = '',
                 dag_id = '',
                 stats = [
                     airflow_client.client.models.dag_stats_state_response.DagStatsStateResponse(
diff --git a/test/test_dag_tag_response.py b/test/test_dag_tag_response.py
index 2d341b8..c7cffec 100644
--- a/test/test_dag_tag_response.py
+++ b/test/test_dag_tag_response.py
@@ -35,11 +35,13 @@
         model = DagTagResponse()
         if include_optional:
             return DagTagResponse(
+                dag_display_name = '',
                 dag_id = '',
                 name = ''
             )
         else:
             return DagTagResponse(
+                dag_display_name = '',
                 dag_id = '',
                 name = '',
         )
diff --git a/test/test_dag_version_collection_response.py b/test/test_dag_version_collection_response.py
index cf35afb..2d311df 100644
--- a/test/test_dag_version_collection_response.py
+++ b/test/test_dag_version_collection_response.py
@@ -41,6 +41,7 @@
                         bundle_url = '', 
                         bundle_version = '', 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
                         dag_id = '', 
                         id = '', 
                         version_number = 56, )
@@ -55,6 +56,7 @@
                         bundle_url = '', 
                         bundle_version = '', 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
                         dag_id = '', 
                         id = '', 
                         version_number = 56, )
diff --git a/test/test_dag_version_response.py b/test/test_dag_version_response.py
index ef4adba..04060db 100644
--- a/test/test_dag_version_response.py
+++ b/test/test_dag_version_response.py
@@ -39,6 +39,7 @@
                 bundle_url = '',
                 bundle_version = '',
                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                dag_display_name = '',
                 dag_id = '',
                 id = '',
                 version_number = 56
@@ -46,6 +47,7 @@
         else:
             return DagVersionResponse(
                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                dag_display_name = '',
                 dag_id = '',
                 id = '',
                 version_number = 56,
diff --git a/test/test_dag_warning_collection_response.py b/test/test_dag_warning_collection_response.py
index e351aba..c96a52f 100644
--- a/test/test_dag_warning_collection_response.py
+++ b/test/test_dag_warning_collection_response.py
@@ -37,6 +37,7 @@
             return DAGWarningCollectionResponse(
                 dag_warnings = [
                     airflow_client.client.models.dag_warning_response.DAGWarningResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         message = '', 
                         timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
@@ -48,6 +49,7 @@
             return DAGWarningCollectionResponse(
                 dag_warnings = [
                     airflow_client.client.models.dag_warning_response.DAGWarningResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         message = '', 
                         timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
diff --git a/test/test_dag_warning_response.py b/test/test_dag_warning_response.py
index 595a705..8df24b0 100644
--- a/test/test_dag_warning_response.py
+++ b/test/test_dag_warning_response.py
@@ -35,6 +35,7 @@
         model = DAGWarningResponse()
         if include_optional:
             return DAGWarningResponse(
+                dag_display_name = '',
                 dag_id = '',
                 message = '',
                 timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
@@ -42,6 +43,7 @@
             )
         else:
             return DAGWarningResponse(
+                dag_display_name = '',
                 dag_id = '',
                 message = '',
                 timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
diff --git a/test/test_event_log_collection_response.py b/test/test_event_log_collection_response.py
index 2fe92de..2c8cc35 100644
--- a/test/test_event_log_collection_response.py
+++ b/test/test_event_log_collection_response.py
@@ -37,6 +37,7 @@
             return EventLogCollectionResponse(
                 event_logs = [
                     airflow_client.client.models.event_log_response.EventLogResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         event = '', 
                         event_log_id = 56, 
@@ -55,6 +56,7 @@
             return EventLogCollectionResponse(
                 event_logs = [
                     airflow_client.client.models.event_log_response.EventLogResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         event = '', 
                         event_log_id = 56, 
diff --git a/test/test_event_log_response.py b/test/test_event_log_response.py
index 8aab2ec..47e0e6f 100644
--- a/test/test_event_log_response.py
+++ b/test/test_event_log_response.py
@@ -35,6 +35,7 @@
         model = EventLogResponse()
         if include_optional:
             return EventLogResponse(
+                dag_display_name = '',
                 dag_id = '',
                 event = '',
                 event_log_id = 56,
diff --git a/test/test_experimental_api.py b/test/test_experimental_api.py
new file mode 100644
index 0000000..1f5cbb9
--- /dev/null
+++ b/test/test_experimental_api.py
@@ -0,0 +1,38 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.api.experimental_api import ExperimentalApi
+
+
+class TestExperimentalApi(unittest.TestCase):
+    """ExperimentalApi unit test stubs"""
+
+    def setUp(self) -> None:
+        self.api = ExperimentalApi()
+
+    def tearDown(self) -> None:
+        pass
+
+    def test_wait_dag_run_until_finished(self) -> None:
+        """Test case for wait_dag_run_until_finished
+
+        Experimental: Wait for a dag run to complete, and return task results if requested.
+        """
+        pass
+
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_external_log_url_response.py b/test/test_external_log_url_response.py
new file mode 100644
index 0000000..76fd7cb
--- /dev/null
+++ b/test/test_external_log_url_response.py
@@ -0,0 +1,52 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.external_log_url_response import ExternalLogUrlResponse
+
+class TestExternalLogUrlResponse(unittest.TestCase):
+    """ExternalLogUrlResponse unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> ExternalLogUrlResponse:
+        """Test ExternalLogUrlResponse
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `ExternalLogUrlResponse`
+        """
+        model = ExternalLogUrlResponse()
+        if include_optional:
+            return ExternalLogUrlResponse(
+                url = ''
+            )
+        else:
+            return ExternalLogUrlResponse(
+                url = '',
+        )
+        """
+
+    def testExternalLogUrlResponse(self):
+        """Test ExternalLogUrlResponse"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_external_view_response.py b/test/test_external_view_response.py
new file mode 100644
index 0000000..c82e07d
--- /dev/null
+++ b/test/test_external_view_response.py
@@ -0,0 +1,59 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.external_view_response import ExternalViewResponse
+
+class TestExternalViewResponse(unittest.TestCase):
+    """ExternalViewResponse unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> ExternalViewResponse:
+        """Test ExternalViewResponse
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `ExternalViewResponse`
+        """
+        model = ExternalViewResponse()
+        if include_optional:
+            return ExternalViewResponse(
+                category = '',
+                destination = 'nav',
+                href = '',
+                icon = '',
+                icon_dark_mode = '',
+                name = '',
+                url_route = ''
+            )
+        else:
+            return ExternalViewResponse(
+                href = '',
+                name = '',
+        )
+        """
+
+    def testExternalViewResponse(self):
+        """Test ExternalViewResponse"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_hitl_detail.py b/test/test_hitl_detail.py
new file mode 100644
index 0000000..498270f
--- /dev/null
+++ b/test/test_hitl_detail.py
@@ -0,0 +1,196 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.hitl_detail import HITLDetail
+
+class TestHITLDetail(unittest.TestCase):
+    """HITLDetail unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> HITLDetail:
+        """Test HITLDetail
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `HITLDetail`
+        """
+        model = HITLDetail()
+        if include_optional:
+            return HITLDetail(
+                assigned_users = [
+                    airflow_client.client.models.hitl_user.HITLUser(
+                        id = '', 
+                        name = '', )
+                    ],
+                body = '',
+                chosen_options = [
+                    ''
+                    ],
+                created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                defaults = [
+                    ''
+                    ],
+                multiple = True,
+                options = [
+                    ''
+                    ],
+                params = airflow_client.client.models.params.Params(),
+                params_input = airflow_client.client.models.params_input.Params Input(),
+                responded_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                responded_by_user = airflow_client.client.models.hitl_user.HITLUser(
+                    id = '', 
+                    name = '', ),
+                response_received = True,
+                subject = '',
+                task_instance = airflow_client.client.models.task_instance_response.TaskInstanceResponse(
+                    dag_display_name = '', 
+                    dag_id = '', 
+                    dag_run_id = '', 
+                    dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
+                        bundle_name = '', 
+                        bundle_url = '', 
+                        bundle_version = '', 
+                        created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
+                        dag_id = '', 
+                        id = '', 
+                        version_number = 56, ), 
+                    duration = 1.337, 
+                    end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    executor = '', 
+                    executor_config = '', 
+                    hostname = '', 
+                    id = '', 
+                    logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    map_index = 56, 
+                    max_tries = 56, 
+                    note = '', 
+                    operator = '', 
+                    operator_name = '', 
+                    pid = 56, 
+                    pool = '', 
+                    pool_slots = 56, 
+                    priority_weight = 56, 
+                    queue = '', 
+                    queued_when = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    rendered_fields = airflow_client.client.models.rendered_fields.Rendered Fields(), 
+                    rendered_map_index = '', 
+                    run_after = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    scheduled_when = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    state = 'removed', 
+                    task_display_name = '', 
+                    task_id = '', 
+                    trigger = airflow_client.client.models.trigger_response.TriggerResponse(
+                        classpath = '', 
+                        created_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        id = 56, 
+                        kwargs = '', 
+                        triggerer_id = 56, ), 
+                    triggerer_job = airflow_client.client.models.job_response.JobResponse(
+                        dag_display_name = '', 
+                        dag_id = '', 
+                        end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        executor_class = '', 
+                        hostname = '', 
+                        id = 56, 
+                        job_type = '', 
+                        latest_heartbeat = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        unixname = '', ), 
+                    try_number = 56, 
+                    unixname = '', )
+            )
+        else:
+            return HITLDetail(
+                created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                options = [
+                    ''
+                    ],
+                subject = '',
+                task_instance = airflow_client.client.models.task_instance_response.TaskInstanceResponse(
+                    dag_display_name = '', 
+                    dag_id = '', 
+                    dag_run_id = '', 
+                    dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
+                        bundle_name = '', 
+                        bundle_url = '', 
+                        bundle_version = '', 
+                        created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
+                        dag_id = '', 
+                        id = '', 
+                        version_number = 56, ), 
+                    duration = 1.337, 
+                    end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    executor = '', 
+                    executor_config = '', 
+                    hostname = '', 
+                    id = '', 
+                    logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    map_index = 56, 
+                    max_tries = 56, 
+                    note = '', 
+                    operator = '', 
+                    operator_name = '', 
+                    pid = 56, 
+                    pool = '', 
+                    pool_slots = 56, 
+                    priority_weight = 56, 
+                    queue = '', 
+                    queued_when = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    rendered_fields = airflow_client.client.models.rendered_fields.Rendered Fields(), 
+                    rendered_map_index = '', 
+                    run_after = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    scheduled_when = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    state = 'removed', 
+                    task_display_name = '', 
+                    task_id = '', 
+                    trigger = airflow_client.client.models.trigger_response.TriggerResponse(
+                        classpath = '', 
+                        created_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        id = 56, 
+                        kwargs = '', 
+                        triggerer_id = 56, ), 
+                    triggerer_job = airflow_client.client.models.job_response.JobResponse(
+                        dag_display_name = '', 
+                        dag_id = '', 
+                        end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        executor_class = '', 
+                        hostname = '', 
+                        id = 56, 
+                        job_type = '', 
+                        latest_heartbeat = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        unixname = '', ), 
+                    try_number = 56, 
+                    unixname = '', ),
+        )
+        """
+
+    def testHITLDetail(self):
+        """Test HITLDetail"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_hitl_detail_collection.py b/test/test_hitl_detail_collection.py
new file mode 100644
index 0000000..8873d51
--- /dev/null
+++ b/test/test_hitl_detail_collection.py
@@ -0,0 +1,224 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.hitl_detail_collection import HITLDetailCollection
+
+class TestHITLDetailCollection(unittest.TestCase):
+    """HITLDetailCollection unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> HITLDetailCollection:
+        """Test HITLDetailCollection
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `HITLDetailCollection`
+        """
+        model = HITLDetailCollection()
+        if include_optional:
+            return HITLDetailCollection(
+                hitl_details = [
+                    airflow_client.client.models.hitl_detail.HITLDetail(
+                        assigned_users = [
+                            airflow_client.client.models.hitl_user.HITLUser(
+                                id = '', 
+                                name = '', )
+                            ], 
+                        body = '', 
+                        chosen_options = [
+                            ''
+                            ], 
+                        created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        defaults = [
+                            ''
+                            ], 
+                        multiple = True, 
+                        options = [
+                            ''
+                            ], 
+                        params = airflow_client.client.models.params.Params(), 
+                        params_input = airflow_client.client.models.params_input.Params Input(), 
+                        responded_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        responded_by_user = airflow_client.client.models.hitl_user.HITLUser(
+                            id = '', 
+                            name = '', ), 
+                        response_received = True, 
+                        subject = '', 
+                        task_instance = airflow_client.client.models.task_instance_response.TaskInstanceResponse(
+                            dag_display_name = '', 
+                            dag_id = '', 
+                            dag_run_id = '', 
+                            dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
+                                bundle_name = '', 
+                                bundle_url = '', 
+                                bundle_version = '', 
+                                created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                dag_display_name = '', 
+                                dag_id = '', 
+                                id = '', 
+                                version_number = 56, ), 
+                            duration = 1.337, 
+                            end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            executor = '', 
+                            executor_config = '', 
+                            hostname = '', 
+                            id = '', 
+                            logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            map_index = 56, 
+                            max_tries = 56, 
+                            note = '', 
+                            operator = '', 
+                            operator_name = '', 
+                            pid = 56, 
+                            pool = '', 
+                            pool_slots = 56, 
+                            priority_weight = 56, 
+                            queue = '', 
+                            queued_when = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            rendered_fields = airflow_client.client.models.rendered_fields.Rendered Fields(), 
+                            rendered_map_index = '', 
+                            run_after = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            scheduled_when = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            state = 'removed', 
+                            task_display_name = '', 
+                            task_id = '', 
+                            trigger = airflow_client.client.models.trigger_response.TriggerResponse(
+                                classpath = '', 
+                                created_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                id = 56, 
+                                kwargs = '', 
+                                triggerer_id = 56, ), 
+                            triggerer_job = airflow_client.client.models.job_response.JobResponse(
+                                dag_display_name = '', 
+                                dag_id = '', 
+                                end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                executor_class = '', 
+                                hostname = '', 
+                                id = 56, 
+                                job_type = '', 
+                                latest_heartbeat = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                unixname = '', ), 
+                            try_number = 56, 
+                            unixname = '', ), )
+                    ],
+                total_entries = 56
+            )
+        else:
+            return HITLDetailCollection(
+                hitl_details = [
+                    airflow_client.client.models.hitl_detail.HITLDetail(
+                        assigned_users = [
+                            airflow_client.client.models.hitl_user.HITLUser(
+                                id = '', 
+                                name = '', )
+                            ], 
+                        body = '', 
+                        chosen_options = [
+                            ''
+                            ], 
+                        created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        defaults = [
+                            ''
+                            ], 
+                        multiple = True, 
+                        options = [
+                            ''
+                            ], 
+                        params = airflow_client.client.models.params.Params(), 
+                        params_input = airflow_client.client.models.params_input.Params Input(), 
+                        responded_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        responded_by_user = airflow_client.client.models.hitl_user.HITLUser(
+                            id = '', 
+                            name = '', ), 
+                        response_received = True, 
+                        subject = '', 
+                        task_instance = airflow_client.client.models.task_instance_response.TaskInstanceResponse(
+                            dag_display_name = '', 
+                            dag_id = '', 
+                            dag_run_id = '', 
+                            dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
+                                bundle_name = '', 
+                                bundle_url = '', 
+                                bundle_version = '', 
+                                created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                dag_display_name = '', 
+                                dag_id = '', 
+                                id = '', 
+                                version_number = 56, ), 
+                            duration = 1.337, 
+                            end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            executor = '', 
+                            executor_config = '', 
+                            hostname = '', 
+                            id = '', 
+                            logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            map_index = 56, 
+                            max_tries = 56, 
+                            note = '', 
+                            operator = '', 
+                            operator_name = '', 
+                            pid = 56, 
+                            pool = '', 
+                            pool_slots = 56, 
+                            priority_weight = 56, 
+                            queue = '', 
+                            queued_when = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            rendered_fields = airflow_client.client.models.rendered_fields.Rendered Fields(), 
+                            rendered_map_index = '', 
+                            run_after = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            scheduled_when = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            state = 'removed', 
+                            task_display_name = '', 
+                            task_id = '', 
+                            trigger = airflow_client.client.models.trigger_response.TriggerResponse(
+                                classpath = '', 
+                                created_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                id = 56, 
+                                kwargs = '', 
+                                triggerer_id = 56, ), 
+                            triggerer_job = airflow_client.client.models.job_response.JobResponse(
+                                dag_display_name = '', 
+                                dag_id = '', 
+                                end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                executor_class = '', 
+                                hostname = '', 
+                                id = 56, 
+                                job_type = '', 
+                                latest_heartbeat = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                                unixname = '', ), 
+                            try_number = 56, 
+                            unixname = '', ), )
+                    ],
+                total_entries = 56,
+        )
+        """
+
+    def testHITLDetailCollection(self):
+        """Test HITLDetailCollection"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_hitl_detail_response.py b/test/test_hitl_detail_response.py
new file mode 100644
index 0000000..b2ded33
--- /dev/null
+++ b/test/test_hitl_detail_response.py
@@ -0,0 +1,65 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.hitl_detail_response import HITLDetailResponse
+
+class TestHITLDetailResponse(unittest.TestCase):
+    """HITLDetailResponse unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> HITLDetailResponse:
+        """Test HITLDetailResponse
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `HITLDetailResponse`
+        """
+        model = HITLDetailResponse()
+        if include_optional:
+            return HITLDetailResponse(
+                chosen_options = [
+                    ''
+                    ],
+                params_input = airflow_client.client.models.params_input.Params Input(),
+                responded_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                responded_by = airflow_client.client.models.hitl_user.HITLUser(
+                    id = '', 
+                    name = '', )
+            )
+        else:
+            return HITLDetailResponse(
+                chosen_options = [
+                    ''
+                    ],
+                responded_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                responded_by = airflow_client.client.models.hitl_user.HITLUser(
+                    id = '', 
+                    name = '', ),
+        )
+        """
+
+    def testHITLDetailResponse(self):
+        """Test HITLDetailResponse"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_hitl_user.py b/test/test_hitl_user.py
new file mode 100644
index 0000000..fd247dc
--- /dev/null
+++ b/test/test_hitl_user.py
@@ -0,0 +1,54 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.hitl_user import HITLUser
+
+class TestHITLUser(unittest.TestCase):
+    """HITLUser unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> HITLUser:
+        """Test HITLUser
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `HITLUser`
+        """
+        model = HITLUser()
+        if include_optional:
+            return HITLUser(
+                id = '',
+                name = ''
+            )
+        else:
+            return HITLUser(
+                id = '',
+                name = '',
+        )
+        """
+
+    def testHITLUser(self):
+        """Test HITLUser"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_job_collection_response.py b/test/test_job_collection_response.py
index 062ff99..24c2038 100644
--- a/test/test_job_collection_response.py
+++ b/test/test_job_collection_response.py
@@ -37,6 +37,7 @@
             return JobCollectionResponse(
                 jobs = [
                     airflow_client.client.models.job_response.JobResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         executor_class = '', 
@@ -54,6 +55,7 @@
             return JobCollectionResponse(
                 jobs = [
                     airflow_client.client.models.job_response.JobResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         executor_class = '', 
diff --git a/test/test_job_response.py b/test/test_job_response.py
index e18d024..32112bd 100644
--- a/test/test_job_response.py
+++ b/test/test_job_response.py
@@ -35,6 +35,7 @@
         model = JobResponse()
         if include_optional:
             return JobResponse(
+                dag_display_name = '',
                 dag_id = '',
                 end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 executor_class = '',
diff --git a/test/test_last_asset_event_response.py b/test/test_last_asset_event_response.py
new file mode 100644
index 0000000..eb21138
--- /dev/null
+++ b/test/test_last_asset_event_response.py
@@ -0,0 +1,52 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.last_asset_event_response import LastAssetEventResponse
+
+class TestLastAssetEventResponse(unittest.TestCase):
+    """LastAssetEventResponse unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> LastAssetEventResponse:
+        """Test LastAssetEventResponse
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `LastAssetEventResponse`
+        """
+        model = LastAssetEventResponse()
+        if include_optional:
+            return LastAssetEventResponse(
+                id = 0.0,
+                timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f')
+            )
+        else:
+            return LastAssetEventResponse(
+        )
+        """
+
+    def testLastAssetEventResponse(self):
+        """Test LastAssetEventResponse"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_login_api.py b/test/test_login_api.py
index d52ea08..007e38c 100644
--- a/test/test_login_api.py
+++ b/test/test_login_api.py
@@ -40,6 +40,13 @@
         """
         pass
 
+    def test_refresh(self) -> None:
+        """Test case for refresh
+
+        Refresh
+        """
+        pass
+
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/test/test_plugin_api.py b/test/test_plugin_api.py
index 16e26ff..dbb929c 100644
--- a/test/test_plugin_api.py
+++ b/test/test_plugin_api.py
@@ -33,6 +33,13 @@
         """
         pass
 
+    def test_import_errors(self) -> None:
+        """Test case for import_errors
+
+        Import Errors
+        """
+        pass
+
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/test/test_plugin_collection_response.py b/test/test_plugin_collection_response.py
index 2b44470..56e67cc 100644
--- a/test/test_plugin_collection_response.py
+++ b/test/test_plugin_collection_response.py
@@ -50,6 +50,16 @@
                                 name = '', 
                                 view = '', )
                             ], 
+                        external_views = [
+                            airflow_client.client.models.external_view_response.ExternalViewResponse(
+                                category = '', 
+                                destination = 'nav', 
+                                href = '', 
+                                icon = '', 
+                                icon_dark_mode = '', 
+                                name = '', 
+                                url_route = '', )
+                            ], 
                         fastapi_apps = [
                             airflow_client.client.models.fast_api_app_response.FastAPIAppResponse(
                                 app = '', 
@@ -77,6 +87,16 @@
                         operator_extra_links = [
                             ''
                             ], 
+                        react_apps = [
+                            airflow_client.client.models.react_app_response.ReactAppResponse(
+                                bundle_url = '', 
+                                category = '', 
+                                destination = 'nav', 
+                                icon = '', 
+                                icon_dark_mode = '', 
+                                name = '', 
+                                url_route = '', )
+                            ], 
                         source = '', 
                         timetables = [
                             ''
@@ -101,6 +121,16 @@
                                 name = '', 
                                 view = '', )
                             ], 
+                        external_views = [
+                            airflow_client.client.models.external_view_response.ExternalViewResponse(
+                                category = '', 
+                                destination = 'nav', 
+                                href = '', 
+                                icon = '', 
+                                icon_dark_mode = '', 
+                                name = '', 
+                                url_route = '', )
+                            ], 
                         fastapi_apps = [
                             airflow_client.client.models.fast_api_app_response.FastAPIAppResponse(
                                 app = '', 
@@ -128,6 +158,16 @@
                         operator_extra_links = [
                             ''
                             ], 
+                        react_apps = [
+                            airflow_client.client.models.react_app_response.ReactAppResponse(
+                                bundle_url = '', 
+                                category = '', 
+                                destination = 'nav', 
+                                icon = '', 
+                                icon_dark_mode = '', 
+                                name = '', 
+                                url_route = '', )
+                            ], 
                         source = '', 
                         timetables = [
                             ''
diff --git a/test/test_plugin_import_error_collection_response.py b/test/test_plugin_import_error_collection_response.py
new file mode 100644
index 0000000..1022103
--- /dev/null
+++ b/test/test_plugin_import_error_collection_response.py
@@ -0,0 +1,62 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.plugin_import_error_collection_response import PluginImportErrorCollectionResponse
+
+class TestPluginImportErrorCollectionResponse(unittest.TestCase):
+    """PluginImportErrorCollectionResponse unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> PluginImportErrorCollectionResponse:
+        """Test PluginImportErrorCollectionResponse
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `PluginImportErrorCollectionResponse`
+        """
+        model = PluginImportErrorCollectionResponse()
+        if include_optional:
+            return PluginImportErrorCollectionResponse(
+                import_errors = [
+                    airflow_client.client.models.plugin_import_error_response.PluginImportErrorResponse(
+                        error = '', 
+                        source = '', )
+                    ],
+                total_entries = 56
+            )
+        else:
+            return PluginImportErrorCollectionResponse(
+                import_errors = [
+                    airflow_client.client.models.plugin_import_error_response.PluginImportErrorResponse(
+                        error = '', 
+                        source = '', )
+                    ],
+                total_entries = 56,
+        )
+        """
+
+    def testPluginImportErrorCollectionResponse(self):
+        """Test PluginImportErrorCollectionResponse"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_plugin_import_error_response.py b/test/test_plugin_import_error_response.py
new file mode 100644
index 0000000..eab0f75
--- /dev/null
+++ b/test/test_plugin_import_error_response.py
@@ -0,0 +1,54 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.plugin_import_error_response import PluginImportErrorResponse
+
+class TestPluginImportErrorResponse(unittest.TestCase):
+    """PluginImportErrorResponse unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> PluginImportErrorResponse:
+        """Test PluginImportErrorResponse
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `PluginImportErrorResponse`
+        """
+        model = PluginImportErrorResponse()
+        if include_optional:
+            return PluginImportErrorResponse(
+                error = '',
+                source = ''
+            )
+        else:
+            return PluginImportErrorResponse(
+                error = '',
+                source = '',
+        )
+        """
+
+    def testPluginImportErrorResponse(self):
+        """Test PluginImportErrorResponse"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_plugin_response.py b/test/test_plugin_response.py
index 0953bef..2a69944 100644
--- a/test/test_plugin_response.py
+++ b/test/test_plugin_response.py
@@ -48,6 +48,16 @@
                         name = '', 
                         view = '', )
                     ],
+                external_views = [
+                    airflow_client.client.models.external_view_response.ExternalViewResponse(
+                        category = '', 
+                        destination = 'nav', 
+                        href = '', 
+                        icon = '', 
+                        icon_dark_mode = '', 
+                        name = '', 
+                        url_route = '', )
+                    ],
                 fastapi_apps = [
                     airflow_client.client.models.fast_api_app_response.FastAPIAppResponse(
                         app = '', 
@@ -75,6 +85,16 @@
                 operator_extra_links = [
                     ''
                     ],
+                react_apps = [
+                    airflow_client.client.models.react_app_response.ReactAppResponse(
+                        bundle_url = '', 
+                        category = '', 
+                        destination = 'nav', 
+                        icon = '', 
+                        icon_dark_mode = '', 
+                        name = '', 
+                        url_route = '', )
+                    ],
                 source = '',
                 timetables = [
                     ''
@@ -95,6 +115,16 @@
                         name = '', 
                         view = '', )
                     ],
+                external_views = [
+                    airflow_client.client.models.external_view_response.ExternalViewResponse(
+                        category = '', 
+                        destination = 'nav', 
+                        href = '', 
+                        icon = '', 
+                        icon_dark_mode = '', 
+                        name = '', 
+                        url_route = '', )
+                    ],
                 fastapi_apps = [
                     airflow_client.client.models.fast_api_app_response.FastAPIAppResponse(
                         app = '', 
@@ -122,6 +152,16 @@
                 operator_extra_links = [
                     ''
                     ],
+                react_apps = [
+                    airflow_client.client.models.react_app_response.ReactAppResponse(
+                        bundle_url = '', 
+                        category = '', 
+                        destination = 'nav', 
+                        icon = '', 
+                        icon_dark_mode = '', 
+                        name = '', 
+                        url_route = '', )
+                    ],
                 source = '',
                 timetables = [
                     ''
diff --git a/test/test_queued_event_collection_response.py b/test/test_queued_event_collection_response.py
index 0eedc40..5628635 100644
--- a/test/test_queued_event_collection_response.py
+++ b/test/test_queued_event_collection_response.py
@@ -39,6 +39,7 @@
                     airflow_client.client.models.queued_event_response.QueuedEventResponse(
                         asset_id = 56, 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
                         dag_id = '', )
                     ],
                 total_entries = 56
@@ -49,6 +50,7 @@
                     airflow_client.client.models.queued_event_response.QueuedEventResponse(
                         asset_id = 56, 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
                         dag_id = '', )
                     ],
                 total_entries = 56,
diff --git a/test/test_queued_event_response.py b/test/test_queued_event_response.py
index 9646d64..a2bf766 100644
--- a/test/test_queued_event_response.py
+++ b/test/test_queued_event_response.py
@@ -37,12 +37,14 @@
             return QueuedEventResponse(
                 asset_id = 56,
                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                dag_display_name = '',
                 dag_id = ''
             )
         else:
             return QueuedEventResponse(
                 asset_id = 56,
                 created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                dag_display_name = '',
                 dag_id = '',
         )
         """
diff --git a/test/test_react_app_response.py b/test/test_react_app_response.py
new file mode 100644
index 0000000..424db03
--- /dev/null
+++ b/test/test_react_app_response.py
@@ -0,0 +1,59 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.react_app_response import ReactAppResponse
+
+class TestReactAppResponse(unittest.TestCase):
+    """ReactAppResponse unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> ReactAppResponse:
+        """Test ReactAppResponse
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `ReactAppResponse`
+        """
+        model = ReactAppResponse()
+        if include_optional:
+            return ReactAppResponse(
+                bundle_url = '',
+                category = '',
+                destination = 'nav',
+                icon = '',
+                icon_dark_mode = '',
+                name = '',
+                url_route = ''
+            )
+        else:
+            return ReactAppResponse(
+                bundle_url = '',
+                name = '',
+        )
+        """
+
+    def testReactAppResponse(self):
+        """Test ReactAppResponse"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_response_clear_dag_run.py b/test/test_response_clear_dag_run.py
index bb1db01..c7f50f8 100644
--- a/test/test_response_clear_dag_run.py
+++ b/test/test_response_clear_dag_run.py
@@ -37,6 +37,7 @@
             return ResponseClearDagRun(
                 task_instances = [
                     airflow_client.client.models.task_instance_response.TaskInstanceResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         dag_run_id = '', 
                         dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
@@ -44,6 +45,7 @@
                             bundle_url = '', 
                             bundle_version = '', 
                             created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            dag_display_name = '', 
                             dag_id = '', 
                             id = '', 
                             version_number = 56, ), 
@@ -58,6 +60,7 @@
                         max_tries = 56, 
                         note = '', 
                         operator = '', 
+                        operator_name = '', 
                         pid = 56, 
                         pool = '', 
                         pool_slots = 56, 
@@ -79,6 +82,7 @@
                             kwargs = '', 
                             triggerer_id = 56, ), 
                         triggerer_job = airflow_client.client.models.job_response.JobResponse(
+                            dag_display_name = '', 
                             dag_id = '', 
                             end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                             executor_class = '', 
@@ -94,6 +98,7 @@
                 total_entries = 56,
                 bundle_version = '',
                 conf = airflow_client.client.models.extra.extra(),
+                dag_display_name = '',
                 dag_id = '',
                 dag_run_id = '',
                 dag_versions = [
@@ -102,12 +107,14 @@
                         bundle_url = '', 
                         bundle_version = '', 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
                         dag_id = '', 
                         id = '', 
                         version_number = 56, )
                     ],
                 data_interval_end = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 data_interval_start = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                duration = 1.337,
                 end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 last_scheduling_decision = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
@@ -117,12 +124,14 @@
                 run_type = 'backfill',
                 start_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 state = 'queued',
-                triggered_by = 'cli'
+                triggered_by = 'cli',
+                triggering_user_name = ''
             )
         else:
             return ResponseClearDagRun(
                 task_instances = [
                     airflow_client.client.models.task_instance_response.TaskInstanceResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         dag_run_id = '', 
                         dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
@@ -130,6 +139,7 @@
                             bundle_url = '', 
                             bundle_version = '', 
                             created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            dag_display_name = '', 
                             dag_id = '', 
                             id = '', 
                             version_number = 56, ), 
@@ -144,6 +154,7 @@
                         max_tries = 56, 
                         note = '', 
                         operator = '', 
+                        operator_name = '', 
                         pid = 56, 
                         pool = '', 
                         pool_slots = 56, 
@@ -165,6 +176,7 @@
                             kwargs = '', 
                             triggerer_id = 56, ), 
                         triggerer_job = airflow_client.client.models.job_response.JobResponse(
+                            dag_display_name = '', 
                             dag_id = '', 
                             end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                             executor_class = '', 
@@ -178,6 +190,7 @@
                         unixname = '', )
                     ],
                 total_entries = 56,
+                dag_display_name = '',
                 dag_id = '',
                 dag_run_id = '',
                 dag_versions = [
@@ -186,6 +199,7 @@
                         bundle_url = '', 
                         bundle_version = '', 
                         created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                        dag_display_name = '', 
                         dag_id = '', 
                         id = '', 
                         version_number = 56, )
diff --git a/test/test_response_get_xcom_entry.py b/test/test_response_get_xcom_entry.py
index 2683c87..817cb10 100644
--- a/test/test_response_get_xcom_entry.py
+++ b/test/test_response_get_xcom_entry.py
@@ -35,21 +35,25 @@
         model = ResponseGetXcomEntry()
         if include_optional:
             return ResponseGetXcomEntry(
+                dag_display_name = '',
                 dag_id = '',
                 key = '',
                 logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 map_index = 56,
                 run_id = '',
+                task_display_name = '',
                 task_id = '',
                 timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 value = ''
             )
         else:
             return ResponseGetXcomEntry(
+                dag_display_name = '',
                 dag_id = '',
                 key = '',
                 map_index = 56,
                 run_id = '',
+                task_display_name = '',
                 task_id = '',
                 timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 value = '',
diff --git a/test/test_task_inlet_asset_reference.py b/test/test_task_inlet_asset_reference.py
new file mode 100644
index 0000000..fbbc01f
--- /dev/null
+++ b/test/test_task_inlet_asset_reference.py
@@ -0,0 +1,58 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.task_inlet_asset_reference import TaskInletAssetReference
+
+class TestTaskInletAssetReference(unittest.TestCase):
+    """TaskInletAssetReference unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> TaskInletAssetReference:
+        """Test TaskInletAssetReference
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `TaskInletAssetReference`
+        """
+        model = TaskInletAssetReference()
+        if include_optional:
+            return TaskInletAssetReference(
+                created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                dag_id = '',
+                task_id = '',
+                updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f')
+            )
+        else:
+            return TaskInletAssetReference(
+                created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                dag_id = '',
+                task_id = '',
+                updated_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+        )
+        """
+
+    def testTaskInletAssetReference(self):
+        """Test TaskInletAssetReference"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_task_instance_api.py b/test/test_task_instance_api.py
index 7b8a9b8..dd701d9 100644
--- a/test/test_task_instance_api.py
+++ b/test/test_task_instance_api.py
@@ -26,6 +26,27 @@
     def tearDown(self) -> None:
         pass
 
+    def test_bulk_task_instances(self) -> None:
+        """Test case for bulk_task_instances
+
+        Bulk Task Instances
+        """
+        pass
+
+    def test_delete_task_instance(self) -> None:
+        """Test case for delete_task_instance
+
+        Delete Task Instance
+        """
+        pass
+
+    def test_get_external_log_url(self) -> None:
+        """Test case for get_external_log_url
+
+        Get External Log Url
+        """
+        pass
+
     def test_get_extra_links(self) -> None:
         """Test case for get_extra_links
 
@@ -33,6 +54,20 @@
         """
         pass
 
+    def test_get_hitl_detail(self) -> None:
+        """Test case for get_hitl_detail
+
+        Get Hitl Detail
+        """
+        pass
+
+    def test_get_hitl_details(self) -> None:
+        """Test case for get_hitl_details
+
+        Get Hitl Details
+        """
+        pass
+
     def test_get_log(self) -> None:
         """Test case for get_log
 
@@ -152,6 +187,13 @@
         """
         pass
 
+    def test_update_hitl_detail(self) -> None:
+        """Test case for update_hitl_detail
+
+        Update Hitl Detail
+        """
+        pass
+
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/test/test_task_instance_collection_response.py b/test/test_task_instance_collection_response.py
index a3b5e59..572a4ac 100644
--- a/test/test_task_instance_collection_response.py
+++ b/test/test_task_instance_collection_response.py
@@ -37,6 +37,7 @@
             return TaskInstanceCollectionResponse(
                 task_instances = [
                     airflow_client.client.models.task_instance_response.TaskInstanceResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         dag_run_id = '', 
                         dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
@@ -44,6 +45,7 @@
                             bundle_url = '', 
                             bundle_version = '', 
                             created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            dag_display_name = '', 
                             dag_id = '', 
                             id = '', 
                             version_number = 56, ), 
@@ -58,6 +60,7 @@
                         max_tries = 56, 
                         note = '', 
                         operator = '', 
+                        operator_name = '', 
                         pid = 56, 
                         pool = '', 
                         pool_slots = 56, 
@@ -79,6 +82,7 @@
                             kwargs = '', 
                             triggerer_id = 56, ), 
                         triggerer_job = airflow_client.client.models.job_response.JobResponse(
+                            dag_display_name = '', 
                             dag_id = '', 
                             end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                             executor_class = '', 
@@ -97,6 +101,7 @@
             return TaskInstanceCollectionResponse(
                 task_instances = [
                     airflow_client.client.models.task_instance_response.TaskInstanceResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         dag_run_id = '', 
                         dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
@@ -104,6 +109,7 @@
                             bundle_url = '', 
                             bundle_version = '', 
                             created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            dag_display_name = '', 
                             dag_id = '', 
                             id = '', 
                             version_number = 56, ), 
@@ -118,6 +124,7 @@
                         max_tries = 56, 
                         note = '', 
                         operator = '', 
+                        operator_name = '', 
                         pid = 56, 
                         pool = '', 
                         pool_slots = 56, 
@@ -139,6 +146,7 @@
                             kwargs = '', 
                             triggerer_id = 56, ), 
                         triggerer_job = airflow_client.client.models.job_response.JobResponse(
+                            dag_display_name = '', 
                             dag_id = '', 
                             end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                             executor_class = '', 
diff --git a/test/test_task_instance_history_collection_response.py b/test/test_task_instance_history_collection_response.py
index bb90858..f67fcfa 100644
--- a/test/test_task_instance_history_collection_response.py
+++ b/test/test_task_instance_history_collection_response.py
@@ -37,6 +37,7 @@
             return TaskInstanceHistoryCollectionResponse(
                 task_instances = [
                     airflow_client.client.models.task_instance_history_response.TaskInstanceHistoryResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         dag_run_id = '', 
                         dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
@@ -44,6 +45,7 @@
                             bundle_url = '', 
                             bundle_version = '', 
                             created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            dag_display_name = '', 
                             dag_id = '', 
                             id = '', 
                             version_number = 56, ), 
@@ -55,6 +57,7 @@
                         map_index = 56, 
                         max_tries = 56, 
                         operator = '', 
+                        operator_name = '', 
                         pid = 56, 
                         pool = '', 
                         pool_slots = 56, 
@@ -75,6 +78,7 @@
             return TaskInstanceHistoryCollectionResponse(
                 task_instances = [
                     airflow_client.client.models.task_instance_history_response.TaskInstanceHistoryResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         dag_run_id = '', 
                         dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
@@ -82,6 +86,7 @@
                             bundle_url = '', 
                             bundle_version = '', 
                             created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                            dag_display_name = '', 
                             dag_id = '', 
                             id = '', 
                             version_number = 56, ), 
@@ -93,6 +98,7 @@
                         map_index = 56, 
                         max_tries = 56, 
                         operator = '', 
+                        operator_name = '', 
                         pid = 56, 
                         pool = '', 
                         pool_slots = 56, 
diff --git a/test/test_task_instance_history_response.py b/test/test_task_instance_history_response.py
index 2d42523..6c1bfd0 100644
--- a/test/test_task_instance_history_response.py
+++ b/test/test_task_instance_history_response.py
@@ -35,6 +35,7 @@
         model = TaskInstanceHistoryResponse()
         if include_optional:
             return TaskInstanceHistoryResponse(
+                dag_display_name = '',
                 dag_id = '',
                 dag_run_id = '',
                 dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
@@ -42,6 +43,7 @@
                     bundle_url = '', 
                     bundle_version = '', 
                     created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    dag_display_name = '', 
                     dag_id = '', 
                     id = '', 
                     version_number = 56, ),
@@ -53,6 +55,7 @@
                 map_index = 56,
                 max_tries = 56,
                 operator = '',
+                operator_name = '',
                 pid = 56,
                 pool = '',
                 pool_slots = 56,
@@ -69,6 +72,7 @@
             )
         else:
             return TaskInstanceHistoryResponse(
+                dag_display_name = '',
                 dag_id = '',
                 dag_run_id = '',
                 executor_config = '',
diff --git a/test/test_task_instance_response.py b/test/test_task_instance_response.py
index dab1085..f69d77f 100644
--- a/test/test_task_instance_response.py
+++ b/test/test_task_instance_response.py
@@ -35,6 +35,7 @@
         model = TaskInstanceResponse()
         if include_optional:
             return TaskInstanceResponse(
+                dag_display_name = '',
                 dag_id = '',
                 dag_run_id = '',
                 dag_version = airflow_client.client.models.dag_version_response.DagVersionResponse(
@@ -42,6 +43,7 @@
                     bundle_url = '', 
                     bundle_version = '', 
                     created_at = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
+                    dag_display_name = '', 
                     dag_id = '', 
                     id = '', 
                     version_number = 56, ),
@@ -56,6 +58,7 @@
                 max_tries = 56,
                 note = '',
                 operator = '',
+                operator_name = '',
                 pid = 56,
                 pool = '',
                 pool_slots = 56,
@@ -77,6 +80,7 @@
                     kwargs = '', 
                     triggerer_id = 56, ),
                 triggerer_job = airflow_client.client.models.job_response.JobResponse(
+                    dag_display_name = '', 
                     dag_id = '', 
                     end_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                     executor_class = '', 
@@ -92,6 +96,7 @@
             )
         else:
             return TaskInstanceResponse(
+                dag_display_name = '',
                 dag_id = '',
                 dag_run_id = '',
                 executor_config = '',
diff --git a/test/test_task_instances_batch_body.py b/test/test_task_instances_batch_body.py
index 29992d3..754a18b 100644
--- a/test/test_task_instances_batch_body.py
+++ b/test/test_task_instances_batch_body.py
@@ -41,14 +41,20 @@
                 dag_run_ids = [
                     ''
                     ],
+                duration_gt = 1.337,
                 duration_gte = 1.337,
+                duration_lt = 1.337,
                 duration_lte = 1.337,
+                end_date_gt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 end_date_gte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                end_date_lt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 end_date_lte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 executor = [
                     ''
                     ],
+                logical_date_gt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 logical_date_gte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                logical_date_lt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 logical_date_lte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 order_by = '',
                 page_limit = 0.0,
@@ -59,9 +65,13 @@
                 queue = [
                     ''
                     ],
+                run_after_gt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 run_after_gte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                run_after_lt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 run_after_lte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                start_date_gt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 start_date_gte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
+                start_date_lt = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 start_date_lte = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 state = [
                     'removed'
diff --git a/test/test_trigger_dag_run_post_body.py b/test/test_trigger_dag_run_post_body.py
index 01de078..bfedbd5 100644
--- a/test/test_trigger_dag_run_post_body.py
+++ b/test/test_trigger_dag_run_post_body.py
@@ -35,7 +35,7 @@
         model = TriggerDAGRunPostBody()
         if include_optional:
             return TriggerDAGRunPostBody(
-                conf = airflow_client.client.models.conf.Conf(),
+                conf = airflow_client.client.models.extra.extra(),
                 dag_run_id = '',
                 data_interval_end = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 data_interval_start = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
diff --git a/test/test_update_hitl_detail_payload.py b/test/test_update_hitl_detail_payload.py
new file mode 100644
index 0000000..4314f4b
--- /dev/null
+++ b/test/test_update_hitl_detail_payload.py
@@ -0,0 +1,57 @@
+# coding: utf-8
+
+"""
+    Airflow API
+
+    Airflow API. All endpoints located under ``/api/v2`` can be used safely, are stable and backward compatible. Endpoints located under ``/ui`` are dedicated to the UI and are subject to breaking change depending on the need of the frontend. Users should not rely on those but use the public ones instead.
+
+    The version of the OpenAPI document: 2
+    Generated by OpenAPI Generator (https://openapi-generator.tech)
+
+    Do not edit the class manually.
+"""  # noqa: E501
+
+
+import unittest
+
+from airflow_client.client.models.update_hitl_detail_payload import UpdateHITLDetailPayload
+
+class TestUpdateHITLDetailPayload(unittest.TestCase):
+    """UpdateHITLDetailPayload unit test stubs"""
+
+    def setUp(self):
+        pass
+
+    def tearDown(self):
+        pass
+
+    def make_instance(self, include_optional) -> UpdateHITLDetailPayload:
+        """Test UpdateHITLDetailPayload
+            include_optional is a boolean, when False only required
+            params are included, when True both required and
+            optional params are included """
+        # uncomment below to create an instance of `UpdateHITLDetailPayload`
+        """
+        model = UpdateHITLDetailPayload()
+        if include_optional:
+            return UpdateHITLDetailPayload(
+                chosen_options = [
+                    ''
+                    ],
+                params_input = airflow_client.client.models.params_input.Params Input()
+            )
+        else:
+            return UpdateHITLDetailPayload(
+                chosen_options = [
+                    ''
+                    ],
+        )
+        """
+
+    def testUpdateHITLDetailPayload(self):
+        """Test UpdateHITLDetailPayload"""
+        # inst_req_only = self.make_instance(include_optional=False)
+        # inst_req_and_optional = self.make_instance(include_optional=True)
+
+if __name__ == '__main__':
+    unittest.main()
diff --git a/test/test_x_com_collection_response.py b/test/test_x_com_collection_response.py
index c61d47a..dec1460 100644
--- a/test/test_x_com_collection_response.py
+++ b/test/test_x_com_collection_response.py
@@ -38,11 +38,13 @@
                 total_entries = 56,
                 xcom_entries = [
                     airflow_client.client.models.x_com_response.XComResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         key = '', 
                         logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         map_index = 56, 
                         run_id = '', 
+                        task_display_name = '', 
                         task_id = '', 
                         timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
                     ]
@@ -52,11 +54,13 @@
                 total_entries = 56,
                 xcom_entries = [
                     airflow_client.client.models.x_com_response.XComResponse(
+                        dag_display_name = '', 
                         dag_id = '', 
                         key = '', 
                         logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), 
                         map_index = 56, 
                         run_id = '', 
+                        task_display_name = '', 
                         task_id = '', 
                         timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'), )
                     ],
diff --git a/test/test_x_com_response.py b/test/test_x_com_response.py
index af5b24a..01c967d 100644
--- a/test/test_x_com_response.py
+++ b/test/test_x_com_response.py
@@ -35,20 +35,24 @@
         model = XComResponse()
         if include_optional:
             return XComResponse(
+                dag_display_name = '',
                 dag_id = '',
                 key = '',
                 logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 map_index = 56,
                 run_id = '',
+                task_display_name = '',
                 task_id = '',
                 timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f')
             )
         else:
             return XComResponse(
+                dag_display_name = '',
                 dag_id = '',
                 key = '',
                 map_index = 56,
                 run_id = '',
+                task_display_name = '',
                 task_id = '',
                 timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
         )
diff --git a/test/test_x_com_response_native.py b/test/test_x_com_response_native.py
index 3170519..8c202e8 100644
--- a/test/test_x_com_response_native.py
+++ b/test/test_x_com_response_native.py
@@ -35,21 +35,25 @@
         model = XComResponseNative()
         if include_optional:
             return XComResponseNative(
+                dag_display_name = '',
                 dag_id = '',
                 key = '',
                 logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 map_index = 56,
                 run_id = '',
+                task_display_name = '',
                 task_id = '',
                 timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 value = None
             )
         else:
             return XComResponseNative(
+                dag_display_name = '',
                 dag_id = '',
                 key = '',
                 map_index = 56,
                 run_id = '',
+                task_display_name = '',
                 task_id = '',
                 timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 value = None,
diff --git a/test/test_x_com_response_string.py b/test/test_x_com_response_string.py
index bcb2361..850a746 100644
--- a/test/test_x_com_response_string.py
+++ b/test/test_x_com_response_string.py
@@ -35,21 +35,25 @@
         model = XComResponseString()
         if include_optional:
             return XComResponseString(
+                dag_display_name = '',
                 dag_id = '',
                 key = '',
                 logical_date = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 map_index = 56,
                 run_id = '',
+                task_display_name = '',
                 task_id = '',
                 timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
                 value = ''
             )
         else:
             return XComResponseString(
+                dag_display_name = '',
                 dag_id = '',
                 key = '',
                 map_index = 56,
                 run_id = '',
+                task_display_name = '',
                 task_id = '',
                 timestamp = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
         )
diff --git a/test_python_client.py b/test_python_client.py
index bf04d68..9d77142 100644
--- a/test_python_client.py
+++ b/test_python_client.py
@@ -17,10 +17,10 @@
 #
 # PEP 723 compliant inline script metadata (not yet widely supported)
 # /// script
-# requires-python = ">=3.9"
+# requires-python = ">=3.10"
 # dependencies = [
 #   "apache-airflow-client",
-#   "rich",
+#   "rich>=13.6.0",
 # ]
 # ///
 
diff --git a/version.txt b/version.txt
index b502146..a0cd9f0 100644
--- a/version.txt
+++ b/version.txt
@@ -1 +1 @@
-3.0.2
+3.1.0
\ No newline at end of file