This crate includes several examples of how to use various DataFusion APIs and help you on your way.
Prerequisites:
Run git submodule update --init to init test files.
avro_sql.rs: Build and run a query plan from a SQL statement against a local AVRO filecsv_sql.rs: Build and run a query plan from a SQL statement against a local CSV filecustom_datasource.rs: Run queries against a custom datasource (TableProvider)dataframe.rs: Run a query using a DataFrame against a local parquet filedataframe_in_memory.rs: Run a query using a DataFrame against data in memorydeserialize_to_struct.rs: Convert query results into rust structs using serdeexpr_api.rs: Use the Expr construction and simplification APIflight_sql_server.rs: Run DataFusion as a standalone process and execute SQL queries from JDBC clientsmemtable.rs: Create an query data in memory using SQL and RecordBatchesparquet_sql.rs: Build and run a query plan from a SQL statement against a local Parquet fileparquet_sql_multiple_files.rs: Build and run a query plan from a SQL statement against multiple local Parquet filesquery-aws-s3.rs: Confiure object_store and run a query against files stored in AWS S3rewrite_expr.rs: Define and invoke a custom Query Optimizer passsimple_udaf.rs: Define and invoke a User Defined Aggregate Function (UDAF)simple_udf.rs: Define and invoke a User Defined (scalar) Function (UDF)flight_client.rs and flight_server.rs: Run DataFusion as a standalone process and execute SQL queries from a client using the Flight protocol.