A native Rust library for Apache Hudi, with bindings into Python

Clone this repo:
  1. 1e6e091 build(deps): upgrade datafusion to 42.2.0 (#192) by Shiyan Xu · 11 hours ago main
  2. e2a3b36 build(deps): upgrade datafusion and object store (#182) by kazdy · 16 hours ago
  3. 2d94b74 fix: handle schema retrieval for datafusion api (#187) by Shiyan Xu · 23 hours ago
  4. 0effb24 fix: improve api to get file slices splits (#185) by Shiyan Xu · 23 hours ago
  5. bf5c6dd Revert "build(deps): bump codecov/codecov-action from 4 to 5 (#184)" (#186) by Shiyan Xu · 24 hours ago

The hudi-rs project aims to broaden the use of Apache Hudi for a diverse range of users and projects.

SourceInstallation Command
PyPipip install hudi
Crates.iocargo add hudi

Example usage

[!NOTE] These examples expect a Hudi table exists at /tmp/trips_table, created using the quick start guide.

Python

Read a Hudi table into a PyArrow table.

from hudi import HudiTable

hudi_table = HudiTable("/tmp/trips_table")
records = hudi_table.read_snapshot()

import pyarrow as pa
import pyarrow.compute as pc

arrow_table = pa.Table.from_batches(records)
result = arrow_table.select(
    ["rider", "ts", "fare"]).filter(
    pc.field("fare") > 20.0)
print(result)

Rust

cargo new my_project --bin && cd my_project
cargo add tokio@1 datafusion@39
cargo add hudi --features datafusion

Update src/main.rs with the code snippet below then cargo run.

use std::sync::Arc;

use datafusion::error::Result;
use datafusion::prelude::{DataFrame, SessionContext};
use hudi::HudiDataSource;

#[tokio::main]
async fn main() -> Result<()> {
    let ctx = SessionContext::new();
    let hudi = HudiDataSource::new("/tmp/trips_table").await?;
    ctx.register_table("trips_table", Arc::new(hudi))?;
    let df: DataFrame = ctx.sql("SELECT * from trips_table where fare > 20.0").await?;
    df.show().await?;
    Ok(())
}

Work with cloud storage

Ensure cloud storage credentials are set properly as environment variables, e.g., AWS_*, AZURE_*, or GOOGLE_*. Relevant storage environment variables will then be picked up. The target table's base uri with schemes such as s3://, az://, or gs:// will be processed accordingly.

Contributing

Check out the contributing guide for all the details about making contributions to the project.