For example, if you're using the Hudi quick-start guide for spark you can just add --jars hudi-extensions-0.1.0-SNAPSHOT-bundled.jar
to the end of the command.
hoodie.avro.write.support.class: io.onetable.hudi.extensions.HoodieAvroWriteSupportWithFieldIds
hoodie.client.init.callback.classes: io.onetable.hudi.extensions.AddFieldIdsClientInitCallback
hoodie.datasource.write.row.writer.enable : false
(RowWriter support is coming soon)If you want to use OneTable with Hudi streaming ingestion to sync each commit into other table formats.
hudi-extensions-0.1.0-SNAPSHOT-bundled.jar
) to your class path.io.onetable.hudi.sync.OneTableSyncTool
to your list of sync classeshoodie.onetable.formats: "ICEBERG,DELTA"
(or simply use one format) hoodie.onetable.target.metadata.retention.hr: 168
(default retention for target format metadata is 168 hours)