Spark-pinot connector to read and write data from/to Pinot.
Detailed read model documentation is here; Spark-Pinot Connector Read Model
import org.apache.spark.sql.SparkSession val spark: SparkSession = SparkSession .builder() .appName("spark-pinot-connector-test") .master("local") .getOrCreate() import spark.implicits._ val data = spark.read .format("pinot") .option("table", "airlineStats") .option("tableType", "offline") .load() .filter($"DestStateName" === "Florida") data.show(100)
For more examples, see src/test/scala/example/ExampleSparkPinotConnectorTest.scala
Spark-Pinot connector uses Spark DatasourceV2 API
. Please check the Databricks presentation for DatasourceV2 API;