blob: 5be9c37cdf7850db70343e98265cb1c3cb9fb3e6 [file] [log] [blame]
<!DOCTYPE html><html lang="en"><head><meta charSet="utf-8"/><meta http-equiv="X-UA-Compatible" content="IE=edge"/><title>Ingestion spec reference ยท Apache Druid</title><meta name="viewport" content="width=device-width, initial-scale=1.0"/><link rel="canonical" href="https://druid.apache.org/docs/26.0.0/ingestion/ingestion-spec.html"/><meta name="generator" content="Docusaurus"/><meta name="description" content="Reference for the configuration options in the ingestion spec."/><meta name="docsearch:language" content="en"/><meta name="docsearch:version" content="26.0.0" /><meta property="og:title" content="Ingestion spec reference ยท Apache Druid"/><meta property="og:type" content="website"/><meta property="og:url" content="https://druid.apache.org/index.html"/><meta property="og:description" content="Reference for the configuration options in the ingestion spec."/><meta property="og:image" content="https://druid.apache.org/img/druid_nav.png"/><meta name="twitter:card" content="summary"/><meta name="twitter:image" content="https://druid.apache.org/img/druid_nav.png"/><link rel="shortcut icon" href="/img/favicon.png"/><link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/docsearch.js@2/dist/cdn/docsearch.min.css"/><link rel="stylesheet" href="//cdnjs.cloudflare.com/ajax/libs/highlight.js/9.12.0/styles/default.min.css"/><script async="" src="https://www.googletagmanager.com/gtag/js?id=UA-131010415-1"></script><script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments); }
gtag('js', new Date());
gtag('config', 'UA-131010415-1');
</script><link rel="stylesheet" href="https://use.fontawesome.com/releases/v5.7.2/css/all.css"/><link rel="stylesheet" href="/css/code-block-buttons.css"/><script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/clipboard.js/2.0.4/clipboard.min.js"></script><script type="text/javascript" src="/js/code-block-buttons.js"></script><script src="/js/scrollSpy.js"></script><link rel="stylesheet" href="/css/main.css"/><script src="/js/codetabs.js"></script></head><body class="sideNavVisible separateOnPageNav"><div class="fixedHeaderContainer"><div class="headerWrapper wrapper"><header><a href="/"><img class="logo" src="/img/druid_nav.png" alt="Apache Druid"/></a><div class="navigationWrapper navigationSlider"><nav class="slidingNav"><ul class="nav-site nav-site-internal"><li class=""><a href="/technology" target="_self">Technology</a></li><li class=""><a href="/use-cases" target="_self">Use Cases</a></li><li class=""><a href="/druid-powered" target="_self">Powered By</a></li><li class="siteNavGroupActive"><a href="/docs/26.0.0/design/index.html" target="_self">Docs</a></li><li class=""><a href="/community/" target="_self">Community</a></li><li class=""><a href="https://www.apache.org" target="_self">Apache</a></li><li class=""><a href="/downloads.html" target="_self">Download</a></li><li class="navSearchWrapper reactNavSearchWrapper"><input type="text" id="search_input_react" placeholder="Search" title="Search"/></li></ul></nav></div></header></div></div><div class="navPusher"><div class="docMainWrapper wrapper"><div class="docsNavContainer" id="docsNav"><nav class="toc"><div class="toggleNav"><section class="navWrapper wrapper"><div class="navBreadcrumb wrapper"><div class="navToggle" id="navToggler"><div class="hamburger-menu"><div class="line1"></div><div class="line2"></div><div class="line3"></div></div></div><h2><i>โ€บ</i><span>Ingestion</span></h2><div class="tocToggler" id="tocToggler"><i class="icon-toc"></i></div></div><div class="navGroups"><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Getting started<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/index.html">Introduction to Apache Druid</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/index.html">Quickstart (local)</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/single-server.html">Single server deployment</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/cluster.html">Clustered deployment</a></li></ul></div><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Tutorials<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-batch.html">Load files natively</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-msq-extern.html">Load files using SQL ๐Ÿ†•</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-kafka.html">Load from Apache Kafka</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-batch-hadoop.html">Load from Apache Hadoop</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-query.html">Querying data</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-rollup.html">Roll-up</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-sketches-theta.html">Theta sketches</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-retention.html">Configuring data retention</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-update-data.html">Updating existing data</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-compaction.html">Compacting segments</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-delete-data.html">Deleting data</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-ingestion-spec.html">Writing an ingestion spec</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-transform-spec.html">Transforming input data</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/docker.html">Tutorial: Run with Docker</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-kerberos-hadoop.html">Kerberized HDFS deep storage</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-msq-convert-spec.html">Convert ingestion spec to SQL</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-sql-query-view.html">Get to know Query view</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-unnest-arrays.html">Unnesting arrays</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-jupyter-index.html">Jupyter Notebook tutorials</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-jupyter-docker.html">Docker for tutorials</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/tutorials/tutorial-jdbc.html">JDBC connector</a></li></ul></div><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Design<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/architecture.html">Design</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/segments.html">Segments</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/processes.html">Processes and servers</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/dependencies/deep-storage.html">Deep storage</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/dependencies/metadata-storage.html">Metadata storage</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/dependencies/zookeeper.html">ZooKeeper</a></li></ul></div><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Ingestion<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/index.html">Ingestion</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/data-formats.html">Data formats</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/data-model.html">Data model</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/rollup.html">Data rollup</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/partitioning.html">Partitioning</a></li><li class="navListItem navListItemActive"><a class="navItem" href="/docs/26.0.0/ingestion/ingestion-spec.html">Ingestion spec</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/schema-design.html">Schema design tips</a></li><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">Stream ingestion</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/kafka-ingestion.html">Apache Kafka ingestion</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/kafka-supervisor-reference.html">Apache Kafka supervisor</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/kafka-supervisor-operations.html">Apache Kafka operations</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/kinesis-ingestion.html">Amazon Kinesis</a></li></ul></div><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">Batch ingestion</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/native-batch.html">Native batch</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/native-batch-input-sources.html">Native batch: input sources</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/migrate-from-firehose.html">Migrate from firehose</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/hadoop.html">Hadoop-based</a></li></ul></div><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">SQL-based ingestion ๐Ÿ†•</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/multi-stage-query/index.html">Overview</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/multi-stage-query/concepts.html">Key concepts</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/multi-stage-query/api.html">API</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/multi-stage-query/security.html">Security</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/multi-stage-query/examples.html">Examples</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/multi-stage-query/reference.html">Reference</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/multi-stage-query/known-issues.html">Known issues</a></li></ul></div><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/tasks.html">Task reference</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/faq.html">Troubleshooting FAQ</a></li></ul></div><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Data management<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><li class="navListItem"><a class="navItem" href="/docs/26.0.0/data-management/index.html">Overview</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/data-management/update.html">Data updates</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/data-management/delete.html">Data deletion</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/data-management/schema-changes.html">Schema changes</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/data-management/compaction.html">Compaction</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/data-management/automatic-compaction.html">Automatic compaction</a></li></ul></div><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Querying<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">Druid SQL</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql.html">Overview and syntax</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-data-types.html">SQL data types</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-operators.html">Operators</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-scalar.html">Scalar functions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-aggregations.html">Aggregation functions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-multivalue-string-functions.html">Multi-value string functions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-json-functions.html">JSON functions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-functions.html">All functions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-api.html">Druid SQL API</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-jdbc.html">JDBC driver API</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-query-context.html">SQL query context</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-metadata-tables.html">SQL metadata tables</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sql-translation.html">SQL query translation</a></li></ul></div><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/querying.html">Native queries</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/query-execution.html">Query execution</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/troubleshooting.html">Troubleshooting</a></li><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">Concepts</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/datasource.html">Datasources</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/joins.html">Joins</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/lookups.html">Lookups</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/multi-value-dimensions.html">Multi-value dimensions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/nested-columns.html">Nested columns</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/multitenancy.html">Multitenancy</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/caching.html">Query caching</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/using-caching.html">Using query caching</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/query-context.html">Query context</a></li></ul></div><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">Native query types</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/timeseriesquery.html">Timeseries</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/topnquery.html">TopN</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/groupbyquery.html">GroupBy</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/scan-query.html">Scan</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/searchquery.html">Search</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/timeboundaryquery.html">TimeBoundary</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/segmentmetadataquery.html">SegmentMetadata</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/datasourcemetadataquery.html">DatasourceMetadata</a></li></ul></div><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">Native query components</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/filters.html">Filters</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/granularities.html">Granularities</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/dimensionspecs.html">Dimensions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/aggregations.html">Aggregations</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/post-aggregations.html">Post-aggregations</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/misc/math-expr.html">Expressions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/having.html">Having filters (groupBy)</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/limitspec.html">Sorting and limiting (groupBy)</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/topnmetricspec.html">Sorting (topN)</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/sorting-orders.html">String comparators</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/virtual-columns.html">Virtual columns</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/geo.html">Spatial filters</a></li></ul></div></ul></div><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Configuration<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><li class="navListItem"><a class="navItem" href="/docs/26.0.0/configuration/index.html">Configuration reference</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions.html">Extensions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/configuration/logging.html">Logging</a></li></ul></div><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Operations<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/web-console.html">Web console</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/java.html">Java runtime</a></li><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">Security</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/security-overview.html">Security overview</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/security-user-auth.html">User authentication and authorization</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/auth-ldap.html">LDAP auth</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/password-provider.html">Password providers</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/dynamic-config-provider.html">Dynamic Config Providers</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/tls-support.html">TLS support</a></li></ul></div><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">Performance tuning</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/basic-cluster-tuning.html">Basic cluster tuning</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/segment-optimization.html">Segment size optimization</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/mixed-workloads.html">Mixed workloads</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/http-compression.html">HTTP compression</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/clean-metadata-store.html">Automated metadata cleanup</a></li></ul></div><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">Monitoring</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/request-logging.html">Request logging</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/metrics.html">Metrics</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/alerts.html">Alerts</a></li></ul></div><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/api-reference.html">API reference</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/high-availability.html">High availability</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/rolling-updates.html">Rolling updates</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/rule-configuration.html">Using rules to drop and retain data</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/other-hadoop.html">Working with different versions of Apache Hadoop</a></li><div class="navGroup subNavGroup"><h4 class="navGroupSubcategoryTitle">Misc</h4><ul><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/dump-segment.html">dump-segment tool</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/reset-cluster.html">reset-cluster tool</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/insert-segment-to-db.html">insert-segment-to-db tool</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/pull-deps.html">pull-deps tool</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/deep-storage-migration.html">Deep storage migration</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/export-metadata.html">Export Metadata Tool</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/metadata-migration.html">Metadata Migration</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/use_sbt_to_build_fat_jar.html">Content for build.sbt</a></li></ul></div></ul></div><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Development<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/overview.html">Developing on Druid</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/modules.html">Creating extensions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/javascript.html">JavaScript functionality</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/build.html">Build from source</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/versioning.html">Versioning</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/experimental.html">Experimental features</a></li></ul></div><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Misc<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><li class="navListItem"><a class="navItem" href="/docs/26.0.0/misc/papers-and-talks.html">Papers</a></li></ul></div><div class="navGroup"><h3 class="navGroupCategoryTitle collapsible">Hidden<span class="arrow"><svg width="24" height="24" viewBox="0 0 24 24"><path fill="#565656" d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"></path><path d="M0 0h24v24H0z" fill="none"></path></svg></span></h3><ul class="hide"><li class="navListItem"><a class="navItem" href="/docs/26.0.0/comparisons/druid-vs-elasticsearch.html">Apache Druid vs Elasticsearch</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/comparisons/druid-vs-key-value.html">Apache Druid vs. Key/Value Stores (HBase/Cassandra/OpenTSDB)</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/comparisons/druid-vs-kudu.html">Apache Druid vs Kudu</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/comparisons/druid-vs-redshift.html">Apache Druid vs Redshift</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/comparisons/druid-vs-spark.html">Apache Druid vs Spark</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/comparisons/druid-vs-sql-on-hadoop.html">Apache Druid vs SQL-on-Hadoop</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/auth.html">Authentication and Authorization</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/broker.html">Broker</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/coordinator.html">Coordinator Process</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/historical.html">Historical Process</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/indexer.html">Indexer Process</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/indexing-service.html">Indexing Service</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/middlemanager.html">MiddleManager Process</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/overlord.html">Overlord Process</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/router.html">Router Process</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/design/peons.html">Peons</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/approximate-histograms.html">Approximate Histogram aggregators</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/avro.html">Apache Avro</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/azure.html">Microsoft Azure</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/bloom-filter.html">Bloom Filter</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/datasketches-extension.html">DataSketches extension</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/datasketches-hll.html">DataSketches HLL Sketch module</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/datasketches-quantiles.html">DataSketches Quantiles Sketch module</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/datasketches-theta.html">DataSketches Theta Sketch module</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/datasketches-tuple.html">DataSketches Tuple Sketch module</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/druid-basic-security.html">Basic Security</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/druid-kerberos.html">Kerberos</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/druid-lookups.html">Cached Lookup Module</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/druid-ranger-security.html">Apache Ranger Security</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/google.html">Google Cloud Storage</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/hdfs.html">HDFS</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/kafka-extraction-namespace.html">Apache Kafka Lookups</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/lookups-cached-global.html">Globally Cached Lookups</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/mysql.html">MySQL Metadata Store</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/orc.html">ORC Extension</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/druid-pac4j.html">Druid pac4j based Security extension</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/parquet.html">Apache Parquet Extension</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/postgresql.html">PostgreSQL Metadata Store</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/protobuf.html">Protobuf</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/s3.html">S3-compatible</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/simple-client-sslcontext.html">Simple SSLContext Provider Module</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/stats.html">Stats aggregator</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/test-stats.html">Test Stats Aggregators</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/druid-aws-rds.html">Druid AWS RDS Module</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-core/kubernetes.html">Kubernetes</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/ambari-metrics-emitter.html">Ambari Metrics Emitter</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/cassandra.html">Apache Cassandra</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/cloudfiles.html">Rackspace Cloud Files</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/distinctcount.html">DistinctCount Aggregator</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/graphite.html">Graphite Emitter</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/influx.html">InfluxDB Line Protocol Parser</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/influxdb-emitter.html">InfluxDB Emitter</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/kafka-emitter.html">Kafka Emitter</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/materialized-view.html">Materialized View</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/momentsketch-quantiles.html">Moment Sketches for Approximate Quantiles module</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/moving-average-query.html">Moving Average Query</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/opentsdb-emitter.html">OpenTSDB Emitter</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/redis-cache.html">Druid Redis Cache</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/sqlserver.html">Microsoft SQLServer</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/statsd.html">StatsD Emitter</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/tdigestsketch-quantiles.html">T-Digest Quantiles Sketch module</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/thrift.html">Thrift</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/time-min-max.html">Timestamp Min/Max aggregators</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/gce-extensions.html">GCE Extensions</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/aliyun-oss.html">Aliyun OSS</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/development/extensions-contrib/prometheus.html">Prometheus Emitter</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/operations/kubernetes.html">kubernetes</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/hll-old.html">Cardinality/HyperUnique aggregators</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/querying/select-query.html">Select</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/native-batch-firehose.html">Firehose (deprecated)</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/native-batch-simple-task.html">Native batch (simple)</a></li><li class="navListItem"><a class="navItem" href="/docs/26.0.0/ingestion/standalone-realtime.html">Realtime Process</a></li></ul></div></div></section></div><script>
var coll = document.getElementsByClassName('collapsible');
var checkActiveCategory = true;
for (var i = 0; i < coll.length; i++) {
var links = coll[i].nextElementSibling.getElementsByTagName('*');
if (checkActiveCategory){
for (var j = 0; j < links.length; j++) {
if (links[j].classList.contains('navListItemActive')){
coll[i].nextElementSibling.classList.toggle('hide');
coll[i].childNodes[1].classList.toggle('rotate');
checkActiveCategory = false;
break;
}
}
}
coll[i].addEventListener('click', function() {
var arrow = this.childNodes[1];
arrow.classList.toggle('rotate');
var content = this.nextElementSibling;
content.classList.toggle('hide');
});
}
document.addEventListener('DOMContentLoaded', function() {
createToggler('#navToggler', '#docsNav', 'docsSliderActive');
createToggler('#tocToggler', 'body', 'tocActive');
var headings = document.querySelector('.toc-headings');
headings && headings.addEventListener('click', function(event) {
var el = event.target;
while(el !== headings){
if (el.tagName === 'A') {
document.body.classList.remove('tocActive');
break;
} else{
el = el.parentNode;
}
}
}, false);
function createToggler(togglerSelector, targetSelector, className) {
var toggler = document.querySelector(togglerSelector);
var target = document.querySelector(targetSelector);
if (!toggler) {
return;
}
toggler.onclick = function(event) {
event.preventDefault();
target.classList.toggle(className);
};
}
});
</script></nav></div><div class="container mainContainer docsContainer"><div class="wrapper"><div class="post"><header class="postHeader"><a class="edit-page-link button" href="https://github.com/apache/druid/edit/master/docs/ingestion/ingestion-spec.md" target="_blank" rel="noreferrer noopener">Edit</a><h1 id="__docusaurus" class="postHeaderTitle">Ingestion spec reference</h1></header><article><div><span><!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing,
~ software distributed under the License is distributed on an
~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
~ KIND, either express or implied. See the License for the
~ specific language governing permissions and limitations
~ under the License.
-->
<p>All ingestion methods use ingestion tasks to load data into Druid. Streaming ingestion uses ongoing supervisors that run and supervise a set of tasks over time. Native batch and Hadoop-based ingestion use a one-time <a href="/docs/26.0.0/ingestion/tasks.html">task</a>. Other than with SQL-based ingestion, use an <em>ingestion spec</em> to configure your ingestion.</p>
<p>Ingestion specs consists of three main components:</p>
<ul>
<li><a href="#dataschema"><code>dataSchema</code></a>, which configures the <a href="#datasource">datasource name</a>,
<a href="#timestampspec">primary timestamp</a>, <a href="#dimensionsspec">dimensions</a>, <a href="#metricsspec">metrics</a>, and <a href="#transformspec">transforms and filters</a> (if needed).</li>
<li><a href="#ioconfig"><code>ioConfig</code></a>, which tells Druid how to connect to the source system and how to parse data. For more information, see the
documentation for each <a href="/docs/26.0.0/ingestion/index.html#ingestion-methods">ingestion method</a>.</li>
<li><a href="#tuningconfig"><code>tuningConfig</code></a>, which controls various tuning parameters specific to each
<a href="/docs/26.0.0/ingestion/index.html#ingestion-methods">ingestion method</a>.</li>
</ul>
<p>Example ingestion spec for task type <code>index_parallel</code> (native batch):</p>
<pre><code class="hljs">{
<span class="hljs-attr">"type"</span>: <span class="hljs-string">"index_parallel"</span>,
<span class="hljs-attr">"spec"</span>: {
<span class="hljs-attr">"dataSchema"</span>: {
<span class="hljs-attr">"dataSource"</span>: <span class="hljs-string">"wikipedia"</span>,
<span class="hljs-attr">"timestampSpec"</span>: {
<span class="hljs-attr">"column"</span>: <span class="hljs-string">"timestamp"</span>,
<span class="hljs-attr">"format"</span>: <span class="hljs-string">"auto"</span>
},
<span class="hljs-attr">"dimensionsSpec"</span>: {
<span class="hljs-attr">"dimensions"</span>: [
<span class="hljs-string">"page"</span>,
<span class="hljs-string">"language"</span>,
{ <span class="hljs-attr">"type"</span>: <span class="hljs-string">"long"</span>, <span class="hljs-attr">"name"</span>: <span class="hljs-string">"userId"</span> }
]
},
<span class="hljs-attr">"metricsSpec"</span>: [
{ <span class="hljs-attr">"type"</span>: <span class="hljs-string">"count"</span>, <span class="hljs-attr">"name"</span>: <span class="hljs-string">"count"</span> },
{ <span class="hljs-attr">"type"</span>: <span class="hljs-string">"doubleSum"</span>, <span class="hljs-attr">"name"</span>: <span class="hljs-string">"bytes_added_sum"</span>, <span class="hljs-attr">"fieldName"</span>: <span class="hljs-string">"bytes_added"</span> },
{ <span class="hljs-attr">"type"</span>: <span class="hljs-string">"doubleSum"</span>, <span class="hljs-attr">"name"</span>: <span class="hljs-string">"bytes_deleted_sum"</span>, <span class="hljs-attr">"fieldName"</span>: <span class="hljs-string">"bytes_deleted"</span> }
],
<span class="hljs-attr">"granularitySpec"</span>: {
<span class="hljs-attr">"segmentGranularity"</span>: <span class="hljs-string">"day"</span>,
<span class="hljs-attr">"queryGranularity"</span>: <span class="hljs-string">"none"</span>,
<span class="hljs-attr">"intervals"</span>: [
<span class="hljs-string">"2013-08-31/2013-09-01"</span>
]
}
},
<span class="hljs-attr">"ioConfig"</span>: {
<span class="hljs-attr">"type"</span>: <span class="hljs-string">"index_parallel"</span>,
<span class="hljs-attr">"inputSource"</span>: {
<span class="hljs-attr">"type"</span>: <span class="hljs-string">"local"</span>,
<span class="hljs-attr">"baseDir"</span>: <span class="hljs-string">"examples/indexing/"</span>,
<span class="hljs-attr">"filter"</span>: <span class="hljs-string">"wikipedia_data.json"</span>
},
<span class="hljs-attr">"inputFormat"</span>: {
<span class="hljs-attr">"type"</span>: <span class="hljs-string">"json"</span>,
<span class="hljs-attr">"flattenSpec"</span>: {
<span class="hljs-attr">"useFieldDiscovery"</span>: <span class="hljs-literal">true</span>,
<span class="hljs-attr">"fields"</span>: [
{ <span class="hljs-attr">"type"</span>: <span class="hljs-string">"path"</span>, <span class="hljs-attr">"name"</span>: <span class="hljs-string">"userId"</span>, <span class="hljs-attr">"expr"</span>: <span class="hljs-string">"$.user.id"</span> }
]
}
}
},
<span class="hljs-attr">"tuningConfig"</span>: {
<span class="hljs-attr">"type"</span>: <span class="hljs-string">"index_parallel"</span>
}
}
}
</code></pre>
<p>The specific options supported by these sections will depend on the <a href="/docs/26.0.0/ingestion/index.html#ingestion-methods">ingestion method</a> you have chosen.
For more examples, refer to the documentation for each ingestion method.</p>
<p>You can also load data visually, without the need to write an ingestion spec, using the &quot;Load data&quot; functionality
available in Druid's <a href="/docs/26.0.0/operations/web-console.html">web console</a>. Druid's visual data loader supports
<a href="/docs/26.0.0/development/extensions-core/kafka-ingestion.html">Kafka</a>,
<a href="/docs/26.0.0/development/extensions-core/kinesis-ingestion.html">Kinesis</a>, and
<a href="/docs/26.0.0/ingestion/native-batch.html">native batch</a> mode.</p>
<h2><a class="anchor" aria-hidden="true" id="dataschema"></a><a href="#dataschema" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>dataSchema</code></h2>
<blockquote>
<p>The <code>dataSchema</code> spec has been changed in 0.17.0. The new spec is supported by all ingestion methods
except for <em>Hadoop</em> ingestion. See the <a href="#legacy-dataschema-spec">Legacy <code>dataSchema</code> spec</a> for the old spec.</p>
</blockquote>
<p>The <code>dataSchema</code> is a holder for the following components:</p>
<ul>
<li><a href="#datasource">datasource name</a></li>
<li><a href="#timestampspec">primary timestamp</a></li>
<li><a href="#dimensionsspec">dimensions</a></li>
<li><a href="#metricsspec">metrics</a></li>
<li><a href="#transformspec">transforms and filters</a> (if needed).</li>
</ul>
<p>An example <code>dataSchema</code> is:</p>
<pre><code class="hljs"><span class="hljs-string">"dataSchema"</span>: {
<span class="hljs-string">"dataSource"</span>: <span class="hljs-string">"wikipedia"</span>,
<span class="hljs-string">"timestampSpec"</span>: {
<span class="hljs-string">"column"</span>: <span class="hljs-string">"timestamp"</span>,
<span class="hljs-string">"format"</span>: <span class="hljs-string">"auto"</span>
},
<span class="hljs-string">"dimensionsSpec"</span>: {
<span class="hljs-string">"dimensions"</span>: [
<span class="hljs-string">"page"</span>,
<span class="hljs-string">"language"</span>,
{ <span class="hljs-string">"type"</span>: <span class="hljs-string">"long"</span>, <span class="hljs-string">"name"</span>: <span class="hljs-string">"userId"</span> }
]
},
<span class="hljs-string">"metricsSpec"</span>: [
{ <span class="hljs-string">"type"</span>: <span class="hljs-string">"count"</span>, <span class="hljs-string">"name"</span>: <span class="hljs-string">"count"</span> },
{ <span class="hljs-string">"type"</span>: <span class="hljs-string">"doubleSum"</span>, <span class="hljs-string">"name"</span>: <span class="hljs-string">"bytes_added_sum"</span>, <span class="hljs-string">"fieldName"</span>: <span class="hljs-string">"bytes_added"</span> },
{ <span class="hljs-string">"type"</span>: <span class="hljs-string">"doubleSum"</span>, <span class="hljs-string">"name"</span>: <span class="hljs-string">"bytes_deleted_sum"</span>, <span class="hljs-string">"fieldName"</span>: <span class="hljs-string">"bytes_deleted"</span> }
],
<span class="hljs-string">"granularitySpec"</span>: {
<span class="hljs-string">"segmentGranularity"</span>: <span class="hljs-string">"day"</span>,
<span class="hljs-string">"queryGranularity"</span>: <span class="hljs-string">"none"</span>,
<span class="hljs-string">"intervals"</span>: [
<span class="hljs-string">"2013-08-31/2013-09-01"</span>
]
}
}
</code></pre>
<h3><a class="anchor" aria-hidden="true" id="datasource"></a><a href="#datasource" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>dataSource</code></h3>
<p>The <code>dataSource</code> is located in <code>dataSchema</code> โ†’ <code>dataSource</code> and is simply the name of the
<a href="/docs/26.0.0/design/architecture.html#datasources-and-segments">datasource</a> that data will be written to. An example
<code>dataSource</code> is:</p>
<pre><code class="hljs"><span class="hljs-string">"dataSource"</span>: <span class="hljs-string">"my-first-datasource"</span>
</code></pre>
<h3><a class="anchor" aria-hidden="true" id="timestampspec"></a><a href="#timestampspec" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>timestampSpec</code></h3>
<p>The <code>timestampSpec</code> is located in <code>dataSchema</code> โ†’ <code>timestampSpec</code> and is responsible for
configuring the <a href="/docs/26.0.0/ingestion/data-model.html#primary-timestamp">primary timestamp</a>. An example <code>timestampSpec</code> is:</p>
<pre><code class="hljs"><span class="hljs-string">"timestampSpec"</span>: {
<span class="hljs-string">"column"</span>: <span class="hljs-string">"timestamp"</span>,
<span class="hljs-string">"format"</span>: <span class="hljs-string">"auto"</span>
}
</code></pre>
<blockquote>
<p>Conceptually, after input data records are read, Druid applies ingestion spec components in a particular order:
first <a href="/docs/26.0.0/ingestion/data-formats.html#flattenspec"><code>flattenSpec</code></a> (if any), then <a href="#timestampspec"><code>timestampSpec</code></a>, then <a href="#transformspec"><code>transformSpec</code></a>,
and finally <a href="#dimensionsspec"><code>dimensionsSpec</code></a> and <a href="#metricsspec"><code>metricsSpec</code></a>. Keep this in mind when writing
your ingestion spec.</p>
</blockquote>
<p>A <code>timestampSpec</code> can have the following components:</p>
<table>
<thead>
<tr><th>Field</th><th>Description</th><th>Default</th></tr>
</thead>
<tbody>
<tr><td>column</td><td>Input row field to read the primary timestamp from.<br /><br />Regardless of the name of this input field, the primary timestamp will always be stored as a column named <code>__time</code> in your Druid datasource.</td><td>timestamp</td></tr>
<tr><td>format</td><td>Timestamp format. Options are: <ul><li><code>iso</code>: ISO8601 with 'T' separator, like &quot;2000-01-01T01:02:03.456&quot;</li><li><code>posix</code>: seconds since epoch</li><li><code>millis</code>: milliseconds since epoch</li><li><code>micro</code>: microseconds since epoch</li><li><code>nano</code>: nanoseconds since epoch</li><li><code>auto</code>: automatically detects ISO (either 'T' or space separator) or millis format</li><li>any <a href="http://joda-time.sourceforge.net/apidocs/org/joda/time/format/DateTimeFormat.html">Joda DateTimeFormat string</a></li></ul></td><td>auto</td></tr>
<tr><td>missingValue</td><td>Timestamp to use for input records that have a null or missing timestamp <code>column</code>. Should be in ISO8601 format, like <code>&quot;2000-01-01T01:02:03.456&quot;</code>, even if you have specified something else for <code>format</code>. Since Druid requires a primary timestamp, this setting can be useful for ingesting datasets that do not have any per-record timestamps at all.</td><td>none</td></tr>
</tbody>
</table>
<p>You can use the timestamp in a expression as <code>__time</code> because Druid parses the <code>timestampSpec</code> before applying <a href="#transforms">transforms</a>. You can also set the expression <code>name</code> to <code>__time</code> to replace the value of the timestamp.</p>
<p>Treat <code>__time</code> as a millisecond timestamp: the number of milliseconds since Jan 1, 1970 at midnight UTC.</p>
<h3><a class="anchor" aria-hidden="true" id="dimensionsspec"></a><a href="#dimensionsspec" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>dimensionsSpec</code></h3>
<p>The <code>dimensionsSpec</code> is located in <code>dataSchema</code> โ†’ <code>dimensionsSpec</code> and is responsible for
configuring <a href="/docs/26.0.0/ingestion/data-model.html#dimensions">dimensions</a>.</p>
<p>You can either manually specify the dimensions or take advantage of schema auto-discovery where you allow Druid to infer all or some of the schema for your data. This means that you don't have to explicitly specify your dimensions and their type.</p>
<p>To use schema auto-discovery, set <code>useSchemaDiscovery</code> to <code>true</code>.</p>
<p>Alternatively, you can use the string-based schemaless ingestion where any discovered dimensions are treated as strings. To do so, leave <code>useSchemaDiscovery</code> set to <code>false</code> (default). Then, set the dimensions list to empty or set the <code>includeAllDimensions</code> property to <code>true</code>.</p>
<p>The following <code>dimensionsSpec</code> example uses schema auto-discovery (<code>&quot;useSchemaDiscovery&quot;: true</code>) in conjunction with explicitly defined dimensions to have Druid infer some of the schema for the data:</p>
<pre><code class="hljs css language-json">"dimensionsSpec" : {
"dimensions": [
"page",
"language",
{ "type": "long", "name": "userId" }
],
"dimensionExclusions" : [],
"spatialDimensions" : [],
"useSchemaDiscovery": true
}
</code></pre>
<blockquote>
<p>Conceptually, after input data records are read, Druid applies ingestion spec components in a particular order:
first <a href="/docs/26.0.0/ingestion/data-formats.html#flattenspec"><code>flattenSpec</code></a> (if any), then <a href="#timestampspec"><code>timestampSpec</code></a>, then <a href="#transformspec"><code>transformSpec</code></a>,
and finally <a href="#dimensionsspec"><code>dimensionsSpec</code></a> and <a href="#metricsspec"><code>metricsSpec</code></a>. Keep this in mind when writing
your ingestion spec.</p>
</blockquote>
<p>A <code>dimensionsSpec</code> can have the following components:</p>
<table>
<thead>
<tr><th>Field</th><th>Description</th><th>Default</th></tr>
</thead>
<tbody>
<tr><td><code>dimensions</code></td><td>A list of <a href="#dimension-objects">dimension names or objects</a>. You cannot include the same column in both <code>dimensions</code> and <code>dimensionExclusions</code>.<br /><br />If <code>dimensions</code> and <code>spatialDimensions</code> are both null or empty arrays, Druid treats all columns other than timestamp or metrics that do not appear in <code>dimensionExclusions</code> as String-typed dimension columns. See <a href="#inclusions-and-exclusions">inclusions and exclusions</a> for details.<br /><br />As a best practice, put the most frequently filtered dimensions at the beginning of the dimensions list. In this case, it would also be good to consider <a href="/docs/26.0.0/ingestion/partitioning.html"><code>partitioning</code></a> by those same dimensions.</td><td><code>[]</code></td></tr>
<tr><td><code>dimensionExclusions</code></td><td>The names of dimensions to exclude from ingestion. Only names are supported here, not objects.<br /><br />This list is only used if the <code>dimensions</code> and <code>spatialDimensions</code> lists are both null or empty arrays; otherwise it is ignored. See <a href="#inclusions-and-exclusions">inclusions and exclusions</a> below for details.</td><td><code>[]</code></td></tr>
<tr><td><code>spatialDimensions</code></td><td>An array of <a href="/docs/26.0.0/development/geo.html">spatial dimensions</a>.</td><td><code>[]</code></td></tr>
<tr><td><code>includeAllDimensions</code></td><td>Note that this field only applies to string-based schema discovery where Druid ingests dimensions it discovers as strings. This is different from schema auto-discovery where Druid infers the type for data. You can set <code>includeAllDimensions</code> to true to ingest both explicit dimensions in the <code>dimensions</code> field and other dimensions that the ingestion task discovers from input data. In this case, the explicit dimensions will appear first in the order that you specify them, and the dimensions dynamically discovered will come after. This flag can be useful especially with auto schema discovery using <a href="./data-formats.html#flattenspec"><code>flattenSpec</code></a>. If this is not set and the <code>dimensions</code> field is not empty, Druid will ingest only explicit dimensions. If this is not set and the <code>dimensions</code> field is empty, all discovered dimensions will be ingested.</td><td>false</td></tr>
<tr><td><code>useSchemaDiscovery</code></td><td>Configure Druid to use schema auto-discovery to discover some or all of the dimensions and types for your data. For any dimensions that aren't a uniform type, Druid ingests them as JSON. You can use this for native batch or streaming ingestion.</td><td>false</td></tr>
</tbody>
</table>
<h4><a class="anchor" aria-hidden="true" id="dimension-objects"></a><a href="#dimension-objects" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a>Dimension objects</h4>
<p>Each dimension in the <code>dimensions</code> list can either be a name or an object. Providing a name is equivalent to providing
a <code>string</code> type dimension object with the given name, e.g. <code>&quot;page&quot;</code> is equivalent to <code>{&quot;name&quot;: &quot;page&quot;, &quot;type&quot;: &quot;string&quot;}</code>.</p>
<p>Dimension objects can have the following components:</p>
<table>
<thead>
<tr><th>Field</th><th>Description</th><th>Default</th></tr>
</thead>
<tbody>
<tr><td>type</td><td>Either <code>auto</code>, <code>string</code>, <code>long</code>, <code>float</code>, <code>double</code>, or <code>json</code>. For the <code>auto</code> type, Druid determines the most appropriate type for the dimension and assigns one of the following: STRING, ARRAY<STRING>, LONG, ARRAY<LONG>, DOUBLE, ARRAY<DOUBLE>, or COMPLEX<json> columns, all sharing a common 'nested' format. When Druid infers the schema with schema auto-discovery, the type is <code>auto</code>.</td><td><code>string</code></td></tr>
<tr><td>name</td><td>The name of the dimension. This will be used as the field name to read from input records, as well as the column name stored in generated segments.<br /><br />Note that you can use a <a href="#transformspec"><code>transformSpec</code></a> if you want to rename columns during ingestion time.</td><td>none (required)</td></tr>
<tr><td>createBitmapIndex</td><td>For <code>string</code> typed dimensions, whether or not bitmap indexes should be created for the column in generated segments. Creating a bitmap index requires more storage, but speeds up certain kinds of filtering (especially equality and prefix filtering). Only supported for <code>string</code> typed dimensions.</td><td><code>true</code></td></tr>
<tr><td>multiValueHandling</td><td>Specify the type of handling for <a href="/docs/26.0.0/querying/multi-value-dimensions.html">multi-value fields</a>. Possible values are <code>sorted_array</code>, <code>sorted_set</code>, and <code>array</code>. <code>sorted_array</code> and <code>sorted_set</code> order the array upon ingestion. <code>sorted_set</code> removes duplicates. <code>array</code> ingests data as-is</td><td><code>sorted_array</code></td></tr>
</tbody>
</table>
<h4><a class="anchor" aria-hidden="true" id="inclusions-and-exclusions"></a><a href="#inclusions-and-exclusions" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a>Inclusions and exclusions</h4>
<p>Druid will interpret a <code>dimensionsSpec</code> in two possible ways: <em>normal</em> or <em>schemaless</em>.</p>
<p>Normal interpretation occurs when either <code>dimensions</code> or <code>spatialDimensions</code> is non-empty. In this case, the combination of the two lists will be taken as the set of dimensions to be ingested, and the list of <code>dimensionExclusions</code> will be ignored.</p>
<blockquote>
<p>The following description of schemaless refers to string-based schemaless where Druid treats dimensions it discovers as strings. We recommend you use schema auto-discovery instead where Druid infers the type for the dimension. For more information, see <a href="#dimensionsspec"><code>dimensionsSpec</code></a>.</p>
</blockquote>
<p>Schemaless interpretation occurs when both <code>dimensions</code> and <code>spatialDimensions</code> are empty or null. In this case, the set of dimensions is determined in the following way:</p>
<ol>
<li>First, start from the set of all root-level fields from the input record, as determined by the <a href="/docs/26.0.0/ingestion/data-formats.html"><code>inputFormat</code></a>. &quot;Root-level&quot; includes all fields at the top level of a data structure, but does not included fields nested within maps or lists. To extract these, you must use a <a href="/docs/26.0.0/ingestion/data-formats.html#flattenspec"><code>flattenSpec</code></a>. All fields of non-nested data formats, such as CSV and delimited text, are considered root-level.</li>
<li>If a <a href="/docs/26.0.0/ingestion/data-formats.html#flattenspec"><code>flattenSpec</code></a> is being used, the set of root-level fields includes any fields generated by the <code>flattenSpec</code>. The <code>useFieldDiscovery</code> parameter determines whether the original root-level fields will be retained or discarded.</li>
<li>Any field listed in <code>dimensionExclusions</code> is excluded.</li>
<li>The field listed as <code>column</code> in the <a href="#timestampspec"><code>timestampSpec</code></a> is excluded.</li>
<li>Any field used as an input to an aggregator from the <a href="#metricsspec">metricsSpec</a> is excluded.</li>
<li>Any field with the same name as an aggregator from the <a href="#metricsspec">metricsSpec</a> is excluded.</li>
<li>All other fields are ingested as <code>string</code> typed dimensions with the <a href="#dimension-objects">default settings</a>.</li>
</ol>
<p>Additionally, if you have empty columns that you want to include in the string-based schemaless ingestion, you'll need to include the context parameter <code>storeEmptyColumns</code> and set it to <code>true</code>.</p>
<blockquote>
<p>Note: Fields generated by a <a href="#transformspec"><code>transformSpec</code></a> are not currently considered candidates for
schemaless dimension interpretation.</p>
</blockquote>
<h3><a class="anchor" aria-hidden="true" id="metricsspec"></a><a href="#metricsspec" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>metricsSpec</code></h3>
<p>The <code>metricsSpec</code> is located in <code>dataSchema</code> โ†’ <code>metricsSpec</code> and is a list of <a href="/docs/26.0.0/querying/aggregations.html">aggregators</a>
to apply at ingestion time. This is most useful when <a href="/docs/26.0.0/ingestion/rollup.html">rollup</a> is enabled, since it's how you configure
ingestion-time aggregation.</p>
<p>An example <code>metricsSpec</code> is:</p>
<pre><code class="hljs"><span class="hljs-string">"metricsSpec"</span>: [
{ <span class="hljs-string">"type"</span>: <span class="hljs-string">"count"</span>, <span class="hljs-string">"name"</span>: <span class="hljs-string">"count"</span> },
{ <span class="hljs-string">"type"</span>: <span class="hljs-string">"doubleSum"</span>, <span class="hljs-string">"name"</span>: <span class="hljs-string">"bytes_added_sum"</span>, <span class="hljs-string">"fieldName"</span>: <span class="hljs-string">"bytes_added"</span> },
{ <span class="hljs-string">"type"</span>: <span class="hljs-string">"doubleSum"</span>, <span class="hljs-string">"name"</span>: <span class="hljs-string">"bytes_deleted_sum"</span>, <span class="hljs-string">"fieldName"</span>: <span class="hljs-string">"bytes_deleted"</span> }
]
</code></pre>
<blockquote>
<p>Generally, when <a href="/docs/26.0.0/ingestion/rollup.html">rollup</a> is disabled, you should have an empty <code>metricsSpec</code> (because without rollup,
Druid does not do any ingestion-time aggregation, so there is little reason to include an ingestion-time aggregator). However,
in some cases, it can still make sense to define metrics: for example, if you want to create a complex column as a way of
pre-computing part of an <a href="/docs/26.0.0/querying/aggregations.html#approximate-aggregations">approximate aggregation</a>, this can only
be done by defining a metric in a <code>metricsSpec</code>.</p>
</blockquote>
<h3><a class="anchor" aria-hidden="true" id="granularityspec"></a><a href="#granularityspec" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>granularitySpec</code></h3>
<p>The <code>granularitySpec</code> is located in <code>dataSchema</code> โ†’ <code>granularitySpec</code> and is responsible for configuring
the following operations:</p>
<ol>
<li>Partitioning a datasource into <a href="/docs/26.0.0/design/architecture.html#datasources-and-segments">time chunks</a> (via <code>segmentGranularity</code>).</li>
<li>Truncating the timestamp, if desired (via <code>queryGranularity</code>).</li>
<li>Specifying which time chunks of segments should be created, for batch ingestion (via <code>intervals</code>).</li>
<li>Specifying whether ingestion-time <a href="/docs/26.0.0/ingestion/rollup.html">rollup</a> should be used or not (via <code>rollup</code>).</li>
</ol>
<p>Other than <code>rollup</code>, these operations are all based on the <a href="/docs/26.0.0/ingestion/data-model.html#primary-timestamp">primary timestamp</a>.</p>
<p>An example <code>granularitySpec</code> is:</p>
<pre><code class="hljs"><span class="hljs-string">"granularitySpec"</span>: {
<span class="hljs-string">"segmentGranularity"</span>: <span class="hljs-string">"day"</span>,
<span class="hljs-string">"queryGranularity"</span>: <span class="hljs-string">"none"</span>,
<span class="hljs-string">"intervals"</span>: [
<span class="hljs-string">"2013-08-31/2013-09-01"</span>
],
<span class="hljs-string">"rollup"</span>: <span class="hljs-literal">true</span>
}
</code></pre>
<p>A <code>granularitySpec</code> can have the following components:</p>
<table>
<thead>
<tr><th>Field</th><th>Description</th><th>Default</th></tr>
</thead>
<tbody>
<tr><td>type</td><td><code>uniform</code></td><td><code>uniform</code></td></tr>
<tr><td>segmentGranularity</td><td><a href="/docs/26.0.0/design/architecture.html#datasources-and-segments">Time chunking</a> granularity for this datasource. Multiple segments can be created per time chunk. For example, when set to <code>day</code>, the events of the same day fall into the same time chunk which can be optionally further partitioned into multiple segments based on other configurations and input size. Any <a href="/docs/26.0.0/querying/granularities.html">granularity</a> can be provided here. Note that all segments in the same time chunk should have the same segment granularity.</td><td><code>day</code></td></tr>
<tr><td>queryGranularity</td><td>The resolution of timestamp storage within each segment. This must be equal to, or finer, than <code>segmentGranularity</code>. This will be the finest granularity that you can query at and still receive sensible results, but note that you can still query at anything coarser than this granularity. E.g., a value of <code>minute</code> will mean that records will be stored at minutely granularity, and can be sensibly queried at any multiple of minutes (including minutely, 5-minutely, hourly, etc).<br /><br />Any <a href="/docs/26.0.0/querying/granularities.html">granularity</a> can be provided here. Use <code>none</code> to store timestamps as-is, without any truncation. Note that <code>rollup</code> will be applied if it is set even when the <code>queryGranularity</code> is set to <code>none</code>.</td><td><code>none</code></td></tr>
<tr><td>rollup</td><td>Whether to use ingestion-time <a href="/docs/26.0.0/ingestion/rollup.html">rollup</a> or not. Note that rollup is still effective even when <code>queryGranularity</code> is set to <code>none</code>. Your data will be rolled up if they have the exactly same timestamp.</td><td><code>true</code></td></tr>
<tr><td>intervals</td><td>A list of intervals defining time chunks for segments. Specify interval values using ISO8601 format. For example, <code>[&quot;2021-12-06T21:27:10+00:00/2021-12-07T00:00:00+00:00&quot;]</code>. If you omit the time, the time defaults to &quot;00:00:00&quot;.<br /><br />Druid breaks the list up and rounds off the list values based on the <code>segmentGranularity</code>.<br /><br />If <code>null</code> or not provided, batch ingestion tasks generally determine which time chunks to output based on the timestamps found in the input data.<br /><br />If specified, batch ingestion tasks may be able to skip a determining-partitions phase, which can result in faster ingestion. Batch ingestion tasks may also be able to request all their locks up-front instead of one by one. Batch ingestion tasks throw away any records with timestamps outside of the specified intervals.<br /><br />Ignored for any form of streaming ingestion.</td><td><code>null</code></td></tr>
</tbody>
</table>
<h3><a class="anchor" aria-hidden="true" id="transformspec"></a><a href="#transformspec" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>transformSpec</code></h3>
<p>The <code>transformSpec</code> is located in <code>dataSchema</code> โ†’ <code>transformSpec</code> and is responsible for transforming and filtering
records during ingestion time. It is optional. An example <code>transformSpec</code> is:</p>
<pre><code class="hljs"><span class="hljs-string">"transformSpec"</span>: {
<span class="hljs-string">"transforms"</span>: [
{ <span class="hljs-string">"type"</span>: <span class="hljs-string">"expression"</span>, <span class="hljs-string">"name"</span>: <span class="hljs-string">"countryUpper"</span>, <span class="hljs-string">"expression"</span>: <span class="hljs-string">"upper(country)"</span> }
],
<span class="hljs-string">"filter"</span>: {
<span class="hljs-string">"type"</span>: <span class="hljs-string">"selector"</span>,
<span class="hljs-string">"dimension"</span>: <span class="hljs-string">"country"</span>,
<span class="hljs-string">"value"</span>: <span class="hljs-string">"San Serriffe"</span>
}
}
</code></pre>
<blockquote>
<p>Conceptually, after input data records are read, Druid applies ingestion spec components in a particular order:
first <a href="/docs/26.0.0/ingestion/data-formats.html#flattenspec"><code>flattenSpec</code></a> (if any), then <a href="#timestampspec"><code>timestampSpec</code></a>, then <a href="#transformspec"><code>transformSpec</code></a>,
and finally <a href="#dimensionsspec"><code>dimensionsSpec</code></a> and <a href="#metricsspec"><code>metricsSpec</code></a>. Keep this in mind when writing
your ingestion spec.</p>
</blockquote>
<h4><a class="anchor" aria-hidden="true" id="transforms"></a><a href="#transforms" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a>Transforms</h4>
<p>The <code>transforms</code> list allows you to specify a set of expressions to evaluate on top of input data. Each transform has a
&quot;name&quot; which can be referred to by your <code>dimensionsSpec</code>, <code>metricsSpec</code>, etc.</p>
<p>If a transform has the same name as a field in an input row, then it will shadow the original field. Transforms that
shadow fields may still refer to the fields they shadow. This can be used to transform a field &quot;in-place&quot;.</p>
<p>Transforms do have some limitations. They can only refer to fields present in the actual input rows; in particular,
they cannot refer to other transforms. And they cannot remove fields, only add them. However, they can shadow a field
with another field containing all nulls, which will act similarly to removing the field.</p>
<p>Druid currently includes one kind of built-in transform, the expression transform. It has the following syntax:</p>
<pre><code class="hljs">{
<span class="hljs-attr">"type"</span>: <span class="hljs-string">"expression"</span>,
<span class="hljs-attr">"name"</span>: <span class="hljs-string">"&lt;output name&gt;"</span>,
<span class="hljs-attr">"expression"</span>: <span class="hljs-string">"&lt;expr&gt;"</span>
}
</code></pre>
<p>The <code>expression</code> is a <a href="/docs/26.0.0/misc/math-expr.html">Druid query expression</a>.</p>
<blockquote>
<p>Conceptually, after input data records are read, Druid applies ingestion spec components in a particular order:
first <a href="/docs/26.0.0/ingestion/data-formats.html#flattenspec"><code>flattenSpec</code></a> (if any), then <a href="#timestampspec"><code>timestampSpec</code></a>, then <a href="#transformspec"><code>transformSpec</code></a>,
and finally <a href="#dimensionsspec"><code>dimensionsSpec</code></a> and <a href="#metricsspec"><code>metricsSpec</code></a>. Keep this in mind when writing
your ingestion spec.</p>
</blockquote>
<h4><a class="anchor" aria-hidden="true" id="filter"></a><a href="#filter" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a>Filter</h4>
<p>The <code>filter</code> conditionally filters input rows during ingestion. Only rows that pass the filter will be
ingested. Any of Druid's standard <a href="/docs/26.0.0/querying/filters.html">query filters</a> can be used. Note that within a
<code>transformSpec</code>, the <code>transforms</code> are applied before the <code>filter</code>, so the filter can refer to a transform.</p>
<h3><a class="anchor" aria-hidden="true" id="legacy-dataschema-spec"></a><a href="#legacy-dataschema-spec" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a>Legacy <code>dataSchema</code> spec</h3>
<blockquote>
<p>The <code>dataSchema</code> spec has been changed in 0.17.0. The new spec is supported by all ingestion methods
except for <em>Hadoop</em> ingestion. See <a href="#dataschema"><code>dataSchema</code></a> for the new spec.</p>
</blockquote>
<p>The legacy <code>dataSchema</code> spec has below two more components in addition to the ones listed in the <a href="#dataschema"><code>dataSchema</code></a> section above.</p>
<ul>
<li><a href="#parser-deprecated">input row parser</a>, <a href="#flattenspec">flattening of nested data</a> (if needed)</li>
</ul>
<h4><a class="anchor" aria-hidden="true" id="parser-deprecated"></a><a href="#parser-deprecated" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>parser</code> (Deprecated)</h4>
<p>In legacy <code>dataSchema</code>, the <code>parser</code> is located in the <code>dataSchema</code> โ†’ <code>parser</code> and is responsible for configuring a wide variety of
items related to parsing input records. The <code>parser</code> is deprecated and it is highly recommended to use <code>inputFormat</code> instead.
For details about <code>inputFormat</code> and supported <code>parser</code> types, see the <a href="/docs/26.0.0/ingestion/data-formats.html">&quot;Data formats&quot; page</a>.</p>
<p>For details about major components of the <code>parseSpec</code>, refer to their subsections:</p>
<ul>
<li><a href="#timestampspec"><code>timestampSpec</code></a>, responsible for configuring the <a href="/docs/26.0.0/ingestion/data-model.html#primary-timestamp">primary timestamp</a>.</li>
<li><a href="#dimensionsspec"><code>dimensionsSpec</code></a>, responsible for configuring <a href="/docs/26.0.0/ingestion/data-model.html#dimensions">dimensions</a>.</li>
<li><a href="#flattenspec"><code>flattenSpec</code></a>, responsible for flattening nested data formats.</li>
</ul>
<p>An example <code>parser</code> is:</p>
<pre><code class="hljs"><span class="hljs-string">"parser"</span>: {
<span class="hljs-string">"type"</span>: <span class="hljs-string">"string"</span>,
<span class="hljs-string">"parseSpec"</span>: {
<span class="hljs-string">"format"</span>: <span class="hljs-string">"json"</span>,
<span class="hljs-string">"flattenSpec"</span>: {
<span class="hljs-string">"useFieldDiscovery"</span>: true,
<span class="hljs-string">"fields"</span>: [
{ <span class="hljs-string">"type"</span>: <span class="hljs-string">"path"</span>, <span class="hljs-string">"name"</span>: <span class="hljs-string">"userId"</span>, <span class="hljs-string">"expr"</span>: <span class="hljs-string">"$.user.id"</span> }
]
},
<span class="hljs-string">"timestampSpec"</span>: {
<span class="hljs-string">"column"</span>: <span class="hljs-string">"timestamp"</span>,
<span class="hljs-string">"format"</span>: <span class="hljs-string">"auto"</span>
},
<span class="hljs-string">"dimensionsSpec"</span>: {
<span class="hljs-string">"dimensions"</span>: [
<span class="hljs-string">"page"</span>,
<span class="hljs-string">"language"</span>,
{ <span class="hljs-string">"type"</span>: <span class="hljs-string">"long"</span>, <span class="hljs-string">"name"</span>: <span class="hljs-string">"userId"</span> }
]
}
}
}
</code></pre>
<h4><a class="anchor" aria-hidden="true" id="flattenspec"></a><a href="#flattenspec" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>flattenSpec</code></h4>
<p>In the legacy <code>dataSchema</code>, the <code>flattenSpec</code> is located in <code>dataSchema</code> โ†’ <code>parser</code> โ†’ <code>parseSpec</code> โ†’ <code>flattenSpec</code> and is responsible for
bridging the gap between potentially nested input data (such as JSON, Avro, etc) and Druid's flat data model.
See <a href="/docs/26.0.0/ingestion/data-formats.html#flattenspec">Flatten spec</a> for more details.</p>
<h2><a class="anchor" aria-hidden="true" id="ioconfig"></a><a href="#ioconfig" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>ioConfig</code></h2>
<p>The <code>ioConfig</code> influences how data is read from a source system, such as Apache Kafka, Amazon S3, a mounted
filesystem, or any other supported source system. The <code>inputFormat</code> property applies to all
<a href="/docs/26.0.0/ingestion/index.html#ingestion-methods">ingestion method</a> except for Hadoop ingestion. The Hadoop ingestion still
uses the <a href="#parser-deprecated"><code>parser</code></a> in the legacy <code>dataSchema</code>.
The rest of <code>ioConfig</code> is specific to each individual ingestion method.
An example <code>ioConfig</code> to read JSON data is:</p>
<pre><code class="hljs css language-json">"ioConfig": {
"type": "&lt;ingestion-method-specific type code&gt;",
"inputFormat": {
"type": "json"
},
...
}
</code></pre>
<p>For more details, see the documentation provided by each <a href="/docs/26.0.0/ingestion/index.html#ingestion-methods">ingestion method</a>.</p>
<h2><a class="anchor" aria-hidden="true" id="tuningconfig"></a><a href="#tuningconfig" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>tuningConfig</code></h2>
<p>Tuning properties are specified in a <code>tuningConfig</code>, which goes at the top level of an ingestion spec. Some
properties apply to all <a href="/docs/26.0.0/ingestion/index.html#ingestion-methods">ingestion methods</a>, but most are specific to each individual
ingestion method. An example <code>tuningConfig</code> that sets all of the shared, common properties to their defaults
is:</p>
<pre><code class="hljs css language-plaintext">"tuningConfig": {
"type": "&lt;ingestion-method-specific type code&gt;",
"maxRowsInMemory": 1000000,
"maxBytesInMemory": &lt;one-sixth of JVM memory&gt;,
"indexSpec": {
"bitmap": { "type": "roaring" },
"dimensionCompression": "lz4",
"metricCompression": "lz4",
"longEncoding": "longs"
},
&lt;other ingestion-method-specific properties&gt;
}
</code></pre>
<table>
<thead>
<tr><th>Field</th><th>Description</th><th>Default</th></tr>
</thead>
<tbody>
<tr><td>type</td><td>Each ingestion method has its own tuning type code. You must specify the type code that matches your ingestion method. Common options are <code>index</code>, <code>hadoop</code>, <code>kafka</code>, and <code>kinesis</code>.</td><td></td></tr>
<tr><td>maxRowsInMemory</td><td>The maximum number of records to store in memory before persisting to disk. Note that this is the number of rows post-rollup, and so it may not be equal to the number of input records. Ingested records will be persisted to disk when either <code>maxRowsInMemory</code> or <code>maxBytesInMemory</code> are reached (whichever happens first).</td><td><code>1000000</code></td></tr>
<tr><td>maxBytesInMemory</td><td>The maximum aggregate size of records, in bytes, to store in the JVM heap before persisting. This is based on a rough estimate of memory usage. Ingested records will be persisted to disk when either <code>maxRowsInMemory</code> or <code>maxBytesInMemory</code> are reached (whichever happens first). <code>maxBytesInMemory</code> also includes heap usage of artifacts created from intermediary persists. This means that after every persist, the amount of <code>maxBytesInMemory</code> until the next persist will decrease. If the sum of bytes of all intermediary persisted artifacts exceeds <code>maxBytesInMemory</code> the task fails.<br /><br />Setting <code>maxBytesInMemory</code> to -1 disables this check, meaning Druid will rely entirely on <code>maxRowsInMemory</code> to control memory usage. Setting it to zero means the default value will be used (one-sixth of JVM heap size).<br /><br />Note that the estimate of memory usage is designed to be an overestimate, and can be especially high when using complex ingest-time aggregators, including sketches. If this causes your indexing workloads to persist to disk too often, you can set <code>maxBytesInMemory</code> to -1 and rely on <code>maxRowsInMemory</code> instead.</td><td>One-sixth of max JVM heap size</td></tr>
<tr><td>skipBytesInMemoryOverheadCheck</td><td>The calculation of maxBytesInMemory takes into account overhead objects created during ingestion and each intermediate persist. Setting this to true can exclude the bytes of these overhead objects from maxBytesInMemory check.</td><td>false</td></tr>
<tr><td>indexSpec</td><td>Defines segment storage format options to use at indexing time.</td><td>See <a href="#indexspec"><code>indexSpec</code></a> for more information.</td></tr>
<tr><td>indexSpecForIntermediatePersists</td><td>Defines segment storage format options to use at indexing time for intermediate persisted temporary segments.</td><td>See <a href="#indexspec"><code>indexSpec</code></a> for more information.</td></tr>
<tr><td>Other properties</td><td>Each ingestion method has its own list of additional tuning properties. See the documentation for each method for a full list: <a href="/docs/26.0.0/development/extensions-core/kafka-supervisor-reference.html#tuningconfig">Kafka indexing service</a>, <a href="/docs/26.0.0/development/extensions-core/kinesis-ingestion.html#tuningconfig">Kinesis indexing service</a>, <a href="/docs/26.0.0/ingestion/native-batch.html#tuningconfig">Native batch</a>, and <a href="/docs/26.0.0/ingestion/hadoop.html#tuningconfig">Hadoop-based</a>.</td><td></td></tr>
</tbody>
</table>
<h3><a class="anchor" aria-hidden="true" id="indexspec"></a><a href="#indexspec" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a><code>indexSpec</code></h3>
<p>The <code>indexSpec</code> object can include the following properties:</p>
<table>
<thead>
<tr><th>Field</th><th>Description</th><th>Default</th></tr>
</thead>
<tbody>
<tr><td>bitmap</td><td>Compression format for bitmap indexes. Should be a JSON object with <code>type</code> set to <code>roaring</code> or <code>concise</code>.</td><td><code>{&quot;type&quot;: &quot;roaring&quot;}</code></td></tr>
<tr><td>dimensionCompression</td><td>Compression format for dimension columns. Options are <code>lz4</code>, <code>lzf</code>, <code>zstd</code>, or <code>uncompressed</code>.</td><td><code>lz4</code></td></tr>
<tr><td>stringDictionaryEncoding</td><td>Encoding format for STRING value dictionaries used by STRING and COMPLEX&lt;json&gt; columns. <br /><br />Example to enable front coding: <code>{&quot;type&quot;:&quot;frontCoded&quot;, &quot;bucketSize&quot;: 4}</code><br /><code>bucketSize</code> is the number of values to place in a bucket to perform delta encoding. Must be a power of 2, maximum is 128. Defaults to 4.<br /> <code>formatVersion</code> can specify older versions for backwards compatibility during rolling upgrades, valid options are <code>0</code> and <code>1</code>. Defaults to <code>0</code> for backwards compatibility.<br /><br />See <a href="#front-coding">Front coding</a> for more information.</td><td><code>{&quot;type&quot;:&quot;utf8&quot;}</code></td></tr>
<tr><td>metricCompression</td><td>Compression format for primitive type metric columns. Options are <code>lz4</code>, <code>lzf</code>, <code>zstd</code>, <code>uncompressed</code>, or <code>none</code> (which is more efficient than <code>uncompressed</code>, but not supported by older versions of Druid).</td><td><code>lz4</code></td></tr>
<tr><td>longEncoding</td><td>Encoding format for long-typed columns. Applies regardless of whether they are dimensions or metrics. Options are <code>auto</code> or <code>longs</code>. <code>auto</code> encodes the values using offset or lookup table depending on column cardinality, and store them with variable size. <code>longs</code> stores the value as-is with 8 bytes each.</td><td><code>longs</code></td></tr>
<tr><td>jsonCompression</td><td>Compression format to use for nested column raw data. Options are <code>lz4</code>, <code>lzf</code>, <code>zstd</code>, or <code>uncompressed</code>.</td><td><code>lz4</code></td></tr>
</tbody>
</table>
<h5><a class="anchor" aria-hidden="true" id="front-coding"></a><a href="#front-coding" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a>Front coding</h5>
<p>Front coding is an experimental feature starting in version 25.0. Front coding is an incremental encoding strategy that Druid can use to store STRING and <a href="/docs/26.0.0/querying/nested-columns.html">COMPLEX&lt;json&gt;</a> columns. It allows Druid to create smaller UTF-8 encoded segments with very little performance cost.</p>
<p>You can enable front coding with all types of ingestion. For information on defining an <code>indexSpec</code> in a query context, see <a href="/docs/26.0.0/multi-stage-query/reference.html#context-parameters">SQL-based ingestion reference</a>.</p>
<blockquote>
<p>Front coding was originally introduced in Druid 25.0, and an improved 'version 1' was introduced in Druid 26.0, with typically faster read speed and smaller storage size. The current recommendation is to enable it in a staging environment and fully test your use case before using in production. By default, segments created with front coding enabled in Druid 26.0 are backwards compatible with Druid 25.0, but those created with Druid 26.0 or 25.0 are not compatible with Druid versions older than 25.0. If using front coding in Druid 25.0 and upgrading to Druid 26.0, the <code>formatVersion</code> defaults to <code>0</code> to keep writing out the older format to enable seamless downgrades to Druid 25.0, and then later is recommended to be changed to <code>1</code> once determined that rollback is not necessary.</p>
</blockquote>
<p>Beyond these properties, each ingestion method has its own specific tuning properties. See the documentation for each
<a href="/docs/26.0.0/ingestion/index.html#ingestion-methods">ingestion method</a> for details.</p>
</span></div></article></div><div class="docs-prevnext"><a class="docs-prev button" href="/docs/26.0.0/ingestion/partitioning.html"><span class="arrow-prev">โ† </span><span>Partitioning</span></a><a class="docs-next button" href="/docs/26.0.0/ingestion/schema-design.html"><span>Schema design tips</span><span class="arrow-next"> โ†’</span></a></div></div></div><nav class="onPageNav"><ul class="toc-headings"><li><a href="#dataschema"><code>dataSchema</code></a><ul class="toc-headings"><li><a href="#datasource"><code>dataSource</code></a></li><li><a href="#timestampspec"><code>timestampSpec</code></a></li><li><a href="#dimensionsspec"><code>dimensionsSpec</code></a></li><li><a href="#metricsspec"><code>metricsSpec</code></a></li><li><a href="#granularityspec"><code>granularitySpec</code></a></li><li><a href="#transformspec"><code>transformSpec</code></a></li><li><a href="#legacy-dataschema-spec">Legacy <code>dataSchema</code> spec</a></li></ul></li><li><a href="#ioconfig"><code>ioConfig</code></a></li><li><a href="#tuningconfig"><code>tuningConfig</code></a><ul class="toc-headings"><li><a href="#indexspec"><code>indexSpec</code></a></li></ul></li></ul></nav></div><footer class="nav-footer druid-footer" id="footer"><div class="container"><div class="text-center"><p><a href="/technology">Technology</a>โ€‚ยทโ€‚<a href="/use-cases">Use Cases</a>โ€‚ยทโ€‚<a href="/druid-powered">Powered by Druid</a>โ€‚ยทโ€‚<a href="/docs/26.0.0/">Docs</a>โ€‚ยทโ€‚<a href="/community/">Community</a>โ€‚ยทโ€‚<a href="/downloads.html">Download</a>โ€‚ยทโ€‚<a href="/faq">FAQ</a></p></div><div class="text-center"><a title="Join the user group" href="https://groups.google.com/forum/#!forum/druid-user" target="_blank"><span class="fa fa-comments"></span></a>โ€‚ยทโ€‚<a title="Follow Druid" href="https://twitter.com/druidio" target="_blank"><span class="fab fa-twitter"></span></a>โ€‚ยทโ€‚<a title="Download via Apache" href="https://www.apache.org/dyn/closer.cgi?path=/incubator/druid/{{ site.druid_versions[0].versions[0].version }}/apache-druid-{{ site.druid_versions[0].versions[0].version }}-bin.tar.gz" target="_blank"><span class="fas fa-feather"></span></a>โ€‚ยทโ€‚<a title="GitHub" href="https://github.com/apache/druid" target="_blank"><span class="fab fa-github"></span></a></div><div class="text-center license">Copyright ยฉ 2022 <a href="https://www.apache.org/" target="_blank">Apache Software Foundation</a>.<br/>Except where otherwise noted, licensed under <a rel="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA 4.0</a>.<br/>Apache Druid, Druid, and the Druid logo are either registered trademarks or trademarks of The Apache Software Foundation in the United States and other countries.</div></div></footer></div><script type="text/javascript" src="https://cdn.jsdelivr.net/npm/docsearch.js@2/dist/cdn/docsearch.min.js"></script><script>
document.addEventListener('keyup', function(e) {
if (e.target !== document.body) {
return;
}
// keyCode for '/' (slash)
if (e.keyCode === 191) {
const search = document.getElementById('search_input_react');
search && search.focus();
}
});
</script><script>
var search = docsearch({
appId: 'CPK9PMSCEY',
apiKey: 'd4ef4ffe3a2f0c7d1e34b062fd98736b',
indexName: 'apache_druid',
inputSelector: '#search_input_react',
algoliaOptions: {"facetFilters":["language:en","version:26.0.0"]}
});
</script></body></html>