| <!DOCTYPE html> |
| <html lang="en"> |
| <head> |
| <meta charset="UTF-8" /> |
| <meta name="viewport" content="width=device-width, initial-scale=1.0"> |
| <meta name="description" content="Apache Druid"> |
| <meta name="keywords" content="druid,kafka,database,analytics,streaming,real-time,real time,apache,open source"> |
| <meta name="author" content="Apache Software Foundation"> |
| |
| <title>Druid | ORC Extension</title> |
| |
| <link rel="alternate" type="application/atom+xml" href="/feed"> |
| <link rel="shortcut icon" href="/img/favicon.png"> |
| |
| <link rel="stylesheet" href="/assets/css/font-awesome-5.css"> |
| |
| <link href='//fonts.googleapis.com/css?family=Open+Sans+Condensed:300,700,300italic|Open+Sans:300italic,400italic,600italic,400,300,600,700' rel='stylesheet' type='text/css'> |
| |
| <link rel="stylesheet" href="/css/bootstrap-pure.css?v=1.1"> |
| <link rel="stylesheet" href="/css/base.css?v=1.1"> |
| <link rel="stylesheet" href="/css/header.css?v=1.1"> |
| <link rel="stylesheet" href="/css/footer.css?v=1.1"> |
| <link rel="stylesheet" href="/css/syntax.css?v=1.1"> |
| <link rel="stylesheet" href="/css/docs.css?v=1.1"> |
| |
| <script> |
| (function() { |
| var cx = '000162378814775985090:molvbm0vggm'; |
| var gcse = document.createElement('script'); |
| gcse.type = 'text/javascript'; |
| gcse.async = true; |
| gcse.src = (document.location.protocol == 'https:' ? 'https:' : 'http:') + |
| '//cse.google.com/cse.js?cx=' + cx; |
| var s = document.getElementsByTagName('script')[0]; |
| s.parentNode.insertBefore(gcse, s); |
| })(); |
| </script> |
| |
| |
| </head> |
| |
| <body> |
| <!-- Start page_header include --> |
| <script src="//ajax.googleapis.com/ajax/libs/jquery/2.2.4/jquery.min.js"></script> |
| |
| <div class="top-navigator"> |
| <div class="container"> |
| <div class="left-cont"> |
| <a class="logo" href="/"><span class="druid-logo"></span></a> |
| </div> |
| <div class="right-cont"> |
| <ul class="links"> |
| <li class=""><a href="/technology">Technology</a></li> |
| <li class=""><a href="/use-cases">Use Cases</a></li> |
| <li class=""><a href="/druid-powered">Powered By</a></li> |
| <li class=""><a href="/docs/latest/design/">Docs</a></li> |
| <li class=""><a href="/community/">Community</a></li> |
| <li class="header-dropdown"> |
| <a>Apache</a> |
| <div class="header-dropdown-menu"> |
| <a href="https://www.apache.org/" target="_blank">Foundation</a> |
| <a href="https://www.apache.org/events/current-event" target="_blank">Events</a> |
| <a href="https://www.apache.org/licenses/" target="_blank">License</a> |
| <a href="https://www.apache.org/foundation/thanks.html" target="_blank">Thanks</a> |
| <a href="https://www.apache.org/security/" target="_blank">Security</a> |
| <a href="https://www.apache.org/foundation/sponsorship.html" target="_blank">Sponsorship</a> |
| </div> |
| </li> |
| <li class=" button-link"><a href="/downloads.html">Download</a></li> |
| </ul> |
| </div> |
| </div> |
| <div class="action-button menu-icon"> |
| <span class="fa fa-bars"></span> MENU |
| </div> |
| <div class="action-button menu-icon-close"> |
| <span class="fa fa-times"></span> MENU |
| </div> |
| </div> |
| |
| <script type="text/javascript"> |
| var $menu = $('.right-cont'); |
| var $menuIcon = $('.menu-icon'); |
| var $menuIconClose = $('.menu-icon-close'); |
| |
| function showMenu() { |
| $menu.fadeIn(100); |
| $menuIcon.fadeOut(100); |
| $menuIconClose.fadeIn(100); |
| } |
| |
| $menuIcon.click(showMenu); |
| |
| function hideMenu() { |
| $menu.fadeOut(100); |
| $menuIconClose.fadeOut(100); |
| $menuIcon.fadeIn(100); |
| } |
| |
| $menuIconClose.click(hideMenu); |
| |
| $(window).resize(function() { |
| if ($(window).width() >= 840) { |
| $menu.fadeIn(100); |
| $menuIcon.fadeOut(100); |
| $menuIconClose.fadeOut(100); |
| } |
| else { |
| $menu.fadeOut(100); |
| $menuIcon.fadeIn(100); |
| $menuIconClose.fadeOut(100); |
| } |
| }); |
| </script> |
| |
| <!-- Stop page_header include --> |
| |
| |
| <div class="container doc-container"> |
| |
| |
| |
| |
| <p> Looking for the <a href="/docs/24.0.2/">latest stable documentation</a>?</p> |
| |
| |
| <div class="row"> |
| <div class="col-md-9 doc-content"> |
| <p> |
| <a class="btn btn-default btn-xs visible-xs-inline-block visible-sm-inline-block" href="#toc">Table of Contents</a> |
| </p> |
| <!-- |
| ~ Licensed to the Apache Software Foundation (ASF) under one |
| ~ or more contributor license agreements. See the NOTICE file |
| ~ distributed with this work for additional information |
| ~ regarding copyright ownership. The ASF licenses this file |
| ~ to you under the Apache License, Version 2.0 (the |
| ~ "License"); you may not use this file except in compliance |
| ~ with the License. You may obtain a copy of the License at |
| ~ |
| ~ http://www.apache.org/licenses/LICENSE-2.0 |
| ~ |
| ~ Unless required by applicable law or agreed to in writing, |
| ~ software distributed under the License is distributed on an |
| ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY |
| ~ KIND, either express or implied. See the License for the |
| ~ specific language governing permissions and limitations |
| ~ under the License. |
| --> |
| |
| <h1 id="orc-extension">ORC Extension</h1> |
| |
| <p>This Apache Druid (incubating) module extends <a href="../../ingestion/hadoop.html">Druid Hadoop based indexing</a> to ingest data directly from offline |
| Apache ORC files. </p> |
| |
| <p>To use this extension, make sure to <a href="../../operations/including-extensions.html">include</a> <code>druid-orc-extensions</code>.</p> |
| |
| <h2 id="orc-hadoop-parser">ORC Hadoop Parser</h2> |
| |
| <p>The <code>inputFormat</code> of <code>inputSpec</code> in <code>ioConfig</code> must be set to <code>"org.apache.orc.mapreduce.OrcInputFormat"</code>.</p> |
| |
| <table><thead> |
| <tr> |
| <th>Field</th> |
| <th>Type</th> |
| <th>Description</th> |
| <th>Required</th> |
| </tr> |
| </thead><tbody> |
| <tr> |
| <td>type</td> |
| <td>String</td> |
| <td>This should say <code>orc</code></td> |
| <td>yes</td> |
| </tr> |
| <tr> |
| <td>parseSpec</td> |
| <td>JSON Object</td> |
| <td>Specifies the timestamp and dimensions of the data (<code>timeAndDims</code> and <code>orc</code> format) and a <code>flattenSpec</code> (<code>orc</code> format)</td> |
| <td>yes</td> |
| </tr> |
| </tbody></table> |
| |
| <p>The parser supports two <code>parseSpec</code> formats: <code>orc</code> and <code>timeAndDims</code>. </p> |
| |
| <p><code>orc</code> supports auto field discovery and flattening, if specified with a <a href="../../ingestion/flatten-json.html">flattenSpec</a>. |
| If no <code>flattenSpec</code> is specified, <code>useFieldDiscovery</code> will be enabled by default. Specifying a <code>dimensionSpec</code> is |
| optional if <code>useFieldDiscovery</code> is enabled: if a <code>dimensionSpec</code> is supplied, the list of <code>dimensions</code> it defines will be |
| the set of ingested dimensions, if missing the discovered fields will make up the list.</p> |
| |
| <p><code>timeAndDims</code> parse spec must specify which fields will be extracted as dimensions through the <code>dimensionSpec</code>.</p> |
| |
| <p><a href="https://orc.apache.org/docs/types.html">All column types</a> are supported, with the exception of <code>union</code> types. Columns of |
| <code>list</code> type, if filled with primitives, may be used as a multi-value dimension, or specific elements can be extracted with |
| <code>flattenSpec</code> expressions. Likewise, primitive fields may be extracted from <code>map</code> and <code>struct</code> types in the same manner. |
| Auto field discovery will automatically create a string dimension for every (non-timestamp) primitive or <code>list</code> of |
| primitives, as well as any flatten expressions defined in the <code>flattenSpec</code>.</p> |
| |
| <h3 id="hadoop-job-properties">Hadoop Job Properties</h3> |
| |
| <p>Like most Hadoop jobs, the best outcomes will add <code>"mapreduce.job.user.classpath.first": "true"</code> or |
| <code>"mapreduce.job.classloader": "true"</code> to the <code>jobProperties</code> section of <code>tuningConfig</code>. Note that it is likely if using |
| <code>"mapreduce.job.classloader": "true"</code> that you will need to set <code>mapreduce.job.classloader.system.classes</code> to include |
| <code>-org.apache.hadoop.hive.</code> to instruct Hadoop to load <code>org.apache.hadoop.hive</code> classes from the application jars instead |
| of system jars, e.g.</p> |
| <div class="highlight"><pre><code class="language-json" data-lang="json"><span></span><span class="err">...</span> |
| <span class="s2">"mapreduce.job.classloader"</span><span class="err">:</span> <span class="s2">"true"</span><span class="err">,</span> |
| <span class="s2">"mapreduce.job.classloader.system.classes"</span> <span class="err">:</span> <span class="s2">"java., javax.accessibility., javax.activation., javax.activity., javax.annotation., javax.annotation.processing., javax.crypto., javax.imageio., javax.jws., javax.lang.model., -javax.management.j2ee., javax.management., javax.naming., javax.net., javax.print., javax.rmi., javax.script., -javax.security.auth.message., javax.security.auth., javax.security.cert., javax.security.sasl., javax.sound., javax.sql., javax.swing., javax.tools., javax.transaction., -javax.xml.registry., -javax.xml.rpc., javax.xml., org.w3c.dom., org.xml.sax., org.apache.commons.logging., org.apache.log4j., -org.apache.hadoop.hbase., -org.apache.hadoop.hive., org.apache.hadoop., core-default.xml, hdfs-default.xml, mapred-default.xml, yarn-default.xml"</span><span class="err">,</span> |
| <span class="err">...</span> |
| </code></pre></div> |
| <p>This is due to the <code>hive-storage-api</code> dependency of the |
| <code>orc-mapreduce</code> library, which provides some classes under the <code>org.apache.hadoop.hive</code> package. If instead using the |
| setting <code>"mapreduce.job.user.classpath.first": "true"</code>, then this will not be an issue.</p> |
| |
| <h3 id="examples">Examples</h3> |
| |
| <h4 id="orc-parser-orc-parsespec-auto-field-discovery-flatten-expressions"><code>orc</code> parser, <code>orc</code> parseSpec, auto field discovery, flatten expressions</h4> |
| <div class="highlight"><pre><code class="language-json" data-lang="json"><span></span><span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"index_hadoop"</span><span class="p">,</span> |
| <span class="nt">"spec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"ioConfig"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"hadoop"</span><span class="p">,</span> |
| <span class="nt">"inputSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"static"</span><span class="p">,</span> |
| <span class="nt">"inputFormat"</span><span class="p">:</span> <span class="s2">"org.apache.orc.mapreduce.OrcInputFormat"</span><span class="p">,</span> |
| <span class="nt">"paths"</span><span class="p">:</span> <span class="s2">"path/to/file.orc"</span> |
| <span class="p">},</span> |
| <span class="err">...</span> |
| <span class="p">},</span> |
| <span class="nt">"dataSchema"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"dataSource"</span><span class="p">:</span> <span class="s2">"example"</span><span class="p">,</span> |
| <span class="nt">"parser"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"orc"</span><span class="p">,</span> |
| <span class="nt">"parseSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"format"</span><span class="p">:</span> <span class="s2">"orc"</span><span class="p">,</span> |
| <span class="nt">"flattenSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"useFieldDiscovery"</span><span class="p">:</span> <span class="kc">true</span><span class="p">,</span> |
| <span class="nt">"fields"</span><span class="p">:</span> <span class="p">[</span> |
| <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"path"</span><span class="p">,</span> |
| <span class="nt">"name"</span><span class="p">:</span> <span class="s2">"nestedDim"</span><span class="p">,</span> |
| <span class="nt">"expr"</span><span class="p">:</span> <span class="s2">"$.nestedData.dim1"</span> |
| <span class="p">},</span> |
| <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"path"</span><span class="p">,</span> |
| <span class="nt">"name"</span><span class="p">:</span> <span class="s2">"listDimFirstItem"</span><span class="p">,</span> |
| <span class="nt">"expr"</span><span class="p">:</span> <span class="s2">"$.listDim[1]"</span> |
| <span class="p">}</span> |
| <span class="p">]</span> |
| <span class="p">},</span> |
| <span class="nt">"timestampSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"column"</span><span class="p">:</span> <span class="s2">"timestamp"</span><span class="p">,</span> |
| <span class="nt">"format"</span><span class="p">:</span> <span class="s2">"millis"</span> |
| <span class="p">}</span> |
| <span class="p">}</span> |
| <span class="p">},</span> |
| <span class="err">...</span> |
| <span class="p">},</span> |
| <span class="nt">"tuningConfig"</span><span class="p">:</span> <span class="err"><hadoop-tuning-config></span> |
| <span class="p">}</span> |
| <span class="p">}</span> |
| <span class="err">}</span> |
| </code></pre></div> |
| <h4 id="orc-parser-orc-parsespec-field-discovery-with-no-flattenspec-or-dimensionspec"><code>orc</code> parser, <code>orc</code> parseSpec, field discovery with no flattenSpec or dimensionSpec</h4> |
| <div class="highlight"><pre><code class="language-json" data-lang="json"><span></span><span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"index_hadoop"</span><span class="p">,</span> |
| <span class="nt">"spec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"ioConfig"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"hadoop"</span><span class="p">,</span> |
| <span class="nt">"inputSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"static"</span><span class="p">,</span> |
| <span class="nt">"inputFormat"</span><span class="p">:</span> <span class="s2">"org.apache.orc.mapreduce.OrcInputFormat"</span><span class="p">,</span> |
| <span class="nt">"paths"</span><span class="p">:</span> <span class="s2">"path/to/file.orc"</span> |
| <span class="p">},</span> |
| <span class="err">...</span> |
| <span class="p">},</span> |
| <span class="nt">"dataSchema"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"dataSource"</span><span class="p">:</span> <span class="s2">"example"</span><span class="p">,</span> |
| <span class="nt">"parser"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"orc"</span><span class="p">,</span> |
| <span class="nt">"parseSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"format"</span><span class="p">:</span> <span class="s2">"orc"</span><span class="p">,</span> |
| <span class="nt">"timestampSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"column"</span><span class="p">:</span> <span class="s2">"timestamp"</span><span class="p">,</span> |
| <span class="nt">"format"</span><span class="p">:</span> <span class="s2">"millis"</span> |
| <span class="p">}</span> |
| <span class="p">}</span> |
| <span class="p">},</span> |
| <span class="err">...</span> |
| <span class="p">},</span> |
| <span class="nt">"tuningConfig"</span><span class="p">:</span> <span class="err"><hadoop-tuning-config></span> |
| <span class="p">}</span> |
| <span class="p">}</span> |
| <span class="err">}</span> |
| </code></pre></div> |
| <h4 id="orc-parser-orc-parsespec-no-autodiscovery"><code>orc</code> parser, <code>orc</code> parseSpec, no autodiscovery</h4> |
| <div class="highlight"><pre><code class="language-json" data-lang="json"><span></span><span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"index_hadoop"</span><span class="p">,</span> |
| <span class="nt">"spec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"ioConfig"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"hadoop"</span><span class="p">,</span> |
| <span class="nt">"inputSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"static"</span><span class="p">,</span> |
| <span class="nt">"inputFormat"</span><span class="p">:</span> <span class="s2">"org.apache.orc.mapreduce.OrcInputFormat"</span><span class="p">,</span> |
| <span class="nt">"paths"</span><span class="p">:</span> <span class="s2">"path/to/file.orc"</span> |
| <span class="p">},</span> |
| <span class="err">...</span> |
| <span class="p">},</span> |
| <span class="nt">"dataSchema"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"dataSource"</span><span class="p">:</span> <span class="s2">"example"</span><span class="p">,</span> |
| <span class="nt">"parser"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"orc"</span><span class="p">,</span> |
| <span class="nt">"parseSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"format"</span><span class="p">:</span> <span class="s2">"orc"</span><span class="p">,</span> |
| <span class="nt">"flattenSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"useFieldDiscovery"</span><span class="p">:</span> <span class="kc">false</span><span class="p">,</span> |
| <span class="nt">"fields"</span><span class="p">:</span> <span class="p">[</span> |
| <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"path"</span><span class="p">,</span> |
| <span class="nt">"name"</span><span class="p">:</span> <span class="s2">"nestedDim"</span><span class="p">,</span> |
| <span class="nt">"expr"</span><span class="p">:</span> <span class="s2">"$.nestedData.dim1"</span> |
| <span class="p">},</span> |
| <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"path"</span><span class="p">,</span> |
| <span class="nt">"name"</span><span class="p">:</span> <span class="s2">"listDimFirstItem"</span><span class="p">,</span> |
| <span class="nt">"expr"</span><span class="p">:</span> <span class="s2">"$.listDim[1]"</span> |
| <span class="p">}</span> |
| <span class="p">]</span> |
| <span class="p">},</span> |
| <span class="nt">"timestampSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"column"</span><span class="p">:</span> <span class="s2">"timestamp"</span><span class="p">,</span> |
| <span class="nt">"format"</span><span class="p">:</span> <span class="s2">"millis"</span> |
| <span class="p">},</span> |
| <span class="nt">"dimensionsSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"dimensions"</span><span class="p">:</span> <span class="p">[</span> |
| <span class="s2">"dim1"</span><span class="p">,</span> |
| <span class="s2">"dim3"</span><span class="p">,</span> |
| <span class="s2">"nestedDim"</span><span class="p">,</span> |
| <span class="s2">"listDimFirstItem"</span> |
| <span class="p">],</span> |
| <span class="nt">"dimensionExclusions"</span><span class="p">:</span> <span class="p">[],</span> |
| <span class="nt">"spatialDimensions"</span><span class="p">:</span> <span class="p">[]</span> |
| <span class="p">}</span> |
| <span class="p">}</span> |
| <span class="p">},</span> |
| <span class="err">...</span> |
| <span class="p">},</span> |
| <span class="nt">"tuningConfig"</span><span class="p">:</span> <span class="err"><hadoop-tuning-config></span> |
| <span class="p">}</span> |
| <span class="p">}</span> |
| <span class="err">}</span> |
| </code></pre></div> |
| <h4 id="orc-parser-timeanddims-parsespec"><code>orc</code> parser, <code>timeAndDims</code> parseSpec</h4> |
| <div class="highlight"><pre><code class="language-json" data-lang="json"><span></span><span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"index_hadoop"</span><span class="p">,</span> |
| <span class="nt">"spec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"ioConfig"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"hadoop"</span><span class="p">,</span> |
| <span class="nt">"inputSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"static"</span><span class="p">,</span> |
| <span class="nt">"inputFormat"</span><span class="p">:</span> <span class="s2">"org.apache.orc.mapreduce.OrcInputFormat"</span><span class="p">,</span> |
| <span class="nt">"paths"</span><span class="p">:</span> <span class="s2">"path/to/file.orc"</span> |
| <span class="p">},</span> |
| <span class="err">...</span> |
| <span class="p">},</span> |
| <span class="nt">"dataSchema"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"dataSource"</span><span class="p">:</span> <span class="s2">"example"</span><span class="p">,</span> |
| <span class="nt">"parser"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"type"</span><span class="p">:</span> <span class="s2">"orc"</span><span class="p">,</span> |
| <span class="nt">"parseSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"format"</span><span class="p">:</span> <span class="s2">"timeAndDims"</span><span class="p">,</span> |
| <span class="nt">"timestampSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"column"</span><span class="p">:</span> <span class="s2">"timestamp"</span><span class="p">,</span> |
| <span class="nt">"format"</span><span class="p">:</span> <span class="s2">"auto"</span> |
| <span class="p">},</span> |
| <span class="nt">"dimensionsSpec"</span><span class="p">:</span> <span class="p">{</span> |
| <span class="nt">"dimensions"</span><span class="p">:</span> <span class="p">[</span> |
| <span class="s2">"dim1"</span><span class="p">,</span> |
| <span class="s2">"dim2"</span><span class="p">,</span> |
| <span class="s2">"dim3"</span><span class="p">,</span> |
| <span class="s2">"listDim"</span> |
| <span class="p">],</span> |
| <span class="nt">"dimensionExclusions"</span><span class="p">:</span> <span class="p">[],</span> |
| <span class="nt">"spatialDimensions"</span><span class="p">:</span> <span class="p">[]</span> |
| <span class="p">}</span> |
| <span class="p">}</span> |
| <span class="p">},</span> |
| <span class="err">...</span> |
| <span class="p">},</span> |
| <span class="nt">"tuningConfig"</span><span class="p">:</span> <span class="err"><hadoop-tuning-config></span> |
| <span class="p">}</span> |
| <span class="p">}</span> |
| </code></pre></div> |
| <h3 id="migration-from-contrib-extension">Migration from 'contrib' extension</h3> |
| |
| <p>This extension, first available in version 0.15.0, replaces the previous 'contrib' extension which was available until |
| 0.14.0-incubating. While this extension can index any data the 'contrib' extension could, the json spec for the |
| ingestion task is <em>incompatible</em>, and will need modified to work with the newer 'core' extension. </p> |
| |
| <p>To migrate to 0.15.0+: |
| * In <code>inputSpec</code> of <code>ioConfig</code>, <code>inputFormat</code> must be changed from <code>"org.apache.hadoop.hive.ql.io.orc.OrcNewInputFormat"</code> to |
| <code>"org.apache.orc.mapreduce.OrcInputFormat"</code> |
| * The 'contrib' extension supported a <code>typeString</code> property, which provided the schema of the |
| ORC file, of which was essentially required to have the types correct, but notably <em>not</em> the column names, which |
| facilitated column renaming. In the 'core' extension, column renaming can be achieved with |
| <a href="../../ingestion/flatten-json.html"><code>flattenSpec</code> expressions</a>. For example, <code>"typeString":"struct<time:string,name:string>"</code> |
| with the actual schema <code>struct<_col0:string,_col1:string></code>, to preserve Druid schema would need replaced with: |
| <code>json |
| "flattenSpec": { |
| "fields": [ |
| { |
| "type": "path", |
| "name": "time", |
| "expr": "$._col0" |
| }, |
| { |
| "type": "path", |
| "name": "name", |
| "expr": "$._col1" |
| } |
| ] |
| ... |
| } |
| </code> |
| * The 'contrib' extension supported a <code>mapFieldNameFormat</code> property, which provided a way to specify a dimension to |
| flatten <code>OrcMap</code> columns with primitive types. This functionality has also been replaced with |
| <a href="../../ingestion/flatten-json.html"><code>flattenSpec</code> expressions</a>. For example: <code>"mapFieldNameFormat": "<PARENT>_<CHILD>"</code> |
| for a dimension <code>nestedData_dim1</code>, to preserve Druid schema could be replaced with |
| <code>json |
| "flattenSpec": { |
| "fields": [ |
| { |
| "type": "path", |
| "name": "nestedData_dim1", |
| "expr": "$.nestedData.dim1" |
| } |
| ] |
| ... |
| } |
| </code></p> |
| |
| </div> |
| <div class="col-md-3"> |
| <div class="searchbox"> |
| <gcse:searchbox-only></gcse:searchbox-only> |
| </div> |
| <div id="toc" class="nav toc hidden-print"> |
| </div> |
| </div> |
| </div> |
| </div> |
| |
| <!-- Start page_footer include --> |
| <footer class="druid-footer"> |
| <div class="container"> |
| <div class="text-center"> |
| <p> |
| <a href="/technology">Technology</a> ·  |
| <a href="/use-cases">Use Cases</a> ·  |
| <a href="/druid-powered">Powered by Druid</a> ·  |
| <a href="/docs/latest/">Docs</a> ·  |
| <a href="/community/">Community</a> ·  |
| <a href="/downloads.html">Download</a> ·  |
| <a href="/faq">FAQ</a> |
| </p> |
| </div> |
| <div class="text-center"> |
| <a title="Join the user group" href="https://groups.google.com/forum/#!forum/druid-user" target="_blank"><span class="fa fa-comments"></span></a> ·  |
| <a title="Follow Druid" href="https://twitter.com/druidio" target="_blank"><span class="fab fa-twitter"></span></a> ·  |
| <a title="GitHub" href="https://github.com/apache/druid" target="_blank"><span class="fab fa-github"></span></a> |
| </div> |
| <div class="text-center license"> |
| Copyright © 2020 <a href="https://www.apache.org/" target="_blank">Apache Software Foundation</a>.<br> |
| Except where otherwise noted, licensed under <a rel="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA 4.0</a>.<br> |
| Apache Druid, Druid, and the Druid logo are either registered trademarks or trademarks of The Apache Software Foundation in the United States and other countries. |
| </div> |
| </div> |
| </footer> |
| |
| <script async src="https://www.googletagmanager.com/gtag/js?id=UA-131010415-1"></script> |
| <script> |
| window.dataLayer = window.dataLayer || []; |
| function gtag(){dataLayer.push(arguments);} |
| gtag('js', new Date()); |
| gtag('config', 'UA-131010415-1'); |
| </script> |
| <script> |
| function trackDownload(type, url) { |
| ga('send', 'event', 'download', type, url); |
| } |
| </script> |
| <script src="//code.jquery.com/jquery.min.js"></script> |
| <script src="//maxcdn.bootstrapcdn.com/bootstrap/3.2.0/js/bootstrap.min.js"></script> |
| <script src="/assets/js/druid.js"></script> |
| <!-- stop page_footer include --> |
| |
| |
| <script> |
| $(function() { |
| $(".toc").load("/docs/0.15.0-incubating/toc.html"); |
| |
| // There is no way to tell when .gsc-input will be async loaded into the page so just try to set a placeholder until it works |
| var tries = 0; |
| var timer = setInterval(function() { |
| tries++; |
| if (tries > 300) clearInterval(timer); |
| var searchInput = $('input.gsc-input'); |
| if (searchInput.length) { |
| searchInput.attr('placeholder', 'Search'); |
| clearInterval(timer); |
| } |
| }, 100); |
| }); |
| </script> |
| </body> |
| </html> |