| PREHOOK: query: DROP TABLE insert_into4a |
| PREHOOK: type: DROPTABLE |
| POSTHOOK: query: DROP TABLE insert_into4a |
| POSTHOOK: type: DROPTABLE |
| PREHOOK: query: DROP TABLE insert_into4b |
| PREHOOK: type: DROPTABLE |
| POSTHOOK: query: DROP TABLE insert_into4b |
| POSTHOOK: type: DROPTABLE |
| PREHOOK: query: CREATE TABLE insert_into4a (key int, value string) |
| PREHOOK: type: CREATETABLE |
| PREHOOK: Output: database:default |
| PREHOOK: Output: default@insert_into4a |
| POSTHOOK: query: CREATE TABLE insert_into4a (key int, value string) |
| POSTHOOK: type: CREATETABLE |
| POSTHOOK: Output: database:default |
| POSTHOOK: Output: default@insert_into4a |
| PREHOOK: query: CREATE TABLE insert_into4b (key int, value string) |
| PREHOOK: type: CREATETABLE |
| PREHOOK: Output: database:default |
| PREHOOK: Output: default@insert_into4b |
| POSTHOOK: query: CREATE TABLE insert_into4b (key int, value string) |
| POSTHOOK: type: CREATETABLE |
| POSTHOOK: Output: database:default |
| POSTHOOK: Output: default@insert_into4b |
| PREHOOK: query: EXPLAIN INSERT INTO TABLE insert_into4a SELECT * FROM src LIMIT 10 |
| PREHOOK: type: QUERY |
| PREHOOK: Input: default@src |
| PREHOOK: Output: default@insert_into4a |
| POSTHOOK: query: EXPLAIN INSERT INTO TABLE insert_into4a SELECT * FROM src LIMIT 10 |
| POSTHOOK: type: QUERY |
| POSTHOOK: Input: default@src |
| POSTHOOK: Output: default@insert_into4a |
| STAGE DEPENDENCIES: |
| Stage-1 is a root stage |
| Stage-2 depends on stages: Stage-1 |
| Stage-0 depends on stages: Stage-2 |
| Stage-3 depends on stages: Stage-0 |
| |
| STAGE PLANS: |
| Stage: Stage-1 |
| Tez |
| #### A masked pattern was here #### |
| Edges: |
| Reducer 2 <- Map 1 (CUSTOM_SIMPLE_EDGE) |
| Reducer 3 <- Reducer 2 (CUSTOM_SIMPLE_EDGE) |
| #### A masked pattern was here #### |
| Vertices: |
| Map 1 |
| Map Operator Tree: |
| TableScan |
| alias: src |
| Statistics: Num rows: 500 Data size: 89000 Basic stats: COMPLETE Column stats: COMPLETE |
| Limit |
| Number of rows: 10 |
| Statistics: Num rows: 10 Data size: 1780 Basic stats: COMPLETE Column stats: COMPLETE |
| Select Operator |
| expressions: key (type: string), value (type: string) |
| outputColumnNames: _col0, _col1 |
| Statistics: Num rows: 10 Data size: 1780 Basic stats: COMPLETE Column stats: COMPLETE |
| Reduce Output Operator |
| null sort order: |
| sort order: |
| Statistics: Num rows: 10 Data size: 1780 Basic stats: COMPLETE Column stats: COMPLETE |
| TopN Hash Memory Usage: 0.1 |
| value expressions: _col0 (type: string), _col1 (type: string) |
| Execution mode: vectorized, llap |
| LLAP IO: all inputs |
| Reducer 2 |
| Execution mode: vectorized, llap |
| Reduce Operator Tree: |
| Limit |
| Number of rows: 10 |
| Statistics: Num rows: 10 Data size: 1780 Basic stats: COMPLETE Column stats: COMPLETE |
| Select Operator |
| expressions: UDFToInteger(VALUE._col0) (type: int), VALUE._col1 (type: string) |
| outputColumnNames: _col0, _col1 |
| Statistics: Num rows: 10 Data size: 1780 Basic stats: COMPLETE Column stats: COMPLETE |
| File Output Operator |
| compressed: false |
| Statistics: Num rows: 10 Data size: 950 Basic stats: COMPLETE Column stats: COMPLETE |
| table: |
| input format: org.apache.hadoop.mapred.TextInputFormat |
| output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |
| serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe |
| name: default.insert_into4a |
| Select Operator |
| expressions: _col0 (type: int), _col1 (type: string) |
| outputColumnNames: key, value |
| Statistics: Num rows: 10 Data size: 950 Basic stats: COMPLETE Column stats: COMPLETE |
| Group By Operator |
| aggregations: min(key), max(key), count(1), count(key), compute_bit_vector_hll(key), max(length(value)), avg(COALESCE(length(value),0)), count(value), compute_bit_vector_hll(value) |
| minReductionHashAggr: 0.9 |
| mode: hash |
| outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8 |
| Statistics: Num rows: 1 Data size: 400 Basic stats: COMPLETE Column stats: COMPLETE |
| Reduce Output Operator |
| null sort order: |
| sort order: |
| Statistics: Num rows: 1 Data size: 400 Basic stats: COMPLETE Column stats: COMPLETE |
| value expressions: _col0 (type: int), _col1 (type: int), _col2 (type: bigint), _col3 (type: bigint), _col4 (type: binary), _col5 (type: int), _col6 (type: struct<count:bigint,sum:double,input:int>), _col7 (type: bigint), _col8 (type: binary) |
| Reducer 3 |
| Execution mode: vectorized, llap |
| Reduce Operator Tree: |
| Group By Operator |
| aggregations: min(VALUE._col0), max(VALUE._col1), count(VALUE._col2), count(VALUE._col3), compute_bit_vector_hll(VALUE._col4), max(VALUE._col5), avg(VALUE._col6), count(VALUE._col7), compute_bit_vector_hll(VALUE._col8) |
| mode: mergepartial |
| outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8 |
| Statistics: Num rows: 1 Data size: 332 Basic stats: COMPLETE Column stats: COMPLETE |
| Select Operator |
| expressions: 'LONG' (type: string), UDFToLong(_col0) (type: bigint), UDFToLong(_col1) (type: bigint), (_col2 - _col3) (type: bigint), COALESCE(ndv_compute_bit_vector(_col4),0) (type: bigint), _col4 (type: binary), 'STRING' (type: string), UDFToLong(COALESCE(_col5,0)) (type: bigint), COALESCE(_col6,0) (type: double), (_col2 - _col7) (type: bigint), COALESCE(ndv_compute_bit_vector(_col8),0) (type: bigint), _col8 (type: binary) |
| outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col9, _col10, _col11 |
| Statistics: Num rows: 1 Data size: 530 Basic stats: COMPLETE Column stats: COMPLETE |
| File Output Operator |
| compressed: false |
| Statistics: Num rows: 1 Data size: 530 Basic stats: COMPLETE Column stats: COMPLETE |
| table: |
| input format: org.apache.hadoop.mapred.SequenceFileInputFormat |
| output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat |
| serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe |
| |
| Stage: Stage-2 |
| Dependency Collection |
| |
| Stage: Stage-0 |
| Move Operator |
| tables: |
| replace: false |
| table: |
| input format: org.apache.hadoop.mapred.TextInputFormat |
| output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |
| serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe |
| name: default.insert_into4a |
| |
| Stage: Stage-3 |
| Stats Work |
| Basic Stats Work: |
| Column Stats Desc: |
| Columns: key, value |
| Column Types: int, string |
| Table: default.insert_into4a |
| |
| PREHOOK: query: INSERT INTO TABLE insert_into4a SELECT * FROM src LIMIT 10 |
| PREHOOK: type: QUERY |
| PREHOOK: Input: default@src |
| PREHOOK: Output: default@insert_into4a |
| POSTHOOK: query: INSERT INTO TABLE insert_into4a SELECT * FROM src LIMIT 10 |
| POSTHOOK: type: QUERY |
| POSTHOOK: Input: default@src |
| POSTHOOK: Output: default@insert_into4a |
| POSTHOOK: Lineage: insert_into4a.key EXPRESSION [(src)src.FieldSchema(name:key, type:string, comment:default), ] |
| POSTHOOK: Lineage: insert_into4a.value SIMPLE [(src)src.FieldSchema(name:value, type:string, comment:default), ] |
| PREHOOK: query: SELECT SUM(HASH(c)) FROM ( |
| SELECT TRANSFORM(*) USING 'tr \t _' AS (c) FROM insert_into4a |
| ) t |
| PREHOOK: type: QUERY |
| PREHOOK: Input: default@insert_into4a |
| PREHOOK: Output: hdfs://### HDFS PATH ### |
| POSTHOOK: query: SELECT SUM(HASH(c)) FROM ( |
| SELECT TRANSFORM(*) USING 'tr \t _' AS (c) FROM insert_into4a |
| ) t |
| POSTHOOK: type: QUERY |
| POSTHOOK: Input: default@insert_into4a |
| POSTHOOK: Output: hdfs://### HDFS PATH ### |
| -826625916 |
| PREHOOK: query: EXPLAIN INSERT INTO TABLE insert_into4a SELECT * FROM src LIMIT 10 |
| PREHOOK: type: QUERY |
| PREHOOK: Input: default@src |
| PREHOOK: Output: default@insert_into4a |
| POSTHOOK: query: EXPLAIN INSERT INTO TABLE insert_into4a SELECT * FROM src LIMIT 10 |
| POSTHOOK: type: QUERY |
| POSTHOOK: Input: default@src |
| POSTHOOK: Output: default@insert_into4a |
| STAGE DEPENDENCIES: |
| Stage-1 is a root stage |
| Stage-2 depends on stages: Stage-1 |
| Stage-0 depends on stages: Stage-2 |
| Stage-3 depends on stages: Stage-0 |
| |
| STAGE PLANS: |
| Stage: Stage-1 |
| Tez |
| #### A masked pattern was here #### |
| Edges: |
| Reducer 2 <- Map 1 (CUSTOM_SIMPLE_EDGE) |
| Reducer 3 <- Reducer 2 (CUSTOM_SIMPLE_EDGE) |
| #### A masked pattern was here #### |
| Vertices: |
| Map 1 |
| Map Operator Tree: |
| TableScan |
| alias: src |
| Statistics: Num rows: 500 Data size: 89000 Basic stats: COMPLETE Column stats: COMPLETE |
| Limit |
| Number of rows: 10 |
| Statistics: Num rows: 10 Data size: 1780 Basic stats: COMPLETE Column stats: COMPLETE |
| Select Operator |
| expressions: key (type: string), value (type: string) |
| outputColumnNames: _col0, _col1 |
| Statistics: Num rows: 10 Data size: 1780 Basic stats: COMPLETE Column stats: COMPLETE |
| Reduce Output Operator |
| null sort order: |
| sort order: |
| Statistics: Num rows: 10 Data size: 1780 Basic stats: COMPLETE Column stats: COMPLETE |
| TopN Hash Memory Usage: 0.1 |
| value expressions: _col0 (type: string), _col1 (type: string) |
| Execution mode: vectorized, llap |
| LLAP IO: all inputs |
| Reducer 2 |
| Execution mode: vectorized, llap |
| Reduce Operator Tree: |
| Limit |
| Number of rows: 10 |
| Statistics: Num rows: 10 Data size: 1780 Basic stats: COMPLETE Column stats: COMPLETE |
| Select Operator |
| expressions: UDFToInteger(VALUE._col0) (type: int), VALUE._col1 (type: string) |
| outputColumnNames: _col0, _col1 |
| Statistics: Num rows: 10 Data size: 1780 Basic stats: COMPLETE Column stats: COMPLETE |
| File Output Operator |
| compressed: false |
| Statistics: Num rows: 10 Data size: 950 Basic stats: COMPLETE Column stats: COMPLETE |
| table: |
| input format: org.apache.hadoop.mapred.TextInputFormat |
| output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |
| serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe |
| name: default.insert_into4a |
| Select Operator |
| expressions: _col0 (type: int), _col1 (type: string) |
| outputColumnNames: key, value |
| Statistics: Num rows: 10 Data size: 950 Basic stats: COMPLETE Column stats: COMPLETE |
| Group By Operator |
| aggregations: min(key), max(key), count(1), count(key), compute_bit_vector_hll(key), max(length(value)), avg(COALESCE(length(value),0)), count(value), compute_bit_vector_hll(value) |
| minReductionHashAggr: 0.9 |
| mode: hash |
| outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8 |
| Statistics: Num rows: 1 Data size: 400 Basic stats: COMPLETE Column stats: COMPLETE |
| Reduce Output Operator |
| null sort order: |
| sort order: |
| Statistics: Num rows: 1 Data size: 400 Basic stats: COMPLETE Column stats: COMPLETE |
| value expressions: _col0 (type: int), _col1 (type: int), _col2 (type: bigint), _col3 (type: bigint), _col4 (type: binary), _col5 (type: int), _col6 (type: struct<count:bigint,sum:double,input:int>), _col7 (type: bigint), _col8 (type: binary) |
| Reducer 3 |
| Execution mode: vectorized, llap |
| Reduce Operator Tree: |
| Group By Operator |
| aggregations: min(VALUE._col0), max(VALUE._col1), count(VALUE._col2), count(VALUE._col3), compute_bit_vector_hll(VALUE._col4), max(VALUE._col5), avg(VALUE._col6), count(VALUE._col7), compute_bit_vector_hll(VALUE._col8) |
| mode: mergepartial |
| outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8 |
| Statistics: Num rows: 1 Data size: 332 Basic stats: COMPLETE Column stats: COMPLETE |
| Select Operator |
| expressions: 'LONG' (type: string), UDFToLong(_col0) (type: bigint), UDFToLong(_col1) (type: bigint), (_col2 - _col3) (type: bigint), COALESCE(ndv_compute_bit_vector(_col4),0) (type: bigint), _col4 (type: binary), 'STRING' (type: string), UDFToLong(COALESCE(_col5,0)) (type: bigint), COALESCE(_col6,0) (type: double), (_col2 - _col7) (type: bigint), COALESCE(ndv_compute_bit_vector(_col8),0) (type: bigint), _col8 (type: binary) |
| outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col9, _col10, _col11 |
| Statistics: Num rows: 1 Data size: 530 Basic stats: COMPLETE Column stats: COMPLETE |
| File Output Operator |
| compressed: false |
| Statistics: Num rows: 1 Data size: 530 Basic stats: COMPLETE Column stats: COMPLETE |
| table: |
| input format: org.apache.hadoop.mapred.SequenceFileInputFormat |
| output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat |
| serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe |
| |
| Stage: Stage-2 |
| Dependency Collection |
| |
| Stage: Stage-0 |
| Move Operator |
| tables: |
| replace: false |
| table: |
| input format: org.apache.hadoop.mapred.TextInputFormat |
| output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |
| serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe |
| name: default.insert_into4a |
| |
| Stage: Stage-3 |
| Stats Work |
| Basic Stats Work: |
| Column Stats Desc: |
| Columns: key, value |
| Column Types: int, string |
| Table: default.insert_into4a |
| |
| PREHOOK: query: INSERT INTO TABLE insert_into4a SELECT * FROM src LIMIT 10 |
| PREHOOK: type: QUERY |
| PREHOOK: Input: default@src |
| PREHOOK: Output: default@insert_into4a |
| POSTHOOK: query: INSERT INTO TABLE insert_into4a SELECT * FROM src LIMIT 10 |
| POSTHOOK: type: QUERY |
| POSTHOOK: Input: default@src |
| POSTHOOK: Output: default@insert_into4a |
| POSTHOOK: Lineage: insert_into4a.key EXPRESSION [(src)src.FieldSchema(name:key, type:string, comment:default), ] |
| POSTHOOK: Lineage: insert_into4a.value SIMPLE [(src)src.FieldSchema(name:value, type:string, comment:default), ] |
| PREHOOK: query: SELECT SUM(HASH(c)) FROM ( |
| SELECT TRANSFORM(*) USING 'tr \t _' AS (c) FROM insert_into4a |
| ) t |
| PREHOOK: type: QUERY |
| PREHOOK: Input: default@insert_into4a |
| PREHOOK: Output: hdfs://### HDFS PATH ### |
| POSTHOOK: query: SELECT SUM(HASH(c)) FROM ( |
| SELECT TRANSFORM(*) USING 'tr \t _' AS (c) FROM insert_into4a |
| ) t |
| POSTHOOK: type: QUERY |
| POSTHOOK: Input: default@insert_into4a |
| POSTHOOK: Output: hdfs://### HDFS PATH ### |
| -1653251832 |
| PREHOOK: query: EXPLAIN INSERT INTO TABLE insert_into4b SELECT * FROM insert_into4a |
| PREHOOK: type: QUERY |
| PREHOOK: Input: default@insert_into4a |
| PREHOOK: Output: default@insert_into4b |
| POSTHOOK: query: EXPLAIN INSERT INTO TABLE insert_into4b SELECT * FROM insert_into4a |
| POSTHOOK: type: QUERY |
| POSTHOOK: Input: default@insert_into4a |
| POSTHOOK: Output: default@insert_into4b |
| STAGE DEPENDENCIES: |
| Stage-1 is a root stage |
| Stage-2 depends on stages: Stage-1 |
| Stage-0 depends on stages: Stage-2 |
| Stage-3 depends on stages: Stage-0 |
| |
| STAGE PLANS: |
| Stage: Stage-1 |
| Tez |
| #### A masked pattern was here #### |
| Edges: |
| Reducer 2 <- Map 1 (CUSTOM_SIMPLE_EDGE) |
| #### A masked pattern was here #### |
| Vertices: |
| Map 1 |
| Map Operator Tree: |
| TableScan |
| alias: insert_into4a |
| Statistics: Num rows: 20 Data size: 1900 Basic stats: COMPLETE Column stats: COMPLETE |
| Select Operator |
| expressions: key (type: int), value (type: string) |
| outputColumnNames: _col0, _col1 |
| Statistics: Num rows: 20 Data size: 1900 Basic stats: COMPLETE Column stats: COMPLETE |
| File Output Operator |
| compressed: false |
| Statistics: Num rows: 20 Data size: 1900 Basic stats: COMPLETE Column stats: COMPLETE |
| table: |
| input format: org.apache.hadoop.mapred.TextInputFormat |
| output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |
| serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe |
| name: default.insert_into4b |
| Select Operator |
| expressions: _col0 (type: int), _col1 (type: string) |
| outputColumnNames: key, value |
| Statistics: Num rows: 20 Data size: 1900 Basic stats: COMPLETE Column stats: COMPLETE |
| Group By Operator |
| aggregations: min(key), max(key), count(1), count(key), compute_bit_vector_hll(key), max(length(value)), avg(COALESCE(length(value),0)), count(value), compute_bit_vector_hll(value) |
| minReductionHashAggr: 0.95 |
| mode: hash |
| outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8 |
| Statistics: Num rows: 1 Data size: 400 Basic stats: COMPLETE Column stats: COMPLETE |
| Reduce Output Operator |
| null sort order: |
| sort order: |
| Statistics: Num rows: 1 Data size: 400 Basic stats: COMPLETE Column stats: COMPLETE |
| value expressions: _col0 (type: int), _col1 (type: int), _col2 (type: bigint), _col3 (type: bigint), _col4 (type: binary), _col5 (type: int), _col6 (type: struct<count:bigint,sum:double,input:int>), _col7 (type: bigint), _col8 (type: binary) |
| Execution mode: vectorized, llap |
| LLAP IO: all inputs |
| Reducer 2 |
| Execution mode: vectorized, llap |
| Reduce Operator Tree: |
| Group By Operator |
| aggregations: min(VALUE._col0), max(VALUE._col1), count(VALUE._col2), count(VALUE._col3), compute_bit_vector_hll(VALUE._col4), max(VALUE._col5), avg(VALUE._col6), count(VALUE._col7), compute_bit_vector_hll(VALUE._col8) |
| mode: mergepartial |
| outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8 |
| Statistics: Num rows: 1 Data size: 332 Basic stats: COMPLETE Column stats: COMPLETE |
| Select Operator |
| expressions: 'LONG' (type: string), UDFToLong(_col0) (type: bigint), UDFToLong(_col1) (type: bigint), (_col2 - _col3) (type: bigint), COALESCE(ndv_compute_bit_vector(_col4),0) (type: bigint), _col4 (type: binary), 'STRING' (type: string), UDFToLong(COALESCE(_col5,0)) (type: bigint), COALESCE(_col6,0) (type: double), (_col2 - _col7) (type: bigint), COALESCE(ndv_compute_bit_vector(_col8),0) (type: bigint), _col8 (type: binary) |
| outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col9, _col10, _col11 |
| Statistics: Num rows: 1 Data size: 530 Basic stats: COMPLETE Column stats: COMPLETE |
| File Output Operator |
| compressed: false |
| Statistics: Num rows: 1 Data size: 530 Basic stats: COMPLETE Column stats: COMPLETE |
| table: |
| input format: org.apache.hadoop.mapred.SequenceFileInputFormat |
| output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat |
| serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe |
| |
| Stage: Stage-2 |
| Dependency Collection |
| |
| Stage: Stage-0 |
| Move Operator |
| tables: |
| replace: false |
| table: |
| input format: org.apache.hadoop.mapred.TextInputFormat |
| output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |
| serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe |
| name: default.insert_into4b |
| |
| Stage: Stage-3 |
| Stats Work |
| Basic Stats Work: |
| Column Stats Desc: |
| Columns: key, value |
| Column Types: int, string |
| Table: default.insert_into4b |
| |
| PREHOOK: query: INSERT INTO TABLE insert_into4b SELECT * FROM insert_into4a |
| PREHOOK: type: QUERY |
| PREHOOK: Input: default@insert_into4a |
| PREHOOK: Output: default@insert_into4b |
| POSTHOOK: query: INSERT INTO TABLE insert_into4b SELECT * FROM insert_into4a |
| POSTHOOK: type: QUERY |
| POSTHOOK: Input: default@insert_into4a |
| POSTHOOK: Output: default@insert_into4b |
| POSTHOOK: Lineage: insert_into4b.key SIMPLE [(insert_into4a)insert_into4a.FieldSchema(name:key, type:int, comment:null), ] |
| POSTHOOK: Lineage: insert_into4b.value SIMPLE [(insert_into4a)insert_into4a.FieldSchema(name:value, type:string, comment:null), ] |
| PREHOOK: query: SELECT SUM(HASH(c)) FROM ( |
| SELECT TRANSFORM(*) USING 'tr \t _' AS (c) FROM insert_into4b |
| ) t |
| PREHOOK: type: QUERY |
| PREHOOK: Input: default@insert_into4b |
| PREHOOK: Output: hdfs://### HDFS PATH ### |
| POSTHOOK: query: SELECT SUM(HASH(c)) FROM ( |
| SELECT TRANSFORM(*) USING 'tr \t _' AS (c) FROM insert_into4b |
| ) t |
| POSTHOOK: type: QUERY |
| POSTHOOK: Input: default@insert_into4b |
| POSTHOOK: Output: hdfs://### HDFS PATH ### |
| -1653251832 |
| PREHOOK: query: DROP TABLE insert_into4a |
| PREHOOK: type: DROPTABLE |
| PREHOOK: Input: default@insert_into4a |
| PREHOOK: Output: default@insert_into4a |
| POSTHOOK: query: DROP TABLE insert_into4a |
| POSTHOOK: type: DROPTABLE |
| POSTHOOK: Input: default@insert_into4a |
| POSTHOOK: Output: default@insert_into4a |
| PREHOOK: query: DROP TABLE insert_into4b |
| PREHOOK: type: DROPTABLE |
| PREHOOK: Input: default@insert_into4b |
| PREHOOK: Output: default@insert_into4b |
| POSTHOOK: query: DROP TABLE insert_into4b |
| POSTHOOK: type: DROPTABLE |
| POSTHOOK: Input: default@insert_into4b |
| POSTHOOK: Output: default@insert_into4b |