blob: efcfc78ea4282c0d82edcdcf09e328b65424584f [file] [log] [blame]
<table class="table">
<thead>
<tr>
<th style="width:25%">Function</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td>element_at(array, index)</td>
<td>Returns element of array at given (1-based) index. If index < 0,
accesses elements from the last to the first. The function returns NULL
if the index exceeds the length of the array and `spark.sql.ansi.enabled` is set to false.
If `spark.sql.ansi.enabled` is set to true, it throws ArrayIndexOutOfBoundsException
for invalid indices.</td>
</tr>
<tr>
<td>element_at(map, key)</td>
<td>Returns value for given key. The function returns NULL
if the key is not contained in the map and `spark.sql.ansi.enabled` is set to false.
If `spark.sql.ansi.enabled` is set to true, it throws NoSuchElementException instead.</td>
</tr>
<tr>
<td>map(key0, value0, key1, value1, ...)</td>
<td>Creates a map with the given key/value pairs.</td>
</tr>
<tr>
<td>map_concat(map, ...)</td>
<td>Returns the union of all the given maps</td>
</tr>
<tr>
<td>map_entries(map)</td>
<td>Returns an unordered array of all entries in the given map.</td>
</tr>
<tr>
<td>map_from_arrays(keys, values)</td>
<td>Creates a map with a pair of the given key/value arrays. All elements
in keys should not be null</td>
</tr>
<tr>
<td>map_from_entries(arrayOfEntries)</td>
<td>Returns a map created from the given array of entries.</td>
</tr>
<tr>
<td>map_keys(map)</td>
<td>Returns an unordered array containing the keys of the map.</td>
</tr>
<tr>
<td>map_values(map)</td>
<td>Returns an unordered array containing the values of the map.</td>
</tr>
<tr>
<td>str_to_map(text[, pairDelim[, keyValueDelim]])</td>
<td>Creates a map after splitting the text into key/value pairs using delimiters. Default delimiters are ',' for `pairDelim` and ':' for `keyValueDelim`. Both `pairDelim` and `keyValueDelim` are treated as regular expressions.</td>
</tr>
</tbody>
</table>