CREATE RESOURCE
This statement is used to create a resource. Only the root or admin user can create resources. Currently supports Spark, ODBC, S3 external resources. In the future, other external resources may be added to Doris for use, such as Spark/GPU for query, HDFS/S3 for external storage, MapReduce for ETL, etc.
grammar:
CREATE [EXTERNAL] RESOURCE "resource_name" PROPERTIES ("key"="value", ...);
illustrate:
Create a Spark resource named spark0 in yarn cluster mode.
CREATE EXTERNAL RESOURCE "spark0" PROPERTIES ( "type" = "spark", "spark.master" = "yarn", "spark.submit.deployMode" = "cluster", "spark.jars" = "xxx.jar,yyy.jar", "spark.files" = "/tmp/aaa,/tmp/bbb", "spark.executor.memory" = "1g", "spark.yarn.queue" = "queue0", "spark.hadoop.yarn.resourcemanager.address" = "127.0.0.1:9999", "spark.hadoop.fs.defaultFS" = "hdfs://127.0.0.1:10000", "working_dir" = "hdfs://127.0.0.1:10000/tmp/doris", "broker" = "broker0", "broker.username" = "user0", "broker.password" = "password0" );
Spark related parameters are as follows:
Working_dir and broker need to be specified when Spark is used for ETL. described as follows:
ALTER SYSTEM ADD BROKER command.Create an ODBC resource
CREATE EXTERNAL RESOURCE `oracle_odbc` PROPERTIES ( "type" = "odbc_catalog", "host" = "192.168.0.1", "port" = "8086", "user" = "test", "password" = "test", "database" = "test", "odbc_type" = "oracle", "driver" = "Oracle 19 ODBC driver" );
The relevant parameters of ODBC are as follows:
Create S3 resource
CREATE RESOURCE "remote_s3" PROPERTIES ( "type" = "s3", "s3_endpoint" = "http://bj.s3.com", "s3_region" = "bj", "s3_root_path" = "/path/to/root", "s3_access_key" = "bbb", "s3_secret_key" = "aaaa", "s3_max_connections" = "50", "s3_request_timeout_ms" = "3000", "s3_connection_timeout_ms" = "1000" );
S3 related parameters are as follows:
CREATE, RESOURCE