During Import Process, entity transforms are required to make changes to the entity before it gets committed to the database. These modifications are necessary to make the entity conform to the environment it is going to reside. The Import Process provided a mechanism to do that.
A transformation framework allows a mechanism to selectively transform an entity or specific attributes of that entity.
To achieve this, the framework, provides:
The existing transformation frameworks allowed this to happen.
While the existing framework provided the basic benefits of transformation framework, it did not have support for some of the commonly used Atlas types. Which meant that users of this framework would have to meticulously define transformations for every type they are working with. This can be tedious and potentially error prone. The new framework addresses this problem by providing built-in transformations for some of the commonly used types. It can also be extended to accommodate new types.
The approach used by the new transformation framework creates a transformation by:
Following are built-in conditions.
Condition Types | Description |
---|---|
ENTITY_ALL | Any/every entity |
ENTITY_TOP_LEVEL | Entity that is the top-level entity. This is also the entity present specified in AtlasExportRequest. |
EQUALS | Entity attribute equals to the one specified in the condition. |
EQUALS_IGNORE_CASE | Entity attribute equals to the one specified in the condition ignoring case. |
STARTS_WITH | Entity attribute starts with. |
STARTS_WITH_IGNORE_CASE | Entity attribute starts with ignoring case. |
HAS_VALUE | Entity attribute has value. |
Action Type | Description |
---|---|
ADD_CLASSIFICATION | Add classifiction |
REPLACE_PREFIX | Replace value starting with another value. |
TO_LOWER | Convert value of an attribute to lower case. |
SET | Set the value of an attribute |
CLEAR | Clear value of an attribute |
During import, hive_db entity whose qualifiedName is stocks@cl1 will get the classification clSrcImported.
{ "conditions": { "hive_db.qualifiedName": "stocks@cl1" }, "action": { "__entity": "ADD_CLASSIFICATION: clSrcImported" } }
Every imported entity will get the classification by simply changing the condition. The __entity is special condition which matches entity.
{ "conditions": { "__entity": "" }, "action": { "__entity": "ADD_CLASSIFICATION: clSrcImported" } }
To add classification to only the top-level entity (entity that is used as starting point for an export), use:
{ "conditions": { "__entity": "topLevel:" }, "action": { "__entity": "ADD_CLASSIFICATION: clSrcImported" } }
This action works on string values. The first parameter is the prefix that is searched for a match, once matched, it is replaced with the provided replacement string.
The sample below searches for /aa/bb/, once found replaces it with /xx/yy/.
{ "conditions": { "hdfs_path.clusterName": "EQUALS: CL1" }, "action": { "hdfs_path.path": "REPLACE_PREFIX: = :/aa/bb/=/xx/yy/" } }
Entity whose hdfs_path.clusterName is CL1 will get its path attribute converted to lower case.
{ "conditions": { "hdfs_path.clusterName": "EQUALS: CL1" }, "action": { "hdfs_path.path": "TO_LOWER:" } }
Entity whose hdfs_path.clusterName has value set, will get its replicatedTo attribute value cleared.
{ "conditions": { "hdfs_path.clusterName": "HAS_VALUE:" }, "action": { "hdfs_path.replicatedTo": "CLEAR:" } }
Please look at these tests for examples using Java classes.