Assume Apache Eagle (called Eagle in the following) package has been copied and exacted under /usr/hdp/current/eagle.
WARNING: the following instructions work in sandbox currently.
Create a Kafka[^KAFKA] topic if you have not. Here is an example command.
$ /usr/hdp/current/kafka-broker/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic sandbox_hdfs_audit_log
Stream HDFS log data to Kafka, and refer to here on how to do it .
Start dependent services Storm[^STORM], Spark[^SPARK], HBase[^HBASE] & Kafka via Ambari[^AMBARI].
Install Eagle Ambari plugin
$ /usr/hdp/current/eagle/bin/eagle-ambari.sh install
Restart Ambari click on disable and enable Ambari back.
Add Eagle Service to Ambari. Click on “Add Service” on Ambari Main page
Add Policies and meta data required by running the below script.
$ cd <eagle-home> $ examples/sample-sensitivity-resource-create.sh $ examples/sample-policy-create.sh
[^HBASE]:Apache HBase. [^AMBARI]:All mentions of “ambari” on this page represent Apache Ambari. [^SPARK]:Apache Spark. [^KAFKA]:All mentions of “kafka” on this page represent Apache Kafka. [^STORM]:Apache Storm.