tree: 92433f46542fa49508a685ce5b641ce1a927f415 [path history] [tgz]
  1. bin/
  2. conf/
  3. docs/
  4. examples/
  5. lib/

Eagle User Guide


  • Hadoop
  • HBase
  • Storm
  • Spark
  • Kafka

Eagle requires you to have access on Hadoop CLI, where you have full permissions to HDFS, Storm, HBase and Kafka. To make things easier, we strongly recommend you to start Eagle on a hadoop sandbox such as


  • Download the latest version of Eagle source code.

      git clone
  • Build the source code, and a tar.gz package will be generated under eagle-assembly/target.

      mvn clean compile install -DskipTests


  • Copy this package onto the sandbox.

      scp -P 2222 eagle/eagle-assembly/target/eagle-0.1.0-bin.tar.gz root@
  • Run Eagle patch installation at the first time, and restart HDFS namenode.

  • Start Storm, HBase, and Kafka via Ambari Web UI. Make sure the user has the privilege to run Storm, HBase, and Kafka cmd in shell, and with full permissions to access HBase, such as creating tables. Check the installation & running status of the required services.

  • Create necessary HBase tables for Eagle.

  • Start Eagle service.

      bin/ start
  • Create Kafka topics and topology metadata for Eagle.

  • Start Eagle topology, which will submit the topology to Storm via the Storm CLI tools. You can check it with storm UI.

      bin/ [--jar <jarName>] [--main <mainClass>] [--topology <topologyName>] start

Now you can let Eagle to monitor by creating your own policy!

Sandbox Starter

  • startup Eagle service & topology

  • check eagle UI

  • Take the following actions which will violate and obey the sample policy.

    • Violation Action: hdfs dfs -ls unknown
    • Violation Action: hdfs dfs -touchz /tmp/private
    • Obey Action: hdfs dfs -cat /tmp/private