Development in IDE
In IDE, configure the following main class and program arguments
org.apache.eagle.server.ServerMain server src/main/resources/configuration.yml
http://localhost:9090/rest/sites POST
{ "siteId" : "testsite", "siteName" :"testsite", "description" : "test description", "context" : {} }
###n2.2 Create logic alert engine topology
http://localhost:9090/rest/metadata/topologies POST
{ "name": "alertUnitTopology_1", "numOfSpout": 1, "numOfAlertBolt": 10, "numOfGroupBolt": 4, "spoutId": "alertEngineSpout", "groupNodeIds": [ "streamRouterBolt0", "streamRouterBolt1", "streamRouterBolt2", "streamRouterBolt3" ], "alertBoltIds": [ "alertBolt0", "alertBolt1", "alertBolt2", "alertBolt3", "alertBolt4", "alertBolt5", "alertBolt6", "alertBolt7", "alertBolt8", "alertBolt9" ], "pubBoltId": "alertPublishBolt", "spoutParallelism": 1, "groupParallelism": 1, "alertParallelism": 1 }
Please reference eagle-core/eagle-alert-parent/eagle-alert-app/src/main/resources/META-INF/providers/org.apache.eagle.alert.app.AlertUnitTopologyAppProvider.xml for complete configuration.
http://localhost:9090/rest/apps/install POST
{ "siteId" : "testsite", "appType" : "AlertUnitTopologyApp", "mode" : "LOCAL", "configuration" : { } }
Please use correct uuid
http://localhost:9090/rest/apps/start POST
{ "uuid": "dc61c4b8-f60d-4d95-bfd7-f6b07382a3f3", "appId": "AlertUnitTopologyApp-testsite" }
http://localhost:9090/rest/apps/install POST
{ "siteId" : "testsite", "appType" : "HdfsAuditLogApplication", "mode" : "LOCAL", "configuration" : { "dataSourceConfig.topic" :"hdfs_audit_log"} }
Please use correct uuid
http://localhost:9090/rest/apps/start POST
{ "uuid": "dc61c4b8-f60d-4d95-bfd7-f6b07382a3f3", "appId": "HdfsAuditLogApplication-testsite" }
http://localhost:9090/rest/metadata/datasources GET
http://localhost:9090/rest/metadata/streams GET
http://localhost:9090/rest/metadata/policies POST
{ "name": "hdfsPolicy", "description": "hdfsPolicy", "inputStreams": [ "hdfs_audit_log_enriched_stream" ], "outputStreams": [ "hdfs_audit_log_enriched_stream_out" ], "definition": { "type": "siddhi", "value": "from hdfs_audit_log_enriched_stream[user=='hadoop'] select * insert into hdfs_audit_log_enriched_stream_out" }, "partitionSpec": [ { "streamId": "hdfs_audit_log_enriched_stream", "type": "GROUPBY", "columns" : [ "user" ] } ], "parallelismHint": 2 }
{ "name":"hdfs_audit_log_enriched_stream_out", "type":"org.apache.eagle.alert.engine.publisher.impl.AlertEmailPublisher", "policyIds": [ "hdfsPolicy" ], "properties": { "subject":"alert when user is hadoop", "template":"", "sender": "eagle@apache.org", "recipients": "eagle@apache.org", "mail.smtp.host":"", "connection": "plaintext", "mail.smtp.port": "25" }, "dedupIntervalMin" : "PT1M", "serializer" : "org.apache.eagle.alert.engine.publisher.impl.StringEventSerializer" }
./kafka-console-producer.sh --topic hdfs_audit_log --broker-list sandbox.hortonworks.com:6667
2015-04-24 12:51:31,798 INFO FSNamesystem.audit: allowed=true ugi=hdfs (auth:SIMPLE) ip=/10.0.2.15 cmd=getfileinfo src=/apps/hbase/data dst=null perm=null proto=rpc