tree: 86beef384c904f7038a3a750755999f9ec3679db [path history] [tgz]
  1. build-tools/
  2. doc/
  3. has-client/
  4. has-common/
  5. has-dist/
  6. has-plugins/
  7. has-server/
  8. has-tool/
  9. supports/
  10. LICENSE
  11. pom.xml
  12. README.md
has/README.md

Hadoop Authentication Service (HAS)

A dedicated Hadoop Authentication Server to support various authentication mechanisms other than just Kerberos. In its core it leverages a Kerby KDC developed by Apache Kerby, a sub project of Apache Directory.

High level considerations

  • Hadoop services are still strongly authenticated by Kerberos, as Kerberos is the only means so far to enable Hadoop security.
  • Hadoop users can remain to use their familiar login methods.
  • Security admins won't have to migrate and sync up their user accounts to Kerberos back and forth.
  • New authentication mechanism can be customized and plugined.

Architecture

Design

Assuming existing users are stored in a SQL database (like MySQL), the detailed design and workflow may go like the following:

New mechanism plugin API

HAS client plugin HasClientPlugin:

// Get the login module type ID, used to distinguish this module from others. 
// Should correspond to the server side module.
String getLoginType()

// Perform all the client side login logics, the results wrapped in an AuthToken, 
// will be validated by HAS server.
AuthToken login(Conf loginConf) throws HasLoginException

HAS server plugin HasServerPlugin:

// Get the login module type ID, used to distinguish this module from others. 
// Should correspond to the client side module.
String getLoginType()

// Perform all the server side authentication logics, the results wrapped in an AuthToken, 
// will be used to exchange a Kerberos ticket.
AuthToken authenticate(AuthToken userToken) throws HasAuthenException

REST API

Please look at REST API for details.

How to start

Please look at How to start for details.

High Availability

Please look at High Availability for details.

Cross Realm

Please look at How to setup cross-realm for details.

Enable Hadoop ecosystem components

List of supported Hadoop ecosystem components

Big Data ComponentsSupportedRebuild RequiredConfiguring Required
HadoopYesYesYes
ZookeeperYesYesYes
HBaseYesYesYes
HiveYesNoYes
PhoenixYesNoYes
ThriftYesNoYes
SparkYesNoYes
OozieYesNoYes
PrestoYes (0.148 and later)NoYes
PigYesNoNo
SqoopYesNoNo

Performance test report

Please look at Performance test report for details.