E-MapReduce clusters also support authentication based on LDAP, which manages the account system through LDAP. The Kerberos client uses LDAP account information as identity information for authentication.

LDAP identity authentication

LDAP accounts can be shared with other services, such as Hue. You can use an LDAP service (in ApacheDS) configured in the E-MapReduce cluster or use an existing LDAP service, and you only need to configure it on the Kerberos server.

In the following example, an LDAP service (in ApacheDS) has been started by default in a cluster:

  • Configure the basic environment in gateway management. (This is the same as the second part of RAM. If it has already been configured, this step can be skipped).

    The only difference is that auth_type in /etc/has/has-client.conf needs to be modified in LDAP.

    You may also not modify /etc/has/has-client.conf. The test user can copy the file, modify auth_type with their account, and specify the path through environment variables. For example:

    export HAS_CONF_DIR=/home/test/has-conf

  • Configure the LDAP administrator user name/password to Kerberos server (HAS) in the E-MapReduce console.

    On the E-MapReduce console, enter Configuration Management > HAS Software, configure the LDAP administrator user name and password in the corresponding bind_dn and bind_password fields, and restart the HAS service.

    In this example, the LDAP service is the ApacheDS service in the E-MapReduce cluster. Related fields can be obtained from ApacheDS.

  • The E-MapReduce cluster administrator adds user information to LDAP.
    • Obtain the administrator user name and password for the ApacheDS LDAP service. manager_dn and manager_password can be seen in the E-MapReduce console's Configuration Management/ApacheDS Configuration page.
    • Add the test user and password to ApacheDS.
      Log on to root account in the cluster emr-header-1 node
       Create a file test.ldif with the following content:
       dn: cn=test,ou=people,o=emr
       objectclass: inetOrgPerson
       objectclass: organizationalPerson
       objectclass: person
       objectclass: top
       cn: test
       sn: test
       mail: test@example.com
       userpassword: test1234
       #Add to LDAP, in which -w denotes that password is changed to manager_password
       ldapmodify -x -h localhost -p 10389 -D "uid=admin,ou=system" -w "Ns1aSe" -a -f test.ldif
       #Delete test.ldif
       rm test.ldif
      Provide added user name/password to the test user.
  • The test user configures the LDAP information.
    Log on the test account of Gateway
     # Run the script
     sh add_ldap.sh test
    The add_ldap.sh script is attached to modify the LDAP account information:
    user=$1
     if [[ `cat /home/$user/.bashrc | grep 'export LDAP_'` == "" ]];then
     echo "
     #Modify to the user test's LDAP_USER/LDAP_PWD
     export LDAP_USER=YOUR_LDAP_USER
     export LDAP_PWD=YOUR_LDAP_USER
     " >>~/.bashrc
     else
        echo $user LDAP user info has been added to .bashrc
     fi
  • The test user accesses the cluster services.

    Execute HDFS commands.

    [test@iZbp1cyio18s5ymggr7yhrZ ~]$ hadoop fs -ls /
      17/11/19 13:33:33 INFO client.HasClient: The plugin type is: LDAP
      Found 4 items
      drwxr-x---   - has    hadoop          0 2017-11-18 21:12 /apps
      drwxrwxrwt   - hadoop hadoop          0 2017-11-19 13:33 /spark-history
      drwxrwxrwt   - hadoop hadoop          0 2017-11-19 12:41 /tmp
      drwxrwxrwt   - hadoop hadoop          0 2017-11-19 12:41 /user
    Run the Hadoop/Spark job.