
Node Configuration
==================

1. copy client factory 
   cp hadoop-connector/src/main/resources/prototype/HadoopClient.prototype.cfo .tfcache/pkg/HadoopClient.localhost.cfo

2. update client factory 
   defaultUserId should be set to username that is going to access hdfs file system. 
                 This user should be known by hadoop. 
                 If kerberos authentication is disabled in factory and on hadoop server no authenticaton will be done and 
                 only authorization for specified username will be performed on hadoop server. 
                  

   <?xml version="1.0"?>
   <ClientFactory>
     <defaultUserId>user1</defaultUserId>
     <defaultPassword></defaultPassword>
     <factoryType>HadoopClient</factoryType>
     <factoryName>localhost</factoryName>
     <factoryDesc>Hadoop Connection Factory</factoryDesc>
     <className>com.streamscape.lib.fs.client.hadoop.HadoopFileSystemClientConnection</className>
     <isReliable>true</isReliable>
     <isManaged>false</isManaged>
     <checkInterval>2000</checkInterval>
     <url>hdfs://localhost:9000</url>
     <minorVersion>1</minorVersion>
     <majorVersion>0</majorVersion>
     <vendorString>Hadoop Factory</vendorString>
     <factoryProperties>
       <property name="resource.manager.address" value="localhost:8032"/>
   
       <property name="hadoop.dfs.client.use.datanode.hostname" value="true"/>
       
       <property name="kerberos.enabled" value="false"/>
       <property name="kerberos.keytab" value =""/>
       <property name="kerberos.ticket.cache" value =""/>
       <property name="java.security.krb5.conf" value =""/>
       <property name="java.security.krb5.realm" value =""/>
       <property name="java.security.krb5.kdc" value =""/>
     </factoryProperties>
   </ClientFactory>

3. import hadoop libraries 

3.1 using one sthadoop-with-deps.jar

3.1.1 start node

3.1.2 import jar file
    
    import archive sthadoop-with-deps from '<stroot dir>/platform/lib'
    create package client.HadoopClient archives (sthadoop-with-deps.jar)
    register package client.HadoopClient autoload true 

or 

3.2 using separate jars

3.2.1 copy archives and package

   cp hadoop-connector/libs/* .tfcache/lib/
   cp hadoop-connector/libsmapred/* .tfcache/lib/
   cp hadoop-connector/target/sthadoop.jar .tfcache/lib/
   cp hadoop-connector/src/main/resources/prototype/client.HadoopClient.pkg .tfcache/pkg/client.HadoopClient.pkg

3.2.2 start node

3.2.3 register and load package
   
      load package client.HadoopClient autoload true
      
4. If hadoop is located on another server and hadoop host not known on node server, 
   add host name to /etc/hosts and set hadoop.dfs.client.use.datanode.hostname to true
   
   23.100.27.151 SST-LX-HADOOP-SERVER
         
