Configuring the connection to Hive - 7.0

Hive

EnrichVersion
7.0
EnrichProdName
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Data Integration
Talend Data Management Platform
Talend Data Services Platform
Talend ESB
Talend MDM Platform
Talend Open Studio for Big Data
Talend Open Studio for Data Integration
Talend Open Studio for ESB
Talend Open Studio for MDM
Talend Real-Time Big Data Platform
EnrichPlatform
Talend Studio
task
Data Governance > Third-party systems > Database components > Hive components
Data Quality and Preparation > Third-party systems > Database components > Hive components
Design and Development > Third-party systems > Database components > Hive components

About this task

Configuring tHiveConnection

Procedure

  1. Double-click tHiveConnection to open its Component view.
  2. From the Property type list, select Built-in. If you have created the connection to be used in Repository, then select Repository, click the button to open the [Repository content] dialog box and select that connection. This way, the Studio will reuse that set of connection information for this Job.
    For further information about how to create a Hadoop connection in Repository, see the chapter describing the Hadoop cluster node of the Talend Open Studio for Big Data Getting Started Guide .
  3. In the Version area, select the Hadoop distribution to be used and its version. If you cannot find from the list the distribution corresponding to yours, select Custom so as to connect to a Hadoop distribution not officially supported in the Studio.
    For a step-by-step example about how to use this Custom option, see Connecting to a custom Hadoop distribution.
  4. In the Connection area, enter the connection parameters to the Hive database to be used.
  5. In the Name node field, enter the location of the master node, the NameNode, of the distribution to be used. For example, talend-hdp-all:50300. If you are using WebHDFS, the location should be webhdfs://masternode:portnumber; if this WebHDFS is secured with SSL, the scheme should be swebhdfs and you need to use a tLibraryLoad in the Job to load the library required by the secured WebHDFS.
  6. In the Job tracker field, enter the location of the JobTracker of your distribution. For example, hdfs://talend-hdp-all:8020.
    Note that the notion Job in this term JobTracker designates the MR or the MapReduce jobs described in Apache's documentation on http://hadoop.apache.org/.