Linking components - 7.1

Java custom code for Map Reduce

author
Talend Documentation Team
EnrichVersion
7.1
EnrichProdName
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Real-Time Big Data Platform
task
Data Governance > Third-party systems > Custom code components (Integration) > Java custom code component for Map Reduce
Data Quality and Preparation > Third-party systems > Custom code components (Integration) > Java custom code component for Map Reduce
Design and Development > Third-party systems > Custom code components (Integration) > Java custom code component for Map Reduce
EnrichPlatform
Talend Studio

Procedure

  1. In the Integration perspective of the Studio, create an empty Map/Reduce Job from the Job Designs node in the Repository tree view.
    For further information about how to create a Map/Reduce Job, see Talend Open Studio for Big Data Getting Started Guide .
  2. Drop a tHDFSInput component, a tJavaMR component, and a tHDFSOutput component in the workspace.
    The tHDFSInput component reads data from the Hadoop distribution to be used and the tHDFSOutput component writes processed data into a that distribution.
  3. Connect these components using the Row > Main link.