Skip to main content

Computing data with Hadoop distributed file system

This scenario applies only to Talend products with Big Data.

For more technologies supported by Talend, see Talend components.

The following scenario describes a simple Job that creates a file in a defined directory, get it into and out of HDFS, subsequently store it to another local directory, and read it at the end of the Job.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!