Skip to main content Skip to complementary content

Loading data from the file on S3 to Redshift


  1. Double-click tRedshiftBulkExec to open its Basic settings view on the Component tab.
  2. In the Host field, press Ctrl + Space and from the list select context.redshift_host to fill in this field.
    Do the same to fill:
    • the Port field with context.redshift_port,

    • the Database field with context.redshift_database,

    • the Schema field with context.redshift_schema,

    • the Username field with context.redshift_username,

    • the Password field with context.redshift_password,

    • the Access Key field with context.s3_accesskey,

    • the Secret Key field with context.s3_secretkey, and

    • the Bucket field with context.s3_bucket.

  3. In the Table Name field, enter the name of the table to be written. In this example, it is person.
  4. From the Action on table list, select Drop table if exists and create.
  5. In the Key field, enter the name of the file on Amazon S3 to be loaded. In this example, it is person_load.
  6. Click the [...] button next to Edit schema and in the pop-up window define the schema by adding two columns: ID of Integer type and Name of String type.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!