Procedure
-
Double-click tBigQueryOutput to
open its Component view.
- Click Sync columns to retrieve the schema from its preceding component.
- In the Local filename field, enter the directory where you need to create the file to be transferred to BigQuery.
- Navigate to the Google APIs Console in your web browser to access the Google project hosting the BigQuery and the Cloud Storage services you need to use.
- Click Google Cloud Storage > Interoperable Access to open its view.
- In Google storage configuration area of the Component view, paste Access key, Access secret from the Interoperable Access tab view to the corresponding fields, respectively.
-
In the Bucket field, enter the
path to the bucket you want to store the transferred data in. In this example, it is
talend/documentation
This bucket must exist in the directory in Cloud Storage
-
In the File field, enter the
directory where in Google Clould Storage you receive and create the file to be
transferred to BigQuery. In this example, it is gs://talend/documentation/biquery_UScustomer.csv. The file name must be the
same as the one you defined in the Local
filename field.
Troubleshooting: if you encounter issues such as Unable to read source URI of the file stored in Google Cloud Storage, check whether you put the same file name in these two fields.
- Enter 0 in the Header field to ignore no rows in the transferred data.