Building access to BigQuery - 7.0

Google BigQuery

EnrichVersion
7.0
EnrichProdName
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Data Integration
Talend Data Management Platform
Talend Data Services Platform
Talend ESB
Talend MDM Platform
Talend Open Studio for Big Data
Talend Open Studio for Data Integration
Talend Open Studio for ESB
Talend Open Studio for MDM
Talend Real-Time Big Data Platform
EnrichPlatform
Talend Studio
task
Data Governance > Third-party systems > Cloud storages > Google BigQuery components
Data Quality and Preparation > Third-party systems > Cloud storages > Google BigQuery components
Design and Development > Third-party systems > Cloud storages > Google BigQuery components

Procedure

  1. Double-click tBigQueryOutput to open its Component view.
  2. Click Sync columns to retrieve the schema from its preceding component.
  3. In the Local filename field, enter the directory where you need to create the file to be transferred to BigQuery.
  4. Navigate to the Google APIs Console in your web browser to access the Google project hosting the BigQuery and the Cloud Storage services you need to use.
  5. Click the API Access tab to open its view.
  6. In the Component view of the Studio, paste Client ID, Client secret and Project ID from the API Access tab view to the corresponding fields, respectively.
  7. In the Dataset field, enter the dataset you need to transfer data in. In this scenario, it is documentation.
    This dataset must exist in BigQuery. The following figure shows the dataset used by this scenario.
  8. In the Table field, enter the name of the table you need to write data in, for example, UScustomer.
    If this table does not exist in BigQuery you are using, select Create the table if it doesn't exist.
  9. In the Action on data field, select the action. In this example, select Truncate to empty the contents, if there are any, of target table and to repopulate it with the transferred data.