Big Data
Issue |
Description |
Available in |
---|---|---|
Last column is blank when using tFileInputDelimited | When there is data in file or database with null as a last
item, the null item is no longer considered as an empty space to
avoid any row or column rejection that leads to blank data when using
tFileInputDelimited. Nevertheless, in case of an empty space due to a missing
item, the rejection still remains. |
ⓘ Available in: Big Data Big Data Platform Cloud Big Data Cloud Big Data Platform Cloud Data Fabric Data Fabric Real-Time Big Data Platform All subscription-based Talend products with Big Data |
Decimal precision needs to be provided in tHiveOutput schema | When you migrate a Job from Talend 7.2.1 to 7.3.1 versions and want to write data to a Hive table, you can now load data without specifying length and precision. By default, the values are 38 for length and 18 for precision. |
ⓘ Available in: Big Data Big Data Platform Cloud Big Data Cloud Big Data Platform Cloud Data Fabric Data Fabric Real-Time Big Data Platform All subscription-based Talend products with Big Data |
Schema mismatch detected when writing to the Delta table | The issue preventing to write data to the Delta table has been fixed. When you use a component like tAggregateRow before tDeltaOutput to load data to the output table, the database column name is now used instead of the schema column name. |
ⓘ Available in: Big Data Big Data Platform Cloud Big Data Cloud Big Data Platform Cloud Data Fabric Data Fabric Real-Time Big Data Platform All subscription-based Talend products with Big Data |