Datasources
Datasource created with source database as Oracle
Oracle as source
1. Target datatype is created as Numeric/Double when source is Number:
Issue description: When the datatype at source is Number without any precision or scale then datatype created in the target redshift is of Numeric/Double
Explanation: As per the AWS documentation when the scale and precision is 0 at the source we have to use a Real equivalent datatype in the target, Refer for this link more details in the data types section.
S3 Datasource
2. Ingestion failure for duplicate file names in source folder
Issue Description: If the source folder contains multiple files with the same name (in subfolders), only one file will be ingested into the target.
Explanation: This occurs when the file type is XLSX and the target is a Data Warehouse (Aurora MySQL / Redshift).
- When an XLSX file is uploaded to a dataset in Amorphic with DWH as the target, the file is converted to CSV before loading.
- Since XLSX files cannot be loaded directly into DWH, this conversion changes the file type and size.
- As a result, the system cannot compare source and target files correctly.
Resolution: Keep the file names unique in the source folder and this issue will not be seen
ArcGIS Datasource
1. Unable to update the feature service layer dataset configuration for existing dataflows
Issue Description: When editing dataflows for feature service layers, the dataset configuration does not get updated properly. While the dataflow updating process works, the dataset name fails to update.
Explanation: This issue occurs because the UI passes the dataset ID in the payload, but the dataset ID itself is not being updated. Since the ID is present, the backend does not recognize this as either an updated dataset or a new dataset. As a result, the datasource payload gets updated and the dataset name is updated in the datasource, but not in the actual dataset.
This is a known issue and will be fixed in the upcoming release.
Resolution: To resolve this issue, users need to:
- Delete the feature service item while editing the dataflow in the "Configure Selected Items" step
- Navigate back to the "Select Items" step
- Select the feature service item again
- Reconfigure the dataset configuration
This process will create a new dataset and ensure the dataset name is properly updated.
Jdbc Bulkload Dataflow
1. Latest Statistics are not available.
Issue Description: Latest Statistics are not available for Jdbc Bulkload serverless full load dataflows.
Explanation: For serverless full-load dataflows, the Latest Statistics section may stay empty even when the dataflow has finished successfully. This occurs when the full load completes very quickly. In those cases, the underlying AWS service shuts down the replication before the statistics can be captured, so the numbers are no longer available to display. The dataflow itself completes correctly; only the statistics display is affected.
2. "56 years ago" displayed for FullLoadStartTime when tables never start loading
Issue Description: When JDBC bulk dataload tables remain in "Before load" state and never start loading, the UI displays "56 years ago" for FullLoadStartTime.
Explanation: This is expected AWS DMS behavior. AWS DMS returns epoch 0 (1970-01-01) for tables whose full load did not commence, and the UI calculates the time difference from 1970 to the current date.
Generic Issues
- When a datasource flow is created with Target Location as S3 and includes a transformation rule to remove a column, data distortion occurs in the CSV files stored in S3.
- This issue is due to a bug on AWS's end and is not related to Amorphic.
- AWS has not provided an ETA for fixing this issue. Once AWS resolves it, the fix will automatically reflect in Amorphic.