Connect Portal
Symptom | Area Affected | Problem | Solution |
---|---|---|---|
Unexpected exception occurred in the expression handler | Specifying filter expressions and clicking the Validate Expression button in the Filters tab of the Add/Edit Data Flow dialog in the Syncsort Connect Portal. | Syncsort Connect Portal problem | Specify filter expressions without clicking the Validate Expression. Filter expressions work as expected at run-time. |
Copy project hangs if the table is empty or committed data is incorrect | After starting a Project for the first time, a copy DataFlow can be shown as Ok instead of Stopped. | Precisely Connect Portal problem | Restart the project. |
The monitor replication page data flow grid displays the error "No data flows are configured for this project" or "Unable to load model controller for project <project name>" after deploying a project, after starting or stopping a project or data flow, or when refreshing the monitor page. | Monitoring replication projects in the Syncsort Connect Portal. | This is a Syncsort Connect Portal problem. | Refresh the page or move between projects until the interface displays data flows. |
Unable to insert row for version '1.6' in Schema History table "PUBLIC"."schema_version" | Connect Portal installation when customers jump from 9.9.6 to a version > 9.9.10 without installing 9.9.10. | This is a Syncsort Connect Portal problem. | Install 9.9.10 before a version > 9.9.10 to ensure a seamless migration. |
Unable to select tables from multiple schemas. Selecting a table from a different schema clears previously selected tables. | Existing DMX customers who setup agent with multiple schemas. This would affect replication from DB2/Z using Connect portal. | This is a Syncsort Connect Portal problem. | Ensure that each publisher/engine contains tables from only one schema, or define the data flow using the Connect Portal. |
Users can see a misleading error message "Could not connect to server 'xxxx' at 'xxxx'. Ensure the server is available and the DMX DataFunnel Runtime Service is running. | Adding and editing Servers in Syncsort Connect Portal. | This is a Syncsort Connect Portal problem. | Please ensure the server is running and ignore the error for "DMX DataFunnel Runtime Service is running" |
Saving a project with Run on: 'Cluster' option has no effect and reverts to local mode. | Saving/Running a Copy project in the Syncsort Connect Portal. | This is a Syncsort Connect Portal problem. | Use local run mode. |
Selecting a new source schema on an existing dataflow clears existing saved table selections in Syncsort Connect Portal. | Changing the selected schema on the source tab for an existing dataflow and then selecting the existing saved schema loads the tables for that schema but deselects previously selected tables. | This is a Syncsort Connect Portal problem. | Click Cancel on the Data flow dialog and then click Edit Data Source to reopen the saved source data connection settings. |
Database issues
Symptom | Area Affected | Problem | Solution |
---|---|---|---|
(DBIOCTIV) data type of column in database table is not supported | Connect ETL/Connect for Big Data tasks that access bytea or text data type in Greenplum through DataDirect ODBC on Linux. | DataDirect ODBC Greenplum Driver issue | If possible, use data type varchar instead of bytea and text for the specified column. Alternatively, you can use SQL text to cast text as varchar. |
(TBLORCOL) unable to output records to database table. | DMExpress tasks loading data to a Greenplum target though ODBC on Linux, where the target is a user defined SQL statement of the form "update <table> set ... where..." and a varchar column is present in the where clause of the update SQL statement. | Greenplum problem | If possible, use char data type instead of varchar for that column. |
Oracle error when a value is larger than specified precision allowed in a column. | DMExpress tasks with an Oracle target, which load into NUMBER columns with scale greater than 0 using Oracle 11 client on Windows. | This is a Oracle problem. | Upgrade your Oracle client to version 12 or higher. |
Running a DMX-h application with HCat source or target in cluster deploy mode in a kerberos enabled cluster causes the application to abort | Running Connect for Big Data jobs with an HCatalog source or target on Cloudera Spark 2.1 in cluster deploy mode on a Kerberos-enabled CDH Spark cluster. | Cloudera Spark 2.1 problem | Run the application in client deploy mode or upgrade to Cloudera Spark 2.2 or higher. For more information, see CDS Powered by Apachee Spark Overview. |
Analysis of Hive target column statistics fails with a Hive syntax error | When running Hive versions prior to 1.2, DMExpress tasks analyzing a Hive target specified with a fully-qualified table name. | This is a Hive bug | Contact your Hadoop distribution vendor for a resolution or omit the database prefix from the table name when the target table is in the default database. |
Analysis of Hive target column statistics fails with a Hive error | When running Hive versions prior to 1.2, DMExpress tasks analyzing a Hive target table when a Text or Date/time constant is mapped to a target partition column. | This is a Hive bug | Contact your Hadoop distribution vendor for a resolution or disable analysis of column statistics for the table. |
Error issued when DMExpress is unable to insert data from staging table <TABLE_NAME> for database “<CONNECTION_URL>" target table | Running a DMX-h job on Spark in a MapR cluster when the job contains an Amazon S3-backed external Hive target table connected through a JDBC connection. | This is a MapR problem. | Set the hive configuration property ‘hive.optimize.insert.dest.volume’ to false. |
Unable to output records to database table "jdoe.df_chars". | A Connect ETL task reading from or writing to a Postgres CHAR or VARCHAR column with length greater than 255 when this column is mapped to a CHAR or VARCHAR column in another database. | This is a PostgreSQL ODBC driver problem. |
In the PostgreSQL section of the odbc.ini or via the equivalent Windows ODBC DSN entry set the following configuration item to a value greater than the length of the longest CHAR or VARCHAR column. MaxVarcharSize=65530 |
Length limitations loading when any Unicode values mapped to Teradata character columns. | Tasks that write to Teradata CHAR or VARCHAR columns longer than 10666 when any of the target data is Unicode encoded. | This is a Teradata limitation. | If all the Unicode data is Locale compatible, use the Encode() function to convert it to Locale. |
FastLoad cannot load LONG VARCHAR columns in a UTF8 session, when the associated FastLoad data definition is LONG VARCHAR. | Tasks that write to Teradata LONG VARCHAR columns when any of the target data is Unicode encoded. | This is a Teradata problem. | If all the Unicode data is Locale compatible, use the Encode() function to convert it to Locale. |
Task with user defined SQL target crashes on 7.1.5 Datadirect driver | Tasks defined with a Greenplum target that include user-defined SQL in which the query has a mix of literals and parameters. | This is a Greenplum problem. | Set the parameter EnableDescribeParam=0 for the data source in the odbc.ini file. |
Support Amazon S3 sources and targets in MapReduce | Accessing Amazon S3 buckets in MapReduce jobs when the AWS secret key entered either on the Amazon S3 Connection dialog or in the DTL or SDK command option /SERVERCONNECTION contains a slash (/). | This is a Hadoop problem. | Encrypt of file paths and modify a few existing messages. Replace "hdfs file" with "hadoop file". |
DMX task stops and records are rejected | Loading data that includes at least one linefeed, carriage return or "~~" ("0x7E0x7E") characters as part of the data into Azure Synapse Analytics (Azure SQL Data Warehouse) targets. | This is a Azure Synapse Analytics (Polybase) problem. | Preprocess the data to replace these characters before using DMX to load into Azure Synapse Analytics targets. |
DMExpress task completed with zero records read | DMExpress tasks extracting data from very large Greenplum tables through ODBC on Windows. | This is a Greenplum ODBC driver issue. | "Use Declare/Fetch" option in the Greenplum ODBC DSN settings. |
Task continues to run when Teradata utility error limit setting is exceeded. | DMExpress tasks with Teradata targets customized to use the TPT Stream operator. | This is a Teradata problem. | Export the environment variable: DMXTeradataTPTStreamPackFactor=1. This is a workaround causing the TPT Stream operator to use a pack factor of 1, which resolves the problem at the expense of degraded performance. |
Unexpected behavior when defining or running tasks with Oracle database sources in DMExpress. | Tasks which connect to an Oracle database that has cursor sharing set to SIMILAR. | This is a documented Oracle problem. Please see Oracle bug #5553553 | Set cursor sharing to EXACT on the Oracle server DMExpress is connecting to. |
Task continues to run when Teradata utility error limit setting is exceeded. | DMExpress tasks with Teradata targets customized to use the TPT Stream operator. | This is a Teradata problem. | Export the environment variable: DMXTeradataTPTStreamPackFactor=1. |
TPT Stream operator may write valid records to the error table when neighboring records contain errors. | DMExpress tasks with Teradata targets customized to use the TPT Stream operator. | This is a Teradata problem. | Export the environment variable: DMXTeradataTPTStreamPackFactor=1. |
TPT Stream operator encounters an error when writing data to LONG VARCHAR columns, unless the LONG VARCHAR column is the only column in the table | DMExpress tasks with Teradata targets customized to use the TPT Stream operator running against Teradata server version 13.00.00.00. | This is a Teradata problem. | Upgrade your Teradata server to patch level 13.00.00.02 or higher |
DMExpress or Connect ETL
Symptom | Area Affected | Problem | Solution |
---|---|---|---|
The XML document does not contain a layout | DMExpress jobs containing Tasks having XML schema with xsd:import statement specified as External Metadata and with Lineage generation turned on. | This is a DMExpress problem. | Import the schema specified by xsd:import into the parent schema file and rerun the DMExpress job. |
Return code of DMExpress is 0 (success) instead of 100 (exception), despite warning messages regarding target statistics. | When Hive analyze queries fail and warning messages are issued, DMExpress tasks running in local mode that load to Hive tables via JDBC and request analysis of table/column statistics. | This is a DMExpress problem | Identify the problem via the GCMSG warning message that is printed to the task log. |
Double quoted column names via DMX DTL or SDK /DBINPUT JDBC/DB2ZOS source do not preserve lowercase characters. | DB2ZOS source tables with lowercase characters in the table name accessed by DMExpress through JDBC. | This is a DMExpress problem. | When accessing DB2ZOS via JDBC avoid accessing table names with lowercase characters - use ODBC instead. |
32-bit DMX can use only up to 2G of memory. If lookup data is large enough that DMX requires more than 2G to process it, then DMX aborts with INMEM (6.3.1 to 6.9.11) or INMEM4LU (6.9.12 to current) message. | Using 32-bit DMExpress to perform lookup from large look-in data which requires more than 2G of memory to store | This is a DMExpress problem. | Where feasible, change your task to use Join instead of lookup, or upgrade to 64-bit DMExpress. |
DMExpress issues an error message if Commit Interval or Abort If Error is enabled in bulk load mode for SQL Server on AIX 64bit | Loading to SQL Server tables on AIX 64-bit with Commit Interval or Abort If Error enabled and using bulk load mode. The source files also have invalid records. | This is a DMExpress problem. | Do not select Table Lock in the BulkLoadOptions attribute for the data source in the odbc.ini file. |
DMExpress: (SUMRECTOOLONG) a summarized record is greater than the maximum record length allowed | Aggregate tasks where record format for some sources is defined via delimited record layout | This is a DMExpress problem. | Increase longest expected record length in the Performance Tuning dialog. |
An error occurred during query execution: server closed the connection unexpectedly | DMExpress tasks extracting data from a Vertica LOB column and loading into an Oracle varchar column, when the ToText() function is used to convert the LOB data into text. | This is a DMExpress problem. | Unload from the Vertica table to a flat file in the first task. Then, load from the flat file to the Oracle table in the second task |
DMExpress crashes when importing a DTL task from the command line or the Task/Job Editor. | Tasks that contain both summarization and sources with embedded layouts (header layout, XML). | This is a DMExpress problem | None. |
DMExpress does not properly convert the format of floating point fields from the IBM format (Excess64) to IEEE. | DMExpress tasks with a mainframe source file containing one or more IBM floating point number fields, and a non-mainframe target file including at least one of these fields in either a delimited target record layout with a bit field, or a fixed position target record layout that has one or more of the following:
|
This is a DMExpress problem. | Add a subsequent task to your DMX job that writes the target with the desired condition(s) separately from the task that does the data conversion. |
Unable to open the file <filename> for reading( ). A file or directory in the path name does not exist. | Executing Jobs with DMExpress server 3.3.8 and above that were created using Job Editor versions 2.6.2 through 3.3.7. | This is a DMExpress problem. | Resave jobs with the latest release before executing them. |
The status of the scheduled job is not updated automatically in the DMExpress Server Dialog when there is a change of status while the Server Dialog is open | Scheduled DMX jobs | This is a DMExpress problem | Click Refresh in the server dialog to update the status of the scheduled jobs |
Hive Sources and Target Column permission issue with SentryHive Sources and Target Column permission issue with Sentry | DMExpress tasks accessing a Sentry-enabled Hive source/target where the user has column-level select privilege but does not have table level privilege. | This is a DMExpress problem. | Currently, there is no solution available. Table-level select privilege must be granted to the user. |
User-defined C/C++ function called instead of internal DMExpress function. | Invoking DMExpress from a C/C++ program on HP, where the code contains a function with the same name as a DMExpress function. | This is a DMExpress problem. | Ensure the function names defined in the C/C++ program do not collide with the function names. |
Segmentation Fault | Tasks with a COBOL-IT Line Sequential source and standard output as the target on SunOS SPARC 64-bit. | This is a DMExpress problem. | Standard output target is not supported with a COBOL-IT Line Sequential source. Use a file target. |
Unable to write timestamp/date column in HDP 3.1 | Running a DMX-h job on the cluster when the job contains a task writing to a non-transactional and unpartitioned Hive 3.0 or above timestamp or date column. | This is a DMExpress problem. | Set the environment variable DMX_HIVE_TARGET_FORCE_STAGING=1 |
DMExpress GUI crashes during installation if "Netezza" or "Vertica" is selected on the "Verify Database Connections" dialog | Installing and uninstalling Connect ETL. | This is a Connect ETL problem. | If an install crashes, contact Precisely Technical Support. |
The lookin source has more records than conditional lookup can process. | Tasks with a conditional lookup where the look-in source file has more than 134 million records. | This is a Connect ETL/Connect for Big Data problem. | Divide the look-in source file into multiple smaller files having fewer than 134 million records and create a conditional value that does the lookup from the multiple smaller files. |
DMExpress converts the table/column to uppercase and encloses them with quotes, if they are not quoted previously. | Connect ETL/Connect for Big Data Software Development Kit (SDK) tasks that access Greenplum source through ODBC. | This is a Connect ETL/Connect for Big Data problem | Enclosing the database table and/or column name in double-quotes (") may resolve the problem. |
Database connection to ODBC data source could not be established. | Running tasks that connect to Vertica databases after upgrading to DMExpress 7.14 or later. | The SQLGetPrivateProfileString function needed by the Vertica driver has been moved to a different library starting in DMExpress 7.14. | In the vertica.ini file, modify the ODBCInstLib setting to point to <dmx_install_dir>/lib/libodbcinstSS.so instead of libodbcSS.so. |
DMExpress converts the table/column to uppercase and encloses them with quotes | Connect ETL/Connect for Big Data Software Development Kit (SDK) tasks that access Greenplum target through ODBC. | This is a Connect ETL/Connect for Big Data problem. | Enclose the database table and/or column name in double-quotes ("). |
Unable to create "<target file>" | Typical applications that create target files when DMExpress is installed | Denied access when DMExpress tries to create target files under "C:/Program Files/DMExpress/Examples/TypicalApplications | Reduce the "User Account Control" settings to "Never notify" or copy the 'Examples' to a user writable folder |
DMExpress hangs | Tasks with a Vertica target table where any of the following is true:
|
This is a problem with the Connect ETL/Connect for Big Data/Vertica interface |
|
DMX-h
Symptom | Area Affected | Problem | Solution |
---|---|---|---|
(DFNF) data file <filename> does not exist DMExpress has aborted | DMX-h jobs written in DTL where a task’s single output is connected to more than one other tasks with direct dataflow specified. | This is a DMX-h ETL problem | Use only one direct dataflow per file. |
DMExpress : (GCAPIERR) data connector "JDBC" returned failed status for Data Connector API "dmx_connector_openObjectUsingDriver" | Running a DMX-h Intelligent Execution job on single cluster node when the job has a subjob containing either a Hive or an Impala table connected through a JDBC connection. | This is a DMX-h problem. | Run the job from the edge node or run the subjob on cluster via Job orchestration. |
DMExpress : (HDFSNS) HDFS server connections are only supported on Intel-based 64-bit Linux systems | Sampling an XML file with a Hadoop Distributed File System (HDFS) connection. | This is a DMX-h ETL problem. | Move the file to a local or a non-HDFS source. |
DMExpress : (HDFSNS) HDFS server connections are only supported on Intel-based 64-bit Linux systems | Clicking “Map layout” in the Header Properties dialog of a file with a Hadoop Distributed File System (HDFS) connection. | This is a DMX-h ETL problem. | Move the file to a local or a non-HDFS source. |
Other issues
Symptom | Area Affected | Problem | Solution |
---|---|---|---|
Internal error has occurred (1 in SSTRPHDL) or (2226 in sscetij.c) | Tasks that access MySQL database tables using MySQL ODBC driver version lower than 5.1. | This is a MySQL driver 3.51 problem. | Upgrade MySQL ODBC driver to 5.1 |
Internal error when non-ASCII characters are used | Tasks that use non-ASCII characters in advanced comparison functions IfContainsAny and IfEqualsAny with the environment variable LC_ALL=C fails with a coredump | Set the environment variable LC_ALL=en_US.UTF8 or LC_ALL=en_US.UTF-8 depending on your operating system. | |
(TDTERROR) MTDP: EM_DBC_CRASH_A(220): Network connection to the DBC was lost | Connect ETL/Connect for Big Data tasks running on UNIX platforms having both Teradata and Oracle target tables where the Teradata client version is 14. | All utilities except FastLoad | Create separate tasks for the Oracle and Teradata targets. |
A core dump or segmentation fault executing a user written program linked to the Connect library. | User written programs linked to the DMExpress library, on HP-UX platforms, that call tmpnam() or ctermid() with NULL as a function argument. | In a multi-threaded environment, the behavior of tmpnam() and ctermid() with NULL argument is not defined in the POSIX standard. | To ensure the correct behavior of tmpnam() and ctermid(), supply a pointer to a buffer as the argument to tmpnam() and ctermid(). |
Amazon S3 source file paths in Connect for Big Data statistics shown inconsistently. | Connect for Big Data statistics in mapper logs of MapReduce Join jobs. | Connect for Big Data problem | A fix will be available in an upcoming Connect for Big Data release. |
(INWAR) (1 in SSALLOCATEFILTERMEM); planned memory not available. | DMExpress tasks that access Amazon Redshift source or target via ODBC | This is an Amazon Redshift ODBC driver issue. | Set the "Single Row Mode" option in the Amazon Redshift ODBC DSN settings. |
System function iconv_open failed during set up for function Unicode Converter. | DMExpress applications which convert locale encoded data and aresubmitted to a 64-bit AIX server through the GUI. | This is an AIX problem. | Install IBM AIX APAR IY83580. Download APAR |
DMExpress tasks with MS Access targets may exceed the allowable size of the MDB file (2 GB for Access 2003). In such a situation, DMX will load data until the limit is reached. | DMExpress tasks with a Microsoft Access target. | Loads into a Microsoft Access database create a significant amount of excess data, and Access has a file size limitation on the MDB file, typically 1 - 2 gigabytes. | Partition the data into smaller sizes, load each partition separately and run Microsoft Access's Compact and Repair Database utility in between loading each partition. The "Compact and Repair Database" utility can be accessed either from the Access Tools menu or from the command line. |
DMX CDC Director help fails to launch | DMX CDC Director when you try to open the help. | This is a DMX CDC Director problem. | Open the help from the installation folder ${DMEXPRESS_INSTALLATION}/DMXCDC/Windows/director/omnidir.chm). |
Record is truncated when empty LOB is inserted into DB2 using DataDirect drivers | DMExpress tasks loading data into a DB2 database using the provided DataDirect driver, where the target table contains at least one of the following types of columns in any position but the last: CLOB/BLOB/DBCLOB/ LONG VARCHAR/ LONG VARGRAPHIC. | This is a progress DataDirect problem | Refer to "Connect and Connect64 for ODBC hot fix download and install instructions". |
DMExpress stops with an internal error in SSTRPHDL. | Running multiple DMExpress tasks at the same time on a HP Itanium machine. | This is an HP-UX software problem. | Upgrade to DMExpress 4.1.2 or later and install or supersedepatches PHSS_35979 and PHSS_34853. |
DMExpress stops with an error when the specified match condition compares <field name> field and <column name> column but the target table mapping is between <field name> field and <column name> column | DMExpress tasks that update target table columns using a condition as match criteria, when a field or value used in the match condition is mapped to a different column in the Target Database Table dialog. | Provide the mappings for all the field-column pairs in the match condition. Note that the additional mappings will affect the output to the target table when 'Update and insert' disposition is selected and the match criteria is not satisfied. | |
Memory dump while reading from an Informix table that has NULL values in a decimal or date column | Tasks that read decimal or date columns containing NULL values from an Informix source table on Linux 64bit. | This is an ODBC configuration issue. | Add DMXSQLLEN = 4 to Informix section in odbcinst.ini. |
DMExpress does not obey resource limits in place for the user submitting a task or job. | Jobs or Tasks which are submitted to a DMExpress Server through the GUI. | Per-process resource limits are inherited by jobs or tasks when they are submitted to the DMExpress Server through the GUI. | Set resource limits in the user's profile or in the global profile. See the online help topic "System resource utilization" for more information. |
DMExpress stops | Tasks where a file is provided as the search set to the IfContainsAny, IfEqualsAny, or FindContainedValue functions, when processing of the searchset file causes a warning message. | Avoid the warning by addressing the cause of the message. | |
DMExpress hangs waiting for user input on Windows platforms under Cygwin | SDK task with incomplete options or option arguments. | This is a Cygwin issue due to MinTTY being based on Cygwin pty. | Fix the SDK syntax error and rerun the task |
DMExpress hangs while running a distributed job | Servers running HP-UX 11 | This is an HP-UX software problem. | Install patches: PHKL_22840 and PHNE_22397 |
DMExpress is unable to connect to a FTP server | A DMExpress task or job that contains a large number of remote files accessed using FTP. | This is a system problem. | Check whether the association of FTP connection with fileswithin a DMExpress task can be avoided. An FTP connection is needed only to access files that are not directly accessible from the system where the job is run. |
Running setup.exe in silent mode prompts the user for elevated permissions with the messages:
|
Systems running Windows Vista and Windows 7 | Refer to the "Silent Installation" section of the Installation Guide for instructions on how to avoid the prompts. | |
Script or program which invokes a DMExpress job is killed when anerror occurs in the DMExpress job. | DMExpress jobs customized using Korn shell and run from anothershell script or external program. | This is caused by a Korn shell limitation. | The parent script should start dmxjob in a separate process group.In Korn shell, this can be done as follows:
|