Known issues and limitations - Data360_Analyze - Latest

Data360 Analyze Release Notes

Product type
Software
Portfolio
Verify
Product family
Data360
Product
Data360 Analyze
Version
Latest
Language
English
Product name
Data360 Analyze
Title
Data360 Analyze Release Notes
Copyright
2024
First publish date
2016
Last updated
2024-12-12
Published on
2024-12-12T14:19:08.338000

We would like to make you aware of the following list of issues and limitations.

If you encounter any other technical issues, contact support at support.precisely.com.

Third parties

The following table lists third party known issues and limitations:

Feature Description

Avro

The Avro 1.7.7 specification lists some supported metadata constraints. Specifically, it places restrictions on the names of fields, as follows:

  • The field names must start with [A-Za-z_]
  • The field names must only contain [A-Za-z0-9_]

Avro 1.7.7 does not support date, time and datetime data types. As a result, if you want to upload data and use the Data360 Analyze nodes, these fields will need to be converted to string data types.

Hadoop Hive Cluster

When downloading files from the Hadoop Hive Cluster, the WebHDFS API automatically encodes files to base64 format. As a result, it is not always possible to view the contents of the download in the fields on the output.

For example, if the DataOutputMode property is set to Field, due to the automatic base64 encoding, the encoded result will be visible instead of the contents.

To view the contents, set the DataOutputFieldEncoding property to None. However, this is not always possible due to invalid characters in the original file; in this case, the workaround is to set the DataOutputMode to File and then import the data using one of the input connector nodes.

Web application

The following table lists Data360 Analyze known issues and limitations:

Feature Description
Linux multi-user install

Uploading larger files through the user interface may result in failures due to permissions errors on the workDir used by Tomcat. To allow larger file uploads to work:

  1. Stop Data360 Analyze.
  2. Create a directory on the server file system that the web application should be able to use for processing temporary files.

    In the next step, this directory is referred to as [webappTempDir]. The Data360 Analyze web application group is referred to as [webGroup].

  3. Run these commands:

    sudo chown :[webGroup] [webappTempDir]

    sudo chmod 2775 [webappTempDir]

  4. Edit the tomcat/conf/server.xml within the Data360 Analyze installation directory, changing the line:

    <Host name="localhost" appBase="webapps" unpackWARs="false" autoDeploy="false">

    to

    <Host name="localhost" appBase="webapps" unpackWARs="false" autoDeploy="false" workDir="[webappTempDir]">

  5. Restart Data360 Analyze.

Composite library nodes created in previous versions

When importing or running a data flow that was created in an older version of the product, you may see error messages if the data flow contains composite library nodes that have been upgraded since the data flow was first created. If the data flow did not previously show these errors, you can resolve the issues as follows:

  1. Open the data flow and select all nodes.
  2. Choose Apply Auto-Fixes.
  3. Save the data flow, then return to the Directory before reopening the data flow.

Links from tooltips to help

Although it is not currently possible to open the integrated help from the links in node property tooltips, you can manually navigate to the help by pressing F1 then searching for the relevant topic.

Logistic Regression node

The Logistic Regression node does not support Unicode for categorical data.

Legacy C++ based nodes

Legacy nodes which are built upon the C++ technology, on a Windows installation of Data360 Analyze, are unable to deal with files where there is a long file path. For example, the deprecated "CSV File" node will error when trying to acquire data from a file which is situated within a long path on the Windows file system.

In this scenario, you are recommended to move your file up a number of folder levels, in order to ensure that the node can successfully read it.