Studio Best Practices - Automate_Studio - 20.3

Automate Studio with Connect User Guide

Product type
Software
Portfolio
Integrate
Product family
Automate
Product
Automate > Automate Studio
Version
20.3
Language
English
Product name
Automate Studio
Title
Automate Studio with Connect User Guide
First publish date
2018
Last updated
2024-05-06
Published on
2024-05-06T08:28:10.436135

Use these best practices for Transaction and Query to streamline your script and query production processes and get solutions up and running more quickly.

Transaction

  • ALWAYS walk through the t-code in SAP before recording. This will ensure you know the process to be recorded and have data for the recording.
  • Practice, record, and test your scripts and queries in a non-production SAP system to ensure that no erroneous data is added to a production system.
  • Always provide a unique file name for every script and Excel template file on the server. If the file is related to a specific transaction code, consider including the transaction code and a brief description in the name. In the cases where there are multiple files based on the same object(s) – for example, transaction code, BAPI, etc., provide unique descriptions for each version.

  • Consider adding add_GUI to the name a script recorded in GUI Scripting mode. Later, you will be able to identify those scripts quickly.

  • Use GUI scripting only if all the other recording options fail - GUI scripting works as a screen reader and is the most sensitive to any transaction screen variations in the GUI. It is not supported in workflows and is also the most difficult recording mode to maintain.

  • For transactions with selectable views, always clear the default views by clicking the Deselect all button in the recording.
  • Do not use a loop to update items. If one item update fails, the entire transaction fails.
  • Do not use validation on an HR transaction because data is committed to the back end after each screen.
    Important: Exception: PA30 and PA40 are supported if the Automate Function Module is installed.
  • When creating data update scripts, consider using the Backup Data feature (on the Studio Run tab) to automatically write existing data to a separate tab before changes are made.

  • When your Excel data template has multiple tabs, consider locking (on the Studio Run tab) your script to a specific tab.

  • In ME21N, disable all BDC Cursor positions up to the point where the filter is selected. Otherwise, Transaction will have trouble locating these when navigating the second time around.
  • When modifying existing complex transaction scripts with missing screens, consider recording the transaction with required basic information and the missing screen. Then copy the screen to the existing transaction script.

  • Make the expert view work for you by understanding how to manually add/update screen and field names and descriptions, creating conditions and loops, learning where OK codes come from, and deleting unnecessary “Indicates cursor position on the screen” rows.

  • Check that Save (OK code) is sitting within its own screen at the very end of the script, after any mapped fields. This will make validation work.

Query

  • Run multiple-table Query scripts as an SAP background process to avoid affecting the SAP server performance.
  • Always use Selection criteria in a query so that you extract just what you need.
  • Joins between Primary Key fields always have the best performance. Joins between Index fields have the next-best performance.
  • Creating a join between non-indexed fields will degrade the performance of data extraction.
  • Inner join queries are always recommended, because a left outer join will degrade the performance of data extraction.
  • It is always better to do a preview run before you publish a query to be run in the production environment. The preview run will confirm that the required data exists and that the query is correctly formed.
  • Use Adaptive Query Throttling (AQT) mode for execution, so that the Query run will not affect the SAP server performance.
  • Use Direct Execution mode (non-AQT) to run queries that will extract fewer than a thousand items.
  • Applying criteria on non-Indexed fields will degrade the performance of data extraction.
  • Use Data Chunking in Query when running large data downloads to avoid network bottlenecks.
  • Creating Queries using View and Transparent tables gives better performance than Cluster and Pool tables, which are normally heavier in size.
  • Remove any unnecessary tables, fields, and joins in a Query script to get better performance in production.
  • Use the Text result destination, or data type, to write data in faster than the Excel, Access, XML, SQL Server, and Reference Data data types.
    Note:

    The account that you use to write to SQL Server must have db_owner or db_ddladmin membership on the database that contains the target table.

  • Always check the Number of Entries on the Workspace tab to validate number of records that will be extracted with Query.
  • Use selection criteria on the join fields when you have very large queries. The join fields are usually the Primary key fields.

Direct

  • Always understand BAPI/RFM requirements before attempting to create a Direct script by doing one or more of the following.
  • Review documentation.
  • Online research to find use case examples.
  • Understand the purpose common features/what to look for (INPUT STRUCTURES, OUTPUT STRUCTURES, TABLES, ImportOthers, ExportOthers, “X” tables, Commit Required, Enqueue Required, etc.).
  • Whenever possible, test data in SAP using transaction SE37.
  • Test the data template with at least two or three rows or documents in a non-production environment.
  • Verify the BAPI/RFM is remote enabled.
  • Before mapping, identify required fields and include them in the Direct script selected fields.
  • Check the Commit Required check box in the Workspace tab to commit/save in SAP – otherwise the BAPI will be run in simulation mode.
  • Check the Enqueue Required check box in the Workspace tab to lock the record in SAP during upload.