Sunday, September 24, 2023

SAP PowerConnect

SAP PowerConnect for Splunk is a tool to capture data about an events in SAP systems and upload it to a Splunk software in real time. The data is analyzed to and visualized as SAP telemetry intelligence in the Splunk. The SAP telemetry provides analytical information on efficiency of the systems of SAP in a local and cloud environment.

The PowerConnect tool was originally developed by BNW Consulting. This company was acquired by SoftwareOne in 2019. Splunk company was acquired by cisco in 2023.

Technically the tool is developed in /BNWVS/* namespace as a software component BNWVS.

Following are example of jobs running in SAP systems that are part of the SAP PowerConnect.






More information:

PowerConnect documentation

PowerConnect on splunkbase

Sunday, August 27, 2023

Reserved key words for SAP table fields

When it comes to table creation in SAP Data Dictionary (DDIC) there is a check performed whether particular field (names of table columns) is not reserved. If so the dictionary object cannot be activated. This applies regardless the table is created manually or generated by the system. Automatically generated dictionary objects can involve in BW objects like aDSOs, Open Hubs, etc. In other applications, it can involve automatic generation of tables for CDS views.

The check if the field is or is not reserved is performed in ABAP Data Dictionary. The dictionary has a list of reserved words that may not be used for database objects. The list depends on the database system and is present in the system in a form of database table called TRESE. The table has two colons:

NAME – represents the reserved key word itself

SOURCEHINT - reason for reservation, means what DB type the keyword is reserved for


In summary there is the table TRESE in ABAP dictionary that stores reserved or protected names that cannot be used within the dictionary objects names.


Thursday, August 24, 2023

Unavailable SAP Notes

There are a few cases when SAP Notes or Knowledge Base Articles (KBA) are not available to be viewed on SAP for Me (formerly ONE Support Lunchpad (OSS)) site.


1. SAP Note/KBA XXXXXXX is being updated. This happens when there is a new version of particular Note being prepared. As usually the Notes has initially a version in its master language it can be that, there is a few days delay when the Note is translated (e.g. from German to English). The translation part can cause that delay. If the Note is still not around for a few days, you can approach the SAP via e.g. social media and ask for the Note. That usually helps to speed up the updating process.

What makes it a bit unfortunate and frustrating too is that during the updating the former version of the Note is not accessible either. This issue was raised many times to SAP however it is not solved yet. SAP claims that sometimes the old version is not correct (even it can be harmful) thus they decided to take the whole Note offline for a while.


2. Sorry! You are not entitled to access SAP Note/KBA. This message is so called a maintenance terminated status. The message is present when customer number to which your support site user is linked to lost an access to the software component which the Note is related to. Means that e.g. license to that software is not provisioned or wasn’t provision to your customer number. In this  case you need to raise this issue to your SAP representative (account manager).

More information:


Friday, August 4, 2023

ABAP Platform Trial 1909 SP07

Last time when SAP released an on premise version of SAP ABAP Platform/NetWeaver developer edition was on Feb 15th 2021. It was a version called SAP ABAP Platform 1909, Developer Edition and it was provided as a Docker image allowing developers run it on its own machines as a container. That edition was available till Dec 2021 when Log4J vulnerability was revealed. The ABAP Platform 1909 became a victim of Log4J as many other software due to fact that software vendors decided to pull it off till the vulnerability is not properly addressed. SAP took a time a decide on the fate of the ABAP Platform 1909 container image. Initially the decision was scheduled to be announced on January 10th 2022. That was delayed till mid of Oct 2022 (23th of Oct) when SAP announced that “SAP’s product and delivery standards have evolved…”. Basically the statement said that SAP is preferring cloud versions over something that independed developers can install by themselves. And, in fact meanwhile a CAL versions were provided. Anyhow, this move sparked quite an discussion and criticism (e.g. here (363 comments), or here) within SAP developers community. Following all that an idea was created on Dec 12th 2022 on SAP influence page. It got an attention of 278 voters. After approximately 7 months the idea status moved suddenly to Delivered. Via blog SAP announced that docker based version of ABAP Server is back. This is it is again ABAP Platform Trial 1919 by slightly higher Service Pack version – 07.

How to make it run? Just follow instructions at docker hub page. All you need is to have a machine with a lot of memory (32+ GB), DockerHub account, and if you are on WINDOWS OS – a Docker Desktop. Linux and MacOS is supported as well. Once you get the Docker Desktop up and running fire up a below command to pull the image from the hub to yoru machine:

docker pull sapse/abap-platform-trial:1909

After 30min or so (depending on speed of internet connection), image extraction part starts that created a new contained in your Docker:

Once it is finished start the container with below command:

docker run --stop-timeout 3600 -i --name a4h -h vhcala4hci -p 3200:3200 -p 3300:3300 -p 8443:8443 -p 30213:30213 -p 50000:50000 -p 50001:50001 sapse/abap-platform-trial:1909 -skip-limits-check

Once you get a message in the command prompt saying “*** Have fun! ***” the ABAP trial is up and running.

To stop it just hit CTRL+C in the terminal window which started it. There will be logs like below popping up:

Interrupted by user

My termination has been requested

Stopping services

Terminating -> Worker Processes (2919)


Finally passing away ...

Good Bye!


To start it again run the container (aka regular start) with below command:

docker start -ai a4h

In my case as I run the Docker on WSL2 (there are different engines available to power it) I faced an issue during first attempt to run the container. It hanged “HDB: starting” and not going anywhere for couple of hours.

WARNING: the following system limits are below recommended values:

  (sysctl kernel.shmmni = 4096) < 32768

  (sysctl vm.max_map_count = 65530) < 2147483647

  (sysctl fs.file-max = 2668219) < 20000000

  (sysctl fs.aio-max-nr = 65536) < 18446744073709551615

Hint: consider adding these parameters to your docker run command:

  --sysctl kernel.shmmni=32768

Hint: if you are on Linux, consider running the following system commands:

  sudo sysctl vm.max_map_count=2147483647

  sudo sysctl fs.file-max=20000000

  sudo sysctl fs.aio-max-nr=18446744073709551615

sapinit: starting

start hostcontrol using profile /usr/sap/hostctrl/exe/host_profile

Impromptu CCC initialization by 'rscpCInit'.

  See SAP note 1266393.

Impromptu CCC initialization by 'rscpCInit'.

  See SAP note 1266393.

sapinit: started, pid=14


HDB: starting


This is due to fact that in WSL2 case there is no sizing options. WSL2 has an in-built dynamic memory and CPU allocation feature that means that the Docker can utilize only the required memory and CPU. But as there are other processes in WIN OS that need an OS resources too it started to be a problem because WSL2 consumed all available memory. This can be solved by editing a .wslconfig file located in WIN’s USERPROFILE folder. I altered the file as below:


memory=26GB   # Limits memory to docker in WSL

processors=5     # Limits no processors


Other option than to .wslconfig file modification is to start the docker image deployment with addition to following parameters (on bold) of “docker run” command.

docker run --stop-timeout 3600 -i --name a4h -h vhcala4hci -p 3200:3200 -p 3300:3300 -p 8443:8443 -p 30213:30213 -p 50000:50000 -p 50001:50001 -m 26g --cpus 5 sapse/abap-platform-trial:1909 -skip-limits-check


If you can’t get your container up and running there are couple of things to be checked with the respect to HANA DB:



You may need to review log files in those folder. In most cases it will be an issue of lack of operating memory or hard drive space.

For extending the licenses of either ABAP Platform (AS ABAP) or HANA DB see my post here.

One more thing to clarify is a naming convention. Difference between SAP NetWeaver AS ABAP Developer Edition and SAP ABAP Platform Trial. One part is that SAP has shifted away from the NetWeaver to ABAP Platform. Technology wise the ABAP trial is delivered as Docker container whereas Netweaver Developer Edition are based on Virtual Machine software like Oracle Virtual Box.

In closing I must say that I really appreciate an effort all of the people who made this great distribution of ABAP Platform available again. I especially appreciate that SAP made a commitment to deliver also feature releases release (version 2023, 2025 and 2027 according to the release strategy for SAP S/4HANA). Even more they plan to release the ABAP Platform Trial subsequently whenever there is a new SP update. Thanks again for that!

More information:

Power up your SAP NetWeaver Application Server ABAP Developer edition

SAP ABAP Platform 1909, Developer Edition – installation on WINDOWS OS

Saturday, July 29, 2023

Repairable DTP data load request

When using Data Transfer Processes (DTPs) to load data, there is an important feature called "Reparability" If a data load request fails, it appears in red status, indicating that the data was not loaded correctly into the target destination. Such failures can occur due to various reasons, such as extraction issues, network errors, database locks, data transformation errors, or data integrity issues, such as missing master data.

The good news is that, depending on the cause of the data load error, the data
processed by the DTP can still be loaded and repaired. This means that if the
DTP data load request has a "Repairable = Yes" status on the DTP's monitor screen, the data can be fixed and reloaded.

However, this repair option is only available in specific cases, typically when the data is still present in the DTP's temporary storage. There are two particular scenarios that need to be met for successful data repair:

1. The source of the extracted data must hold the data in a table with a technical key (request, data package). Additionally, each data package in the source must be uniquely assigned to a data package extracted by the DTP. In case of incorrectly processed data packages, they can be reconstructed uniquely in another attempt using selections from the source data.

2. If the source does not have this property, there is still an option to reconstruct the incorrect data from the temporary storage created at runtime by the DTP. However, this is only possible when all the data from the source was extracted in the first attempt and when at least one temporary storage was created for all data packages.

The technical evaluation of the data load request for the "Repairable" flag is carried out using the ABAP method IF_RSBK_REQUEST~GET_REPAIRABLE in the class CL_RSBK_REQUEST_RED. The flag itself refers to the data element RSBKREPAIRABLE (Indicator: Request Is Repairable) in the data dictionary.

Inconclusion, the reparability feature of data load requests in DTPs provides a valuable way to address and correct data load failures, ensuring smoother data integration into the target destination.

Thursday, July 27, 2023

SAP BW Standard Transport System

In SAP BW a transport system is used to move BW objects from one system to another. This system facilitates the movement of data models, data sources, transformations, InfoObjects, DataStore Objects, queries objects and other BW specific artifacts between different environments, such as development, testing, and production systems. The standard transport system in SAP BW is part of the overall SAP Transport Management System (TMS). The type of BW transport system is called BW standard transport system. There is a difference between the TMS and BW standard transport system. That BW standard transport system can be active or inactive in the BW system. How the BW system does behaves in case it is active and inactive?

Standard transport inactive (deactivated) – All new objects are created as local objects, means they belong to $TMP developer package. No dialog windows appears to assign the development class neither transport request upon the objects creation. Once these objects need to be transported then need to be collected to different developer package, e.g. via Transport Connection tool.

Standard transport active – New objects are no longer created as local objects. Instead, developer is asked to assign the objects to developer package and assign transport request upon creation of the new BW object.

Developer can switch on/off BW standard transport system in SAP GUI as follows.

1. t-code RSA1 -> Transport Connection, menu Edit -> Transport -> Switch On Standard

2. t-code RSOR -> same mu path as above

Info on what type of transport type a BW system is set is stored in table RSADMINS. In its field called TADIRPOPUP. If there is, an X value saved it means the BW standard transport system is set on.

Technically, there is an ABAP SQL statement UPDATE that manipulates the column. The code is implemented in ABAP class CL_RSO_GUI_REPOSITORY and method TRANSPORT_STD_MODIFY.

One more specialty is there with the respect to BEx/BW query objects. The BEx/BW artefacts are written to a fixed BEx request. In addition, you can optionally determine a BEX request per package. BEx objects are written directly to a request and not to a developer's task. The popup to enter the BEx request is also available in the transport connection tool.

Tuesday, July 11, 2023

Program to find personal data in BW/BEx report objects

Within BW reports (formerly BEx reports), there can be a personal information in form of SAP user name or email stored. There are multiple BW report objects that can store such an information. It can be query element (ELEM), query view (QVIW), workbook (XLBW), web template (BTMP), enterprise report (ERPT), broadcast settings (BRSE), bookmark (ABAP, JAVA runtime or BICS), web item (BITM), Query Variants (RSRT) and so on.

Situation where one particular user that is used in objects like above needs to be replaced with a new one. This can be a case when original user is leaving an organization. Similarly if user email got changed (e.g. in case of marriage). How to find what are all the objects where the user ID/email is stored?

SAP provides an ABAP program to do that. It is called RS_FIND_USER_INFO and it serves for purposes of finding the user information across different BW/BEx objects.

On its selection screen a user needs to be specified whose information is search for. On top of that, program can run in three modes:

1. Display what are particular BW/BEx table where the information was found and how many entrees are there in those tables. See picture below.

2. Replacement of user in OWNER field – OWNER is a user who created the particular BW/BEx object. In this mode of the program run the OWNER value can be replaced.

 3. Replacement of user in other field – Programs replaces the user name in other fields such as AUTHOR, TSTPNM, LASTUSER, CHANGED_BY etc.

One more option on the selection screens is a checkbox for deletion of a records in the tables that carries the user record.

Example of the program’s output:

The program 1st appeared in BW 7.4 SP20 and is present also in BW4/HANA systems.


More info:

2603432 - Report RS_FIND_USER_INFO

2642676 - NW 7.50 - BEx 7.x Java runtime – deletion of data

Sunday, July 9, 2023

SAP Notes in different languages

There is a translation of SAP Notes content available on SAP support side for quite some time. A links to translated Notes are available usually at the very end of the note. There is a section called Available Language that lists out all the languages.

There can is an URL parameter that can be used to access the SAP Note in a specific language. Like below examples:      English       Italian      German      Spain      Portugal      French      Russian       Japan      Chinese      Korean


Normally except German and English languages, the translation to all other languages is just a machine translation. As with any machine translated texts there is not a guarantee of accuracy or completeness of the translation.

In addition, there can be another problem that a codes snippets, parameters names and similar things that are not supposed to be translated are sometimes translated too.

Anyhow, if someone is interested in learning the new languages this is can be also a chance :-)

Wednesday, June 28, 2023

SAP Data Warehousing solutions as of 2023

Data warehousing is a category of enterprise software that integrates a data from different sources, transform it into a consistent and structured format, and provide users with easy access to the information needed for decision-making purposes.

As of Q2 2023 SAP has three data warehouse solutions depending on its deployment scenarios:


A) on-prem & private cloud:

1. SAP HANA SQL DWH (Data WareHousing) – brings SAP HANA platform with loosely coupled tools and its platform services (HANA application, integration, processing and database services), best of breed to build own data models

2. SAP BW/4HANA – packaged data warehousing solution, successor of SAP BW, all DWH services in one integrated repository (modeling, monitoring and managing the DWH)

B) public cloud:

3. SAP Datasphere (formerly SAP DataWarehouse Cloud) – cloud based data warehousing service


Into the A) section there would be a place for classic SAP Business Warehouse (SAP BW) up to its latest version 7.5 as well. However, the SAP BW is based on SAP NetWeaver stack that reaches its end of maintenance status. Due to this the latest version of SAP BW – 7.5 has a planned expiration of mainstream maintenance as of Dec 31st of 2027.

Other perspective can be either from application or native (SQL type) of the Data warehouse. SAP Datasphere and SAP BW/4HANA is application driven solution whereas SAP HANA SQL DWH is native driven type of DWH solution.

Each solution caters to different needs and deployment scenarios, enabling customers to effectively manage and leverage their data assets for improved decision-making and business outcomes.

More information:



SQL Data Warehouse

Thursday, June 22, 2023

Upport Downport Overview (UDO) tool

SAP as any other software vendor needs to support its software thru its whole lifecycle. This involves also providing code changes introduced in higher-level releases to lower ones. Process like that is so called downporting. There can be a e.g. legal reasons for that.

Downporting is the process of porting an application or software component from a higher version of a platform to a lower version of the same platform. Similarly, there is an opposite process called upporting. It involves porting of software from lower to higher version.

In a world of SAP software, downporting and upporting is done using the UDO tool which simplifies the process of by generating a report which will be saved in the system in the normal deliverable package and be shipped in a SAP note.

The UDO tool is continuation of other tools like Correction Workbench (t-code SCWB), ABAP Split Screen Editor (t-code SE39), Version Management (dev pack SVRS), Note Assistant (t-code SNOTE) etc. It compares active version of the ABAP object in remote system (system that has the feature implemented) with version in the local SAP system (system where the feature is to be implemented aka downported). Then it collects DDIC changes, documentation changes and it prepares a new ABAP program. The ABAP program is usually called UDO_NOTE_X or in BW area RS_UDO_NOTE_X, where X stands for SAP Note number e.g. RS_UDO_NOTE_2367512. The report is called UDO report. The UDO report is published in specific SAP Note on SAP ONE Support portal and it can be downloaded from there. When the UDO report is executed (e.g. in customer system) it takes care of the DDICs and documentation and creates and activates them in the target system.

If there would not be the UDO tool in place all of these activities would need to take place manually. Of course, no one wants to do this job. On one hand, responsible developer may forget to write down all the steps needed. On the other hand, person who is implementing those steps manually may forgot to perform few of them. Therefore, a need to automate such activities is it is obvious.

Technically the UDO tool is implemented by ABAP program called SAP_LOCAL_DOWNPORT_ASSISTANT associated with t-code UDO and it is a part of SAP Software Logistics Toolset.

Wednesday, June 21, 2023

Enhanced Master Data Update

One of a new BW features that were introduced in BW4/HANA version is Enhanced Master Data Update. It can be enabled for a characteristics (attributes and/or texts) type of Info Objects. Only the characteristics that do have master data is supported. This feature brings a possibility of parallel loading of data to the IO. There are new tables created for the IO in case the feature is enabled. The tables are similar to inbound and active tables like aDSO object has. Like in the aDSO case also here the data is loaded first to inbound table. Upon the data activation the attribute and/or text tables of the IO are updated.

Advantage of this feature is while a huge portions of MD is being loaded and/or from multiple sources. It means that the data can be loaded in parallel. Delta mechanism is also supported.

The feature can be enabled in BW Modeling Tools of SAP Hana Studio on IO maintenance screen on general tab.

Technically the flag is stored in table RSDCHABAS column ENHANCED_MD_UPDATE. Its values referring to domain RSDMDUPDATEENH that has following 2 values:

0        Does not use enhanced master data update

1        Uses enhanced master data update


Once the enhanced MD updated is enabled for the characteristics the settings related to that can be maintained in t-code RSDMD_SETTINGS. The t-code refers to class CL_RSDMD_SETTINGS_CTRL and method MAINTAIN_SETTINGS.

In case of issues with the IO activation that error message complains about the Enhanced Master Data Update cannot be used for Char. X due to ... (e.g. message no R7B420, R7B428 etc.) there is an ABAP program RSD_CHA_ADAPT_DEFAULT_ENHMDUPD that fixes those.

Sunday, June 18, 2023

Consuming data from BW or BW/4 in BW based on ODP

In Classic BW systems the scenarios when BW serves a data to other BW systems involves creation of so called export DataSource (DS). That DS is created in source system and it has a naming convention 8*. In other BW system where the DS to be is consumed the export DS is replicated and means afterwards it is visible in the target system. From there on a data flow can be built on topo of that replicated DS.

In case of BW/4 system as target system the export DS are obsolete. There is no need to create the export data source anymore. While leveraging the ODP technology objects from source BW are available in the target DS. There is a large variety of supported objects like aDSO, Composite Providers, former BW Classic objects (infocubes, MultiProviders, Semantically partitioned objects, InfoSets, Query as InfoProviders) and finally InfoObjects as well. In case of the InfoObjects and its attributes, texts and hierarchy are supported within ODP-BW context.

Now how get the data from source BW target one? In the target BW just replicate the ODP-BW context node available BW. That activity is to be done in SAP HANA Studio. While doing it a list of available objects from source BW appears in popup window. From there you just choose the object(s) its data is needed to be loaded to target. Afterwards a newly created DS(s) will appear in the target BW. Their names are the same as the object in the sources just there is a postfix character describing the nature of data for the object:

*$F    - Transaction Data/Facts

*$P    - Master Data/Attributes

*$Q    - Time-Dependent Master Data/Attributes

*$T    - Texts

*$H    - Hierarchies


Just to add few more remarks on working with the ODP. Technically when the replication is triggered from target to source BW system there are following Function Modules that are called:





There is so called ODP extractor checker that can be used to test the extraction in source system. It is kind of a new RSA3 t-code that checked the extractors in case of classic S-API plug-in based extractor. In ODP based extraction it is an ABAP Program RODPS_REPL_TEST (ODP Replication Test) which serves the same purpose.

In addition, notice that ODP has gone some development are it is recommended to use the latest one, which is ODP Data Replication API 2.0. Version 2.0 is available as of:

SAP_BW 750 SP 0 (incl. former PI_BASIS packages)

DW4CORE 200 SP 0 (incl. former PI_BASIS packages)


More information:

Operational Data Provisioning (ODP)

Online docu - Transferring Data from SAP BW or SAP BW/4HANA Systems via ODP (InfoProviders, Export DataSources)

2483299 - BW4SL & BWbridgeSL - Export DataSources (SAP BW/4HANA or SAP Datasphere, BW bridge as Source)

Sunday, June 11, 2023

What is a difference SAP RISE vs SAP GROW (GROW with SAP)?

SAP tries to pitch its S/4 cloud solution to its customers very hard for a years. However as per surveys conducted by the SAP User groups like ASUG or DSAG the adoption is not at the level SAP would wish it would be.

Therefore, SAP continues to provide offerings to help customers with their transformation to S/4. It started back in 2021 with SAP RISE followed by this year (March 21st 2023) Sapphire when another offering – SAP GROW (or Grow with SAP) was introduced.

Let’s look at the both offerings in the nutshell.

The SAP RISE is an offering that combines SAP S/4HANA Cloud public and private edition, accelerated adoption services (e.g. SAP Readiness Check, Custom Code Migration, etc.), SAP Signavio to manage business processes, pre-configured best practices. See more information here.

The SAP GROW is also build on SAP S/4HANA Cloud public edition with the focus on midsize companies where the business processes are more standardized.

The main difference between them is that RISE with SAP is more flexible and customizable, while GROW with SAP is more standardized and out-of-the-box. RISE with SAP targets larger and more complex businesses, and offers both public and private cloud options for SAP S/4HANA. On the other hand GROW with SAP focuses on midsize companies, and only provides SAP S/4HANA Cloud, public edition.


More information:

SAP RISE (RISE with SAP) blog

RISE page

RISE online docu

GROW page

Thursday, June 1, 2023

SAP Smart Data Integration (SDI) and difference vs Smart Data Access (SDA)

HANA as data management platform needs to connect to other data sources to get a data. There are many integration patterns to follow to enable the connection. There is an ETL processing, real time replication, data integration etc. Each of them has its own advantages as well as disadvantages. With the introduction of SAP HANA Smart Data Integration (SDI) the SAP is pursuing to get the best of some of them.

SAP Smart data integration (SDI) is an extension of SAP HANA Smart Data Access (SDA). The SDA leverages the concept of virtual tables (data federation type of thing). That means a data replication is eliminated by providing a virtual layer that abstracts the underlying data sources. Metadata of tables of the source system are imported as virtual tables into the HANA. On top of the SDA the SDI brings broader set of preinstalled adapters from Data Provisioning Agent (DP Agent) component. The adapters in case of the SDI are no longer part of HANA DB. They run as separate processes in DP Agent component.

The SDI is part of the Enterprise Information Management (EIM) solution of SAP. Its purpose is to connect data sources to the HANA, provision data from them and load the data from them to HANA.

As the SDI brings more functions into the EIM process, some customers are opting for this option. Therefore e.g. in SAP BW area they are migrating the SDA based data sources to the SDI one.


SDA is available since SAP HANA 1.0 SPS06

SDI is available since SAP HANA 1.0 SPS09

More information:


SDI online docu

2400022 - FAQ: SAP HANA Smart Data Integration (SDI)

2180119 - FAQ: SAP HANA Smart Data Access

2600176 – SAP HANA Smart Data Access Supported Remote Sources

Monday, May 29, 2023

Stack-ability of Composite Providers

Composite Providers do have a settings that enables them to be reused in other the Composite Providers. In other words, the settings is describing reusability or stack-ability of the Composite Providers. The settings in available in SAP HANA Studio - BW Modelling tools in Composite Provider maintenance screen under General tab. Composite View is re-usable as PartProvider of other Composite Views.

Technically the settings is implemented in table RSOHCPR, and its field STACKABLE. In case the flag is set, the Composite Providers is type of stacked Composite Provider (STACKABLE = X). The setting is handled in methods like IS_STACKABLE, CHECK_STACKABLE and SET_STACKABLE of ABAP class CL_HCPR_COMPOSITE_PROVIDER.

More information:

Friday, May 12, 2023

SAP background jobs and processes behind transports (CTS)

Transport system of SAP objects is a component that enables the controlled movement of software configurations, developments, and customizations across different systems in a landscape. It ensures deployment of changes from development to testing, quality assurance, and ultimately to production systems.

There are several steps involved while particular objects is being moved. Proper sequence and list of the steps executed depends on what type of objects are present in the transport request (TR).

Some of the ABAP steps are performed in the SAP system. These are:

- ABAP Dictionary activation (A)

- Distribution of ABAP Dictionary objects (S)

- Table conversion (N)

- Matchcode activation (M)

- Import of application objects (D)

- Update of version management (U)

- Execution of XPRAs (R), XPRA = EXecution of PRograms After import  or Executable Program for Repair and Adjustment


When there is a transport request being imported into target systems following steps are executed:

Step                                                                     Program (ABAP prg/OS level cmd) executing step

SHADOW_IMPORT()                                              OS cmd: R3trans

DD IMPORT (H)                                                    OS cmd: R3trans

DD ACTIVATION (A)                                             ABAP prg: RDDMASGL

DISTRIBUTION OF DD OBJECTS (S)                       ABAP prg: RDDDIS0L


tpmvntabs                                                          OS cmd: tp

MAIN IMPORT (I)                                                 OS cmd: R3trans

tpmvkernel (C)                                                    OS cmd: tp



VERSION UPDATE (V)                                           ABAP prg: RDDVERSL



Transport related jobs (RDD*) jobs:

Job/ABAP program RDDIMPDP (or JOB_RDDNEWPP) - a dispatcher (transport deamon) program for transport activities running within NetWeaver/ABAP platform based systems. It receives information from the external transport program (OS level programs) tp and R3trans via table TRBAT. The job is normally scheduled by ABAP program RDDNEWPP.

Shadow import – SAP system upgrade relevant only. It importss the upgrade data into Customizing tables, as well as the transport requests of add-ons, support packages, and languages.

DD import – is a phase where the Data Dictionary (DDIC) objects are imported into the target system during the transport process. The DDIC objects include data elements, tables, views, domains, and other dictionary-related components.

DD activation - Job/ABAP program RDDMASGL - runs Data Dictionary Activation, if there is an object in the TR has Data Dictionary object and it was imported before, during Data Dictionary Import.

DISTRIBUTION OF DD OBJECTS - Job/ABAP program RDDDIS0L – phase where the Distribution tasks related to DDIC objects are performed during the transport process. This phase handles the distribution and activation of DDIC objects across the SAP system landscape. RDDDIS0L runs the report RDDGENBB, if the transport request object list has at least one DDIC object, its DDIC import and DDIC activation already happened and the object has to be adopted on database level. Table structures are changed if necessary.

TBATG CONVERSION OF DD OBJECTS - Job/ABAP program RDDGEN0L - this phase refers to the conversion of DDIC objects during the transport process. It is the conversion from the original system format to a format of the target system.

Tpmvntabs - phase is responsible for updating the database tables related to transport management in the target system. During this phase, the transport tool performs various tasks related to transport management tables to ensure their consistency and synchronization with the transported objects. These tables store information about TR, their status, logs, and other relevant data.

MAIN IMPORT - phase where the actual objects contained in the TR are imported into the target system. Tasks like object integration, object activation, dependency resolution, conflict resolution and post-import activities are executed.

TBATG CONVERSION OF MC OBJECTS - Job/ABAP program RDDGEN0L - this phase refers to the conversion of Modifiable Customizing (MC) objects during the transport process. This phase is responsible for converting the MC objects from the original system format to a format suitable for the target system. Modifiable Customizing objects are system-specific configurations and settings that can be customized by customers. These objects are typically maintained in client-dependent tables and are specific to individual systems. During the transport process, these objects need to be converted to make them compatible with the target system.

Tpmvkernel -

IMPORT OF SELFDEFINED OBJECTS - Job/ABAP program RDDDIC1L – it does import of ADO (Application Defined Objects) from table TRBAT.

VERSION UPDATE - Job/ABAP program RDDVERSL - refers to the process of updating the version information of the transported objects in the target system. This phase ensures that the system recognizes and tracks the correct versions of the imported objects.

EXECUTION OF REPORTS AFTER PUT - refers to the execution of specific reports or programs after the objects contained in the transport request have been successfully imported and applied to the target system. Job/ABAP program RDDEXECL - runs as Method Execution (XPRA), if an application has their own object in the transport object list and it was imported during Main Import, and their application specific method/function/program has to run.


Other ABAP programs related to transports:

RDDPROTT - displays the log for a specific transport request. Used by t-code SCTS_LOG -Transport Log Overview.

RDDFDBCK - import feedback process, Message After Exporting or Importing Requests

RDDDIC3L - deleting obsolete or no longer required entries from the transport request buffer in the SAP system.

RDDMNTAB - runs a move nametab, for a new ABAP objects that were imported, their runtime objects have to be put in the active runtime environment.

RDDDIC0L – Exports ADO (Application Defined Objects) objects

RDDDIC1L – Imports ADO objects

RDDDIC3L - Generates ABAP programs and screens

RDDVERSE –Version Management: Version at Export


Important tables:

TRBAT - it forms the interface between the transport control program tp and the SAP system. To trigger an ABAP step, tp writes control information to this table. The JOB_RDDNEWPP phase schedules the event-driven background job.

TRBATS - Communication Table for Deployment Control

TRJOB - Job ID (e.g. 07444600) that is running at some point of time in teh SAP systems is entered in this table.


More information:

Change and Transport system

2147060 - Import steps not specific to transport request