Friday, May 12, 2023

SAP background jobs and processes behind transports (CTS)

Transport system of SAP objects is a component that enables the controlled movement of software configurations, developments, and customizations across different systems in a landscape. It ensures deployment of changes from development to testing, quality assurance, and ultimately to production systems.

There are several steps involved while particular objects is being moved. Proper sequence and list of the steps executed depends on what type of objects are present in the transport request (TR).

Some of the ABAP steps are performed in the SAP system. These are:

- ABAP Dictionary activation (A)

- Distribution of ABAP Dictionary objects (S)

- Table conversion (N)

- Matchcode activation (M)

- Import of application objects (D)

- Update of version management (U)

- Execution of XPRAs (R), XPRA = EXecution of PRograms After import  or Executable Program for Repair and Adjustment


When there is a transport request being imported into target systems following steps are executed:

Step                                                                     Program (ABAP prg/OS level cmd) executing step

SHADOW_IMPORT()                                              OS cmd: R3trans

DD IMPORT (H)                                                    OS cmd: R3trans

DD ACTIVATION (A)                                             ABAP prg: RDDMASGL

DISTRIBUTION OF DD OBJECTS (S)                       ABAP prg: RDDDIS0L


tpmvntabs                                                          OS cmd: tp

MAIN IMPORT (I)                                                 OS cmd: R3trans

tpmvkernel (C)                                                    OS cmd: tp



VERSION UPDATE (V)                                           ABAP prg: RDDVERSL



Transport related jobs (RDD*) jobs:

Job/ABAP program RDDIMPDP (or JOB_RDDNEWPP) - a dispatcher (transport deamon) program for transport activities running within NetWeaver/ABAP platform based systems. It receives information from the external transport program (OS level programs) tp and R3trans via table TRBAT. The job is normally scheduled by ABAP program RDDNEWPP.

Shadow import – SAP system upgrade relevant only. It importss the upgrade data into Customizing tables, as well as the transport requests of add-ons, support packages, and languages.

DD import – is a phase where the Data Dictionary (DDIC) objects are imported into the target system during the transport process. The DDIC objects include data elements, tables, views, domains, and other dictionary-related components.

DD activation - Job/ABAP program RDDMASGL - runs Data Dictionary Activation, if there is an object in the TR has Data Dictionary object and it was imported before, during Data Dictionary Import.

DISTRIBUTION OF DD OBJECTS - Job/ABAP program RDDDIS0L – phase where the Distribution tasks related to DDIC objects are performed during the transport process. This phase handles the distribution and activation of DDIC objects across the SAP system landscape. RDDDIS0L runs the report RDDGENBB, if the transport request object list has at least one DDIC object, its DDIC import and DDIC activation already happened and the object has to be adopted on database level. Table structures are changed if necessary.

TBATG CONVERSION OF DD OBJECTS - Job/ABAP program RDDGEN0L - this phase refers to the conversion of DDIC objects during the transport process. It is the conversion from the original system format to a format of the target system.

Tpmvntabs - phase is responsible for updating the database tables related to transport management in the target system. During this phase, the transport tool performs various tasks related to transport management tables to ensure their consistency and synchronization with the transported objects. These tables store information about TR, their status, logs, and other relevant data.

MAIN IMPORT - phase where the actual objects contained in the TR are imported into the target system. Tasks like object integration, object activation, dependency resolution, conflict resolution and post-import activities are executed.

TBATG CONVERSION OF MC OBJECTS - Job/ABAP program RDDGEN0L - this phase refers to the conversion of Modifiable Customizing (MC) objects during the transport process. This phase is responsible for converting the MC objects from the original system format to a format suitable for the target system. Modifiable Customizing objects are system-specific configurations and settings that can be customized by customers. These objects are typically maintained in client-dependent tables and are specific to individual systems. During the transport process, these objects need to be converted to make them compatible with the target system.

Tpmvkernel -

IMPORT OF SELFDEFINED OBJECTS - Job/ABAP program RDDDIC1L – it does import of ADO (Application Defined Objects) from table TRBAT.

VERSION UPDATE - Job/ABAP program RDDVERSL - refers to the process of updating the version information of the transported objects in the target system. This phase ensures that the system recognizes and tracks the correct versions of the imported objects.

EXECUTION OF REPORTS AFTER PUT - refers to the execution of specific reports or programs after the objects contained in the transport request have been successfully imported and applied to the target system. Job/ABAP program RDDEXECL - runs as Method Execution (XPRA), if an application has their own object in the transport object list and it was imported during Main Import, and their application specific method/function/program has to run.


Other ABAP programs related to transports:

RDDPROTT - displays the log for a specific transport request. Used by t-code SCTS_LOG -Transport Log Overview.

RDDFDBCK - import feedback process, Message After Exporting or Importing Requests

RDDDIC3L - deleting obsolete or no longer required entries from the transport request buffer in the SAP system.

RDDMNTAB - runs a move nametab, for a new ABAP objects that were imported, their runtime objects have to be put in the active runtime environment.

RDDDIC0L – Exports ADO (Application Defined Objects) objects

RDDDIC1L – Imports ADO objects

RDDDIC3L - Generates ABAP programs and screens

RDDVERSE –Version Management: Version at Export


Important tables:

TRBAT - it forms the interface between the transport control program tp and the SAP system. To trigger an ABAP step, tp writes control information to this table. The JOB_RDDNEWPP phase schedules the event-driven background job.

TRBATS - Communication Table for Deployment Control

TRJOB - Job ID (e.g. 07444600) that is running at some point of time in teh SAP systems is entered in this table.


More information:

Change and Transport system

2147060 - Import steps not specific to transport request

Tuesday, May 9, 2023

Semantic Grouping set in a DTP or in TRFN?

Semantic Grouping enables grouping of data based on its semantics into the same data pack that is process by the BW load. In BW classic based systems, this function is normally called the Semantic Grouping whereas in BW/4 based systems it is called Grouped Extraction.

If the function is used, data records are extracted from a source as per a logical key and they processed together. The function can be switched on by selecting particular fields on DTP or in TRFN (Transformation). The DTP part was there since BW 7.0 and availability on TRFN came with BW 7.2

Why there is an option to use the Semantic Grouping / Grouped Extraction in the TRFN level as it is available at the DTP level already? Is there any difference between the DTP and TRFN?

Instead of finding a difference, it is better to looked at it as what they can do if they are combined. There can be following cases:

1. Error handing on the DTP is activated (‘Cancel Request, Track First Incorrect Record, No Update’) only the sources fields of key fields can be selected in the DTP. Grouping uses the intersection of these selected key fields and the fields selected from the transformation for semantic grouping.

2. Error handing is deactivated (‘Cancel Request, Do Not Track Records, No Update’), all the sources fields that were also selected in the transformation can be selected in the DTP. The Grouping is formed from this intersection.

3. For compatibility reasons - similar case as no 2 but no source fields have been selected in the transformation for the semantic grouping, all the source fields can be selected in DTP.

SAP is recommending (see in online doc links below) a selecting fields for grouping at the transformation level.

More information:

Online docu BW 7.3

Online docu BW4/HANA

SAP BW DTIS - Data Transfer Intermediate Storage

Saturday, April 8, 2023

SAP HANA license types

There are several license types available for the SAP HANA database. Below is a brief overview of the most common ones:


SAP HANA Runtime License: This license allows users to run the SAP HANA database in a production environment. It is available for both on premise and cloud deployments and is typically based on the amount of memory utilized by the database. The runtime license is most likely more purchased database license type. This is also the more restrictive license since customers can only leverage it for SAP applications and the underlying SAP and non-SAP data can only be loaded, exported, and managed via SAP technologies.


SAP HANA Full Use License: This license provides full access to all of the SAP HANA features, including data integration, advanced analytics, and development tools. It is typically purchased as an add-on to the SAP HANA Runtime License. The full use license type is less restrictive one. It is because there are less restrictions of SAP HANA Runtime. Organizations can use HANA full use for any combination of SAP, non-SAP, custom, or third-party applications. In addition, no limitations exist on the loading and exporting of SAP and non-SAP data directly in and out of SAP HANA.


SAP HANA Platform Edition: This license includes both the SAP HANA Runtime and Full Use licenses, as well as additional tools for data modeling and administration. It is designed for organizations that require a comprehensive data management solution.


SAP HANA Enterprise Cloud License: This license provides access to the SAP HANA database via a cloud-based subscription model. It includes both the SAP HANA Runtime and Full Use licenses, as well as cloud-specific features such as automated backups and disaster recovery.


SAP HANA Edge Edition: This license is designed for small to medium-sized businesses and includes a subset of the SAP HANA Platform Edition features. It is typically based on the number of users or CPU cores utilized by the database.


SAP HANA Express Edition: This license is a free, streamlined version of the SAP HANA database that is intended for non-production use. It includes a limited set of features and is typically used for development, testing, and training purposes. It is a free up to 32GB of use with the ability to purchase more, making it suitable for individuals.


The licensing terms and conditions for SAP HANA may vary depending on the specific product and deployment model. SAP clients need to always check with their SAP account manager about appropriate license type for their needs.



More information:

What type of SAP HANA (License) do you need ?

3247198 - SAP HANA License

Wednesday, April 5, 2023

Vanilla vs customized SAP systems

Vanilla is the term used when computer software (also other computing related things like hardware) are not customized from their original form, i.e., they are used without any customizations or updates applied to them (see wikipedia).

A vanilla SAP system refers to a standard installation of the SAP software without any additional customizations or enhancements. It is also known as a standard SAP system or a SAP system without any modifications.

When an organization decides to implement an SAP system, they can either choose to use the vanilla SAP system or customize it to meet their specific needs. The vanilla SAP system can be flexible and configurable, in a way that to some extent it meets the requirements of the organization without requiring significant customization. However, it is not uncommon for the organizations to require additional functionality that are not provided by the standard system.

In such cases, the organizations may choose to customize the vanilla SAP system by developing custom programs, reports, and interfaces, or by implementing add-on modules or third-party software solutions. However, customizing the SAP system can be a complex and expensive process that requires significant resources and expertise.

In summary, a vanilla SAP system is a standard installation of the SAP software without any customizations or enhancements. It provides all the standard modules and features that are offered by SAP and can be configured to meet the requirements of different businesses to some extent. However, companies may choose to customize the system to meet their specific needs, although this can be a complex and costly process.

In an era of cloud computing when the organization are moving to the cloud it becomes important to analyze into what extend their SAP installation are customized. Heavy SAP customization can be challenge for the cloud. Mostly because of the customization may not be compatible with the cloud environment or the cloud platform on which the SAP instance is being hosted. In addition, there might be a security and compliance issues, impact on scalability and performance, and maintenance and support challenges. It is important to carefully evaluate the impact of customization when moving SAP to the cloud and plan accordingly to minimize any potential issues.

Thursday, March 30, 2023

No Routine (ROUT) objects in BW4 transport of transformations

You may notice that in BW4 based systems transporting of Transformation is simplified. In classic versions of BW there were usually objects like Routine included in the transport requests in case particular Transformation contained them. The objects (OBJECT = ROUT) represent either formula in transformation rules or Start/End/Expert Routine. Such transport requests look like below one:

Routine objects of the Transformations are stored in table RSTRANSTEPROUT (Rule Type: Routine). They still are there in the table in case of BW4 systems.

However, in BW/4 based systems there are no transport entries like that anymore. Reason is that, routines and formulas are embedded with transformation metadata. The table RSTRANSTEPROUT has a column CODE that contains all the code of a particular routine and formula. Because of that the ROUT object are not necessary anymore. This change is in place since SAP BW/HANA 1.0 SP04.


More information:

2548884 - Formula handling and transport related problems in BW/4HANA

Wednesday, March 29, 2023


SAP released a new version of SAP GUI for Windows – version 8 on 20 Jan 2023. It is the first version that is 64-bit version that is delivered along 32-bit version. While installing the version 8 on WINDOWS OS 64-bit user needs to decide which version of the GUI will be installed. Parallel install of both versions the 32 and 64 bits is not possible. What are other new features of the new version of SAP GUI? Here I’m mentioning just a subset of new features that I find interesting.

UI point of view:


1. Quartz theme got new rendering engine


1. Changes to combo box control and the "normal" combo box

2. Tree control has a scrolling indicator now. This will be useful while navigating in BW’s RSA t-code e.g. in Info Object or Process Chain trees. But who is now using the SAP GUI for BW? We all moved to Eclipse based SAP HANA Studio with BW modeling tools, right? :-)

3. Table Control and Dialog box Container supports additional hot keys to enable faster navigation, e.g. CTRL + UP/DOWN/HOME/END.

4. ALV grid support a paste with gaps feature.

5. Branding support: selected branding image per SAP system and client can be defined now.

Technical point of view:

HTML Control based on WebView2 (package that enables to render web content via MS Edge browse rendering engine) supports a browser tabs, which were missing in SAP GUI 77 via browser extension from the Microsoft Edge Add-Ons.


More information:

Online docu

3075781 - New and changed features in SAP GUI for Windows 8.00

3218166 - SAP GUI for Windows: Functional differences of the 64-bit version compared to the 32-bit version

2035293 - Known and open issues of SAP GUI for Windows

Monday, March 27, 2023

Package based aDSO activation

In SAP BW, aDSO (advanced DataStore Object) or DSO (DataStore Object) is used as a persistence layer for storing and processing data. In case of e.g. standard DSO a data activation is the process of moving the data from inbound table of the DSO to active table. Data is aggregated accordingly within this process. See Flavors of aDSO object for more details.

In case there is a huge number (10k+) of data loads requests in specific aDSO objects and they all are being activated together there can be problems. Mostly with the respect to memory issues.

This is solved in the newer BW releases (7.50, BW/4HANA 1.0, BW/4HANA 2.0, BW/4HANA 2021) while introducing so called package based activation. This type of the data activation is available when the activation runs over Process chain – by process variant "Clean Up Old Requests in DataStore Objects (Advanced)". Here in this process s package size can be defined for specific aDSO object by a user. If it is not defined, a default package size of 10k is used.

The value itself is stored in table RSPCVARIANT (Generic Variant Storage):

The new way of the package based data activation only works when it is triggered from the in process chains via above mentioned process variant. In case the activation is performed manually, either via t-code RSMNG or via BW/4 Cockpit the activation work in the old way – no package size used.



More information:

3108217 - ADSO: packaged activation of requests

3038366 - Activation of requests in an aDSO with a huge number of request fails

Thursday, March 23, 2023

Watermarks of a BW data target object

Watermarks in BW can be considered as set of an internal counters (so called watermarks) of a technical information of the BW data target. Following are the BW data target objects that may have the watermarks: InfoCube, DataStore Object, Master Data Table and Text Table.

There is an ABAP program (RSPM_ADSO_WATERMARKS) in SAP BW systems that displays the aDSO watermarks like:

·        Number of all requests in aDSO

·        Number of nonactive requests in aDSO

·        Number of deleted requests in aDSO

·        TSN of lowest active request (AQ) in aDSO

·        TSN of high
est active request (AQ) in aDSO

·        AQ Status

·        TSN of lowest active request (AT) in aDSO

·        TSN of highest active request (AT) in aDSO

·        AT Status

·        Count

·        Count AT

·        Data in aDSO must be activated

·        Proc. Type

·        Process ID

·        Datamarted source TSN

·        Requests not datamarted

·        Delta information


The program gathers information from many tables like RSOADSO, RSPMREQUEST, RSMDATASTATE_DMO (DMO - DataMart Out), RSMDATASTATE_DMI (DMI - DataMart In) and others. For non BW4 optimized objects like classic infocube there is an function module RSSM_SHOW_WATERMARKS that gathers the watermarks for those objects. 

Selection screen of the report RSPM_ADSO_WATERMARKS:

Example output screen of the same report:

Monday, March 20, 2023

Reassigning of Process Chain’s InfoArea

When I needed to reassign a process chain (PC) between infoareas (IA) I was always struggling. Only way, how to do it I knew; was a drag and drop type of thing in backend t-codes like RSA1, RSPC1, etc. In this way, I moved my PC within the IA’s tree from current IA node to new one. In case a distance between the old and new node was quite big I ended up with moving the mouse until I got into the new IA. And sometimes it took even couple of minutes to get there.

However, with an introduction of BW Cockpit the PC’s IA reassignment gets a lot more easier. There a functionality built in the BW Cockpit to do it very easily. While in Process Chain Editor part of the BW Cockpit there is a Properties button on top right part of the page.

The button shows up a dialog window where an InfoArea can be easily changed to new one among other attributes of the PC. The information of the assigned IA is saved in table RSPCCHAINATTR and column APPLNM.

More information:

How to find out InfoArea for particular BW object type – in BW/4 systems?

BW’s InfoArea vs Application components

BW/4HANA Cockpit

Other posts on Process Chains topic

Monday, March 13, 2023

SAP Datasphere is the new SAP DataWarehouse Cloud

This week on March 8th 2023 the during a SAP Data Unleashed event SAP announced a new solution called SAP Datasphere. What it means is that Datawarehouse Cloud (DWC) becomes SAP Datasphere.


What led SAP to announce this solution? Basically, they are trying to address today’s challenges of data architecture: from data warehousing (structured data) to data lakes (unstructured or any kind of data) and beyond to data fabric (integrated layer (fabric) of data and connecting processes) reaching to particular challenges like data federation, cataloging, lineage, metadata, integration and semantic modeling of data.

How does the SAP address these kind of challenges it? By mixing a portfolio of their existing products like:

DWC – Data warehouse solution in cloud that evolved from SAP BW (BW/4), customers can move their BW models to Datasphere/DWC via SAP BW Bridge (BWB), thus investments made into BW are safe.

SAP Analytics Cloud (SAC) – Datasphere is integrated into SAC by supporting its analytics and planning use cases.

SAP Data Intelligence Cloud (SAP Data Intelligence formerly SAP Data Hub) – Datasphere leverages its Data Catalog functionality and engines for data moving.


And in addition, solutions from their partners like below to support the Business Data Fabric:

Databricks – provides data lakes platform called lakehouse (data warehouse + data lake) initially based around Apache Spark.

Confluent – capturing data in motion capabilities based on Apache Kafka.

Collibra – data governance and metadata management capabilities.

DataRobot – capabilities of AI lifecycle management, a platform for augmented intelligence – AutoML.


All these capabilities together forms the Datasphere. Although technically speaking it is a combination of DWC and SAP Data Intelligence Cloud. The Datawarehouse Cloud is rebranded to the Datasphere claiming the Datasphere to be a next generation of the DWC. Simple speaking features of the data integration, data cataloging, and semantic modeling were added into the DWC to enhance its data discovery, modeling, and distribution capabilities making it the Datasphere.

Data can be either replicated into the Datasphere or federated. SAP emphasizes an approach of data can sit anywhere just its analytics runs in the Datasphere. This is crucial point as running the data warehouse in the cloud may not be scalable easily. Thus, keeping data in its source and not replicating it may sound a better options.

User of the Datasphere digs into so called Datasphere Catalog to find a data of his/her interest. Leveraging its lineage capability a relationships between the different data can be explored. The Catalog supports data objects from other SAP Datasphere instances and SAP Analytics Cloud. This should be expanded soon to other SAP apps (like BW, ECC, S/4) plus non SAP apps via its partners. While accessing data like this the data from one source can be enhanced/mixed with data from other sources just by the user in Datasphere Spaces. Assuming here that the spaces are next generation of BW workspaces.


My take

Nowadays organizations are processing data outside their enterprise systems more and more. Therefore, a solution to enable users to work in analytics area using data from “anywhere” is very plausible. This is not a new for SAP. Somewhat similar picture was painted when Data Intelligence/Data Hub came on board. Seeing the Datasphere as a successor to those initiatives (DI, BW, DWC) plus having its AI powered capabilities the idea of the business data fabric perhaps may come true in future having a kind of “chatGPT” style of analytics.


More information:

SAP DataSphere microsite


SAP Data Unleashed event

SAP Data Warehouse Cloud (DWC), SAP BW Bridge (BWB)

Wednesday, March 8, 2023

What are types of ABAP development packages?

Development package serves for purposes of structuring ABAP development objects into logical units.  In addition to organizing of the ABAP objects while the development packages can be organized in other packages it enables SAP software logistics to function.

Organizing of packages within other packages is enabled by the type of the development package. There are following three types (as per domain MAINPACK):

'S' - Structure packages

'X' - Main packages

'' - Development packages


A purpose of the structure packages is to act as top level container in the package hierarchy that defines architecture of the objects within the subordinate packages. This type of the package does not contain any development objects itself. Instead, they can contain package interfaces and subpackages.

A purpose of the main packages can be seen as group of semantically similar objects. As well as the structure packages the main one do not contain any development objects itself as well.

Development package groups development objects itself as per their types (e.g. domains, data elements, programs, includes, classes, etc.)

Main packages, with the exception of your own package interfaces and subpackages, cannot contain other repository objects. Subpackages, in turn, can be other main packages and the standard development packages.

Package are being created in t-code SE21 Package Builder. All the packages are stored in table TDEVC.


What development packages can be seen in the SAP NetWeaver/ABAP Platform based systems?

$TMP – Local or temporary objects, most visible one. All objects that are not needed to be transported into other system go in here.

Z* or Y* – custom development packages. All objects created by customer are stored in those packages.

$HV – Generated Help View Program

$MC – Generated Matchcode Programs/Functions

$ENQ – Generated ENQUEUE Function Modules

$GEN – Other generated objects

$SWF_RUN_CNT – Workflow Container: Generated Data Types

$EQ_GEN – Stores BW Easy Query generated objects. The package is created automatically upon first easy query generation. It includes all generated BW Easy Query objects such as the function modules.


More information:

ABAP Packages

SAP glossary

Friday, March 3, 2023

BW Background Job - Cleanup of orphaned DTIS entries

DTP may run in DTIS (Data Transfer Intermediate Storage) mode. That mode works in a way that extracted data is stored in in intermediate storage before they are updated to target.

The Intermediate Storage contains many records even though the corresponding DTP requests are already deleted or are in green status without any error. Thus the intermediate storage can grow up fast and it can be a problem while having many DTPs in the BW system running in that mode.

There is an ABAP program RSDTIS_CLEANUP that can run as a job to clean up the Intermediate Storage of DTIS DTPs that belong to deleted requests.

The report can run for specific DTP or for multiple DTPs that are identified either by source object (aDSO, DataSource, Query Element, InfoSOurce, Composite Provider or InfoObject) or by Unique DTIS Number. The Unique DTIS Number (RSDTIS_NUMBER) is a sequential number that belong to specific execution of and DTP in the DITS mode. All the Unique DTIS Number are stored in table RSDTIS.

More over there is a checkbox that enables a deletion of unused DTIS tables.

More information:

2708942 - Cleanup of orphaned DTIS entries

SAP BW DTIS - Data Transfer Intermediate Storage

Wednesday, March 1, 2023

Collections of BW Background Jobs

SAP BW (Business Warehouse) background jobs are automated tasks that are scheduled to run at regular intervals without requiring manual intervention. These jobs are used for various purposes in the BW system, including data loading, data transformation, and system maintenance. Regular SAP BW background jobs are essential for ensuring the efficient operation of the BW system. They help to automate repetitive tasks, reduce the risk of human error, and improve system performance. They can also help to ensure that critical data is available to end-users when needed.

Here are some of the most common types of regular SAP BW background jobs:

Data loading jobs: Used to load data from source systems into the BW system. They may include jobs that extract data from external systems, jobs that transform data into the correct format for BW, and jobs that load data into BW data targets.

Data transformation jobs: Used to transform data from one format to another. They may include jobs that perform data cleansing, data validation, or data enrichment.

System maintenance jobs: Used to maintain the health and performance of the BW system. They may include jobs that perform database maintenance, system backups, or system health checks.

Archiving jobs: Used to archive data that is no longer needed in the active BW system. Archiving helps to reduce the size of the active BW system, which can improve system performance.

Broadcasting jobs: Used to generate reports based on data stored in the BW system.

Here’s a list of my blog posts related regular background jobs in BW:

1. Cleanup of orphaned DTIS entries

2. Deletion of Orphaned Entries in Errorstack/Log Job RSBKCLEANUPBUFFER



5. Deletion of BW Messages/Parameters

SAP Profitability and Performance Management

It is SAP solution designed to analyze financial performance and profitability of an organizations. It provides advanced analytical tools and reporting capabilities to help businesses identify the factors that drive their profits and optimize their performance. It is also known as SAP Performance Management for Financial Services or SAP Profitability and Performance Management Cloud or as SAP PaPM that is abbreviation of Profitability and Performance Management. Sometimes it is also referred as NXI (software component) as it is being developed by partner company called NEXONTIS (part of msg global solutions ag).


More information:

PaPM microsite

Help site

BW Background Job - Deletion of BW Messages/Parameters

BW background management provides basic functions for functions for managing background and parallel processes in the BW system. Collection of the background management plus logs and tools is what is BW data processing all about. Simply speaking it comprises of all the tasks that runs as the background jobs that are performing BW tasks like data loading, parallelization, indexing, filling aggregates, archiving, attribute change run, broadcasting, compression, data activation, rollups, remodeling, collecting statistics, logs, messages, etc. All the BW background management jobs are visible in t-code RSBATCH.

All these tasks running all the time in the BW systems are producing a great amount of logs and messages. The table that is used to store those logs and message is RSBATCHDATA. In order to prevent the table to grow a lot there is a job that can do a housekeeping activities. The job is usually called BI_DELETE_OLD_MSG_PARM_DTPTEMP and it runs the ABAP program RSBATCH_DEL_MSG_PARM_DTPTEMP. It deletes old messages, parameters and temporary DTP Data etc. On its selection screen, it is possible to specify the no of days as a retention period (delete all records older than X days, where X is from zero to 999) for messages, logs and DTP temporary data. It usually runs on daily basis.

The job can be scheduled from t-code RSBATCH -> under Settings and Parallel Processing part -> Deletion selection.

More information: