Thursday, November 24, 2022

BW Transformation tech name different across systems

Once BW transformation is created there is a technical ID associated with it. The tech ID is generated by the system and it is a hash value of some transformation attributes like source object and its type and subtype, target object its type and subtype. The tech ID is stored in table RSTRAN and field TRANID.  

Now the transformation is moved across the systems in BW landscape via the BW transport. Normally one would expect that the same tech ID of the transformation is replicated into target system of the transport. However, there are some checks executed in the target system that decides about the ID of the transformation. Following situation can occur the target BW system. The tech ID of the transformation can be different. This means that table entry RSTRAN-TRANID is different in source and the target BW system of the transport.

Why would the BW system generate the new tech ID? In this case, the TRFN is based on DataSource. The DS name is different because it depends on source system abbreviation. Obviously, the DS associated with the DEV (D) BW system has different name, as is the DS associated with the QAS (Q) BW system. As the DS name is among the attributes of the TRFN that is used to generate the TRFN tech ID also the generated tech ID must be different. Therefore, that’s the reason why the same TRFN has different tech ID in D and Q systems.

Original TRFN name from source system (D in this case) is stored in column ORGTRANID for a record associated with the new TRFN tech ID in table RSTRAN.

More information:

2774555 - Technical names of Transformations

2117773 - Transformation is not deleted in complex landscape

Failed transport contains objects related to new source system

Tuesday, November 22, 2022

t-code I18N - Internationalization

There is a launchpad style t-code that enables access to different customizing parts of internalization settings that reside in SAP NW system.

Following are things that are accessible in the t-code menu:


1. Internationalization Customizing:

1.1 I18N Language configuration - runs program RSCPINST - NLS installation tool

1.2 I18N System Parameters - runs program RSCPCOLA - Correspondence language installation

1.3 Code Page in System Landscape (RFC Dest) - runs t-code SM59 to maintain RFC destinations

1.4 Correspondence Languages - runs program - RSCPSET_PARAM - Maintain system parameters used for Internationalization (I18n) functions.


2. SAP Code Pages

2.1 Display / Maintain - runs t-code SCP (program RSCPSEGMENT_SHOW) – that which manages SAP code pages, segments, etc. E.g. Display and Maintain Code Page

2.2 Code Page Migration - runs program - RSCP0126 to do a Code Page Conversion: SAP Character-Based -> Unicode-Based

2.3 Upload / Download - runs program - RSCP0025 to do an Upload/Download Code Page Definitions and Code Page Segments

2.4 Compare

2.4.1 Character & Mapping Set: Detail - runs program - RSCP0129 to make a comparison of Character and Mapping Sets

2.4.2 Code Page / Segment: Remote - runs program - RSCP0133 same as above just remote comparison of Code Pages or Code Page Segments

2.4.3 Code Page Statistics - runs program - RSCP0003 to display number of characters per character set

2.5 Check

2.5.1 Code Page Consistency - runs t-code SCP - see 2.1

2.5.2 Segment Consistency - runs program - RSCP0132 to run a consistency check of segment use

2.5.3 Round Trip Measurement - runs program - RSCP0125 to measure a round trip of Code Page > Intermediate Code Page > Code Page

2.5.4 Character Conversion Test - runs program - RSCP0032 to test properties of function SCP_TRANSLATE_CHARS that translates a short text from one codepage into another.


3. International Standards References

3.1 Country Code - ISO 3166 - launches web page

3.2 Language Code - ISO 639 - same as above

3.3 Script Code - ISO 15924 - same as above

3.4 Code Pages

3.4.1 ISO - see 3.1

3.4.2 Unicode - same as above

3.4.3 Microsoft - same as above


4. Language Translation and Transport

4.1 Translation Environment - runs t-code SE63 - Translation Editor

4.2 Language Transport Utility - runs t-code SMLT - Language Management


5. Printing

5.1 Output Controller: Spool Request - runs t-code SP01 - Output Controller, spool request

5.2 Spool Administration - runs program - RSCPSETCASCADINGFONTS - Cascading Fonts Configurator (CFC)

5.3 Cascading Font Customization - runs t-code SPAD – Spool Administration


6. I18N Services

6.1 Database Scan: Find and Replace Tool - runs program RSI18N_SEARCH - Database Scan Tool

6.2 File Encoding Converter

6.2.1 ABAP Tool - runs program - RSCP_CONVERT_FILE - to convert plain text files from one SAP code page to another SAP code page

6.2.2 Operating System Tool 'sapiconv' - Displays SAP Note 752859, a tool for converting the encoding of files


7. Troubleshooting

7.1 Current I18N System Configuration - runs program - RSCP0018 that checks profiles for language, character sets and so on

7.2 SAP Language Code - runs program - RSCP0147 that show languages and their relationship

7.3 CCC Cache - runs program - RSCP0148 that displays CCC Cache information (Code page Converter Cache). It is a shared memory that is used for code page conversion tables.

7.4 Unicode Normalization - runs program - RSCP_NORMALIZE to demonstrate Unicode Normalization (class CL_ICU_NORMALIZATION). The IDN (Internationalized Domain Name) is an Internet domain name that contains non-ASCII characters

7.5 IDNA Conversion - runs program - RSCP_IDNA_ICU to demonstrate IDNA Conversion (class CL_ICU_IDNA)

7.6 Printing Test

7.6.1 ABAP list Characters by Unicode Block - runs program - I18N_ABAPLIST_UC_BLOCK Characters by Unicode Script Block Multiple Scripts - runs program - I18N_ABAPLIST_MULTI_SCRIPTS to display multiple scripts for display and print

7.6.2 SAPscript Multiple Scripts - runs program - I18N_PRINT_TEST_SC_SF_DIRECT to directly send SAPScript + Smartform doc to print with SAPscript form I18N_PRINT_TEST_SC_UC 7-bit ASCII: English - runs program - - see above but for form I18N_PRINT_TEST_SC_EN Latin-1 Supplement: French / German - runs program - see above but for form I18N_PRINT_TEST_SC_L1 Bidi + Shaping: Arabic - runs program - see above but for form I18N_PRINT_TEST_SC_AR Bidi: Hebrew - runs program - see above but for form I18N_PRINT_TEST_SC_HE Double-byte: Japanese - runs program - see above but for form I18N_PRINT_TEST_SC_JA Combining Characters: Thai - runs program - see above but for form I18N_PRINT_TEST_SC_TH

7.6.3 Smart Forms - runs program I18N_PRINT_TEST_SC_SF_DIRECT to directly send Smartform doc to print with following Smart Form forms:








7.7 Locale

7.7.1 Maintain (TCP0C) - runs t-code SM30 for table TCP0C - Locale names for setting up C-libraries

7.7.2 Test Locale Switch - runs program - RSCP0016 to test switching system character set

7.7.3 Test TO UPPER - runs program - RSI18N_TEST_CASE_CONVERSION to test upper/lower case conversion for the ABAP statement "translate"

7.7.4 Test Sort Order - runs program - RSCP0102 to test a language-dependent sorting


More information:

848036 - Transaction 'I18N' (Internationalization)

42305 - RSCPINST (I18N configuration tool)

1969062 - Meaning of fields in I18N -> Troubleshooting -> CCC Cache  

Stacked objects in SAP BW

There is one set of interesting terms in BW’s terminology. It is stacked scenario, stacked data load, stacked transformation, etc. All of those terms are having word stacked in common. The stacked means an arrangement of BW objects of the same object type one behind the other to form a stack. What are all of those in particular?

Stacked data flow – particular data flow that uses at least one InfoSource object between source and target BW objects: (source -> InfoSource -> target). Those means that there are two data transformation followed each other. One transformation is going from source objects to the InfoSource and other one from the InfoSource to target objects. Such a BW transformation is also sometimes called as stacked transformation.

Other case of the stacked data flow can be when Calculation Scenario (as SAP HANA runtime object) from the DTP uses another Calculation Scenario as data source.

Stacked scenario – data load scenario where stacked data flow(s) is/are used.

Simple data flow / simple scenario – connects two persistent BW objects with no InfoSource in between. Just one BW transformation is involved. Sometimes called non-stacked data flow.

Normally it is easy to spot the stacked data flow – there is the InfoSource. However when it comes to an DTP execution the DTP request ID naming convention starts with DTPR__* in case of non stacked transformation. In case of the stacked one the naming convention is replaced by TR_*.


More information:

Closed loops scenarios in BW

DTP processing mode

Another property of a DTP packages among those like Version, Indicator (Saved/Not Saved) and Object status is Processing Mode. It describes the order in which processing steps such as extraction, transformation and transfer to target are processed at runtime of the DTP request. The processing mode also determines when parallel processes are to be separated.

The processing mode of a request is based on whether the request is processed asynchronously, synchronously or in real-time data acquisition mode (RDA), and on the type of the source object.

It also takes the number of parallel processes defined in the Background Manager into consideration. If only one process is defined there, no additional processes are separated while the DTP request is being processed.

- Asynchronous Processing - A request is processed asynchronously in a background process when a DTP is started in a process chain or a RDA request is updated. The processing mode is based on the source type.

- Synchronous Processing - A request is processed synchronously in a dialog process when it is started in debug mode from DTP maintenance. RDA requests cannot be started in debug mode.

 - RDA Processing - A request is processed in RDA mode when it is started using a real-time demon.

Technically the DTP processing mode is stored in table RSBKDTP and column PROCESSMODE. Following are types of processing identified by tech ID and name and description.


- " 0 (serial) " Serially in the Background Process

Requests are processed one by one on background.


- " 3 (par_all) " Parallel Extraction and Processing

The data packages are extracted and processed in parallel processes, meaning that a parallel process is derived from the main process for each data package. This parallel process extracts and processes the data. Maximum number of background processes that can be used for each DTP can be defined. Execution is performed in parallel background processing.


- " 4 (sync_debug) " Serially in the Dialog Process (for Debugging)

Processing of the DTP is executed in debugging mode. The request is processed synchronously in a dialog process and the update of the data is simulated.


- " 8 (sync) " Serially in Dialog Process

Requests are processed one by one in dialog (foreground) mode.


- " 9 (no_data) " No Data Transfer; Delta Status in Source: Fetched

With this processing mode a delta request is created without transferring data. This is analogous to simulating the delta initialization with the InfoPackage. The DTP is executed directly in the dialog. No data is transferred in BW, but the delta pointer is established with the initializing status. When next delta loads is triggered it fetches only those new records which are created after the delta pointer was set.


- " S (par_source) " Parallel Extraction and Processing (Flexible Preparation)

With this processing type, data transfer processes are executed that directly transfer data without a PSA from a DataSource (in the operational data provisioning (ODP) source system) to an InfoProvider. The processing mode is used if error handling is activated and the transformation does not require any semantic grouping.

The difference between this processing mode and V1 is that this processing mode has no explicit preparation phase in the program flow (see the Execute tab). During the preparation phase with ODP source systems, the system cannot find out how many data record and packages are delivered by the source. The main process does not ask whether a data package exists until after a parallel process has been spilt off.


- " I (no_data_no_i) " No Data Transfer; Delta Status in Source: Not Fetched



B) Processing the Data Transfer Process in SAP HANA

- " D (script) " Serial SAP HANA Execution

If certain prerequisites are met, you can use this processing mode for a data transfer process that can be transformed in SAP HANA.


- " E (script_par) " Parallel SAP HANA Execution

Execution is performed in parallel background processing.


- " F (mix_ser_all) " Serial Processing with partial SAP HANA Execution

In case Transformation run in HANA mode but contains ABAP End Routine. Mixed scenario.


- " H (mix_par) "      Parallel Processing with partial SAP HANA Execution

In case Transformation run in HANA mode but contains ABAP End Routine and execution is performed in parallel background processing. Mixed scenario.


- " J (script_sync)" Serial SAP HANA Execution in Dialog Process (new in BW/4HANA)


- " K (mix_sync) "  Serial Partial SAP HANA Execution in Dialog Process (new in BW/4HANA)



C) Obsolete processes:

- " 1 (par_immediate) " Serial Extraction, Immediate Parallel Processing (Obsolete)

The data packages are extracted sequentially in a process. The packages are processed in parallel processes, meaning that the main process extracts the data packages sequentially and derives a process that processes the data for each data package.

Note You can define the maximum number of background processes that can be used for each DTP.


- " 2 (par_later) " Serial Extraction, Then Parallel Processing (Obsolete)


- " 5 (realtime) " Processing Mode for Real-Time Data Package (Obsolete)

With this processing mode, you execute data transfer processes for real-time data acquisition.


- " 6 (remote) " Processing Mode for Direct Access (Obsolete)

With this processing mode, you execute data transfer processes for direct access.


- " 7 (ser_package) " Serial Extraction and Processing of Source Packages (Obsolete)

The data packages are extracted and processed sequentially in a process, the main process.


- " A (par_non_init)" Parallel Extraction and Processing Except Delta Except Delta init /  (Obsolete)

Using this processing mode, you can achieve extensive parallelization for delta DTPs that extract data from a standard DataStore object who delta initialization is not read from the change log.

When the request is created for this kind of DTP, the processing mode is set dynamically as follows: During delta initialization, the data is extracted in serial form (from the active table with or without archive), and the data packages are processed in parallel processes. The delta requests are then extracted and processed in parallel.


- " G (mix_ser_x) " Serial SAP HANA Transformation, Immediate Paral. ABAP Update (Obsolete)

Mixed scenario.

Friday, November 11, 2022

BW Object Status - OBJSTAT

Many BW objects use so called concept of Object Status. It indicates whether the object is active and is usable (e.g. executable) in the system. Let’s take an example of a DTP object. In addition to Object Version information and Indicator of Saved/Not Saved and there is the Object Status for each of the DTP in the system.

There are following values that the Object Status information can consist of:

ACT = active, executable. An object with this status can be executed or used.

PRO = productive. This status is a one step further from ACT, but is currently not yet implemented in the BW systems.

INA = inactive. The object is not executable/useable. This is the case, for example, if

·        object has been created and saved for the first time, but has not yet been activated,

·        object firstly has to be revised after changing one of the others.

OFF = switched off. The object is not useable. It was deliberately switched off so as not to appear, for example, in the list of useable objects.

DEL = logical deleted

<blank> =  DTP doesn't exist


Statuses ACT and PRO are the ones that indicate that the object is executable/useable.


Technically the BW tables contain column OBJSTAT that refers to ABAP DDIC domain RSOBJSTAT that is type of CHAR 3. In case of the DTP object the table that stores Object Status information is RSBKDTPSTAT (Status Information on Data Transfer Process).

Some other BW objects that use concept of the Object Status are BW query components, InfoSources, BW Formulas, Application components, ABAP routines, Open Hub Destinations, BPC model, BPC environment, Remodeling Rules, HANA Analysis Processes, Directory of InfoAreas, InfoObject catalogs, Directory of all InfoObjects, Open ODS View, Process Chains, Planning Sequence, Transformations, Workspaces, info cubes, etc.


More information:

Object Version

Thursday, November 10, 2022

Delete a DTP without deleting request in data target?

Normally the DTP that is not needed anymore can be deleted either from SAP HANA Studio (BW Modeling Tools), from RSDTP t-code, from RSA1 t-code etc. However, in some cases the deletion of the DTP is not possible. Such a case can be that the DTP is type of delta and it was used to load the data. The data that the DTP loaded remains in data targets. The DTP normally can’t be deleted. This would mean that even such the DTP is obsolete it is not possible to retain the data in SAP BW system and to get rid of the DTP in the same time. Because of that as DTP is still present in the system it is not possible to get rid of not used anymore objects like transformations, data sources. Reason is that those objects are still referenced by the DTP.

Luckily, there is a solution to fix this. In case that the DTP can’t be deleted via normal procedure (SAP HANA Studio, RSDTP t-code, RSA1 t-code) ABAP program RSBKDTPDELETE can be used to delete DTP like described above. Selection screen of the program needs just the DTP technical name that will be deleted to be provided. Also, one needs to be careful in case of the delta DTP. If the delta DTP is deleted the delta mechanism between source and target will be invalidated and delta would need to be reinitialized.

The program is delivered via SAP Note 2834915. It is available for BW 7.x versions also including BW4 versions of SAP BW.

What are the DTP types that the program capable to delete?

- Detached DTPs – so called DTPs without reference

- Error stack DTPs

- Delta DTPs – It is possible to delete such the DTPs however a strict authorization must be assigned to a user.

Program RSBKDTPDELETE vs RSBKDTPDELETE2 – There is also a 2nd version if the program available. A difference between the two is that 2nd version can generate a transport request of the DTP deleted. Also a different BW’s frameworks are used to delete the DTPs. There is a cl_rsdtis_factory=>delete_all_requests_of_dtp used in ver1 and if_rso_tlogo_maintain~delete in latter one.

More information:

934492 - 70SP08: Data transfer process (DTP) cannot be deleted

2925404 - Program RSBKDTPDELETE dump

Tuesday, November 1, 2022


It is becoming a tradition that in autumn SAP developer relation team prepared an initiative called Devtoberfest. In a nutshell it is an contest for SAP community – for developers. It lasts for four weeks of October month. Each week has assigned a number of activities. By fulfilling, the activities a one can collect a points. The activities are mostly a tutorials located at There is a nice gameboard that shows progress of individual during the contest. Each day of the week is dedicated to a topic (e.g. ABAP, UI technologies (IU5, Fiori), Analytics (Machine Learning, Data Warehousing), Low-Code/No-Code, Cloud (Containerization, Kubernetes).

I is a really fun to take a part in the Devtoberfest plus you get to learn a ton of new things!

Below is how I did during the event during years:



Activating Business Function in SAP BW

SAP NetWeaver ABAP Stack based systems are using called Switch Framework concept to control what functionality is enabled in the system. Functionalities of the SAP system delivered as either Software Add-ons by the SAP itself or by its partners can be switch on and off by using the Switch Framework. More detailed information about the Switch Framework can be found here.

In this post, I briefly discuss what are a procedures to check if particular business function is in/active in the system. As well as how to activate the BI Content.


To check the status of particular business function:

1. t-code SFW5 find particular business function:

Double click on the entry and its status is displayed in next screen:

To see all others details like development packages that delivers to function drill into the entry located on Switch tab:

To install BI Business Content below is a procedure:

1. Turn on Business Function in t-code SFW5. In case of SAP HANA-optimized BW Content it is the function /IMO/BWCONTENT. Notice that /IMO/BW_CONTENT is obsolete now.

2. Install BI content itself with t-code RSORBCT


More information:

Switch Framework for BW

Technical Business Content

Wednesday, October 12, 2022

Type of Data Transfer Procesess

There are a few types of DTP. A purpose of the DTP or so called Data Transfer Process is to specify how data is transferred between two objects in BW system. It governs from which objects data is extracted (source object) and into which objects the data is stored to (target object.). The data transfer process forms a template for requests. When a request of this type is processed, data is loaded into a target object from a source object.

Technical name of the DTP can have different prefixes as I wrote here. However, in this blog post I discuss the type of the DTP. There are following DTP types recognized by their purpose or a context in which the particular DTP can be used:


<blank>      Standard (Can Be Scheduled)

Mostly to run in Process Chains on periodic basis (thus a term scheduled) as loading process. Also can be running manually by the user.


DTP for Real-Time Data Acquisition, A daemon provides DTPs of this type with new data from a source in regular and frequent intervals.


REMT           DTP for Direct Access

Is used to read data from a query directly from a source system using RFC. DTPs for direct access typically access highly current data that has not yet been made available in the BW system by the scheduled load processes.


EDTP            Error DTP

The DTP that serve for purposes of error handling of another DTP. The other DTP is called standard DTP. Error DTP has the standard DTP specified in table RSBKDTP and column DTP_STANDARD. In addition, the standard DTP is specified in column SRC in the same table.


Type of the DTP can be spotted in t-codes like RSDTP:

Technically the type of the DTP is stored in table RSBKDTP and in column DTPTYPE (Type of Data Transfer Process). The column definition refers to data dictionary domain RSBKDTPTYPE.

Apart of column DTP_STANDARD there is one more interesting column in the table RSBKDTP. It is DTP_ORIGINAL column. In most cases the DTP technical ID (column DTP) is equal to the DTP_ORIGINAL. However, when the DTP was copied from the DTP that was transferred from a Business Content the DTP_ORIGINAL carries the ID of the content DTP. In case the DTP_ORIGINAL column is empty there is an ABAP program RSBK_FILL_ORIGINAL_DTP that populates the column. This is needed as the DTP that has no original column maintained may fail to transport across BW systems in a landscape. 

More information:

A little about InfoPackage / DTP prefixes and BW data request's prefixes

2016048 - P34: DTP: Fill Original_DTP if field is empty

Saturday, September 24, 2022

DELETE_FACTS t-code to delete from BW InfoProvider

Normally in case a deletion of data is to be performed it is possibel to be done via RSA1 t-code or in case of BW/4 in its web environment (SAP BW∕4HANA Cockpit). If this task is supposed to be automated, it is possible to leverage deletion process in process chains. However, there is also a dedicated t-code for deleting of data from the fact table of BW’s InfoProviders. It is called DELETE_FACTS t-code. It is associated with ABAP program RSDRD_DELETE_FACTS. The program/t-code is also available in BW/4 based BW systems.

On its input screen, there is one mandatory parameter – Data Target that the BW InfoProvider on top the deletion is performed. Following that input field there are three radio buttons. Each of them controls mode of the program.

In case it is set to “Direct Deletion” the program asks for deletion criteria and deletion itself is performed as a next. The deletion criteria is just a filter that is setup by a user and is used for deletion. Only data that are returned by the filter criteria are deleted.

Next radio button is called “Generate Selection Program”. It generates either GP program for the InfoProv deletion. The generated program contains selection screen that serves as deletion criteria. User can input the filter in here. Once the deletion criteria are provided the deletion can be performed.

Finally, there is third radio button called “Generate Deletion Program”. The report is generated on the fly and the deletion criteria are static only available when the GP is generated. Thus no usage of ABAP program variants in this case.

Instead the GP program a real program name can be given in another option field of the selection screen – called “Name of report”.

The deletion itself is performed in a very interesting way. BW system copies the data are not supposed to be deleted (selection criteria reversed) to temporary DB table. The temp table is a copy of real InoProvider DB table. Next is that the system deletes the original (infoprov) table completely. Not just the data in the table but whole table is dropped. Finally, as a last step the system renames the temp table to original table. Reason why it is doing like that is a performance. You can read more details about this approach in a SAP Note listed below.

More information:

2918552 - [BW Central KBA] Selective Deletion

Friday, September 23, 2022

Switching output from plain list to ALV and vice versa

Many SAP’s NetWeaver based systems t-codes or programs are giving its output in form of a list. The list can be either in form of paint text formatted into columns and row or ALV list. An ALV is abbreviation of aAp List Viewer and it is popular from of outputting a results. Among many ALV advantages is easy column manipulation, data filtering, sub/total calculation, drilldowns etc.

However it may happen that the report that you are used to use is not coming as an ALV layout. Instead, the plain list is showed. Most likely this is caused that a user parameter called SALV_SWITCH_TO_LIST (Switch Grid -> List) is set to value X via t-code SU3 (Maintain User Profile).

In case of t-code SM66 if the param is set the output look like:

In other case if the param is not set the result of the same t-clod looks like: 

More information:

Thursday, September 22, 2022

Setup of t-code RSMNG

T-code RSMNG is useful replacement of former administration part of RSA1. It comes handy also in case you do not prefer web environment (SAP BW∕4HANA Cockpit) based administration of BW/4 bases systems. However, after some time of using it I realized it has its drawbacks. Most annoying thing for me is its performance. This comes to picture in cases when I do administration of an aDSO object that has large (tens of thousands) number of data load requests.

In such case, it takes time (several minutes) to get into the administration screen. The RSMNG t-code has following two filter options that have an influence on how many request are displayed on the admin screen.

Filter by Time – options like Today, Yesterday and Today, In the Past Week, This Month and Last Month, This Year and Last Year, Free Data and No Time Restriction are available here to choose from. This screen is more less the same as on Process Chain’s log selection screen.

Filter by Status – OK, Error, Running and Deleted are data load request status to choose from.

In case you used filter by time and set it to No Time Restriction position of a radio button on object that has very few data load request - all is fine. However if you meanwhile jump into another aDSO that has a lot of the data load requests you will be waiting till all of them are read and the screen is finally displayed. Therefore, this is the bummer.

There is a database table that stores the setup of the RSMNG t-code. The table name is RSDSO_MNG_DYNSET (ADSO manage: dynamic user settings). The table is managed by Function Module RSDSO_MNG_SET_PARAM. The FM itself is called from RSMNG framework build in ABAP class CL_RSAWBN_AWB. I find myself in a position to change the table entries in case I realize my settings are set e.g. to No Time Restriction and I run to the very large aDSO.

The settings of the table are stored per a user. Means each user (column UNAME) running the RSMNG t-code has its own settings. This is actually causing this issue that if I set time filter to No Time Restriction value it is valid for all the aDSOs I used in the t-code. To avoid situations like this it would be better if table could store the RSMNG settings on particular aDSO (or IO objects as that can be managed here too) level. However, at this time BW/4 2.0 SP06 it is not supported.

There are different tables that stores the RSMNG setup data per infoprovider:


RSDMD_MNG_DYNSET – for InfoObjects

RSBOH_MNG_DYNSET – for Open Hubs

In a below pictures you can see how is the table columns mapping done against the particular filter settings of the RSMNG.

Friday, September 16, 2022

Listcube does not show any data

While using a t-code LISTCUBE sometime following odd situation can occur. The t-code does not show any data at all despite the data is present in a BW objects that the LISTCUBE runs on top of. That can be very strange and confusing situation. Instead of any data only below message is present:

No messages exist Message no. R7896

Explanation is very simple. As an ALV technology is used to present the data there needs to be a list of columns that are supposed to be displayed in an output screen passed to the function module (e.g. REUSE_ALV_GRID_DISPLAY) call - that does the display operation itself. This list of columns is called a field catalog (e.g. param IT_FIELDCAT in the FM call). The Field catalog contains all the fields to be displayed with their descriptions. Similarly, there is another structure that needs to be passed to the FM during its call - IS_LAYOUT. That one has the list layout specifications. If some of these information (field catalog or the layout) is missing the ALV does not know what columns (fields) are supposed to be displayed. Thus, no information is displayed after all.

What caused it? As the ALV supports saving the output layout (field catalog) for later use it could happen that, someone stored his/her layout while worked with BW info provider ABC and market it as global and default while working in the LISTCUBE. Once another user use the LISTCUBE against info provider XYZ that has a different structure (different IOs) the global default layout was sent to ALV. As none of the fields that were retrieved from info provider XYZ are matching the field catalog prepared based on the info provider ABC nothing can be displayed. 

How can this be fixed? While in the LISTCUBE output screen there in its menu the layout management screen is available. It can be found in menu Settings -> Layout -> Layout Management:

Here just default layout needs to be deleted: 

More information:

Usage of t-code LISTCUBE

DTP reads different data than LISTCUBE