Tuesday, April 30, 2024

aDSO: Validity and Reference-Point tables

As mentioned in my post Storage data target type in BW Request Management there are several tables that store data for aDSO objects. Except those very well know like active, inbound, change log tables there are also others.

In case the aDSO is type of Inventory:



There are also below two tables available (followed by namespace):

Validity Table: /BIC/A<technical name>4

Reference Point Table: /BIC/A<technical name>5

 

A purpose of the inventory-enabled aDSO is to manage noncumulative key figures. A non-cumulative measure, in the context of data analysis or statistics, refers to a metric or variable that does not accumulate or aggregate over time or across categories. In other words, it represents a single point or snapshot value rather than a total or sum.

E.g. if there is a need to analyzing sales data for a particular day, the number of units sold on that day would be a non-cumulative measure. It doesn't consider sales from previous days; it just reflects the sales for that specific day.

Non-cumulative measures are often used when you need to examine data at a specific point in time or within a specific category without considering historical or cumulative values. They are particularly useful for analyzing trends, patterns, or comparisons within discrete units of analysis.

The tables are available also via t-code RSMNG in Utilities menu: Display Validity tab and Reference-Point tab:



Friday, March 29, 2024

Activation of SNP Glue objects after transport

In my last blog I wrote about SNP Glue transport.  Once the Glue objects are moved to the target SAP system there is one more step to be executed before the objects are ready to be used. It is so called objects activation step. It can be performed in dedicated t-code (/DVD/GLTR - DataVard Glue Transport Workbench). There is a button called Activate Request in the toolbar.


Once you hit that button you get a pop-up window to provide a TR which objects are to be activated.


Next pop-up lists out all the objects that are in that TR. One can choose if particular objects is to be activated or not. Also there is a possibility to delete the Glue objects by selecting Delete checkbox. One more check box is to be checked if the objects are new and are to be created/changed. 


Once this pop-up is confirmed the objects are being physically activated and afterwards the objects are visible in Glue Object navigator – t-code /DVD/GL80 and are ready to be used.

Transports of SNP Glue objects

In my previous post related to SNP Glue integration tool I described how the tool can be leveraged to transfer data between Snowflake and SAP BW/4HANA systems. However I left out a part on how to transport the SNP Glue objects. Thus I write about it in this post.

There is a dedicated t-code (/DVD/GLTR - DataVard Glue Transport Workbench) to manage transports the objects developed for the Glue.


The Glue transport t-code allows to collects the objects for the transport either by package/user name/object name. Once the objects are entered in the selection screen there is a button Execute. Once it is selected the objects are written to the regular SAP STMS transport request. Here’s such a request looks like:



Entries in the TR are basically the transparent table entries. These values in the table entries will be transported to the target SAP system. This is because the Glue transport contains meta data objects (objects definitions), further there are ABAP objects like classes and programs. Based on this meta data - the physical Glue objects are recreated in the target system upon objects activation.

Below are some of the table entries that carries the meta data per particular Glue object:

/DVD/GL_T_DD_TH – Glue table header objects

/DVD/GL_T_E2_MAP - Persistent Transformation/mapping

/DVD/GL_T_E2_OBJ - Extractor 2.0 object directory

/DVD/GL_T_E2_OBP - Extraction object's parameters

/DVD/GL_T_FLD_OB - Glue objects mapped to logical folder

/DVD/GL_T_FOLDER - Table for mapping folders inside packages

 

Once the objects are collected into the TR via t-code /DVD/GLTR then the TR can be regularly released like any other SAP transport in e.g. SE10 t-code. At this point the TR is ready to be imported to target SAP system.



Sunday, March 24, 2024

Simple Snowflake to SAP BW data flow via SNP Glue

I was recently involved in a data transformation project leveraged SNP Glue tool. The SNP Clue (formerly Datavard Glue) is designed to integrate and connect various SAP systems and non-SAP systems, facilitating data exchange, synchronization, and consolidation.

In my scenario we used the Glue to transfer the data located in Snowflake system in the cloud to on-premise based SAP BW4/HANA system. There are a couple of the Glue objects that need to be developed to enable the information exchange between the Snowflake and the SAP BW/4HANA.

1. Glue Storage

2. Glue table

3. SAP table

4. Glue Fetcher

5. Glue Consumer

6. Glue Extraction Process

7. SAP BW Datasource

8. SAP BW target infoprovider – e.g. aDSO

 

All these objects are developed in SAP BW system where the Glue tool is installed. Main component of the Glue is a Cockpit (SNP Glue Cockpit) that can be access via t-code /DVD/GLUE (ABAP program /DVD/GL_MAIN). Here all other parts of the Glue can be accessed from. The Glue can be also installed as add-on too. In this case there is a component called Glue in the particular installation of the SAP BW.



1. Glue Storage – it is a central object that encapsulates all information needed to connect to the remote object where the data will be read/write – from/to. The Storage is an SNP objects that is shared between their tools (e.g. SNP Outboard). Below are settings that are needed to be provided to the Storage object in order to connect to the Snowflake. There is a dedicated t-code (/DVD/SM_SETUP) to maintain the storage. There are 2 storages that need to be defined for connection to the Snowflake:

1.1 Internal Glue Storage

Storage ID – logical name, defined by the Glue developer, usually must follow a naming convention given by particular developer guideline

Storage type – predefined as SNOW_STAGE, type binary

Description – free text, should be describing e.g. meaning of data that is being transformed

Java connector RFC – id of connection to JAVA connector (JCo). The JCo is mandatory and it must be installed in SAP BW system in order to use the Glue. Similarly as Storage it-self the JCo connection are shared across different SNP tools. There is a special t-code (/DVD/JCO_MNG) to setup and control the JCo connections.

Account name - name of the Snowflake account, in the format: <account name>.<region>.<platform>

User role – haven’t used this

JAVA Call Repeat – 0 by default

Repeat delay (seconds) - 0 by default

Driver path – path to the JDBC Snowflake driver, located at SAP server

Connection pool size – 0 by default

Username – user at Snowflake side

Password – of the user above

 

1.2 Glue Storage

Storage ID - logical name, see above in internal storage. Notice that the ID is different from the ID used in internal storage.

Storage type - predefined as SNOWFLAKE, type TAB, for transparent storage

Description – see in internal storage

Referenced storage – id of internal storage, main storage uses internal one

Java connector RFC - see in internal storage

JDBC Call Repeat – 0 by default

JDBC Repeat delay (seconds) – 0 by default

Account name - see in internal storage

Warehouse – existing warehouse in the Snowflake account, WH that will be used to perform computing operations like SQL in the snowflake

Database name – name of the database in the Snowflake

Database schema – name of the database schema in the Snowflake

User Role – user role in the Snowflake

Driver path – Snowflake drive path in SAP server

Hints - string that is added to connection string when JDBC driver establishes the connection

Connection pool size – 0 by default, number of connections that can be kept open in the pool

File Type – CSV or Parquet, I used CSV type

Table name prefix - prefix of all Glue tables created within this storage

Use Snowflake App for data merge – haven’t use it, if enabled, the SNP Native app is used

Wrap values in staged CSV files - haven’t use it

Data delivery guarantee – EO (Exactly-once), Data transfer behavior

 

All other Glue objects listed below are maintained in Object Navigator part of the SNP Glue Cockpit. The objects navigator can be accessed by dedicated t-code - /DVD/GL80.


2. Glue table – can be also maintained in dedicated t-code /DVD/GL11. The Glue table is metadata object, which represents the remote object (Snowflake view in my case). It allows to work with the data from that remove object in the SAP landscape. It contains a list of the columns from table or view of in remote DB. Again in my case it is a list of the columns in my Snowflake DB’s table. The Glue Table must be activated before it can be used in other Glue objects. This table doesn’t store data at SAP side persistently. The data is only read from remote DB when the extraction process runs.

 

3. SAP table – SAP system DDIC table that stored the data physically once it is fetched from the remote DB. The data is stored persistently in here.

 

Below objects (fetcher, consumer and extraction process) are part of the Glue Extractor 2.0.

4. Glue Fetcher - allows data transfer from source to target objects. It refers to the SAP Glue Table objects. It defines weather particular columns in remote table/view is used for selection. Further it defines what delta mechanism type is used (FULL, DATA, TIMESTAMNP, VALUE, VALUE_ DIST). In my case I used FULL extraction and cursor as well.


5. Glue Consumer – comes into the picture when the data is written into the target. In my case it specifies the SAP Table I created in step no 3.


6. Glue Extraction Process - object responsible to run whole process from reading the data from the source (by the Fetcher) to writing the data by Consumer. The Extraction Process can also be used to manage the execution of extraction and monitoring of launched extractions. It has similar capabilities like running DTP process or process chain in SAP BW.

Here the Fetcher and Consumer need to be specified during a design time. Whole process is driven by generating a Z* report. You can find the report name in Generated report name field. There is also possible to define a data transformation by specifying the rules and start/end routines. In my case I haven’t used any routines neither any transformation. I used just pure 1:1 mapping.


7. SAP BW Datasource – in my scenario I created a custom DS to be able to store data into the BW aDSO object. The DS was based on the SAP table object I created in step no 3.


8. SAP BW target infoprovider – e.g. aDSO that is my final storage of Snowflake data in the SAP BW. This is an object I used further for reporting – over composite provider.

 

To run the data replication from the Snowflake to the SAP BW/4HANA I used a process chain. The 1st step in the chain was to run Glue extraction process. By this I got the Snowflake data to SAP Table. From there I used DTP to load the data to my aDSO.

 

More information:

SNP Glue product page

SNP Glue documentation



Thursday, February 22, 2024

How to switch BPC environment state (on/off line)?

In one of my older posts, I explained few ways how to check BPC environment state. Either it is online or offline. This time I focus on how to change the state of the BPC environment.

1. in BPC web client: In main screen of web client choose Settings button on top right corner. From there click on Environment button in a drop down Settings menu.

There is a new popup shown as next with a listing all available environments in the system. In a bottom part of the window there is a “Manage All Environments” button. Click on that one. 

There is a new screen called Manage All Environments shown as a next. Form here choose the environment which status do you want to change and clink on “Change Status” bottom on the screen toolbar. 


As a last step there is a new popup window where the environment status can be changed via radio button. 


2. in SAP BPC backend: Use t-code SE16 to browse table UJ0_PARAM_APP. Put following selection into the selection screen.

APPSET_ID = <your BPC environment tech name>

FIELD = AVAILABLEFLAG


In a VALUE column of the table there is either 0 or 1. Zero means the environment is to offline and one means it is set to online.


3. in SAP BPC backend – custom program: There can be a small ABAP program created. That changes the content of VALUE column in table UJ0_PARAM_APP. That can be done via a call of method SET_APPSET_STATUS  of class CL_UJA_APPSET.

 

More information:

How to check BPC environment state (on/off line)?

Saturday, January 13, 2024

System does not let you log on using a password

In case user is getting a below message on attempt to logon to SAP NetWeaver/ABAP Platform based system:

This system does not let you log on using a password

Message no 00139

 

It means that password based logon to particular system was disabled. In such a case user has to use different authentication method to logon. There are following:

• Using password (conventional logon) – method that is not possible in this case

• Using an external security product (SNC)

• Using an X.509 browser certificate (intranet/Internet)

• Using a Workplace Single Sign-On (SSO) ticket

 

In case majority of the users in respective SAP system are using a different type of the authentication than password then it should be disabled. The disabling can be done system profile parameter called login/disable_password_logon.

The parameter’s default value is 0 that means the password-based logon is possible. Other allowed values are:

1 - Password logon only for users of the group specified by login/password_logon_usergroup

2 - Password logon is no longer possible in general


More information:

Profile Parameters for Logon and Password (Login Parameters)

320991 - Error codes during logon (list)

Tuesday, December 26, 2023

Request Status Process Management (RSPM)

Request Status Process Management (RSPM) concept for BW request management is in place since a higher support packages of SAP BW 7.5 and also in SAP BW4/HANA. It replaces RSSM (Request Status Management) request management based on 0REQUID. The Request Status Process Management (RSPM) based on Request Transaction Sequence Numbers (TSN, 0REQTSN). Details can be found here.

However, in this post I want to mention few interesting TSNs that are used in BW systems. The TSN is referred by RSPM_TSN domain in DDIC. There are following special TSN defined in class interface IF_RSPM_CONSTANTS.

 

transfer_min = '18991231000000000000000' = {1899-12-31 01:00:00 000000 CET}

transfer_min_old = '19000101000000000000000' = {1900-01-01 01:00:00 000000 CET}

transfer_max= '19680120031408000000000' = {1968-01-20 04:14:08 000000 CET}

min = '19700101000000000000000' = {1970-01-01 01:00:00 000000 CET}

reorg_low_min = '19700101000000000001000' = {1970-01-01 01:00:00 000001 CET}

reorg_low_max = '19700101000000999998000' = {1970-01-01 01:00:00 999998 CET}

housekeeping_min = '19720101000000000001000' = {1972-01-01 01:00:00 000001 CET}

housekeeping_max = '19723112235959999998000' = {????-??-?? ??:??:?? ?????? UTC}

dummy = '20140206171506000000000' = {2014-02-06 18:15:06 000000 CET} - timestamp of data element RSPM_TSN

real_min = '20140206171506000001000' = {2014-02-06 18:15:06 000001 CET} - as above plus 1 ┬Ás

real_max = '28991231235959999998000' = {2900-01-01 00:59:59 999998 CET}

not_activated = '29000101000000000001000' = {not activated} - dummy TSN for not yet activated load requests

simulation_min = '29000102000000000001000' = {2900-01-02 01:00:00 000001 CET

simulation_max = '29000102000000999999000' = {2900-01-02 01:00:00 999999 CET}

reorg_high_min = '29991231235959000001000' = {3000-01-01 00:59:59 000001 CET}

reorg_high_max = '29991231235959999998000' = {3000-01-01 00:59:59 999998 CET}

max = '99991231235959999999999' = {9999-12-31 23:59:59 999999 UTC}

nc_reporting = '99999999999999000000001' = {????-??-?? ??:??:?? ?????? UTC}

nc_user = '99999999999999000000002' = {????-??-?? ??:??:?? ?????? UTC}

nc_temp = '99999999999999000000003' = {????-??-?? ??:??:?? ?????? UTC}

 

More information:

BW request types: RSSM vs RSPM

PC does not start due to not maintained start variant

I faced recently a strange error within my newly created process chain. On attempt to start it up, I got below error message:

Job could not be released. Probably an authorization issue

Message no. RSAR051


I was sure that it has nothing to do with the authorization. I checked what user is assigned into the chain in its attributes as Execution User. That user did not lack any authorization too.

It turned out that it was just a missing chain’s start variant where I did not specified scheduling condition part. There was no either Intermediate Start/Date Time/After Job/After Event/Operation Mode data entered.

Once I added it the chain worked just fine.



Error message is issued within FM RSSM_PLAN_START_BATCH_JOB somewhere around line 661. There a return value of variable l_job_released of another FM (JOB_CLOSE) call is evaluated:

MESSAGE s051(rsar) WITH 'Job could not be released.'(210) 'Probably an authorization issue'(211) '' ''.

However real reason why job schedule failed is not really properly propagated. This is fixed in SAP Note 3164846 which introduces a CASE/WHEN statement to do that.

 

More information:

3164846 - Error messages from job scheduling are not returned to caller

Saturday, December 16, 2023

"Results table" feature in SAP Code Inspector

SAP Code Inspector (t-code SCI) is a tool for analyzing and inspecting ABAP code. The tool is to help ABAP developers to ensure that ABAP code complies with certain coding standards, performance guidelines, and best practices. It performs static code checks, which means it analyzes the source code without actually executing it. The tool identifies potential issues, vulnerabilities, and areas for improvement in the code, facilitating the development of more robust and efficient applications.

Here I want to mention one feature of the SCI that is most likely not very known. It is called a "Results table". It is a button on SCI’s toolbar:

View of "Results table" provides comprehensive view where checks from the SCI and ABAP Extended Program Check (t-code SLIN) are combined.

In a picture below – issue marked in red are reported by the SCI. Issue marked in green are reported by the SLIN.


The "Results table" button is a new function that is added into SAP NetWeaver system as of SAP_BASIS 702.


BW/4HANA 2.0: External SAP HANA SQL View

There is a new feature in BW/4HANA 2.0 starting with Initial Shipment Stack. In this version of the SAP BW4/HANA 2.0 is external SAP HANA SQL View introduced. The new view follows one of principles of SAP BW4/HANA, which is openness.

In older releases of SAP BW4/HANA there wasn’t dedicated SQL view to support an extraction from aDSO objects. As SAP doesn’t recommends to use active table of the aDSO for extraction (see SAP Note 1682131) due to the fact that those tables are no publicly released. In the case of SAP BW4/HANA versions lower than BW/4HANA 2.0 SAP recommends to either use the HANA Modeler to import BW InfoProvider metadata and generate a HANA Model or to generate the HANA Model for BW InfoProvider from the BW backend.

Now with the respect to BW/4HANA 2.0 the new feature brings a new SQL view that is supposed to be used for an extraction of SAP BW objects like aDSOs or InfoObjects (see SAP Note 2723506). A naming convention for the new SQL view ends *8. That means on top of other DDIC tables related to the aDSO object there is a new table /BIC/A<aDSO_tech_name>8. E.g./BIC/AZMMA_ADSO8 - View for external Access for DataStore ZMMA_ADSO.

A full list of DDIC tables/views for my example aDSO is ZMMA_ADSO:

 

TABLES:

/BIC/AZMMA_ADSO                Inbound Table for DataStore ZMMA_ADSO

/BIC/A ZMMA_ADSO1              Inbound Table for DataStore ZMMA_ADSO

/BIC/A ZMMA_ADSO2              Active Data Table for DataStore ZMMA_ADSO

/BIC/AZMMA_ADSO3               Change Log for DataStore ZMMA_ADSO

VIEWS:

/BIC/AZMMA_ADSO6               View for Extraction from DataStore ZMMA_ADSO

/BIC/AZMMA_ADSO7               View for Reporting for DataStore ZMMA_ADSO

/BIC/AZMMA_ADSO8               View for external Access for DataStore ZMMA_ADSO

 

The views ending with 6 and 7 were generated by BW4/HANA 1.0 and view ending with 8 was generated in BW4/HANA 2.0. From my experience (based on BW4/HANA 2.0 SP05) the *8 view doesn’t get generated by aDSO activation right away. In my case it was generated at the time I loaded the data into the aDSO for 1st time.

 

More information:

How to read/write/delete from/to aDSO objects

1682131 - SAP BW tables in SAP HANA Information Views and ABAP CDS Views not supported

2723506 - External SAP HANA SQL View with SAP BW/4HANA 2.0

Thursday, November 2, 2023

SAP_INFOCUBE_DESIGNS tool

In case dimension tables of star schema data model becomes too large comparing to fact table there can be a performance problems. Thus, it needs to be monitored. SAP provides a tool for that. It is ABAP program called SAP_INFOCUBE_DESIGNS. The tool provides a list of cubes associated with number of rows and density % rate for each if the cube.

The density ratio means a number of rows in the cube fact table divided by the number of rows in each of its dimension table. The ratio is calculated by DB specific FM RSDU_INFOCUBE_TABLE_SIZES e.g. RSDU_INFOCUBE_TABLE_SIZES_HDB in case of HANA DB.

If the ratio is very high, there should be done something with the cube design. Some characteristics can be moved to separate dimension. Similarly, if the characteristics is very large it is be only one in the dimension and the dimension can be market as line item dimension.

In case of BW4/HANA the report is not relevant as source of the information is table RSDCUBE and there are no cubes in BW/4HANA. Basic infoprovider type of objects is aDSO that has flat layout so no star schema anymore. To get to know more information about the aDSO one can use watermarks tool.

 

More information:

Note 1461926 - FAQ: BW report SAP_INFOCUBE_DESIGNS

Tuesday, October 31, 2023

BW/4HANA obsolete tools

In SAP BW4/HANA many tools that were present in earlier releases of BW became obsolete. This is has few reasons. Mostly it is because the BW/4 uses less objects comparing to classic BW. The BW/4 is simplified in many ways – see details about it here.

Below I tried to compile a list of the obsolete tools in BW4/HANA. This blog post complements my earlier post about t-codes that are obsolete in BW/4.

 

1. DB table consistency tool

Program RSDU_TABLE_CONSISTENCY is deprecated with SAP BW/4HANA, source code removed. See more information about the tool here.

 

2. aDSO tool for Activating/Repairing DataStore Objects

These are the ABAP programs that check and regenerates aDSO objects in the case of technical incompleteness or inconsistency of metadata without creating a transport.

RSDG_ODSO_ACTIVATE - obsolete

RSDG_ADSO_ACTIVATE this to be used in BW/4

RSDG_ADSO_ACTIVATE only works in dialog only RSDG_ADSO_ACTIVATE_ALL works in background too.

 

3. InfoSet tool

Program RSQ_ISET_MASS_OPERATIONS doesn't exists in BW4

 

4. Tool to activate inactive communication structures

Program RS_COMSTRU_ACTIVATE_ALL does not exists in BW4

 

5. Tool to activate transfer structures for a source system

Program RS_TRANSTRU_ACTIVATE_ALL does not exists in BW4

 

6. Tool to activate inactive update rules

Program RSAU_UPDR_REACTIVATE_ALL does not exists in BW4

 

7. Print a list of InfoProviders available in system with their layout

Program SAP_INFOCUBE_DESIGNS – obsolete as FM RSD_CUBE_MULTI_GET_ONLY_DB reads from RSDCUBE table that is empty in BW/4

 

8. Undo of DSO conversion to SAP HANA-Optimized DataStore Object

Program RSDRI_RECONVERT_DATASTORE - obsolete on BW4 (ASSERT 1 = 0.)

 

9. Program for Mass activation of Non-Badi SPOs

Program RSDG_LPOA_ACTIVATE - obsolete on BW4 (ASSERT 1 = 0.)

 

10. Optimize Conversion of Standard Objects to SAP HANA

T-code RSMIGRHANADB or program RSDRI_CONVERT_CUBE_TO_INMEMORY - obsolete on BW4 (ASSERT 1 = 0.)

 

11. SAP HANA Partitioning of DataStore Objects

Program RSDU_WODSO_REPART_HDB - obsolete on BW4 (ASSERT 1 = 0.)


More information:

Action canceled. Not supported in BW4HANA edition.

Converting BW system to BW/4HANA

BW on HANA - tables consistency check program

SAP BW/4HANA (B4H) – what is it?

SAP BW/4HANA (B4H) versions

SAP BW4/HANA related t-codes


BWonHANA: SAP HANA-Optimized DataStore Object

In BWonHANA BW systems there was a new object introduced. It was SAP HANA-Optimized DataStore Object. I mentioned it in my blog post related to BWonHANA - Benefits of #BWonHANA. It was a type of DSO object that was optimized for HANA DB. This objects do not have any data in its change log table stored persistently. The change log table is calculated on the fly via HANA’s calculation view. Data is read from the history table for the temporal table of active data in the SAP HANA database. The tables around the optimized DSO comprises of a history table main table and delta table. The object contains additional field IMO__INT_KEY in the active data table. The field is pure technical and it is not visible in reporting.

While migration of the BW system to HANA DB the DSO is needed to be migrated to optimized one. It can be done via t-code RSMIGRHANADB.

As of BW 7.3 SPS09 this type of DSO became obsolete. Conversion to SAP HANA-optimized DSO is performed automatically for all standard DataStore objects. Conversion of the DSO objects to the SAP HANA-optimized objects is thus obsolete. It is possible to use the SAP HANA-optimized DataStore objects, but SAP recommends to reconverting them back to original DSOs. That can be done via report RSDRI_RECONVERT_DATASTORE.


BW objects - Differences between DDIC and DB

BW system generates a lot of temporary database objects while it is running e.g. during query execution, or other processes that read data from BW infoproviders. These objects can be database views or tables. Mostly they are placed in '/BI0/0' namespace.

I wrote some information about the BW temporary objects here. In that blog post I mentioned that program SAP_DROP_TMPTABLES can be used to remove these type temporary objects. However if some of those temporary objects are reported as inconsistent via t-code DB02 there is another program that should be executed. It is program SAP_UPDATE_DBDIFF. The program makes a copy of an information about differences between definitions in ABAP DDIC and in database catalog to table DBDIFF. The DB02 t-code than includes the DBDIFF  table when checking for inconsistencies.

 

More information:

Deletion of temporary database BW objects

Friday, October 27, 2023

Slice statistics in BW

To be able to use slice statistics in BW there are few things to be customized. Via t-code RSDDSTAT_SEL_CUST (program RSDDSTAT_SEL_CUSTOMIZE that calls function module RSDDSTAT_SEL_MAINTAIN) a selection statistics for particular InfoProvider (mostly aDSO in BW/4 based systems) needs to be activated.

The all the InfoProviders for which it was customized is stored in table RSDDSTATSLICECUS (Customizing of data slice statistics). Particular statistics selection data criteria is stored in table RSDDSTATSLICE                  (Selections criteria).

Afterwards slice statistics selection data is stored in table: RSDDSTATSLICEEX (Extracted values).

Sunday, September 24, 2023

SAP PowerConnect

SAP PowerConnect for Splunk is a tool to capture data about an events in SAP systems and upload it to a Splunk software in real time. The data is analyzed to and visualized as SAP telemetry intelligence in the Splunk. The SAP telemetry provides analytical information on efficiency of the systems of SAP in a local and cloud environment.

The PowerConnect tool was originally developed by BNW Consulting. This company was acquired by SoftwareOne in 2019. Splunk company was acquired by cisco in 2023.

Technically the tool is developed in /BNWVS/* namespace as a software component BNWVS.

Following are example of jobs running in SAP systems that are part of the SAP PowerConnect.

/BNWVS/BC_CHECK_JOB

/BNWVS/BC_DATA_ARCHIVE

/BNWVS/BC_DATA_EXTRACT

/BNWVS/BC_DATA_TRANSFER

/BNWVS/DISTRIBUTE_PARALLEL

More information:

powerconnect.io

PowerConnect documentation

PowerConnect on splunkbase

Sunday, August 27, 2023

Reserved key words for SAP table fields

When it comes to table creation in SAP Data Dictionary (DDIC) there is a check performed whether particular field (names of table columns) is not reserved. If so the dictionary object cannot be activated. This applies regardless the table is created manually or generated by the system. Automatically generated dictionary objects can involve in BW objects like aDSOs, Open Hubs, etc. In other applications, it can involve automatic generation of tables for CDS views.

The check if the field is or is not reserved is performed in ABAP Data Dictionary. The dictionary has a list of reserved words that may not be used for database objects. The list depends on the database system and is present in the system in a form of database table called TRESE. The table has two colons:

NAME – represents the reserved key word itself

SOURCEHINT - reason for reservation, means what DB type the keyword is reserved for

 

In summary there is the table TRESE in ABAP dictionary that stores reserved or protected names that cannot be used within the dictionary objects names.