Friday, July 1, 2022

Working with navigational attributes in Composite Providers

While working with Composite Providers (HCPR) a navigation attributes of InfoObjects that are present in objects contained in the HCPR are not shown by default. Below picture illustrates how it looks like when the navigation attributes are not shown.

In BW Modelling Tools  of SAP HANA Studio there is an option how to enable it. In the HCPR maintenance screen under a right click there is an option called “Show Unassigned Navigation Attributes”.


Once the above option is activated then the navigation attributes are present and can be freely assigned. Here’s how does it look like when the navigation attributes are available.



 



Thursday, June 9, 2022

Error: Numeric overflow for parameter/column

I recently ran into an issue of reading aDSO object from custom code. I used a method READ of ABAP class CL_RSDRI_INFOPROV. Return code of the method call was 8 = inherited_error. When I debugged it; I came to place in FM TREX_DBS_AGGREGATE where following call is performed:

cl_hdb_sql_for_aggr_req_facade=>get_instance_for_1st_chunk

Exception that was triggered I found following error:

AttributeEngine: overflow in numeric calculation;AttributeEngine: overflow in numeric calculation;exception 70006944: AttributeE

overflow in numeric calculation; $message$=aggregation failed $BIC$<KF_name>$sum$ 1 fixed14.3(17) exception 70006944: AttributeE

overflow in numeric calculation; $message$=aggregation failed $BIC$ KF_name $sum$ 1 fixed14.3(17) ,Exception in executor ...

plan408469425@ndhcdb01-int:30085 while executing pop 6: calcEngine search on olapIndex failed.,QueryId: ...

00O2SPBEZ1RC04NPREWOW2QMR:_C10/[Request Info: Object Name = "<db_schema>"."0BW:BIA: <cube/aDSO_name>", FM Name = TREX_DBS_AGGREGATE]

Error 6.944 has occurred in the BWA/SAP HANA server

Error reading the data of InfoProvider <cube/aDSO_name>$X



Seems this is a generic DB error no 10811:

Numeric overflow for parameter/column (<id>) source type <source_type>, target type <target_type>, value '<value>'

Apparently, value of KF mentioned in the error exceeded its threshold. Solution was to delete the data that caused this overflow.

 

More information:

2399990 - How-To: Analyzing ABAP Short Dumps in SAP HANA Environments

2393013 - FAQ: SAP HANA Clients

2352450 - ADBC: Numeric overflow for data type BIGINT


Critical path of a process chain

Within BW Tools (t-code ST13) and in Process Chain Analysis in particular there is a function call “Critical Path” available. What it does is basically highlights some of the processes in the PC that are connected with a flow of the PC meaning what is a previous process of that needs to be completed in order for the next process to start. Once it is set on the BW system start determining the process, which finished as last one. Then the systems goes back thru a list of PC’s processes up to a trigger of the PC. This is how the critical path is analyzed.

A review of the critical path does not help much with a respect to optimizing of the PC runtime as there is no relation between the time the particular process consumed. The function is implemented in ABAP method TIME_CRITICAL_PATH of ABAP class CL_RSPC_HIER.

The function is available under a button called “Critical Path on/off” available in ST13’s Process Chain Hierarchy screen:


More information:

Suite of helpful BW programs – BW Tools (ST13->BW-TOOLS)

Run time comparison of Process Chains

Tuesday, June 7, 2022

0INFOPROV vs 0TCAIPROV

I recently faced an authorization issue when a new BW/BEx report was created. It wasn’t accessible by end users. When I traced in t-code RSECADMIN I got below error message:

Authorization missing for aggregation (":"): Char: 0INFOPROV – empty



From the trace itself, it looked like it is obvious. Analysis authorization object used in role of end user lack a 0INFOPROV characteristics. In particular its column aggregation. I followed up my analysis by checking corresponding analysis authorization object.

By definition (see e.g. here) every analysis authorization object needs to have below three characteristics:

0TCAACTVT (activity), 0TCAIPROV (InfoProvider) and 0TCAVALID (validity)

 

Therefore, no mention of the 0INFOPROV that popped up in the RSECADMIN’s trace. From this, it seemed that my analysis authorization object is setup in correct way. Nevertheless, why the RSECADMIN’s trace is complaining about 0INFOPROV? What is relation between the 0INFOPROV and 0TCAIPROV? The latter is referenced to earlier one. However, that shouldn’t matter...

It turned out that the 0INFOPROV was set to be an Authorization Relevant (in table RSDCHA field AUTHRELFL is set to X). As you can see on below picture, the 0INFOPROV comes from a Business Content as not Authorization Relevant (tables in both systems) however in the SAP BW system in question (table at the picture’s bottom) the active version of the characteristics was enabled as Authorization Relevant. That particular setting was driving the security trace to mark the 0INFOPROV as not present in the analysis authorization object. Once I included the 0INFOPROV (and its column aggregation) into the analysis authorization object effected, business user was able to access the report.


More information:

Defining Analysis Authorizations

820183 - New authorization concept in BI

1956404 - Characteristics 0TCAIPROV, 0TCAACTVT, 0TCAVALID are no longer Authorization Relevant after Upgrade to BW7.3 or higher



Sunday, June 5, 2022

CDS views related to Analytics (BI)

A CDS (Core Data Services) concept is around for some time and it enables data models to be defined and consumed on the database server rather than the application server. The concept is important from many perspectives. It provides: semantic layer (for use cases like, analysis, operations, search etc.), a uniform data model for all transactional and analytical application areas, and brings an simplifications to the SQL database language, reducing technical complexity for the user. One of the use cases for the CDS is Analytics/Business Intelligence.

Before I dig into the Analytics related CDS view let’s define few more terms related to this area. CDS Annotations describe semantics related to the business data. The annotation enriches a definition of a model element in the CDS with metadata. It can be specified for specific scopes of a CDS objects, namely specific places in a piece of CDS source code.

The CDS annotation can be ABAP specific (means consumed by ABAP runtime) or Framework specifics ones. Particular frameworks ca be Service Adaptation Definition Language (SADL), Business Object Processing Framework (BOPF), Analytics, or Enterprise Search.

 

Let’s focus on Analytics CDS annotations now. Here we recognize two of them:

 

1 Analytics Annotations - Enable the Analytic Manager for multidimensional data consumption, performing data aggregation, and slicing and dicing data. BI front ends like Design Studio and Analysis Office can consume the data via the Analytic Manager.

2 AnalyticsDetails Annotations - Enable application developers to specify the default multidimensional layout of the query, the sequence of variables in UI consumption, and the specific aggregation and planning behavior of the data. All these annotations can only be used in views with @Analytics.query: true. Such identified CDS views are called CDS Query.

 

 

Based on Annotation Analytics.dataCategory we can distinguish between following data categories of the CDS views:

 

1. Fact View = CDS view annotated with @Analytics.dataCategory: #FACT

CDS entity is represented by fact table (center of star schema) of transactional data object. The fact table contains measures (key figures). Use case here is an replication thus this type of the CDS view should not be joined with master data views.

 

2. Cube View = CDS view annotated with @Analytics.dataCategory: #CUBE

What it is cube view? Used for reporting on BW’s InfoProviders like cubes or aDSOs. It is similar to #FACT but as reporting on only figures doesn’t bring any value such an data need to be joined with master data objects. Example.

 

3. CDS Dimension = CDS view annotated with @Analytics.dataCategory: #DIMENSION

Used for reporting in master data. No key figure fields can be defined as key fields. Only characteristic fields can be key fields. Example.

 

4. Aggregation level view = CDS view annotated with @Analytics.dataCategory: # AGGREGATIONLEVEL

Used in planning scenarios to provide write-back functionality.


Few more words about naming conventions related to CDS Views. There are three different technical names for the CDS view stored in table RSODPABAPCDSVIEW:

SQLVIEWNAME - Name of SQL View (ABAP Object), the sqlviewname is used in the ABAP dictionary can be seen in t-code SE11.

DDLNAME - Name of CDS View

STRUCTOBJNAME - Name of view defined in CDS View (Entity Name)



More information:

Core Data Services

CDS Annotations

Analytics Annotations

AnalyticsDetails Annotations

Cube View in CDS

InfoObject in CDS

CDS views: HowTo use in BW contexts

Monday, May 16, 2022

Change default download directory from SAP GUI

Today here’s a quick tip on how to change default download directory from SAP GUI. The default directory on WINDOWS OS host where files that are being downloaded via SAP GUI are stored is pointing to:

x:\Users\<username>\Documents\SAP\SAP GUI

 

However, there is a possibility to change it. Once of the options here is via WINDOWS Registry. Below is the Registry path:


[HKEY_LOCAL_MACHINE\Software\SAP\SAP Shared] on 32bit operating systems

[HKEY_LOCAL_MACHINE\Software\Wow6432Node\SAP\SAP Shared] on 64bit operating systems

Key: SapWorkDir

Type: Expandable String Value

Value: The path you want to change

 

Fort someone who has no an access to the WINDOWS Registry this is not an option. Other possibility is to do it via user profile settings in SAP GUI. Simply call t-code SU3 and switch to tab called Parameters.

Here enter the param name GR8.



Give value to desired target directory for the parameter.


Once you use the file download functionality in any SAP GUI’s t-code the default folder will be pointing to the folder name given in the SU3’s param. Similarly, one can leverage other parameter CR9 that is used while uploading the files from user’s workstation to the SAP system.

Seems this is very old functionality as it is available in SAP NetWeaver based systems since component SAP_BASIS version 620 - that was released circa 2007.



More information:

622128 - Customizing the file Upload/Download paths.

Thursday, May 12, 2022

SAP for Me and SAP Support Portal vs ONE Support Launchpad

In my post, I do mention supports sites of SAP a lot. They are quite useful to search for an information about SAP products, issues, how-tos, also to request license keys, support or to download SAP software.  As there are many names for those sites I recon it can be useful to mention the latest ones.

In past we normally referred to these sites as SAP Support Portals either as OSS (Online Support System) or later as SMP (Service Market Place). Nowadays we can distinguish between two main support landing pages:

 

SAP Support Portal is one stop for all support and service related needs of SAP customers. It can be used to access the software, download the software, request license keys, get technical support, get an information about the software solution and find the documentation. Link: support.sap.com

 

ONE Support Launchpad is the new version of the Service Market Place (SMP) for all SAP customers. Link: launchpad.support.sap.com the site will soon transaction to SAP for Me that will be single entry point to all support site of SAP. Right now, the SAP for Me serves as a place to improve SAP customer experience throughout all touch-points with SAP. A list of all product’s portfolio, finance and legal stuff with customer relation to SAP is available there, other areas are service and support and systems and provisioning among other things are available via the SAP for Me site.


More information:

OSS

SMP

SAP Support Portal

SAP ONE Support Launchpad

Tuesday, April 19, 2022

Run DTP as a job?

This is probably a quite silly idea to run the DTP scheduled as background job. However, I just started to wonder whether it is possible at all. I am aware that ever since there was a concept of a process chain introduced in BW this is not really necessary. Having the process chain it is a pretty much possible to run the DTP within that chain anytime we want with any frequency we want. Nevertheless, I still wondered - it is possible to run the DTP as a job without the chain? Let us have a look.

There is an ABAP program called RSPROCESS that can runs any BW Process. Be it Execute Data Transfer Process (DTP_LOAD), Delete Complete Data Target Contents (DROPCUBE), Activate Requests in advanced DataStore Objects (ADSOACT), Execute ABAP Program (ABAP), Execute Operating System Command (COMMAND), Activate Master Data (MDACTIVAT), any BPC process and so on. See overview of different BW tasks/objects within the SAP BW here.

Thus, a very first idea is to leverage this program and schedule it as the background job. On the program’s selection screen, here are input parameters that need to be provided:

Process Type                                                 DTP_LOAD

Process Variant                                              <DTP_tech_name>


Now the program needs to be scheduled as the background job. To do that there is an item called Execute in background in menu Program:


As a next, just accept a Background Print Parameters pop-up:


Finally enter Start date/time or event which should serve as the background job’s trigger. Once done the job will be scheduled by accepting the pop-up by save button.


There is an information message “Background job was scheduled for program RSPROCESS” confirming the job’s schedule.


Now it is possible to observe the job’s execution in t-code SM37. Depending on DTP and BW system setup there can be several jobs found:


RSPROCESS - job that ran the program RSPROCESS itself. This one started next job(s) that represents the DTP load execution.

BIDTPR_20220419211828000002_1 – generated timestamps represents the data load request ID that can be seen in DTP’s monitor (e.g. t-code RSDTP). There can be multiple of those jobs depending on size data set that is being processed, DTP’s pack size etc. 




Debugging End Routine in HANA runtime of BW Transformation

In BW’s Transformation that are set to be executed in HANA mode there is still possible to have an ABAP End Routine. Here’s a short guide on how to debug ABAP code in such End Routines.


Technically a place that where the ABAP code of the End Routine is embedded into overall code of the HAAP process is a generated ABAP class. As an example if HAAP name (variable I_HAAPNM) is TR_ZQPTTP4YE36NYCQEVA6P_A then the ABAP class name is /BIC/ZQPTTP4YE36NYCQEVA6P_A (variable R_CLASS_NAME). In that ABAP class there is a method called GLOBAL_END where it code is located.

There are two environments where such an End Routine can be debugged.

 

1. in SAP HANA Studio – First open the transformation and display the End Routine code. Do a right click on some executable ABAP statement and set the breakpoint via choosing "Toggle Breakpoint" context menu item. 


While being in BW Modeling perspective open respective DTP and via button on DTP’s toolbar called "Start execution of data transfer process" select "Simulate in Dialog" item. This action open an RSDTP t-code embedded in SAP HANA Studio. Here hit a Simulate button and continue as you would do the debugging in SAP GUI as described below.



2. in SAP GUI – The debugging is still possible in SAP GUI if one prefer to do so. While being in RSDTP t-code set a "Processing Mode" to "Serially in the Dialog Process (for Debugging)" position, check out "Expert Mode" checkbox and hit Simulate button.


Next, a following pop-up called "Debug Request" is shown. Here enable "Before Transformation" checkbox. After that run it.



After a while, an ABAP debugger screen of SAP GUI is presented. Follow to Break./Watchpoints tab of the ABAP Debugger. Create a new breakpoint here pointing to ABAP objects:



Class name       CL_RSDHAR_TGT

Method Name      IF_RSDHAR_TGT~EXTRACT

Once the new breakpoint is created, it is visible like below:



Run the debugger until the new breakpoint is reached. There may be few more stops caused by hardcoded BREAK-POINT ABAP statements until it is reached to. If the ABAP debugger finally stops at IF_RSDHAR_TGT~EXTRACT (CL_RSDHAR_TGT) breakpoint just scroll a little bit down to a place where a call of method o_execute_abap_endrout is located and place a another breakpoint there.



Once that new breakpoint at the call of method o_execute_abap_endrout is reached, you came into the point where the ABAP code of custom End Routine is executed.

Thursday, March 31, 2022

BW on HANA - tables consistency check program

In SAP BW systems based on HANA DB there is a program to check a consistency of table properties with respect to BW application. The consistency check is often performed in cases of system migration (e.g. OS, hardware), system copies, restores. The program name is RSDU_TABLE_CONSISTENCY.


The program reads all tables residing in HANA DB and identifies those, which are BW relevant considering the program’s consistency check scenarios. So-called consistency check scenarios are present on the program’s selection screen. Inconsistency is found in case expected table property is not matching the reality. The program supports also repair mode along the detecting an issues.

Program RSDU_TABLE_CONSISTENCY is obsolete in BW/4 based systems as a part of simplification effort. The code of the program is not present. It can run but a message “Report not supported anymore. Please see SAP Note 2668225 (msg no: RSHDB 111)” is shown instead of the program execution. As the main object that carry the BW data persistency is in BW/4 an aDSO object the checks of the aDSO objects are now available in t-code RSOADSODTO.  

 

More information:

2569097 - How to use of report RSDU_TABLE_CONSISTENCY [VIDEO]

2668225 - Report RSDU_TABLE_CONSISTENCY is deprecated with SAP BW/4HANA

1937062 - Usage of RSDU_TABLE_CONSISTENCY - PDF doc


Wednesday, March 30, 2022

BW/4 t-code RSOADSODTO - Data Tiering Maintenance

Data Tiering is a strategy for a placing of a data into different areas (tiers). Decision on where the data is stored is highly driven by a frequency of data access, requirement the data updates, performance requirements and criticality of the data for business.

In case the data is very frequently used as well as updated, also the data is critical for the business we put it to so called hot tier.

If the data is less often accessed, has lower performance requirement we put it to so called warm tier.

Finally in a case the data is infrequently accessed, not require the updates, can be accessed in a longer time and is not critical put it to so called cold tier.

A concept of hot/warm/cold storage tier is called temperature schema.

In general putting the data into those areas, we say as Data Tiering Maintenance or Data Tiering Optimization (DTO). The DTO is a feature of BW/4 systems. There is still a possibility to use data archiving Data Archiving Process (DAP) as traditional approach used in NetWeaver based BW systems.

In terms of BW/4 based systems we can leverage an InfoProvider of type aDSO as object for the Data Tiering. From underlying HANA DB technology we used HANA indexserver nodes for hot tier. HANA extension nodes based on scale-out technology with relaxed core/memory and memory/disk ratio are used for the warm tier. In addition, some kind of external storage like SAP IQ DB that is accessible via Smart Data Access (SDA) for query access (DQL) and data manipulation (DML) can be used for the cold tier.

In BW/4 the aDSO as object needs to have some kind of a time characteristics or any other IO included in aDSO’s key) is part of key field’s definition. Once this is fulfilled on a Settings tab of the aDSO definition there is a Partitions part enabled. Here either Static (by given hardcoded values) or Dynamic Partitioning (by loaded data of particular characteristics) can be defined. Each partition can be maintained in different temperature / tier via button Maintain Temperatures. Depending on a version of BW/4 the click on this button either opens t-code RSOADSODTO (ABAP prog RSO_RES_ADSO_TEMP_MAIN) - Temperature Maintenance or launch’s BW/4 cockpit and its FIORI app (under DataStore Object -> Manage Data Tiering) for the DTO. In one of the two environment, we can choose the tier for particular partitions. Once the tiers are chosen it is possible to execute a change of the tier for specific partition/object depending how the DTO was defined on level of aDSO’s settings. The tier change can be also done in t-code RSOADSODTOEXE (ABAP prog RSO_RES_ADSO_TEMP_MAIN_EXEC) – Data Tier Adjustment. Another t-code RSOADSODTOS (ABAP prog RSO_RES_ADSO_TEMP_MAIN_START) is used for the FIORI app calls. The execution of the data tier change stores a logs about it. Those can be reviewed in t-code SLG1 under Object = RSOADSODTO, Subobject = RSOADSODTO_EXEC and Ext. Identifier = ADSO_TEMP_MAINTENANCE*.


More information:

Online docu

BW transport error: Error 2.048 has occurred in the BWA/SAP HANA server

While importing a new BW object like IO or aDSO across the landscape a following transport error may pop up in BWonHANA or BW/4 based systems:

 

RSD_TREX100:        column store error: <?xml version='1.0' encoding=' 2048

DBMAN099:   column store error: <?xml version='1.0' encoding='utf-8'?><createCubeResult version='1.0'><status><message>Error during ...

DBMAN099:   executing SQL statement</message><errorCode>2116</errorCode></st

DBMAN901:   Error 2.048 has occurred in the BWA/SAP HANA server

RS_EXCEPTION000: Could not create logical index

 

I recently came across this error in case of transporting an InfoProv – aDSO object in BW/4 based system. Every InfoProv object that runs in HANA DB must have column view created to access its data. The column view is needed for reporting on the object. The column view is sometimes called as logical index. It is view on DB containing all joins and can be used for an Infoprovider while querying it. The view is being created in the origin system when it was created upon its creation. In any other system where the object is moved to the view is created upon its transport.

Solution is to create the column view in that target system of the transport manually as it failed during the transport way. There is an ABAP program RSDDB_LOGINDEX_CREATE that can be used to generate the column view for aDSO, InfoCube, Open ODS View, CompositeProvider, InfoObject and MultiProviders. However there are more objects for which the views have to be generated there another tool – ABAP program RSDDB_INDEX_CREATE_MASS. Once the view is fixed the transport’s reimport will be finished successfully.


More information:

2286336 - Column view cannot be created on HANA

BWonHANA: InfoProvider column views

2607883 - Checking Column View and Calculation Scenario Errors in BW Queries



BW generation tool

The BW generation tool is to generate classes used in different data warehouse technical operation like: Activating data in DSO, extracting data from InfoProv, etc. Basically, every data operation in BW is performed via some generated piece of code.

Objects that are generated by BW generation tool usually falls under 'GP*' namespace. Those generated programs are sometimes called as writers. E.g. aDSO Writer, or InfoCube write program, etc.

The 'GP*' objects are generated by the BW generation tool according the templates. The generation is done by the program (generation) classes. The template is special ABAP include (e.g. RSDSO_ACTIVATE_TMPL) that contains a string like '*@' at the beginning of the line. Upon the generation, the template is filled with the actual data as per BW objects which it is being generated for.

Technically, there is a t-code RSSGPCLA (Maintain program class) where we can maintain what's template for particular data operation represented by the program class.

Assignment of program classes (field PROGCLASS, data element RSSG_PCLAS, domain RSSG_PCLAS) to template (field TEMPLATE, data element RSSG_TEMPL, domain PROGNAME) is stored in table RSSGTPCLA 'BW Generation Tool: Program Class'.

Here is few examples of different program classes:

RSAPTD1                        Generation of the Program for Transaction Data Transfer

RSDMD_ACTIVATE           Activation of requests for enhanced master data update

RSDRO_ACTIVATE            Activate an ODS Object

RSDRO_EXTRACT             Extraction of ODS Objects

RSDRO_UPDATE 

RSDSO_ACTIVATE            Activation of DataStore Objects (advanced)

RSDSO_OLRPROC             Select Requests for Deletion in Process Type ADSOOLR

RSDSO_ROLLBACK           Deletion of requests from DataStore objects (advanced)

RSDSO_UPDATE              Update Procedures

RSDTPERR                      Error Handling DTP

RSDWTMPLWIDTP            InfoCube Update Program for DTP

RSODSO_ACTIVATE          Activation of DataStore Object Data

RSODSO_RDA                 Template for Realtime Data Acquisition

RSODSO_ROLLBACK         Deletion of request from DataStore objects

 

Furthermore there are tables as per different BW object (e.g. in case of IO it is table RSDCHABASLOC, in case of aDSO it is table RSOADSOLOC, etc.) that store an assignment of generated writer programs to the particular BW object.

A few situations are there that may require maintaining the table RSSGTPCLA via t-code RSSGPCLA. It is mostly in case when the writer program need to be regenerated e.g. due to upgrade of BW system. This activity is sometimes called as 'Reset generation flag'. It can be performed by hitting a button 'Set Status' located in a toolbar of the t-code RSSGPCLA. The status is set to "Generation required". This means once the operation will run for next time the generated/writer program will be regenerated prior. Technically the setting of the status is done by a call of FM RSS_PROGRAM_GENSTATUS_RESET.

When one does this it must be ensured that while the generation flag is reset for particular BW activity the operation is not in progress. Otherwise, the current operation is cancelled and some inconsistencies may occur.




Tuesday, March 22, 2022

Cube-like aDSO object

In my other post, I introduced different flavors of an aDSO objects. In this post, I want to go deeper with one of them – cube-like aDSO object.

Aim of the cube-like aDSO object is to replicate behavior of infocube object that we know from classic (NetWeaver based) releases of SAP BW. It is used to model data warehouse layer – data mart according LSA++ architecture. The object does not have change log (CL) table. There are only two tables underneath:

1 Inbound data (AQ or AX for a cold storage) - stores newly loaded data identified by data load requests, similar to F-table in classic cube.

2 Active data (AT) – stores compressed data by activation requests, similar to E-table in classic cube.

Data is moved from inbound table to active one upon its activation (also known as aggregation). Once all data is activated, there is no data in the inbound table.

The cube-like aDSO object setup is following. "Active data" and "All Characteristics are Key" check boxes are checked.

Reporting on the cube-like ADSOs is performed as a union of inbound and active tables. This behavior is referred as stable navigation. There is no need to activate the data as both tables are used for the querying the data.

There are few remarks with regard to a deletion of the cube-like aDSO object. Request based deletion works if the data is only available in the inbound/new (AQ, e.g. /BIC/A*1 Inbound Table) table. Once the data is activated it is not possible to delete the request in the aDSO. There is only possibility to complete content deletion or a selective deletion. If a possibility to delete the requests is needed the data must not be activated.

 

More information:

Onlinedocu - infocube

Onlinedocu – datawarehouse layer – delta calculation

Flavors of aDSO object

3102582 - Request Based Deletion does not work as expected

3116509 - Deletion of overlapping request for ADSO is not working in case if the request is activated in target of ADSO type cube