Saturday, January 4, 2025

Program RSPROCESS vs RSBATCH_EXECUTE_PROZESS

In one of my earlier posts, I described ABAP program called RSPROCESS. However, there is also a program RSBATCH_EXECUTE_PROZESS. Both are serving similar purpose in SAP BW systems. They are used to managing certain BW processes. As described in that other blog post the RSPROCESS executes process variant of process chain. For various types of process variant of the PC see this post or refer to table RSPROCESSTYPES.

On other hand the RSBATCH_EXECUTE_PROZESS program is used for managing BW processes in terms of individual batch process. It is often involved in specific data warehousing activities such as data loads, DSO (DataStore Object) activations, and parallel processing. RSBATCH_EXECUTE_PROZESS is typically scheduled as a background job to handle these tasks efficiently.

Selection screen of the RSBATCH_EXECUTE_PROZESS.



Automation for SAP BW configuration tasks

ABAP Task Manager (aka Task Manager for Technical Configuration) is part of the ABAP stack that serves for purposes of execution of an automation task lists for various configuration tasks in automated way. It is available in SAP ABAP Platform/ABAP NetWeaver Stack backend via t-codes STC01 (ABAP task manager for lifecycle management automation), STC02 (Task list run monitor). As an example of task lists that exists in the Task Manager are automation of following activities: system copy, post system copy (PCA – Post copy Automation), initial system setup, system check, FIORI setup, embedded search, etc. Of course, in SAP BW systems there are task lists also available for SAP BW tasks automation.



ABAP Task Manager is only the runtime for the execution of the automation task lists. For most of automation tasks there is also needed corresponding automation content, which are offered by SAP as well. The content contains the tasks itself. Below are listed few examples of the SAP BW related task lists:

 

SAP_BW_HOUSEKEEPING                         Activities associated with regular BW system maintenance           

SAP_BASIS_BW_OIA_CONFIG                    SAP_BASIS_BW_OIA_CONFIG       

SAP_BW_AFTER_MIGRATION                     Activities following the successful migration of a BW system 

SAP_BW_AFTER_UPGRADE                       Activities following the successful upgrade of a BW system           

SAP_BW_BASIS_COPY_INITIAL_CONFIG      Initial Copy for BW and BW Source Sys – Cleanup and Configuration

SAP_BW_BASIS_COPY_REFRESH_CONFIG    Sys Refresh of BW/BW Source Systems Export/Cleanup/Import/Conf

SAP_BW_BEFORE_MIGRATION                    Activities prior to the migration of a BW system       

SAP_BW_BEFORE_UPGRADE                       Activities prior to the upgrade of a BW system         

SAP_BW_COPY_INITIAL_PREPARE               Preparation for Initial Copy of BW System   

SAP_BW_SETUP_INITIAL_CONFIG               BW Initial Setup Task List 

SAP_BW_SETUP_INITIAL_S4HANA              BW Initial Setup Task List for S4/HANA

SAP_BW4_TRANSFER_CHECK_CLOUD_RMT  Collect and Check BW objs whether they are compatible to BW Bridge

SAP_BW4_TRANSFER_CHECK_CLOUD_SHL   Collect and Check BW objs whether they are compatible to BW Bridge

SAP_BW4_TRANSFER_CHECK_INPLACE        Collect and Check BW objects whether they are compatible to BW/4

SAP_BW4_TRANSFER_CHECK_REMOTE        Collect and Check BW objects whether they are compatible to BW/4

SAP_BW4_TRANSFER_CHECK_SHELL           Collect and Check BW objects whether they are compatible to BW/4

SAP_BW4_TRANSFER_CLOUD_REMOTE        Activities to be performed in original sys of remote BW4Cloud-Transfer

SAP_BW4_TRANSFER_CLOUD_SHELL           Activities to be performed in original sys of shell BW4Cloud Transfer

SAP_BW4_TRANSFER_INPLACE                   Transfer BW objects to be compatible to BW/4

SAP_BW4_TRANSFER_READY4CONV            Transfer IOs & Open Hub Destinations to be compatible to BW/4

SAP_BW4_TRANSFER_REMOTE_PREPARE     Activities to be performed in original system of remote BW4-Transfer

SAP_BW4_TRANSFER_REMOTE_SHELL         Activities to be performed in original system of shell BW4-Transfer

SAP_BW4_TRANSFER_SEM                         Tasks to Transfer SEM-BW objects to be compatible to BW/4

SAP_BW4_TRANSFER_SEM_SHELL              Transfer SEM-BCS objects & BW objects w/o data into remote sys

 

While the task list is being executed the BW system triggers job within following naming convention: BW_TASK e.g. job BW_TASK_20241105082934000005000

 

More information:

Automated Initial Setup of ABAP Systems Based on SAP NetWeaver

ABAP Post-Copy Automation for SAP BW Configuration Guide

1829728 - BW housekeeping task list

3349077 - [BW Central KBA] Systemcopy / Refresh


Tuesday, December 31, 2024

SAP Change Data Capture (CDC)

Change Data Capture (CDC) allows identifying and capturing changes made to data within SAP application tables with the SAP system. Those data changes can be then in real-time replicated to other systems or processes. It's a way to keep data synchronized across different environments, enabling real-time analytics, data warehousing, and other data-driven initiatives.

There are different techniques for implementing CDC in SAP environments:

Log-Based CDC: This approach reads the transaction logs of the SAP database to identify changes. It's generally considered the most efficient method with minimal impact on the source system.

Trigger-Based CDC: This method uses database triggers to capture changes as they occur. Triggers are database objects that automatically execute a predefined action in response to a specific event (e.g., an insert, update, or delete operation).

Table-Based CDC: This technique involves comparing snapshots of tables at different points in time to identify changes. It's less efficient than log-based or trigger-based methods but can be used in situations where those options are not available.

SAP has several tools and technologies that leveraging CDC, including:

SAP Data Services: A data integration and data quality platform that includes CDC capabilities.

SAP Landscape Transformation (SLT) Replication Server: A tool for real-time data replication from SAP systems to SAP HANA.

There are many 3rd party tools that uses SAP CDC as well.


Now when it comes to extracting the data via CDS (Core Data Services) aka ABAP CDS views the trigger-based CDC is used as well. Specific annotation for the CDS view needs to be specified in ABAP Development Tool (ADT Tool). The annotations allow the CDS view to use a trigger-based CDC recording mechanism to load data and record changes to the tables that belong to the view.

PS: Do not confuse SAP CDC (Change Data Capture) with other SAP products like SAP Customer Data Cloud.

Mor information:

Online documentation


Sunday, December 29, 2024

Exposing SAP data with a Note 3255746 in mind

In a nutshell with an update of the Note 3255746 SAP stops customers and 3rd party applications from using RFC modules in the ODP Data Replication API to access and extract SAP data from ABAP sources. As these APIs are only for SAP’s internal use. This is an SAP's ban of RFC usage.

This means it is not possible to use ODP to access ABAP CDS views directly. Instead, the ODP component with OData API provided by SAP for data extraction should be used. This API is stable and recommended for all customer and third-party applications. Furthermore, SAP advises customers “to use SAP Datasphere for realizing data replication scenarios to move data from various SAP sources (such as SAP S/4HANA, SAP BW, SAP ECC sources etc.) into third-party applications & tools.”

Restriction was introduced on February 2nd, 2024, when SAP updated the Note mentioned. Note originated sometime around year 2022 when it stated that “ODP RFC method were unsupported”. But 2024 Feb update of the Note made unsupported -> unpermitted.

Impact on 3rd party tools / Target platforms

SNP/DataVard Glue - no impact as the tool uses an technology that is independent of any third-party (SAP's in this case). More info.

TheoBald Xtract – impacted, Theobald claims it will enhance its tool to use OData API (as well as it is currently utilizing the RFC modules) sometime around 2024.Q4. More info.

Init Software – Init ODP Source Connector is impacted. Use Init OData v2 Source / Sink Connector More info.

Databricks – impacted, as their lakehouse ingests data from many sources More info.

Azure Data Factory (ADF) / Azure Synapse Analytics - data ingestion platforms of MS Azure is impacted by this as Azure Data Factory SAP CDC connector is using the ODP framework. Use OData API based connector instead that is available in Azure. More info.

Qlik Replica – impacted, More info.

Google Cloud Data Fusion – As there are many GCP connectors for SAP (e.g. BigQuery, SAP BW Open Hub Batch Source, SAP OData, SAP ODP, SAP SLT Replication, SAP Table Batch Source) some of them are impacted. Specific case needs to be reviewed and a new integration needs to be built based on OData API if impacted. More info.

AWS Appflow – There may be a several connectors available in AWS Appflow. In case of “Amazon AppFlow SAP OData connector” usage there is no impact as it leverages OData API.

HVR/Fivetran - Fivetran's SAP NetWeaver connector uses RFC calls thus it is impacted. Most likely there will be a new Fivetran ODP connector released soon to be based on ODP OData API. More info.

Snowflake – depends on which tool is used to replicate the data to the snowflake.

Notice that there are many other tools used to expose SAP data that are not listed in here.

Also, the big question if this action done by SAP is legally binding its customers. Well, it is a big topic actually. What is driving it is the SAP Software Use Rights signed. There is a standard version of it (but also notice that the document is a subject to evolution and depends on the SAP version, etc.) that clearly says that asynchronous indirect access to SAP data is not licensed without SAP Digital Access, BW or OpenHub. Thus, one can believe that even SAP puts auditing/tracking mechanisms in place and they prove by that that particular customer is violating the Note 3255746 they can’t do much about it. But again, a disclaimer – this is not a legal advice at all! Always consult with your SAP account manager also reach out for professional help of e.g. companies providing SAP licenses consulting.

 

More information:

3255746 - Unpermitted usage of ODP Data Replication APIs

ODP-Based Data Extraction via OData

Guidance for Partners on certifying their data integration offerings with SAP Solutions

Thursday, October 31, 2024

How to clear cache in SAP Analysis for Microsoft Office

When SAP Analysis for Microsoft Office (AfO) is being reinstalled or upgraded there can be several errors popping up causing not possible to reuse/refresh the existing reports. Basically, the AfO is crashing or freezing with errors like following.

 

"An exception occurred in one of the data sources. SAP BI Add-in has disconnected all data sources. (ID-111007)"

"Nested exception. See inner exception below for more details."

 

Root cause of the errors like those is that when the AfO is upgraded/uninstalled, the process process does not clear the cache of the AfO. To correct it the cache needs to be cleared manually.

First folder of the cache to look for to be cleared is:

"c:\Users\<USER_ID>\AppData\Roaming\SAP AG\SAP BusinessObjects Advanced Analysis\cache\"

In this folder the cache files located that falls under a naming convention like this:

<SID>.cache

 

Other folder to look for is COF (Common Office Framework) directory out of %APPDATA% that is accessible under link:

"%APPDATA%\SAP\Cof"

It that points to folder:

"c:\Users\<USER_ID>\AppData\Roaming\SAP AG\SAP BusinessObjects Advanced Analysis\cache\"

 

Once the cache is cleared start the AfO from Windows Start menu (All Aps -> SAP Business Intelligence -> Analysis for Microsoft Office) and it should be possible to refresh AfO reports.

 

More information:

2979452 - An exception occurred in one of the data sources. SAP BI Add-in has disconnected all data sources [1e04-3ff5-15]

AfO wiki

Friday, October 25, 2024

Scheduling process Chain in alternate time zone

There is an option to run the process chain in different time zone available in BW systems. It is available in start variant of the PC. There is a new checkbox “Use Alternative Time Zone”. Once it is checked a new field shows up.

The feature can be useful in case BW admin is not aware of what time zone the BW system runs in. So, the alternative time zone can be used.

Once there is an alternative time zone specified for specific PC’s variant it is saved in table RSPCTRIGGER and in field TMZONE.

Same functionality is leveraged in SAP standard background jobs.



Monday, September 30, 2024

Different product lines of SAP BW

In some cases there is a confusion about versions of SAP BW which were introduced during all the years (1st version appeared up around 1998). Let me briefly sort this out. This blog post is not that comprehensive but tries to put a naming conversion of the major releases of the BW straight.

 

1. SAP Business Warehouse Classic (classic BW) aka SAP NetWeaver based Business Warehouse (component SAP_BW), runs on any DB, see details:

SAP Business Warehouse 3.5 part of SAP NetWeaver 04

SAP Business Warehouse 7.0 part of SAP NetWeaver 2004s (NW’04s) aka NetWeaver 7.0

SAP Business Warehouse 7.3

SAP Business Warehouse 7.4

SAP Business Warehouse 7.5

These versions of the BW are sometimes referred as SAP NetWeaver all versions aka BW 7.x

 

2. SAP Business Warehouse powered by SAP HANA aka BW on HANA (component SAP_BW), runs on SAP HANA DB only, see details here

 

3. SAP BW4/HANA (component DW4CORE), see details here or here, BW4/HANA was based on BW 7.5 but redeveloped, many components were removed, and it is not based on NetWeaver stack anymore.

 

If there is a term classis BW used, what is meant by that is the BW based on SAP NetWeaver stack. Means all the versions starting with 3.5 up to BW on HANA including. However difference between 7.x and BW on HANA is that 7.x supports any database but the BW on HANA runs on HANA DB only.

 

More information:

Short history of SAP BW

SAP BW/4HANA (B4H) versions


Monday, August 12, 2024

SAP S/4HANA Cloud Public vs Private Edition?

SAP S/4HANA Cloud is an enterprise resource planning (ERP) suite offered by SAP, and it comes in two primary deployment options: Public Edition and Private Edition. Each offers different features, levels of customization, and deployment flexibility to cater to various business needs. In general, below is a breakdown of the differences between the two:


SAP S/4HANA Cloud Public Edition is better deal for organizations that want a standardized, effective, and quickly deployable ERP solution with minimal customization needs.

On other hand its Private Edition is better suited for organizations that require a highly customized ERP environment, need control over their system updates, and are willing to invest in a more flexible and powerful deployment model.

3 Tier Model to get to ABAP Cloud

Customers who want to migrate tier SAP ERP systems to a cloud need to embrace the cloud from ABAP perspective too. This shift is needed as within on premise SAP ERP systems Classic ABAP extensibility options (user/customer exits, BADIs, enhancement points, modifications, append, structure, menu exits, etc.). All of these were used to tailor SAP systems to meet specific business requirement. But since the introduction of cloud the Classic ABAP extensibility options are not supported anymore in the cloud bases SAP ERP systems.

Apparently majority of SAP customers won’t start with move to the cloud by having a new greenfield implementation of ERP system like S4/HANA Could is. Therefore, SAP had to come up with something that would enable the cloud transition for the existing customer running their ERP systems on premise. It is a 3-tier Extensibility Model. Its purpose is to enable the transition from Classic ABAP to ABAP Cloud. It is also intended to manage a coexistence of these different extensibility models.

Remember much used term - "clean core"? Well as it means up2date, transparent, unmodified SAP system. All these adjectives describe a system that is cloud compliant. Reason why it is important is that as in the cloud all the customers use same base code line and changes are applied to all customers simultaneously. Therefore, there is a no way to allow each individual customer to implement enhancements in the same way that they could in their on premise systems.

 

Tier 1 – Cloud development: default choice for all new extensions and new custom applications following the SAP S/4HANA Cloud extensibility model. Goal is to get to this tier form the lower tiers.

 

Tier 2Cloud API enablement / API layer: if there are any objects (BAPI, Classes, Function Modules, Core Data Services) that are not yet released by SAP and are required in tier 1 a custom wrapper is created for them. By this a missing local public APIs or extension points are mitigated. The custom wrappers are built and released for cloud development. Once there is SAP released public local API, the custom one can be set as obsolete and removed. ABAP Test Cockpit (ATC) can be leveraged inhere to force ABAP Cloud guidelines. Also via ATC exemptions violation of the ABAP Cloud rules can be managed.

 

Tier 3Legacy Development / classic ABAP development: classic extensibility based on classic ABAP custom code that is not supported in the ABAP Cloud development model. E.g. BAPIs, user exits, modifications, SAP GUI, file access, reports writing to GUI, etc. The goal is to avoid developments in this tier and follow the ABAP Cloud development mode. However, as the customer is at this stage the classic objects are to be modernized and moved to the tier 1. Those need to be refurnished one-by-one there is no any tool for that.

 

Now when it comes to real (re)development of the objects in the particular tiers. A concept of software components is used in here. By creating its own component, the object is separated from the others (e.g. non-clean core components – remember clean core). This is because the component puts stricter ABAP Cloud rules to the objects thus separation is needed.

For all the details how to work with the object within specific tier follow below SAP official guidelines.

 

More information:

Clean Core

ABAP Cloud API Enablement Guidelines for SAP S/4HANA Cloud, private edition, and SAP S/4HANA - overview

ABAP Extensibility Guide - overview

ABAP Cloud - How to mitigate missing released SAP APIs in SAP S/4HANA Cloud, private edition and SAP S/4HANA – The new ABAP Cloud API enablement guide

SAP S/4HANA Extensibility: All You Need to Know

Wednesday, August 7, 2024

MERGE process in SAP BW

In SAP BW systems running on SAP HANA database there is a process called merge. Data being written into (BW) objects/tables are first saved in uncompressed delta index. Merge process is a transfer of uncompressed delta index table data to performance-optimized storages. In order to fully leverage those performance-optimized storages the merge process must be started in regular intervals. In case the data would be just staying in the delta index storages there would be an increased memory consumption (delta storages tables may become very large) that could lead to system instability.

The merge it-self can be executed based on several reasons. There is a term - merge motivations that is a mechanism by which a delta merge operation is triggered. From SAP HANA DB depending on how the merge process is triggered there are following types of the merge process:

Delta merge - operation in the column store of the SAP HANA database that moves changes collected in the delta storage to the read-optimized main storage.

Smart merge – delta merge operation triggered by a request from an application. In SAP BW this term is referred as DataStore smart merge.

Hard merge – delta merge operation triggered manually by SQL statement.

Forced merge – delta merge operation triggered manually by SQL statement that passes an optional parameter to execute the delta merge regardless of system resource availability.

Memory-only merge - delta merge operation triggered manually by SQL statement that passes an optional parameter to omit the final step of the delta merge, that is, persisting main storage to disk.

Auto merge – automatically executed merge by database system.

 

Smart Delta merge in SAP BW trigger can be set on DTP level. In the DTP maintenance t-code RSDTP on Update tab there is a checkbox called 'Trigger Database Merge'. There is also a process type in Process Chains called 'Trigger the delta merge' that can be used to trigger the merge process.

Hard delta merge after request deletion in aDSO can be driven by RSADMIN table parameter called 'RSDSO_HARD_MERGE_AFTER_REQDEL'. It can be set to X to enable hard delta merge after the request deletion.

 

More information:

SAP HANA Delta Merge Related To SAP BW/4HANA

3481267 - Hard delta merge after request deletion

Tuesday, August 6, 2024

'accounts.sap.com' vs 'account.sap.com'

With an introduction of SAP Universal ID (UID) there are two different sites for managing SAP sites’s user accounts. One is https://accounts.sap.com and another https://account.sap.com. Kindly note the difference, singular 'account' vs plural one - 'accounts'

Difference between the two is with respect to what accounts are they supposed to manage. For S-user or P-user and Universal IDs type of the accounts there are different platform thru which the two are managed.

S-user/P-user type of the accounts are managed via singular 'accounts' - https://accounts.sap.com (Profile Management).


Whereas the SAP Universal ID are managed via plural 'account' - https://account.sap.com (Universal ID Account Manager).


Users can use the respective sites to manage passwords and account information too. For example, if the user  wants to reset a UID password he or she needs to make sure 'account.sap.com' is used. On other hand if he or she wishes to reset an S-User or P-User ID password (if that ID is not linked to a UID) then you would use 'accounts.sap.com', site.

 

More information:

SAP Universal ID (UID)

Something about SAP Service Marketplace (S-user or S*user) ID

Thursday, August 1, 2024

What are 1972 requests in BW?

In my earlier post about RSPM (Request Status Process Management) I mentioned special data loads request numbers (so called TSN - Transaction Sequence Numbers). One of them is called housekeeping request. ID of those request starts with 1972 thus they are sometimes called as 1972 request or TSN from year 1972 or 1972-replacement request.

Request format:

{1972-XX-XX XX:XX:XX XXXXXX XXX}

Example of such a request can be:

{1972-01-01 01:00:00 001359 CET}      

Not sure why particular year 1972 was chosen. Perhaps it has something to do with a fact that SAP was founded in 1972 ? Not sure, probably not :-)

Example of the 1972 request as it was seen in RSMNG t-code:

More information:

Request Status Process Management 

BW request types: RSSM vs RSPM


Tuesday, July 30, 2024

DTP: Track records after failed request

There is an option available for DTPs on its Update tab called “Track Records after Failed Request”. When it is checked, it means that BW systems builds cross-reference table. The cross-reference table that is built when the upload request fails. The table traces an erroneous records of the data load.


This option can only be selected when error handling is set to “Terminate request; no record tracing; no updating”, e.g. when error handling is deactivated.

This option helps performance of data load because there is no tracking of error records during data load process.

Normally when the option is checked for a particular DTP there is a warning message as below shown:

'Attribute 'Automatically Switch Record Tracking On: Value 'X' is obsolete'. RSBK453


This message is shown as there are certain checks executed by a method _CHECK_GENERAL call (of class CL_RSBK_DTP_V). This method calls another one CL_RSBK_DTP_API=>ADMISSIBLE_GET that populates the value. If the check box is enabled the values is equal to following attribute: CL_RSBK_DTP_API=>C_S_ATTRIBUTE-TRACK_RECORDS.

Just to add; SAP recommends (e.g. here for BW/4/HANA or for DataSphere) activating error handling only if errors occur during execution of the DTP. So not by default. If error handling is activated, the data records with errors are written to the data transfer intermediate storage, where they can be corrected before being written to the data target using an error DTP.


Saturday, July 27, 2024

DTP: No filter defined for mandatory selection field

There can be an error message displayed as below on attempt to run or activate the DTP:

Filter for field xxx (InfoObject xxx) is mandatory; Input a selection

No filter defined for mandatory selection field xxx


Does it mean that there can be a mandatory field on DTP filter? Well, it depends...


1. If BW is classic BW system or it is BW/4HANA 1.0 based system and if the target objects for the DTP is CompositeProvider (HCPR) - DTP Filter Routine are not checked for mandatory fields. This can be solved via implementing SAP Note 2438744 (see also Note 3375464).

 

2. If as source object is calculation view based on HCPR (and the BW system is based on NetWeaver or any BW/4HANA version) there are couple of Notes to be implemented. Start with the Note 2813510.

 

More information:

2438744 - DTP Filter Routine are not checked for mandatory fields in combination with CompositeProvider as source

3375464 - Error "Filter for field /BIC/FXXXXXXX (InfoObject XXXXXXX) is mandatory" occurs while activating the DTP.

2813510 - No filter defined for mandatory selection field

2760751 - DTP: "No filter defined for mandatory selection field XYZ" displays during the Check / Activation of the DTP which has the target as SPO.

Sunday, June 30, 2024

Possibilities of BPC package prompt's UI

There are a few options how to setup an input given by a user by a popup windows of DataMart script (DS).

First point of view whether the input fields in prompts should support browsing a BPC dimension hierarchies.

If the hierarchy browsing is not required then the prompt can be setup as s simple text field. Means PROMPT commands is a type of TEXT.

DS:

PROMPT(TEXT,%SRC_TIM%,"Enter Source TIME","%TIME%")

TASK(/CPMB/DEFAULT_FORMULAS_LOGIC,REPLACEPARAM,SRC_TIM%EQU%%SRC_TIM%)

 

In case the hierarchy browsing is needed the PROMPT command is a type of COPYMOVEINPUT:


DS:

PROMPT(COPYMOVEINPUT,%SRC_TIM%,%TGT_TIM%,"Select the TIME members from to","TIME")

TASK(/CPMB/DEFAULT_FORMULAS_LOGIC,MEMBERSELECTION,SRC_TIM%EQU%%SRC_TIM%%TAB%TGT_TIM%EQU%%TGT_TIM%)

 

Disadvantage of above option that support the hierarchy browsing is that in case there are several dimension values to be entered they are placed on a new pop-up window.

In case use just prefers to have a one pop-up window with all the values following option can be used. It is again the PROMPT command type of SELECT:



DS:

PROMPT(SELECT,%SELECTION_IN%,,"Select data to copy","TIME,VERSION")

TASK(/CPMB/FX_RESTATMENT_LOGIC,REPLACEPARAM,SELECTION_IN%EQU%%SELECTION_IN%)

 

More information:

TEXT Prompt() Command

COPYMOVEINPUT Prompt() Command

SELECT Prompt() Command


Saturday, June 1, 2024

SAP Datasphere (and BW bridge) limitations

When comparing SAP Datasphere to the traditional SAP BW system, several technical limitations become evident. Here are some of them listed out:

 

1 Mature Feature Set / Functionality Gap

SAP BW has a more mature and extensive feature set due to its long history, including advanced data modeling, ETL capabilities, and built-in analytics, which SAP Datasphere may lack. As an example:

·        OLAP Engine/BW Analytical Manager functionality not supported e.g., no analysis authorizations, no query as InfoProvider, no query execution, no calculation of non-cumulative key figures (Inventory management)

·        No add-ons (e.g. SEM-BCS) supported

 

2 Data Modeling related to SAP HANA

·        Generation of External SAP HANA Calculation Views not possible

·        Not possible to use SAP HANA Calculation Views as part of Composite Provider

·        No planning scenarios supported

·        Temporal joins in Composite Provider supported

·        Many processes used process chains are not supported (e.g. ABAP program execution process, archive, job event, BODS related processes etc.)

·        Ambiguous Joins not supported

·        BADI Provider as PartProvider not supported

·        Open ODS View without calculation scenario not supported

 

3 Data Integration

·        Connection to source systems supported only via ODP technology as well as push scenarios only.

 

4 Performance and Scalability

·        Current sizes of datasphere instances are limited to 4TB, which may not be enough for some organization that runs bigger BW systems than that.

 

5 Reporting and Analytics

·        No BW (BEx) query support

·        User Exit BW (BEx) variables

·        Unit conversion

·        Constant selections in BW (BEx) reports

·        No BW (BEx) Query variables support in DTP’s filters

 

6 Application development

·        Applications development is not supported. Only applications to be done using SAP BTP.

 

7 UI/UX

·        No SAP GUI access

 

 

These technical limitations highlight the areas where SAP Datasphere is still catching up to the more established and mature SAP BW system. As SAP continues to develop Datasphere, many of these limitations may be addressed over time.

Tuesday, April 30, 2024

aDSO: Validity and Reference-Point tables

As mentioned in my post Storage data target type in BW Request Management there are several tables that store data for aDSO objects. Except those very well know like active, inbound, change log tables there are also others.

In case the aDSO is type of Inventory:



There are also below two tables available (followed by namespace):

Validity Table: /BIC/A<technical name>4

Reference Point Table: /BIC/A<technical name>5

 

A purpose of the inventory-enabled aDSO is to manage noncumulative key figures. A non-cumulative measure, in the context of data analysis or statistics, refers to a metric or variable that does not accumulate or aggregate over time or across categories. In other words, it represents a single point or snapshot value rather than a total or sum.

E.g. if there is a need to analyzing sales data for a particular day, the number of units sold on that day would be a non-cumulative measure. It doesn't consider sales from previous days; it just reflects the sales for that specific day.

Non-cumulative measures are often used when you need to examine data at a specific point in time or within a specific category without considering historical or cumulative values. They are particularly useful for analyzing trends, patterns, or comparisons within discrete units of analysis.

The tables are available also via t-code RSMNG in Utilities menu: Display Validity tab and Reference-Point tab: