Wednesday, June 28, 2023

SAP Data Warehousing solutions as of 2023

Data warehousing is a category of enterprise software that integrates a data from different sources, transform it into a consistent and structured format, and provide users with easy access to the information needed for decision-making purposes.

As of Q2 2023 SAP has three data warehouse solutions depending on its deployment scenarios:

 

A) on-prem & private cloud:

1. SAP HANA SQL DWH (Data WareHousing) – brings SAP HANA platform with loosely coupled tools and its platform services (HANA application, integration, processing and database services), best of breed to build own data models

2. SAP BW/4HANA – packaged data warehousing solution, successor of SAP BW, all DWH services in one integrated repository (modeling, monitoring and managing the DWH)

B) public cloud:

3. SAP Datasphere (formerly SAP DataWarehouse Cloud) – cloud based data warehousing service

 

Into the A) section there would be a place for classic SAP Business Warehouse (SAP BW) up to its latest version 7.5 as well. However, the SAP BW is based on SAP NetWeaver stack that reaches its end of maintenance status. Due to this the latest version of SAP BW – 7.5 has a planned expiration of mainstream maintenance as of Dec 31st of 2027.

Other perspective can be either from application or native (SQL type) of the Data warehouse. SAP Datasphere and SAP BW/4HANA is application driven solution whereas SAP HANA SQL DWH is native driven type of DWH solution.

Each solution caters to different needs and deployment scenarios, enabling customers to effectively manage and leverage their data assets for improved decision-making and business outcomes.

More information:

BW/4HANA

Datasphere

SQL Data Warehouse

Thursday, June 22, 2023

Upport Downport Overview (UDO) tool

SAP as any other software vendor needs to support its software thru its whole lifecycle. This involves also providing code changes introduced in higher-level releases to lower ones. Process like that is so called downporting. There can be a e.g. legal reasons for that.

Downporting is the process of porting an application or software component from a higher version of a platform to a lower version of the same platform. Similarly, there is an opposite process called upporting. It involves porting of software from lower to higher version.

In a world of SAP software, downporting and upporting is done using the UDO tool which simplifies the process of by generating a report which will be saved in the system in the normal deliverable package and be shipped in a SAP note.

The UDO tool is continuation of other tools like Correction Workbench (t-code SCWB), ABAP Split Screen Editor (t-code SE39), Version Management (dev pack SVRS), Note Assistant (t-code SNOTE) etc. It compares active version of the ABAP object in remote system (system that has the feature implemented) with version in the local SAP system (system where the feature is to be implemented aka downported). Then it collects DDIC changes, documentation changes and it prepares a new ABAP program. The ABAP program is usually called UDO_NOTE_X or in BW area RS_UDO_NOTE_X, where X stands for SAP Note number e.g. RS_UDO_NOTE_2367512. The report is called UDO report. The UDO report is published in specific SAP Note on SAP ONE Support portal and it can be downloaded from there. When the UDO report is executed (e.g. in customer system) it takes care of the DDICs and documentation and creates and activates them in the target system.

If there would not be the UDO tool in place all of these activities would need to take place manually. Of course, no one wants to do this job. On one hand, responsible developer may forget to write down all the steps needed. On the other hand, person who is implementing those steps manually may forgot to perform few of them. Therefore, a need to automate such activities is it is obvious.

Technically the UDO tool is implemented by ABAP program called SAP_LOCAL_DOWNPORT_ASSISTANT associated with t-code UDO and it is a part of SAP Software Logistics Toolset.

Wednesday, June 21, 2023

Enhanced Master Data Update

One of a new BW features that were introduced in BW4/HANA version is Enhanced Master Data Update. It can be enabled for a characteristics (attributes and/or texts) type of Info Objects. Only the characteristics that do have master data is supported. This feature brings a possibility of parallel loading of data to the IO. There are new tables created for the IO in case the feature is enabled. The tables are similar to inbound and active tables like aDSO object has. Like in the aDSO case also here the data is loaded first to inbound table. Upon the data activation the attribute and/or text tables of the IO are updated.

Advantage of this feature is while a huge portions of MD is being loaded and/or from multiple sources. It means that the data can be loaded in parallel. Delta mechanism is also supported.

The feature can be enabled in BW Modeling Tools of SAP Hana Studio on IO maintenance screen on general tab.


Technically the flag is stored in table RSDCHABAS column ENHANCED_MD_UPDATE. Its values referring to domain RSDMDUPDATEENH that has following 2 values:

0        Does not use enhanced master data update

1        Uses enhanced master data update

 

Once the enhanced MD updated is enabled for the characteristics the settings related to that can be maintained in t-code RSDMD_SETTINGS. The t-code refers to class CL_RSDMD_SETTINGS_CTRL and method MAINTAIN_SETTINGS.


In case of issues with the IO activation that error message complains about the Enhanced Master Data Update cannot be used for Char. X due to ... (e.g. message no R7B420, R7B428 etc.) there is an ABAP program RSD_CHA_ADAPT_DEFAULT_ENHMDUPD that fixes those.



Sunday, June 18, 2023

Consuming data from BW or BW/4 in BW based on ODP

In Classic BW systems the scenarios when BW serves a data to other BW systems involves creation of so called export DataSource (DS). That DS is created in source system and it has a naming convention 8*. In other BW system where the DS to be is consumed the export DS is replicated and means afterwards it is visible in the target system. From there on a data flow can be built on topo of that replicated DS.

In case of BW/4 system as target system the export DS are obsolete. There is no need to create the export data source anymore. While leveraging the ODP technology objects from source BW are available in the target DS. There is a large variety of supported objects like aDSO, Composite Providers, former BW Classic objects (infocubes, MultiProviders, Semantically partitioned objects, InfoSets, Query as InfoProviders) and finally InfoObjects as well. In case of the InfoObjects and its attributes, texts and hierarchy are supported within ODP-BW context.

Now how get the data from source BW target one? In the target BW just replicate the ODP-BW context node available BW. That activity is to be done in SAP HANA Studio. While doing it a list of available objects from source BW appears in popup window. From there you just choose the object(s) its data is needed to be loaded to target. Afterwards a newly created DS(s) will appear in the target BW. Their names are the same as the object in the sources just there is a postfix character describing the nature of data for the object:

*$F    - Transaction Data/Facts

*$P    - Master Data/Attributes

*$Q    - Time-Dependent Master Data/Attributes

*$T    - Texts

*$H    - Hierarchies

 

Just to add few more remarks on working with the ODP. Technically when the replication is triggered from target to source BW system there are following Function Modules that are called:

RODPS_REPL_ODP_GET_DETAIL

RODPS_REPL_MODEL_GET_LIST

RODPS_REPL_CONTEXT_GET_LIST

 

There is so called ODP extractor checker that can be used to test the extraction in source system. It is kind of a new RSA3 t-code that checked the extractors in case of classic S-API plug-in based extractor. In ODP based extraction it is an ABAP Program RODPS_REPL_TEST (ODP Replication Test) which serves the same purpose.

In addition, notice that ODP has gone some development are it is recommended to use the latest one, which is ODP Data Replication API 2.0. Version 2.0 is available as of:

SAP_BW 750 SP 0 (incl. former PI_BASIS packages)

DW4CORE 200 SP 0 (incl. former PI_BASIS packages)

 

More information:

Operational Data Provisioning (ODP)

Online docu - Transferring Data from SAP BW or SAP BW/4HANA Systems via ODP (InfoProviders, Export DataSources)

2483299 - BW4SL & BWbridgeSL - Export DataSources (SAP BW/4HANA or SAP Datasphere, BW bridge as Source)

Sunday, June 11, 2023

What is a difference SAP RISE vs SAP GROW (GROW with SAP)?

SAP tries to pitch its S/4 cloud solution to its customers very hard for a years. However as per surveys conducted by the SAP User groups like ASUG or DSAG the adoption is not at the level SAP would wish it would be.

Therefore, SAP continues to provide offerings to help customers with their transformation to S/4. It started back in 2021 with SAP RISE followed by this year (March 21st 2023) Sapphire when another offering – SAP GROW (or Grow with SAP) was introduced.

Let’s look at the both offerings in the nutshell.

The SAP RISE is an offering that combines SAP S/4HANA Cloud public and private edition, accelerated adoption services (e.g. SAP Readiness Check, Custom Code Migration, etc.), SAP Signavio to manage business processes, pre-configured best practices. See more information here.

The SAP GROW is also build on SAP S/4HANA Cloud public edition with the focus on midsize companies where the business processes are more standardized.

The main difference between them is that RISE with SAP is more flexible and customizable, while GROW with SAP is more standardized and out-of-the-box. RISE with SAP targets larger and more complex businesses, and offers both public and private cloud options for SAP S/4HANA. On the other hand GROW with SAP focuses on midsize companies, and only provides SAP S/4HANA Cloud, public edition.

 

More information:

SAP RISE (RISE with SAP) blog

RISE page

RISE online docu

GROW page

Thursday, June 1, 2023

SAP Smart Data Integration (SDI) and difference vs Smart Data Access (SDA)

HANA as data management platform needs to connect to other data sources to get a data. There are many integration patterns to follow to enable the connection. There is an ETL processing, real time replication, data integration etc. Each of them has its own advantages as well as disadvantages. With the introduction of SAP HANA Smart Data Integration (SDI) the SAP is pursuing to get the best of some of them.

SAP Smart data integration (SDI) is an extension of SAP HANA Smart Data Access (SDA). The SDA leverages the concept of virtual tables (data federation type of thing). That means a data replication is eliminated by providing a virtual layer that abstracts the underlying data sources. Metadata of tables of the source system are imported as virtual tables into the HANA. On top of the SDA the SDI brings broader set of preinstalled adapters from Data Provisioning Agent (DP Agent) component. The adapters in case of the SDI are no longer part of HANA DB. They run as separate processes in DP Agent component.

The SDI is part of the Enterprise Information Management (EIM) solution of SAP. Its purpose is to connect data sources to the HANA, provision data from them and load the data from them to HANA.

As the SDI brings more functions into the EIM process, some customers are opting for this option. Therefore e.g. in SAP BW area they are migrating the SDA based data sources to the SDI one.

 

SDA is available since SAP HANA 1.0 SPS06

SDI is available since SAP HANA 1.0 SPS09


More information:

SDA

SDI online docu

2400022 - FAQ: SAP HANA Smart Data Integration (SDI)

2180119 - FAQ: SAP HANA Smart Data Access

2600176 – SAP HANA Smart Data Access Supported Remote Sources