Tuesday, July 30, 2019

BPC in Standard Mode or Embedded Mode - difference

There are many information on the web about different flavors of SAP BPC - Business Planning and Consolidation solution. I wanted to summarize the main differences between the two main favors plus a bit of information about the newest favors. By no means is this blog post intended to provide a comprehensive insight just basic overview and difference between the two. For much more detailed information refer to blogs listed under More information part. This blog is mostly abstract of the two mentioned blogs in there.

1. BPC Standard Model (or Classic): is a planning solution, which has its own functionality that creates and manages cubes in BW. The BPC standard is a planning (means also reporting and consolidation) solution based on BW technology mainly designed to be used by LoB.  The technical BW objects needed (like InfoObjects, InfoCubes, etc.) are generated and controlled by BPC and not directly exposed in BPC. BPC introduces BPC specific concepts different from BW concepts. Thus in the BPC standard model one has to copy over all master data, hierarchies and transaction data from BW to BPC and to align the copied data with the BPC concepts. In this sense thee BPC standard is a data mart solution. To support this the BPC standard’ implemented again a lot of existing functionality in BW although in the BPC way. The BPC Standard is sometimes called as Business Planning and Consolidation, version for SAP NetWeaver (BPC NW).

2. BPC Embedded Model: is a planning solution based around BW’s IP (Integrated Planning) functionality. The guiding principle of this approach is do not copy the data but use the BW objects and features instead. Therefore, instead of brining data over to BPC (as in the BPC standard model) it is leveraging existing data in BW. By nature, the BW is more IT driven whereas LoB drives in case of the BPC standard model.
The BPC Embedded is sometimes called as BW-IP/PAK (Planning Applications Kit).
Both of these flavors of BPC run on BW – they just use different features of the platform and have different design approaches.
With S/4HANA system there are even more options on how to run Business Planning and Consolidation solution. In S/4HANA “Simple Finance” there is Integrated Business Planning for Finance (IBPF) available. The purpose of IBPF is to leverage ERP objects for consociation. Later the IBPF was renamed to “BPC optimized for HANA” or “SAP BPC optimized for S/4HANA”.

3. BPC Optimized: It is the BPC Embedded version 10.1 installed on the BW engine present in S/4HANA Finance system. BPC Optimized is only available with S/4HANA. Its first purpose is to replace planning functions from FI/CO, which are not there anymore in S/4HANA Finance.
This favor of the BPC is also called, as Real-Time Consolidation (RTC) is a consolidation solution based on the deep integration between SAP S/4HANA and SAP Business Planning and Consolidation (SAP BPC). RTC takes both data quality and flexibility into account. It has the privileged direct access to universal journal entries, while leveraging the consolidation capabilities from SAP BPC. Unified staging and storage of finance data achieves high data quality and eliminates redundancy.

4. SAP BCS FOR SAP BW/4HANA (BCS/4HANA or BCS4HANA): although this is not directly related to BPC, the BCS (Business Consolidation) is solution to replace SEM-BCS. It is to support an automation of financial group close (consolidation). The BCS4HANA is a subset of the functionality of software components SEM-BW and FINBASIS related to consolidation.

5. SAP Business Planning and Consolidation, version for SAP BW/4HANA (SAP BPC 11.1 version for SAP BW/4HANA, BPC4HANA): both models (Standard or Embedded) can run in one system. If organization wants, the planning tool to be managed it in the centralized way -> Embedded; if the same shall be running by LoB (users) -> Standard one. That means planning and consolidation functionalities are now together (BI-IP or (BW-IP) and PAK are included in Embedded model now) so SAP calls it “Simplicity” – Simplified BPC.

More information:
Concepts compared: BPC standard and BPC embedded
Practical differences – BPC standard vs Embedded BPC (BW-IP/PAK)

Monday, July 29, 2019

Not possible to change DTP package size

I faced recently a situation when it was not possible to change package size of DTP. I entered the DTP in edit mode but field change package was disabled for changes. I tried to reactivate the DTP but it did not help neither. I also tried to delete data requests that was produced by that DTP from infoprovider and again it did not help. Finally, I removed semantics groups from TRFN and/or DTP and it was possible to change package size from DTP UI so the field became possible to be editable.

I researched this topic a bit a came across few SAP Notes (see below) that discuss this situation. There is a report RSBKDTPREPAIR_MAXSIZE available that determines such a “erroneous” DTPs and corrects them by re-activating them again. It is possible to run the report in simulation mode get a list affected DTPs. In addition, the report can prepare BW transport for or without it and as well to run it for selected DTPs only.

More information:
1521135 - DTP package size is too large (more than two billion)
1595541 - Extension of the report RSBKDTPREPAIR_MAXSIZE

What is SAP Analysis for Microsoft Office, edition for SAP Analytics Cloud

As per SAP's "BI Convergence Strategy 2018" there is an edition of  SAP Analysis for Microsoft Office (AfO) that works with SAP Analytics Cloud (SAC). It is called "SAP Analysis for Microsoft Office, edition for SAP Analytics Cloud" - "AfO-SAC". This means that AfO-SAC is integrated to SAC so there will be in future just one solution that integrates with both on premise and cloud-based data sources. This is what SAP calls as bridging the gap between cloud and on-premise systems.

In particular within AfO-SAC can: consume models (including both models analytics and planning ones) from SAC (an SAC connection can be created in AfO-SAC = as data source), store AfO workbooks in SAC environment, store AfO workbooks having SAC models as data sources locally, working with SAC hierarchies, enter planning data of SAC planning model in a crosstab in AfO analysis (front-end cell locking is used), etc.

Although that there are currently many restriction within the products (see online docu and roadmap) the AfO-SAC is being heavily developed together with strong support on SAP BW backend side. Which with the newest SP 16 (soon to be released) for BW 7.5 version is a kind of “go-to release” for all organizations that wants to use SAC.

As of AfO version 2.7 the AfO-SAC version follow the original AfO versions. Just it is market as different component - ANALYSISOFFICE_FOR_SAC.

More information:
roadmap

Sunday, July 28, 2019

Planning (e.g. APO) requests in BW InfoProviders

Planning application like APO, BPC, IP etc. are using special data load request type. Normally planning data is stored under one Request ID that starts with prefix 'APO_*' in real-time InfoProviders. One can observe it in Manage screen of a BW info provider in this way. I mentioned those APO_* request already my earlier post here.

Information about the APO_* requests are visible in tables RSREQDONE and RSSELDONE however not much information about them is available (no InfoSource, DataSource and so on). Most of information available in these tables is the same that can be found in Manage screen of a BW info provider. However
In administration of an InfoCube, the "Type Of Data Update" field is not filled in the request list if there is an APO request.

Since almost nothing exists for APO requests (no Source/InfoSource, DataSource, Source System and so on), the type of data update for the APO request is also unclear. Just a "Type of Data Update" field is populated with the "Full Update" value.



In addition, notice that a data, which was loaded into a real-time InfoProv via an InfoPackage/DTP (so-called BW load of BW request), cannot be changed with the Demand Planning. Thus if key figure not zero and it originates from BW request then the key figure becomes a read-only. Such KF needs to be copied to other if it needs to be changed in the planning app.

Wednesday, July 10, 2019

How to find out backend job name for BPC Package execution

All BPC Packages during they run time they are executed by SAP Job in SAP BW backend system. Sometime it is useful to have a look at these jobs in case some issue occurred and one needs to analyze it.

In case of BPC Package investigation starts in EPM add-in in menu Data Manager -> View Status -> View Package Status. Here in column called SEQ is a generated string that uniquely identifies run time of the BPC Package.


With that, SEQ ID we go to SAP BW backend into table called UJD_STATUS (BPC: Package Status). The value of SEQ ID column we put to LOG_ID (Log-ID of a Process Chain Run) field and as a result, in JOB_NAME field we get the real SAP job name that can be further investigated in e.g. t-code SM37.




Checking status of cube behavior

Real-time cubes in SAP BW means that it is a planning cube where the planning functions can be implemented. This in short means a user’s can enter/modify data. It is possible then to copy, change and do a many different calculations with the data (e.g. distributions based on reference data or forecast functions).
The Real-time cubes can be therefore set into two modes (or called behaviors). Load behavior where the cube can be loaded via regular BW’s transformation or plan (or called real-time) behavior in which the cube can be saving planning data entered by users.

The change or the switch of the two modes can be done either manually:




Or same activity as done manually via RSA1 -> Modeling -> right click on cube -> Planning-Specific Properties -> Change Real-Time Load Behavior can be done programmatically.

Following FM can be used to check the real-time / load behavior and to set it:
RSM_LOADALWD_GET
RSM_LOADALWD_SET       

In addition, there is an ABAP report SAP_CONVERT_NORMAL_TRANS can be used for the same. Similarly, there is a process that can be put into process chain for that.

Table RSMDATASTATE (Status of the data in the Infocubes) and its field LOADALWD (Allow loading for real-time data targets) stores the information into which behavior the cube currently is set to. If the field LOADALWD is set to X it is in loading mode, if it is set to blank in planning mode.

For information on this topic relevant for BW4/HANA see this post: Checking status of cube behavior – BW4HANA


Friday, July 5, 2019

Tips on troubleshooting UD Connect type based source systems connection to SAP BW

There can be multiple issues popping up while connecting SAP BW to UD Connect based source systems like MSSQL. Within this blog post, I list few of them I recently faced.

1. Message No. RSSDK300:
S:RSSDK:300 Cannot convert a value of 'Local' from type java.l ang.String to INT at field CURRENCYTYPE
or
S:RSSDK:300 Cannot convert a value of 'LOCAL' from type java.lang.String to FLOAT at field CURRENCY
Normally error no RSSDK300 can be solved by reactivating corresponding DataSource, then following errors may pop up:
S:RSSDK:300 Field ENTITY is not a member of /BIC/CAZRXD_V_0001 70000001
Message No. RSSDK300
or
S:RSSDK:300 Column: Entity not found    Message No. RSSDK300
To solve this look for SAP Note: 1009063 - UDConnect displays erratic behaviour in loading data and blog. Or as per Note 1040572 – “UDC: Error "Column XXX not found" happens randomly”, use use parameters "fixedSchema" and "fixedCatalog" because the MSSQL system may be using different DB schemas and you may use the table/view that has same name in different schemas.

2 Message No. RSM340
Errors in source system       Message No. RSM340

3 Message No. RSDS_ACCESS036
Error while extracting field list of UD Connect object: UDC ADAPT ERROR::RSSDK|100|while trying to invoke th

4 Message No. RSSDK100
UDC adapter problem: connection area     Message No. RSSDK100

Things that may help to solve above mentioned errors:

JDBC test page
There is JDBC test page available on NetWeaver JAVA stack with following URL:
http://server:port/TestJDBC_Web/TestJDBCPage.jsp
Here one can test connection to particular UD Connect based system. Moreover you can even extract the columns of table or view extraction is retrieving data from. This extremely needed check. It can reveal many issues. For example, in case BW’s datasource is expecting column called ABC but in source system there is completely different column name available like XYZ.

Meta data cache
As UD Connect, based systems are using JAVA NetWeaver server in middle there can be an issues at JAVA server. Mostly these are related to meta data cache. Because metadata description may already be present metadata cache of JAVA server. After the changes done insource system (column name change, column data type/length change, etc.) these changes must be replicated to the JAVA server. As those changes will not automatically be removed from the metadata cache on the JAVA. Normally these can be easily solved by restarting JAVA server when the cache is cleared. However, in production environments it may not be easy to get a time slot to perform the restart. Luckily as of NW version 7.2 the clear cache can be done w/o restart. Procedure is described in SAP Note 1479970 - Changes in RFC modules are not reflected in Java Server. In NetWeaver Admin tool choose tab "Availability and Performance" then select "Resource Monitoring" then "JCo Monitoring", there choose tab "Meta Data Cache". It is also possible to made the cache clear programmatically.

Investigate JAVA logs
Look for following logs to dig out more details about the error:
·        \usr\sap\\\j2ee\cluster\server?\BI_SDK_Trace.log. (there's one log per server node)
·        \usr\sap\\\j2ee\cluster\server?\log\defaultTrace.?.trc (send us the latest default?.trc file from all the server nodes)
·        \usr\sap\\\j2ee\cluster\server0\log\applications\BI\UDI.?.log

Debug on ABAP Stack
Set a breakpoint in FM RSSDK_DATA_UPLOAD_ALL on the line where function RSSDK_DATA_UPLOAD_ALL is called. Also see t-code RSSDK - DB Connect and ABAP program RSSDK_START.

More information:
1722695 - JRA: Object not found in lookup of CallbackConnectionFactory
2000681 - JRA: NullPointerException in CallbackConnectionFactory
1004194 - UDC: How to create log files for problem analysis
512739 - BW external DB Connect for MS SQLServer
1396552 - Remote connection delivers wrong character data