Wednesday, September 18, 2019

Find out users having a role

Just a quick tip on how to find out what are user(s) having particular role assigned in the SAP system. Normally this can be done via t-code PFCG there in role maintenance screen is a tab called Users, which shows this information.

However if one has no access to the PFCG t-code it needs it be relied on tables. The table that holds an information on assignment of roles to users is called AGR_USERS. The same info as in PFCG can be seen there.

Thursday, September 12, 2019

Dummy Source System in BW

Concept of dummy for an SAP BW system is sometimes required in case of large BW installations. The case is that there is not only connected one actual source system but there are connected several source systems that are not all completely homogeneous.

In case there are several target system for one source system it is possible to leverage option “7.0” in screen of “Conversion of source system names after the transport (view V_RSLOGSYSMAP)”. That option is only valid for BW 7.x objects like RSDS, TRFN and DTPA etc. It means that 7.x objects are imported several times (to several targets) even though they were exported only once. However in case target are not the same (not homogeneous) it may become tricky because those system may have different settings applied like different filters on DTPs, different schedules etc.

These kind of issues are addressed by introducing a source system type of dummy source system. This type of source system can only be created in development systems. It is not real source system just an alias to real one. Metadata of all depended objects are referenced this alias. During the transport remote function calls which for example are doing replication of DataSources or actually activate and run the metadata objects are re-directed to the real source system.

More information:

Friday, August 2, 2019

SAP BW InA Provider

SAP BW InA stands for Information Access Protocol or Info Access Service or Info Access Interface. It is an SAP internal proprietary protocol used by SAP products to retrieve data from (embedded) BW or HANA databases. As the protocol is proprietary, there is no public documentation for this SAP internal protocol. Web services of this protocol are accessible via URL: https://hostname:port/sap/bw/ina/ Functionalities are codes in ABAP classes/methods starting with: CL_BICS_INA*. It was delivered since NW 74 SP04.

Originally, the InA protocol was only used for BPC embedded product its froent end tool called EPM client, which is an add-in to either MS Excel or PowerPoint. Afterwards InA used increased within other clients and products. These counts Design Studio, SAP Analytics Cloud (SAC), Lumira v2 among others. S/4HANA (cloud or on-premise) supports a consumption of CDS views through the technical InA interface.

In case of e.g. the Design Studio and the SAC it is for pure analytical use cases that the InA is used for. More over in case of the SAC the InA enables BW’s Direct Live connections into the SAC.

Furthermore, in case of BPC’s EPM client the interface that communicates to NWBPC backend was developed under project code name FireFly. At least this seems to be teh case based on SAP Notes that are containing a phrase - Firefly. Leveraging the INA Provider connection, it is possible to do the following in the EPM client:

    Work with BW queries (with or without variables)
    Retrieve data, using reports
    Enter and save data, using input forms
    Execute planning function for BW Integrated Planning

More information:
Software component: BW-BEX-OT-BICS-INA InA functionality

Tuesday, July 30, 2019

BPC in Standard Mode or Embedded Mode - difference

There are many information on the web about different flavors of SAP BPC - Business Planning and Consolidation solution. I wanted to summarize the main differences between the two main favors plus a bit of information about the newest favors. By no means is this blog post intended to provide a comprehensive insight just basic overview and difference between the two. For much more detailed information refer to blogs listed under More information part. This blog is mostly abstract of the two mentioned blogs in there.

1. BPC Standard Model (or Classic): is a planning solution, which has its own functionality that creates and manages cubes in BW. The BPC standard is a planning (means also reporting and consolidation) solution based on BW technology mainly designed to be used by LoB.  The technical BW objects needed (like InfoObjects, InfoCubes, etc.) are generated and controlled by BPC and not directly exposed in BPC. BPC introduces BPC specific concepts different from BW concepts. Thus in the BPC standard model one has to copy over all master data, hierarchies and transaction data from BW to BPC and to align the copied data with the BPC concepts. In this sense thee BPC standard is a data mart solution. To support this the BPC standard’ implemented again a lot of existing functionality in BW although in the BPC way. The BPC Standard is sometimes called as Business Planning and Consolidation, version for SAP NetWeaver (BPC NW).

2. BPC Embedded Model: is a planning solution based around BW’s IP (Integrated Planning) functionality. The guiding principle of this approach is do not copy the data but use the BW objects and features instead. Therefore, instead of brining data over to BPC (as in the BPC standard model) it is leveraging existing data in BW. By nature, the BW is more IT driven whereas LoB drives in case of the BPC standard model.
The BPC Embedded is sometimes called as BW-IP/PAK (Planning Applications Kit).
Both of these flavors of BPC run on BW – they just use different features of the platform and have different design approaches.
With S/4HANA system there are even more options on how to run Business Planning and Consolidation solution. In S/4HANA “Simple Finance” there is Integrated Business Planning for Finance (IBPF) available. The purpose of IBPF is to leverage ERP objects for consociation. Later the IBPF was renamed to “BPC optimized for HANA” or “SAP BPC optimized for S/4HANA”.

3. BPC Optimized: It is the BPC Embedded version 10.1 installed on the BW engine present in S/4HANA Finance system. BPC Optimized is only available with S/4HANA. Its first purpose is to replace planning functions from FI/CO, which are not there anymore in S/4HANA Finance.
This favor of the BPC is also called, as Real-Time Consolidation (RTC) is a consolidation solution based on the deep integration between SAP S/4HANA and SAP Business Planning and Consolidation (SAP BPC). RTC takes both data quality and flexibility into account. It has the privileged direct access to universal journal entries, while leveraging the consolidation capabilities from SAP BPC. Unified staging and storage of finance data achieves high data quality and eliminates redundancy.

4. SAP BCS FOR SAP BW/4HANA (BCS/4HANA or BCS4HANA): although this is not directly related to BPC, the BCS (Business Consolidation) is solution to replace SEM-BCS. It is to support an automation of financial group close (consolidation). The BCS4HANA is a subset of the functionality of software components SEM-BW and FINBASIS related to consolidation.

5. SAP Business Planning and Consolidation, version for SAP BW/4HANA (SAP BPC 11.1 version for SAP BW/4HANA, BPC4HANA): both models (Standard or Embedded) can run in one system. If organization wants, the planning tool to be managed it in the centralized way -> Embedded; if the same shall be running by LoB (users) -> Standard one. That means planning and consolidation functionalities are now together (BI-IP or (BW-IP) and PAK are included in Embedded model now) so SAP calls it “Simplicity” – Simplified BPC.

More information:
Concepts compared: BPC standard and BPC embedded
Practical differences – BPC standard vs Embedded BPC (BW-IP/PAK)

Monday, July 29, 2019

Not possible to change DTP package size

I faced recently a situation when it was not possible to change package size of DTP. I entered the DTP in edit mode but field change package was disabled for changes. I tried to reactivate the DTP but it did not help neither. I also tried to delete data requests that was produced by that DTP from infoprovider and again it did not help. Finally, I removed semantics groups from TRFN and/or DTP and it was possible to change package size from DTP UI so the field became possible to be editable.

I researched this topic a bit a came across few SAP Notes (see below) that discuss this situation. There is a report RSBKDTPREPAIR_MAXSIZE available that determines such a “erroneous” DTPs and corrects them by re-activating them again. It is possible to run the report in simulation mode get a list affected DTPs. In addition, the report can prepare BW transport for or without it and as well to run it for selected DTPs only.

More information:
1521135 - DTP package size is too large (more than two billion)
1595541 - Extension of the report RSBKDTPREPAIR_MAXSIZE

What is SAP Analysis for Microsoft Office, edition for SAP Analytics Cloud

As per SAP's "BI Convergence Strategy 2018" there is an edition of  SAP Analysis for Microsoft Office (AfO) that works with SAP Analytics Cloud (SAC). It is called "SAP Analysis for Microsoft Office, edition for SAP Analytics Cloud" - "AfO-SAC". This means that AfO-SAC is integrated to SAC so there will be in future just one solution that integrates with both on premise and cloud-based data sources. This is what SAP calls as bridging the gap between cloud and on-premise systems.

In particular within AfO-SAC can: consume models (including both models analytics and planning ones) from SAC (an SAC connection can be created in AfO-SAC = as data source), store AfO workbooks in SAC environment, store AfO workbooks having SAC models as data sources locally, working with SAC hierarchies, enter planning data of SAC planning model in a crosstab in AfO analysis (front-end cell locking is used), etc.

Although that there are currently many restriction within the products (see online docu and roadmap) the AfO-SAC is being heavily developed together with strong support on SAP BW backend side. Which with the newest SP 16 (soon to be released) for BW 7.5 version is a kind of “go-to release” for all organizations that wants to use SAC.

As of AfO version 2.7 the AfO-SAC version follow the original AfO versions. Just it is market as different component - ANALYSISOFFICE_FOR_SAC.

More information:

Sunday, July 28, 2019

Planning (e.g. APO) requests in BW InfoProviders

Planning application like APO, BPC, IP etc. are using special data load request type. Normally planning data is stored under one Request ID that starts with prefix 'APO_*' in real-time InfoProviders. One can observe it in Manage screen of a BW info provider in this way. I mentioned those APO_* request already my earlier post here.

Information about the APO_* requests are visible in tables RSREQDONE and RSSELDONE however not much information about them is available (no InfoSource, DataSource and so on). Most of information available in these tables is the same that can be found in Manage screen of a BW info provider. However
In administration of an InfoCube, the "Type Of Data Update" field is not filled in the request list if there is an APO request.

Since almost nothing exists for APO requests (no Source/InfoSource, DataSource, Source System and so on), the type of data update for the APO request is also unclear. Just a "Type of Data Update" field is populated with the "Full Update" value.

In addition, notice that a data, which was loaded into a real-time InfoProv via an InfoPackage/DTP (so-called BW load of BW request), cannot be changed with the Demand Planning. Thus if key figure not zero and it originates from BW request then the key figure becomes a read-only. Such KF needs to be copied to other if it needs to be changed in the planning app.

Wednesday, July 10, 2019

How to find out backend job name for BPC Package execution

All BPC Packages during they run time they are executed by SAP Job in SAP BW backend system. Sometime it is useful to have a look at these jobs in case some issue occurred and one needs to analyze it.

In case of BPC Package investigation starts in EPM add-in in menu Data Manager -> View Status -> View Package Status. Here in column called SEQ is a generated string that uniquely identifies run time of the BPC Package.

With that, SEQ ID we go to SAP BW backend into table called UJD_STATUS (BPC: Package Status). The value of SEQ ID column we put to LOG_ID (Log-ID of a Process Chain Run) field and as a result, in JOB_NAME field we get the real SAP job name that can be further investigated in e.g. t-code SM37.

Checking status of cube behavior

Real-time cubes in SAP BW means that it is a planning cube where the planning functions can be implemented. This in short means a user’s can enter/modify data. It is possible then to copy, change and do a many different calculations with the data (e.g. distributions based on reference data or forecast functions).
The Real-time cubes can be therefore set into two modes (or called behaviors). Load behavior where the cube can be loaded via regular BW’s transformation or plan (or called real-time) behavior in which the cube can be saving planning data entered by users.

The change or the switch of the two modes can be done either manually:

Or same activity as done manually via RSA1 -> Modeling -> right click on cube -> Planning-Specific Properties -> Change Real-Time Load Behavior can be done programmatically.

Following FM can be used to check the real-time / load behavior and to set it:

In addition, there is an ABAP report SAP_CONVERT_NORMAL_TRANS can be used for the same. Similarly, there is a process that can be put into process chain for that.

Table RSMDATASTATE (Status of the data in the Infocubes) and its field LOADALWD (Allow loading for real-time data targets) stores the information into which behavior the cube currently is set to. If the field LOADALWD is set to X it is in loading mode, if it is set to blank in planning mode.

Friday, July 5, 2019

Tips on troubleshooting UD Connect type based source systems connection to SAP BW

There can be multiple issues popping up while connecting SAP BW to UD Connect based source systems like MSSQL. Within this blog post, I list few of them I recently faced.

1. Message No. RSSDK300:
S:RSSDK:300 Cannot convert a value of 'Local' from type java.l ang.String to INT at field CURRENCYTYPE
S:RSSDK:300 Cannot convert a value of 'LOCAL' from type java.lang.String to FLOAT at field CURRENCY
Normally error no RSSDK300 can be solved by reactivating corresponding DataSource, then following errors may pop up:
S:RSSDK:300 Field ENTITY is not a member of /BIC/CAZRXD_V_0001 70000001
Message No. RSSDK300
S:RSSDK:300 Column: Entity not found    Message No. RSSDK300
To solve this look for SAP Note: 1009063 - UDConnect displays erratic behaviour in loading data and blog. Or as per Note 1040572 – “UDC: Error "Column XXX not found" happens randomly”, use use parameters "fixedSchema" and "fixedCatalog" because the MSSQL system may be using different DB schemas and you may use the table/view that has same name in different schemas.

2 Message No. RSM340
Errors in source system       Message No. RSM340

3 Message No. RSDS_ACCESS036
Error while extracting field list of UD Connect object: UDC ADAPT ERROR::RSSDK|100|while trying to invoke th

4 Message No. RSSDK100
UDC adapter problem: connection area     Message No. RSSDK100

Things that may help to solve above mentioned errors:

JDBC test page
There is JDBC test page available on NetWeaver JAVA stack with following URL:
Here one can test connection to particular UD Connect based system. Moreover you can even extract the columns of table or view extraction is retrieving data from. This extremely needed check. It can reveal many issues. For example, in case BW’s datasource is expecting column called ABC but in source system there is completely different column name available like XYZ.

Meta data cache
As UD Connect, based systems are using JAVA NetWeaver server in middle there can be an issues at JAVA server. Mostly these are related to meta data cache. Because metadata description may already be present metadata cache of JAVA server. After the changes done insource system (column name change, column data type/length change, etc.) these changes must be replicated to the JAVA server. As those changes will not automatically be removed from the metadata cache on the JAVA. Normally these can be easily solved by restarting JAVA server when the cache is cleared. However, in production environments it may not be easy to get a time slot to perform the restart. Luckily as of NW version 7.2 the clear cache can be done w/o restart. Procedure is described in SAP Note 1479970 - Changes in RFC modules are not reflected in Java Server. In NetWeaver Admin tool choose tab "Availability and Performance" then select "Resource Monitoring" then "JCo Monitoring", there choose tab "Meta Data Cache". It is also possible to made the cache clear programmatically.

Investigate JAVA logs
Look for following logs to dig out more details about the error:
·        \usr\sap\\\j2ee\cluster\server?\BI_SDK_Trace.log. (there's one log per server node)
·        \usr\sap\\\j2ee\cluster\server?\log\defaultTrace.?.trc (send us the latest default?.trc file from all the server nodes)
·        \usr\sap\\\j2ee\cluster\server0\log\applications\BI\UDI.?.log

Debug on ABAP Stack
Set a breakpoint in FM RSSDK_DATA_UPLOAD_ALL on the line where function RSSDK_DATA_UPLOAD_ALL is called. Also see t-code RSSDK - DB Connect and ABAP program RSSDK_START.

More information:
1722695 - JRA: Object not found in lookup of CallbackConnectionFactory
2000681 - JRA: NullPointerException in CallbackConnectionFactory
1004194 - UDC: How to create log files for problem analysis
512739 - BW external DB Connect for MS SQLServer
1396552 - Remote connection delivers wrong character data

Thursday, May 30, 2019

How to find out code of SAP icon

Codes of SAP icons are used in ABAP report in some cases. For example of here a dynpro having some icons assigned to particular data rows in table grid there can be a logic which evaluates what is value of particular icon. In case one needs to debug it and perhaps change the icon, (means change value of the icon’s code) it is handy to know these codes. Because the icon can only be entered into debugger by providing its value.

Recently I debugged some BW’s UD Connect data source screens and I came across statement that evaluated value of the icon:

Variable icon_bw_datasource declared in Type group called ICON. In this particular type group, there are more than 1200 icons available.

The type group are available from Data Dictionary related t-codes like SE11. 

How to find out when and who ran what APD

Sometime it is needed to find out when some APDs were running, who ran them and so on. In case the APD name is known, it is easy. One just needs to go to APD maintenance part of t-code RSA1. That screen can be also accessed directly via t-code RSANWB. Here just particular APD needs to be displayed and there is a Monitor icon available on its tool bar.

The monitor of the APD has also separate t-code called RSANWB_MONITOR. Report behind this t-code  is called RSAN_PRR_MONITOR_NO_SEL_SCREEN. The report just submits other report RSAN_PRR_MONITOR. Finally, the later report calls FM RSAN_PRR_MONITOR_START where all logic of the APD monitor is developed.

However, what to do in case that APD name that ran is not known. In such a case t-code SLG1 can be leveraged. By supplying string RSANPR into Object field of the SLG1.

SLG1 -> Object = RSANPR

Tuesday, May 21, 2019

How to find out APD that have Performance Settings set to Process Data in Memory

APD processes have an option of processing all data in memory. This means that data of the APD is stored entirety in main memory while processing. It can be set in maintenance screen of the APD there in menu called Goto there is an option Process Data in Memory. Normally this shall be set on for small amounts of data.

In case large volume of data (couple of millions) are being processed like that, execution of such APD may terminate once the main memory no longer has sufficient space for the data.

Now how to find out what APDs are having this flag set on? Data element corresponding to this flag is RSAN_PROCESS_IN_MEMORY_FLAG. However, it is not saved directly in some tables. Main table that stores data about the APDs is RSANT_PROCESS. That one contain column called XML. In this column all settings like filters, mapping etc is stored. Here within the XML column there is a section called PROCESS_DATA_IN_MEMORY. If it is equal to X then it means that the APD is processed in memory.

I created small ABAP program that list out all APDs, which have Performance Settings set to Process Data in Memory. The program is available at my github.

Sunday, May 19, 2019

Switching to BEx Query Designer from SAP BI front end tools

Running BEx Query Designer (QD) directly from BI front-end tool is very common case especially when the QD opened directly the query, which is displayed in the front end tool. Having this function in place means no need to open the QD separately and no need to find the particular query in the QD while it is opened.

BEx Analyzer has this function for a very long time. It is available via menu Business Explorer -> Analyzer -> Choose Add-Ins, and from BEx Analysis Toolbox, choose Tools -> New Query.

In Analysis for Office (AfO) which is supposed to be replacement for BEx Analyzer this function was missing in its earlier versions. However at least in 2.6 the function is there. It needs to be enabled first by customizing the AfO’s UI. This is available via menu Analysis -> Custumize Analysis -> Custumize User Interface. 

On this pop-up window under Ribbon, part there is a Tools section available and under that one “Launch Query Designer” needs to be ticked off.

Finally, the BEx QD is available on AfO's ribbon:

Tuesday, April 30, 2019

How to read/write/delete from/to DSO objects

Sometimes it is needed to store data into DSO objects from ABAP. Either from BW’s transformation or for some reason from regular ABAP programs. Therefore some kind of API needed to be developed by SAP to allow this. There in classing (non BW4H) BW an several DSO types are available: standard/direct update/write optimized. The DSO type is needed to be considered and scenario of the DSO type shall be respected. Normally it makes no sense to write data to write optimized or standard DSO. Regular BW flows shall be used instead. The only DSO type is designed to be updated programically from custom code is Direct Update DSO. That’s why full API is only available for this type of the DSO.

DSO types                   methods of accessing


Write-Optimized        Open SQL SELECT statement 
                                  / BAPI_ODSO_READ_DATA_UC

Note that in earlier versions of BW the FM BAPI_ODSO_READ_DATA was available but it is obsolete (as of version NW2004s) and BAPI_ODSO_READ_DATA_UC should be use instead.

RSDRI* FM are also available as non RFC enabled ones.

More information:

Time between SAP application server and server time not in sync

In case time of SAP server and server where SAP runs is not coordinated, there is an ABAP ZDATE_ILLEGAL_LOCTIME dump like below:

Category               Internal Kernel Error
Runtime Errors         ZDATE_ILLEGAL_LOCTIME
Application Component  Not assigned
Date and Time          08.04.2019 14:15:00
Short Text
     The local time on the application server is not correct.

Many SAP applications and functions are working based on premise that time always increase. When the times mentioned are not in sync it may lead to inconsistences of SAP data. These can be e.g.: generated ABAP source, inconsistent data creation by applications based on timestamps. Issue is even more relevant in today’s modern approach when SAP runs on virtual machines where it is hosted by host OS. 

Normally when this situation is observed a particular SAP application server on which the issue occurred must be shut down immediately. Afterwards OS must be checked but no reset of the OS time can be done if the SAP application server is still running. In some cases issue is with Network Time Protocol (NTP) that is responsible to synchronize times in a distributed SAP system among SAP servers and database server.

More information:
2535959 - Dump ZDATE_ILLEGAL_LOCTIME occurs, what should be analyzed

Tuesday, March 26, 2019

BPC add-in error: "wrong CSV Format"

Below error is pretty appearing quiet often while working with EPM add-in in MS Office. It is actually also tricky error as there may be a many things to consider that may cause this error.
Below I introduce few things that needs to be checked as they may have a root cause on this error.

1. Enhancing BPC master data info objects with new dimensions, new data in hierarchies etc. This is especially the case when errors like following can be found in BPC log:

2019-03-26 11:42:33,813|ERROR|Metadata|?.?||||||VSTA_Main| Member [ACCOUNT].[PARENTH13].[ALL ACCOUNTS] is attached to member [ACCOUNT].[PARENTH13].[1,,,,,,,,,,,] on hierarchy PARENTH13, but [ACCOUNT].[PARENTH13].[1,,,,,,,,,,,] doesn't exist# 2019-03-26

Solution: as per SAP Note 1709380, hierarchy first needs to be deleted, dimension processed, data added back to hierarchy and finally dimension to be processed again. 
If this does not help proceed with running of ABAP reports UJXO_CLEAN_DIM_CACHE and UJXO_CLEAN_TDH_DIM_CACHE according notes 2229878 and 2201768
Also, report UJA_REFRESH_DIM_CACHE needs to run for involved dimension as per Note 2269291
If this is specific to TDH dimension see Notes 2767117 and 2303454.

2. Transport related issues. In this case, there is entry in the BPC log corresponding to:

FPMXLClient.Connection.RESTConnection+HierarchyNode doesn't exist

It is caused by transport and according SAP Note 2085650 BPC dimension needs to be retransported.

3. Loading of data from flat files. See SAP Note 2411607 - "Wrong CSV Format" error or missing members in EPM client.

4. Inconsistency caused by /CPMB/A9* objects under 'unassigned nodes'. Proceed accordign Note 1927742.

5. Restore environment related issues. See Note 2162971 how to use report UJXO_CLEAN_DIM_CACHE to fix it.

Friday, March 15, 2019

How to find out how uploaded file to BPC server

EPM add-in of MS Excel offers functionality of uploading files to BPC server. These are so called Data Files of Data Manager functions. It enables to upload files from local user machine to BPC server. Uploaded files can be later used within BPC packages, transformations and so on. 

I was wondering whether there is no information about who uploaded particular file. Within e.g. Data Preview, function there is only information about when was the file modified for last time but no information who modified/uploaded it.

Luckily, there is a table available in BW backend, which holds this information. It is table UJF_DOC. While browsing for table in t-code like SE11 you need to provide environment ID first into APPSET field at selection screen; afterwards put a full path of file in Data Manager into field DOCNAME.

This will return you the user name in field LSTMOD USER that is not available on BPC’s popups in EPM add-in.

Wednesday, March 13, 2019

Managing settings of AfO

Settings or configuration of Analysis for Office (AfO) was done via adjusting an registry settings for AfO of version 1.x in a client machine. However, this was changed and as of version, 2.x the configuration of the AfO tool can be done via maintaining XML configuration files. There are the config files available for Administrator and regular user. There are 2 files available for each of user:

Files for administrator are located in folder: %PROGRAMDATA%\Sap\Cof

And for user in folder: %APPDATA%\Sap\Cof

Another way to access the settings is via Excel menu File -> Analysis -> Customize Analysis -> Technical Configuration.

More info:
2083067 - How to maintain settings for Analysis Office 2.x

Sunday, March 10, 2019

EPM-BPC error: Not possible to open report

Normally an EPM add in of MS Excel used as interface to BPC server needs to have a temporary folder with read/write access rights to properly work with the BPC reports. At first place this can be set in settings of the Excel under Save where is a property called “Default personal templates location”. However, in some organization users may not be allowed to change this setting.

In case the setting “Default personal templates location” that points to folder into which user has no write access then any attempt to open report/template/layout from BPC server does not succeed and BPC log (under EPM add-in menu More->Log) raises following message:

ERROR|FilesManagement|FPMXL.Client.OpenSaveServerManager.DownloadedFileFromServer|||||VSTA_Main|Access to path “...” is denied.#

Luckily, there is another option. The Location of default folder for save/open of local data can be changed via User Option in EPM add-in under Server Configuration there is a setting called “Default Folder for Local Open/Save”. Once this local settings point to any folder where write access is granted user it is possible to open report/template/layout from BPC server w/o any issues.