Thursday, December 31, 2015

SAP BW 74: few new features

The SAP BW 74 is available for some time already and I already posted few blogs about it. In this post I introduce two more functions introduced in this release.

1) New with SAP BW 7.4 SP8: XXL - Attributes
It provides a possibility to load and store different files types (so called Multipurpose Internet Mail Extensions (MIME) Types) or long string (data type STRING or XSTRING) as XXL attribute into BW. The MIME file format can be e.g. PDF, XML, office files, image, video or audio files. The XXL attributes are basically characteristics values lower or equal to 255 and long texts up to 1333 characters.
While specifying an InfoObject to store such a XXL attribute it must be set a new flag called “Supports XXL Attributes” which is available on Master Data/Texts tab of the IO maintenance screen. Afterwards on tab called “XXL Attributes” it is possible to specify which XXL attributes the IO should have.
See more information in online documentation.


2) History function – Within RSA1 t-code under Modeling section, there is a new view. It displays history of actions performed on BW objects. All actions with relation to workbench navigation on objects (e.g. display, change, manage etc.) are displayed there. Notice that all actions are grabbed as per same session of user. Actions which are captured there can be re-performed. It is useful function which helps within the navigation across the different objects in RSA1.


3) Source/target extraction possibilities on DTP – there are few new object types introduced from which a DTP can extract the data. These are Advanced DSO (aDSO), and InfoObject: XXL Attributes.

Similarly with regards of target object type of the DTP there are again these two new into which the DTP can store the data to.


4) New source system types – bellow are new types of source system introduced. These are mostly related to Operational Data Provisioning (ODP) source systems. See online documentation here.
ODP - BW
ODP - SAP (Extractors)
ODP - SAP HANA Information Views
ODP - SLT Queue
ODP - SAP Business ByDesign
ODP - Other Contexts


More posts about SAP BW 74:

BW Modeling Tools (for SAP HANA BW)

Changes to query properties in BW 73 vs 74

BW 74: Monitoring of process chain for mobile devices

BW 74: Master data maintenance: web based environment

"Field-based" modeling in BW

Wednesday, December 30, 2015

Possibilities of Process Chain monitoring

Monitoring of Process Chains (PC) is very important part of BW system administration. Basically only successful run of the PC guarantees that there is up to date data for BW reports. There are many different tools which can be used to monitor of the PC. In this blog post I will focus only on the tools that can be employed for automatic monitoring without manual interaction of administrator. In all cases described below an email with the status of the PC run is being sent out to dedicated user.

1. CCMS: tcode RSMON -> Monitors -> BI CCMS; The CCMS gives a list of the PC available in the system. If an distribution list together with message for process of the PC is created then the CCMS will sent an email as the PC’s alerts happens.

More information about whole setup can be found here.

2. tcode ALRTCATDEF – here an recipient list is created and together with alert settings for selected process chain failure is setup (in tcode RSPC -> menu Process Chain -> Attributes -> Alerting -> check an checkbox “Send Alerts if Error Occur”) then email notification is sent out. In this case the PC’s property needs to be changed for every PC to be monitored as settings's flag (alerting) needs to be checked.

More information about whole setup can be found here.

3. tcode RSPCM - simply particular PC needs to added and notification maintained. This case comes with less implementation effort. No chains will need to be transported.


More information about whole setup can be found here.

Maintenance of user’s favorites in SAP NetWeaver Portal

SAP NetWeaver Portal is very common environment for accessing of BW Queries. Users have just one single point of access to the queries. It is very common for BW or Portal administrator to manage user’s favorites in Portal. Users have requirements to copy favorites from one user to other or to delete some reports from the user’s favorites. Here’s is how to perform such a tasks.

Copying of the favorites:

Login into the SAP NetWeaver Portal and then navigate into following path:

portal -> Content Administration -> KM Content -> root folder -> userhome


Now select the user which favorites are to be copied; go inside its folder and via context menu choose COPY:

Go to userhome -> and select the user to whom you want to copy the favorites to -> Click on favorites -> and then “OK”

Delete reports from the user’s favorites:
Similarly go into the folder of the user under path similar as above and via context menu delete the particular item which corresponds to the report that is supposed to be removed form user’s favorites.

"Field-based" modeling in BW

When it comes to effort for building of BW models, flows, objects or reports common perception is that developing in the BW takes long time due to many objects that need to be created in order to ensemble them into something which is working like report or flow. On top of this an transport management overhead comes as the objects developed in one system need to be transported into other system. SAP wants to address this and there are possibilities to “shorten” development time of the BW objects.

One of these attempts is so called "field-based" modeling. This type of processing is part of Layered Scalable Architecture ++ (LSA++) architecture.  What the field-based modeling means? By leveraging it; it is possible to integrate data into the BW with considerably lower effort comparing to classic BW development. The data for the models of the BW can resides outside of BW. Also the BW model can operate with the data on field level data without the need of defining InfoObjects. Moreover there is no need for having mapping of the fields to the InfoObjects. All this is possible by using new type of BW object – InfoProvider called Open ODS View. Starting with SAP BW 7.40 SP5 while creating new DSO there is a possibility to choose either the new DSO is based on the InfoObjects or on Fields. Also the Open ODS View can be defined of type “Database Table or View” and once it is employed in HAVA view it is fed by HANA table. Similarly we can use Advanced DSOs type of the DSO objects in BW 7.40 for the field-based modeling.

In closing words I would like to say that is nice to see that there are possibilities how to enable rapid modeling for BW. The field-based modeling together with BW Workspaces exactly supports this.

More information:

Doing SAP upgrade? What SAP Notes to check…

Depending on how complex particular SAP system is an upgrade experience can be very challenging. Therefore it is always better to check all information that may relate to the components that are going to be upgraded. The most important source of the upgrade information is SAP Notes. Within this blog post I want to show some strategies that use to when searching for the SAP Notes when collecting information for the upgrade projects.

Below are example of areas that are in my interests when I assessing the upgrade. Notice that everything must be based for particular version of components that is going to be upgraded. As an example further I assume that SAP Basis component (Software Component = SAP_BASIS) will be upgraded to version 7.5.

1. Component Documentation Note:
2212573 - SAP NetWeaver 7.5 Documentation

2. Collective Note
2198174 - Collective Note: SAP NetWeaver 7.5 SP01 - Application Server Java (AS Java)
2230033 - Collective Note: SAP NetWeaver 7.5 SP01 - BI JAVA
2202224 - Central Note: SAP NetWeaver 7.5 SP01 - EP Core (Application Platform)
2198193 - Collective Note: SAP NetWeaver 7.5 SP01 - Composition Platform
2202178 - Collective Note: SAP NetWeaver 7.5 SP01 - Development Infrastructure
2198099 - Collective note: SAP NETWEAVER 7.5 SP01 - Process Orchestration (PI)
... list may continue depending on software solution upgrade of NetWeaver 7.5 is associated with.

3. Central Note:
2202227 - Central Note: SAP NetWeaver 7.5 SP00 - Guided Procedures
2202228 - Central Note: SAP NetWeaver 7.5 SP01 - Guided Procedures

4. DB platform requirements Note:
2158828 - Minimal DB system platform requirements for SAP NetWeaver 7.5

5. Release upgrade Note:
Specific Notes for upgrades between different EhP, e.g.
1810104 - Upgrade from Release 7.0 EHP3 or Release 7.3 EHP1 to Release 7.40

6. Known problems Notes:
2158103 - Known problems with Support Packages in SAP NW 7.50 AS ABAP
822379 - Known problems with Support Packages in SAP NW 7.0x AS ABAP

7. Dual Stack split Notes:
2230617 - Upgrade and Split of Netwaever Pi Dual Stack in SAP Netwaever 7.5

8. Add-on compatibility Notes:
2156130 - Add-on compatibility of SAP NetWeaver 7.5 – ABAP
2156543 - Add-on compatibility of SAP NetWeaver 7.5 – Java

9. Influence on customer programs:
1404527 - Release upgrade from 7.1 to 7.2 for customer programs
367676 - Release upgrade from 4.6 to 6.10 for customer programs

Note: I honestly do not think that my list is comprehensive however it may serve as starting point to anyone who is preparing the upgrade project.


Hangman.vbs

Hangman is one of SAP tools to analyze a hang situations on an SAP system. In order to be able to analyze this several types of information needs to be collected. The "Hangman" tool helps with that. It stores following information into a log file:

- SAP work process list of all application servers
- SQL Server process list
- Windows process list of all application servers
- Windows process list of the database server
- SQL Server database locks
- SQL Server configuration and log files
- extract from the SAP dev_* traces of all application servers

Log file generated by the Hangman can help to understand root cause of the problems with SAP instance. The tool supports both stacks (ABAP & JAVA) of NetWeaver. However main database type of the Hangman is MS SQL Server.


More information:
948633 - Hangman.vbs

541256 - Hangman 4.1

Versions of SAP Connector for Microsoft .NET (NCo)

As for development of integration scenarios within JAVA world we can use SAP JAVA Connector (JCo) which I introduced in my previous post there is also SAP .NET Connector (NCO) used to develop integration of SAP and .NET world.

As there are multiple versions of the NCo in table below I briefly introduce them as an evidence on how the NCo was evolving over the years. Notice that is cases of some versions release dates are just approximate.


Release
Release Date
Note
Maintenance ends
1.x
Pre 2002
n/a
31.12.2004
2.0.1
21.11.2006
Design time part: 31.12.2009
Run time part: 31.03.2013
3.0.1
24.1.2011
July 31, 2018
3.0.2
21.4.2011
3.0.3
25.8.2011
3.0.4
3.11.2011
3.0.5
9.11.2011
3.0.6
11.11.2011
3.0.7
22.5.2012
3.0.8
7.9.2012
3.0.9
31.10.2012
3.0.10
9.11.2012
3.0.11
28.11.2012
3.0.12
8.5.2013
3.0.13
25.2.2014
3.0.14
23.9.2014
3.0.15
14.11.2014
3.0.16
10.4.2015
3.0.17
10.4.2016
3.0.18
10.7.2016


To download the newest version of NCo see directory "SAP Connector for Microsoft .NET" at SMP quick link: service.sap.com/connectors or see Note 856863 - SAP NCo Release and Support Strategy.

Last update on: 14.1.2017

SAP ABAP Connector (ACO)

Within SAP NetWeaver 7.4 there were some new features introduces in case of ABAP. One of them is so called SAP ABAP Connector (ACO). It is an ABAP component designed to consume RFC Services on remote ABAP systems. The things that it tries to solve are related to different dictionary types that one remote function (RFM) may have. These can be either a case that dictionary types are not available at all, or are available in different versions on different systems.

Basically using the ACO a consumer proxy can be generated and used for RFC scenarios. If for any reason an interface of RFM changes a call from ABAP with the old interface doesn’t make a dump. But the proxy will be regenerated instead if any change in the RFM interface happens.
The ACO leverages SAP Connectors like SAP Java Connector 3 (JCo) or SAP .Net Connector 3 (NCo). It uses them first to retrieve metadata via APIs, them it performs dynamic RFC call of any RFM while it dynamically sets/gets the parameter values.

While the ACO access the remote RFM following two paradigms are supported:

1. Generation of a static proxy - proxy class is generated once for specific remote system and the RFM. Client program will regenerate the class when the RFM’s interface is changed.

2. Dynamic access to the values and metadata of all parameters – Values of the RFM param can be read/set by using get/set methods. The params (plus its types) are determined at runtime by the ACO.

One drawback here is that the ACO doesn’t support bgRFC or or t/qRFC.

More information:

SAP Enterprise Threat Detection (ETD)

As more and more companies are facing security issues and threats are usually first in the line a topic of monitoring and evaluating of security related events becomes very important for every big software vendor. SAP jumped on this bandwagon as well with their offering called SAP Enterprise Threat Detection.

The aim of this solution is to:
  •  offer real-time data platform for performing forensic investigations in order to discover suspicious patterns
  •  automatically evaluate of attack detection patterns
  •  analyze and correlate log
  • integrate custom log providers
  • find threats focused on SAP software


So all in all it helps to identify the real attacks as they are happening and analyze the threats quickly enough to neutralize them before serious damage occurs.

Technically it is based on processing of data collected by ESP (Event Stream Processor). The ESP gets the data from SAP NetWeaver Application servers (JAVA and ABAP), from SAP HANA database and from non-SAP sources. The ESP then collected data provides to SAP HANA engine to evaluate and analyze them and generate alerts based on analyses results. Analyses that are done in HANA are patterns based. The patterns are developed and enhanced by SAP. Customers can change them according their needs and also they can create completely new ones.

Currently there is an SP02 available for SAP Enterprise Threat Detection 1.0.

More information:

SAP Anywhere

Earlier this year SAP quietly introduced new offering for SME segment. As they have already presence in this segment with their SAP Business One solution (SBO) and by SAP Business All-in-One which are on premise type of solution and by SAP Business ByDesign which is cloud solution also a push to cloud for the SBO was logically expected.

The SAP Anywhere is first front-office suite for managing of sales, marketing, e-commerce and inventory activities in one complete system. It is provided as a software-as-a-service (SaaS) which runs by SAP in a public cloud. It can be accessed via mobile devices and/or desktops as well. Right now solution is available for China market with rest of world to be followed in 2016. 

Question is how the new offering fits into existing one – especially to the SBO. As per SAP for companies running the SBO there are plans evolve their SBO systems to SAP Anywhere. With regards to SAP Business ByDesign there is application programming interface (API) which can be provided to partners for integration of the SAP Anywhere with the SAP Business ByDesign solution.

More information:
B1-ANYWHERE - Component name on SMP

Tuesday, December 29, 2015

How to put query into SAP BW cache?

There are few different cache memories used within the SAP BW server. One of them is OLAP cache. Its purpose is to store query result sets. Once the query set is stored into the OLAP Cache it can be later re-used by all users and system doesn’t need to retrieve the data sets from the underlying BW InfoProviders objects (e.g. cubes) again.

Now how particular BW query gets stored into the BW’s server OLAP cache? There are few cases for this. The most common is that once user executes a query, the result set for that query’s request can be stored in the OLAP access to those result sets. Later in case the same query is then executed again (also it can be by another user); the query request can be filled by accessing the result set already stored in the OLAP cache.

How can we get the query into the cache upfront in order to prepare it for very first user so even execution of the query by very first user takes place it will hit the OLAP cache?

1. pre-calc server:
By using simple web template which is published for the BW query and is scheduled to run in the background using the reporting agent. As the query has one variable there can be multiple query views within the web template. Information Broadcasting can be used in this case in BW 7.x. In SAP BW 3.x instead of the Information Broadcasting there is reporting agent functionality available to achieve the same.

2. BEx Broadcaster:
The BW query can be published to BEx Broadcaster from BEx Query Designer tool. The tool is running in JAVA stack. So once you do this in from the BEx Query Designer you get to web environment. You can setup different settings in here. The settings includes like following: Distribution Type (Broadcast to: email, Portal, printer, etc and one of the Type is also OLAP cache. The BEx Broadcaster Settings can regularly run as a job or it can also be automated using Process Chains (Process type: Trigger event data change for Broadcaster).

In closing words I want to show how the BW query loaded into the OLAP cache looks like. These queries can be monitored in tcode RSRCACHE (you can jump there also from tcode RSRT -> Cache Monitor. On picture below is like BW query loaded into the OLAP cache looks like. To housekeep the OLAP cache you can use ABAP report RSR_CACHE_RSRV_CHECK_ENTRIES e.g. to delete entries in the cache.



More information:

SAP HANA VORA

In August this year SAP announced that it is entering into big unstructured data analysis. Well when it comes to big data we usually talk about Hadoop. The Apache Hadoop an open source software framework written in Java. Its purpose is to process and store very large data sets on computer clusters built from commodity hardware. With this regards SAP introduced new product called SAP HANA VORA which is in-memory engine that allows applying of business analytics to unstructured data e.g. provided by Hadoop based systems. Using the VORA SAP enables SAP Hana to analyze the data sitting in Hadoop.

In general there are following three use cases for the Vora:

Precise decision – combine biz data with data from external sources
Democratize Data Access – make data mashups based on enterprise and Hadoop data and analyze them in OLAP tools.
Simplify big data ownership access and process corporate and Hadoop data in single solution.

As a curiosity one may wonder what the name VORA means. Seems it is combination of words like “velocity” and “Velociraptor” both combined to just Vora.

More information: