Monday, December 31, 2018

A year in review – 2018

This year treated me very well I must say. It was a year in my professional career where I took some months off because I worked only on part time basis. Due to this I had a time for my side projects. E.g. I managed to keep installing developer editions of SAP software like: SAP NetWeaver 7.51 SPS2, SAP NetWeaver 7.52 SPS1 and SAP HANA 2.0 express edition (HXE). For my client I also installed and maintained SAP ERP6 IDES EhP8 on Windows/MSSQL platform.

Most precious thing that happened this year for me that I spent a time with my family. I also completed few things around our household which stayed unfinished for some time.

From my profession as an SAP consultant point of view this year was about learning and visiting a few IT/SAP events. I completed 10 online courses via open.sap.com and open.hpi.de. I read 3 books from SAP Press.

Regarding an events, it all started in January with SAP CodeJam in Prague. In April I attended local Slovak IT conference called codecon and an SAP Slovakia organized workshop - Innovation driven by data. In May it was SAP Forum Slovensko 2018 event. Then in October I was lucky enough to attended SAP TechEd. This happened because I participated in TechEd Tutorial Mission contest held in August. The TechEd was a highlight of this year for me. I met many people there, learn a bunch of new things and had a great time being there. Finally, in autumn there was another SAP Slovakia organized workshop - Modern data warehouses and analytics. Also there was an another SAP CodeJam in Vienna in November. This CodeJam  followed by sitVIE held on next time. The sitVIE was also great event that I enjoyed much.

I would recommend anyone who has a chance to join any SAP community events to take a part in it. They are not just informative and gaining a knowledge events but you get a chance to meet many people from SAP community around the world. And that what is a special thing about it.

As this is my very last post in this year, I wish all my readers (whether regular ones or just those who came here by an accident) successful and wonderful New Year of 2019!

Native DataStore object (nDSO)

The nDSO object resides in HANA DB and it was introduced by SAP HANA Data Warehousing Foundation (DWF) to model SAP HANA SQL Data Warehouse systems. It is derived from a BW concept that is now accessible in SQL based warehousing (native DWH). The nDSO object is developed upon the SAP HANA XS advanced application server framework. Similarity to BW means that there are 3 tables generated for the object (active data table, change log or protocol table or delta tablw, new data or inbound queue). The tables are exposed as a CDS (Core Data Services) entities.

The nDSO provides a persistence of data with additional semantics like:

- delta handling (before/after image) to move, aggregate & load delta data containing deleted records
- request management to allow roll-back of data loads
- (selective) deletion of data, single requests or the complete content of a nDSO


There is an integration possible of the nDSO (can be seen as native DWH) and interoperability SAP BW (BW/4HANA). The nDSO can be used as source objects of BW DataSources. Within the HANA the nDSO can be used as the source for a calculation views.

Modeling of the nDSO objects is embedded into SAP Web IDE for SAP HANA using HANA CDS as metadata description language.

More information:
HAN-APP-DWS-DSO (DataStore Object) - SAP support site component

Sunday, December 30, 2018

SAP HANA Data Warehousing Foundation (DWF)

The SAP HANA Data Warehousing Foundation (DWF) is a suite of packaged tools that aims to support large scale SAP HANA use cases like below while leveraging its data management tools:

- HANA Table, Partition and Application Distribution Management in HANA Scale-Out configurations
- Data Temperature Management
- Modeling
- Execute and Monitor

The tools were developed to try to reduce TCO while running HANA. Also to support HANA administrators and data warehouse designers. SAP HANA DW Foundation complements the data warehouse offerings of SAP like BW/4HANA, SAP BW powered by SAP HANA (BoH) and SAP HANA SQL Data Warehouse, HANA Agile Data Mart as well as mixed HANA scenarios.

The DWF product family is shipped within individual Delivery Units. Those delivery units are HCO_HDM (HANA DWF Core), HCO_HDM_DDO (Data Distribution Optimizer), HCO_HDM_DLM (Data Lifecycle Manager) and HDC_HDM (HANA DWF Documentation).

Below are briefly described components of the DWF product suite:


Component                                Description

Native DataStore Object (NDSO)    A semantically rich persistency object within HANA that can be used to manage full and delta data loads. The NDSO has same capabilities as advanced DataStore Object (ADSO) as we know it from BW systems.
Data Warehouse Scheduler (DWS) Allows task chains to be scheduled in HANA developments to define dependency graphs for HANA artifacts. E.g. dependencies between single processes with the focus to provision DWH models.
Data Lifecycle Manager (DLM)        Model aging rules on tables to displace (archive) aged data to HANA extended tables, remote data stores, MultiStore Table, HANA Extension Node, Dynamic Tiering, SAP IQ, Hadoop or SAP Vora in order to optimize the mem­ory footprint of data in HANA.
Data Warehouse Monitor (DWM)     Provides a comprehensive overview about activities going on in DWH such as what is/was scheduled, completed, and failed task chains within a selected HDI con­tainer, data lifecycle manager profiles, as well as an overview over all NDSOs.
Data Distribution Optimizer (DDO)  Is used to plan, adjust and analyze landscape reorganizations for HANA scale-out systems.


More information:
2435452 Release Note SAP HANA Data Warehousing Foundation 2.0
SAP support site components:
          HAN-APP-DWS-DSO (DataStore Object)
HAN-APP-DWS-DWS (Data Warehousing Scheduler)
HAN-APP-DWS-DLM (SAP HANA Data Lifecycle Manager)
          HAN-APP-DWS-DDO (SAP HANA Data Distribution Optimizer)

SAP Data Hub (SAP DH)

In 2017 SAP introduced a new product for data integration. It is called a Data Hub. Main purpose of the product is to help to solve and overcome issues like: constantly growing amount of data available today (big data), less accessibility of data due to proliferation of cloud based software, data governance risks (e.g. GDPR in EU), disconnect of data due to sitting in silos, missing link between the data, no data readiness, etc.

Traditional approach to solve some of these above would be to load the data to some central place (e.g. data warehouse) while using ETL tools. A new way the Data Hub is using is to orchestrate the data via pipeline driven integration, operations and governance.

The Data Pipelines with the Data Hub are flow based applications consisting of reusable and configurable operations (e.g. ETL, data preparation, code execution, connectors, etc.).

There are so called workflows available in the Data Hub solution to orchestrate processes across the data landscape (e.g. executing data pipelines, triggering SAP BW Process Chains, SAP Data Services Jobs, etc.).

From data governance point of view, the products has a metadata repository of information stored in the connected landscape. This supports discovery, profiling and search capabilities for the data.




Basically what it does is organizing the data in systems landscape. It enables accessing and harmonizing information from a variety of sources via unifying the metadata in catalog.

It can connect all types of data sources (e.g. enterprise systems, data lakes, etc.).  Then it can organize and manage all data assets coming from these sources. Also it can orchestrate and monitor data processes within the different systems. And finally integrate existing assets (e.g. python scripts on data lake, process chains in SAP BW/4HANA, etc.)

The SAP Data  Hub is part of SAP HANA Data Management Suite (HDMS) suite of products. The product is integrated with other SAP and none SAP solutions like:

BW4/HANA - (BW/4HANA process chain can start a workflow task in DH as the BW/4HANA has specific process type “Data Hub Workflow”)

- Hadoop – the Data Hub can write to HDFS files via an OpenHub destination, it has Connectivity via HTTP.

A picture - courtesy of SAP SE from its marketing material.

More information:
EIM-DH (SAP Data Hub)- SAP support site component
2466184 - SAP Data Hub: Central Release Note

SAP PowerDesigner (SAP PD)

Apart of SAP Enterprise Architecture Designer (EAD) in are of enterprise architecture (EA) there is one more product in SAP's portfolio. It is the product called SAP PowerDesigner (PD) which has a long history. SAP gained it through acquisition of Sybase in 2010.

It enables data modeling and metadata management for following enterprise IT architecture areas: data, information and enterprise architecture. The PD enables impact analysis, design-time change management and metadata management techniques to enterprise IT projects.

The PD tool can help to visualize, understand, and manage impact of change to enterprise system before it happens. It supports end-to-end modeling by so called model-driven architecture (MDA) design with industry-standard modeling techniques, a powerful metadata repository, and unique Link and Sync technology. The PD has a capability of metadata repository that supports collaboration and communication between all the enterprise IT project stakeholders. Also it facilitates responses across teams involved to contributes to the business agility.

Difference between PowerDesigner and Enterprise Architecture Designer 
This may sound familiar if you read my other blog regarding SAP Enterprise Architecture Designer (EAD). The two products are overlapping in some areas. The difference between the two is mostly with regards to customization of the tools. Whereas the PowerDesigner is more customizable in case of the EAD the customization is limited. Source for this information can be found hereOther difference and more obvious one is that SAP EAD is a natural progression of PowerDesigner targeting on customers who want to move to a cloud-first SaaS solution for enterprise architecture. The EAD is SAP's strategic solution for enterprise architecture now.


More information:
BC-SYB-PD (PowerDesigner) - SAP support site component
2494295 - Overview and information about SAP PowerDesigner

Friday, December 28, 2018

SAP Enterprise Architecture Designer (EA Designer or EAD)

Within the SAP HANA Data Management Suite (HDMS) there is a central tool which serves as single point of design, documentation and reference from enterprise architecture (EA) point of view.
It can be used for capturing, analyzing, present of many artefacts important in architecting of IT objects. These objects can be landscapes, systems, strategies, data, biz processes, requirements, application modeling and other used in the IT environments. Depicting the mentioned objects by industry standards, using metadata and diagrams enables proper understanding of the EAD’s outcomes in different stages of the IT projects.

Short example of different diagram types that are supported by the tool:

- Business Process diagrams (e.g. BPMN 2.0: descriptive and executable)
- Conceptual Data, Physical data modeling & data architecture: Tables & Views, Virtual table definitions, Data Movement Models (Flowgraphs), Native DataStore Objects, HANA CDS associations
- Data Movement Diagram
- Enterprise Architecture Diagram
- NoSQL Document Schema (JSON)
- Process Map
- Requirements List

Here are few examples of capabilities that are supported by the tool:

Integrated design - Translate business strategy to technical implementation requirements
Development automation - Generate architecture and technical artifacts automatically
Corporate knowledge - Drive collaboration from a single point of truth for all stakeholders
Flexible deployment - Deploy artifacts to SAP HANA or SAP Cloud Platform

The tool is web based so user is using web browser all the time while using the tool. It aims to support comprehensive end-to-end modeling experience. It has reverse engineering capabilities too. It supports model comparison, lineage and impact analysis. It saves generated SAP HANA HDI compatible files to either local ZIP files (for later import to SAP Web IDE) or to Git Repository (for Git integration to SAP Web IDE).

The tool has its place in data warehousing are as it is used in SAP SQL Data Warehousing solution. Here it servers during design phase where the new SQL based DWH is being modeled in the tool. The models that are needed for such a DWH includes: Conceptual Data Model (CDM), Physical Data Model (PDM), LifeCycle Metamodel, CalcView, Data Movement Model.

SAP SQL Data Warehousing

SAP SQL Data Warehousing is reflecting modern trends in data warehousing solutions. It leverages SAP HANA Data Management Suite (HDMS) as foundation the modern data warehousing. It basically brings SQL capabilities to traditional SAP's style of the data warehousing (e.g. BW based).
Moreover, there is a need to address also things like nonSAP data majority that many customers have in their landscape, integration of 3rd party 3rd party DWH solutions, development agility, leveraging existing SQL knowledge and database skills that customers have in their IT department acquired by using non SAP solutions. All this pushes SAP to deliver SQL-based DWH solutions.

What are possibilities of running SAP SQL Data Warehousing?

1. Using SAP BW/4HANA - means combining SAP SQL Data Warehousing with SAP BW. Here best is to leverage BW/4HANA. use cases here can be a consumption of SQL based datamarts and data warehouses by SAP BW. Best case here for SAP customers is running the SAP SQL Data Warehousing and BW/4HANA on same platform/system which contributes to simplify system landscape.

2. Using SAP HANA Data Management Suite (HDMS) – See my blog about the HDMS here. In particular, this use case mostly SAP HANA XS Advanced engine is leveraged plus other tools. In addition, in design phase the future SQL DWH is modeled in SAP Enterprise Architecture Designer (EAD) which is part of HDMS. A next phase is development of the DWH. Here it is possible to clone, edit and build the DWH artefacts using SAP Web IDE for SAP HANA (again part of HDMS). In this case the development artefacts are SAP HANA content and models. The HDMS toolset covers the tools for deploying (HANA Deployment Infrastructure (HDI); Application Lifecycle Management (ALM); External Repository Services) and running (SAP Data Warehousing Foundation tools like: Data Warehouse Scheduler (DWS); Data Lifecycle Manager (DLM); Data Warehouse Monitor (DWM); Data Distribution Optimizer (DDO); etc.) the SQL based DWH as well.

More information:
Getting Started with SAP SQL Data Warehousing

SAP HANA Data Management Suite (HDMS)

In this year's SAPPHIRE, SAP announced a platform of tools for governing and integrating data across multiple data (federated) environments. It is built on top of existing SAP products (components of HDMS) as below:



1. DB platform: SAP HANA: The HDMS leverages HANA platform which means all of its components runs on HANA with all benefits of the HANA like in memory computing, advance DB capabilities, different engines (spatial/graph processing, text analysis, search and mining, predictive/machine learning, streaming, series data storage and processing), etc.

2. Orchestration and governance: SAP Data Hub is used to centralize metadata management cross different systems. Main function of this component are:
- Data Pipelines – flow based applications consisting of reusable and configurable operations, e.g. ETL, Preparation, Code Execution, Connectors 
- Workflows – orchestrate processes across the data landscape, e.g. executing data pipelines, triggering SAP BW Process Chains, SAP Data Services Jobs and many more.
- Governance – metadata repository of information stored in the connected landscape. Offering discovery, profiling and search capabilities.

3. Multi-modal modeling: SAP Enterprise Architecture Designer (EA Designer or EAD) is used as component of the HDMS to model systems in landscape and processes around it. Is it to use for conceptual, logical, and physical modeling of SAP and non-SAP data.

4. Cloud services: SAP Cloud Platform Big Data Services component is used to deliver full-service big data cloud based on Hadoop and Spark.

Tools of SAP HDMS:
SAP HANE Extended Application Services (XSA/Cloud Foundry)
SAP HANA Web IDE
SAP HANA Application Lifecycle Management (ALM)
SAP HANA Deployment Infrastructure (HDI)
SAP HANA EIM Services (SDI/SDQ); SDI = Smart Data Integration; SDQ = Smart Data Quality
SAP Enterprise Architecture Designer
SAP HANA Processing Engines & Services
SAP HANA Data Tiering
SAP HANA Data Warehousing Foundation (DWF)
SAP Agile Data Preparation (ADP)
SAP HANA Based Services on SCP
Open Source Tools like Git, Jenkins, etc.


The HDMS is part of SAP’s initiative called Intelligent Data and it shall support its customers in transition to an Intelligent Enterprise. The term Intelligent Data can be described as data (data ingestion from various sources, smart data streaming for event capture, high data quality) in a single and unified view.  Together with smart data integration that enabling innovative/ advanced applications and data management. 

More information:
Intelligent Data

All pictures - courtesy of SAP SE from its marketing materials.

Wednesday, December 19, 2018

abapGIT

Ten years back we had a tool called SAPlink that enabled sharing of ABAP code. Few years back a new initiative started that aimed to implement a version-control system for tracking changes of ABAP code. abapGIT is open source initiative that basically does the same what git tool does for other programming languages. So a source-code management in software development supporting multiple programmers working on same code. SAPlink is obsolete now and abapGIT is fully replacing it and offering many more.

I installation is fairly simple you just need to manually create one abap report. And that’s it. No includes as everything is contained in the one ABAP source code ZABAPGIT. While you can use the abapGIT in so called modes:

offline mode This is to e.g. pack your programs to ZIP file (export) and deploy it on another SAP system (import). You can import also any ABAP source code from git servers (like github.com) just by exporting the code into the ZIP file and importing it to SAP system.

online mode – Enables you to clone any Git repository from git servers (like github.com) by proving clone URL.

For the online mode a certificates needs to be installed on SAP system in order to enable SSL. This is basically described in manual. It involves a download the certificates from github.com and installing them via t-code STRUST. Here first you import the certificate via IMPORT button and afterwards you need to add to Certificate list with respective button. Of the certificate is not added the whole SSL setup won’t be completed.



Lastly you need to add following system profile parameters below in order to complete the SSL setup. This can be done via t-code RZ10 or directly by editing a file (/sapmnt//profile/_DXX_hostname).

ssl/client_ciphersuites = 150:PFS:HIGH::EC_P256:EC_HIGH
ssl/ciphersuites = 135:PFS:HIGH::EC_P256:EC_HIGH

More information:
Latest build: zabapgit.abap


Saturday, December 8, 2018

Software Provisioning Manager (SWPM) – list of SAPinst properties

SAPinst is part of Software Provisioning Manager (SWPM)and it is a tool used to logistics of SAP software products. It basically executes installations/upgrades of SAP software. In this post I’d like to list out all parameters (called properties) that can be used while running the SAPinst.

To get an information about version of the SAPinst use parameter: -version It will give a output similar to following:

SAPinst build information:
Version:         749.0.49
Build:           1838559
Compile time:    Apr 24 2018 - 09:56:11
Make type:       optU
Codeline:        749_REL
Platform:        ntamd64
Kernel build:    749, patch 500, changelist 1837688
SAP JRE build:   SAP Java Server VM (build 8.1.038 9.0.4+011, Mar 19 2018 16:55:53 - 81_REL - optU - windows amd64 - 6 - bas2:303791 (mixed mode))
SAP JCo build:   3.0.18
SL-UI version:   2.6.22
SAP UI5 version: 1.50.4

To get a list of all properties just run it with parameter: -p Here are all the properties:

SAPINST_BROWSER Path to the browser executable that is started with the SLP access URL.
Windows only: If SAPINST_BROWSER is not set, the default browser of the system is started. For more information, see SAP Note 2336746.
Value type: string
Default value:

SAPINST_CONTROL_URL Path and name of a control file 'control.xml' or a catalog file 'product.catalog'.
If this parameter is missing, the first unknown parameter that is not a valid parameter is used for control file.
Value type: string
Default value:

SAPINST_CRL_PATH        The path to the downloaded CRL. Download the CRL from https://tcs.mysap.com/crl/crlbag.p7s
Value type: string
Default value: ~/.sapinst/crlbag.p7s

SAPINST_CWD       Directory that is used as working directory. Requires write permissions to that directory.  This parameter has the same effect as previously changing to this directory.
Value type: string
Default value:

SAPINST_EXECUTE_PRODUCT_ID       With this value you can directly process the required option using the attribute ID at the 'component' tag in the 'product.catalog' file, for example id="NW_ABAP_OneHost:S4HANACLOUD.CORE.HDB.ABAP". If no option exists with this ID, the 'Welcome' screen displays all available options.
Value type: string
Default value:

SAPINST_HTTPS_PORT   HTTPS port in SL Common-GUI mode
Value type: number
Default value: 4237

SAPINST_INIFILE_URL   File to be used for reading initial values of component parameters. For more information, see SAP Note 950619. Aliases: SAPINST_PARAMETER_CONTAINER_URL
Value type: string
Default value: inifile.xml

SAPINST_INPUT_PARAMETERS_URL   Path and name of a text file containing 'key=value' pairs to be used preferentially when evaluating component parameters. A component parameter attribute 'defval-for-inifile-generation' defines the key. Its value is used as the initial value when processsing the component for the first time.
Value type: string
Default value:

SAPINST_IPv6_ACTIVE   If set to 'true', the usage of IPv6 is enabled by default
Value type: bool
Allowed values: true = {true, yes, 1}, false = {false, no, 0}. Other values are not allowed.
Default value: false

SAPINST_MESSAGE_CONSOLE_THRESHOLD Trace level console
Value type: string
Allowed values: flow_trace, trace, info, phase, warning, error, external(for ACC)
Default value: error

SAPINST_MESSAGE_DEVLOG_THRESHOLD   Trace level developer logfile
Value type: string
Allowed values: flow_trace, trace, info, phase, warning, error
Default value: flow_trace

SAPINST_MESSAGE_GUILOG_THRESHOLD    Trace level standard logfile 'sapinst.log'
Value type: string
Allowed values:   values: flow_trace, trace, info, phase, warning, error
Default value: info

SAPINST_REMOTE_ACCESS_USER       Name of an OS user who is authorized to log on remotely with name and password via the GUI. By default, this is the user of the SAPinst process. For more information, see SAP Note 1745524.
Value type: string
Allowed values: the login name of the user who can logon to the remote UI
Default value:

SAPINST_REMOTE_ACCESS_USER_IS_TRUSTED     If set to 'true', warnings concerning the remote access user are disabled. This user must be a trusted user.  For more information, see SAP Note 1745524.
Value type: bool
Allowed values: true = {true, yes, 1}, false = {false, no, 0}. Other values are not allowed.
Default value: false

SAPINST_SKIP_DIALOGS          If set to 'true', all dialogs are skipped. Value type: bool
Allowed values: true = {true, yes, 1}, false = {false, no, 0}. Other values are not allowed.
Default value: false

SAPINST_SKIP_ERRORSTEP      If set to 'true', the first step with status ERROR is skipped; step status is set to 'OK'. Value type: bool
Allowed values: true = {true, yes, 1}, false = {false, no, 0}. Other values are not allowed.
Default value: false

SAPINST_STACK_XML     Path and name of the stack configuration file 'stack.xml' as described in SAP Note 1680045.
Value type: string
Default value:

SAPINST_STACK_XML_ACTIVE_SID    For unattended mode only: Contains the SAPSID of the active target system if working with a stack configuration file 'stack.xml'.
Value type: string
Default value:

SAPINST_START_GUI      If set to 'true', on Windows the default browser starts automatically with SL Common UI.
Value type: bool
Allowed values: true = {true, yes, 1}, false = {false, no, 0}. Other values are not allowed.
Default value: true

SAPINST_START_GUISERVER   If set to 'true', the SL Common UI server starts automatically. Alternatively, option '-noguiserver' can be used to turn off the SL Common UI server.
Value type: bool
Allowed values: true = {true, yes, 1}, false = {false, no, 0}. Other values are not allowed.
Default value: true

SAPINST_STOP_AFTER_DIALOG_PHASE       If set to 'true', the execution stops after processing all steps of the 'Define Parameters' phase. Used for generating parameter files only. If set to 'false', the 'Execute Service' phase is also performed.
Value type: bool
Allowed values: true = {true, yes, 1}, false = {false, no, 0}. Other values are not allowed.
Default value: false

SAPINST_TRACE_JSLIB   Switch to activate the JS trace functionality for specific JS classes. The values 'flow_trace' and 'Retry' activate the flow trace. Aliases: SAPINST_JSLIB_TRACE
Value type: string
Allowed values: comma separated list of JS classes or all
Default value:

SAPINST_TRACE_KERNEL         Determines the trace level of the kernel trace library   Aliases: SAPINST_MSLIB_TRACE
Value type: number
Allowed values: 0, 1, 2 or 3
Default value: 1

SAPINST_TRACE_RFC      If set to 'true', traces for RFC calls are written. CAUTION: This can cause passwords to appear in logfiles. Aliases: SAPINST_RFCLIB_TRACE
Value type: bool
Allowed values: true = {true, yes, 1}, false = {false, no, 0}. Other values are not allowed.
Default value: false

SAPINST_USE_HOSTNAME        The framework returns the value of that property if the getHostName() function is called. The value is not checked. For more information, see SAP Note 1564275.
Value type: string
Default value:


Selfextractor properties:
-sfxver:
  display selfextractor version information within a message box and ends.

-sfxverNB (Windows only):
  prints selfextractor version information without a message box and ends.

-noMessageBox (Windows only):
  Selfextractor errors are usually displayed within a message box.
  Using this parameters suppresses message boxes and errors are written to stdout only.

-exitOnError (Windows only):
  After a successful installation the selfextractor window is closed automatically.
  In case of an error the window stays open to show logging information.
  Setting this property will close selfextractor window automatically also in error case.

-extract [
]:
  unpacks SAPinst files to
or to current directory if is not specified.
  If
does not exist it is created.

* Example of property usage:
SAPINST_SKIP_DIALOGS=false

Return codes:
0   success
1   unknown exception
2   user decided to stop
3   iSeries specific initialization problem
4   framework error; contact SAP support (BC-INS-FWK)
5   incomplete service; contact SAP support (fitting subcomponent below BC-INS)
6   stop after dialog phase requested
7   unknown exception during step execution
8   problems creating support archive
9   problems in GUI subsystem; contact SAP support (BC-INS-FWK)
10  SLC could not be started in server mode because of missing encryption jars within the used JVM
11  executed step requested to "abort the installation"
12  terminated by a signal
13  service needs to display a dialog but no GUI is available; restart SAPinst with GUI.
14  executed step stopped service execution
15  problems concerning the SDT server's user (default 'sapsdt')
16  problems with loading of DLL modules (only for Windows)
17  framework error; see console output
18  reserved for framework issues ; not used at the moment
19  reserved for framework issues ; not used at the moment
20  reserved for framework issues ; not used at the moment
21 - 110  error code was defined by an exception thrown by an installation step
 // see message.xml
111 step execution ends with an error without specifying an return code via messages identifier
112 - 126 error code was defined by a exception thrown by an installation step
 // see message.xml