Friday, March 29, 2024

Activation of SNP Glue objects after transport

In my last blog I wrote about SNP Glue transport.  Once the Glue objects are moved to the target SAP system there is one more step to be executed before the objects are ready to be used. It is so called objects activation step. It can be performed in dedicated t-code (/DVD/GLTR - DataVard Glue Transport Workbench). There is a button called Activate Request in the toolbar.


Once you hit that button you get a pop-up window to provide a TR which objects are to be activated.


Next pop-up lists out all the objects that are in that TR. One can choose if particular objects is to be activated or not. Also there is a possibility to delete the Glue objects by selecting Delete checkbox. One more check box is to be checked if the objects are new and are to be created/changed. 


Once this pop-up is confirmed the objects are being physically activated and afterwards the objects are visible in Glue Object navigator – t-code /DVD/GL80 and are ready to be used.

Transports of SNP Glue objects

In my previous post related to SNP Glue integration tool I described how the tool can be leveraged to transfer data between Snowflake and SAP BW/4HANA systems. However I left out a part on how to transport the SNP Glue objects. Thus I write about it in this post.

There is a dedicated t-code (/DVD/GLTR - DataVard Glue Transport Workbench) to manage transports the objects developed for the Glue.


The Glue transport t-code allows to collects the objects for the transport either by package/user name/object name. Once the objects are entered in the selection screen there is a button Execute. Once it is selected the objects are written to the regular SAP STMS transport request. Here’s such a request looks like:



Entries in the TR are basically the transparent table entries. These values in the table entries will be transported to the target SAP system. This is because the Glue transport contains meta data objects (objects definitions), further there are ABAP objects like classes and programs. Based on this meta data - the physical Glue objects are recreated in the target system upon objects activation.

Below are some of the table entries that carries the meta data per particular Glue object:

/DVD/GL_T_DD_TH – Glue table header objects

/DVD/GL_T_E2_MAP - Persistent Transformation/mapping

/DVD/GL_T_E2_OBJ - Extractor 2.0 object directory

/DVD/GL_T_E2_OBP - Extraction object's parameters

/DVD/GL_T_FLD_OB - Glue objects mapped to logical folder

/DVD/GL_T_FOLDER - Table for mapping folders inside packages

 

Once the objects are collected into the TR via t-code /DVD/GLTR then the TR can be regularly released like any other SAP transport in e.g. SE10 t-code. At this point the TR is ready to be imported to target SAP system.



Sunday, March 24, 2024

Simple Snowflake to SAP BW data flow via SNP Glue

I was recently involved in a data transformation project leveraged SNP Glue tool. The SNP Clue (formerly Datavard Glue) is designed to integrate and connect various SAP systems and non-SAP systems, facilitating data exchange, synchronization, and consolidation.

In my scenario we used the Glue to transfer the data located in Snowflake system in the cloud to on-premise based SAP BW4/HANA system. There are a couple of the Glue objects that need to be developed to enable the information exchange between the Snowflake and the SAP BW/4HANA.

1. Glue Storage

2. Glue table

3. SAP table

4. Glue Fetcher

5. Glue Consumer

6. Glue Extraction Process

7. SAP BW Datasource

8. SAP BW target infoprovider – e.g. aDSO

 

All these objects are developed in SAP BW system where the Glue tool is installed. Main component of the Glue is a Cockpit (SNP Glue Cockpit) that can be access via t-code /DVD/GLUE (ABAP program /DVD/GL_MAIN). Here all other parts of the Glue can be accessed from. The Glue can be also installed as add-on too. In this case there is a component called Glue in the particular installation of the SAP BW.



1. Glue Storage – it is a central object that encapsulates all information needed to connect to the remote object where the data will be read/write – from/to. The Storage is an SNP objects that is shared between their tools (e.g. SNP Outboard). Below are settings that are needed to be provided to the Storage object in order to connect to the Snowflake. There is a dedicated t-code (/DVD/SM_SETUP) to maintain the storage. There are 2 storages that need to be defined for connection to the Snowflake:

1.1 Internal Glue Storage

Storage ID – logical name, defined by the Glue developer, usually must follow a naming convention given by particular developer guideline

Storage type – predefined as SNOW_STAGE, type binary

Description – free text, should be describing e.g. meaning of data that is being transformed

Java connector RFC – id of connection to JAVA connector (JCo). The JCo is mandatory and it must be installed in SAP BW system in order to use the Glue. Similarly as Storage it-self the JCo connection are shared across different SNP tools. There is a special t-code (/DVD/JCO_MNG) to setup and control the JCo connections.

Account name - name of the Snowflake account, in the format: <account name>.<region>.<platform>

User role – haven’t used this

JAVA Call Repeat – 0 by default

Repeat delay (seconds) - 0 by default

Driver path – path to the JDBC Snowflake driver, located at SAP server

Connection pool size – 0 by default

Username – user at Snowflake side

Password – of the user above

 

1.2 Glue Storage

Storage ID - logical name, see above in internal storage. Notice that the ID is different from the ID used in internal storage.

Storage type - predefined as SNOWFLAKE, type TAB, for transparent storage

Description – see in internal storage

Referenced storage – id of internal storage, main storage uses internal one

Java connector RFC - see in internal storage

JDBC Call Repeat – 0 by default

JDBC Repeat delay (seconds) – 0 by default

Account name - see in internal storage

Warehouse – existing warehouse in the Snowflake account, WH that will be used to perform computing operations like SQL in the snowflake

Database name – name of the database in the Snowflake

Database schema – name of the database schema in the Snowflake

User Role – user role in the Snowflake

Driver path – Snowflake drive path in SAP server

Hints - string that is added to connection string when JDBC driver establishes the connection

Connection pool size – 0 by default, number of connections that can be kept open in the pool

File Type – CSV or Parquet, I used CSV type

Table name prefix - prefix of all Glue tables created within this storage

Use Snowflake App for data merge – haven’t use it, if enabled, the SNP Native app is used

Wrap values in staged CSV files - haven’t use it

Data delivery guarantee – EO (Exactly-once), Data transfer behavior

 

All other Glue objects listed below are maintained in Object Navigator part of the SNP Glue Cockpit. The objects navigator can be accessed by dedicated t-code - /DVD/GL80.


2. Glue table – can be also maintained in dedicated t-code /DVD/GL11. The Glue table is metadata object, which represents the remote object (Snowflake view in my case). It allows to work with the data from that remove object in the SAP landscape. It contains a list of the columns from table or view of in remote DB. Again in my case it is a list of the columns in my Snowflake DB’s table. The Glue Table must be activated before it can be used in other Glue objects. This table doesn’t store data at SAP side persistently. The data is only read from remote DB when the extraction process runs.

 

3. SAP table – SAP system DDIC table that stored the data physically once it is fetched from the remote DB. The data is stored persistently in here.

 

Below objects (fetcher, consumer and extraction process) are part of the Glue Extractor 2.0.

4. Glue Fetcher - allows data transfer from source to target objects. It refers to the SAP Glue Table objects. It defines weather particular columns in remote table/view is used for selection. Further it defines what delta mechanism type is used (FULL, DATA, TIMESTAMNP, VALUE, VALUE_ DIST). In my case I used FULL extraction and cursor as well.


5. Glue Consumer – comes into the picture when the data is written into the target. In my case it specifies the SAP Table I created in step no 3.


6. Glue Extraction Process - object responsible to run whole process from reading the data from the source (by the Fetcher) to writing the data by Consumer. The Extraction Process can also be used to manage the execution of extraction and monitoring of launched extractions. It has similar capabilities like running DTP process or process chain in SAP BW.

Here the Fetcher and Consumer need to be specified during a design time. Whole process is driven by generating a Z* report. You can find the report name in Generated report name field. There is also possible to define a data transformation by specifying the rules and start/end routines. In my case I haven’t used any routines neither any transformation. I used just pure 1:1 mapping.


7. SAP BW Datasource – in my scenario I created a custom DS to be able to store data into the BW aDSO object. The DS was based on the SAP table object I created in step no 3.


8. SAP BW target infoprovider – e.g. aDSO that is my final storage of Snowflake data in the SAP BW. This is an object I used further for reporting – over composite provider.

 

To run the data replication from the Snowflake to the SAP BW/4HANA I used a process chain. The 1st step in the chain was to run Glue extraction process. By this I got the Snowflake data to SAP Table. From there I used DTP to load the data to my aDSO.

 

More information:

SNP Glue product page

SNP Glue documentation



Thursday, February 22, 2024

How to switch BPC environment state (on/off line)?

In one of my older posts, I explained few ways how to check BPC environment state. Either it is online or offline. This time I focus on how to change the state of the BPC environment.

1. in BPC web client: In main screen of web client choose Settings button on top right corner. From there click on Environment button in a drop down Settings menu.

There is a new popup shown as next with a listing all available environments in the system. In a bottom part of the window there is a “Manage All Environments” button. Click on that one. 

There is a new screen called Manage All Environments shown as a next. Form here choose the environment which status do you want to change and clink on “Change Status” bottom on the screen toolbar. 


As a last step there is a new popup window where the environment status can be changed via radio button. 


2. in SAP BPC backend: Use t-code SE16 to browse table UJ0_PARAM_APP. Put following selection into the selection screen.

APPSET_ID = <your BPC environment tech name>

FIELD = AVAILABLEFLAG


In a VALUE column of the table there is either 0 or 1. Zero means the environment is to offline and one means it is set to online.


3. in SAP BPC backend – custom program: There can be a small ABAP program created. That changes the content of VALUE column in table UJ0_PARAM_APP. That can be done via a call of method SET_APPSET_STATUS  of class CL_UJA_APPSET.

 

More information:

How to check BPC environment state (on/off line)?

Saturday, January 13, 2024

System does not let you log on using a password

In case user is getting a below message on attempt to logon to SAP NetWeaver/ABAP Platform based system:

This system does not let you log on using a password

Message no 00139

 

It means that password based logon to particular system was disabled. In such a case user has to use different authentication method to logon. There are following:

• Using password (conventional logon) – method that is not possible in this case

• Using an external security product (SNC)

• Using an X.509 browser certificate (intranet/Internet)

• Using a Workplace Single Sign-On (SSO) ticket

 

In case majority of the users in respective SAP system are using a different type of the authentication than password then it should be disabled. The disabling can be done system profile parameter called login/disable_password_logon.

The parameter’s default value is 0 that means the password-based logon is possible. Other allowed values are:

1 - Password logon only for users of the group specified by login/password_logon_usergroup

2 - Password logon is no longer possible in general


More information:

Profile Parameters for Logon and Password (Login Parameters)

320991 - Error codes during logon (list)