While working with BAPI RFC_READ_TABLE
I came across following issue. In case I called the BAPI with input parameters
where I specified very large table (e.g. MARA) I got exception DATA_BUFFER_EXCEEDED
thrown.
I checked the ABAP code and what it
does is that in case of data extracted for a row of table is larger than 512
bytes then exception is raised. This means function call limits extracted data
to 512 bytes per row and there is nothing you can do about it.
However there is a kind of workaround
available. There is another BAPI Z_AW_RFC_READ_TABLE which doesn't have that
limitation. I found the other BAPI working perfectly. All of cases I tested I
never got the same issue as with 1st BAPI.
However you need to notice that the 2nd
BAPI although it is delivered by SAP is included in Z* namespace. The BAPI is
included in development package ZDAV which belongs to BusinessObjects Data
Integrator (BODS) Development Class. It is not clear to me why it is in Z*
namespace. There is already few years after SAP
acquired BO so SAP had already enough time to migrate it.
However I must admit that at least
the issue is correctly described and documented via SAP Note: 1752954 -
DATA_BUFFER_EXCEEDED error - Data Services.
4 comments:
Hi Martin,
It seems like this FM is not available on every system ;)
Here, this is not the case. Maybe it depends on SP level or what functional modules are activated ?
Christophe.
Hi Christophe,
yes it is not in every NW ABAP based systems. What I observed it is available in systems like ECC. It is not available in BW systems. I do not thing it is SP matter. Simply as the FM is part of BODS SAP puts it in every system where extraction of business data make sense. In any case if it is not there just copy it from the system where it is available - as note 1752954 says so.
cheers
m./
The most current table read FM for SAP Data Services is SAPDS/RFC_READ_TABLE.
These improvements came with DS 4.1:
SAP table reader in regular data flows: The Data Services SAP table reader in regular data flows can now fetch data in batch from an SAP system.
This new implementation allows the reader to process large volumes of data and mitigates out-of-memory errors. Also included are an Array fetch size option, which allows the data to be sent in chunks, avoiding large caches on the source side, and an Execute in background (batch) option, which lets us run the SAP table reader in batch mode (using a background work process) for time-consuming transactions.
Thanks Anonymous, just to correct the FM name; it is: /SAPDS/RFC_READ_TABLE
Post a Comment