Debugger in SAP Data Services

This post describes on how to use debugger in Data Services.

Using the interactive debugger

The Designer includes an interactive debugger that allows you to examine and modify data row by row by placing filters and breakpoints on lines in a data flow diagram.
A debug filter functions as a simple query transform with a WHERE clause. Use a filter to reduce a data set in a debug job execution. A breakpoint is the location where a debug job execution pauses and returns control to you.
This exercise demonstrates how to set a breakpoint and view data in debug mode. Continue reading →

SAP Data Services Designer tool

This post gives you a short overview of the Data Services product and terminology. Refer to the post SAP BO DATA Integrator / Data Services for more details.

Data Services Components

The following diagram illustrates Data Services product components and relationships-

Continue reading →

SQL Transform in SAP BODS


SQL Transform helps to import a schema in your dataflow that can act as a source.

Create a new batch job, add a workflow and dataflow. In your dataflow, drag the SQL transform from the local object library.  You can find it under the ‘Platform’ set of transforms.

Double click on the transform.

(Refer post Validation Transform to create the required tables in the database)

Select your datastore, write a simple select statement accessing a table already imported in your datastore. Click on ‘Update Schema’.

Observe that now the schema appears on the top window.

schema in.PNG

Add a query transform and add the columns required to the output schema. Join this with a target template table. Save and execute your job.


This transform is particularly useful when you want only a particular set of data from the database table and push it to target level.

Refer to the ebook for more details:

New Ebook – SAP BODS Step by Step

Case Transformation in SAP BODS

Case transform is a part of Platform set of Transform in data services. It deals with the branching logic i.e it separates source data into multiple output data sets based on a condition.

For example, the source data from different countries is diverted to separate country tables based on certain conditions.

The condition based on which the data is branched has two parts: Label and Expression.

The label is the path name to the target table and the expression has the SQL logic that separates the data.

For example, define a label Region_INDIA where expression is Employee.regionId = 1.

Here, Employee is the source table having regionId as a column.

To understand it more clearly, login to the data services. Create some sample data in Microsoft SQL Server like below: Continue reading →

How to transport an Export Datasource

An export datasource is created when you want to use an info provider to load data to another targets which may reside in the same system or different system. An export datasource is created by default for a DSO. But its not the case for cubes. Below I will discuss one such scenario where I created and transported an export datasource.


You want to load data from cube in one system to a cube in another system. Let’s say that the source system is a SAP BW system with id SRD and target system is another BW system with system id TGD

SRD – SRT – SRP (Source system landscape)
TGD – TGT – TGP (Target system landscape)

Now there is a cube in SRD system which has data. This data needs to be get loaded to a cube in TGD system based on certain filters and conditions. And this has to finally move to the production systems (SRP and TGP)

The way to do this is first to request for a TR (transport request) in both SRD and TGD systems.


Continue reading →

Transport failed with RC=12 error


Sometimes while transporting changes from Dev to Quality system or from Quality to Production system, the TR fails with RC 12 error. This is a critical error which does not depend on the contents of the TR. There may be many scenarios leading to this error, I have mentioned one of them in my earlier post, this post is about one more such cases where I encountered this error-

Root Cause Analysis-

In this case, there was table space issue in the quality system and hence while transporting the changes, incomplete objects were transported resulting in a dump in the quality system and TR failing with RC=12 error-

Below is the screenshot of the error that we got in the TR-

rc12 error 3

Below is the screen shot of the dump which we were getting in the quality system-

rc12 error1

rc12 error2


We asked basis team to check the tablespace in the quality system. They extended the tablespace of the table mentioned in the error log/ short dump. The TR was then moved successfully.

Hope this helps..

Validation Transform in SAP BODS


A Validation transform is very much similar to the case transform. This also comes under ‘Platform’ set of transform in data services.

It is used to validate the data and transfer it to Pass and Fail tables. The validation rules can be defined in this transform. They can be simple or complex.

The rules can be written for each single column.

One important point to mention here is that a FAIL rule is stronger that PASS rule as a row will pass once it satisfies all the conditions but it will fail if any one of the condition is not satisfied.

A Validation transform has an input and two output schema.

In the Fail validation schema, there are two extra columns, one is Error Action and the other is Error Column.

The error action column will tell whether the  row is sent to pass or fail or both the schema.

The error column will have the information about which column has failed.

One extra table ‘Validation_RuleViolation’ will also be generated having the error details.

Also, there is option for ‘Action on Failure’. Here, you can direct the system to transfer the failed record to pass table, fail table or both the tables and also substitute a text in place of the failed column value.


Here, we will be using the same database as created in Case transform example.

Create a new project ‘validation_transform’. Continue reading →

Extracting ECC data to SQL Server using BODS

This post is about how to extract data from SAP ECC system and store it in an SQL database using Data Services as the ETL tool.

  • Make sure that you are able to connect to the SAP ECC system.
  • You have already created a database in SQL, a local repository and the job server for this purpose. Refer to this post on how to create them.
Steps involved

Below are the main steps involved in this process- Continue reading →