Central Repository in BODS

This post talks about how to set up a central repository using the repository manager.

Below are the main steps involved:

  • Create a DB in the SQL server
  • Create a Central Repository Using the Repository Manager
  • Login to designer and Add the Central repository
  • Upload a job from local repository to the central repository
  1. Login to the SQL and create a local repository and a database “DB_CENTRAL”

See the below screenshots



Click on ok and the database will be created. You can see it on the left of the screen.


2. Create a central repository Continue reading →

Transformation not getting deleted in RSA1

Recently I came across a scenario where I was not able to delete an inactive transformation while doing development.

I created dso and loaded the data. Then I observed that the dso should have different infoobjects. In order to correct the dso, I deleted the data first. Then I added the new info objects and activated the dso.

The dso was activated successfully. However, the transformation and dtp became inactive. I deleted the dtp successfully.

I went to edit the transformation and it showed warnings like – rule is invalid. I pressed continue and after couple of warnings, I got a dump as shown below- Continue reading →

Setting up the system in BODS

Below are the basic steps you need to follow when setting up a Data Services system-

1) Create a database in SQL (or any supporting back end DB). This will act as local repository for the designer.
2) Create a Repository (Local, Central or Profiler) using the Repository Manager.
3) Assign the Repository to the data services designer.(Management Console)
4) Define a Job server ( Server Manager)
5) Start with the Designer ( Data services Designer)


I) First step is you need to create a database in SQL so that it will contain all the information of the intermediate layers like temp tables (simple temp tables for one to one mapping or those temp tables which handle the transformed data)
You can create a separate user id and password for the DB which you can use when you login to the Data Services Designer.
Following screenshots illustrate the steps on how you create your database-

Login to SQL management studio
Navigation :- Start – Program Files -Microsoft SQL Server 2005 -SQL Server Management studio Express
Give in the Credentials
Authentication :- SQL Server Authentication
Login :- your user name
enter password and click on connect.
Now you will be in SQL Server Management Studio.


Now select the Data base folder on the left hand side and Right click -> New Database

Give the Database name “how2BODS” and click on OK Continue reading →

SAP BO DATA Integrator / Data Services

Data services is a part of SAP Business Objects also known as SAP BODS (SAP Business Objects Data Services). It is integrated with SAP Applications. It also supports non sap databases.

Data Services performs ETL processing of data. It loads the data from the source to the target, modifying and cleansing it in the process. For every dataflow, a backend sql is generated. This can be viewed by going to main menu-Validation-Display Optimized SQL

optimized sql.PNG

The ETL job runs IN-MEMORY, hence SAP BODS is the fastest tool in the market place.



Advantages of Data Services over SAP BI/BW ETL process

  • It is simple to use and has a user friendly framwork
  • It has in build configuration of many types of datasources like flat files, XML, hadoop etc.
  • It has many in-built transformations like key generation, case, merge etc.
  • It has separate jobs for batch execution and real time loads. It can also perform delta loads.
  • There is no concept of Process chains/ DTP/ Info packages if you use the data services to load the data.

Data integrator / Services Architecture


1 flowchart.PNG


Data Integrator Components



  • It is used to create the ETL dataflow
  • All the designer objects are reusable

Management Console (URL based tool / Web based tool)


  • It is used to activate the repositories
  • You can create users and user groups and assign roles and privileges here
  • It allows to auto schedule batch jobs and monitor them and also see history of execution

Access Server

  • It gets the XML input (real time data)
  • XML inputs can be loaded to the Warehouse using the Access server
  • It is responsible for the execution of online / real time jobs

Repository Manager


  • It allows to create the Repositories (Local, Central, and Profiler)
  • Repositories are created using standard database. (Oracle, Microsoft SQL, HANA etc.)
  • Data Services system tables are available in the repository which is a database in the Microsoft SQL Server for example. Any new tables imported in the Designer will be stored in the database as local repository external table. Any template tables not imported will be stored as Internal table in the local repository.
  • For a novice learner, central repository can be skipped. Central repository is only used if there are multiple users accessing the designer.

Meta Data Integrator

  • It generates Auto Documentation
  • It generates sample reports and semantic layers
  • It generates job based statistic dash boards

Job Server

This is the server which is responsible to execute the jobs. Without assigning the local / central repository , you cannot execute the job.

Designer Objects

Projects :-

Project is a folder where you store all the related jobs at one place. You can call it as a folder to organize jobs. Only one project can be  opened at a time in the Data Services Designer.


Jobs are the executable part of the Data Services. A job is present under a project. There are two types of jobs:

  1. Batch Job
  2. Real time Jobs

Work Flows:-

A work flow acts as a folder to contain the related Data Flows. The Work Flows are re-usable. These are optional, i.e. you can execute a job containing a dataflow and no workflow.


Conditional contains Work Flows or data flows and these are controlled by scripts. Scripts will decide whether to trigger the conditionals or not.


Scripts are set of codes used to define or initialize the global variables, control the flow of conditionals or control the flow of execution , to print some statements at the runtime and also to assign specific default values to the variables.

Data Flow:-

The actual data processing happens here. 

Source Data Store:-

This datastore connects your data services designer to your source system.

Target Data Store:-

This datastore connects your data services designer to your target system database.


These are the query transformations that are used to carry out the ETL process. These are broadly categorized into 3 types(platform, quality and integrator)

File Format :-

It contains various legacy system file formats like XLS, CSV,  XLSX, TXT etc.


You can create and use the local and global variables and use them in the project. The variables name starts with “$” Symbol.


There are numerous inbuilt functions like (String, math, lookup , enrich and so on) provided in the designer.

Template Table:-

These are the temporary tables that are used to hold the intermediate data or the final data. These can be converted to permanent tables (i.e. tables stored in the database) by importing them.

Data Store:-

The data stores acts a port from which you can define the connections to the source or the target systems.


Refer to the ebook for more details:

New Ebook – SAP BODS Step by Step