Extracting ECC data to SQL Server using BODS

This post is about how to extract data from SAP ECC system and store it in an SQL database using Data Services as the ETL tool.

Prerequisites
  • Make sure that you are able to connect to the SAP ECC system.
  • You have already created a database in SQL, a local repository and the job server for this purpose. Refer to this post on how to create them.
Steps involved

Below are the main steps involved in this process-

  • Create a New Data Store and give the credentials to connect to the ECC system
  • Select the Tables under the Data store and import the sap tables that are required to the Data services
  • Copy the SAP table data to staging tables in the SQL database and use the staged tables for further steps
Step 1: Creating Data store to connect to SAP ECC system
  • Open the SAP GUI and click on the system details of the the ECC system. Note the server and system ID.
  • Login to the Data services designer. Create a data store in the local object library under Data store tab.

create ds

  • Give a data store name and provide the SAP ECC server connection details and also provide advanced parameters after clicking on ‘Advanced’ tab-

ds name

advanced parameters

Data store Name DS_SAP_IDES

Data store type “ SAP Application”

Application server :

Username :

Password :

Client number  :

System number:

Work Directory: \\ application server name \

Application path:  \\ application server name \

Generate Directory: C:\Program Files\Business Objects\BusinessObjects Data Services

Security : SAP_ALL

Number or connection retries: 1

Please note:- Make sure that you change the ‘Number of retries’ to 1. This is because if you enter any wrong credentials,  data services will try connecting to the sap system 3 times with the wrong password and the user id will be locked due to improper login for 3 times.

You can find the newly created Data store under the local object library.

Step 2: Import the ECC tables
  • Select the Data store – right click and click on search to browse for the tables in SAP ECC system-

search datastore

  • Select the search for External data by using the drop down button, give the table name MARA and click on search.

search tables

  • Right click on the table and select ‘import’. Import MARA and MARC tables to the data store.

imported tables

Now your sap tables are ready to proceed with the job.

As per SAP AIO, first save the sap tables to the local data base. This will avoid connecting to the sap every time to fetch the data.

Step 3: Creating staging tables and executing the job
  • In the designer, create a job, data flow and under the data flow, drag the tables MARA and MARC in the canvas

drag tables in canvas

  • Now we need to save these tables data in to the local database. Drag query transformation-

query transform for tables

Click to open the query and move the desired fields from schema in to schema out.

Here we can see the tabs like mapping, select, from, outer join, where and so on. These are used to filter the data or make a join. These will be explained later.

  • Now click on back  and create a template table
  • Click on the template table on the right side pane and click it back on the canvas. Give the table name and select the data store on which you want to save and click on OK. Connect the query to the template table.

create template table

connet template table to query

  • Similarly do the same for the MARC table. And click on save. Now you are saving the sap table data to the SQL.
  • Select the job under the project area and right click and execute the job.
  • You will get the execution properties, click on OK.

job execution

  • Under ‘trace’ you can see the job log.
  • In the monitor next to the trace, you can see details of status, records fetched and time taken-

job monitor

In this way, you have completed the data load to SQL.

You can see the tables and records by logging in the SQL Management studio express and selecting your database.

Conditional flows in Data Services

Global variables

Global variables are used to define the variables that are available all over the job. These will be assigned a fixed value.

Defining a Global variable : Go to tools -> variables in the designer window.

define global variable

You will get the variables and parameters screen.

  • Select the Global variable and right click and click on insert.

insert global variable

  • Double click on the global variable and define the variable-

double click on global variable

  • Give the above credentials and click on OK. Similarly create another global variable $GV_load.
Script

Scripts are used to initialize the global variables and to drive the job. You can restrict some portion of the job from execution by setting the Global Variable to ‘N’.

  • Click on the scripts on the right side pane and click it on the canvas. Give a name.

script

  • Click on the script to open it and define the global variables as below-

write script

Go back.

Conditional Flows

Global variables are used in the conditional flows to control the job flow.

  • In the designer, select the conditional flow on the right side of the canvas-

conditional flows

and click it on the canvas. Double click on the condition and give the global variable inside-

condition

give global variable in condition

  • Similarly do the same for the load condition.
  • Now Drag the Dataflow used to extract under the condition as above. Click on back  and open the second Condition. Drag a data flow from the right side pane, click it under the Condition load as below and name it as DF_Load.

condition load

  • Open the Data Flow by clicking on it.  Now drag the tables STG_MARA and STG_MARC from the Data store STAGING as below-

drag tables in data flow

  • Next, drag a query transform from the right side pane and click it on the canvas. Connect the source tables, STG_MARA & STG_MARC to the Query Transform. Click on the query to open it.query drag
  • You can see the two tables under the schema in. Expand them and drop the required fields to the schema out. Here you need to join both the tables under the where condition. Click on the where tab. Drag MATNR from MARA to the where condition  type “=” and now drag MATNR from the MARC table.

It is mandatory to join if you are using two tables as source.

join tables

  • You can also filter the data for a particular plant like above in the where condition.
  • Click on back and create a template table and connect the query to the template table.

With this, the procedure for joining two tables into using where condition is completed.

  • Link the script to the Condition1 and link the condition1 to condition2 as shown below-

link conditions

  • Now click on validate/Validate all  on the top of the screen to check if the job is correct.

validate jobs

  • Run the job- Select the job under the Project Area right click and click on execute.

execute job

job log

job monitor screen

This concludes the use of Global Variables, Scripts and Conditional Flows for extracting the data from sap and loading it to SQL and then joining those SQL tables based on a condition.