ODIExperts.com

The blog for Oracle Data Integrator ( ODI )

April 4, 2010
by kdevendr
17 Comments

Categories: How to , ODI , Tips and Tricks

Tags: , , , ,

Single Post View


Flow Control and Static Control

Flow Control

– If enabled this option uses the CKM selected and applies before loading into the target thus avoiding wrong data to get loaded

clip_image001

What actually happening in the above flow is that , After loading the data into I$,

  • a check table is created (SNP_CHECK_TAB) , deleting previous error table  and previous errors as ODI generally does.
  • Now it creates a new Error table , and check for Primary key unique constraints , other constraints and conditions defined in Database or Model level ODI conditions and Not Null check for each column marked as Not null.
  • If records violate the above constraints and conditions, it  adds the required records into E$ table and add an entry of it into SNP_CHECK_TAB with information about schema, error message , count etc.
  • Finally the other records are inserted and updated as per the KM and logic.

E$ Table

clip_image002

In the E$ table , the completed columns and the error message including error count and constraint name is record with associated ROW_ID.

SNP_CHECK_TAB

image

Static Control

– If enabled this option used the CKM selected and applied after loading into the target.

clip_image004

Here being the PK constraint on Target table , Insert has failed .

Lets see an example where the target table has no constraint at the database level but we are applying at the ODI level

image

Key is applied on the Employee_ID at the ODI Level.

clip_image006

Here after inserting into the target table , it uses the CKM selected and check for all the constraints and conditions and finally loads the rejected records into E$ and SNP_CHECK_TAB.

clip_image007

Check at the Datastore Level.

To check for the constraint or mostly bad records at the source side , ODI have option called check at the Datastore and when checked run the CKM on that particular datastore and stores the values into the respective E$ table.

clip_image008

clip_image009

After fetching the required errors if found makes an E$ table and makes an entry at SNP_SCHECK_TAB.

Constraints , conditions and duplicated can be checked at the datastore level too

clip_image010

clip_image011

Checking duplicate –

clip_image012

Foreign Key Constraints

clip_image013

Or run Check at the datastore level for complete check for all the constraint using the CKM defined in the model.

clip_image014

Recycle Errors

ODI KM’s have an option called Recycle Errors , what actually happens is that ODI reads from the E$ tables and loads into the I$ capturing all the records which are not present in the I$ by matching on the Primary Key or Unique Keys.

Lets look at an example , here  i am having an two duplicate rows stored in my E$ table , now iam going to make remove one duplicate record.

image

I am making changes into the Source File ( for my Example) and saving and re run the interface

image

image

The new records is inserted and the records are also deleted from the error tables.

image

image

March 28, 2010
by kdevendr
3 Comments

Categories: Administration , Architecture , How to , ODI , Technology , Tips and Tricks

Tags: , , , , , ,

Single Post View


Export DWR to EWR [ Common Master Rep ]

This post explain the process of Exporting Scenarios from DWR to EWR .  As Cezar have suggested in previous post of how it is important to have one single DWR ( Developer Work Repository) and multiple EWR [ Execution Work Repository ] , As Multiple  DWR can lead to improper code maintains and improper synchronization between the multiple DWR.    Read more this post here

In this present example i am moving from my DWR to EWR [ Testing Environment ] .

Generally before the UAT , its important to have an Testing environment  where Scenarios functionality, code Testing , Data Validity  is tested. I strongly feel that it is import to have such an environment before releasing to UAT so that less codes fail at the UAT and multiple minor issues can be fixed at this stage thus avoiding more failure at the next stages.

In this example i am creating the EWR in the same Master Repository and using a different context and agent but its is not necesarry to do so,  if both are in the same box ,  the only draw back i see is that using the same agent for  Development and Testing can lead to performance degradation , but again its the architecture and the requirement which matter requiring to understand in  how to handle the resources.

Step 1

  –   Creating the Execution Work Repository  schema, tablespace and other Database permission for users.

Step 2

  –  Creating the EWR in ODI Topology Manager..

Right click – > Insert Work Repository .

Provide the required Meaningful name ,  EWR  schema user/password   created in step 1

image

Provide the  JDBC Driver, URL.

image

Click Test  and Once Successful Click Apply and OK.

Provide the new ID , make sure the ID different from the  other DWR created .  Select ‘ Execution ‘ type  and provide a meaningful Name as that will be your  Execution Work Repository Name.  click Ok once done.

image 

We have our  EWR created

image

As you can see that in EWR , mostly tables related to Scenarios , Sessions  , Sequence , Variables tables are created and the work tables in  EWR are less than the DWR since in EWR mostly information related to scenarios, sessions are mattered a lot.

image

Also  EWR  , access to  Operator , Topology  and Security Manager are allowed . As the name suggestion Execution Work Repository so only  Scenarios are allowed to be imported into this environment and so designer are not created in EWR .

Step 3

  –  Logging into EWR Operator.

image

image

Now we will move our Scenarios from DWR to EWR

Step 4

–  Transfer of Scenarios from DWR to EWR

Log into Operator  of DWR .

Right Click on the Scenario to be Exported and selected the Folder.

image

image

image

Always use the insert_update mode so the in the future when you try to insert or update ODI can automatically identify the scenario and make the necessary insertion and updation and less code corruption.

image

image

Using Solutions

Solution is great when the client or architecture demans to track each code flow from one environment and to the other ,as solution creates version for insertion and for every transfer so at the end there can be multiple version of the same objects . As i have said before its the requirement that matter , in case if solution have to be used use Description to provide as much details as possible

say for example

Description .

1. Fixed the Invalid_number issue

2. Logic for Data validation is changed by this example and so on.

Doing so can provide more help and information for the administrator or the user in  getting older and newer codes  into the other environment as the demand requires.

Go the DWR  operator  – > Insert Solution

image

Provide the Solution Name  and drag in the scenario  into Principal Elements.  Solution creates Version for every Insert you do .  Solution are great when you would like to track each scenario and roll back to previous versions.

image

image

Click yes on the above  warning and keep adding as many scenarios to the Solution.

image

image

Since  Solution are created in Master Repository[ SNP_VERSION] , you can find that in other Work Repository.

image

We are going to extract the scenarios from the Solution created above. To do so , Right click – > Restore All Elements and click yes for the below warnings

image

image

Once restored , all the scenarios are imported into the other environment.

image

Step 5

– create a new Context ‘ XMT_TEST’ 

[ NOTE : – Make sure you create a new Work Schema for Testing as it will be easy to Testing and Debugging the $ tables ]

image

step 6

– Creating Data Servers  and Physical Schema . [ Being a testing environment i am using the same development source box  and so only the target structure is replicated for the newer envrionment, thus creating the new Dataserver and Physical schema and link them to the respective Logical Schema  ]

image

Create the Physical schema and Link the to the already created Logical Schema using the new Context

image

image

Keep repeating the same steps for other physical schemas.

Step 7

– Create Agents for the new Testing environment.

image

Step 8

– Lets test the scenario using the newly created context.

image

image

image

March 28, 2010
by kdevendr
0 comments

Categories: ODI , Reports , Tips and Tricks

Tags: , ,

Single Post View


Scenario Report

This below query provide details about each scenario and its associated session  run details in the operator . This enables to get details about each run whether it was successful , record processed , inserted and updated and duration in hours and other statistical information about each scenario.

SELECT
SS.SCEN_NAME AS SCENARIO_NAME ,
SS.SCEN_VERSION AS SCEN_VERSION ,
SSR.SCEN_RUN_NO AS SESS_NO,
SSR.SESS_DUR AS DUR,
CASE
WHEN SSR.SESS_DUR <61 THEN SSR.SESS_DUR||' SEC'
WHEN SSR.SESS_DUR BETWEEN 61 AND 3600 THEN (SSR.SESS_DUR/60)||' MIN'
WHEN SSR.SESS_DUR > 3600 THEN (TRUNC(SSR.SESS_DUR/3600)||' HR '||TRUNC(MOD(SSR.SESS_DUR,3600)/60)||' MIN')
END AS DURATION,
-- The above CASE statement is to show the duraion in hrs , min and seconds
-- rather than manual calculating when the session run for multiple hours
DECODE(SSR.SESS_STATUS,'D','SUCCESSFUL'
,'E','FAILED')AS STATUS,
SSR.NB_ROW AS ROW_PROCESSED,
SSR.NB_INS AS ROWS_INSERTED,
SSR.NB_UPD AS ROWS_UPDATED,
SSR.NB_DEL AS ROWS_DELETED,
SSR.NB_ERR AS ERROR_RECORD,
CASE
WHEN TO_CHAR(SSR.SESS_END,'MM-DD-YYYY')=TO_CHAR(SSR.SESS_BEG,'MM-DD-YYYY')
THEN TO_CHAR(SSR.SESS_END,'MM-DD-YYYY')
ELSE TO_CHAR(SSR.SESS_BEG,'MM-DD-YYYY')||'-TO-'||TO_CHAR(SSR.SESS_END,'MM-DD-YYYY')
END AS SESSION_RAN_DATE
FROM SNP_SCEN SS
LEFT OUTER JOIN SNP_SCEN_REPORT SSR ON SS.SCEN_NO=SSR.SCEN_NO
WHERE SSR.SCEN_RUN_NO IS NOT NULL
ORDER BY SCENARIO_NAME,SSR.SESS_END

image

March 27, 2010
by kdevendr
4 Comments

Categories: ODI , Tips and Tricks

Tags: , ,

Single Post View


Asynchronous -Parallel Execution

In ODI interface can be called parallel through using  odistartscen.

Step 1 – Create Scenarios of the Interface.

Step 2 .  In the package ,drag the odistartscen and provide the following details as shown below.

[ Note ;- Version = –1 calls latest scenario , so keep it –1]

In the Synchronous / Asynchronous , select the Asynchronous .

Session Name – will be visible in the Operator.

image

Keep doing so for all the Interface to be executed parallel  and provide the following as defined above.

image

Step 3 –  Finally call the odiwaitforchildsession

image

The  odiwaitforchildsession checks till all the parallel scenarios are executed successful or failure. The odiwaitforchildsession has to be called after such parallel execution and before Asynchronous  started if there are  interface before it.

March 23, 2010
by kdevendr
2 Comments

Categories: How to , ODI , Tips and Tricks

Tags: , , ,

Single Post View


Dropping $ Tables

There are many development environment where the $ tables created in the ODI work schema are not removed at proper interval .As the development grows the more number of $ tables grow especially in the environment where dynamic $ tables created for every interface run .

This particular procedure drops all the $ tables.  Please use this first in your development and modify the codes according for your environment.

image

In the above step , we are dropping all the $ tables from Work Schema . My Work Schema is ODI_TEMP , Selected any required Schema which have the necessary permission to do the drop. Provide the Work_Schema in TEMP_SCHEMA option.

image

Download the procedure

http://cid-d7195c8badc40d0b.skydrive.live.com/embedicon.aspx/odiexperts.com/TRT_ODI_DROP_TEMP_TABLES.xml

March 23, 2010
by kdevendr
1 Comment

Categories: How to , Knowledge Modules , Logic , ODI , Tips and Tricks

Tags: , ,

Single Post View


KM FOR UNION BASED ON IKM SQL CNTRL APPEND

This KM is do UNION in ODI  which created based on IKM SQL Control Append and support Flow control too and is based on the full outer join concept

clip_image002

Map the column in this format

NVL( <SOURCE1.COLUMN1 , SOURCE2.COLUMN1>
NVL(<SOURCE1.COLUMN2,SOURCE2.COLUMN2>
and so on..

and do a simple join between the two source tables  and select this KM and execute the interface.

clip_image003

The codes are shown below.

INSERT FLOW INTO I$ TABLE

insert into <%=snpRef.getTable("L","INT_NAME","A")%>
(
<%=snpRef.getColList("", "[COL_NAME]", ",nt", "", "((INS and !TRG) and REW)")%>
)
select <%=snpRef.getPop("DISTINCT_ROWS")%>
<%=snpRef.getColList("", "[EXPRESSION]", ",nt", "", "((INS and !TRG) and REW)")%>
from <%=odiRef.getSrcTablesList("", "[SCHEMA].[TABLE_NAME]", " FULL OUTER JOIN ", "")%>
on <%=odiRef.getJoinList("","([EXPRESSION])"," and ","")%>
where <% if (snpRef.getPop("HAS_JRN").equals("0")) { %>
(1=1)
<%} else {%>
JRN_FLAG <> 'D'
<% } %>
<%=snpRef.getFilter()%>
<%=snpRef.getJrnFilter()%>
<%=snpRef.getGrpBy()%>
<%=snpRef.getHaving()%>

INSERT INTO NEW ROWS

<%if ( snpRef.getUserExit("FLOW_CONTROL").equals("1") ) { %>
insert into <%=snpRef.getTable("L","TARG_NAME","A")%>
(
<%=snpRef.getColList("", "[COL_NAME]", ",nt", "", "((INS and !TRG) and REW)")%>
<%=snpRef.getColList(",", "[COL_NAME]", ",nt", "", "((INS and TRG) and REW)")%>
)
select <%=snpRef.getColList("", "[COL_NAME]", ",nt", "", "((INS and !TRG) and REW)")%>
<%=snpRef.getColList(",", "[EXPRESSION]", ",nt", "", "((INS and TRG) and REW)")%>
from <%=snpRef.getTable("L","INT_NAME","A")%>
<% }
else { %>
insert into <%=snpRef.getTable("L","TARG_NAME","A")%>
(
<%=snpRef.getColList("", "[COL_NAME]", ",nt", "", "(INS and REW)")%>
)
select <%=snpRef.getPop("DISTINCT_ROWS")%>
<%=snpRef.getColList("", "[EXPRESSION]", ",nt", "", "(INS and REW)")%>
from <%=odiRef.getSrcTablesList("", "[SCHEMA].[TABLE_NAME]", " FULL OUTER JOIN ", "")%> on <%=odiRef.getJoinList("","([EXPRESSION])"," and ","")%>
where <% if (snpRef.getPop("HAS_JRN").equals("0")) { %>
(1=1)
<%} else {%>
JRN_FLAG <> 'D'
<% } %>
<%=snpRef.getFilter()%>
<%=snpRef.getJrnFilter()%>
<%=snpRef.getGrpBy()%>
<%=snpRef.getHaving()%>
<% } %>

Hope it does the work till latest ODI is released with UNION and MINUS logic.

Download the KM below.

Download XML

March 20, 2010
by kdevendr
2 Comments

Categories: Common Errors , How to , Logic , ODI , ODI Bug , Tips and Tricks

Tags: , ,

Single Post View


Installing ODI in Windows 7

There seems to be some forum threads with installing ODI in windows 7 Edition . I would like to throw some light and tweaks to make it work.

ODI doesn’t support windows 7 since it checks for Windows version before installing. To suppress run the setup from command prompt > setup -ignoresysprereqs , this will let you pass through the initial Windows version check and keep continuing until the end.

In windows 7 Home premium , my ODI was not successful in installation or even it does the ODI dint start so I triggered manually by copying Oracledi folder and created the Environment Variable – ODI_JAVA_HOME to point to latest JDK Path.

Windows 7 Professional , Enterprise and Server 2008 R2 have been successful in installation with ignoresysprereqs option.

After ODI installation , install the JDK1.6 and later since 1.6 support transparency else ODI will work in Basic Theme and create or change the Environment Variable – ODI_JAVA_HOME to point to latest JDK Path.

Small  simple idea to access ODI faster from Taskbar

The easy way of calling designer or topology is creating a shortcut or batch file. Batch files are smart windows programming as shell scripts for Unix.

For example  – create a batch file  –  designer.bat and copy paste

CD C:OraHome_1oracledibin    < odi designer path >
DESIGNER.BAT

similarly do so for operator , topology and agent too .

CD C:OraHome_1oracledibin
agentscheduler.bat -name=XMT_AGENT

or  create shortcut into the desktop from  oracledi/bin folder  into a separate Folder and call it Batch_files or whatever meaningful name you wish to provide.

image

Right  click on Taskbar – >  Toolbars  – > New toolbar  and selected the Batch file folder .

As shown in the above image , with one click from taskbar i can access my ODI easily. This is just my way of accessing ODI in windows 7 , if you have some thing more creative , please share with odiexperts followers.

March 20, 2010
by kdevendr
2 Comments

Categories: Common Errors , How to , ODI , ODI Bug , Tips and Tricks

Tags: ,

Single Post View


Querying in ODI

There is an easy way of writing query inside ODI

Right click on the any datastore of the required technology on which you wish to run the query  and select Data.

image

image

Click on the  SQL (?) on the Top Left  and type in your query  inside the query box and get results in the  result box as you can see in the above screen .

image

Have fun and enjoy and look odiexperts for more tips and tricks.

March 14, 2010
by Cezar Santos
54 Comments

Categories: How to , Logic , ODI , Tips and Tricks

Tags: , , , , , ,

Single Post View


Changed Data Capture ( CDC )

Changed Data Capture as the term implies is used to capture the data that is inserted, updated and deleted at the source side and replicating the same at the target. ODI have a Journalizing KM to do the required implementation and it is basically into two types simple and consistent.

Simple is used for CDC implementation on a single table and Consistent on Multiple tables or Model.

In this post i will be covering Simple CDC concept . Look for my future post for Consistent based CDC implementation.

Simple CDC – Oracle Table

clip_image001

I am going to implement the Simple CDC concept on this particular table – SRC_AGE_GROUP.

clip_image001[1]

Step 1 – Adding the Table to CDC

clip_image002

clip_image003

Step 2 – Starting the journalizing

clip_image004

Select the user as Subscriber

clip_image005

Execute it

clip_image006

The journalizing process is established.

clip_image007

In the above execution ODI creates a Subscriber table in the work schema, J$ table, view and a trigger to capture any data changes when any Insert, Update and Deletion takes places.

After successful execution you will be able to see a small green clock icon.

clip_image008

Let’s do a simple mapping and test for CDC.

clip_image010

Drag the Journalized Table as source and the required target. On the Source Data store, check mark the option “ JOURNALIZED DATA ONLY “ and ODI will automatically add a filter with a required date condition and subscriber information, use the proper LKM and IKM as per your technology. For this Example I have used IKM Oracle Incremental Update.

I have run it once to check whether its working fine or not. Now let me show you an example.

Original Source

clip_image012

Modified source

clip_image014

I have removed the last column and change the third row age_max 39 to 35.

There are two ways of seeing Journalized Data.

clip_image015

On the Journalized Interface source and Right click on filter and click on data.

clip_image016

Here is the sql from where the journalized data is being retrived

select * from ODI_TEMP.JV$DSRC_AGE_GROUP SRC_AGE_GROUP where (JRN_SUBSCRIBER = ‘SUNOPSIS’ /* AND JRN_DATE < sysdate */)

The Other way is to right click on Journalized data store in Model and Changed Data capture – > Journal Data and you will see the above data again too.

clip_image017

Let’s capture the Journalized data to my Target table.

Target Table Before running the Interface

clip_image018

As you can see, one Update and one Deletes.

clip_image020

Target Table after running the Interface

clip_image021

MSSQL – SIMPLE

Repeat the above steps for Journalizing SQL Server table as we did for Oracle namely

Add to CDC

Start Journal

And if everything is start the journalizing will start.

clip_image022

clip_image024

Source Data before Running the Journalized Interface

clip_image026

delete from dimtime_backup where TimeKey=6;
update dimtime_backup set DayNumberOfWeek=4 where TimeKey=4;

clip_image028

Journalized Data

clip_image030

Although there is update and Delete on the source but there is just delete in the ODI reason being I have update the same value DayNumberOfWeek=4 again for update so ODI have correctly did only the Delete as the target already had the same value which is DayNumberOfWeek=4

clip_image032

How to say whether you have which JKM you have , right click on your model , Edit – >Journalizing Tab and you see what JKM is being used and whether Simple or Consistent

clip_image033

clip_image034

March 14, 2010
by kdevendr
40 Comments

Categories: Architecture , How to , ODI

Tags: , ,

Single Post View


Agent

This post describes how to create an single and multiple agent and how to start an scheduler agent and other information.

Step 1 – Start the Agent.bat or Agent.sh under oracledi/bin

image

Step 2 – Topology Manager – >   Physical Architecture – > Agents  -> Insert Agent

image

Provide the required information and click Test  and you should get a pop – Agent Test Successful. The default Port is 20910

image

If you are getting this above error message , it means you have not started agent.bat or agent.sh and Please do so and test again.

[ Note : – In case you wish to start an agent on another port say 20911 , go to Oracledi/bin  and type  agent -port=20911 and click test for successful connection . By default agent.bat or agent.sh communicate on 20910 port only ]

Step 3 – Topology Manager – >   Logical  Architecture – > Agents  -> Insert Agent

image

Link the Physical and Logical Agent with the required Context.

Step 4 – Editing odiparams.bat or odiparams.sh

rem

rem Repository Connection Information

rem

set ODI_SECU_DRIVER=oracle.jdbc.driver.Oracle Driver

set ODI_SECU_URL=jdbc:oracle:thin:@WIN-Q5COCAGCSMF:1521:ORCL

set ODI_SECU_USER=ODI_MASTER

set ODI_SECU_ENCODED_PASS=bPyXDShZRGkg822hGYJ8Dy

set ODI_SECU_WORK_REP=ORCL_WORK_REP

set ODI_USER=SUPERVISOR

set ODI_ENCODED_PASS=LELKIELGLJMDLKMGHEHJDBGBGFDGGH

  • ODI_SECU_DRIVER – Provide the driver name of the Agent Technology ( for my example its Oracle )
  • ODI_SECU_URL  –  Provide the Url of the Connection string
  • ODI_SECU_USER – Master Repository User
  • ODI_SECU_ENCODED_PASS – Master Repository User Password , To encode any password under oracledi /bin in command prompt or terminal type  agent encode <password > for windows (or)  sh agent.sh encode < password> for Unix
  • ODI_SECU_WORK_REP – Work Repository Name

Don’t change the ODI_USER  & ODI_ENCODED_PASS from default , unless you want to run from other SUPERVISOR privilege account or you have changed the SUPERVISOR  password.

Step 5 – Running an Agent

Running ODI Agent as scheduler agent

Run the agent in agentscheduler mode  from oracledi/bin

agentscheduler -name=<agent_created> -port=<port_no>for Windows

sh agentscheduler.sh -name=<agent_created> -port=<port_no> for Unix

To run into background run the command  in Unix –

sh agentscheduler.sh -name=<agent_name> -port=<port_no> &

Running ODI Agent as Windows Service

agentservice -i -s <agent_name> as Scheduler Agent

agentservice -i -a <agent_name> as Listener Agent

For help on agent service type – > agentservice – help

clip_image001[4]

To start the service go to the service ( services.msc ) . Right click Start

clip_image002[4]

( Or ) NET START SnpsAgent<agent_name> ( from windows common prompt)

For my example – > SnpsAgentXMT_AGENT

clip_image003[4]

Running Multiple Agents.

Lets say I am creating another agent – XMT_AGENT_2  on Port –20911

image

Run the agent.bat or agent.sh with the port number you are planning to run on.

image

image

Click Test for successful connection.

To run simultaneously agent 1 and agent 2 , we will need to make a duplicate of   agentscheduler  . Rename the copy to  meaningful name  for my example i have renamed to   agentscheduler_2.bat.

Lets run both the agent simultaneously.

image

For creating agent on two different Server , make a copy of odiparams and agentscheduler and lets say  i rename to  odiparams_2.bat and agentscheduler_2.bat  and

change the  the agentscheduler_2.bat   – >

call “%ODI_HOME%binodiparams.bat” to

call “%ODI_HOME%binodiparams_2.bat” .

Load Balancing

If you have two or three agent and you want to do Load Balancing ,

under  Physical Agents  – > Load Balancing check the required Agents

image

For scheduling using ODI Scheduler look into this earlier post  Scheduling using ODI Scheduler