The blog for Oracle Data Integrator ( ODI )

Flow Control and Static Control

Flow Control

– If enabled this option uses the CKM selected and applies before loading into the target thus avoiding wrong data to get loaded


What actually happening in the above flow is that , After loading the data into I$,

  • a check table is created (SNP_CHECK_TAB) , deleting previous error table  and previous errors as ODI generally does.
  • Now it creates a new Error table , and check for Primary key unique constraints , other constraints and conditions defined in Database or Model level ODI conditions and Not Null check for each column marked as Not null.
  • If records violate the above constraints and conditions, it  adds the required records into E$ table and add an entry of it into SNP_CHECK_TAB with information about schema, error message , count etc.
  • Finally the other records are inserted and updated as per the KM and logic.

E$ Table


In the E$ table , the completed columns and the error message including error count and constraint name is record with associated ROW_ID.



Static Control

– If enabled this option used the CKM selected and applied after loading into the target.


Here being the PK constraint on Target table , Insert has failed .

Lets see an example where the target table has no constraint at the database level but we are applying at the ODI level


Key is applied on the Employee_ID at the ODI Level.


Here after inserting into the target table , it uses the CKM selected and check for all the constraints and conditions and finally loads the rejected records into E$ and SNP_CHECK_TAB.


Check at the Datastore Level.

To check for the constraint or mostly bad records at the source side , ODI have option called check at the Datastore and when checked run the CKM on that particular datastore and stores the values into the respective E$ table.



After fetching the required errors if found makes an E$ table and makes an entry at SNP_SCHECK_TAB.

Constraints , conditions and duplicated can be checked at the datastore level too



Checking duplicate –


Foreign Key Constraints


Or run Check at the datastore level for complete check for all the constraint using the CKM defined in the model.


Recycle Errors

ODI KM’s have an option called Recycle Errors , what actually happens is that ODI reads from the E$ tables and loads into the I$ capturing all the records which are not present in the I$ by matching on the Primary Key or Unique Keys.

Lets look at an example , here  i am having an two duplicate rows stored in my E$ table , now iam going to make remove one duplicate record.


I am making changes into the Source File ( for my Example) and saving and re run the interface



The new records is inserted and the records are also deleted from the error tables.




  1. Hi,
    In my case i have working on an interface where source is file and target is also file, and i am using complex file in target.
    but my problem is when i am trying to pass a null value in mandatory not null column, i would expect a error on that particular not null mandatory column, but when i deploy the package it give me error in another native column. my question is….

    for complex file why it is not give me proper error?
    why the error shows in native column? instead of that particular column…?

    • Hi Gourab,

      I never saw such situation of error, I mean, the error should be raised in the right column. What is the ODI version? There is any RDBMS technology as staging area?

  2. Hi,

    I came across one situation when i’m checking for failure case.

    i created one interface, source is File (CSV file) and target is oracle. I’m inserting the data from file to table.

    when i’m testing for failure case, i have added 10 records in a file. From those 10 records, i intensionally added one error record ( like data type mis match i.e actual data type of attribute is Date instead i’m given the string value to that attribute). So when i’m executing my mapping, ODI inserts 9 records into the table, it discards one error record into a BAD file.

    But my requirement is either all records from file should insert into table or all records should not insert into table if there is any error record in file.

    Please help me for solving this problem.

    Thanks & Regards,
    Shiva Kumar

  3. Hi,
    Could you please help in below requirement
    the requirement is to load all valid/good data in the target table and capture all Data type/Size mismatch errors or any other errors should be sent as log file.


  4. Hi,

    I am working on an inetrface to load data from an oracle database ( web commerce portal) to another oracle database (Billing system )
    and the requirement is to load all valid/good data in the target table and capture all Data type/Size mismatch errors into a separate error table .

    We are using ODI One of the way which I figured out is using database feature “dml error logging” but for this I will have to customize the KM which i want to avoid future maintenance of KM ‘s.

    I wanted to know if there is any way to achieve this by using Flow Control and without customizing KM or writing ODI constraints on target table.

    • Hi Raj,

      that is exactly what Flow Control was built for. If you wish, send me the type of control to be performed and I can help you in how to do it.


      Thank you for visit us!

      • Hi Santos,

        Thanks for your response. I want to capture all data type/length mismatch errors. e.g. say for a column “product_Code” , length in the source table is VARCAHR(40) and in target table its’ VARCHAR(30), in this case I want to capture all records that fail to load in the target table where product_Code has larger value in the source table.

        Can I trap these errors using Flow Control without writing any ODI constraints.

        I am using “LKM SQL to Oracle” and C$ table ( work table) is created based on target table. So Product code in C$table is VARCAHR(30). Can I alter the behaviour to create the C$table based on source structure. I want to load data from different source tables into the C$tables and then load valid/good data into the target table and report all records that failed due to data type/length mismatch.

        Please let me know how can I achieve this.

        • Hi Raj,

          I have similar requirement to load all the format error (incorrect length, incorrect datatypes..etc) records into either E$ tables or any custom table. Right now, even if a single error record is there in the data file, ODI 11g is rejecting the entire data file. And after correcting that record and if we re-load the file, it is again rejecting the file if any other error records are there. It is becoming very difficult for identifying, correcting of error records and re-processing of the files for every single error record. So, we need to send the list of all format error records as a file to our client.

          Please let me know if you get solution for your requirement.


        • Could you please share the steps to solve this type of scenario.


  5. HI KI,

    SELECT ErrorCode, ErrorFieldName, DataParcel
    FROM mediation_test.ET_C$_0Cdrs_CDMA_Temp

  6. Hello,
    I’ve got a question regarding to CKM Oracle. It does 4 steps:
    create check table
    delete previous check sum
    create error table
    delete previous errors

    And then depending whether specific error type exists in table it does do ‘insert x errors’. If error doesn’t exist then it doesn’t seem to even check for any errors.
    My question is when does it check if errors exist at all? Is it done implicitly?

  7. Hi

    I am using ODI to load the data from one DB to another and getting this error
    ODI-1228: Task INT-DB-TO-DB (Integration) fails on the target ORACLE connection ODI_STAGE.
    Caused By: java.sql.SQLDataException: ORA-01843: not a valid month

    because target tables datatype is Date from source has varchar2.

    Now i want to insert the error msg in error table (E$_TRG_SALES_PERSON) for that record and rest of the records should be loaded to target table. Currently it is getting stopped because of the error for one record and error msg is not inserting in error table.
    I am using LKM SQL to SQL and IKM Oracle Incremental Update.

    Please update for this update if anybody has some idea on it.

    • you need use to_date and convert the date format , for ex:- TO_DATE(COL_NM,’YYYYMM’)

      • Hello devendr,

        You are right, i should use conversion function here.
        But my requirement is like, source is having incorrect data, it can not be converted thats why i asked to map the error record in ERROR table.
        I hope, i am clear to you now.


Leave a Reply

Required fields are marked *.