Home Search results for: MVP������������Talk:PC53���������������������:www.za32.net

The reason for building my "IBM i" Change Data Capture solution for Informatica® PowerCenter is to grab complete stream of "IBM i" database changes and process them with great power of Informatica® PowerCenter. 

Let me present, what can be done with the "IBM i" Change Data Capture for Informatica® PowerCenter solution.


  1. IBM i contains your crucial database with great number of large tables. You would like to have copy of those tables in a database build by different vendor let say Microsoft®. But copy of those tables takes several hours. You want them to be always up-to-date. We can setup change data capture solution in a few hours for hundreds of tables.  
  2. You would like to gather data for your AI solution. Unfortunately Slowly Changing Dimension is not so appropriate. You would like to analyze changes not for different days, months, years by for seconds or even to subsequent changes which were applied to your database.  Building known Warehouse solutions one must always loose some information and you decide which before warehouse is setup.   Our solution  let you keep all information as long as you need them. Query them as one do it in source database. Compare any states of source database. After analises, you can decide that some states are not important, than define a query which shows important states and the tool removes not important states. 
  3. You would like to start some processes when an important business event has occurred. Just define logical condition which defines when your event has occurred. As you can keep current and old database states, your event condition can be built using all database states that ever happened.


CDC Features


As you see, CDC is not enough. We need additional functionalities to mainan source and targets in consistent states.

  1. MONITCDC, to be informed that something is wrong (the file system is full) .  
  2. wf_CDC_DATETIME_CHK, when we would like to compare source and target tables on column level to make sure we have all data  replicated.
  3. wf_CDC_FULL_REFRESH, for new tables we have to do FULL REFRESH.
  4. wf_CDC_Synchr_Diff, in some cases, we need to load just differences between source and target tables.
  5. wf_i5OS_CDC, always running, change data capture. 

All five workflows are oriented on multi table operations. Thus, adding tables, changing structure does not required to do any changes to Workflows or Mappings.

My LinkedIn profile is open. So feel free to send me and email if you have any questions. Cezary Opacki LinkedIn profile


Change data Capture workflow is described hier

Published in i5/OS Integration

In a well known financial institution was developed workflow with 47 concurrently running sessions. Each session was populating one table by doing table copy between source and target. The purpose of the workflow was to make full target refresh, which is usually required before incremental replication. As full refresh is a very rare operation, the workflow had not been run for months. One day, the workflow was going to be used. As there was no modification to the workflow, no one expected any errors.

Our workflow was ended with following error:


[Informatica][SCLI PWX Driver] PWX-33312 xxxxxxxxx:42915 : Partially completed network transmission timed out after 1322 seconds (bytes completed: 72484 mode: R)

Database driver error...

Function Name : Fetch

SQL Stmt : …..

Published in PowerExchange rules

In case of error 67 on Windows while connecting to IFS on i5/OS check if following Windows local policy is set properly:

Network security: Minimum session security for NTLM SSP based (including secure RPC) clients

Suggested value:"No minimum".


Changes of the value are reflected after system restart.



Published in NetServer

Authentication to i5/OS NetServer can be setup to "Encrypted Password" or "Encrypted Password/Network Security". In case of "Encrypted Password", logging to NetServer is simple to configure but does not use LDAP to authenticate user. "Network Security" allows to use Kerberos to logon. Configuring for "Encrypted Password/Network Security" first tries to find LDAP. If LDAP is not found then "Encrypted Password" method is used. When LDAP is not available extremely long logging time can be experienced. It should be keep in mind especially when moving machine to new environment through backup/Recovery of operating system.      

Published in NetServer