Wednesday, March 26, 2008

ANY WAY TO REFERENCE A SPECIFIC TARGET IN A WORKFLOW LINK ?

ANY WAY TO REFERENCE A SPECIFIC TARGET IN A WORKFLOW LINK ?

QUESTION:

I have a requirement where if a specific target has rows loaded to
it, we need to send an email notification.
I have used the link task properties in the past to prompt workflow
failures, email notifications etc. but not a specific target.
Informatica has a predefined variable that can be used for
comparision that identifies if ANY rows are written. It looks like
this.
$s_session_name.TgtSuccessRows > 1
What I need is something similar to this statement but I need a way
to check a specific table. I immediately thought about splitting
this mapping apart so that I could isolate the rows written to this
table but that's not possible in this particular case without some
major changes.
If you have any experience identifying whether records were written
to a specific table in a workflow, I would like to hear how you did
this.
I needed a similar thing sometime back but I couldn't find the perfect
solution.
I noticed that OPB_SWIDGINST_LOG table does store the number of row
affected/applied/rejected on each Target, however, this information gets
committed only after the session is over.
I thought of creating a session after the session in question which
would read this information and be able to send an email after that.
I ended up creating a dummy textfile (apart from loading the data in the
oracle table in question) and in command task evaluated to see if the
text file had any rows and take action accordingly.
Also, I didn't write all rows, since I just wanted to know if anything
got written to the table or not, so I had a filter before the textfile
target to allow only one row.
Not a neat or good idea, but it kind of worked.
I would like to try and avoid writing data
via a separate mapping as this requires additional code that must be
maintained (I'm really a stickler on this issue).
I simply need a notification if any records to this particular
target are created (the target is an error table). This will prompt
a notification to be sent to the data steward that errors were
identified and require attention.
I would be very interested in knowing how you reference the
rep_sess_tbl_log from the workflow as you mentioned.
If I cannot find a way to identify when records are written to this
target, then I will likely create a separate mapping that searches
for any errors since the last time the workflow ran (this can be
tricky when we start thinking about fault tolerance). Another
option is to create a separate workflow that does the same thing
after all scheduled interfaces have run but the caveat to this
approach is that the data steward is not notified at the time the
rows are created.

ANSWERS:

You do not have to create a separate mapping. You can introduce a text
file target in existing one (yes it still is additional code).
There are multiple ways you can approach this....
From what I know, I believe, currently there isn't an easy way (at workflow level) to ping table(s) and ascertain rec count.
Unfortunately, you'll need to create a new mapping here. Silver lining here is.... you can create a reusable mapping and ping against whichever error table you want. Before you start, you'll need read access to INFA rep view.
If your underlying ERROR table is going to be in Oracle, you can
achieve the email notification by having a INSERT TRIGGER on the
error table.
1. Have a small stored procedure to send email who ever you want to
send - with some parameters if you need customize.
2. Call this stored procedure from INSERT TRIGGER
Whenever there is a new record inserted to this table an email will
be send. This way you do not need to change anything in your
Informatica workflow/mapping.
Alternatively,
you could take a copy of the existing mapping and put an "abort clause" prior to the target. Link this session to a separate email task and you've got what you want.
OR
Reference rep_sess_tbl_log. However, please keep in mind, this log is created after successful completion of your session.

No comments: