Saturday, February 28, 2015

Create, Deploy, Run a Workflow with Mapping Task and Monitor Workflow in Informatica




Workflow is a graphical representation of a set of events, tasks, and decisions that define a business process.  The Data Integration Service uses the instructions configured in the workflow to run the objects.

Workflow Objects
 A workflow object is an event, task, or gateway. You add objects as you develop a worklow in the editor. Workflow objects are non-reusable.

Event Starts or ends the workflow. An event represents something that happens when the workflow runs. The editor displays events as circles.  
Start  Represents the beginning of the workflow. A workflow must contain one Start event. 
End  Represents the end of the workflow. A workflow must contain one End event. 

Task  runs a single unit of work in the workflow, such as running a mapping, sending an email, or running a shell command. A task represents something that is performed during the workflow. The editor displays tasks as squares.
Assignment  Assigns a value to a user-defined workflow variable. 
Command  Runs a single shell command or starts an external executable program. 
Mapping  Runs a mapping. 
Notification  Sends an email notification to specified recipients.

 

Exclusive gateway splits and merges paths in the workflow based on how the Data Integration Service evaluates expressions in conditional sequence flows. An Exclusive gateway represents a decision made in the workflow. The editor displays Exclusive gateways as diamonds.

Workflow Deployment

When you develop a workflow in the Developer tool, you create a workflow definition. To run an instance of the workflow, you add the workflow definition to an application When you deploy a workflow, the Data Integration Service creates a separate set of run-time metadata in the Model repository for the workflow. If you make changes to a workflow definition in the Developer tool after you deploy it, you must redeploy the application that contains the workflow definition for the changes to take effect.

Steps to create a workflow

Create a workflow by clicking File > New > Workflow.

Add objects to the workflow and configure the object properties. Workflow object is an event, task, or gateway.

Connect objects with sequence flows to specify the order that the Data Integration Service runs the objects.

Define variables for the workflow to capture run-time information.

Validate the workflow to identify errors

Add the workflow to an application and deploy the application to the Data Integration Service.

Run an instance of the workflow from the deployed application using the infacmd wfs command line program.
infacmd wfs startWorkflow -dn MyDomain -sn MyDataIntSvs -un MyUser -pd MyPassword -a MyApplication -wf MyWorkflow -pf MyParameterFile.xml
We can also run the instance of the workflow from the deployed application from the Administrator tool.

Monitor the workflow instance run in the Monitoring tool in Informatica Administrator.

Mapping Task

Mapping task runs a mapping during a workflow. When you add a Mapping task to a workflow, you select a single mapping for the task to run.  When you change the mapping, the Model Repository Service tracks the effects of these changes on all Mapping tasks that include the mapping.

Mapping Task Input
Mapping task input is the data that passes into a Mapping task from workflow parameters and variables. On the Mapping task Input tab, you can assign the following information to workflow parameters or variables:
User-defined mapping parameters Assign a user-defined mapping parameter to task input to define the user-defined mapping parameter value in a workflow parameter value, workflow variable value, or literal value. The Input tab lists all parameters created for the mapping and for objects included in the mapping.
Mapping task configuration properties Assign a Mapping task configuration property to task input to define the value of the property in a workflow parameter or variable. The Advanced tab lists the Mapping task configuration properties.

Mapping Task Output
Mapping task output is the data that passes from a Mapping task into workflow variables. When you configure a Mapping task, you specify the task output values that you want to assign to workflow variables on the Output tab. The Data Integration Service copies the Mapping task output values to workflow variables when the Mapping task completes.
General Outputs include output data produced by all tasks such as the task start time, end time, and whether the task successfully ran.
Mapping Task Outputs include number of target rows processed, number of source rows processed and number of error rows.

Mapping Task Advanced Properties
The Advanced tab for a Mapping task includes properties that the task uses to run the mapping. You can also assign a configuration property to task input. Then in the Mapping task Input tab, you can assign the property to a workflow parameter or variable.
Default Date Time Format Date/time format the Data Integration Services uses when the mapping converts strings to dates. Select one of the predefined formats, or type a valid date format string.
Optimizer Level Controls the optimization methods that the Data Integration Service applies to a mapping as follows:
0 (None). The Data Integration Service does not optimize the mapping.
1 (Minimal). The Data Integration Service applies the early projection optimization method to the mapping.
2 (Normal). The Data Integration Service applies the early projection, early selection, pushdown, and predicate optimization methods to the mapping.
3 (Full). The Data Integration Service applies the early projection, early selection, pushdown, predicate, cost-based, and semi-join optimization methods to the mapping.
The property has an integer datatype. Default is 2 (Normal).
High Precision Runs the mapping with high precision. The property has a boolean datatype. Default is true.
Sort Order Order in which the Data Integration Service sorts character data in the mapping. The property has a string datatype. Default is Binary.
Override Tracing Level Overrides the tracing level for each transformation in the mapping. The tracing level determines the amount of information the Data Integration Service sends to the mapping log files. The property has a string datatype. Default is normal.
 

41 comments:

  1. Nice explanation of Workflow with mapping task

    http://www.tekclasses.com/courses/etl/informatica-training-in-bangalore/

    ReplyDelete
  2. I must confess that it is very promising.
    http://www.informaticaonlinetraining.co/

    ReplyDelete
  3. Best Informatica Online Training By 9+Years Of Realtime Expert

    Nice Information.

    Below is the link for Course Content and Demo Class

    https://informaticaonlinetraing.blogspot.com

    ReplyDelete
  4. Can any one help if this scenario is possible in DV.
    Just like Infa PC , i would want a workflow with parallel flows. Even using the exclusive gateway allows you decide on either one of the flow and not both.

    Is it that we can go on sequentially only?

    ReplyDelete
  5. What is the difference between Start Workflow & Cold Start Workflow when a particular workflow fails as both don't do recovery?
    Thanks,
    Aaradhya

    ReplyDelete
  6. Highly informative and its very useful.Nice blog thank you for sharing.
    Informatica online training

    ReplyDelete
  7. Can someone tell me how to re-run a workflow in IDQ

    ReplyDelete
    Replies
    1. Can someone tell me how to re-run a workflow in IDQ without redeploying the workflow/application.

      Delete
  8. Excellent article. Very interesting to read. I really love to read such a nice article. Thanks! keep rocking.Informatica Online Course Bangalore

    ReplyDelete
  9. I would really like to read some personal experiences like the way, you've explained through the above article. I'm glad for your achievements and would probably like to see much more in the near future. Thanks for share.
    python training Course in chennai
    python training in Bangalore
    Python training institute in bangalore

    ReplyDelete
  10. This comment has been removed by the author.

    ReplyDelete
  11. This is most informative and also this post most user friendly and super navigation to all posts... Thank you so much for giving this information to me.

    rpa training in chennai
    rpa training in bangalore
    rpa course in bangalore
    best rpa training in bangalore
    rpa online training

    ReplyDelete
  12. This comment has been removed by the author.

    ReplyDelete
  13. I am really thankful for posting such useful information. It really made me understand lot of important concepts in the topic. Keep up the good work!
    Oracle Training in Chennai | Oracle Course in Chennai

    ReplyDelete
  14. such a great word which you use in your article and article is amazing knowledge. thank you for sharing it.

    Learn Best Cognos Training in Bangalore from Experts. Softgen Infotech offers the Best Cognos Training in Bangalore.100% Placement Assistance, Live Classroom Sessions, Only Technical Profiles, 24x7 Lab Infrastructure Support.

    ReplyDelete
  15. The course module designed according to the requirement of the present market standards cursos de ti online

    ReplyDelete
  16. Excellent Blog! Great Work and informative.thanks for the information
    digital marketing course mumbai

    ReplyDelete
  17. TCL scripting training in Bangalore
    Then, you are at the right place in the category education/training/library in Education & training jobs Marathahalli (Bangalore). Here, any kind of education concerned training or libraries may be sought and found. Please enter and expand your knowledge! As an offering.

    ReplyDelete
  18. In the wake of perusing your article, I was astounded. I realize that you clarify it well overall. What's more, I trust that different perusers will likewise encounter how I feel in the wake of perusing your article.
    data science training in hyderabad

    ReplyDelete
  19. It is the perfect time to make some plans for the future and it is the time to be happy. I've read this post and if I could I would like to suggest some interesting things or suggestions. Perhaps you could write the next articles referring to this article. I want to read more things about it!
    data analytics training in hyderabad

    ReplyDelete
  20. its great job and you explained in very simple way thank you

    ReplyDelete
  21. "Your contribution of valuable information on the best MEC colleges in Hyderabad is greatly appreciated!"
    Best Colleges For MEC In Hyderabad

    ReplyDelete

  22. Excellent data and instructive stuff. Continue to write blogs for us. I'm grateful.
    SAP FICO Training in Hyderabad

    ReplyDelete