Informatica– Big Data Management Specialist – JHB
Location – Johannesburg
Long term contract (3 – 5 years)
Roles and Responsibilities:
Created customized BDMmappings/workflows for incremental using Informatica Developer and deploy themas part of an application on Data Integration service for the native executionor push down using the Blaze or Spark execution engines.
Assist Admins in debugging since nowthe logs are scattered on BDM between the Native servers and on the Yarn -Spark and Blaze have a different set of logs in different directories configuredwhich is a prerequisite for Informatica Big Data Products.
As part of the change control, helpAdmins in updating the service properties and migrating the code between theenvironments.
Familiar with using Sqoop argumentsrequired to source data from any traditional database systems into Hadoop.
Experienced in Comparing the loadperformances with BDM jobs. Should know the concept of Control Tables which canhelp in scheduling of BDM Jobs/Applications.
Work with business analysts to createProof of Concept test cases based on customer’s needs surrounding Unit Testing,QA testing and most importantly, production reconciliation.
Develop shell scripts which will takethe Domain, repository, Integration service, user credentials, Folder name andworkflow as parameters to run the jobs via Informatica command line pmcmd.
Handled the upgrade requests from the10.2.2 versions till 10.2.4
Writing shells to automate theInformatica weekly domain and repository back up process for the restoration incase of crashes.
Understand the different command-lineoptions available such as pmcmd, pmrep, infacmd which will help automate thedaily data loads. Used all the different ways of migrating the code between theenvironment including the deployment groups.
Working experience using InformaticaBDM to create and schedule workflows that will reduce manual intervention.
Automate the weekly repositorybackups, archiving logs and remove unnecessary caches to free up space onthe file system.
Working on Audit Balance andcontrolling the jobs to make sure of not dropping transactions during our ETLusing automated shell scripts which will control sending notifications to thesupport team.
Experience in handling largedeployments.
Successfully performance-tuned andmaintained ETL applications and SQL databases with a significant reduction in theprocess run time and standardization of the process in production.
Creating UNIX Shell scripting andsupported implementation of automated ETL solutions which included schedulingof jobs with Informatica PMCMD Commands.
Consistently provided metrics (effortestimation and tracking on ETL tasks) and status reports for the seniormanagement.
Maintain Security administrationNatively or via Active Directory - support with access privileges for differentlevels users/groups/roles - built-in or customized
Qualification and requirements:
5+ years of Data Warehousingexperience in development, Installing, Upgrade, Administer, Analyze, Design andimplementation of Data warehousing ETL solutions using Informatica Big DataManagement.
Have configured the Informatica BDMversions and integrated with the Kerberos enabled Cloudera and Horton HadoopFrameworks.
Applied incremental Emergency bugfixes to fix any product issues and follow up with Informatica support for anyissues.
Experienced in processing SAP source systemdata using the Informatica Data Transformation Libraries and load them intoHadoop
Please send CV asap email@example.com