MyPage is a personalized page based on your interests.The page is customized to help you to find content that matters you the most.


I'm not curious
2

Informatica ETL

Location Thiruvananthapuram, India
Posted 21-August-2021
Description
Design and Build ETL jobs to support SVBs Enterprise data warehouse. Write Extract-Transform-Load (ETL) jobs using any standard tools and Spark/Hadoop jobs to calculate business metrics
Partnering with business team to understand the business requirements, understand the impact to existing systems and design and Implement new data provisioning pipeline process for Finance / External reporting domains.
You will also have the opportunity to display your skills in the following areas: Big Data technologies, Design, implement, and build our enterprise data platform (EDP).
Design data schema and operate internal data warehouses and SQL/NoSQL database systems
Monitor and troubleshoot operational or data issues in the data pipelines Drive architectural plans and implementation for future data storage, reporting, and analytic solutions


Basic Qualifications

Bachelors degree in Computer Science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience 5years of relevant work experience in analytics, data engineering, business intelligence or related field,
and 5years professional experience 2years of experience in implementing big data processing technology: Hadoop, Apache Spark, etc.
Experience using SQL queries, experience in writing and optimizing SQL queries in a business environment with large-scale, complex datasets
Detailed knowledge of data warehouse technical architecture, infrastructure components, ETL and reporting/analytic tools and environments
Hands on experience on major ETL tools like Ab Initio, Informatica / IICS, BODS and/or any cloud based ETL tools.
Hands on experience with scheduling tools like Control-M, Redwood or Tidel.
Expects good understanding and experience on reporting tools like Tableau, BOXI etc.
Good understanding of Database and data warehouse concepts. Should have hands on experience on major DBs like Oracle, sqlserver, postgres.
Should have awareness about no SQL DB.
Hands on experience in cloud technologies (AWS /google cloud/Azure ) related to Data Ingestion tool ( both real time and batch based), CI/CD process, Cloud architecture understanding , Big data implementation.
AWS certification is a plus and working knowledge of Glue, S3, Athena, Redshift is a plus.
Preferred Qualifications Graduate degree in Computer Science, Mathematics, Statistics, Finance, related technical field Strong ability to effectively communicate with both business and technical teams
Demonstrated experience delivering actionable insights for a consumer business Coding proficiency in at least one modern programming language (Python, Ruby, Java, etc.)
Basic Experience with Cloud technologies
Experience in banking domain is a plus


Preferred skills


and 5years professional experience 2years of experience in implementing big data processing technology: Hadoop, Apache Spark, etc.
Experience using SQL queries, experience in writing and optimizing SQL queries in a business environment with large-scale, complex datasets
Hands on experience on major ETL tools like Ab Initio, Informatica / IICS, BODS and/or any cloud based ETL tools.
Experience
Min 5 to 8 Years.

 
Awards & Accolades for MyTechLogy
Winner of
REDHERRING
Top 100 Asia
Finalist at SiTF Awards 2014 under the category Best Social & Community Product
Finalist at HR Vendor of the Year 2015 Awards under the category Best Learning Management System
Finalist at HR Vendor of the Year 2015 Awards under the category Best Talent Management Software
Hidden Image Url