MyPage is a personalized page based on your interests.The page is customized to help you to find content that matters you the most.


I'm not curious
2

Snowflake Developer

Location Plano, United States
Posted 01-October-2020
Description

Assist in developing, executing, and communicating project plans, timelines, and deliverables lists associated with the DataMart transformation
Assist in enhancing BI workflows
Analyze changes for the existing ETL or target tables to implement changes
Design, Develop, Implement (DDI) and maintain the data flows (source, target, and data definitions/metadata)
o Optimize (for speed and maintainability) and maintain existing ETL code
o Automate data flows using Apache Nifi, Kafka or DataStage, as appropriate.
o Design, Implement, and Maintain the reusable components for Extract, Transform, and Load (ETL) jobs.
o DDI of necessary validation, reconciliation, and quality controls.
o DDI of control M jobs for Daily weekly Monthly batch data processing.
Design, Develop, Implement and Maintain the data models (logical and physical) the data with business partners.
Knowledge transfer and documentation of data models and data flows to USAA Full Time Employees
Provide DDI Services for BDW (Bank Data Warehouse) migration and retirement.
o Analyze changes required in target platform to accommodate BDW data
o Design and implement required changes to target models
o Design, develop and implement ETL to load and integrate target data structures
o Create test infrastructure to validate migration and validate
General Purpose Datamart (GPM)
o Optimize the strategic GPM data architecture.
o Design, Implement, and Maintain data flows between non-strategic platforms and Netezza or other strategic platform (source, target, and data definitions/metadata)
Enhance the Bank Foundation Layer on Hadoop.
o Design, develop and implement controls for the Bank Data Foundation Layer.
o Modify/ customize the enterprise framework and validate for Bank
o Design and develop reusable components to extract / load data to downstream marts and databases.
o Design, develop and implement ETL to load and integrate target data structures
Provide DDI Services for Bank Compliance Requirements
o Analyze changes required in target platform to accommodate compliance requirements.
o Design and implement required changes to target models
o Design, develop and implement ETL to load and integrate target data structures
o DDI of necessary audit, validation, reconciliation, and quality controls.
o Create test infrastructure to validate migration and validate

Preferred Requirements:
Hadoop platforms such as Horton Data Platform to include but not limited to:
Hadoop Distributed File System (HDFS)
Apache Hive
Apache Spark
Apache Tez
Apache Yarn
RDBMS: DB2, SQL Server, Oracle
Netezza, Sailfish
Snowflake
Python and assorted libraries and IDEs
Perl
Korn & Bash shell scripts
Apache Nifi and Kafka
IBM Infosphere DataStage
BMC Control M

 
Awards & Accolades for MyTechLogy
Winner of
REDHERRING
Top 100 Asia
Finalist at SiTF Awards 2014 under the category Best Social & Community Product
Finalist at HR Vendor of the Year 2015 Awards under the category Best Learning Management System
Finalist at HR Vendor of the Year 2015 Awards under the category Best Talent Management Software
Hidden Image Url