Data Engineer
Wizcom Consulting Corporation
Data Engineer/ETL/DataStage/Informatica/Python/Banking
Hybrid/Troy, MI
1 year+
W2 only (no C2C)
Client Need
A senior (10+ years) Data Engineer or Data Developer who knows ETL development on Data Stage, Informatica and Snowflake and knows Python and strong SQL. Candidates need to be senior and be local to Troy MI for 3 days a week hybrid. Candidates must have banking experience.
The manager wants to see ETL (DataStage & Informatica), strong SQL & Snowflake and proficiency in Python scripting. Candidates need to have recent experience in banking or financial services. Candidates must have long projects/good tenure, excellent communication skills, and a state-issued ID (not bills) showing they are local.
Sr. Data Developer Experience
- Responsibilities
- The Sr. Data Engineer is responsible for understanding and supporting the businesses through the design, development, and execution of Extract, Transform, and Load (ELT/ETL), data integration, and data analytics processes across the enterprise.
- He/She will stay on top of tech trends, experiment with and learn new technologies, contribute to the growth of data organization, participate in internal & external technology communities, and mentor other members of the team.
- Provide technical leadership at every stage of the data engineering lifecycle, from designing data platforms, data pipelines, data stores, and gathering, importing, wrangling, querying, and analyzing data.
- The Sr. data engineer will work closely with various customers including their immediate project teams, business domain experts and other technical staff members.
- Work daily within a project team environment, taking direction from project management and technical leaders.
- Responsible for design, development, administration, support, and maintenance of the Snowflake Platform and Oracle Platform.
- Participates in the full systems life cycle and cloud data lake/data warehouse design and build including recommendation of code development, integration with data marketplace or reuse and buy versus build solutions.
Job Responsibilities:
- Technical Leadership – Lead data integration across the enterprise through design, build and implementation of large scale, high volume, high-performance data pipelines for both on-prem and cloud data lake and data warehouses.
- Lead the development and documentation of technical best practices for ELT/ETL activities.
- Oversee a program inception to build a new product if needed.
- Solution Design – Lead the design of technical solution including code, scripts, data pipelines, processes/procedures for integration of data lake and data warehouse solutions in an operative IT environment.
- Code Development – Ensures data engineering activities are aligned with scope, schedule, priority and business objectives.
- Oversees code development, unit and performance testing activities.
- Testing – Leads validation efforts by verifying the data at various middle stages that are being used between source and destination and assisting others in validating the solution performs as expected.
- Meets or exceeds all operational readiness requirements (e.g., operations engineering, performance, and risk management).
- Ensure compliance with applicable federal, state and local laws and regulations.
- Complete all required compliance training.
- Maintain knowledge of and adhere to Flagstar's internal compliance policies and procedures.
- Take responsibility to keep up to date with changing regulations and policies.
Expertise level
Work arrangement
Key skills
Similar Jobs in United States
AWS Engineer with Python
Ampstek
2 weeks ago
Software Engineer
Ascendion
2 weeks ago
2 weeks ago
2 weeks ago
Python Full Stack Engineer
Quantum World Technologies Inc.
2 weeks ago