Data Engineer
Wizcom Corporation
7 months 2 weeks ago
Job Details
Data Engineer/ETL/DataStage/Informatica/Python/Banking
Hybrid/Troy, MI
1 year+
W2 only (no C2C)
Client need: A senior (10+ years) Data Engineer or Data Developer who knows ETL development on Data Stage, Informatica, and Snowflake and knows Python and strong SQL. Candidates need to be senior and be local to Troy MI for 3 days a week hybrid. CANDIDATES MUST HAVE BANKING EXPERIENCE. **The manager wants to see ETL (DataStage & Informatica), strong SQL & Snowflake, and is proficient in Python scripting. Candidates need to have recent experience in banking or financial services. **Candidates must have Long Projects/Good Tenure, Excellent communication skills and a State issued ID (Not Bills) showing they are Local.
Job Responsibilities:
- Technical Leadership: Lead data integration across the enterprise thru design, build, and implementation of large scale, high volume, high performance data pipelines for both on-prem and cloud data lake and data warehouses. Lead the development and documentation of technical best practices for ELT/ETL activities. Also, oversee a program inception to build a new product if needed.
- Solution Design: Lead the design of technical solution including code, scripts, data pipelines, processes/procedures for integration of data lake and data warehouse solutions in an operative IT environment.
- Code Development: Ensures data engineering activities are aligned with scope, schedule, priority, and business objectives. Oversees code development, unit and performance testing activities. Responsible to code and lead the team to implement the solution.
Job Requirements:
- High School Diploma, GED, or foreign equivalent required.
- Bachelor's in Computer Science, Mathematics, or related field + 10 years of development experience preferred, or 10 years comparable work experience required.
- 10 years of experience designing, developing, testing, and implementing Extract, Transform and Load (ELT/ETL) solutions using enterprise ELT/ETL.
- 15 years of comparable work experience.
- 10 years of experience developing and implementing data integration, data lake and data warehouse solutions in an on-premise and cloud environment.
- 5 years of experience working with Business Intelligence tools (IBM Cognos is preferred), Power BI, and Alteryx.
- 7 years of experience working with API's, data as a service, data marketplace, and data mesh.
- 10 years of experience with various Software Development Life Cycle methods such as Agile, SCRUM, Waterfall, etc.
- 3-year experience in 100+ TB data environment.