Skip to main content

Data Engineer

Data Engineer
Wizcom Consulting
5 months ago

Data Engineer/ETL/DataStage/Informatica/Python/Banking

Hybrid/Troy, MI

1 year+

W2 only (no C2C)

Sr. Data Developer Experience

The Sr. Data Engineer is responsible in understanding and supporting the businesses through the design, development, and execution of Extract, Transform, and Load (ELT/ETL), data integration, and data analytics processes across the enterprise. He/She will stay on top of tech trends, experiment with and learn new technologies, contribute to the growth of data organization, participate in internal & external technology communities, and mentor other members of the team. Provide technical leadership at every stage of the data engineering lifecycle, from designing data platforms, data pipelines, data stores, and gathering, importing, wrangling, querying, and analyzing data. The Sr data engineer will work closely with various customers including their immediate project teams, business domain experts and other technical staff members. Work daily within a project team environment, taking direction from project management and technical leaders. Responsible for design, development, administration, support, and maintenance of the Snowflake Platform and Oracle Platform. Participates in the full systems life cycle and cloud data lake/data warehouse design and build including recommendation of code development, integration with data marketplace or reuse and buy versus build solutions.

Job Responsibilities

  • Technical Leadership Lead data integration across the enterprise thru design, build and implementation of large scale, high volume, high performance data pipelines for both on-prem and cloud data lake and data warehouses.
  • Solution Design Lead the design of technical solution including code, scripts, data pipelines, processes/procedures for integration of data lake and data warehouse solutions in an operative IT environment.
  • Code Development Ensures data engineering activities are aligned with scope, schedule, priority and business objectives. Oversees code development, unit and performance testing activities. Responsible to code and lead the team to implement the solution.
  • Testing Leads validation efforts by verifying the data at various middle stages that are being used between source and destination and assisting others in validating the solution performs as expected. Meets or exceeds all operational readiness requirements (e.g., operations engineering, performance, and risk management).
  • Ensure compliance with applicable federal, state and local laws and regulations. Complete all required compliance training. Maintain knowledge of and adhere to Flagstar's internal compliance policies and procedures. Take responsibility to keep up to date with changing regulations and policies.

Job Requirements

  • High School Diploma, GED, or foreign equivalent required.
  • Bachelor's in Computer Science, Mathematics or related field + 10 years of development experience preferred, or 10 years comparable work experience required.
  • 10 years of experience designing, developing, testing, and implementing Extract, Transform and Load (ELT/ETL) solutions using enterprise ELT/ETL
  • 15 years of comparable work experience.
  • 10 years of experience developing and implementing data integration, data lake and data warehouse solutions in an on-premise and cloud environment.
  • 5 years of experience working with Business Intelligence tools (IBM Cognos is preferred), Power BI and Alteryx.
  • 7 years of experience working with API's, data as a service, data marketplace and data mesh.
  • 10 years of experience with various Software Development Life Cycle methods such as Agile, SCRUM, Waterfall, etc.
  • 3-year experience in 100+ TB data environment.
  • Proven experience developing and maintaining data pipelines and ETL jobs using IBM DataStage, Informatica, Matillion, FiveTran, Talend or Dbt
  • Knowledge of AWS cloud services such as S3, EMR, Lambda, Glue, Sage Maker, Redshift & Athena and/or Snowflake.
  • Experienced in data modelling for self-service business intelligence, advanced analytics, and user application.
  • Experience with Data Science including AI/ML Engineering, ML framework/pipeline build and predictive/prescriptive analytics on Aws Sagemaker.

Expertise level

Work arrangement

Similar Jobs in United States

Similar Jobs in Michigan

Similar Jobs in Troy