Skip to main content

Data Engineer - ETL/Python - Banking

Data Engineer - ETL/Python - Banking
VSG Business Solutions
6 months 4 weeks ago

Job Requirements

  • High School Diploma, GED, or foreign equivalent required.
  • Bachelor's in Computer Science, Mathematics or related field + 10 years of development experience preferred, or 10 years comparable work experience required.
  • 10 years of experience designing, developing, testing, and implementing Extract, Transform and Load (ELT/ETL) solutions using enterprise ELT/ETL
  • 15 years of comparable work experience.
  • 10 years of experience developing and implementing data integration, data lake and data warehouse solutions in an on-premise and cloud environment.
  • 5 years of experience working with Business Intelligence tools (IBM Cognos is preferred), Power BI and Alteryx.
  • 7 years of experience working with API's, data as a service, data marketplace and data mesh.

Skills & Experience

  • 10 years of experience with various Software Development Life Cycle methods such as Agile, SCRUM, Waterfall, etc.
  • 3-year experience in 100+ TB data environment.
  • Proven experience developing and maintaining data pipelines and ETL jobs using IBM DataStage, Informatica, Matillion, FiveTran, Talend or Dbt
  • Knowledge of AWS cloud services such as S3, EMR, Lambda, Glue, Sage Maker, Redshift & Athena and/or Snowflake.
  • Experienced in data modelling for self-service business intelligence, advanced analytics, and user application.
  • Experience with Data Science including AI/ML Engineering, ML framework/pipeline build and predictive/prescriptive analytics on Aws Sagemaker.

Responsibilities

The Sr. Data Engineer is responsible in understanding and supporting the businesses through the design, development, and execution of Extract, Transform, and Load (ELT/ETL), data integration, and data analytics processes across the enterprise. He/She will stay on top of tech trends, experiment with and learn new technologies, contribute to the growth of data organization, participate in internal & external technology communities, and mentor other members of the team.

Job Responsibilities

  • Technical Leadership
    • Lead data integration across the enterprise thru design, build and implementation of large scale, high volume, high performance data pipelines for both on-prem and cloud data lake and data warehouses.
    • Lead the development and documentation of technical best practices for ELT/ETL activities.
  • Solution Design
    • Lead the design of technical solution including code, scripts, data pipelines, processes/procedures for integration of data lake and data warehouse solutions in an operative IT environment.
  • Code Development
    • Ensures data engineering activities are aligned with scope, schedule, priority and business objectives.
    • Oversees code development, unit and performance testing activities.
  • Testing
    • Leads validation efforts by verifying the data at various middle stages that are being used between source and destination and assisting others in validating the solution performs as expected.
    • Meets or exceeds all operational readiness requirements (e.g., operations engineering, performance, and risk management).

Expertise level

Work arrangement

Similar Jobs in United States