Skip to main content

Python/Spark Big Data Software Engineer III

Python/Spark Big Data Software Engineer III
JPMorgan Chase
7 months ago

We have an exciting and rewarding opportunity

for you to take your software engineering career to the next level. As a Python / Spark Big Data Software Engineer III at JPMorgan Chase within the Capital Reporting product, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities:

  • Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
  • Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
  • Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture
  • Contributes to software engineering communities of practice and events that explore new and emerging technologies
  • Adds to team culture of diversity, equity, inclusion, and respect

Required qualifications, capabilities, and skills:

  • Formal training or certification on cloud or microservice architecture concepts and proficient applied experience
  • Demonstrated knowledge of software applications and technical processes within a cloud or microservice architecture.
  • Hands-on practical experience in system design, application development, testing, and operational stability
  • Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages
  • Overall knowledge of the Software Development Life Cycle
  • Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security

Preferred qualifications, capabilities, and skills:

  • Skilled with Python or PySpark
  • Exposure to cloud technologies Databricks, Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka
  • Experience with Big Data solutions or Relational DB.
  • Experience in Financial Service Industry is a bonus.

Expertise level

Work arrangement

Key skills

AWS

Similar Jobs in United Kingdom

Similar Jobs in City of Glasgow

Similar Jobs in Glasgow