Job Requirements
-
Skill Combination: Java+AWS (70%) & Python (30%)
Job Responsibilities
-
Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems.
-
Develops secure high-quality production code, and reviews and debugs code written by others.
-
Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems.
-
Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture.
-
Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies.
-
Adds to team culture of diversity, equity, inclusion, and respect.
-
Provides system administration support of Salesforce environment, especially related to customized applications, user permissions, security settings, custom objects, and workflow.
Required Qualifications, Capabilities, and Skills
-
Formal training or certification on data engineering concepts and 8+ years of applied experience.
-
Advanced in one or more programming language(s), such as Java, Python.
-
Hands-on practical experience delivering data pipelines.
-
Proficient in all aspects of the Software Development Life Cycle.
-
Advanced understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security.
-
Demonstrated proficiency and experience with cloud-native distributed systems.
-
Ability to develop reports, dashboards, and processes to continuously monitor data quality and integrity.
-
Working knowledge of Bitbucket and JIRA.
Preferred Qualifications, Capabilities, and Skills
-
Hands-on experience building data pipelines on AWS using Lambda, SQS, SNS, Athena, Glue, EMR.
-
Strong experience with distributed computing frameworks such as Apache Spark, specifically PySpark.
-
Strong hands-on experience building event-driven architecture using Kafka.
-
Experience writing Splunk or Cloudwatch queries, DataDog metrics.