Data Engineer
Tanisha Systems, Inc.
8 months ago
Job Details
My name is Sumit, and I am Delivery Manager at. Tanisha Systems Inc is a global contingency staffing firm servicing. We have an excellent job opportunity with one of our clients.
Role: Data ( Python and AWS)
Location: Jersey City - 3 Day onsite in a week
Experience: 9+Years
Job responsibilities
- " Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- " Develops secure high-quality production code, and reviews and debugs code written by others
- " Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems
- " Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture
- " Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies
- " Adds to team culture of diversity, equity, inclusion, and respect
- " Provides system administration support of Salesforce environment, especially related to customized applications, user permissions, security settings, custom objects and workflow
Required qualifications, capabilities, and skills
- " Formal training or certification on data engineering concepts and 8+ years of applied experience
- " Advanced in one or more programming language(s), such as Java, Python
- " Hands-on practical experience delivering data pipelines
- " Proficient in all aspects of the Software Development Life Cycle
- " Advanced understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security
- " Demonstrated proficiency and experience with cloud-native distributed systems
- " Ability to develop reports, dashboards, and processes to continuously monitor data quality and integrity
- " Working knowledge of bitbucket and JIRA
Preferred qualifications, capabilities, and skills
- " Hands-on experience building data pipelines on AWS using Lambda, SQS, SNS, Athena, Glue, EMR
- " Strong experience with distributed computing frameworks such as Apache Spark, specifically PySpark
- " Strong hands-on experience building event driven architecture using Kafka
- " Experience writing Splunk or Cloudwatch queries, DataDog metrics