Job Description
Lead Data Engineer
Location: Central London (Hybrid: 2 days per week onsite.) £600-£700 Outside
Company Description
My Client is a leading financial organisation committed to leveraging cutting-edge technology and data-driven insights to empower stakeholders and drive innovation within the industry. As they continue to expand their data capabilities, they are seeking a skilled Lead Data Engineer to spearhead the development and implementation of their data platform whilst also advising and consulting on legacy data migrations from on-prem to the cloud to increase carbon neutrality.
Role Overview
As a Lead Data Engineer, you will play a pivotal role in designing, building, and maintaining their data infrastructure while collaborating closely with senior stakeholders across the organisation. Your expertise in Azure, Databricks, Spark, Python, and data modelling will be critical in driving the success of their data initiatives.
Key Responsibilities
- Lead the complete development cycle of data engineering projects, maintaining anonymity from conceptualisation through deployment to ensure scalability and top-notch quality.
- Leverage internal data platforms to translate data into actionable insights, driving informed business decisions through adept application of analytical methods.
- Employ data engineering techniques to automate manual processes, tackling complex business challenges and enhancing efficiency.
- Enforce adherence to digital principles, ensuring the integrity, security, and compliance of solutions while meeting both functional and non-functional requirements.
- Embed observability into solutions, monitoring production performance, resolving incidents, and addressing underlying risks and issues.
- Advocate for client requirements while maintaining discretion and confidentiality.
- Standardise best practices and methodologies, sharing knowledge with engineers in an undisclosed manner.
- Influence the Data and Distribution architecture and technology stack within cloud-based infrastructure, serving as a hands-on lead in the realm of big data and data lakes.
- Encourage a culture of continual learning and innovation, fostering a mindset focused on improvement, automation, and accelerated time to market without leaving traces of identity.
Required Skills and Qualifications
- Demonstrated expertise in architecting systems for real-time transaction processing alongside ETL applications, with a focus on discretion.
- A comprehensive of data modelling, data warehousing principles, and the innovative Lakehouse architecture.
- Exceptional proficiency in ETL methodologies, preferably utilising Azure Databricks or equivalent technologies (Spark, Spark SQL, Python, SQL), including deep insight into ETL/ELT design patterns.
- Proficient in Databricks, SQL, and Python, with a robust understanding of software development life cycles.
- Familiarity with columnar and/or time series data design patterns, as well as performance optimisation techniques.
- Sound understanding of Infrastructure as Code and scripting languages.
- Awareness of Information Security principles to ensure the secure handling and governance of data.
- Effective communication of intricate solutions in a clear manner.
- Strong analytical and problem-solving skills.
- Thorough understanding of Information Security principles for ensuring compliant data management.
- Hands-on experience with Agile methodologies (SCRUM, Kanban).
- Proven track record as a collaborative team player with exceptional leadership qualities, capable of working across business units, teams, and geographical regions.