Skip to main content

Data Engineer - Databricks, Python

Data Engineer - Databricks, Python
N Consulting Ltd
3 weeks 1 day ago

Location

Wroclaw, Poland (Travel 3 time in a week)

Experience

5 to 12 years

Responsibility

  • Create and implement highly scalable and reliable data distribution solution using VQL, Python, Spark & open-source technologies, to deliver data to business components.

  • Work with Denodo, ADLS, Databricks, Kafka, data modelling, data replication, clustering, SQL Query patterns and indexing for handling for large data sets.

  • Demonstrate experience with Python and data access (Numpy, Scipy, panda etc.), machine learning (Tensorflow etc.), and AI libraries (Chat GTP etc.) 4-5 years of hands-on experience in developing large scale applications using data virtualization and/or data streaming technologies. Software engineer/developer focused on cloud based data virtualization and data delivery technologies. Denodo platform familiarity and SQL Experience highly desirable Know-how to apply standards, methods, techniques and templates as defined by our SDLC including code control, code inspection, code deployment.

  • Design, plan and deliver solutions in a large scale enterprise environment. Working with solution architect & business analysts to define implementation design & coding of the assigned modules/responsibilities with highest quality (bug free).

  • Determining technical approaches to be used, and defining the appropriate methodologies.

  • Must be capable of working in a collaborative, multi-site environment to support rapid development and delivery of results and capabilities (i.e. AGILE SDLC).

  • Effectively communicating technical analyses, recommendations, status, and results to project management team. Produce secure and clean code that is stable, operational, consistent and well-performing.

Contract

12 Month+Extension

Expertise level

Work arrangement

Key skills

Similar Jobs in Poland