Senior Data Engineer
Odesa, UA
1 днів тому
source : Just Join IT

IoT (nice to have)

Machine Learning (nice to have)

Golang (nice to have)

Terraform (regular)

Infrastructure as code (regular)

Python (advanced)

We develop tailor-made IT systems that boost the businesses of our clients.

In our rich and constantly growing portfolio you can find :

  • systems supporting very complex business processes our own solutions which have revolutionized the businesses and increased the profits of our partners
  • our own pioneering products in terms of concept and technology, created by our Research & Development Department
  • last but not least : a huge backlog of innovative ideas and concepts which we will soon have turned into working prototypes, and ultimately into profitable start-ups
  • Currently we are looking for Senior Data Engineer as we are supporting our partner in developing Global Analytics unit which is a global, centralized team with the ambition to strengthen data-driven decision-making and the development of smart data products for day-to-day operations.

    The team is meant to be a nucleus radiating data-driven entrepreneurial culture by acting as incubator to realize ideas by simply doing it : creating smart data products in all areas of business, be it sales, marketing, purchasing, logistics or any other part of the company.

    The team has a lot of freedom to shape this, especially in the use of tools and technology, but also by introducing new concepts, solutions and ways of working.

    The first project you would participate in is focused on implementation of IoT solutions supporting almost all operations, processes and functions run in production sites.

    If you want to :

  • take part in the development and implementation of a complex system of smart data solutions used as a core part of IoT venture
  • have opportunity to work on bleeding-edge projects
  • have a chance to see how your visions come true
  • work with the world’s top IT professionals
  • carry out projects which address real business challenges
  • work in a global and diverse team with global reach
  • have a real impact on the projects you work on and the environment you work in
  • have a chance to propose innovative solutions and initiatives,
  • it’s probably a good match.

    Moreover, if you like :

  • flexible working hours
  • casual working environment and no corporate bureaucracy
  • having an access to such benefits as Multisport and Luxmed
  • working in modern office in the centre of Warsaw with good transport links or working remotely as much as you want
  • a relaxed atmosphere at work where your passions and commitment are appreciated
  • vast opportunities for self-development (e.g. online courses and library, experience exchange with colleagues around the world, partial grant of certification),
  • it’s certainly a good match!

    If you join us, your responsibilities will include :

  • structuring whole processes of data extraction, data transformation and data storing using serverless AWS services to deploy models / analytical solutions
  • writing and maintaining ETL processes in Python
  • implementing, maintaining and further developing the functionality of the Python packages for ETL processes, data lineage, operator inputs, including building logic
  • design and implementation of company’s data standard as database models
  • design and implementation of data flow
  • unit and integration tests of Python modules
  • participating in mission critical processes of the data pipeline
  • technical support in understanding business problems and designing smart data products
  • We expect :

  • significant commercial experience on similar position
  • MS or PhD in Computer Science or related field
  • fluency in extracting information from databases
  • excellent SQL skills
  • strong software engineering skills in Python (including unit testing, OOP)
  • ability to write clean, efficient and scalable code
  • very good working knowledge of Amazon Web Services (including SageMaker)
  • experience in building and releasing Infrastructure as Code with working knowledge of such tools as Terraform
  • experience with working with large datasets through Spark and RDBMs
  • experience with versioning systems (e.g. Git), DevOps mode of working and DevOps tools
  • at least basic knowledge of how machine learning models work and are deployed and monitored
  • strong interest in machine learning and willingness to develop your skills in this field as well as pursue your career in this direction
  • experience working in the organizations with the agile culture
  • team player
  • fluent English as you will communicate in English almost all the time
  • If interested please let us get to know you by sending your CV using "Apply" button.

    Please add to your CV the following clause :

    I hereby agree to the processing of my personal data included in my job offer by IIIT spółka z ograniczoną odpowiedzialnością located in Warsaw for the purpose of the current recruitment process.

    If you want to be considered in the future recruitment processes please add the following statement :

    I also agree to the processing of my personal data for the purpose of future recruitment processes.

    Повідомте про це

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Моя електронна адреса
    Клацнувши по кнопці "# кнопка", я даю згоду neuvoo на обробку моїх даних та надсилання сповіщень електронною поштою, як це детально описано в Політиці конфіденційності neuvoo. Я можу будь-коли відкликати свою згоду або скасувати підписку.