Cloud Data Engineer
Odesa, UA
10 годин тому
source : Just Join IT

Data Warehouse (nice to have)

DevOps tools (nice to have)

Amazon AWS (regular)

We are looking for a Cloud Data Engineer to join our growing team of data experts. This person is responsible for expanding and optimizing our data and data warehouse architecture and optimizing for cost and performance.

The ideal candidate is an experienced data wrangler with extensive data platforms experience who enjoys optimizing and building them from the ground up.

The Data Engineer will support our software developers, database architects, data analysts, and data scientists on data initiatives and ensure optimal data delivery architecture is consistent throughout ongoing projects.

They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of designing & optimizing our company’s data architecture to support our next generation of products and data initiatives.


  • Create and support the data pipelines to facilitate the accommodation of data in the data lake using Spark and other technologies.
  • Design and support the data lake for new and existent data sources.
  • Design and develop systems for the maintenance of the business’s data warehouse, ETL processes, and Machine Learning.
  • Provides day-to-day support to the data platform and troubleshoots existing data pipelines and processes.
  • Defines and promotes best practices and design principles for data lake techniques and architecture.
  • Improve data quality by using and improving tools to detect issues automatically
  • Monitors and troubleshoots performance issues on data lake and assists in developing business intelligence, business data standards, and processes.
  • Identify new data needs and delivery mechanisms for acquiring and reporting such information and addressing the actual requirements.
  • Driven to improve data organization and accuracy.
  • Drive technological decision-making for the business’s future data, analysis, and machine learning needs.
  • Qualifications

  • Extensive experience with AWS cloud services such as Redshift, Glue, and Lambda.
  • Big Data Pipeline tools experience Spark, EMR / Hadoop, Hudi, Kinesis, etc.
  • Infrastructure and DevOps / DataOps tools and languages experience : CloudFormation + CDK, GitLab, YAML, JSON, etc.
  • Data Pipeline Workflow Management tools experience : StepFuctions, EventBridge, SQS, SNS, etc.
  • Bachelor’s degree in Computer Science, Data Science, Information Technology, Information Systems, Statistics, or any other related field.
  • Must have the skill to draft, analyze, and debug SQL queries and be proficient in scripting languages, such as Python.
  • Good to have

  • Infrastructure and DevOps / DataOps tools and languages experience : CloudFormation + CDK, GitLab, YAML, JSON, etc.
  • What’s in it for me?

  • Competitive offer package including multisport card, health insurance & vouchers.
  • The best medical cover on the market with free dental care.
  • Annual bonus plan.
  • Remote work from home.
  • One extra day annually to use for the holiday.
  • Opportunity to contribute to developing and maintaining a live platform product making an impact on people’s lives.
  • Повідомте про це

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Моя електронна адреса
    Клацнувши по кнопці "# кнопка", я даю згоду neuvoo на обробку моїх даних та надсилання сповіщень електронною поштою, як це детально описано в Політиці конфіденційності neuvoo. Я можу будь-коли відкликати свою згоду або скасувати підписку.