Enable job alerts via email!

DATA ENGINEER

Star Services LLC

Abu Dhabi

On-site

AED 120,000 - 180,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative firm is looking for a skilled Data Warehouse and Machine Learning Engineer to join their team. In this role, you will design, develop, and maintain data warehousing solutions while implementing and optimizing machine learning algorithms. Your expertise in data pipeline management and cloud services will be crucial in extracting insights from large datasets. This dynamic position offers the chance to work with cutting-edge technologies, including Docker and Kubernetes, in a collaborative environment. If you're passionate about data and eager to contribute to groundbreaking solutions, this opportunity is perfect for you.

Qualifications

  • 5+ years of experience in data warehousing and machine learning.
  • Proficiency in SQL and programming languages like Python or Java.

Responsibilities

  • Design and maintain data warehousing solutions using Redshift, Big Query, or Snowflake.
  • Implement and optimize machine learning algorithms for large datasets.

Skills

Data Warehousing
Machine Learning
SQL
Python
Java
Problem-solving
Communication

Education

Bachelor's degree in Computer Science
Bachelor's degree in Engineering

Tools

Redshift
Big Query
Snowflake
Airflow
Luigi
AWS
Google Cloud Platform
Microsoft Azure
Oracle
Docker
Kubernetes

Job description

Job Description:
Data Warehouse and Machine Learning Engineer

We are seeking a highly skilled and experienced Data Warehouse and Machine Learning Engineer to join our team. In this role, you will be responsible for designing, developing, and maintaining data warehousing solutions such as Redshift, Big Query, or Snowflake, as well as implementing and optimizing machine learning algorithms. You will also have a deep understanding of data pipeline and workflow management tools like Airflow or Luigi and experience working with cloud services such as AWS, Google Cloud Platform, Microsoft Azure, and Oracle. Additionally, knowledge of Docker and Kubernetes for containerization and orchestration is highly desirable.

Responsibilities:
  1. Design, develop, and maintain data warehousing solutions using Redshift, Big Query, or Snowflake.
  2. Implement and optimize machine learning algorithms to extract insights from large datasets.
  3. Create and manage data pipelines using tools like Airflow or Luigi.
  4. Utilize cloud services such as AWS, Google Cloud Platform, Microsoft Azure, and Oracle to store and process data.
  5. Work with Docker and Kubernetes to containerize and orchestrate data workflows.
  6. Troubleshoot and resolve anomalies in system functionality and performance.
  7. Collaborate with cross-functional teams to understand business requirements and develop data solutions accordingly.
  8. Stay updated with the latest trends and advancements in data warehousing, machine learning, and cloud technologies.
Requirements:
  1. Bachelor's degree in Computer Science, Engineering, or a related field.
  2. Minimum of 5 years of experience in data warehousing and machine learning.
  3. Proficiency in SQL and programming languages such as Python or Java.
  4. Strong understanding of data pipeline and workflow management tools like Airflow or Luigi.
  5. Experience working with cloud services like AWS, Google Cloud Platform, Microsoft Azure, and Oracle.
  6. Knowledge of Docker and Kubernetes for containerization and orchestration.
  7. Excellent problem-solving and troubleshooting skills.
  8. Ability to work independently as well as in a team environment.
  9. Strong communication and collaboration skills.

If you are passionate about data and have a strong background in data warehousing and machine learning, we would love to hear from you! Join our dynamic team and be a part of building innovative solutions using cutting-edge technologies.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.