Enable job alerts via email!

Data Engineer

ATOM Insurance

Dubai

On-site

AED 50,000 - 100,000

Full time

7 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking a skilled Data Engineer to join their dynamic team. In this pivotal role, you will collaborate with cross-functional teams to design and implement innovative data solutions that drive business decisions. Your expertise in data engineering, SQL, and AWS will be essential as you build scalable data infrastructures and manage large datasets. This position offers a unique opportunity to work on cutting-edge projects within a supportive environment that values your contributions and encourages professional growth. If you are passionate about data and eager to make an impact, this role is perfect for you.

Qualifications

  • 5+ years in data engineering with customer-facing roles.
  • Experience delivering data solutions on AWS Cloud Platform.
  • Proficient in SQL, Python, and data modeling.

Responsibilities

  • Build data warehousing layers based on use cases.
  • Lead design and delivery of complex data solutions.
  • Ensure data accuracy and availability for analytics.

Skills

Data Engineering
SQL
Python
ETL
AWS Cloud
Stakeholder Management
Problem-Solving
Data Modeling
CI/CD
Agile Methodologies

Education

Bachelor's in Computer Science
AWS Certification

Tools

GitLab
Jenkins
Talend
Informatica
Matillion
Spark
Kafka
Kinesis

Job description

Bachelors in Computer Application (Computers)

Nationality: Any Nationality

Vacancy: 1 Vacancy

Job Description

You will be working on projects with teams from across our ATOM Platform and Business Intelligence/Data Science practice. Reporting to the Chief Product Officer, you will deliver digestible, contemporary, and immediate data content to support and drive business decisions. The key focus of the role is to deliver reports, dashboards, and custom solutions for various business-critical requirements. You will be involved in all aspects of data engineering, from delivery planning, estimating, and analysis, to data architecture, pipeline design, delivery, and production implementation. From day one, you will participate in designing and implementing complex data solutions, including batch, streaming, and event-driven architectures across cloud, on-premise, and hybrid environments.

Responsibilities
  1. Build various data warehousing layers based on specific use cases.
  2. Lead the design, implementation, and successful delivery of large-scale or complex data solutions.
  3. Develop scalable data infrastructure, understanding distributed systems from storage and compute perspectives.
  4. Apply expertise in SQL, ETL, and data modeling.
  5. Ensure data accuracy and availability, understanding the impact of technical decisions on business analytics and reporting.
  6. Proficiently handle large data volumes using scripting/programming languages.
Experience & Skills

Required:

  • 5+ years of experience in data engineering, with a focus on customer or business-facing roles.
  • Ability to communicate requirements effectively to both technical and non-technical audiences.
  • Stakeholder management, problem-solving, and interpersonal skills.
  • Experience with SDLC methodologies, including Agile, waterfall, and hybrid models.
  • Proven experience delivering data solutions on AWS Cloud Platform.
  • Strong understanding of data modeling, data structures, databases, and ETL processes.
  • Experience with large-scale structured and unstructured data.
  • Strong skills in SQL and Python.
  • Experience with CI/CD and DevOps practices in data environments.
  • 3-5 years of consulting or client service delivery experience on AWS.
  • Hands-on experience with big data technologies like Java, Node.js, C#, Python, SQL, EC2, S3, Lambda, Spark, Hive, Pig, Oozie, Kafka, Kinesis, NiFi, etc.
  • Proficiency in programming languages such as Java, C#, Node.js, Python, Spark, SQL, Unix shell/Perl scripting.
  • Experience with DevOps tools like GitLab, Jenkins, CodeBuild, CodePipeline, CodeDeploy.
  • Bachelor's or higher degree in Computer Science or related discipline.
  • At least one AWS certification: Certified AWS DevOps Professional.

Preferred:

  • Experience with multi-cloud environments and developing ETL solutions using tools like Talend, Informatica, Matillion.
  • Knowledge of IoT, event-driven architectures, microservices, containers, and Kubernetes.

Disclaimer: Naukrigulf.com is a platform connecting jobseekers and employers. Applicants should verify employer credentials independently. We do not endorse requests for money or sharing sensitive information. For security concerns, contact abuse@naukrigulf.com.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.