Bachelors in Computer Application (Computers)
Nationality: Any Nationality
Vacancy: 1 Vacancy
Job Description
You will be working on projects with teams from across our ATOM Platform and Business Intelligence/Data Science practice. Reporting to the Chief Product Officer, you will deliver digestible, contemporary, and immediate data content to support and drive business decisions. The key focus of the role is to deliver reports, dashboards, and custom solutions for various business-critical requirements. You will be involved in all aspects of data engineering, from delivery planning, estimating, and analysis, to data architecture, pipeline design, delivery, and production implementation. From day one, you will participate in designing and implementing complex data solutions, including batch, streaming, and event-driven architectures across cloud, on-premise, and hybrid environments.
Responsibilities
- Build various data warehousing layers based on specific use cases.
- Lead the design, implementation, and successful delivery of large-scale or complex data solutions.
- Develop scalable data infrastructure, understanding distributed systems from storage and compute perspectives.
- Apply expertise in SQL, ETL, and data modeling.
- Ensure data accuracy and availability, understanding the impact of technical decisions on business analytics and reporting.
- Proficiently handle large data volumes using scripting/programming languages.
Experience & Skills
Required:
- 5+ years of experience in data engineering, with a focus on customer or business-facing roles.
- Ability to communicate requirements effectively to both technical and non-technical audiences.
- Stakeholder management, problem-solving, and interpersonal skills.
- Experience with SDLC methodologies, including Agile, waterfall, and hybrid models.
- Proven experience delivering data solutions on AWS Cloud Platform.
- Strong understanding of data modeling, data structures, databases, and ETL processes.
- Experience with large-scale structured and unstructured data.
- Strong skills in SQL and Python.
- Experience with CI/CD and DevOps practices in data environments.
- 3-5 years of consulting or client service delivery experience on AWS.
- Hands-on experience with big data technologies like Java, Node.js, C#, Python, SQL, EC2, S3, Lambda, Spark, Hive, Pig, Oozie, Kafka, Kinesis, NiFi, etc.
- Proficiency in programming languages such as Java, C#, Node.js, Python, Spark, SQL, Unix shell/Perl scripting.
- Experience with DevOps tools like GitLab, Jenkins, CodeBuild, CodePipeline, CodeDeploy.
- Bachelor's or higher degree in Computer Science or related discipline.
- At least one AWS certification: Certified AWS DevOps Professional.
Preferred:
- Experience with multi-cloud environments and developing ETL solutions using tools like Talend, Informatica, Matillion.
- Knowledge of IoT, event-driven architectures, microservices, containers, and Kubernetes.
Disclaimer: Naukrigulf.com is a platform connecting jobseekers and employers. Applicants should verify employer credentials independently. We do not endorse requests for money or sharing sensitive information. For security concerns, contact abuse@naukrigulf.com.