Enable job alerts via email!

Data Engineer

Sidgs Digisol

Gurugram District, Dadri

On-site

INR 15,00,000 - 20,00,000

Full time

Yesterday
Be an early applicant

Job summary

A technology solutions company based in Gurugram is seeking a Data Engineer with 6-8 years of experience in designing and maintaining data pipelines. The successful candidate will have strong expertise in cloud data platforms and ETL tools. This role focuses on optimizing data flows to enable data-driven decision-making within the organization.

Qualifications

  • 6–8 years of experience in data engineering or similar roles.
  • Strong experience with ETL tools like Apache Airflow, Talend, or AWS Glue.
  • Hands-on experience with cloud data platforms such as AWS, Azure, or GCP.

Responsibilities

  • Design, develop, and maintain robust ETL/ELT pipelines for data.
  • Build and optimize data warehouses and data lakes.
  • Collaborate with teams to ensure data accuracy and availability.

Skills

SQL
ETL tools
Python
Data modeling
Cloud data platforms
Big data frameworks
CI/CD pipelines

Education

Bachelor's or Master’s degree in Computer Science, IT, Engineering

Tools

Apache Airflow
Talend
AWS Glue
PostgreSQL
MongoDB
Job description
Job Title: Data Engineer

Experience Required: 6 - 8 Years

Location: Gurugram/Noida

Employment Type: Full-time

About SID Global Solutions

SID Global Solutions is a premier Google implementation partner and global technology services firm helping Fortune 500 enterprises across BFSI, Healthcare, Retail, Manufacturing, and Public Sector accelerate digital transformation.

We specialize in AI, Cloud, Automation, API Management, and Modern Data Platforms driving innovation and business growth at scale.

About the Role

We are seeking an experienced Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. The ideal candidate will have strong expertise in modern data technologies, cloud platforms, and big data ecosystems, with a passion for optimizing data flow and enabling data-driven decision-making across the organization.

Key Responsibilities
  • Design, develop, and maintain robust ETL/ELT pipelines for structured and unstructured data.
  • Build and optimize data warehouses, data lakes, and data models to support analytics and reporting needs.
  • Work closely with data analysts, data scientists, and business teams to ensure data accuracy and availability.
  • Implement data governance, quality, and security standards across the data ecosystem.
  • Manage and optimize data storage and retrieval for high performance and scalability.
  • Collaborate with cross-functional teams to migrate, integrate, and transform data across systems.
  • Monitor and troubleshoot data pipeline performance, ensuring minimal downtime.
  • Evaluate and implement new data tools, frameworks, and best practices to enhance data operations.
Required Skills & Qualifications
  • Bachelors or Master’s degree in Computer Science, Information Technology, Engineering, or related field.
  • 6–8 years of experience in data engineering or similar roles.
  • Proficiency in SQL and experience with relational and non-relational databases (e.g., PostgreSQL, MySQL, MongoDB).
  • Strong experience with ETL tools (e.g., Apache Airflow, Talend, Informatica, AWS Glue, dbt).
  • Expertise in at least one programming language (Python, Scala, or Java).
  • Hands-on experience with cloud data platforms — AWS (Redshift, S3, Glue), Azure (Data Factory, Synapse), or GCP (BigQuery, Dataflow).
  • Familiarity with big data frameworks (Spark, Hadoop, Kafka).
  • Experience with data modeling, schema design, and data warehousing concepts.
  • Knowledge of CI/CD pipelines, containerization (Docker/Kubernetes), and version control (Git).
Good to Have
  • Experience with real-time streaming data solutions.
  • Exposure to machine learning data pipelines or analytics platforms.
  • Familiarity with data governance and cataloging tools (Collibra, Alation, Apache Atlas).
  • Understanding of DevOps or MLOps principles in data environments.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.