Enable job alerts via email!

Data Engineer - Specialist

Elfonze Technologies

India

Remote

INR 8,00,000 - 12,00,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology solutions company in India is seeking a skilled Data Engineer to develop and maintain robust data pipelines and infrastructure. The role requires expertise in building scalable data solutions, strong programming skills, and experience with cloud platforms. Ideal candidates will have a track record of ensuring data quality and supporting data-driven decision-making. This position involves collaboration with data scientists and software engineers in an Agile environment.

Responsibilities

  • Build and optimize scalable data pipelines and workflows for batch and real-time processing.
  • Work closely with data scientists, analysts, and software engineers to support data requirements.
  • Develop and maintain data ingestion and transformation processes.
  • Automate data workflows and contribute to CI/CD for data pipelines.
  • Participate in Agile ceremonies to ensure project timelines and deliverables are met.

Skills

Building and managing data pipelines
Programming in Python, Java, or Scala
Strong SQL skills
Experience with cloud platforms (AWS, Azure, GCP)
Knowledge of data lakes and cloud storage solutions
Experience with ETL/ELT tools
Strong analytical skills
Effective communication
Job description

As a DataEngineer, you will develop and maintain robust data pipelines and infrastructure to support real-time and batch data processing. Your work will enable data-driven decision-making across the organization by ensuring data is clean, reliable, and accessible.

Requirements
  • Experience in building and managing datapipelines using tools like Apache Spark.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Strong SQL skills and experience with relational databases.
  • Experience with cloud platforms such as AWS, Azure, or GCP.
  • Knowledge of data lakes and cloud storage solutions like S3, ADLS, or GCS.
  • Experience with ETL/ELT tools and frameworks such as Airflow, dbt, or Glue.
  • Strong analytical and debugging skills.
  • Effective communication and collaboration in a team-oriented environment.
Responsibilities
  • Build and optimize scalable data pipelinesand workflows for batch and real-time processing.
  • Work closely with data scientists, analysts, and software engineers to support data requirements.
  • Develop and maintain data ingestion and transformation processes.
  • Automate data workflows and contribute to CI/CD for data pipelines.
  • Participate in Agile ceremonies such as sprint planning, daily standups, and sprint reviews to ensure project timelines and deliverables are met.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.