Enable job alerts via email!

Data Engineering Consultant

TNP Consultants

Kochi

On-site

INR 6,00,000 - 12,00,000

Full time

20 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative consulting firm is seeking a talented Data Engineer who thrives on solving complex problems with creative solutions. In this role, you will be responsible for designing and developing scalable data pipelines and architectures for both batch and real-time processing. You'll work with a variety of technologies, including relational and NoSQL databases, and collaborate with cross-functional teams to meet business needs. This is a fantastic opportunity to make a significant impact in a dynamic environment that values creativity and technical expertise. If you're ready to take your career to the next level, this position is perfect for you.

Qualifications

  • Expertise in SQL and relational databases is essential.
  • Proficiency in Python or Java is a must-have skill.

Responsibilities

  • Design and maintain scalable data pipelines for processing.
  • Collaborate with teams to identify data requirements and deliver solutions.

Skills

SQL
Python
Java
Scala
ETL tools
Apache Spark
Hadoop
Kafka
AWS
Azure
GCP
Git
CI/CD pipelines
Data visualization
Machine learning
Docker
Kubernetes
Terraform

Job description

TNP is looking for an extraordinary Data Engineer who loves to push boundaries to solve complex business problems using creative solutions. As a Data Engineer, you will work in the Technology team that helps deliver our Data Engineering offerings on a large scale to clients worldwide.


Role Responsibilities:


  • Design, develop, and maintain scalable data pipelines and architectures for batch and real-time processing.
  • Build and optimize data integration workflows, ETL/ELT processes, and data transformation pipelines.
  • Implement data modeling, schema design, and data governance strategies to ensure data quality and consistency.
  • Work with relational and NoSQL databases, data lakes, and distributed systems to manage and store structured and unstructured data.
  • Develop, test, and deploy custom data solutions using programming languages such as Python and SQL.
  • Collaborate with cross-functional teams to identify data requirements and deliver solutions that meet business needs.
  • Monitor data pipelines for performance, reliability, and scalability, and troubleshoot issues as they arise.
  • Ensure data security and compliance with company policies and industry standards.
  • Document processes, tools, and systems for knowledge sharing and scalability.

Must-Have Skills:


  • Expertise in SQL and relational database systems (e.g., PostgreSQL, MySQL, Oracle).
  • Proficiency in programming languages like Python, Java, or Scala.
  • Hands-on experience with ETL tools.
  • Experience with Big Data frameworks such as Apache Spark, Hadoop, or Kafka.
  • Knowledge of cloud platforms (AWS, Azure, GCP) and tools like Redshift, Snowflake, or BigQuery.
  • Proficiency in working with data lakes, data warehouses, and real-time streaming architectures.
  • Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
  • Strong problem-solving, analytical, and communication skills.

Good to Have:


  • Experience with data visualization tools (e.g., Tableau, Power BI)
  • Knowledge of machine learning pipelines and collaboration with Data Scientists.
  • Exposure to containerization technologies like Docker and orchestration tools like Kubernetes.
  • Understanding of DevOps practices and Infrastructure as Code (IaC) tools such as Terraform.
  • Certifications in cloud platforms (AWS, Azure, GCP) or data engineering tools.


Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.