TNP is looking for an extraordinary Data Engineer who loves to push boundaries to solve complex business problems using creative solutions. As a Data Engineer, you will work in the Technology team that helps deliver our Data Engineering offerings on a large scale to clients worldwide.
Role Responsibilities:
- Design, develop, and maintain scalable data pipelines and architectures for batch and real-time processing.
- Build and optimize data integration workflows, ETL/ELT processes, and data transformation pipelines.
- Implement data modeling, schema design, and data governance strategies to ensure data quality and consistency.
- Work with relational and NoSQL databases, data lakes, and distributed systems to manage and store structured and unstructured data.
- Develop, test, and deploy custom data solutions using programming languages such as Python and SQL.
- Collaborate with cross-functional teams to identify data requirements and deliver solutions that meet business needs.
- Monitor data pipelines for performance, reliability, and scalability, and troubleshoot issues as they arise.
- Ensure data security and compliance with company policies and industry standards.
- Document processes, tools, and systems for knowledge sharing and scalability.
Must-Have Skills:
- Expertise in SQL and relational database systems (e.g., PostgreSQL, MySQL, Oracle).
- Proficiency in programming languages like Python, Java, or Scala.
- Hands-on experience with ETL tools.
- Experience with Big Data frameworks such as Apache Spark, Hadoop, or Kafka.
- Knowledge of cloud platforms (AWS, Azure, GCP) and tools like Redshift, Snowflake, or BigQuery.
- Proficiency in working with data lakes, data warehouses, and real-time streaming architectures.
- Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
- Strong problem-solving, analytical, and communication skills.
Good to Have:
- Experience with data visualization tools (e.g., Tableau, Power BI)
- Knowledge of machine learning pipelines and collaboration with Data Scientists.
- Exposure to containerization technologies like Docker and orchestration tools like Kubernetes.
- Understanding of DevOps practices and Infrastructure as Code (IaC) tools such as Terraform.
- Certifications in cloud platforms (AWS, Azure, GCP) or data engineering tools.