We're looking for a talented and passionate individual to fill the role of Data Engineer Specialist with following details
Responsibilities
- Design and build data pipelines and ETL processes for data warehouse/data lake.
- Design data models and schemas based on business and analytics requirements.
- Ensure data quality, consistency, and timely delivery across systems.
- Monitor, troubleshoot, and optimize data pipelines.
- Implement data governance, security, and compliance best practices in data pipelines.
- Stay current with emerging data technologies and recommend improvements.
Qualifications
- Bachelor's or Master's degree in Computer Science, Information Systems, or related field.
- Minimum 3 years of experience in data engineering.
- Proficiency in SQL and one or more programming languages such as Python or Java.
- Strong experience with ETL/ELT tools (e.g., Pentaho, Apache Airflow, Talend, Informatica, or NiFi).
- Deep understanding of data warehouse and lake architectures (preferred if experienced in Snowflake or BigQuery).
- Experience with streaming and real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming, Flink).
- Experience with RDBMS (SQL Server, PostgreSQL, Oracle, MySQL, DB2).
- Strong communication skills with technical and non-technical teams.
- Preferred experience in SysOps of Data Tools.