Enable job alerts via email!
A tech job platform is seeking a Data Engineer with at least 5 years of experience. The role involves designing and maintaining scalable data pipelines, optimizing ETL processes, and collaborating with cross-functional teams. Ideal candidates should have a Bachelor's degree, strong SQL skills, and experience with cloud platforms like AWS or Google Cloud. This position offers the opportunity to work on innovative data solutions in a dynamic environment.
We are looking for a skilled and experienced Data Engineer with at least 5 years of industry experience to design, develop, and maintain scalable data pipelines and infrastructure. The ideal candidate should have strong experience in data warehousing, ETL processes, and cloud-based data solutions.
Design, build, and manage scalable and reliable data pipelines to ingest and process data from multiple sources.
Develop and optimize ETL/ELT workflows to transform raw data into structured formats.
Work with data analysts and data scientists to support data needs and ensure data quality.
Implement data validation, quality checks, and monitoring solutions.
Build and maintain data warehouses or data lakes using cloud platforms.
Collaborate with cross-functional teams including engineering, product, and BI teams.
Optimize database performance, ensure data governance, and implement best practices for data management.
Bachelor's degree in Computer Science, Engineering, or a related field.
5+ years of hands-on experience in data engineering or related roles.
Proficiency in SQL and one or more programming languages like Python, Scala, or Java.
Experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica).
Strong knowledge of data warehousing concepts and technologies (e.g., Snowflake, Redshift, BigQuery).
Experience with cloud platforms (e.g., AWS, Google Cloud Platform, or Azure) and cloud-native data services.
Familiarity with data modeling, schema design, and performance tuning.
Experience with distributed data processing frameworks (e.g., Spark, Hadoop) is a plus.
Experience with real-time data streaming tools like Kafka or Kinesis.
Knowledge of data security, governance, and compliance practices.
Experience working in Agile environments.
We are an equal opportunities employer and welcome applications from all qualified candidates.