Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Data Engineer

Sugar

Deutschland

Vor Ort

EUR 70.000 - 90.000

Vollzeit

Vor 13 Tagen

Zusammenfassung

A dynamic technology company in Germany is seeking a Senior Data Engineer to design and maintain scalable data management systems. The successful candidate will integrate data from various sources and develop ETL processes to ensure data quality and availability. Key qualifications include a bachelor's degree in Computer Science, 5+ years of experience, and strong programming skills. A competitive salary ranging from €70k to €90k is offered along with stock options.

Leistungen

Stock options

Qualifikationen

  • 5+ years of experience in data engineering or related role.
  • Strong analytical and problem-solving skills with attention to detail.
  • Relevant certifications in data engineering or cloud computing.

Aufgaben

  • Design, construct, and maintain scalable data management systems.
  • Integrate raw data from various sources into a centralized data lake.
  • Develop and manage ETL processes for data ingestion.

Kenntnisse

Programming skills in Python, Java, or Scala
Proficiency in SQL
Analytical and problem-solving skills
Excellent communication skills

Ausbildung

Bachelor's degree in Computer Science or related field
Master's degree in a related field

Tools

ETL tools like Spark and Kafka
Databases like MySQL or PostgreSQL
NoSQL databases like MongoDB
Cloud platforms (AWS, Google Cloud, Azure)
Infrastructure as code tools like Terraform
Workflow orchestration tools like Airflow

Jobbeschreibung

ABOUT THE JOB

As Senior Data Engineer, you will design, construct, install, test, and maintain highly scalable data management systems. You will integrate raw data from various microservices and external APIs into a centralized data lake or data warehouse, ensuring data quality and consistency. Your responsibilities will include developing and managing Extract, Transform, Load (ETL) processes to automate data ingestion and transformation, as well as building and maintaining data pipelines to ensure timely data availability for analytics.

You will work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Additionally, you will support backend teams in optimizing and maintaining both SQL and NoSQL databases for high performance and reliability. Your role will involve implementing data governance and security protocols to protect sensitive information and monitoring and fine-tuning data infrastructure for performance, reliability, and scalability.

This role demands a blend of strategic insight and hands-on technical expertise, bearing responsibility for key technological decisions that will shape our data infrastructure. You will also maintain clear documentation of data architecture, processes, and data flows, ensuring seamless collaboration and effective communication with team members and stakeholders.

KEY RESPONSIBILITIES

  • Design and Development: Design, construct, install, test, and maintain highly scalable data management systems.
  • Data Integration: Integrate raw data from various microservices and external APIs to a centralized data lake or data warehouse, ensuring data quality and consistency.
  • ETL Processes: Develop, implement, and manage Extract, Transform, Load (ETL) processes to automate data ingestion and transformation.
  • Data Pipeline Management: Build and maintain data pipelines, ensuring data is processed and available for analytics in a timely manner.
  • Database Management: Support the backend teams to optimize and maintain databases for high performance and reliability, including both SQL and NoSQL databases.
  • Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions.
  • Data Security: Work hand in hand with the platform team to Implement data governance and security protocols to protect sensitive information.
  • Performance Optimization: Monitor and fine-tune data infrastructure for performance, reliability, and scalability.
  • Documentation: Maintain clear documentation of data architecture, processes, and data flows.
REQUIRED SKILLS AND QUALIFICATIONS

  • Education: Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field.
  • Experience: 5+ years of experience in data engineering or a related role.
  • Technical Skills:
  • Strong programming skills in Python, Java, or Scala for data processing.
  • Proficiency in SQL and experience with database management systems such as MySQL or PostgreSQL.
  • Experience with ETL and data pipeline technologies like Spark and Kafka.
  • Experience with one NoSQL database such as MongoDB or Cassandra.
  • Experience with cloud platforms (AWS, Google Cloud, Azure) and their data services, as well as one infrastructure as code tool (e.g. Terraform).
  • Familiarity with a workflow (ETL) orchestration tool (e.g. Airflow).
  • Analytical Skills: Strong analytical and problem-solving skills with attention to detail.
  • Communication: Excellent verbal and written communication skills to effectively collaborate with team members and stakeholders.
STRONG ADDITIONAL COMPETENCIES

  • Certifications: Relevant certifications in data engineering, cloud computing, or big data technologies.
  • Advanced Degree: Master's degree in a related field.
  • Experience: Experience with data modeling, data warehousing solutions, and data visualization tools like Quicksight, Tableau or Power BI.
  • Stream Data Processing: A nice to have is previous experience with data stream processing and near real time analytics like Apache Flink, Kafka Streams.
  • Agile Methodologies: Familiarity with Agile development practices and methodologies.
Salary range
  • €70k to €90k + stock options
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.