Enable job alerts via email!

Data Engineer with Snowflake and DBT – pharmaceutical industry

Sii Ukraїna TOV

Warszawa

On-site

PLN 120,000 - 180,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading pharmaceutical company is seeking an experienced Data Engineer to design and maintain scalable data pipelines. The ideal candidate will have strong skills in SQL, expertise in Snowflake, and a solid understanding of data modeling and ETL processes. Excellent communication in Polish and English is required for collaboration with stakeholders. This role is located in Warsaw, Poland.

Qualifications

  • At least 5 years of experience with a programming language focused on data pipelines.
  • Proficiency in SQL and experience with Snowflake's SQL syntax.
  • Technical expertise with data modeling, data warehousing, and ETL processes (Data Vault).
  • Familiarity with data integration tools and cloud platforms.
  • Strong problem-solving skills to analyze complex data requirements.
  • Ability to optimize performance and troubleshoot issues in cloud-based environments.
  • Excellent communication skills in Polish and English.

Responsibilities

  • Designing, developing, and maintaining scalable and efficient data pipelines.
  • Building efficient data pipelines that handle large amounts of data in real-time.
  • Collaborating with AI scientists and data analysts to analyze business requirements.
  • Developing and maintaining data warehouse and ETL processes.
  • Implementing data security, privacy, and compliance measures.
  • Performing data analysis and providing technical support.
  • Continuously monitoring and optimizing data pipelines for performance.

Skills

Programming languages focused on data pipelines
SQL
Data modeling
Data warehousing
ETL processes
Data integration tools
Cloud platforms
Problem-solving
Communication skills in Polish
Communication skills in English

Tools

Snowflake
Talend
Informatica
AWS
Azure
GCP
Job description
Data Engineer with Snowflake and DBT – pharmaceutical industry

We are looking for a Data Engineer to join our project for one of the largest pharmaceutical companies.

Responsibilities
  • Designing, developing, and maintaining scalable and efficient data pipelines and systems
  • Building efficient data pipelines that handle large amounts of data in real-time
  • Working collaboratively with AI scientists and data analysts to gather and analyze business requirements and ensure data solutions align with business needs
  • Developing and maintaining data warehouse and ETL processes, ensuring data quality and integrity
  • Implementing data security, data privacy, and compliance measures
  • Performing data analysis, troubleshooting data issues, and providing technical support to end-users
  • Continuously monitoring and optimizing data pipelines and systems to ensure optimal performance and scalability
Requirements
  • At least 5 years of experience with a programming language focused on data pipelines
  • Proficiency in SQL and experience with Snowflake's SQL syntax and capabilities
  • Technical expertise with data modeling, data warehousing, and ETL processes (Data Vault)
  • Familiarity with data integration tools (e.g., Talend, Informatica) and cloud platforms (e.g., AWS, Azure, GCP)
  • Strong problem-solving skills and the ability to analyze complex data requirements
  • Ability to optimize performance and troubleshoot issues in a cloud-based environment
  • Excellent communication skills in Polish and English to collaborate with stakeholders and team members
Quick apply

Fill in the form in English please

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.