Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
Thales is seeking a Senior Data Engineer to enhance its Engineering Project Dashboard team in Bucharest, Romania. This role involves designing data pipelines and ensuring data quality using GCP tools, with a strong focus on collaboration and communication. Embrace flexibility and mobility by joining Thales, a leader in technological solutions.
Location: Bucharest, Romania
The people we all rely on to make the world go round – they rely on Thales.Thales rely on its employees to invent the future: right here, right now.
Present in Romania for over 40 years, Thales is expanding its presence in the country by growing its Digital capabilities and by developing a Group Engineering Competence Centre (ECC). Operating from Bucharest, Thales delivers solutions in a number of core businesses, from ground transportation, space and defence, to security and aeronautics.
Several professional opportunities have arisen. If you are looking for the solidity of a Global Group that is at the forefront of innovation, but with the agility of a human structure that tailors to the personal development of its employees and allows opportunities for evolution in an international environment, then this is the place for you!
Background:
We are seeking a passionate Senior Data Engineer to join our Engineering Project Dashboard team aiming to provide KPIs and metrics to monitor engineering activities of projects' engineering work packages. Customers of Engineering Dashboard digital services are spread all around the world, leading teams with different granularity, and looking for contextual information related to their projects.
Mission:
Our Data Engineer colleague will define and implement data transformation from a Data Lake dedicated to engineering to be exploited through Looker Studio GCP (Google Cloud Platform). The goal is to produce Engineering Project Dashboard team aiming to provide KPIs and metrics to monitor engineering activities of projects' engineering work packages.
Main responsibilities:
Design, build, and maintain scalable and reliable data pipelines with various data sources
Develop ETL / ELT processes using tools like Data Fusion GCP and DataFlow GCP (Apache Beam)
Collect and process the data in a suitable format for the organization needs.
Perform and integrate data quality checks to identify and correct errors or discrepancies.
Create and maintain documentation related to data flows and model, transformations applied, and validation procedures.
Optimize performance and cost-efficiency of GCP data services
Ensure security and compliance best practices in data handling
Maintain clear and close collaboration with both the development team and the project stakeholders/ key users.
Bachelor’s degree in Computer science, Computer Engineering, or relevant technical field.
5 + year of experience with cloud data platforms (e.g., AWS, Azure, GCP). GCP is highly desirable or at least a minimal experience.
Strong experience with DataFlow (GCP) and Apache Beam.
Proficiency in Python (or similar languages) with solid software engineering fundamentals (testing, modularity, version control).
Hands-on experience with SQL and NoSQL data stores, such as PostgreSQL, Redshift, DynamoDB, or MongoDB
Good understanding of data warehousing and modern architectures (e.g., data Lakehouse, data mesh)
Familiarity with DevOps/CI-CD practices, infrastructure-as-code (Terraform, CloudFormation), and containerization (Docker/Kubernetes)
Understanding of data quality, observability, lineage, and metadata management practices
Good communication and relationship with the stakeholders and team members
Capable to give and receive feedback; able to listen and share, able to give constructive feedback
English Fluent, French would be a plus
Agile mindset & practices