Attiva gli avvisi di lavoro via e-mail!

Data Engineer

EstiaTech - Passion for Technology

Roma

Ibrido

EUR 55.000 - 75.000

Tempo pieno

Oggi
Candidati tra i primi

Genera un CV personalizzato in pochi minuti

Ottieni un colloquio e una retribuzione più elevata. Scopri di più

Descrizione del lavoro

A leading global technology consulting firm based in Italy is seeking a talented Data Engineer to join their Big Data division. The role involves designing and implementing data pipelines, integrating multiple data sources, and ensuring data quality. Candidates should have substantial experience in data engineering, with strong skills in Python, SQL, and cloud-based data services. The position offers a hybrid work environment along with competitive salary and benefits.

Servizi

Competitive salary
Hybrid work environment
Benefits package

Competenze

  • Experience with data pipeline design and integration.
  • Proficiency in Python, SQL, and Unix scripting.
  • Strong knowledge of data platforms like Databricks and Azure.

Mansioni

  • Design and implement efficient data pipelines.
  • Integrate data from various sources.
  • Ensure data quality and security.

Conoscenze

Python
SQL
Unix scripting
Data pipeline design
Data governance
Analytical skills
Communication
Problem-solving
Technical leadership
Collaboration

Formazione

10+ years of experience in Information Technology
5+ years focused on Data Engineering or Governance

Strumenti

Databricks
Teradata Vantage
Azure Data Services
Graph Databases (Neo4j, Cosmos DB)
Descrizione del lavoro

🏢 Azienda: Estiatech S.r.l. – Consulting and System Integration Company

🌐 Global team | Telecommunications & Technology

Estiatech Srl is partnering with a leading global group in the Telecommunications and Technology sectors to find a talented Data Engineer. This is an exciting opportunity to join a Big Data division where you'll help drive the future of data architecture and solutions at scale.

Key Responsibilities
  • Design and implement reliable, efficient data pipelines on Databricks, Teradata Vantage, and Azure Data Services.
  • Integrate data from various sources: relational databases, cloud storage, event processing platforms, web services, and more.
  • Develop scalable data architectures that empower business intelligence, machine learning, and Agentic AI.
  • Model complex datasets using both relational and multidimensional techniques (e.g., star/snowflake schemas).
  • Leverage geospatial and graph data to support advanced analytics and solve complex business problems.
  • Collaborate with cross-functional teams (Data Scientists, Analysts, Business Stakeholders) to implement innovative data solutions.
  • Ensure data quality, security, and governance across all engineering processes.
  • Monitor and optimize the performance of distributed systems and data workflows.
  • Contribute to the ongoing evolution of the company’s data strategy and platform architecture.
Required Skills & Experience
  • 10+ years of experience in the Information Technology sector, with at least 5 years focused on Data Engineering or Data Governance (preferably in Telecommunications or Finance).
  • Extensive experience in data pipeline design, real-time data integration, and platform management (batch, streaming).
  • Advanced proficiency in Python (including PySpark), SQL, and Unix scripting.
  • Strong knowledge of Data Platforms such as Databricks (Apache Spark), Teradata Vantage, and Azure Cloud Data Services (Fabric, Synapse, Functions).
  • Experience with Graph Databases like Neo4j, Cosmos DB with Gremlin.
  • Expertise in data modeling (Relational - 3NF, Multidimensional - star/snowflake schema).
  • Strong analytical and problem‑solving skills, with the ability to turn complex challenges into actionable solutions.
  • Exceptional communication skills and the ability to engage with both technical and non‑technical stakeholders.
  • A collaborative mindset and the ability to work effectively in cross‑functional teams.
  • High level of initiative, autonomy, and capacity to handle multiple priorities in a fast‑paced environment.
  • Ability to provide technical leadership and mentor junior engineers, fostering growth within the team.
  • Strategic vision and a business‑oriented approach to data solutions.
Preferred Qualifications
  • Certifications in Azure, Databricks, and Teradata.
  • Experience with CI/CD pipelines, Git, and DevOps practices.
  • Familiarity with data governance frameworks and GDPR compliance.
  • Work with a leading global technology solutions provider.
  • Be part of an innovative, high‑impact team driving cutting‑edge data projects.
  • Hybrid work environment that promotes work‑life balance.
  • Competitive salary and benefits package.
  • Opportunity to shape the future of big data and advanced analytics across industries.
Seniority level
  • Mid‑Senior level
Employment type
  • Full‑time
Job function
  • Information Technology
  • IT Services and IT Consulting
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.