Big Data Developer for an International Project! Are you a Big Data expert looking for a global challenge? Join an international financial sector project, working with cutting-edge technologies in a dynamic and collaborative environment.
What will you do?
Design, develop, and optimize large-scale data pipelines.
Create and maintain ETL workflows for processing structured and unstructured data.
Implement solutions using Big Data frameworks (Hadoop, Spark, Hive, etc.).
Develop scalable and high-performance code in Python, Java, or Scala.
Collaborate with data scientists and analysts to optimize workflows.
Ensure data quality, security, and integrity across all systems.
Implement CI / CD pipelines and deploy infrastructure using GitHub Workflows.
Mandatory Requirements
5+ years of experience as a Big Data Developer.
Strong proficiency in Big Data frameworks such as Hadoop, Spark, Hive.
Expertise in Python or Scala and advanced SQL.
Solid understanding of distributed computing and cloud architectures (AWS, Azure, GCP).
Strong analytical and problem-solving skills with a focus on efficiency.
J-18808-Ljbffr
Obtenga la revisión gratuita y confidencial de su currículum.