¡Activa las notificaciones laborales por email!

Java Developer (Data Platform)

RingCentral

España

Presencial

EUR 50.000 - 70.000

Jornada completa

Hace 30+ días

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

An established industry player is seeking a talented data engineer to join their dynamic Data Platform Team. In this role, you will design and maintain robust data platforms that handle large-scale data processing. Collaborating with cross-functional teams, you'll tackle complex data challenges while ensuring the architecture remains scalable and efficient. This opportunity not only allows you to work with cutting-edge technologies but also fosters a culture of knowledge sharing and professional growth. If you're passionate about data and eager to make a significant impact, this role is perfect for you!

Servicios

Collaborative work environment
Opportunity for personal growth
Professional development opportunities
Career advancement prospects

Formación

  • 3+ years experience in Java or Scala programming with strong grasp of Java concepts.
  • Proficiency in ANSI SQL and understanding of distributed systems architecture.

Responsabilidades

  • Design, implement, and support data platform applications using modern technologies.
  • Develop and optimize large-scale data pipelines for high performance and reliability.

Conocimientos

Java
Scala
ANSI SQL
Data Architecture Patterns
Multi-threading
Performance Tuning
Data Streaming Technologies
Python
Data Visualization

Herramientas

Apache Spark
Apache Airflow
Apache Hadoop
MongoDB
AWS
Snowflake
Tableau

Descripción del empleo

Company Overview:

Join a leading international company based in the U.S.,renowned for its enterprise VoIP communication,messaging,and video conferencing solutions.As part of our Data Platform Team,you will work on building and optimizing the backbone of our data infrastructure,ensuring efficient,scalable,and reliable data processing pipelines that support mission-critical business applications.

What You’ll Do:

As a member of the Data Platform Team,you will design,implement,and maintain robust data platforms that handle large-scale data processing and management.You’ll collaborate with cross-functional teams to model data pipelines,develop solutions for complex data challenges,and ensure our platform is built to scale efficiently across a wide variety of use cases and systems.

Responsibilities:

  • Design, implement, and support data platform applications using modern technologies in a dynamic, fast-evolving environment.
  • Develop and optimize large-scale data pipelines, ensuring high performance, reliability, and scalability.
  • Collaborate with various teams to model complex data relationships and provide insights to support data-driven decisions.
  • Ensure data platform architecture and infrastructure remain resilient, efficient, and scalable to meet the company's growing data needs.
  • Promote a knowledge-sharing environment, mentoring peers and contributing to the team's success.

Technology Stack:

  • Core Technologies: Java, Scala, ANSI SQL, Apache Spark, Apache Airflow, Apache Hadoop (HDFS, YARN), Apache Hive, Apache Impala, Apache Flume, MongoDB, AWS (Amazon Web Services), Snowflake.

Skills & Requirements:

  • 3+ years of hands-on experience with Java or Scala programming.
  • Strong grasp of Java concepts (collections, serialization, multi-threading, lambda expressions, JVM architecture, etc.).
  • Proficiency in ANSI SQL, including query syntax, performance tuning, and knowledge of OLAP vs. OLTP.
  • Ability to quickly learn new technologies and integrate them into existing infrastructure.
  • A deep understanding of architecture patterns in distributed systems, especially in data platform environments.

Preferred Qualifications:

  • Experience working with Hadoop ecosystem components and big data frameworks.
  • Hands-on experience with Hadoop, Spark, Kafka, or other data streaming technologies.
  • Familiarity with designing and implementing ETL (Extract, Transform, Load) processes.
  • Basic knowledge of Python and its use in data engineering tasks.
  • Experience with data visualization/analysis tools (e.g., Tableau) to drive insights.
  • Proficiency with Linux and cloud-based infrastructure (especially AWS).
  • Intermediate or higher proficiency in written and spoken English.

What We Offer:

  • A collaborative and high-performing professional team.
  • The opportunity to work with cutting-edge data technologies and solve challenging, large-scale data problems.
  • A dynamic project environment with ample opportunities for personal growth, professional development, and career advancement.
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.