
Ativa os alertas de emprego por e-mail!
Cria um currículo personalizado em poucos minutos
Consegue uma entrevista e ganha mais. Sabe mais
A leading tech company is searching for a Senior Big Data Engineer to join the Customer Identity team. The role is fully remote, dedicated to ensuring customer data is accurate and accessible. Candidates must have over 5 years of experience in data engineering, strong proficiency in Python, and familiarity with tools like Apache Spark and Kafka. This is a full-time role with a structured work-life balance, offering two weeks of paid vacation and 10 local holiday days per year.
Native / Bilingual English is required for this role (read / written / spoken)
Please upload your CV Resume in English.
Monthly salary: $4,000 - $5,000 USD
Along with our partner, we are seeking a Senior Big Data Engineer to join the Customer Identity (CI) team that is central to the enterprise, focused on ensuring customer data is accurate, consistent, and readily accessible. This critical data is the foundation for essential business processes, including personalized marketing, analytical reporting, and seamless online and in-store checkout and customer service experiences.
They manage a wide array of high-performance, scalable batch and real-time services. These services are utilized at numerous customer interactions, such as in-store purchases, online orders, rewards program enrollments, and activations from various marketing channels (Email, Direct Mail, SMS, Push notifications, etc.). By leveraging modern development practices, tools, and technology platforms, the CI team builds capabilities that empower customer-facing and checkout teams to deliver personalized, relevant, and frictionless experiences, ensuring customers realize value with every interaction.
This team operates in every major US time zone from Pacific to Eastern. Core hours would be considered 10 am CST - 4 pm CST with 2 hours spilling over either direction based on location.
Python, Apache Airflow, Apache Spark, Spark SQL, Spark Streaming, Kafka, Relational Databases, NoSQL Databases, GCP BigQuery, BigTable, Dataproc, RESTful APIs.
3 months; end date being March 31, 2026, with the possibility of being extended based on team needs.