
Ativa os alertas de emprego por e-mail!
Cria um currículo personalizado em poucos minutos
Consegue uma entrevista e ganha mais. Sabe mais
A leading technology partner is seeking a Senior Big Data Engineer to join their Customer Identity team. The role involves managing high-performance data services for customer interactions, ensuring accurate and accessible customer data. Ideal candidates will have over 5 years of experience in data engineering, strong Python and Airflow skills, and familiarity with GCP tools. This fully remote position offers flexible work hours and generous vacation benefits, promoting work-life balance.
Native / Bilingual English is required for this role (read / written / spoken)
Please upload your CV Resume in English.
Monthly salary: $4,000 - $5,000 USD
Along with our partner, we are seeking a Senior Big Data Engineer to join the Customer Identity (CI) team that is central to the enterprise, focused on ensuring customer data is accurate, consistent, and readily accessible. This critical data is the foundation for essential business processes, including personalized marketing, analytical reporting, and seamless online and in-store checkout and customer service experiences.
They manage a wide array of high-performance, scalable batch and real-time services. These services are utilized at numerous customer interactions, such as in-store purchases, online orders, rewards program enrollments, and activations from various marketing channels (Email, Direct Mail, SMS, Push notifications, etc.). By leveraging modern development practices, tools, and technology platforms, the CI team builds capabilities that empower customer-facing and checkout teams to deliver personalized, relevant, and frictionless experiences, ensuring customers realize value with every interaction.
This team operates in every major US time zone from Pacific to Eastern. Core hours would be considered 10 am CST - 4 pm CST with 2 hours spilling over either direction based on location.
Python, Apache Airflow, Apache Spark, Spark SQL, Spark Streaming, Kafka, Relational Databases, NoSQL Databases, GCP BigQuery, BigTable, Dataproc, RESTful APIs.
3 months; end date being March 31, 2026, with the possibility of being extended based on team needs.