Behind the user-friendly iOS and Android apps and webpage that work across the world is the engineering team. We are in charge of creating, developing, improving, and maintaining all Fever services so that more people can have an amazing experience.
About the role
- You will be part of the Data organization building and operating the core technologies that enable data scientists, analysts, and the different business units to leverage rich data in efficient and innovative ways to generate impact and connect people to the most relevant real-world experiences.
- You will own critical data pipelines of our data warehouse and the resulting data products that are used daily across the company to inform decisions and models.
- You will ideate and implement tools and processes that increase our ability to exploit diverse data sources to solve business problems and understand behaviors.
- You will work closely with other business units to understand their challenges and apply an engineering vision to create structured and scalable solutions.
- You will contribute to the development of a complex data and software ecosystem using the latest technologies in data and software engineering stack.
On your first month in Fever :
- You will be fully integrated into the team, participating in onboarding, pair programming, one-on-one Scrum sessions, and meeting different departments.
- You will familiarize yourself with Fever's tech stack and frameworks used for data strategy.
- You will attend some Fever Originals experiences like Candlelight.
After 3 months in Fever :
- You will be able to solve new difficult problems, generate impact, and create new business opportunities.
- You will have responsibilities and ownership over parts of our Data Warehouse or other critical tools.
- You will participate in hackdays or hackathons organized with other teams and get to know the data and engineering communities.
On your 6th month in Fever :
- You will contribute to the overall health of our data ecosystem, improving performance, scalability, and robustness.
- You will identify gaps and champion continuous improvement.
- You will mentor new team members.
- You will participate in team-building activities.
Key responsibilities
- Adopt a data-oriented mindset to understand complex data assets and business challenges, and use engineering skills to address them.
- Build trusted data assets for decision-making.
- Create automations to unlock business opportunities.
- Design, build, and support scalable data infrastructure, including robust ETL workflows and data quality monitoring.
- Extend data APIs and develop tools to foster a data-driven culture.
- Understand technical trade-offs, implement scalable solutions, and collaborate with stakeholders to meet data requirements.
About you
- You have a strong background in at least two of the following: data engineering, business intelligence, or software engineering.
- You are an expert in Python 3 and its data ecosystem.
- You have proven experience with SQL.
- You have worked with complex data pipelines.
- You are proactive, energetic, and thrive in fast-paced environments.
- You are a collaborative team player with strong communication skills, adaptable to a multidisciplinary, international setting.
- You possess strong analytical and problem-solving skills backed by solid software engineering expertise.
- You are proficient in business English for clear communication.
It would be a plus if you...
- Have collaborated with multidisciplinary teams including data analysts, data scientists, marketing, and product managers.
- Have experience with scheduling and orchestration tools like Airflow.
- Have worked with databases such as Snowflake and PostgreSQL.
- Have used BI tools like Metabase or Superset for visualization and reporting.
- Have interacted with APIs from marketing platforms like Facebook, Google, Instagram.
- Have developed data-powered tools or applications.
- Are familiar with tools supporting reproducible, production-ready ML workflows.
- Have knowledge of backend frameworks like Django.
We offer an attractive compensation package, stock options, impactful work, a flexible work environment, health benefits, and opportunities for professional growth.
Required Experience: Senior IC
Key Skills
Apache Hive, S3, Hadoop, Redshift, Spark, AWS, Apache Pig, NoSQL, Big Data, Data Warehouse, Kafka, Scala