Ativa os alertas de emprego por e-mail!

Junior Software Engineer / Data

WEX

São Paulo

Presencial

BRL 80.000 - 120.000

Tempo integral

Há 20 dias

Resumo da oferta

A leading company in the technology sector is seeking an entry-level Engineer to join their Data team in São Paulo. This role involves designing and implementing data products and systems, focusing on leveraging big data technologies and driving solutions for business impacts. Ideal candidates are passionate about data, possess strong problem-solving skills, and are eager to learn and grow within a supportive team environment.

Qualificações

  • Bachelor's degree in Computer Science or Software Engineering or equivalent.
  • 1 year of software engineering experience is a plus.
  • Strong Python skills and knowledge of data pipeline concepts.

Responsabilidades

  • Collaborate with stakeholders to understand customer challenges.
  • Design, test, code, and implement new data products and pipelines.
  • Analyze data and develop solutions for business problems.

Conhecimentos

Problem-solving
Communication
Collaboration
Python
Cloud computing
Data processing

Formação académica

Bachelor's degree in Computer Science
Bachelor's in Software Engineering

Ferramentas

GitHub Actions
Terraform
ETL tools
Airflow

Descrição da oferta de emprego

Position Summary

We are looking for a highly motivated and talented entry-level Engineer to join our Data team, aiming to make significant business impacts and grow your career.

This is an exciting time to be part of the Data team at WEX. WEX has sophisticated business operations and products empowering various customer businesses. The data generated from these systems, applications, and platforms is rich and complex. As one of WEX's most valuable assets, data offers immense potential value for our customers and the business. The Data team is responsible for building big data technologies, platforms, systems, and tools to clean, process, enrich, and optimize core data, making it accessible and useful for generating customer and business value. We develop value-added data products for our customers, leveraging advanced industry technologies, including modern big data and AI technologies, using agile development methodologies.

We offer challenging problems with high business impact potential and a talented team of engineers and leaders to support your growth.

If you aspire to be a strong engineer capable of solving tough problems, generating significant impacts, and growing rapidly, this is an excellent opportunity!

Responsibilities:

  1. Collaborate with partners/stakeholders to understand customer business and key challenges.
  2. Design, test, code, and implement new data products, systems, platforms, and pipelines of small to medium complexity.
  3. Learn to measure, inspect, and drive decisions using data.
  4. Develop and maintain CI/CD automation using tools like GitHub Actions.
  5. Implement Infrastructure as Code (IaC) with tools like Terraform for cloud-based data infrastructure provisioning and management.
  6. Practice software development with TDD and BDD, Microservice, and event-oriented architectures.
  7. Utilize data and AI technologies/tools to enhance solutions for better business outcomes and customer experience.
  8. Analyze data to understand customer and business problems and develop effective solutions.
  9. Explore new technologies and innovative approaches to improve efficiency and productivity.
  10. Support live data products/systems, promote proactive monitoring, high data quality, rapid incident response, and continuous improvement through automation.
  11. Identify bottlenecks and opportunities for improvement within systems and processes.
  12. Learn from peers and foster continuous learning within the team.
  13. Understand team processes and best practices, applying them to tasks to solve customer/business problems effectively and sustainably.
  14. Partner with team members for development and problem-solving.
  15. Take ownership and be proactive in your work.
  16. Seek reviews from senior engineers to ensure quality.
  17. Build reliable, secure, high-quality, efficient, and user-friendly big data platforms and tools at scale.
  18. Develop systems, platforms, data pipelines, and tools for the entire data lifecycle, including ingestion, cleaning, processing, enrichment, and delivery, leveraging the Data platform.
  19. Learn and master big data technologies, tools, and software, ensuring proper integration and execution.
  20. Design and implement efficient and user-friendly data models and structures using data modeling techniques.

Required Qualifications:

  • Bachelor's degree in Computer Science, Software Engineering, or a related field, or demonstrable equivalent experience and understanding.
  • 1 year of software engineering experience is a plus.
  • Strong problem-solving, communication, and collaboration skills.
  • Highly self-motivated and eager to learn, with a proactive approach to exploring new technologies like GenAI for productivity and quality improvements.
  • Ability to design solutions for small to medium problems or components.
  • Strong Python skills, including coding, testing, and monitoring automation.
  • Passionate about data, big data technology, and cloud computing.
  • Interest in understanding and solving customer and business problems.
  • Some knowledge of data ingestion, cleaning, processing, enrichment, serving, and quality assurance techniques, including data pipelines, SQL, relational databases, and ELT processes.
  • Knowledge of data warehousing concepts and dimensional modeling is a plus.
  • Experience with ETL tools like DAGs and Airflow is beneficial.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.