Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Python Engineer

Next Ventures

São Paulo

Presencial

BRL 200.000 - 300.000

Tempo integral

Hoje
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A technology solutions company in São Paulo is seeking a Senior Python Developer to enhance backend systems and automate workflows. You'll design backend services, develop web-scraping solutions, and manage data pipelines supporting internal operations. Ideal candidates will have 5+ years of Python backend experience, strong SQL skills, and proficiency in monitoring tools. This role offers opportunities to work with modern frameworks and data engineering practices in a collaborative environment.

Serviços

Competitive salary
Health benefits
Professional development opportunities

Qualificações

  • 5+ years of professional Python backend development experience.
  • 3+ years of experience with web scraping and multi-source data extraction.
  • Proven experience building scalable backend systems.

Responsabilidades

  • Design and implement automated systems that improve internal processes.
  • Develop and maintain scrapers for external sources.
  • Clean, transform, and process large datasets.

Conhecimentos

Advanced Python proficiency
Web Scraping
SQL skills
Linux proficiency
Data Manipulation

Formação académica

Bachelor’s or Master’s degree in Computer Science or related field

Ferramentas

BeautifulSoup
Scrapy
Selenium
PostgreSQL
Docker
Kubernetes
Descrição da oferta de emprego

Senior Python Developer

Join a fast-paced engineering team focused on building scalable backend systems, automating complex workflows, and powering data-driven decision making. In this role, you will design backend services, develop robust web-scraping solutions, and build data pipelines that support internal operations and data science initiatives.

What You’ll Do

Process Automation & Backend Development

  • Design and implement automated systems that improve internal processes and operational efficiency.
  • Build scalable backend services that integrate seamlessly with existing infrastructure.
  • Ensure services meet standards for reliability, performance, and maintainability.

Web Scraping & Data Extraction

  • Develop and maintain scrapers for a variety of external sources.
  • Handle dynamic content, authentication, rate limits, and anti-bot challenges.
  • Implement strong error handling, logging, and retry logic.

Data Manipulation & Processing

  • Clean, transform, and process large structured and unstructured datasets.
  • Build and maintain ETL/ELT pipelines that deliver high-quality data to downstream systems.

Monitoring & Observability

  • Implement monitoring for system performance, data quality, and operational metrics.
  • Build dashboards and alerts to ensure reliability and data integrity.

Data Science Collaboration

  • Provide the infrastructure, pipelines, and tools needed for data science experiments and model deployment.
  • Partner with data scientists to deliver datasets and backend services that accelerate analytics work.

Code Quality & Engineering Best Practices

  • Use test-driven development and maintain strong unit/integration test coverage.
  • Perform code reviews and promote engineering standards and best practices.
  • Follow modern Python packaging and dependency‑management practices.

CI/CD & Infrastructure

  • Build and maintain CI/CD pipelines for automated testing and deployment.
  • Collaborate with DevOps on containerization and orchestration efforts.

Lifecycle Ownership & Continuous Improvement

  • Own the full lifecycle of backend services from development to deployment and ongoing improvement.
  • Identify opportunities to reduce technical debt and enhance system resilience.
What You’ll Bring

Technical Expertise

  • Advanced Python proficiency, including experience with modern backend frameworks (e.g., FastAPI).
  • Strong understanding of HTTP, RESTful APIs, and core web technologies.
  • Experience working in Linux environments.

Web Scraping & Automation

  • Practical experience with scraping libraries and tools (e.g., BeautifulSoup, Scrapy, Selenium, Playwright).
  • Ability to manage JavaScript‑rendered content, sessions, and complex authentication flows.

Data Manipulation

  • Strong experience with Python data libraries (pandas, polars, NumPy).
  • Solid SQL skills and familiarity with common data formats (JSON, CSV, XML, HTML).

Monitoring & Observability

  • Experience with tools such as Datadog or similar platforms.
  • Ability to define, track, and monitor key metrics, logs, and alerts.

Databases & Storage

  • Hands‑on experience with relational databases (e.g., PostgreSQL).
  • Familiarity with ETL/ELT concepts and data‑warehouse fundamentals.

CI/CD & DevOps Collaboration

  • Experience with automated build/test/deploy pipelines.
  • Familiarity with Docker and Kubernetes.

Data Science Support

  • Understanding of data science workflows and the infrastructure needed for experimentation and deployment.
  • Experience building tools and services for ML and analytics use cases.
Minimum Qualifications
  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field, or equivalent practical experience.
  • 5+ years of professional Python backend development experience.
  • 3+ years of experience with web scraping and multi‑source data extraction.
  • Proven experience building scalable backend systems and automated processes.
  • Experience with monitoring/observability tools.
  • Strong SQL experience and familiarity with relational databases.
  • Proficiency with Linux, Git, and command‑line tools.
Preferred Qualifications
  • Experience with asynchronous/concurrent processing (e.g., asyncio, Celery).
  • Exposure to logistics, transportation, or supply‑chain concepts.
  • Experience with microservices or distributed systems.
  • Familiarity with data engineering tools (dbt, Airflow, Prefect).
  • Experience building APIs that integrate with machine‑learning pipelines.
  • Open‑source contributions related to scraping, data engineering, or automation.
  • Experience using LLM‑powered development tools in everyday workflows.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.