Ativa os alertas de emprego por e-mail!
A technology consulting firm is looking for a Senior Data Engineer to design and optimize data pipelines in AWS. Responsibilities include ETL pipeline development, data governance, and collaboration with cross-functional teams. The ideal candidate has over 5 years of experience in data engineering, strong programming skills in Python and SQL, and advanced English proficiency. This remote role offers robust career growth opportunities.
We are looking for a Senior Data Engineer to design and maintain scalable data pipelines on AWS, ensuring performance, quality, and security. You will collaborate with data scientists and analysts to integrate data from multiple sources and support AI/ML initiatives.
Avenue Code reinforces its commitment to privacy and to all the principles guaranteed by global data protection laws such as GDPR, LGPD, CCPA and CPRA. The Candidate data shared with Avenue Code will be kept confidential and will not be transmitted to disinterested third parties, nor will it be used for purposes other than the application for open positions. As a consultancy, Avenue Code may share your information with its clients and other companies from the CompassUol Group to which Avenue Code’s consultants are allocated to perform its services.
Designing performant data pipelines for the ingestion and transformation of complex datasets into usable data products. Build enterprise-grade batch and real-time data processing pipelines on AWS, with a focus on serverless architectures. Design and implement automated ELT processes to integrate disparate datasets. Collaborate across teams to ingest, extract, and process data using Python, SQL, REST, and GraphQL APIs. Transform clickstream and CRM data into meaningful metrics and segments for visualization. Create automated QA and reliability checks to ensure data integrity. Define and maintain CI/CD and deployment pipelines for data infrastructure. Containerize and deploy solutions using Docker and AWS ECS.
We are seeking experienced Data Engineers to develop and deliver robust, cost-efficient data products that power analytics, reporting and decision-making across two distinct brands. The project involves ingesting data with tools like Fivetran, processing in BigQuery, building LookML models, and delivering Looker dashboards. You will work in a modern cloud environment (GCP preferred) with a focus on data quality, performance, and cost efficiency. You will collaborate with cross-functional teams and support business users with timely insights.
Why join: remote-first, global community, strong emphasis on growth, learning, and social good.