Ativa os alertas de emprego por e-mail!
A tech consulting firm is looking for a Senior Data Engineer to design and maintain AWS-based data pipelines. The ideal candidate will have 5+ years of experience and strong programming skills in Python and SQL. Responsibilities include building scalable data infrastructures, optimizing performance, and ensuring data governance. This role offers 100% remote work with flexible benefits for training and career growth.
We are looking for a Senior Data Engineer to design and maintain scalable data pipelines on AWS, ensuring performance, quality, and security. You will collaborate with data scientists and analysts to integrate data from multiple sources and support AI/ML initiatives.
Key Responsibilities:
Requirements:
Avenue Code reinforces its commitment to privacy and to all the principles guaranteed by the most accurate global data protection laws, such as GDPR, LGPD, CCPA and CPRA. The Candidate data shared with Avenue Code will be kept confidential and will not be transmitted to disinterested third parties, nor will it be used for purposes other than the application for open positions. As a Consultancy company, Avenue Code may share your information with its clients and other Companies from the CompassUol Group to which Avenue Code’s consultants are allocated to perform its services.
The company and our mission:
Zartis is a digital solutions provider working across technology strategy, software engineering and product development.
We partner with firms across financial services, MedTech, media, logistics technology, renewable energy, EdTech, e-commerce, and more. Our engineering hubs in EMEA and LATAM are full of talented professionals delivering business success and digital improvement across application development, software architecture, CI/CD, business intelligence, QA automation, and new technology integrations.
We are looking for a Data Engineer to work on a project in the Technology industry.
The project:
Our teammates are talented people that come from a variety of backgrounds. We’re committed to building an inclusive culture based on trust and innovation.
You will be part of a distributed team developing new technologies to solve real business problems. Our client empowers organizations to make smarter, faster decisions through the seamless integration of strategy, technology, and analytics. They have helped leading brands harness their marketing, advertising, and customer experience data to unlock insights, enhance performance, and drive digital transformation.
We are looking for someone with good communication skills, ideally with experience making decisions, being proactive, used to building software from scratch, and with good attention to detail.
What you will do:
What you will bring:
Nice to have:
What we offer:
About The Role
We are seeking experienced Data Engineers to develop and deliver robust, cost-efficient data products that power analytics, reporting and decision-making across two distinct brands.
What You’ll Do
- Build highly consumable and cost-efficient data products by synthesizing data from diverse source systems.
- Ingest raw data using Fivetran and Python, staging and enriching it in BigQuery to provide consistent, trusted dimensions and metrics for downstream workflows.
- Design, maintain, and improve workflows that ensure reliable and consistent data creation, proactively addressing data quality issues and optimizing for performance and cost.
- Develop LookML Views and Models to democratize access to data products and enable self-service analytics in Looker.
- Deliver ad hoc SQL reports and support business users with timely insights.
- (Secondary) Implement simple machine learning features into data products using tools like BQML.
- Build and maintain Looker dashboards and reports to surface key metrics and trends.
What We’re Looking For
- Proven experience building and managing data products in modern cloud environments (GCP preferred).
- Strong proficiency in Python for data ingestion and workflow development.
- Hands-on expertise with BigQuery, dbt, Airflow and Looker.
- Solid understanding of data modeling, pipeline design and data quality best practices.
- Excellent communication skills and a track record of effective collaboration across technical and non-technical teams.
Why Join Kake?
Kake is a remote-first company with a global community — fully believing that it’s not where your table is, but what you bring to the table. We provide top-tier engineering teams to support some of the world’s most innovative companies, and we’ve built a culture where great people stay, grow, and thrive. We’re proud to be more than just a stop along the way in your career — we’re the destination.
The icing on the Kake:
Competitive Pay in USD – Work globally, get paid globally.
Fully Remote – Simply put, we trust you.
Better Me Fund – We invest in your personal growth and passions.
️ Compassion is Badass – Join a community that invests in social good.
Hoje
Job Title: Solution Architect – Databricks
Location: Remote Work
12 months
About the Role
Celebal Technologies is seeking an experienced Data Architect with deep expertise in Azure Databricks, Unity Catalog, PySpark, and Delta Live Tables. In this role, you will lead the design, architecture, and delivery of advanced data solutions, collaborating closely with cross-functional teams to ensure scalable, secure, and high-performance implementations.
Key Responsibilities
Required Skills & Qualifications
Preferred Qualifications
End