Ativa os alertas de emprego por e-mail!
A leading healthcare technology firm is seeking a Data Engineer for a short-term contract in Brazil. The role involves building ETL pipelines in Microsoft Azure, supporting data warehousing, and ensuring high data quality. Candidates should have 4+ years of experience, strong skills in SQL, and proficiency in Azure. The position offers remote work and comprehensive benefits.
4+ years of data engineering experience - building data warehouses and ETL pipelines
3+ years of experience with Snowflake Datawarehouse
Well versed with data marts
Strong understanding of SQL including: SSIS, SSAS, & SSRS
Ability to work MST hours
Nice to Have Skills & Experience
Experience working within the healthcare industry, provider side highly preferred
Experience working in an AWS environment
Experience with big data tools: Hadoop, Spark, Kafka
Job Description
A healthcare client of ours is looking for multiple Data Engineers to join their growing team. This team is currently working on a project to modernize on prem SQL warehouses to snowflake. The data warehouse is built, so you will be supporting the build out of data marts on top of their Snowflake enterprise warehouse. This includes lifting and shifting existing data marts, building net new data marts, and further integrations. Additionally, the data engineering team designs, tests, deploys and maintains data pipelines and establishes a strong data reporting environment. This company operates within an AWS environment which is supported by their DevOps team; however, any cloud integration experience could be helpful.
This position is currently slated as a 6-month contract, with high likelihood to extend for an additional 6+ months. Furthermore, this org has a large pipeline of project work for the next three years, so continuous extensions are likely. Insight Global offers full benefits while on contract. If you're interested, apply today!
Overview
We are seeking a Data Engineer (short-term contractor) to design and implement data acquisition and ETL processes in Microsoft Azure to support enterprise reporting needs. You will be responsible for integrating multiple technical datasets (ServiceNow, PeopleSoft, and vendor systems) into a central Azure SQL Database, ensuring cleansed, reliable, and query-ready data is available for downstream dashboards.
Core Responsibilities
Required Skills
Nice-to-Have
Why Join Us?
Interested? Apply today and help deliver trusted reporting data for enterprise decision-making.
Come to one of the biggest IT Services companies in the world! Here you can transform your career!
Why to join TCS? Here at TCS we believe that people make the difference, that's why we live a culture of unlimited learning full of opportunities for improvement and mutual development. The ideal scenario to expand ideas through the right tools, contributing to our success in a collaborative environment.
We are looking for a "Data Engineer" Remote mode, who wants to learn and transform his career.
In this role you will: (responsibilities)
What can you expect from us?
• Professional development and constant evolution of your skills, always in line with your interests.
• Opportunities to work outside Brazil
• A collaborative, diverse and innovative environment that encourages teamwork.
What do we offer?
Health insurance
Life insurance
Gympass
TCS Cares – free 0800 that provides psychological assistance (24 hrs/day), legal, social and financial assistance to associates
Partnership with SESC
Reimbursement of Certifications
Free TCS Learning Portal – Online courses and live training
International experience opportunity
Discount Partnership with Universities and Language Schools
Bring Your Buddy – By referring people you become eligible to receive a bonus for each hire
TCS Gems – Recognition for performance
Xcelerate – Free Mentoring Career Platform
At TATA Consultancy Services we promote an inclusive culture, we always work for equity. This applies to Gender, People with Disabilities, LGBTQIA+, Religion, Race, Ethnicity. All our opportunities are based on these principles. We think of different actions of inclusion and social responsibility, in order to build a TCS that respects individuality. Come to be a TCSer!
ID:
Project Description:
Responsibilities:
Mandatory Skills Description:
• Strong, recent hands-on expertise with Azure Data Factory and Synapse is a must (3+ years).
• Experience in leading a distributed team.
• Strong expertise in designing and implementing data models, including conceptual, logical, and physical data models, to support efficient data storage and retrieval.
• Strong knowledge of Microsoft Azure, including Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory, and Azure Databricks, pySpark for building scalable and reliable data solutions.
• Extensive experience with building robust and scalable ETL/ELT pipelines to extract, transform, and load data from various sources into data lakes or data warehouses.
• Ability to integrate data from disparate sources, including databases, APIs, and external data providers, using appropriate techniques such as API integration or message queuing.
• Proficiency in designing and implementing data warehousing solutions (dimensional modeling, star schemas, Data Mesh, Data/Delta Lakehouse, Data Vault)
• Proficiency in SQL to perform complex queries, data transformations, and performance tuning on cloud-based data storages.
• Experience integrating metadata and governance processes into cloud-based data platforms
• Certification in Azure, Databricks, or other relevant technologies is an added advantage
• Experience with cloud-based analytical databases.
• Experience with Azure MI, Azure Database for Postgres, Azure Cosmos DB, Azure Analysis Services, and Informix.
• Experience with Python and Python-based ETL tools.
• Experience with shell scripting in Bash, Unix or windows shell is preferable.
Nice-to-Have Skills Description:
• Experience with Elasticsearch
• Familiarity with containerization and orchestration technologies (Docker, Kubernetes).
• Troubleshooting and Performance Tuning: Ability to identify and resolve performance bottlenecks in data processing workflows and optimize data pipelines for efficient data ingestion and analysis.
• Collaboration and Communication: Strong interpersonal skills to collaborate effectively with stakeholders, data engineers, data scientists, and other cross-functional teams.
Languages:
Our client is a U.S.-based company that provides technical expertise, testing, and certification services to the global food and agricultural industry. Their mission is to ensure food safety, quality, and sustainability across international supply chains.
This role is critical to building, maintaining, and modernizing data pipelines that process large-scale regulatory data from around the world and transform it into usable datasets for downstream applications and APIs.
The engineer will work hands-on with Python, SQL, and related tools to untangle legacy “spaghetti code” pipelines, migrate processes to more maintainable platforms such as Airflow, and ensure that our data is accurate, reliable, and ready for client-facing products. This role requires both strong technical ability and a consulting mindset—able to learn undocumented systems, troubleshoot gaps, and design forward-looking solutions that will scale as our data environment evolves.
Required Qualifications:
Preferred Qualifications:
We are seeking a skilled and experienced Data Engineer to join our Threat Research team. The primary responsibility of this role will be to design, develop, and maintain data pipelines for threat intelligence ingestion, validation, and export automation flows.
Responsibilities:
Requirements:
About The Role
We are seeking experienced Data Engineers to develop and deliver robust, cost-efficient data products that power analytics, reporting and decision-making across two distinct brands.
What You’ll Do
- Build highly consumable and cost-efficient data products by synthesizing data from diverse source systems.
- Ingest raw data using Fivetran and Python, staging and enriching it in BigQuery to provide consistent, trusted dimensions and metrics for downstream workflows.
- Design, maintain, and improve workflows that ensure reliable and consistent data creation, proactively addressing data quality issues and optimizing for performance and cost.
- Develop LookML Views and Models to democratize access to data products and enable self-service analytics in Looker.
- Deliver ad hoc SQL reports and support business users with timely insights.
- (Secondary) Implement simple machine learning features into data products using tools like BQML.
- Build and maintain Looker dashboards and reports to surface key metrics and trends.
What We’re Looking For
- Proven experience building and managing data products in modern cloud environments (GCP preferred).
- Strong proficiency in Python for data ingestion and workflow development.
- Hands-on expertise with BigQuery, dbt, Airflow and Looker.
- Solid understanding of data modeling, pipeline design and data quality best practices.
- Excellent communication skills and a track record of effective collaboration across technical and non-technical teams.
Why Join Kake?
Kake is a remote-first company with a global community — fully believing that it’s not where your table is, but what you bring to the table that matters. We provide top-tier engineering teams to support some of the world’s most innovative companies, and we’ve built a culture where great people stay, grow, and thrive. We’re proud to be more than just a stop along the way in your career — we’re the destination.
The icing on the Kake:
Competitive Pay in USD – Work globally, get paid globally.
Fully Remote – Simply put, we trust you.
Better Me Fund – We invest in your personal growth and passions.
️ Compassion is Badass – Join a community that invests in social good.
The company and our mission:
Zartis is a digital solutions provider working across technology strategy, software engineering and product development.
We partner with firms across financial services, MedTech, media, logistics technology, renewable energy, EdTech, e-commerce, and more. Our engineering hubs in EMEA and LATAM are full of talented professionals delivering business success and digital improvement across application development, software architecture, CI/CD, business intelligence, QA automation, and new technology integrations.
We are looking for a Data Engineer to work on a project in the Technology industry.
The project:
Our teammates are talented people that come from a variety of backgrounds. We’re committed to building an inclusive culture based on trust and innovation.
You will be part of a distributed team developing new technologies to solve real business problems. Our client empowers organizations to make smarter, faster decisions through the seamless integration of strategy, technology, and analytics. They have helped leading brands harness their marketing, advertising, and customer experience data to unlock insights, enhance performance, and drive digital transformation.
We are looking for someone with good communication skills, ideally with experience making decisions, being proactive, used to building software from scratch, and with good attention to detail.
What you will do:
What you will bring:
Nice to have:
What we offer:
We are looking for a Senior Data Engineer to design and maintain scalable data pipelines on AWS, ensuring performance, quality, and security. You will collaborate with data scientists and analysts to integrate data from multiple sources and support AI/ML initiatives.
Key Responsibilities:
Requirements:
Avenue Code reinforces its commitment to privacy and to all the principles guaranteed by the most accurate global data protection laws, such as GDPR, LGPD, CCPA and CPRA. The Candidate data shared with Avenue Code will be kept confidential and will not be transmitted to disinterested third parties, nor will it be used for purposes other than the application for open positions. As a Consultancy company, Avenue Code may share your information with its clients and other Companies from the CompassUol Group to which Avenue Code’s consultants are allocated to perform its services.