Job Search and Career Advice Platform

Enable job alerts via email!

ETL Informatica Developer

Cognizant

Toronto

Hybrid

CAD 90,000 - 120,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology company in Toronto is seeking a highly skilled Sr. Developer with over 8 years of experience in data integration and ETL processes. The ideal candidate will excel in using tools like Informatica Powercenter, PL/SQL, and AWS. This role emphasizes enhancing data warehousing solutions and building scalable data pipelines in Databricks, supporting crucial business objectives. Strong data analysis skills and excellent communication are essential. The position operates in a hybrid model.

Qualifications

  • 8+ years of experience in data integration and ETL processes.
  • Proficient in developing and supporting Informatica mappings in production.
  • Strong skills in AWS resource provisioning.
  • Experience building scalable data pipelines in Databricks.

Responsibilities

  • Develop and support Informatica mappings and workflows.
  • Proficient in Teradata FastLoad and MultiLoad techniques.
  • Build scalable data pipelines in Databricks.
  • Implement Kafka streaming data consumption using Confluent tools.

Skills

Data integration
ETL processes
Informatica Powercenter
PL/SQL
AWS resources
Data pipelines in Databricks
Kafka streaming
Python scripting
Data warehousing concepts
Data Quality tools
Communication skills

Tools

Informatica
AWS
Teradata
Databricks
GitHub
Kubernetes
Job description

We are Cognizant Artificial Intelligence:

Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. However, clients need new business models built from analyzing customers and business operations at every angle to really understand them. With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate, and scale the most desirable products and delivery models to enterprise scale within weeks.

Job Summary

We are seeking a highly skilled Sr. Developer with 8 + years of experience to join our dynamic team. The ideal candidate will have expertise in data integration and ETL processes utilizing tools such as Informatica Powercenter and PL/SQL. This role involves working in a hybrid model with a focus on enhancing data warehousing solutions. The candidate will play a crucial role in optimizing data processes to support our business objectives.

Responsibilities
  • Must have hands-on experience in developing and supporting Informatica mappings and workflows in production environments.
  • Should be proficient in Teradata FastLoad and MultiLoad techniques for efficient bulk data ingestion and transformation.
  • Requires strong skills in provisioning AWS resources including S3 buckets, EC2 instances, IAM roles, Lambda functions and PostgreSQL databases using AWS CodePipeline.
  • Must be experienced in building scalable and optimized data pipelines in Databricks for both batch and streaming workloads.
  • Should have prior experience in providing production support for Databricks jobs including troubleshooting and performance tuning.
  • Must be capable of implementing Kafka streaming data consumption using Confluent tools or AWS Lambda for real-time data processing.
  • Requires knowledge of Databricks Medallion architecture (Bronze, Silver, Gold layers) for structured and scalable data lakehouse design.
  • Should be able to develop CI/CD pipelines using GitHub, GitHub Actions and Kubernetes for automated deployment and monitoring.
  • Must be familiar with applying vulnerability fixes to container images to maintain secure cloud environments.
  • Should be able to develop Python scripts to parse JSON data, convert it into flat files and load the structured output into Teradata systems.
  • Requires familiarity with cloud-based scheduling tools for orchestrating ETL workflows and managing job dependencies.
  • Must have a solid understanding of data warehousing concepts including Slowly Changing Dimensions (SCD Type 1 and Type 2).
  • Should have exposure to Data Quality tools for cleansing and matching of name and address data to ensure accuracy and consistency.
  • Requires excellent communication skills for effective collaboration with cross‑functional teams and stakeholders.
  • Must possess strong data analysis skills to interpret complex datasets and support business decision‑making.

At Cognizant, we’re eager to meet people who believe in our mission and can make an impact in various ways. We encourage you to apply if you have most of the skills above and feel like you are strongly suited for this role. Consider what transferrable experience and skills make you a unique applicant and help us see how you’d be beneficial to this role.

Cognizant will only consider applicants for this position who are legally authorized to work in Canada without requiring employer sponsorship, now or at any time in the future.

Working arrangements:

Note: The working arrangements for this role are accurate as of the date of posting. This may change based on the project you’re engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.