Enable job alerts via email!

Senior Integration Engineer - Kafka

HRB

Vaughan

On-site

CAD 100,000 - 130,000

Full time

30+ days ago

Job summary

A leading company is seeking a Senior Integration Engineer to design and maintain integration solutions, focusing on Apache Kafka and AWS Glue. The role involves collaborating with various teams to ensure efficient data flow and implementing best practices in integration architecture.

Qualifications

  • 5+ years of hands-on experience in enterprise-level integration solutions.
  • Proficiency in Python, Java, or Scala for integration development.

Responsibilities

  • Design and maintain scalable integration solutions using Apache Kafka.
  • Develop ETL/ELT pipelines using AWS Glue for data processing.
  • Collaborate with teams to ensure seamless data flow across systems.

Skills

Apache Kafka
AWS Glue
Data Modeling
Event-Driven Architecture
Microservices Principles
Problem Solving
Communication

Education

Bachelor's degree in computer science

Tools

AWS
SQL
NoSQL
Git
Docker
Terraform

Job description

We are seeking a highly skilled and experienced Senior Integration Engineer to join our dynamic team. This role will be pivotal in designing, developing, and maintaining robust integration solutions, with a strong focus on leveraging Apache Kafka for real-time data streaming and AWS integration services, particularly AWS Glue for ETL and data cataloging. The ideal candidate will have a deep understanding of enterprise integration patterns, event-driven architectures, and cloud-native solutions. You will collaborate closely with various engineering, product, and business teams to ensure seamless and efficient data flow across our diverse system landscape.

Key Responsibilities:

  • Design, develop, implement, and maintain scalable and resilient integration solutions using Apache Kafka and its ecosystem (Kafka Connect, Kafka Streams, Schema Registry).
  • Monitor, troubleshoot, and optimize Kafka cluster performance, including throughput, latency, and resource utilization, to ensure stability and efficiency.
  • Architect and build ETL/ELT pipelines using AWS Glue for data extraction, transformation, and loading from various sources into data lakes, data warehouses, and other target systems.
  • Develop and manage AWS Glue Data Catalog and ensure data quality, consistency, and governance across integrated systems.
  • Integrate a variety of cloud-based applications, databases, and third-party services using appropriate AWS services (e.g., Lambda, S3, API Gateway, SNS, SQS, EventBridge) and other integration technologies.
  • Champion and implement best practices for event-driven architecture, microservices integration, and API development.
  • Collaborate with architects, data engineers, software developers, and business stakeholders to understand integration requirements and translate them into technical designs and solutions.
  • Optimize and troubleshoot existing integration solutions to improve performance, reliability, and scalability.
  • Develop and maintain comprehensive documentation for integration designs, processes, and data flows.
  • Implement robust monitoring, alerting, and logging for integration solutions to ensure operational excellence.
  • Mentor junior engineers and provide technical leadership within the integration domain.
  • Stay current with emerging technologies and industry trends in data integration, Kafka, and AWS services.
  • Ensure security best practices are implemented and maintained for all integrationsolutions.

RequiredQualifications:

  • Bachelor's degree in computer science, Engineering, Information Technology, or a related field (or equivalent practical experience).
  • 5+ years of hands-on experience in designing and developing enterprise-level integration solutions.
  • Proven, in-depth experience with Apache Kafka, including designing and implementing Kafka producers, consumers, topics, connectors,and performance tuning.
  • Strong experience with AWS cloud services, particularly AWS Glue for ETL, data cataloging, and job orchestration.
  • Proficiency with other AWS integration and data services such as S3, Lambda, API Gateway, Kinesis, Redshift, RDS, DynamoDB.
  • Solid understanding of data modeling, data warehousing concepts, and ETL/ELT processes.
  • Experience with various data formats (e.g., JSON, Avro, Parquet, XML) and API protocols (e.g., REST, SOAP, gRPC).
  • Proficiency in one or more programming languages commonly used in integration development (e.g., Python, Java, Scala).
  • Experience with SQL and NoSQL databases.
  • Strong understanding of integration patterns, event-driven architecture, and microservices principles.
  • Experience with version control systems (e.g., Git) and CI/CD pipelines.
  • Excellent analytical,problem-solving, and troubleshooting skills.
  • Strong communication and interpersonal skills, with the ability to collaborate effectively withtechnical and non-technicalteams.

Preferred Qualifications:

  • Confluent Certified Developer for Apache Kafka (CCDAK) or similar Kafka certification.
  • AWS Certifications (e.g., AWS Certified Data Analytics - Specialty,AWS Certified Developer - Associate/Professional, AWS Certified Solutions Architect - Associate/Professional).
  • Experience with other ETL tools (e.g., Apache Airflow, Talend, Informatica).
  • Familiaritywith containerization technologies (e.g., Docker, Kubernetes).
  • Experience with Infrastructure as Code (IaC) tools (e.g., Terraform,AWS CloudFormation).
  • Knowledge of data governance and data security best practices.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.