Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Horizon Technologies

Karachi Division

On-site

PKR 1,400,000 - 2,000,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading tech firm in Karachi is seeking an experienced Data Engineer to extract, transform, and load data into data lakes and warehouses using AWS. Ideal candidates should have strong AWS knowledge, programming skills in Python and SQL, and experience in building ETL pipelines. This position offers the opportunity to work with cutting-edge technologies in a dynamic environment.

Qualifications

  • 2+ years of progressive experience in working on AWS services.
  • Previous experience as a data engineer or in a similar role.
  • Technical expertise with data models, data scraping, data cleansing, and segmentation techniques.

Responsibilities

  • Extract data from multiple sources and ingest it into a data lake (AWS S3).
  • Cleanse, transform, and maintain data quality.
  • Build and maintain data lakes, data warehouses, and data marts on AWS.

Skills

AWS Glue
Python
SQL
Data Quality Analysis
ETL Pipelines
Data Engineering Certification
AWS S3
AWS Lambda
MS Excel
PySpark

Education

Bachelor's degree in Computer Science or equivalent

Tools

AWS Services
Terrafom
Job description

We are seeking an experienced Data Engineer to join our team. The Data Engineer will be responsible for extracting, transforming, and loading data from various sources into data lakes and data warehouses using AWS while ensuring the efficiency and alignment of data systems with business goals. The ideal candidate should have strong analytical skills, familiarity with programming languages such as Python, SQL, and experience in building ETL/ELT pipelines.

Must Haves:

  • Bachelors degree in Computer Science or equivalent required.
  • 2+ years of progressive experience in working on AWS services.
  • Previous experience as a data engineer or in a similar role.
  • Technical expertise with data models, data scraping, data cleansing, and segmentation techniques.
  • Knowledge and understanding of Amazon Web Services such as AWS Glue (Crawler, Job, Database, Workflow), AWS S3, AWS App flow, AWS Athena, AWS Lambda, etc.
  • Knowledge and experience in connecting with multiple data sources using different AWS Services or APIs or connection protocols such as ODBC, JDBC, etc.
  • Knowledge and experience of Python and PySpark.
  • Knowledge and experience in SQL and SparkSQL queries.
  • Knowledge of MS Excel and ability to build various views using pivot tables.
  • Great numerical, statistical, and analytical skills.
  • Data engineering certification will be a plus.
  • Interest in and experience with leveraging AI for data engineering is a plus
  • Knowledge and experience on Terraform will be a plus.

Responsibilities:

  • Extract data from multiple sources (Cloud/On-Prem) and ingest it into a data lake (AWS S3) through different AWS Services or APIs or connection protocols such as ODBC, JDBC, etc.
  • Cleanse, transform, and maintain data quality in data lakes and data warehouses.
  • Build and maintain data lakes, data warehouses, and data marts on AWS as per the business requirements.
  • Develop and manage data catalogs.
  • Build data pipelines and workflows to ingest raw data and transform/clean data into data lakes and data warehouses respectively using AWS Glue.
  • Conduct complex data analysis and report on results.
  • Explore and implement methods to enhance data quality, reliability, and governance
  • Evaluate business needs and objectives.
  • Interpret trends and patterns.

Other Details

  • Work Days: Monday-Friday
  • Office location: Off to Shahrah-e-Faisal, PECHS, Karachi
Required Skills:

ETL Data Engineering Pivot Tables Data Quality Analysis Pipelines Web Services AWS Business Requirements Programming Languages Excel Analytical Skills Reliability MS Excel Programming Computer Science Data Analysis Engineering SQL Python Business Science

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.