Enable job alerts via email!
A modern technology company in Dubai is seeking a Data Engineer to architect data ingestion pipelines and develop storage solutions. The ideal candidate has over 6 years of experience in building data systems, strong Python skills, and is proficient with tools like Kafka and SQL. This role offers flexibility for remote work and ample opportunities for professional growth.
Key
Responsibilities
Ingestion&Pipelines:
Architect batchstream pipelines (Airflow Kafka dbt) for diverse
structured and unstructured marked data. Provide reusable SDKs in
Python and Go for internal data
producers.
Storage&Modeling:
Implement and tune S3 columnoriented and timeseries data storage
for petabytescale analytics; own partitioning compression TTL
versioning and cost
optimisation.
Tooling
& Libraries: Develop internal libraries for
schema management data contracts validation and lineage; contribute
to shared libraries and services for internal data consumers for
research backtesting and real-time trading
purposes.
Reliability
& Observability: Embed monitoring alerting
SLAs SLOs and CI/CD; champion automated testing data quality
dashboards and incident
runbooks.
Collaboration:
Partner with Data Science QuantResearch Backend and
DevOps to translate requirements into platform capabilities and
evangelise best
practices.
Qualifications
:
Additional
Information :
What we
offer:
Remote
Work :
Yes
Employment
Type :
Full-time
Key Skills
Apache
Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data
Warehouse,Kafka,Scala
Department / Functional
Area: Data Engineering
Experience:
years
Vacancy:
1