Enable job alerts via email!

AI & Data Engineer

Seattle Aviation Solutions

Dubai

Hybrid

USD 60,000 - 100,000

Full time

2 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative company is seeking a technically brilliant AI & Data Engineer who is passionate about working with large datasets and shaping data strategies. In this role, you will build and maintain scalable data pipelines, collaborate on AI opportunities, and write efficient Python code for data transformation and automation. You will also manage tools for data analysis and model development while staying current on AI trends. If you're self-directed, curious, and ready to tackle technical challenges, this is the perfect opportunity for you to make a significant impact in a forward-thinking environment.

Qualifications

  • 3-6 years of experience in Python development, data engineering, or AI/ML roles.
  • Strong problem-solving skills with a hacker mindset.

Responsibilities

  • Build and maintain scalable, reliable data pipelines and ETL processes.
  • Collaborate to identify AI opportunities and develop prototypes.

Skills

Python Programming
Data Processing Frameworks
Cloud Environments
SQL Databases
NoSQL Databases
ML Model Deployment
DevOps Tools
REST APIs
Large Datasets Management

Tools

Jupyter
Airflow
Docker
Git

Job description

Are you a technically brilliant developer with a passion for working with large datasets and exploring the frontier of AI? Do you enjoy building elegant solutions with Python and want to play a foundational role in shaping a company's data and AI strategy? If so, we invite you to join Seattle Aviation Solutions as an AI & Data Engineer.

Key Responsibilities:
  1. Data Engineering & Pipeline Development: Build and maintain scalable, reliable data pipelines and ETL processes to support analytics, reporting, and AI applications.
  2. AI/ML Enablement: Collaborate with internal stakeholders to identify AI opportunities and develop prototypes that support business functions such as pricing, purchasing optimization, forecasting, and anomaly detection.
  3. Python Development: Write clean, efficient, modular Python code to support data transformation, automation, web scraping, API integrations, and ML model deployment.
  4. Data Architecture: Help define and maintain data infrastructure, schemas, and storage systems to ensure high-performance analytics across structured and unstructured datasets.
  5. Independent Problem Solving: Take ownership of technical challenges, working independently to design and implement end-to-end solutions with minimal oversight.
  6. Tooling and Infrastructure: Set up and manage tools for data analysis, model development, and experimentation (e.g., Jupyter, Airflow, Docker, Git).
  7. Collaboration: Work closely with data and IT leadership and other technical team members to translate high-level ideas into deployable code and functional systems.
  8. Research & Development: Stay current on AI/ML trends, LLMs, and emerging technologies to help SAS map its AI future and continuously push innovation forward.
Personal Attributes:
  • Highly intelligent, curious, and motivated by technical challenges.
  • Strong problem-solver with a hacker mindset and bias for action.
  • Self-directed and capable of managing projects independently from concept to deployment.
  • Excellent communication skills and ability to work across technical and non-technical teams.
Qualifications:
  • Experience: 3-6 years of experience in Python development, data engineering, or AI/ML roles.
Skills:
  • Strong Python programming expertise.
  • Experience with data processing frameworks (e.g., Pandas, NumPy, SQLAlchemy, PySpark).
  • Familiarity with cloud environments (e.g., AWS, GCP, or Azure).
  • Proficiency with SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Redis).
  • Experience building and deploying ML models using libraries like scikit-learn, TensorFlow, or PyTorch.
  • Comfortable with DevOps tools (Git, Docker, CI/CD pipelines) and working in Linux environments.
  • Understanding of REST APIs and third-party data integrations.
  • Experience working with large datasets and solving performance bottlenecks.
  • Comfortable working within the current tech stack (Snowflake, AWS, Fivetran, Tableau).

Bonus: Exposure to LLMs, LangChain, Vector Databases (e.g., Pinecone, Weaviate), or NLP applications is highly desirable.

Location:

Dubai or Remote (willingness to work either EST or PST)

Disclaimer:

Naukrigulf.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advise against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@naukrigulf.com

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.