Enable job alerts via email!

AI Data Engineer

Intuo

Tulsa (OK)

On-site

USD 70,000 - 110,000

Full time

30+ days ago

Job summary

Join a forward-thinking company as a Database and AI Solutions Engineer! You will manage and optimize databases, ensuring data integrity and security while designing scalable ETL pipelines. Your expertise in Python and SQL will be crucial as you develop and integrate AI solutions using cutting-edge frameworks like Langchain. This role offers the opportunity to collaborate with Data Scientists to maintain data quality and explore innovative technologies in the cloud. If you're passionate about data and AI, this is the perfect chance to make a significant impact in an exciting and dynamic environment.

Qualifications

  • Experience in managing and optimizing relational databases.
  • Proficiency in Python and SQL, with familiarity in R being a plus.

Responsibilities

  • Manage and optimize databases for data integrity and security.
  • Design scalable ETL pipelines and develop AI solutions.

Skills

Python
SQL
Git
Google Cloud Platform (GCP)
API frameworks
Langchain
R

Tools

Docker
Kubernetes
AWS
Azure
Apache Kafka
Apache NiFi
Apache Spark
Apache Superset
Job description
  • Manage and optimize databases to ensure data integrity, security, and accessibility.
  • Design and optimize scalable ETL pipelines.
  • Develop and integrate AI solutions using frameworks such as Langchain.
  • Ensure data quality in coordination with Data Scientists.
Requirements:
  • Experience in managing and optimizing relational databases.
  • Excellent proficiency in Python, deep knowledge of SQL, and familiarity with R (nice to have).
  • Knowledge of API frameworks.
  • Experience using versioning tools like Git.
  • Competence with cloud technologies, particularly Google Cloud Platform (GCP).
  • Familiarity with AI frameworks like Langchain and related tools and technologies.
  • Interest in the open-source world and familiarity with related technologies and tools.
Nice to Have:
  • Experience in creating ML/AI models, particularly LLM.
  • Knowledge of principles in managing and analyzing spatial data (geolocated data).
  • Experience with other cloud technologies like AWS and Azure.
  • Knowledge of Kubernetes and containerization with Docker.
  • Experience or interest in working with vector data for AI projects.
  • Experience with Apache Superset or other open-source and commercial Business Intelligence tools.
  • Knowledge of Apache Kafka, Apache NiFi, and Apache Spark for Big Data applications.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.