Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Data Engineer Ii

Simplepractice

Xico

Presencial

MXN 1,093,000 - 1,458,000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A leading software company is seeking a Data Engineer II in Mexico, Veracruz, Xico to enhance its data engineering efforts. The role involves building scalable systems for data from various sources, managing the complete data stack, and designing data solutions using cloud technologies, primarily AWS. Ideal candidates will have 4+ years of experience, strong SQL skills, and proficiency in tools like MySQL or Snowflake. Opportunities for personal growth abound within this fast-paced, innovative environment.

Formación

  • 4+ years of progressive professional experience preferred.
  • Top-notch SQL skills and expert in relational technology.
  • Expert in at least two database engines, preferably MySQL, Snowflake, or Postgres.

Responsabilidades

  • Partner with analysts to build scalable systems for data from various sources.
  • Manage the complete data stack from ingestion to consumption.
  • Design scalable data solutions leveraging cloud technologies.

Conocimientos

SQL
Data modeling
Data ingestion tools
Python
Unix/Linux scripting
Data analytics tools

Educación

BS/MS in Engineering, Mathematics, Physics, Computer Science or equivalent

Herramientas

MySQL
Postgres
Snowflake
Airflow / Prefect
Tableau
Looker
Descripción del empleo
About Us

SimplePractice is headquartered in Los Angeles, California, but we have team members who work and live across the United States, Dominican Republic, Mexico City, and Ukraine.

We are the world's leading health practice management software.

We build products that help clinicians (e.g., therapists, psychiatrists) run their private practices with ease.

At the end of the day, our mission is to empower private practices to thrive.

Our Culture

At SimplePractice, culture is our foundation.

It influences the way we work, how we serve our customers, and how we approach accomplishing our mission.

We have five core values that we strive to embody every day :

  • We think big
  • We take simplicity seriously
  • We come as we are
  • We act with humility
  • We are built on trust

Culture is everyone's responsibility at SimplePractice.

Our culture is what drives us to do better for our teammates and customers.

Connection and collaboration are also key to our success.

You will work with our talented multi‑national teams and have opportunities to participate in onsites in both the US and Mexico.

The Role

SimplePractice is seeking a Data Engineer II to take the company's data engineering to the next level of our data‑driven / data‑focused journey.

Our Data and Analytics Team is responsible for the data across the enterprise, organized in three functions.

Information delivery is provided by our Data & Analytics group, the analytics itself is enabled by Data Engineering, and Infrastructure groups.

The information from our Analytics Team is paramount to our company's successful operation and growth.

Data Engineering and Architecture covers data ingestion and transformations across our analytics platform, as well as the overall enterprise data architecture and our product's database backend architecture and data access layer.

This role will be leasing our data ingestion efforts within the Data Engineering & Architecture team.

All of which enables us to affect data sanctity, making it usable for decision making.

Data Governance includes data stewardship, lineage, quality, and security.

We employ best practices in delivering each of these functions to the utmost benefit of our organization.

As the data vision gets implemented and the company grows, there are many opportunities for personal growth.

Responsibilities
  • Partner with analysts to build scalable systems that help unlock the value of data from a wide range of sources such as backend databases, event streams, and marketing platforms - Consult with our Product and Engineering Teams in the creation of new data in the production environment
  • Create company wide alignment through standardized metrics across the company
  • Promote importance of dimensional data models in communicating across the organization
  • Manage the complete data stack from ingestion through data consumption - Connect our teams and their workflows to centralized and secure databases
  • Build tools to increase transparency in reporting company wide business outcomes
  • Define and promote data engineering best practices
  • Design scalable data solutions leveraging cloud data technologies, preferably in AWS
  • Help define data quality and data security framework to measure and monitor data quality across the enterprise
  • Excellent problem‑solving & critical‑thinking skills to meet complex data challenges and requirements in a fast paced, rapidly changing environment
Desired Skills & Experience

Along with the responsibilities and competencies specified above, we are looking for an individual who possesses a positive, action‑oriented attitude and understands the importance of taking initiative within a team environment.

  • 4+ years of progressive professional experience preferred
  • Top‑notch SQL, statistical / window functions, complex data types - Expert in relational technology, data modeling, and in dimensional modeling
  • Expert in at least two database engines, preferably MySQL, Snowflake, or Postgres
  • Metadata‑driven and database‑centric concepts
  • Database performance
  • Data transformations - Expert at ETL and ETL tools, including Airflow / Prefect, DBT, Airbyte, Fivetran
  • ELT and schema‑on‑read concepts
  • Data ingestion tools, such as Kafka, DMS, Singer
  • At least one programming language, preferably Python
  • Unix / Linux scripting, such as bash
  • Experience with APIs, such as via curl
  • Experience with achieving performance through parallelism
  • DAGs
  • Experience with cloud‑based infrastructure, particularly AWS - Cloud storage, S3
  • Data storage formats, such as Parquet, ORC
  • Experience with external tables
  • Unstructured and semi‑structured data types, JSON
  • Data analytics - Experience with at least one visualization tool, preferably Looker, Tableau, Sisense
  • Excellent communication skills
  • BS / MS degree in Engineering, Mathematics, Physics, Computer Science or equivalent experience
Bonus Points
  • Real‑time ETL - Kafka streaming, AWS Kinesis
  • AWS DevOps - Terraform, Kube
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.