¡Activa las notificaciones laborales por email!

Databricks Developer

Cognizant

Guadalajara

Híbrido

MXN 1,103,000 - 1,472,000

Jornada completa

Hace 2 días
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A leading global IT services provider is looking for a Developer to join their team in Guadalajara, Jalisco. The ideal candidate will have 6 to 10 years of experience with proficiency in Spark, Databricks, and data orchestration. Responsibilities include developing data workflows, managing governance, and collaborating with teams to meet business needs. This role offers a hybrid working model and a competitive benefits package.

Servicios

Competitive benefits and salary package
Ongoing training and development opportunities

Formación

  • 6 to 10 years of experience as a Developer.
  • Strong expertise in big data technology, specifically Spark and Databricks.
  • Proficiency with cloud storage solutions and orchestration tools.

Responsabilidades

  • Develop and optimize data workflows using Spark in Scala.
  • Implement and manage Databricks Unity Catalog Admin.
  • Utilize Apache Airflow for automating workflows.
  • Manage data storage using Amazon S3 and Amazon Redshift.
  • Write and maintain Python scripts for data processing.

Conocimientos

Spark in Scala
Databricks Unity Catalog Admin
Apache Airflow
Amazon S3
Amazon Redshift
Python
Databricks SQL and Delta Lake
PySpark
Descripción del empleo

We’re hiring! At Cognizant, we have an ideal opportunity for you to be part of one of the largest companies in the digital sector worldwide. A Great Place To Work where we look for people who contribute new ideas, experiencing a dynamic and growing environment. At Cognizant, we promote an inclusive culture, where we value different perspectives providing career growth and development opportunities.

We have an exciting opportunity for an exceptional individual to work supporting one of our clients.

Job Summary

We are seeking a Developer with 6 to 10 years of experience to join our dynamic team in a hybrid work model. The ideal candidate will have expertise in Spark in Scala, Databricks Unity Catalog Admin, Apache Airflow, Amazon S3, Amazon Redshift, Python, and Databricks tools. This role involves developing and optimizing data workflows, ensuring data integrity, and contributing to innovative solutions that drive our company's success.

Responsibilities
  • Develop and optimize data workflows using Spark in Scala to ensure efficient data processing and analysis.
  • Implement and manage Databricks Unity Catalog Admin to maintain data governance and security.
  • Utilize Apache Airflow to orchestrate complex data pipelines and automate workflows.
  • Integrate and manage data storage solutions using Amazon S3 and Amazon Redshift for scalable data management.
  • Collaborate with cross-functional teams to design and implement data solutions that meet business requirements.
  • Write and maintain Python scripts to automate data processing tasks and enhance system functionality.
  • Leverage Databricks SQL to perform complex queries and data transformations for insightful analytics.
  • Implement Databricks Delta Lake to ensure data reliability and consistency across various platforms.
  • Design and execute Databricks Workflows to streamline data operations and improve efficiency.
  • Utilize PySpark to process large datasets and extract meaningful insights for decision-making.
  • Ensure data quality and integrity through rigorous testing and validation processes.
  • Provide technical support and guidance to team members to foster a collaborative work environment.
  • Contribute to the continuous improvement of data processes and methodologies to enhance overall performance.
Qualifications
  • Possess strong expertise in Spark in Scala and Databricks Unity Catalog Admin.
  • Demonstrate proficiency in Apache Airflow and Amazon S3 for data orchestration and storage.
  • Have experience with Amazon Redshift and Python for data management and automation.
  • Show capability in using Databricks SQL and Delta Lake for data analysis and reliability.
  • Exhibit skills in designing Databricks Workflows and utilizing PySpark for data processing.
Why Cognizant?

Improve your career in one of the largest and fastest growing IT services providers worldwide.

Receive ongoing support and funding with training and development plans.

Have a highly competitive benefits and salary package.

Get the opportunity to work for leading global companies.

We are committed to respecting human rights and building a better future by helping your minds and the environment.

We invest in people and their wellbeing.

We create conditions for everyone to thrive. We do not discriminate based on race, religion, color, sex, age, disability, nationality, sexual orientation, gender identity or expression, or for any other reason covered.

At Cognizant, we believe that our culture makes us stronger!

Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without distinction of sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected veteran status, age, or any other characteristic protected by law.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.