Enable job alerts via email!

Senior PySpark/Python Developer

Zorba Consulting India Pvt. Ltd.

United States

Remote

USD 120,000 - 160,000

Full time

2 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading consulting firm is seeking a Senior PySpark/Python Developer to build robust systems for managing customer notifications regarding planned power outages. The role involves developing data processing pipelines, collaborating with teams, and addressing complex data challenges. Ideal candidates will have extensive experience in data engineering, particularly with PySpark and Python, and a proactive mindset to navigate ambiguity.

Qualifications

  • Minimum of 8+ years in software development with a focus on data engineering.
  • Extensive experience with PySpark for large-scale data processing.

Responsibilities

  • Design and maintain scalable data processing pipelines using PySpark and Python.
  • Collaborate with cross-functional teams and drive technical challenges to resolution.

Skills

Data Engineering
Problem Solving
Communication
Collaboration
Adaptability

Tools

PySpark
Python
Palantir Foundry
AWS
Azure
GCP

Job description

About The Role :

We are seeking a highly skilled and experienced Senior PySpark/Python Developer to play a critical role in building a robust and reliable system for managing and disseminating customer notifications regarding PG&E's Planned Power Outages (PPOs). This is an exciting opportunity to tackle complex data challenges within a dynamic environment and contribute directly to improving customer communication and experience.

As a key member of the team, you will be responsible for developing and implementing data processing pipelines that can ingest, transform, and synthesize information from various backend systems related to PPOs. Your primary goal will be to "eat ambiguity and excrete certainty" by taking complex, ever-changing data and producing clear, consistent, and timely notifications for PG&E customers.This role requires a strong individual contributor who can execute tasks within a defined scope while also demonstrating leadership qualities such as adaptability, continuous learning, and ownership. You should be comfortable navigating ambiguous situations, proactively identifying solutions, and providing direction even when clear, pre-defined solutions are not immediately apparent.

Responsibilities :

- Design, develop, and maintain scalable and efficient data processing pipelines using PySpark and Python.- Leverage your expertise in data engineering principles to build robust and reliable solutions.- Work with complex and dynamic datasets related to work requests and planned power outages.- Apply your knowledge of Palantir Foundry to integrate with existing data infrastructure and potentially build new modules or workflows.- Develop solutions to consolidate and finalize information about planned power outages from disparate systems.- Implement logic to ensure accurate and timely notifications are generated for customers, minimizing confusion and inconsistencies.- Identify and address edge cases and special circumstances within the PPO process.- Collaborate effectively with cross-functional teams, including business analysts, product owners, and other engineers.- Take ownership of technical challenges and drive them to resolution.- Proactively learn and adapt to new technologies and evolving requirements.- Contribute to the development of technical documentation and best practices.

Required Skills And Experience :

- Minimum of 8+ years of overall experience in software development with a strong focus on data engineering.- Extensive and demonstrable experience with PySpark for large-scale data processing.- Strong proficiency in Python and its relevant data engineering libraries.- Hands-on experience with Palantir Foundry and its core functionalities (e.g., Ontology, Pipelines, Actions, Contour).- Solid understanding of data modeling, data warehousing concepts, and ETL/ELT processes.- Experience working with complex and high-volume datasets.- Ability to write clean, efficient, and well-documented code.- Excellent problem-solving and analytical skills.- Strong communication and collaboration skills.- Ability to work independently and manage priorities effectively in a remote setting.- Demonstrated ability to take ownership and drive tasks to completion.- Comfort navigating ambiguous situations and proposing solutions.- A proactive and continuous learning mindset.

Nice To Have :

- Experience with cloud platforms (e.g., AWS, Azure, GCP).- Familiarity with data visualization tools.- Understanding of notification systems and best practices.

- Prior experience in the utilities or energy sector.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Python Developer

SCIGON

Remote

USD 100,000 - 720,000

2 days ago
Be an early applicant

Principal Python Developer (Amazon Web Services, Angular, Utility Industry) - 100% Remote

Optomi

Remote

USD 100,000 - 720,000

2 days ago
Be an early applicant

Senior Data Scientist with deep expertise in cl Developer : Python, Pyspark, NumPy, Pandas, SQL

BlueSnap, Inc

Remote

USD 90,000 - 150,000

18 days ago

Python Software Engineer, Data

Genzeon

Remote

USD 100,000 - 720,000

2 days ago
Be an early applicant

Senior Python Developer With FastAPI & ReactJS

Ampstek

Remote

USD 100,000 - 130,000

Yesterday
Be an early applicant

Senior Full Stack Python Developer

Cisco

Raleigh

Remote

USD 126,000 - 223,000

3 days ago
Be an early applicant

Senior Full Stack Python Developer

Cisco

Durham

Remote

USD 100,000 - 130,000

3 days ago
Be an early applicant

Senior Full Stack Python Developer

Cisco Systems, Inc.

Raleigh

Remote

USD 100,000 - 130,000

4 days ago
Be an early applicant

Senior Full Stack Python Developer

Cisco Systems, Inc.

North Carolina

Remote

USD 100,000 - 130,000

4 days ago
Be an early applicant