Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
An innovative firm is seeking a skilled data engineer to design and maintain scalable data pipelines using cutting-edge technologies. This role emphasizes building efficient data integration solutions and driving digital innovation across various platforms. You will leverage your expertise in Google Cloud technologies, including GCP and Vertex AI, to enhance core data assets and implement robust automated testing frameworks. If you are passionate about data and eager to make a significant impact in a dynamic environment, this opportunity is perfect for you.
LOCATION: Open to remote – (Cincinnati, Chicago, Charlotte, Boca Raton, San Jose preferred)
RATE: 70-80/hr
DURATION: 6m CTH
WORK AUTH: USC/GCH – need to be able convert
TOP SKILLS: GCP, Vertex AI – needs to have Google tech stack background
Build and Maintain Data Pipelines: Design, build, and maintain scalable, efficient, and reliable data pipelines to support data ingestion, transformation, and integration across diverse sources and destinations, using tools such as Kafka, Databricks, and similar toolsets.
Drive Digital Innovation: Leverage innovative technologies and approaches to modernize and extend core data assets, including SQL-based, NoSQL-based, cloud-based, and real-time streaming data platforms.
Implement Feature Engineering: Develop and manage feature engineering pipelines for machine learning workflows, utilizing tools like Vertex AI, BigQuery ML, and custom Python libraries.
Implement Automated Testing: Design and implement automated unit, integration, and performance testing frameworks.
Mid-Senior level
Contract
Staffing and Recruiting and Retail