Responsibilities
- Design and build optimized data pipelines in on-prem or cloud environments to drive analytic insights.
- Create the conceptual, logical, and physical data models.
- Develop and implement end-to-end data pipelines and data integration solutions using cloud platforms such as AWS or GCP.
- Architect end-to-end data solutions ensuring integration across cloud platforms (AWS, GCP) with optimized ETL pipelines.
- Build the infrastructure required for extraction, transformation, and loading of data from sources like Hadoop, Spark, or AWS Lambda.
- Lead solution design and development of web-based data applications, providing technical direction on front-end frameworks such as React.js, Vue.js, and SmartGWT to build interactive interfaces that support analytics and data exploration.
- Design, develop, test, deploy, maintain and improve data integration pipelines.
- Implement and manage CI/CD pipelines using Jenkins or GitHub Actions to automate development, testing, and deployment of data engineering solutions.
- Guide the adoption of development best practices including version control (Git), CI/CD pipelines, automated testing, and performance tuning.
- Develop pipeline objects using Apache Spark / PySpark / Python or Scala.
- Develop and maintain RESTful APIs and backend services to support data processing workflows and integration between data pipelines and user-facing systems.
- Lead and/or mentor a small team of Data Engineers.
- Communicate effectively with client leadership and business stakeholders.
- Participate in proposal and/or SOW development.
Qualifications
- Bachelor's Degree in Computer Science, Software Engineering, Data Analytics, Information Technology or related field plus 4 years of experience as a Data Engineer, Applications Developer, Programming, or Technical Lead or related occupations.
- 4 years of experience in web-based mobile application development using React, Vue, NodeJS, REST-based Web services, Python, and Java; programming experience using Scala, Python, or Java.
- 3 years of experience with Snowflake, AWS Redshift, Oracle, or SQL; ability to write complex, highly-optimized SQL queries; experience implementing ETL pipelines using AWS Lambda, S3, API Gateway, SNS, or SQS.
Telecommuting is permitted. Must also have authority to work permanently in the U.S. Applicants who are interested in this position may apply at www.jobpostingtoday.com (Ref #95698) for consideration.
Benefits
- Health Care Plan (Medical, Dental & Vision)
- Retirement Plan (401k, IRA)
- Life Insurance (Basic, Voluntary & AD&D)
- Unlimited Paid Time Off (Vacation, Sick & Public Holidays)
- Short Term & Long Term Disability
- Employee Assistance Program
- Training & Development
- Work From Home
- Bonus Program
$130,000 - $150,000 a year