Data Engineer - Remote / Telecommute
Cynet Systems Inc
Edina (MN)
Remote
USD 100,000 - 130,000
Full time
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
Job summary
A leading company in Edina is seeking a skilled professional to drive solution adoption and implementation for complex projects. The ideal candidate will have extensive experience in data management, cloud solutions, and communication skills, working collaboratively in diverse teams. This role demands expertise in Python and various data technologies to ensure successful project outcomes and improvements in business intelligence solutions.
Qualifications
- 8 years of relevant experience in technology solutions.
- 5+ years of DataOps and DevOps experience.
- 3-5 years in designing cloud solutions.
Responsibilities
- Drives solution adoption for complex portfolios.
- Designs solutions for large-scale initiatives.
- Responsible for improving performance and compliance.
Skills
Python
Communication
Data Management
Process Improvement
Education
Bachelor's degree
Master's degree
Tools
Hadoop
Jenkins
GitHub
Oracle
SQL
Snowflake
Job Description:
- Drives successful solution adoption and implementation for medium to complex portfolios.
- Works on several large and enterprise-wide projects.
- Participates in strategy design and leads initiatives.
- Designs solutions for large-scale initiatives.
- Has intermediate to advanced skills in Python, PyTorch, TensorFlow, and other deep learning frameworks.
- Is an expert in working with large databases, BI applications, data quality, and performance tuning.
- Has expert knowledge of developing end-to-end business intelligence solutions: data modeling, ETL, and reporting.
- Has a deep understanding of data gathering, inspecting, cleansing, transforming, and modeling techniques.
- Deep understanding and experience in Microservices architecture.
- May act as an escalation point for others.
- Has outstanding written and communication skills.
- Identifies and drives process improvement.
- Responsible for improving availability, security, compliance, interoperability, performance, and reengineering activities.
- Grows into the role of a recognized subject matter expert in one or more functions.
- Has excellent communication skills, with the ability to work individually and in broader, geographically dispersed teams.
Qualifications:
- Bachelor's degree required; Master's degree preferred.
- 8 years of relevant experience, including several technology solutions such as Java, Big Data technologies, and data management tools.
- 5+ years of DataOps and DevOps experience, building solutions using Hadoop technologies (Pig, Spark, Kafka), Python, and version-controlled CI/CD pipelines using tools like Jenkins and GitHub.
- 3-5 years of experience in designing, developing, and implementing Google and AWS cloud solutions.
- 5+ years of experience with relational database concepts, including star schema, Oracle, SQL, PL/SQL, SQL tuning, OLAP, Big Data technologies, Snowflake, and Apache NiFi.
- 5+ years of experience in building data pipelines in data lake setups.
- 3 years of architecting, designing, and implementing enterprise-scale projects/products and data management.
- 3-4 years of experience with Cerner, EPIC, and Lawson systems.
- 5 years of secure data engineering and scripting (Shell, Python).
- 5 years of experience in healthcare data, building clinical and non-clinical solutions that drive patient outcomes.