Join to apply for the Lead Data Engineer role at Ottomatik.io.
1 week ago Be among the first 25 applicants.
Hi there! We are South and our client is looking for a Lead Data Engineer.
Note To Applicants
- Eligibility: This position is open to candidates residing in Latin America.
- Application Language: Please submit your CV in English. Applications submitted in other languages will not be considered.
- Professional Presentation: We encourage you to showcase your professional experience by including a Loom video in the application form. While optional, candidates who provide a video presentation will be prioritized.
Duties & Responsibilities
- Developing and supporting data pipelines to support data programs, services, process optimization, and business intelligence.
- Designing and developing scalable data warehousing solutions and ETL pipelines in Big Data environments (cloud, on-prem, hybrid).
- Leading data discovery sessions with business teams to understand data requirements for analytics projects.
- Ensuring proper testing and monitoring for trusted and reliable data.
- Partnering with domain experts and development teams to align data design with business strategy.
- Documenting methodologies, standards, and architecture guidelines.
- Assisting Business Intelligence Engineers with technical hurdles.
- Participating in a shared on-call rotation to monitor system health.
Essential Knowledge, Skills, And Abilities
- 3+ years in data architecture, analysis, modeling, and integration.
- 5+ years in custom ETL design, implementation, and maintenance.
- Advanced knowledge of programming languages like Python and OOP.
- Hands-on experience with SQL database design.
- Experience with Cloud Platforms (AWS or Google).
- Experience with CI/CD and source control tools such as GitHub and GitLab.
- Experience with relational databases like Snowflake and Redshift.
- Knowledge of orchestration tools (preferably AirFlow), storage systems (AWS S3, Google Cloud Storage), and real-time data processing.
- Familiarity with Agile/Scrum methodologies.
- Excellent communication and interpersonal skills.
- Bachelor’s or Master’s Degree in Computer Science, Information Systems, or related field.
Bonus Points
- Experience with reporting tools like Tableau.
- Experience with event streaming tools like Kafka or Kinesis.
- Experience with subscription service products.
- Knowledge of accounting, FP&A, and marketing functions.
Schedule: Monday to Friday, 7:30 or 8:00 AM to 4:00 or 4:30 PM PST.
Compensation: USD 4,500 – 5,000/month.
Location: 100% remote.
If interested, send us your resume!