Ativa os alertas de emprego por e-mail!

Data Infrastructure Engineer ID35383

JobFinder Spain

Rio de Janeiro

Híbrido

USD 60.000 - 100.000

Tempo integral

Há 6 dias
Torna-te num dos primeiros candidatos

Melhora as tuas possibilidades de ir a entrevistas

Cria um currículo adaptado à oferta de emprego para teres uma taxa de sucesso superior.

Resumo da oferta

Join a forward-thinking company that excels in creating custom software solutions across various industries. This role offers the opportunity to architect and maintain robust data analytics pipelines while collaborating with diverse teams to drive impactful projects. With a focus on professional growth and competitive compensation, you will be encouraged to innovate and experiment in a supportive environment. Enjoy the flexibility of a tailored work schedule that promotes work-life balance, whether working from home or in the office. If you thrive in a challenging environment and are passionate about data analytics, this position is perfect for you.

Serviços

Professional Growth Opportunities
Competitive Compensation
Flexible Working Hours
Education Budget
Fitness Budget
Team Activities Budget

Qualificações

  • 5+ years of engineering and data analytics experience.
  • Strong SQL and Python/Scala skills for complex data analysis.
  • Experience with data pipeline and warehouse tools.

Responsabilidades

  • Architect and maintain data analytics pipelines.
  • Implement data ingestion integrations for various sources.
  • Work with cross-functional teams to define requirements.

Conhecimentos

SQL
Python
Scala
Data Analytics
Automation Tooling
Data Governance
Cloud Data Lakes
Real-time Data Streaming

Ferramentas

Snowflake
Databricks
Spark
AWS Glue
DBT
Airflow
Terraform
Kubernetes

Descrição da oferta de emprego

AgileEngine is one of the Inc. 5000 fastest-growing companies in the US and a top-3 ranked dev shop according to Clutch. We create award-winning custom software solutions that help companies across 15+ industries change the lives of millions.

If you like a challenging environment where you’re working with the best and are encouraged to learn and experiment every day, there’s no better place - guaranteed! :)

WHAT YOU WILL DO

- Architect, build, and maintain modern and robust real-time and batch data analytics pipelines;

- Develop and maintain declarative data models and transformations;

- Implement data ingestion integrations for streaming and traditional sources such as Postgres, Kafka, and DynamoDB;

- Deploy and configure BI tooling for data analysis;

- Work closely with product, finance, legal, and compliance teams to build dashboards and reports to support business operations, regulatory obligations, and customer needs;

- Establish, communicate, and enforce data governance policies;

- Document and share best practices with regards to schema management, data integrity, availability, and security;

- Protect and limit access to sensitive data by implementing a secure permissioning model and establishing data masking and tokenization processes;

- Identify and communicate data platform needs, including additional tooling and staffing;

- Work with cross-functional teams to define requirements, plan projects, and execute on the plan.

MUST HAVES

-5+ years ofengineering and data analytics experience;

- StrongSQL and Python/Scala skills for complex data analysis;

- Hands-on experience building automation tooling and pipelines usingPython, Scala, Go, or TypeScript;

- Experience with modern data pipeline and warehouse tools (e.g.,Snowflake, Databricks, Spark, AWS Glue);

- Proficiency with declarative data modeling and transformation tools (e.g., DBT, SqlMesh);

- Familiarity with real-time data streaming (e.g.,Kafka, Spark);

- Experience configuring and maintaining data orchestration platforms (e.g., Airflow);

- Background working with cloud-based data lakes and secure data practices;

- Ability to work autonomously and drive projects end-to-end;

- Strong bias for simplicity, speed, and avoiding overengineering;

NICE TO HAVES

-Experience with infrastructure-as-code tools (e.g., Terraform);

- Familiarity with container orchestration (e.g., Kubernetes);

- Prior experience managing external data vendors;

- Background working cross-functionally with compliance, legal, and finance teams;

- Experience driving company-wide data governance or permissioning frameworks.

THE BENEFITS OF JOINING US

-Professional growth: Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps.

-Competitive compensation: We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities.

-A selection of exciting projects: Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands.

-Flextime: Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.

Your application doesn't end here! To unlock the next steps, check your email and complete your registration on our Applicant Site. The incomplete registration results in the termination of your process.

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.