About Us
At Nextlane, we don’t just develop software solutions – we create the future of the automotive industry.
We are a company that combines advanced technology with a clear vision: simplifying and digitizing every step of the automotive customer journey, empowering manufacturers and dealerships to thrive in a constantly evolving market.
We believe in the value of every team member, offering opportunities for you to develop and contribute to meaningful solutions.
So… What does it mean to be a #Nextlaner?
- Be part of a growth-oriented culture.
- Collaborate with colleagues from all over the world.
- Believe in the Power of ideas and the diversity of thought.
- Be committed to providing an environment where you can learn, grow, and collaborate on projects that make a global impact.
Our success is measured not just by results, but also by the growth and satisfaction of those who are part of our company.
At Nextlane, you’ll have the opportunity to innovate, push boundaries, and work on solutions that are transforming the automotive world.
Your Responsibilities:
- Collaborating in the design and implementation of next-generation data platforms based on Data Mesh architecture principles.
- Leading the development of data ingestion pipelines (batch and streaming) for structured and unstructured data sources.
- Building scalable, high-performance data storage solutions, including working with AWS services like S3, Glue, Redshift, and Athena.
- Developing ETL / ELT processes using AWS Glue, Apache Spark, and Databricks to support data processing at scale.
- Optimizing data systems for performance and cost using AWS best practices, such as partitioning, compression, and caching.
- Implementing robust security controls using AWS Lake Formation and IAM policies to ensure data privacy and compliance with regulations like GDPR and ISO27001.
What We're Looking For:
- Experience: 6+ years of hands-on data engineering experience, preferably building large-scale data platforms in AWS.
- Languages: Proficiency in English (Interview will be conducted in English).
- Communication: Strong communication skills, both verbal and written.
- Technical Skills:
- Proficiency in AWS data services (Glue, S3, Redshift, Kinesis, Athena).
- Advanced knowledge of Data Mesh architecture and domain-driven data products.
- Experience with data ingestion pipelines (batch and streaming).
- Expertise in data formats such as Parquet, ORC, and Iceberg.
- Proficiency in Apache Spark (PySpark) and distributed SQL engines like Presto.
- Experience with orchestration tools like AWS Step Functions or Airflow.
Interpersonal Skills: Ability to collaborate in a diverse and dynamic environment.