Main focus of the role
We are seeking a technically excellent, customer-focused Analytics Engineer to drive data integration and transformation for a telecoms analytics platform we are taking to market. This role sits at the intersection of client data environments and our internal product development efforts—requiring both business insight and technical acumen.
Your primary focus will be to gather, clarify, and deeply understand business and reporting requirements—often expressed in SQL, Excel, or ad hoc analysis. You will work closely with business stakeholders, analysts, and the data architecture team to translate these requirements into clear, actionable technical specifications.
Working as part of a collaborative team, you’ll help ensure these specifications are accurately and efficiently implemented—typically in PySpark/SparkSql based data pipelines—by supporting the development process, providing subject-matter expertise, and validating that delivered data outputs match the original requirements. Your work will be critical in ensuring that our analytics solutions are robust, repeatable, and deliver trusted results to the business
This role requires a strong analytical mindset, excellent communication skills, and hands-on experience with data transformation and integration processes.
What you’ll do
- Customer Engagement & Requirements Gathering
- Engage directly with customers to understand their business needs and data environments.
- Elicit, document, and validate functional and non-functional requirements.
- Conduct workshops and interviews with client stakeholders to capture use cases and success criteria.
- Data Analysis & Integration Design
- Analyze complex customer data sets, schemas, and data flows to assess integration needs.
- Collaborate with data engineers and developers to design effective data ingestion, transformation, and mapping processes.
- Validate data quality, completeness, and alignment with business requirements.
- Technical Collaboration & Delivery
- Support the development of scalable data integration and transformation pipelines by assisting with coding, testing, and implementing solutions—primarily using PySpark, SparkSQL, and Python.
- Translate business and analytical requirements into clear, actionable technical specifications to guide the engineering team’s implementation.
- Contribute hands-on to codebases: write and review code, implement validation checks, and assist with the development of complex transformation logic as needed.
- Help automate and maintain data validation and quality checks to ensure reliability and accuracy of analytics outputs.
- Collaborate closely with the FPA team to understand financial reporting needs, ensure alignment of technical solutions with finance objectives, and validate outputs support business decision-making.
- Work with modern data formats and platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks, etc.) to enable efficient data handling and integration.
- Participate in solution architecture and technical discussions, collaborating on user story creation and acceptance criteria to ensure technical solutions align with business needs.
- Product & Analytics Alignment
- Work closely with our product team to ensure that customer data is accurately reflected in analytics outputs.
- Provide feedback on product improvements based on client needs and data insights.
- Monitor and evaluate the effectiveness of integrated data in supporting customer decision-making.
To fly in your role, you’ll need
- 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, and SQL.Proven experience working on data integration projects (ETL, data pipelines, APIs).
- Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale.Ability to translate complex technical concepts into business language and vice versa.
- Strong experience with relational databases and data modelling; experience with modern analytics platforms (Databricks, Delta Lake, cloud storage) is a plus.Experience in Agile/Scrum environments.
- Track record of translating business logic and requirements into production-grade, testable code.Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana).
- Solid grasp of data quality, data validation, and monitoring concepts.
- Strong communication skills—able to present technical logic and results to both technical and non-technical audiences.
- Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams.
Nice to have
- Experience in the telecoms industry.
- Familiarity with the software development lifecycle and Agile methodologies
- Infrastructure as code experience. (terraform and Pulumi)
- Experience working in a scale up/dynamic consulting environment.
- Exposure to accounting concepts.
Benefits
- Working
- Fully remote working / or from the office with daily lunch
- Flexible working hours
- High-spec Dell laptop
- Money towards a keyboard of your choice, that is yours to keep
- Insurance - fully paid on top of, not out of your salary
- Medical Aid, including Gap Cover
- Life Insurance, with Disability Insurance and Funeral cover
- Learning
- Learning Budget - Books or Courses - you choose how to use it
- Culture
- People-first culture that encourages work/life balance
- Everyone has a voice, regardless of title
- Psychological safety
- Leave
- 20 days annual leave
- Paid Maternity, Paternity, Study & Moving, Conference, CSR Volunteering leave
- Long-Term Loyalty Benefits
- 2 years - monthly budget towards a cell phone contract, uber travel vouchers or petrol card
- 3 years – can apply for a study bursary to enhance your current and future role with us
- 5 years - 3 additional days of annual leave
- 7 years – a local weekend away at our cost
- 10 years - 3 month paid sabbatical
Interested?
Send your CV to jobs@structureit.net