
Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A leading geospatial intelligence firm in Calgary is looking for a Data Integration Engineer to manage the ingestion and transformation of large datasets. You will design and automate processes that ensure data is ready for monetization while collaborating with partner engineering teams. Ideal candidates will have 3–5 years of experience in data engineering and proficiency in Python and SQL. This role offers the chance to innovate in a dynamic environment that values growth and accountability.
Employers often ask why you'd be a good fit to work for them. We prefer to start by showing why we'd be a great fit for you. BigGeo is redefining geospatial intelligence with an AI‑ready Discrete Global Grid System (DGGS) that transforms how spatial data is captured, indexed, and monetized. Our platform powers mission‑critical decisions across sectors where location intelligence drives outcomes, from large‑scale infrastructure projects and environmental planning to logistics and emergency response. We are industry agnostic, unlocking possibilities for organizations that have yet to realize the value a system like ours can deliver.
Joining BigGeo now means helping to architect and accelerate the next phase of growth. Our team is multidisciplinary, entrepreneurial, and built for impact. We work quickly, push boundaries, and expect every team member to be both a thinker and a doer.
If you want to be part of a team that is rewriting what is possible in geospatial intelligence, and you have the drive to build, scale, and innovate, BigGeo is where you can do the most meaningful work of your career.
We are seeking a Data Integration Engineer who will own the process of ingesting, transforming, and operationalizing large scale third‑party datasets. This role ensures that every new dataset moves from raw source to a sale‑ready format quickly, consistently, while building scalable data ingestion processes.