Enable job alerts via email!

Data Architect

Zortech Solutions

Canada

Remote

CAD 125,000 - 150,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Join a forward-thinking company as a Data Architect specializing in Google Cloud Platform. This exciting role involves building robust data pipelines, collaborating with diverse teams, and ensuring data security and compliance. You will leverage your expertise in SQL, Python, and cloud technologies to drive data transformation and automation. If you're passionate about innovative data solutions and want to make a significant impact in the health sector, this is the perfect opportunity for you. Embrace the challenge of modern data architecture and contribute to cutting-edge projects in a dynamic environment.

Qualifications

  • Experience building data pipelines and working with SQL and NoSQL.
  • Proficient in cloud-native technologies and data governance.
  • Strong background in automation and CI/CD practices.

Responsibilities

  • Build and maintain data pipelines for optimal data extraction and transformation.
  • Collaborate with stakeholders to support data infrastructure needs.
  • Ensure data security and compliance with corporate policies.

Skills

SQL
Python
Data Transformation
Data Governance
Automation
Frontend Development
Cloud Technologies
CI/CD Best Practices
YAML
JSON

Education

Bachelor's degree in Computer Science or related field
MBA or relevant advanced degree

Tools

Google Cloud Platform
AWS
Airflow
Docker
Kubernetes
Terraform
Looker
PowerBI

Job description

Get AI-powered advice on this job and more exclusive features.

Direct message the job poster from Zortech Solutions

US/ Canada Technical Recruitment Executive @ Zortech Solutions | MBA, HR/ Systems/ IT/ BA

Role: Data Architect-GCP

Location: Remote-Canada

Duration: 6-12+ Months

Job Description:

Qualifications

What you bring:

  • Build Data pipelines required for optimal extraction, anonymization, and transformation of data from a wide variety of data sources using SQL, NoSQL and AWS ‘big data’ technologies.
  • Streaming Batch
  • Work with stakeholders including the Product Owners, Developers and Data scientists to assist with data-related technical issues and support their data infrastructure needs.
  • Ensure that data is secure and separated following corporate compliance and data governance policies.
  • Take ownership of existing ETL scripts, maintain and rewrite them in modern data transformation tools whenever needed.
  • Being an automation advocate for data transformation, cleaning and reporting tools.
  • You are proficient in developing software from idea to production.
  • You can write automated test suites for your preferred language.
  • You have frontend development experience with frameworks such as React.js/Angular.
  • You have experience with cloud-native technologies, such as Cloud Composer, Dataflow, Dataproc, BigQuery, GKE, Cloud run, Docker, Kubernetes, and Terraform.
  • You have used cloud platforms such as Google Cloud/AWS for application hosting.
  • You have used and understand CI/CD best practices with tools such as GitHub Actions, GCP Cloud Build.
  • You have experience with YAML and JSON for configuration.
  • You are up-to-date on the latest trends in AI Technology.

Great-to-haves:

  • 3+ years of experience as a data or software architect.
  • 3+ years of experience in SQL and Python.
  • 2+ years of experience with ELT/ETL platforms (Airflow, DBT, Apache Beam, PySpark, Airbyte).
  • 2+ years of experience with BI reporting tools (Looker, Metabase, Quicksight, PowerBI, Tableau).
  • Extensive knowledge of the Google Cloud Platform, specifically the Google Kubernetes Engine.
  • Experience with GCP cloud data related services (Dataflow, GCS, Datastream, Data Fusion, Data Application, BigQuery, Data Flow, Data Proc, Dataplex, PubSub, CloudSQL, BigTable).
  • Experience in health industry an asset.
  • Expertise in Python, Java.
  • Interest in PaLM, LLM usage and LLMOps.
  • Familiarity with LangFuse or Backstage plugins or GitHub Actions.
  • Strong experience with GitHub beyond source control.
  • Familiarity with monitoring, alerts, and logging solutions.
Seniority level
  • Mid-Senior level
Employment type
  • Contract
Job function
  • Information Technology
  • Industries: Hospitals and Health Care, Public Health, and Health and Human Services.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Cloud Data Architect (OSDU)

Hays

Remote

CAD 110,000 - 150,000

3 days ago
Be an early applicant

Databricks Data Architect, Deloitte Global Technology

Deloitte Canada

Toronto

Remote

CAD 85,000 - 156,000

5 days ago
Be an early applicant

Cloud Solutions Architect - Alliances

Canonical

Toronto

Remote

CAD 100,000 - 130,000

4 days ago
Be an early applicant

Cloud Solutions Architect - Alliances

Canonical

Vancouver

Remote

CAD 100,000 - 130,000

3 days ago
Be an early applicant

Senior Data Architect FCC

Fidelity Canada

Toronto

Remote

CAD 90,000 - 140,000

20 days ago

Cloud Solutions Architect - Alliances

Canonical

Gatineau

Remote

CAD 100,000 - 130,000

4 days ago
Be an early applicant

Senior Data Solutions Architect

OpsGuru

Calgary

Remote

CAD 100,000 - 130,000

2 days ago
Be an early applicant

AWS Data Architect with Databricks

Ampstek

Remote

CAD 110,000 - 130,000

20 days ago

Senior Data Architect FCC

Fidelity Canada

Toronto

Remote

CAD 80,000 - 130,000

30+ days ago