Enable job alerts via email!

Post-Doctoral Fellow - Computer Science (Cybersecurity)

Ontario Tech University

Oshawa

On-site

CAD 60,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading Canadian university is seeking a Post-Doctoral Fellow in Cybersecurity to work on advanced research involving large language models and malware. This role offers the opportunity to contribute to groundbreaking projects that impact AI and security. Applicants should have a Ph.D. in Computer Science, and strong experience in LLMs and malware analysis is required.

Benefits

Opportunity for research impact
Equity, diversity, and inclusion initiatives
Support for professional development

Qualifications

  • Strong experience in LLM training and fine-tuning.
  • Solid Python programming skills for experiment implementation.
  • Deep understanding of malware and obfuscation techniques.

Responsibilities

  • Prepare datasets and fine-tune LLMs for code mutation.
  • Conduct controlled experiments for detection metrics.
  • Automate validation pipelines for malware detection.

Skills

Training and fine-tuning LLMs
Proficient with PyTorch
Software engineering
Knowledge of malware concepts
Research and technical writing

Education

Ph.D. in Computer Science

Tools

PyTorch
Hugging Face
Job description
Post-Doctoral Fellow - Computer Science (Cybersecurity)

Tracking Code: req1835

Faculty of Business and Information Technology

Number of Positions: 1

Appointment Type: Limited Term

Salary Grade: Administered in accordance with the Collective Agreement

Posting Date: November 19, 2025

Closing Date: December 9, 2025(7:00 pm EST)

The Faculty of Business and Information Technology at Ontario Tech University has an opening for a Postdoctoral Fellow (PDF) in the area of Cybersecurity and AI. The position is offered for 12 months, preferably starting in Jan 1st, 2026 subject to funding approval and may be extended contingent upon research performance.

This research program investigates the use of large language models (LLMs) with code-synthesis capability to automate metamorphic malware generation. The goal is to understand how modern foundation models — when fine-tuned on malware code corpora — can learn transformation rules that produce functionally-equivalent variants capable of bypassing static and signature-based detection.

The broader objective is to provide defensible, repeatable measurement of how AI-assisted metamorphic code generation may change the offensive–defensive balance — and to surface threat-relevant insights for defensive counter-design (e.g., more semantic detection and de-obfuscation strategies).

The position will be supervised by Dr. Pooria Madani.

Responsibilities/Accountabilities
  • Dataset preparation and domain-specific fine-tuning of LLMs for code mutation
  • Model compression/quantization to make such models small enough for endpoint-size deployment scenarios (threat modelling angle)
  • Formal and empirical verification that mutated samples preserve semantics (correctness safety checks)
  • Automated validation pipelines to measure detection evasion against commercial AV / EDR engines
  • Controlled experimentation to quantify detection degradation after each class of code-level metamorphic transformation
Qualifications
Required Qualifications
  • Strong experience training and fine-tuning large language models (LLMs), including prompt engineering and instruction tuning.
  • Proficient with PyTorch and the Hugging Face ecosystem (Transformers, Datasets, Accelerate, Trainer or custom training loops).
  • Solid software engineering skills for experiment implementation (Python, reproducible scripts, CI-friendly workflows).
  • Practical knowledge of code-synthesis models and code representation/transformations.
  • Deep understanding of malware concepts, especially metamorphic/obfuscation techniques and code-level mutation strategies.
  • Experience with malware sandboxing and safe experimentation (dynamic analysis, instrumentation, VM/snapshot workflows).
  • Strong research & technical writing skills (papers, reproducible experiment descriptions, responsible disclosure).
Preferred Qualifications
  • Experience with model compression/quantization (distillation, pruning, INT8/4 quant workflows).
  • Familiarity with automated evaluation against AV/EDR products and red-team/blue-team methodologies.
  • Background in formal verification or semantic-preserving program transformations.
  • Prior publications or open-source contributions in ML for code, adversarial ML, or malware research.
  • Awareness of ethics, legal, and responsible-research practices for dual-use work (IRB, disclosure, safe-lab protocols).
Required Education
  • Ph.D. in Computer Science with focus in Machine Learning and/or Cybersecurity.
Required Experience
  • Hands-on experience training and fine-tuning LLMs (preferably with code-generation tasks).
  • Practical experience with PyTorch and HuggingFace (Transformers + Datasets) for experiment pipelines.
  • Experience programming research experiments end-to-end (data prep, training, evaluation, automation).
  • Prior exposure to malware analysis, including metamorphic malware concepts and sandbox-based testing.
  • Demonstrated research writing experience (papers, technical reports, documentation).

This position falls within the bargaining unit represented by the Public Service Alliance of Canada (PSAC) and will be subject to the terms and conditions of the collective agreement between the University and PSAC. The collective agreement may be found on the Human Resources section of our website. Candidates will be required to certify that they are currently legally eligible to work in Canada for the duration of the contract.

How to Apply

Interested candidates should submit in electronic format a covering letter and their resume. Applications will be accepted until December 9, 2025 or until a suitable candidate is found. We appreciate all applications received; however, only those candidates selected for an interview will be contacted.

Ontario Tech University is actively committed to equity, diversity, inclusion, indigenization and decolonization (EDIID), and welcomes applications from all qualified candidates, while especially encouraging applications from First Nations, Metis, Inuit peoples, Indigenous peoples of North America, racialized persons, persons with disabilities, and those who identify as women and/or 2SLGBTQ+. All qualified candidates are encouraged to apply; however, Canadian citizens, permanent residents, Indigenous Peoples in Canada, and those eligible to work in Canada, will be given priority.

Ontario Tech University respects people's different needs and therefore will take all reasonable steps to ensure accommodation for applicants where appropriate. The University is also committed to ensuring that confidentiality is maintained throughout all aspects of the recruitment cycle.

If you require accommodation, please contact Julie Day, Health and Disability Management Specialist.

For more information about the university’s policies for accommodating employees with disabilities please review the university’s Accessibility Policy.

The university acknowledges the lands and people of the Mississaugas of Scugog Island First Nation which is covered under the Williams Treaties. We are situated on the Traditional Territory of the Mississaugas, a branch of the greater Anishinaabeg Nation which includes Algonquin, Ojibway, Odawa and Pottawatomi.

Job Location: Oshawa, Ontario, Canada

Expected Start Date: 1/1/2026

Expected End Date:

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.