Enable job alerts via email!

QA Specialist - Minor Safety and Exploitative Content

Discord

San Francisco (CA)

Remote

USD 108,000 - 122,000

Full time

14 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Discord is seeking a detail-oriented QA Specialist for Minor Safety and Exploitative Content. This role focuses on ensuring quality moderation decisions to uphold community standards and requires strong analytical and communication skills. Join Discord in making a positive impact in online safety while working with a passionate team dedicated to continuous improvement.

Qualifications

  • 2+ years of experience in quality assurance or content moderation.
  • Deep understanding of minor safety and exploitative content issues.
  • Ability to synthesize large datasets into actionable insights.

Responsibilities

  • Review and audit moderation decisions for minor safety.
  • Collaborate with teams to identify trends in content review.
  • Provide feedback to improve decision-making accuracy.

Skills

Analytical skills
Communication skills
Empathy

Tools

Moderation tools
Data analytics tools

Job description

Discord is used by over 200 million people every month for many different reasons, but there’s one thing that nearly everyone does on our platform: play video games. Over 90% of our users play games, spending a combined 1.5 billion hours playing thousands of unique titles on Discord each month. Discord plays a uniquely important role in the future of gaming. We are focused on making it easier and more fun for people to talk and hang out before, during, and after playing games.

We are looking for a detail-oriented professional with a strong passion for safeguarding vulnerable groups and combating exploitative content online. As a QA Specialist for Minor Safety and Exploitative Content (MSEC) at Discord, you will play a pivotal role in ensuring the accuracy, consistency, and quality of moderation decisions that uphold our community standards. This role reports to the Team Lead for Trust & Safety QA and will partner closely with MSEC. Your approach to quality assurance is rooted in empathy, precision, and a commitment to continuous improvement.

What You'll Be Doing

  • Review and audit moderation decisions related to minor safety and exploitative content to ensure adherence to Discord’s Trust & Safety policies.
  • Collaborate with moderators, analysts, and policy teams to identify trends, gaps, and inconsistencies in content review processes.
  • Provide constructive feedback and actionable insights to moderators to improve decision-making accuracy and maintain policy alignment.
  • Develop and lead calibration sessions for the moderation team based on audit findings and evolving content standards.
  • Partner with MSEC and other cross-functional teams to influence policy updates and improve internal tools and workflows for greater efficiency and scalability.
  • Regularly report on quality trends and metrics, highlighting risks, successes, and opportunities for process improvements.

What you should have

  • 2+ years of experience in quality assurance, trust & safety, or content moderation, preferably in a tech or online platform environment.
  • Deep understanding of issues related to minor safety, exploitative content, and global online safety trends.
  • Excellent analytical skills with the ability to synthesize large datasets and translate them into actionable insights.
  • Strong communication skills, both written and verbal, to effectively convey findings and train teams.
  • Familiarity with moderation tools, audit processes, and metrics-driven performance tracking.
  • A calm, resilient demeanor when handling sensitive or potentially distressing content.
  • Ability to flex your expertise to support other QA initiatives, including automation and machine learning, violent and hateful content, cybercrime, and other exploitative content.

Bonus Points

  • Experience working on global teams or in environments that require cultural sensitivity and awareness.
  • Experience with data analytics tools and languages like SQL.
  • Proficiency in multiple languages to support international moderation efforts.
  • Demonstrated success in driving cross-functional initiatives or policy changes in a Trust & Safety context.
  • Experience working with machine learning systems, automation tools, and LLM/AI technologies.

Requirements

  • This role requires regular interfacing with potentially traumatic material, including CSAM and other forms of exploitative, hateful, violent, or shocking content.
  • This role's hours are Monday-Friday, 9:00 AM to 5:00 PM Pacific Standard Time, with occasional flexibility required to accommodate our global partners.

#LI-Remote

The US base salary range for this full-time position is $108,000 to $121,500 + equity + benefits. Our salary ranges are determined by role and level. Within the range, individual pay is determined by additional factors, including job-related skills, experience, and relevant education or training. Please note that the compensation details listed in US role postings reflect the base salary only, and do not include equity, or benefits.

Why Discord?

Discord plays a uniquely important role in the future of gaming. We're a multiplatform, multigenerational and multiplayer platform that helps people deepen their friendships around games and shared interests. We believe games give us a way to have fun with our favorite people, whether listening to music together or grinding in competitive matches for diamond rank. Join us in our mission! Your future is just a click away!

Please see our Applicant and Candidate Privacy Policy for details regarding Discord’s collection and usage of personal information relating to the application and recruitment process by clicking HERE.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

QA Specialist - Minor Safety and Exploitative Content San Francisco (USA) Remote (USA) Discord [...]

Gamecompanies

San Francisco

Remote

USD 108,000 - 122,000

2 days ago
Be an early applicant

QA Specialist - Minor Safety and Exploitative Content

Discord

San Francisco

Remote

USD 108,000 - 122,000

11 days ago

QA Specialist II

Gilead Sciences

Foster City

On-site

USD 102,000 - 133,000

10 days ago