About Anthropic
Anthropic’s mission is to create reliable, interpretable, and steerable AI systems. We want AI to be safe and beneficial for our users and for society as a whole. Our team is a quickly growing group of committed researchers, engineers, policy experts, and business leaders working together to build beneficial AI systems.
About the role
Anthropic's Integrity & Compliance (I&C) function is building the systems that let us scale responsibly as our products reach more people, more enterprises, and more regulated industries. Our global compliance program is bespoke, reflecting our unique mission and position as one of the leading AI labs operating on the frontier. This role – a Content Moderation Specialist – owns day‑to‑day program management of Anthropic’s global content moderation and online safety regulatory compliance program.
Key responsibilities
Own the global content regulation risk assessment program, including the roadmap of required assessments across jurisdictions, a consistent and repeatable risk assessment methodology and framework, and the coordination of inputs, consultation, and approvals for each assessment
Build and maintain systems and trackers to assess, operationalize, and report on relevant regulatory requirements across Anthropic's products and jurisdictions
Partner with internal counsel, Safeguards, Policy, engineering, and operations teams to align internal practices with external commitments and legal obligations
Maintain a controls inventory and the compliance documentation library for content regulation, ensuring documentation is drafted, reviewed by the right stakeholders, and kept current
Conduct gap analysis when new or amended content regulations come into scope, and stand up the compliance readiness plan and workback for each
Provide regular written program status reporting to stakeholders and leadership, proactively surfacing stalled or at‑risk items with a proposed path to unblock
Take on additional related work as the program evolves; job duties and responsibilities may change from time to time at Anthropic's discretion or as required by applicable law
Qualifications
Experience managing regulatory or compliance programs at a technology company or in a regulated industry
Hands‑on experience conducting or program‑managing regulatory risk assessments, including coordinating inputs across multiple functions
Demonstrated ability to build and maintain compliance program artifacts, including policies, risk assessment documentation, controls inventories, program trackers, and readiness plans
A track record of executing cross‑functionally, driving outcomes across legal, product, policy, and operations partners without direct authority
Excellent written and verbal communication skills, including producing clear program documentation and status reporting for senior stakeholders
Sound judgment and the ability to make decisions and move work forward with incomplete information in an evolving regulatory environment
Preferred qualifications
5+ years of relevant experience in regulatory program management or content moderation compliance
Direct experience with online safety or content moderation regulation, such as the EU Digital Services Act, UK Online Safety Act, Australia Online Safety Act, or comparable regimes
Experience in trust and safety, online safety, or regulatory compliance at a large consumer technology platform
Prior experience in a Big 4 or other professional services firm advising on content regulation, online safety, or platform compliance engagements
Experience designing risk assessment methodologies or compliance frameworks from first principles
Experience with multi‑jurisdictional compliance programs in a rapidly scaling environment
Familiarity with how generative AI products intersect with content and online safety regulation
Role-specific policy
We expect staff to be able to work from either our Washington, DC, San Francisco, or New York City office at least 3 days a week, though we encourage you to apply even if you might need some flexibility for an interim period of time.
Compensation range
$255,000 – $270,000 USD
Logistics
Minimum education:
Bachelor’s degree or an equivalent combination of education, training, and/or experience
Required field of study:
A field relevant to the role as demonstrated through coursework, training, or professional experience
Minimum years of experience:
Years of experience required will correlate with the internal job level requirements for the position
Location‑based hybrid policy:
Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices.
Visa sponsorship:
We do sponsor visas! However, we aren’t able to successfully sponsor visas for every role and every candidate. If we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.
Equal Employment Opportunity
As set forth in Anthropic’s Equal Employment Opportunity policy, we do not discriminate on the basis of any protected group status under any applicable law. For government reporting purposes, we ask candidates to respond to the below self‑identification survey. Completion of the form is entirely voluntary. Whatever your decision, it will not be considered in the hiring process or thereafter. Any information that you do provide will be recorded and maintained in a confidential file.
#J-18808-Ljbffr

Content Moderation Specialist
Anthropic, Seattle, WA, United States
Salary min: $255,000.00
Salary max: $270,000.00