Verification, Upload, and Content Moderation Process

December 2024 

At Pornhub, we remain steadfast in taking proactive measures to ensure the safety of our users and platform. These measures include only allowing verified content creators to upload to our platform, the use of several tools and technologies, as well as human review by trained moderators prior to the publication of content to ensure that content adheres to our Terms of Service, Community Guidelines and related policies. This process also includes verifying the identity and consent of every performer seen in user-generated content uploaded through our Model Program. We have provided a detailed description of each of these steps below. 

Step 1 - Becoming a Verified Content Creator

Verified content creators include the following groups: 

Members of the Model Program must verify their age and identity via a trusted third-party identification verification service, which is also reviewed by our human moderation team before they can be approved for participation in the Model Program and permitted to upload content.  

Professional Studios, who apply for the Content Partner Program, must also go through a similar verification process, which also includes business document verification, where applicable. Content Partner sites are routinely audited to ensure that they maintain appropriate documentation for their content. Their content is also assessed for compliance with our Terms of Service and related policies before they are accepted into the program. For more information about the verification process for Content Partners, click here.  

Step 2 - Verifying Other Performers Seen in Content 

Prior to uploading content to Pornhub, members of the Model Program must obtain, retain, and provide identification and evidence of consent to record and distribute for every performer appearing in their content. This requires the Model or Co-Performer to provide proof of age and consent via the following methods: 

  • Uploading valid government-issued photo identification for all performers AND 
  • Providing consent documentation, such as signed Release Forms, for all performers appearing in content uploaded to Pornhub  

If the performer depicted in the content has a verified model account on Pornhub, proof of age and consent can alternatively be provided by tagging them as a Collaborator in the content. 

Our Content Partners must attest that they maintain proof of ID, age and consent in compliance with 18 U.S. Code § 2257 record keeping requirements for all performers who are featured in content uploaded to Pornhub. A statement of this compliance is linked on all channels, which contains the coordinates of the Custodian of Records for the Content Partner. Furthermore, we routinely perform audits, requiring Content Partners to provide age and consent documentation upon request. We have also initiated a process to audit all Content Partners’ ongoing compliance with this requirement on an annual basis. 

Step 3 - Uploading and Moderating Content 

After becoming verified, Models and Studios are eligible to upload content to Pornhub. When members of our Model Program upload content depicting additional performers, they must also verify the identity of these other performers prior to upload in order for the content to begin the content moderation process.  

When a piece of content is uploaded, it is scanned by several tools and against numerous databases to determine if the content may violate our Terms of Service or contains known illegal content before it reaches a human moderator. Our human moderators watch and listen to ensure the content is compliant with our Terms of Service, Community Guidelines and policies. For content that is determined by a moderator to be compliant with our Terms of Service and satisfies our performer verification requirements, it is typically published within 24 hours of being uploaded. That said, content cannot go live on our platforms before it has undergone the complete moderation process. 

Global Moderation and Compliance Content Upload Process 

The chart below illustrates the performer verification and upload process. 

Step by Step Content Approval Process:  

  1. Metadata, such as the content’s title and tags, is scanned against our banned word list. Terms which are banned cannot be submitted in a title or in tags. Terms which are flagged are surfaced to moderators to ensure extra scrutiny. 
  2. Content is scanned against internal and external tools to identify known or suspected Child Sexual Abuse Material (CSAM), Non-Consensual Content, or otherwise abusive or illegal content. A full list of these tools can be found here.   
  3. Content is reviewed by our human moderation team to determine if the content violates our Terms of Service, Community Guidelines or related policies. 
  4. Lastly, content is reviewed by our human moderation team to validate that identity and consent documentation has been provided for the performers seen in the content, where applicable. For content that is compliant with our Terms of Service and satisfies our performer verification requirements, it is typically published within 24 hours of being uploaded. 

If content adheres to our Terms of Service but we have not been provided identity or consent documentation for the performer(s) seen in the content where required, or are otherwise unable to determine that every performer in the content is matched to a verified co-performer, then the content will not be approved. If missing documentation is later provided and is sufficient for the performer(s) depicted after upload, the content is reviewed again and may be published at that time.  

If we determine that content violates our Terms of Service during any stage of this process, it is removed before being published on the platform. The content may also be fingerprinted to assist in identifying and removing the content if it is ever attempted to be uploaded again and the uploader may be banned, where appropriate. Additionally, if content violates our CSAM Policy, the uploader is banned and is reported to the National Center for Missing and Exploited Children (NCMEC). 

We continue to evolve our Trust and Safety policies, procedures and technologies to account for the ever-changing nature of the online space and the challenges it presents. An overview of how our approach to Trust and Safety has evolved over time can be found in Our Commitment to Trust and Safety and our Trust and Safety Initiatives page.  

 

 

Was this article helpful?
89 out of 137 found this helpful