Based on Baidu cloud computing platform, and through rapid scanning of massive data, CDN’s pornographic images authentication feature can detect whether images accelerated by CDN are pornographic, which can help you to save more than 90% of the cost of manpower audit. To bring you convenience, CDN provides interface image examination console for the users; the users can check the pornographic images examined by CDN in the console and execute manual reexamination as required.
Criteria for determining pornographic images
- Suspected pornographic images: The photo is identified by a model of machine learning training and is marked as suspected pornographic if its likelihood of being pornographic is greater than 90%.
- Pornography: The images are identified through the model of machine learning training, and an image with a possibility of pornographic image greater than 99% is marked as a pornographic image.
Audit Service Use
Porn detection overview
- Log in CDN Management Console, select Porn Detection from the Navbar on the left of the console to enter the overview page,
- On this page, you can view the amount of pornographic images and the amount of suspected images determined by system today as well as the TOP5 illegal domain name this month.
Pornographic images authentication trend analysis
You can view the authentication trends of pornographic images and suspected pornographic images at any time in the last three months by selecting a time point, and also can click the “Download” button on the right to download the table of pornographic images corresponding to all time points.
Illegal image query
You can query the illegal images by selecting the domain name, time period and label results, which will be displayed in a visual form of image.
- For suspected pornographic pictures, you can hover over the pictures and select manual judgment. It supports to select all pictures on this page and tag all.
- For pornography, the system will take corresponding blockage measures according to the intelligence algorithm.
Appeal handling after being blocked
After Baidu AI Cloud CDN bans the pornographic photos, it will inform the user by short message, email or private message.
If the results is judged to be inconsistent with your expectation, you can hover over the corresponding photo, click Appeal to submit ticket. The photo you initiate to lift the ban will be reviewed manually, and if it passes the review, the ban will be removed and the photo will be classified as normal; if not, the photo will be classified as pornographic, with the status changed to rejection. The appeal period is 2-3 days.
- The porn detection feature of Baidu AI Cloud CDN is in free open beta test at present. You can save relevant illegal photo records detected in the past 30 days under your account.
- Here, the normal image records come from the suspected grade image manual judgement and successful pornographic image complaints, instead of all the selected normal images.