#Social
Tech Can't Solve Social Problems - Who Knew?
OMG GOOGLE DOES NOT CORRECTLY IDENTIFY PORN ON THE INTERNET!!!zwölf!!
Well, duh. Of course it doesn’t. Neither does anybody else.
On balance, I’m pretty sure people who have “Safe Search” turned on prefer false positives, i.e. it’s better to hide some news articles than it is to show some adult content. Leaving aside the ultimate futility of even defining “porn” or “adult material” correctly, or even agreeing on what “correct” means, you pay for a reduction in Type I errors with an increase in Type II errors, and vice versa.
If you prefer to reduce false positives, turn Safe Search off. The ML behind the search engine will still try to figure out what kind of thing you’re looking for, but instead of removing the other kind, it’ll just down-rank them.
The only thing that’s remarkable about this particular recurrence of this “story” is how many professionals who actually do know better picked it up instead of dismissing it. I guess when your own publication’s articles (and related ad revenue) are false-positived, reporting interests change.