(C) BoingBoing
This story was originally published by BoingBoing and is unaltered.
. . . . . . . . . .



Facebook blames those pesky algorithms for approving pro-genocide ads in Kenya [1]

['Mark Frauenfelder']

Date: 2022-08-01

Meta's mascot should be the "NOT ME" imp that appears in Family Circus comics when none of the kids are willing to take responsibility for spilled grape juice, broken furniture, and other mishaps around the house.

Who approved ads advocating for ethnic cleansing in Kenya? NOT ME, says Facebook.

Who "allowed themselves to be a vector of hate speech and incitement?" NOT ME.

Who "approved ads on Facebook in both Swahili and English that included calls to rape and behead Kenyan citizens along ethnic lines?" NOT ME.

As you might imagine, Facebook is very upset with NOT ME.

Facebook's NOT ME isn't the smirking sprite seen in Family Circus; it's "proactive detection technology" that serves as a convenient scapegoat every time Facebook makes a massive fuckup.

Fortunately, the ads were submitted by a watchdog group testing Facebook's detection measures against hate speech and never ran on the platform.

From Courthouse News:

[END]
---
[1] Url: https://boingboing.net/2022/08/01/facebook-blames-those-pesky-algorithms-for-approving-pro-genocide-ads-in-kenya.html

Published and (C) by BoingBoing
Content appears here under this condition or license: Creative Commons BY-NC-SA 3.0.

via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/boingboing/