What the whole latest fiasco of Facebooks Napalm Girl and “news” feed and image policies really says, in the brutal truth.
- If Facebook is relying on AI to evaluate images, it shows just how bad Artificial Intelligence really is right now. All the AI Engine would have seen was a naked image…that’s what it was programmed to block, so it took the post down. This shows that AI is so very, very far from being good…since the AI system did not understand a) history b) relevance c) context d) emotion. Not much “intelligence” at work there. Facebook needs to train it’s AI algorithms better (and the folks writing those algorithms.)
- If Humans made the decision it shows that whomever is doing the hiring for editors at Facebook likely has a poor education and that the people they’re employing are even more poorly educated. Because they did not understand the historical significance of the photo. In addition, they were too daft to do any background research, like a simple drag and drop Google image search.
Either way, this shows that Facebook really does not understand a) news analysis b) journalism c) research d) current affairs c) international affairs. What it really says is “we hire twenty-somethings who like Starbucks and hashtags and are terribly educated” or “we’re using people as guinea pigs to train our aI algorithms that are really not quite ready for the real-world.”
This also says to me that Facebook really wasn’t playing preferences for democrat over republican content earlier this year. It says their AI algorithms are bad, or terrible, or that they hire inexperienced, uneducated people to manage their news content curation. If they do have a “senior editor” then that person likely is “senior” because they live in Silicon Valley and wrote over 100 blog posts about the nice weather in California and have an arts degree. Or the geeks overrule the experienced news editor.
At the end of the day, Facebook just isn’t a news content or publishing company. If they want to be, then they should take the time and effort and spend some money to bring on real, experienced (as in real, true journalists with deep international and domestic news experience) media people. Whether it be humans or machines running the “news” feed at Facebook, this latest incident proves they really do not have a clue about news content. If they have a highly experienced news person in a management role then that person is probably not getting the support they need and has a bunch of computer scientists and machine learning wonks overruling him/her/they…so how is that working out for you Facebook? Right.
So Facebook should either better support that news editor or they should hire a real news editor, let them build a proper team and the AI nerds should listen to that person. Then Facebook should make a point of promoting that awesomely skilled new journalism team to the world, showing their credentials beyond ordering complicated Starbucks lattes and having written ten blog posts and having an arts degree. Step up or step out Facebook. Right now, Facebook is entirely too opaque about the whole thing…that in itself says a lot. Your move Facebook.