Blog

Why Apple’s AI-driven reality distortion matters – Computerworld


The idea behind this is that people reading those headlines will know that there could be a machine-generated error (as opposed to an error by humans) in the news they are perusing. The inference is, of course, that you should question everything you read to protect yourself against machine-generated error or human mistakes. 

Question everything: Human, or AI

The humans who generate news are up in arms, of course. They see the complaint as a cause celebre from which to make a stand against their own eventual replacement by machines. The UK National Union of Journalists, Reporters Without Borders and the head of Meta’s Oversight Board (if that board still exists by the end of the week) have all pointed to these erroneous headlines to suggest Apple’s AI isn’t yet up to the task. (Though even Apple’s critics point out that part of the problem is that even under human control, public trust in news has already sunk to record lows.)

Those critics also argue that telling users that a news headline has been generated by AI doesn’t go far enough. They argue that it means readers must confirm what they read. “It just transfers the responsibility to users, who — in an already confusing information landscape — will be expected to check if information is true or not,” Vincent Berthier, head of RSF’s technology and journalism desk, told the BBC. 


Source link

Related Articles

Back to top button
close