Life Style

Parler Employees Say Apple Is Trying To Remove The App From The App Store | Grit Daily News

John Matze, CEO of Parler, says that Apple is unfairly trying to hold Parler accountable for the content on its platform in a post that was published to Parler on Friday. A screenshot of Apple’s inquiry posted on the Parler app reveals that the company is threatening to remove the app from the app store, citing a violation of its safety rules that ensure the safety and protection of its users. The company, revealed in the screen shot posted on Parler, claims that it has rejected Parler’s request for App Store Review guidelines.

Matze, who argues that Parler is being unfairly targeted by Apple, claims that the same standards have not been applied to Twitter and Facebook. Both Twitter and Facebook have tightened restrictions on content in recent months. Facebook removed accounts, groups, and posts tied to the QAnon conspiracy theory ahead of Election Day last fall. Twitter, meanwhile, removed thousands of conspiracy theory accounts and right-wing influencers on Friday—at the same time Matze was complaining that Parler was not being given fair treatment.

Parler has earned widespread criticism throughout the last year from experts that argue its presence in the app store provides a clear gateway for online extremism. The app, which prides itself in offering very little content moderation under the guise of protecting free speech on the internet, has played host to millions of extremists in 2020 and into 2021. “Apparently they believe Parler is responsible for ALL user generated content on Parler … Standards not applied to Twitter, Facebook or even Apple themselves, apply to Parler,” argued Matze on his own app.

Social media companies and internet giants are protected by Section 230 of the Communications Decency Act of 1996. The Section of the bill says that companies cannot be held liable for the user generated content on their platforms as long as it does not constitute illegal activity. The law was inspired by bookstores in the pre-internet era that could not be held accountable for the content within the books that they sold (if they were held accountable, they would be forced to read every page of every book in their shops).

A clause within Section 230 called the Good Samaritan Clause grants additional protections for internet companies that choose to moderate content for illegal activity. The Good Samaritan Clause enables tech companies like Apple and Twitter to create their own user guidelines. Companies like Facebook and TikTok, which have expanded their user guidelines in recent months to include harmful conspiracy theories under a list of banned topics, are responsible for assuring that their platforms meet user safety requirements put in place by Apple.

Matze, arguing on Parler that his company does not allow illegal activity, denies that Parler was used to organize the insurrection at the U.S. Capitol on January 6th despite the presence of violent threats that remain on the app to this day.


Source link

Related Articles

Back to top button