Facebook’s Ime Archibong, VP of Product Partnerships, has posted a recursion on what the social media giant has done after the Cambridge Analytica scandal in 2018. As the executive claims, Facebook began to review all of the apps that had access to the humongous sets of information in the platform, involving attorneys, external investigators, data scientists, engineers, policy specialists, and teams of partners in the process. As a result, they were able to define and identify patterns of abuse and suspend tens of thousands of apps in the process.
Initially, the review was done on the basis of how many users an app had and how much data they could access. After the investigations progressed, the reviewers turned their focus on specific signals that were associated with policy abuse and deviation from the defined guidelines. In some cases, the developers of the violating apps were called for in-depth questioning. In others, Facebook’s engineers carried out a technical inspection themselves. Finally, there’s a category of apps that were operating in a blatantly contravening manner, so there were banned immediately and without allowing a chance to explain or negotiate any changes.
The apps that have been suspended this way were created by approximately 400 developers. Some of them were not even available to the users of the social media platform yet, as they were in their testing phase when they were reviewed and suspended. In many cases where the explanations that were provided by the developers were not convincing, or the developers didn’t respond in a timely manner, Facebook suspended the apps. In some cases were app developers tried to welch Facebook’s reviewers, the tech giant decided to answer with lawsuits. Take the Rankwave example from May, who received a lawsuit from Facebook for trying to hide the fact that it collected user data without the user’s consent. They then used this information for advertising and marketing services.
Facebook promises that they will continue to review apps in an intensifying manner, learning more along the way and making their scrutinizing process more powerful. Whether or not all of the above is the unfiltered truth or swollen stories that are published to help create a sense of tight and strict reviewing process on Facebook, it's up to you to decide. We would like to believe that Facebook is really changing. Their past, and repeated user trust betrayals just won’t let us.
Where do you stand on the above? Let us know in the comments down below, or on our socials, on Facebook and Twitter.