San Francisco: A team of researchers at AlgorithmWatch said they were forced to abandon their research project monitoring the Instagram algorithm after legal threats from Facebook, the media reported.
The Berlin-based project went public with the conflict in a post published this week, citing the platform’s recent ban of the NYU Ad Observatory, reports The Verge.
“There are probably more cases of bullying that we do not know about,” the post reads.
“We hope that by coming forward, more organisations will speak up about their experiences,” it added.
Launched in March 2020, AlgorithmWatch provided a browser plug-in that would allow users to collect data from their Instagram feeds, providing insight into how the platform prioritises pictures and videos.
The project published findings regularly showing that the algorithm encouraged photos that showed bare skin and that photos showing faces are ranked higher than screenshots of text.
Facebook disputed the methodology but did not otherwise take action against AlgorithmWatch for the first year of the project.
In May, researchers said Facebook asked to meet the project leaders and accused them of violating the platform’s terms of service. Another objection was that the project violated the GDPR since it collected data from users who had not consented to participate.
“We only collected data related to content that Facebook displayed to the volunteers who installed the add-on,” the researchers said in their defence.
According to the report, the researchers ultimately chose to shut down the project, believing they would face legal action from the company if it continued.
The report mentioned that a Facebook representative confirmed the meeting but denied threatening to sue the project, saying the company was open to find privacy-preserving ways to continue the research.
“We had concerns with their practices which is why we contacted them multiple times so they could come into compliance with our terms and continue their research, as we routinely do with other research groups when we identify similar concerns,” the representative was quoted saying by the tech website.
“We intend to keep working with independent researchers but in ways that don’t put people’s data or privacy at risk,” it added.