Facebook reportedly provided inaccurate data to misinformation researchers, NYT reports

Facebook has apologized to misinformation researchers for providing them with “flawed, incomplete data” for their work examining how users interact with posts and links on its platform, the New York Times reported.

Contrary to what the company told the researchers, the data Facebook provided apparently only included information for roughly half of its users in the US, not all of them. The Times reported that members of Facebook’s Open Research and Transparency team held a call with researchers on Friday to apologize for the error. Some of the researchers questioned whether the mistake was intentional to sabotage the research, or simply an instance of negligence.

In an email to researchers The Times saw, Facebook apologized for the “inconvenience [it] may have caused.” The company also told them that it’s fixing the issue, but that it could take weeks due to the sheer volume of data it has to process. Facebook told the researchers, though, that the data they received for users outside the US isn’t inaccurate.

Facebook spokesperson Mavis Jones blamed the data inaccuracy to a “technical error,” which the company is apparently “working swiftly to resolve.” As The Times notes, it was University of Urbino associate professor Fabio Giglietto who first discovered the inaccuracy.

Giglietto compared the data handed over to researchers with the “Widely Viewed Content Report” the social network published publicly in August and found that the results didn’t match. One researcher flagged concerns after that report was published. Alice Marwick, a researcher from the University of North Carolina, told Engadget that they couldn’t verify those results, because they had no access to the data Facebook used.

Other researchers have been using their own tools to gather information for their research, but in at least one instance, Facebook cut off their access. In August, Facebook disabled the accounts associated with the NYU Ad Observatory project. The team used a browser extension to collect information on political ads, but the social network said it was “unauthorized scraping.”

At the time, Laura Edelson, the project’s lead researcher, told Engadget that Facebook is silencing the team because its “work often calls attention to problems on its platform.” Edelson added: “If this episode demonstrates anything it is that Facebook should not have veto power over who is allowed to study them.”




The following two tabs change content below.
Paul, 37, is from Scotland in the UK, but currently lives and works in Bangkok. Paul has worked in different industries such as telemarketing, retail, hospitality, farming, insurance, and teaching, where he works now. He teaches at an all-girls High School in Bangkok. “It’s a lot of work, but I love my job.” Paul has an active interest in politics. His reason for writing for FBA is to offer people the facts and allow them to make up their own minds. Whilst he believes opinion columns have their place, it is also important that people can have accurate news with no bias.

Leave a Reply