Mr. Musk, who tweeted a link to a baseless story about Speaker Nancy Pelosi’s husband last month before deleting it, did not respond to a request for comment. Twitter, which laid off most of its communications department, also did not respond to a request for comment.
In a report on Monday, researchers at the Fletcher School at Tufts University said that “the quality of the conversation has decayed” on Twitter since Mr. Musk’s takeover, as more extremists and misinformation peddlers have tested the platform’s boundaries.
And in a report released on Saturday by the Election Integrity Partnership, which includes the Stanford Internet Observatory and the University of Washington’s Center for an Informed Public, researchers found the presence and influence of misinformation on Twitter to be pervasive.
Researchers looked at 34 major accounts that spread misleading claims about the 2020 election in the past and discovered “hundreds of false, misleading or unsubstantiated stories that sowed doubt in election procedures or results.” While seven of the accounts were permanently suspended from Twitter, they have largely been able to continue posting on other platforms, with their posts often re-emerging on Twitter via screenshots, the researchers said.
Common Cause, a pro-democracy advocacy group, said this week that it had flagged several tweets pushing false narratives, such as that election results not announced on Tuesday night are a sign of fraud. The group said that “it has taken Twitter much longer than normal to adjudicate” whether the posts violated its policies, a process that usually takes less than three hours but was unresolved after more than three days.
On Tuesday, several major conservative accounts on Twitter continued to amplify the misconception that tallying delays were evidence of malfeasance, while also fueling other rumors about technical difficulties around the vote in Arizona and elsewhere.
Some Twitter executives tried to assuage concerns about the platform in the midterms. Yoel Roth, the company’s head of trust and safety, who helps oversee content moderation, tweeted last week that about 15 percent of his organization had been laid off, versus about 50 percent companywide.