Advertisement

How the fog of war spun rumors of terrorism out of a deadly Long Beach crash

Police officers stand in a roadway next to wrecked cars.
A car struck pedestrians in a crosswalk as well as vehicles in the Oct. 14 crash at Shoreline Drive and Aquarium Way in Long Beach.
(OnScene.TV)
Share via

After a car barreled through a red light in Long Beach this month and plowed into several pedestrians in a crosswalk, killing one, it did not take long for rumors of a terrorist attack to spread in some corners of the internet.

Against the backdrop of the Israel-Hamas war, the Oct. 14 crash, which led to a murder charge against the driver, was taken up by conspiracy theorists, right-wing extremists and others on social media, whose rampant speculation was quickly shot down by police.

Days after the crash, out-of-state groups including the Arizona Border Defenders, which is similar to those that have been characterized by the Anti-Defamation League as part of an extreme anti-immigration movement, posted on Facebook questioning why the FBI was at the scene if authorities were saying it wasn’t a terrorist attack.

Laura Loomer, who for years has boosted outlandish conspiracy theories to her hundreds of thousands of followers on Twitter, now X, pushed the theory that the crash was an act of Islamic terrorism, claiming unnamed “sources” told her that police were under a gag order.

Advertisement

Although police arrested Khalid Yagobbi, 46, on suspicion of murder and said they believed the crash was intentional, they maintained it was not an act of terrorism and have since doubled down, reaffirming that the crash was not tied to the violence in the Middle East.

A car entered a crosswalk and struck both vehicles and pedestrians in Long Beach Saturday evening, leaving a woman dead and a man in critical condition.

Police said Yagobbi drove a Chevrolet Bolt the wrong way on Shoreline Drive, careened through a red light and struck the pedestrians and multiple occupied vehicles, killing one woman. They did not specify how many others were injured. Prosecutors said he was working as an Uber driver and had a passenger in his car, according to the Long Beach Post.

Yagobbi, of Los Angeles, was charged last week with vehicular manslaughter with gross negligence in connection with the death of Romelia Cuarenta-Aguilar, 60. He pleaded not guilty.

Advertisement

A spokesperson for the L.A. County district attorney’s office said Tuesday “that is the charge that is supported by the evidence that was presented at the time of filing.”

An amended complaint has since charged Yagobbi with murder and four felony counts each of attempted murder and assault with a deadly weapon. He was arrested again Wednesday and is being held in lieu of $6-million bail, with arraignment set for Monday.

The Long Beach Police Department posted a statement on its social media accounts denouncing the online misinformation and saying there was no gag order on the case.

Advertisement

“While the motive remains under investigation, at this time, there is no indication the incident is connected to the terrorism nor the current violence in the Middle East,” the Oct. 16 post read.

Fraser Michael Bohm, 22, faces four counts of malice murder and four counts of gross vehicular manslaughter in the Oct. 17 crash in Malibu.

On Thursday, after the murder charge was added against Yagobbi, a Long Beach police spokesperson again told The Times there was no evidence that suggests a link to terrorism.

The FBI responded to the crash “due to the suspicious circumstances of the incident and current international events,” according to the Police Department.

With multiple denials from police and federal authorities, why did many on social media continue to spread the idea that the crash was a terrorist attack?

Disinformation tends to proliferate more rapidly during times of conflict or war, “simply because emotions run high and people are more likely to share information that aligns with their beliefs, often without verifying it,” said Alex Goldenberg, lead intelligence analyst for Network Contagion Research Institute.

“In such turbulent times, like the one we’re experiencing now, misinformation and disinformation can spread like wildfire and in turn further fueling tension and widespread misinformation,” Goldenberg said.

Advertisement

Two Israeli street artists in New York started the project to call attention to those taken hostage. It has become a reminder of how deeply polarizing the war is.

Allison Gallagher, a Long Beach police spokesperson, said the acts of “terror and violence in the Middle East create fear and anxiety here in our community.”

Police are committed to keeping the community safe, Gallagher said, but disinformation “undermines public safety efforts.”

Beyond confusion and anxiety, disinformation can lead to public panic, harm reputations and in some cases even incite violence, Goldenberg said.

“The spread of this false information erodes public trust in institutions and makes it more challenging for people to discern fact or fiction, which is obviously crucial in maintaining a well-informed citizenry,” he said.

Does disinformation vary on different platforms?

Disinformation is pervasive across all major social media platforms, experts said.

This is a moment when people are turning to social media for information because they’re trying to work out what’s going on “in an information-poor environment,” said Imran Ahmed, chief executive of the Center for Countering Digital Hate.

It takes time to establish what exactly is happening, but social media is a unfiltered channel where people can easily self-publish lies and nonsense, he said.

Advertisement

Violence surges in the West Bank as Israel increases raids to root out militants. Palestinians say the military is using the war as an excuse to crack down.

“On these platforms, ‘disinformation does better than good information’ is the truth, because disinformation usually gets there earlier because it’s zero cost to produce,” Ahmed said. “It takes time to actually find the truth.”

Goldenberg said its prevalence varies depending on the platform’s user demographics, algorithms and features.

On X, users will typically see a lot of false information and even incitement to violence or terrorist propaganda being amplified by users with verified badges, which lends a veneer of credibility, he said.

X, formerly known as Twitter, says it is trying to take action on hateful and graphic posts about the Israel-Hamas war. But watchdog groups say misinformation abounds.

The blue checkmark, or verified badge, means that the X account has an active paid premium subscription, but it does not undergo a review previously used by Twitter to verify the account’s authenticity.

The spread of misinformation, Ahmed argues, is made worse by the “fact that these platforms don’t enforce their own rules on social media.”

What can social media users do?

Social media should be seen as a form of entertainment and not as a tool to “find out what people are thinking,” Ahmed said.

Advertisement

“It is a way of being duped by an algorithm whose job it is to keep you addicted, and that is the fundamental issue,” he said. “We still look at social media as a place to get information — it’s not.”

For those who continue looking to social media platforms for information, it can be challenging to decipher where it’s coming from.

Goldenberg advises users critically evaluate information, especially before sharing it, and compare it to multiple reputable sources. Users can also compare the information to official statements from authoritative bodies.

Advertisement