How social media impact stock traders

The Washington Post published a fascinating article last week about the fake Associated Press twitter post by hackers. It provides insights into the incident that triggered a stock market landslide of 1000 points. The article by Dina ElBoghdady and Craig Timberg appeared in the April 24, 2013, issue. According to the Post, a hacker group called the Syrian Electronic Army allegedly hijacked high-profile twitter accounts owned by Western organizations that cover the civil war in Syria. In addition to the AP account, the accounts of NPR and CBS “60 Minutes” have also been hacked.

In the Associated Press case, the hackers posted a false story about explosions at the White House that injured the President. “Breaking: Two Explosions in the White House and Barack Obama is injured,” read the tweet. The “news” came shortly after the Boston Marathon Bombing. As a result, there was heightened sensitivity; the hoax seemed plausible.

“The episode, while lasting only several minutes, has drawn scrutiny from the FBI and a bevy of regulators while also highlighting the hair-trigger nature of today’s markets, where the demand for greater speed clashes with the occasional reality of misinformation,” say ElBoghdady and Timberg. They continue:

“Automated high-speed trading accounts for about half of daily stock market volume, and while few traders admit to having their algorithms make decisions based on a single tweet, several said the use of social media is growing. This kind of computerized trading tends to exacerbate market fluctuation, especially during sudden drops in prices, critics say.”

However, the article also points out that some dispute whether the stock market plunge involved computerized trading. “It’s not clear whether Tuesday’s market drop was caused by fast-fingered humans or computers seeing the words “explosions” and “White House” in a tweet,” say the authors.

They interviewed people who claim that computerized trading was not the cause of the market drop because of a 23 second delay between the tweet and the plunge. “That’s not compatible with computer trading,” said one. “If it was computer algorithms that were trading, the market would have moved in a fraction of a second.”

Regardless of whether humans or computers acted on the misinformation, the incident underscores how vulnerable we all are to unknown assailants who may be half a world away, sitting in a coffee house somewhere, armed with nothing more than a latte and a laptop.

How Search Engines Can Help Perpetuate Misinformation

Before we get into this, I want acknowledge that search engines put a world of relevant information at our fingertips and that they help people find answers faster than ever before. They’re great. I love ’em. I use ’em. But I also see a dark side to them.

Ask anyone a question. If they don’t know the answer, in all likelihood, they will Google for it from a smartphone. Voila! answers! Are they accurate? Are they true? These are much bigger questions.

searchforanswersA frequently quoted book, Prioritizing Web Usability (2006) by Jakob Nielsen, claims 93 percent of Web searchers never go past the first page of results. Yet Google and other search engines often return millions of pages.

At one time, an army of professional authors, editors, reviewers, librarians and fact checkers helped verify and screen information before dishing it up to readers. Today, that verification process applies to only a tiny fraction of all the information put online. Anyone can self-publish anything. “No experience necessary” often equates to “no truth or accuracy required.”

Limitations of Search Engines and Human Brains

Search engines simply report all references to a phrase on the Internet; they make no attempt to determine the truth or accuracy of claims. Yet most people assume the truth of something published. Why?

A 2012 report called Misinformation and Its Correction: Continued Influence and Successful Debiasing published in the journal of the Association for Psychological Science by Stephan Lewandowsky, Ullrich Ecker, Colleen Seifert, Norbert Schwarz and John Cook of the Universities of Western Australia, Michigan and Queensland[1] concludes that, “Cognitively, it is much easier for people to accept a given piece of information than to evaluate its truthfulness.” (Comment: this is especially true when search engine results stretch to thousands or millions of pages.)

The Stickiness of Misinformation

This fascinating report surveys academic literature relating to why we believe certain things we read or hear – even though they may be false. It begins with a discussion of several public policy issues, such as health care reform, vaccinations, and justifications for wars. It also discusses why misinformation is “sticky,” i.e., how hard it is to correct misinformation once it becomes rooted.

According to the report, disinformation in the U.S. healthcare debate peaked in 2009 when Sarah Palin used the phrase “death panels” on her Facebook page. “Within five weeks,” the report continues, “86% of Americans had heard the claim and half either believed it or were unsure about its veracity.”

Mainstream news media and fact-checkers reported that Palin’s characterization of provisions in the proposed law was false. But even today, four years later, a Google search for the term yields 35,800,000 results (in 0.16 seconds)! A scan of the first 20 pages of posts in the Google search revealed:

  • A few were dedicated to exposing “the myth” of death panels, including (to be fair), the very first post in Wikipedia.
  • Most posts conflicted with each other, i.e., a large number claimed the law would create “death panels” and a large number claimed it would not.
  • A large percentage was posted within the last few months, indicating that many people are trying to resurrect the term or keep the debate going, and that the authors of the paper are correct – misinformation is sticky.

Existing Beliefs Influence Belief in New Information

Determining the validity of information requires hard work and an open mind. The problem, say the authors of the Misinformation report, is that most people don’t seek information that contradicts their view of the world. Said another way, they tend to like information that supports their view.

Even when directly confronted with retractions and conflicting facts, many people cling to their original beliefs by saying something like, “Well, we’re all entitled to our opinions.” In fact, say the authors, conflicting information often serves to strengthen belief in  erroneous information.

How The Search for Truth is Getting More Difficult

Think of the Internet as a giant information archive. When topics such as healthcare become politicized, social networks, blogs and circular references turn the Internet into an echo chamber. Millions of references can accumulate in days as people report on reports of other reports, filtering information and putting their own spin on things along the way.

While search engines dutifully record the location of information, they can’t help us determine the truth of it. The sheer volume of conflicting information that they present makes the search for truth like looking for diamonds in a garbage dump.


[1] Click here to learn more about the Authors of Misinformation Report.