After all trading was suspended on the New York Stock exchange Wednesday, US stocks dipped below 1 percent and stayed there for the remainder of the afternoon. The halt was due to technical difficulties that began shortly after 11.30am. By 3.10pm ET the stock market was back up and running, just under an hour before it was set to close at 4pm.
While Wednesday’s blip has been called “the biggest disruption to an American equity venture in almost two years”, it certainly wasn’t an unprecedented occurrence. Just two years ago the Nasdaq Stock Market was forced to halt its operations for 3 hours after a price feed broke. So how different was Wednesday’s glitch from previous technological setbacks?
“While rare, computer breakdowns in electronic markets have become an unavoidable fact of life for American investors operating in markets that have sped up and fragmented over the past decade and a half amid computer advances and regulation,” Sam Mamudi wrote for Bloomberg.
As exchanges struggle to catch up with technological advances, investors will need to get used to these types of technological malfunctions, experts say. In fact, outages have been affecting exchanges every few months in recent years, the Wall Street Journal reported.
Still, the most serious technological problems of the past have often spurred improvements.
When most investors think of technical setbacks, their memories turn to the May 2010 “Flash Crash”. The crash caused the Dow to plummet close to 1,000 points, and sparked wild fluctuations in the market that left investors baffled.
One of the main reasons for the “Flash Crash” was high frequency trading, which used computers programmed specifically to trade a high volume of stocks at a fast rate. After a critical mass of traders halted activities following an enormous transaction by one market participant, the withdrawal of participants caused the system to crash. Essentially, the markets depend on the high volume of activity generated by the traders and their computers. If they stop, so does everything else.
The lessons learned during the 2010 crash led to the approval of a “limit up-limit down” mechanism, which prevents trades in individual stocks from moving outside of a specified price band.
Now, individual stocks can only move a specified percentage level above and below their average prices during the previous five-minute window. For heavily traded securities the level is set at 5 percent, while for other stocks it is set at 10.
Nevertheless, some analysts say that the stock market is still at risk today. Last year, Joe Saluzzi, co-author of the book Broken Markets, told CNNmoney that the market remains susceptible to a flash crash of around 7 percent.
"It could still happen. The structure of the market is still not sound," he said.
It is unlikely, however, that Wednesday’s crash was due to these types of technological problems. In fact, some insiders have speculated that Wednesday’s problems may have been caused by too much traffic from high-speed trading, not too little.
For those griping that, in order to avoid the uncertainty, traders should go back to the day before technology made these little hiccups more frequent, it’s useful to note how little impact these problems actually have on the stock market as a whole.
The NYSE, Nasdaq, and BATS each handle around 20 percent of stock trading during most months. This division of labor ensures that problems in one exchange do not affect the market as a whole. Moreover, with the exception of closing and opening time, no exchange has the exclusive rights to any stock or exchange-traded fund.
The NYSE confirmed that its technical difficulties were not caused by a cyber breach, but the exact reason for the crash is still unknown.
In a note to traders, the exchange confirmed that all open orders had been canceled except for the long-term requests to buy or sell known as “good-till-cancel” orders.