Adventures in Algorithmic Trading

August is the shortest of months, with vacation and Summer Fridays making the time go simply too fast… except for those who eagerly await the September issue of ai5000, for whom the end of the month cannot come soon enough. For those most loyal of readers, we’re posting one article early. It’s Joe Flood’s Adventures in Algorithmic Trading, a look at the varied strategies that comprise the umbrella term ‘algo trading’ – and what it means for you.

It’s been a tough year for high-frequency trading. It started last July, when a former Goldman Sachs computer programmer was arrested for allegedly stealing proprietary high-frequency computer code. Few people had any idea what high-frequency trading really involved, but this was the summer of discontent for the recently profitable, but publicly reviled, Goldman “Vampire Squid” Sachs. A few weeks later, The New York Times ran a cover story that credited the trading technique with being able “to master the stock market, peek at investors’ orders, and, critics say, even subtly manipulate share prices,” to the tune of “$21 billion in profits,” during the financial cratering of 2008 (the real number seems somewhere in the $2 to $8 billion range).

Then, this past May 6 saw a trillion dollars in stock market value spontaneously evaporate, only to mysteriously re-condense moments later in what the media dubbed the “Flash Crash,” even though flash trading had nothing to do with it. Months later, regulators and traders still aren’t sure what happened, but most of the inquiries (and uproar) have focused on high-frequency trading.

“We lump every crime in the world onto high-frequency trading,” says Professor Bernard Donefer of New York University’s Stern School of Business, who ran electronic trading systems for Fidelity Investments before moving into academia. “It’s my opinion that, in fact, the big problem with high-frequency trading is that there is no such thing. There are a whole series of techniques and strategies that use low-latency technology [i.e., super-fast computers and electronic networks], and what you have to do is examine each one of those strategies and techniques and see what it does.”
 
As Donefer and others note, these techniques, granularized, are as follows:

 
Quantitative Trading Strategies
 

For more stories like this, sign up for the CIO Alert daily newsletter.

Brought to life by the likes of blackjack aficionado and legendary investor Edward O. Thorpe and turned famously profitable by mathematician hedge fund kings like Renaissance Technologies’ James Simons, quantitative (or “quant”) strategies typically involve forms of statistical arbitrage and pairs trading. Quant computer models discern historical patterns and correlations between different securities, search for instances where those relationships go out of whack, and, finally, buy and short the affected securities to help push them back to their traditional correlations, collecting the spread along the way. The profits on any one trade tend to be small but, with enough speed and volume, they can create enormous profits.

Impact and Upside: While quant strategies require super-fast computers and trading power, they are probably the least “high-frequency” of the disparate strategies generally lumped under this umbrella. It is also the strategy directly utilized the least by asset owners, aside from some large pension funds and actively managed endowments (though most hedge funds use one form of quant strategy or another).

Downside: As Long Term Capital Management’s 1998 collapse and the 2007 “Quant Meltdown” have shown, the pennies quickly stacked up by quant strategies can be knocked down even faster, particularly when a market or strategy becomes “crowded” with copycats. Strategies that depend on being able to maneuver nimbly suddenly find themselves in a hurly-burly crowd and, when everyone runs for the exit, a classic liquidity trap forms.

Marketmaking 

The strategy most accurately defined as high-frequency trading—automated marketmaking—accounts for roughly half of the daily equity trade volume in the United States.  Automated marketmakers are the computer-age equivalent of the screaming floor traders of a bygone era, providing liquidity by placing bids and offers onto exchanges, and seeing who will buy and sell at the quoted price.

Automated marketmaking would not exist without the recent advances in computing, but its ubiquity is actually more a product of regulatory changes than anything technology-based. The first shift came in the late 1990s, when the SEC approved the creation of alternative trading systems that ended the virtual duopoly of the NASDAQ and NYSE. With the new rules, Electronic Communication Networks (ECNs) could operate like the big exchanges—allowing marketmakers to post their bids and offers publicly—but without all the rules, regulations, and overhead costs of those big exchanges.

ECNs exploded after April 2001, when all exchanges were forced to switch from the old fraction pricing system to decimals. Suddenly the spread on a trade, which was almost always 1/16th of a dollar (6.25 cents), could go as low as a penny, bringing even more savings in the cheaper, alternative exchanges. To attract marketmakers, ECNs developed a new “maker-taker” payment model and, for providing liquidity, marketmakers were paid by the ECN for each deal they made. Because they took the liquidity provided by the marketmakers, firms on the other side of the trades paid for it. The sums involved were relatively small but, with enough volume, an automated marketmaker could all but print money. This incentive, along with the bounty of new exchanges and ultra-fast trades executed in fractions of a second, has helped double the average daily trading volume since 2003.

The industry received another regulatory B12 shot in 2007 when the SEC’s Regulation National Market System (“Reg NMS,” in industry parlance) went into effect. First, Reg NMS required that firms respond to all bids and offers within a second or so, intended to prevent a form of electronic “front-running” where firms would try to get ahead of a trade they were offered. The time requirement cut down on front-running, but also meant that all marketmakers and traders essentially were required to have lightning-fast computers, networks, and algorithms, a perfect environment for high-frequency trading.

Upside and Impact:  Along with decimalization and exchange deregulation, automated marketmaking has increased average liquidity, cut bid-ask spreads, and lowered fees and commissions. The automated system also has cut transaction time, errors, and fraud.

Downside: The old system of manual traders and specialists was, of course, much slower, more expensive, and more prone to mistakes and corruption than the new automated system. Yet, traders generally were required to function as a liquidity source of last resort, were capable of taking on a large share of daily volume themselves, and helped out trading partners in dire straits in order to maintain the business relationship. In today’s Wall Street, automated traders simply can turn off their machines if they don’t like what’s happening in the market, as many firms did during the recent “Flash Crash.”

“The argument that these strategies provide liquidity is a red herring,” says longtime quant (and quant skeptic) Dr. Paul Wilmott. “It’s a ‘liquidity card,’ like the ‘race card,’ people just play it to end arguments. All these people supposedly providing liquidity, if some of them decide to step back, they’ve got you by the cojones, haven’t they?”


Algorithmic Trading:
 

During World War II, one of the military’s biggest problems was prioritizing the infinite combinations of soldiers, weapons, supplies, and replacement parts that needed to be shipped to the front lines—until the arrival of a young Air Force officer named George Dantzig. In his first year at Berkeley, Dantzig had solved what he thought were homework assignments but were, in fact, two of the great unsolved problems in statistics. Over time, the details of the story changed and it was misattributed to a handful of famous mathematicians (and a recondite janitor in the movie Good Will Hunting) but Dantzig was the source. Almost 70 years later, variations on the “Simplex Method” that Dantzig devised for military supply-line problems are used to slice and dice large trades, spread them out over different exchanges, and either execute them with lightning speed or space them out over time, all in the hopes of minimizing price-slippage. While technically all quant strategies and forms of high-frequency trading employ algorithms, this is the sub-specialty most commonly labeled “algorithmic trading.”

Impact and Upside: Like automated marketmaking, algorithmic trading helps provide liquidity and lowers spreads and commissions. It also affects how markets function: drop a large rock into a bucket of water, and there’s a good chance the water will slosh around and spill over the sides; drop pebbles into the bucket and you can add just as much rock without any spills.

Downside: Algorithmic trading can hide the identity of large buyers and sellers to prevent speculators from guessing the overall size of the trade and getting in front of it. The speculators, in turn, use a series of baits and cancellations to test the market, looking for patterns and weaknesses. The result is a lot of complicated, high-speed, game-theoretic duels and micro-volatility, all of it in the name of hiding information. As Gene Fama will tell you, a transparent market is an efficient market.


Dark Pools:  

Sometimes these algorithmic trades are done in “dark pools,” private exchanges that allow select participants to make big trades secretly (until the trades ultimately are  reported on the consolidated tape) and anonymously to minimize market impact.

Impact and Upside: To a greater degree than algorithmic trading, dark pools cut down on slippage, transaction time, and fees. “Nondisplayed liquidity venues are a good source of liquidity,” says Joe Wald, a Managing Director of Knight Capital Group, a leading high-frequency firm. “They’re a place where large trades that could never get done before can go because the spreads and the volumes would have been so large.”

Downside: Because so much of the market has moved into dark pools, it is extremely difficult to do a block trade on the public exchanges—thereby furthering the problem that dark pools were created to address in the first place. “Imagine a world where each grocery store only sold their eggs one at a time, and each one at different prices,” says Keith Bliss, a Senior Vice President of the electronic prime brokerage firm Cuttone & Co. “Ultimately, you might have a better price for each egg, but you spend so much time and money driving around that it might not be worth it.” That said, eliminate dark pools and the cost of moving large blocks of stock would, at least in the short term, rise significantly for asset owners and money managers.


Flash Trading
 

Another feature of the SEC’s Reg NMS that catalyzed high-frequency trading was a new “trade through” rule that functions like an instantaneous equities version of retail stores’ “Best Price” guarantees. Before clearing a trade, exchanges must take the best share price available on all other exchanges. The trouble with this is that exchanges can only take share prices into account, not access fees that can drive the total price much higher. To cut access fees, and keep the trades on their own turf, exchanges like DirectEdge started letting customers “flash” their trades to fellow exchange members a fraction of a second before going out publicly, giving them a chance to match the best available price from other exchanges, and keep the trade on the original exchange. For the “flasher,” this means lower transaction costs; for the “flashee,” it means a chance to do more business; for the exchange, it means getting to keep the trade on their books.

Upside and Impact: “You have to remember,” says NYU’s Donefer, “a customer must request to be flashed; it’s not like this is being done to you secretly.  If the trade gets routed somewhere else [to get the best share price] there’s a routing fee, and the price might change by the time it gets there,” so customers often want their trade flashed to save time and money. Ultra-fast traders can use this millisecond preview to get ahead of the trade and drive the price up or down. Yet, says Donefer, in practice this rarely happens. “With 11,000 shares, you can get ahead of the trade and do something with it,” he says, “but no one is flashing 11,000 shares. I’ve spoken with people from a number of exchanges and the average trade flashed was less than 300 shares, and it’s very rare that they get much larger than that.”

Downside: The asymmetrical information exchange, and the fact that millions of dollars in computing power is needed to take advantage of it, has had the press, politicians, and regulators pummeling flash trading. When the SEC proposed a ban on it last summer, most exchanges decided it wasn’t worth the hassle (flash trading only accounted for 2% to 3% of trades) and voluntarily gave it up, though DirectEdge still offers it.


The Flash Crash and the Big Picture  

On May 6, the Dow dropped like Wile E. Coyote after realizing he’s run off a cliff, with shares of major companies like Accenture trading for as little as a penny. One of the most worrisome things about the “Flash Crash” is that no one is exactly sure what happened—but nearly all of the components of high-frequency trading (except, ironically, flash trading) seem to have played a significant role.
 
May 6th was a nervous, high-volume day from the start, which may have “unbalanced” the books of a number of automated traders, who try to keep their buying and selling roughly even. When they’re unbalanced, automated trading programs often either pull back from the market entirely or offer what’s known as a “stub quote,” which offers to buy a security at an absurdly low price (say, a penny) or sell it at an absurdly high price. Stub quotes evolved as a way for traditional marketmakers, who were required to provide liquidity, to technically offer it when they were unbalanced and didn’t want to make trades. On May 6th, marketmakers seem to have put out stub quotes under the assumption that no one would be dumb enough to take them up on the offer. Yet, such marketmakers ran into the dumbest traders of all: other, poorly designed, computer programs. Some trading programs have what’s known as a “stop-loss” order, which automatically sells a stock once it drops to a pre-determined price. When that threshold is reached, stop-loss orders become a “market order” which line up behind whoever else is trying to sell the stock and get whatever price they can on the market. On May 6, by the time late-comers placed their orders, the only thing left out on the market were those stub quotes, and any stop-loss programs without price parameters programmed into them (called “limit orders”) executed the trades. This further unbalanced markets in a feedback loop similar to the kind of blind portfolio-insurance selling that helped cause 1987’s Black Monday liquidity trap and crash.

On May 6, liquidity problems were particularly bad for Exchange Traded Funds (ETFs). Spreads often open up between the value of the underlying stocks that comprise them and the ETFs themselves. Some quant-arbitrage strategies make money collapsing those spreads through the use of stock, futures, and put option trades, but the added speed and complexity needed for the trades sent ETFs and stocks that comprise a large portion of them (like Accenture) spiraling. 

The fact that the “Flash Crash” mispricing recovered as quickly as it did shows one of the upsides of high-frequency and algorithmic trading: market irrationalities are spotted and reversed fairly quickly. And, in the wake of this event, some of the biggest players in the algorithmic, dark pool, and high frequency games have been willing to take on reforms to prevent another similar crash. Some of the proposals include creating more designated marketmakers and forcing them to always offer “real” liquidity and not just stub quotes, and better coordinating cross-exchange “circuit breakers,” to slow or stop out-of-control trading.

Yet, as New York Times reporter Andrew Ross Sorkin points out in his interview with ai5000, there is very little talk on Wall Street of reducing the complexity and speed of the market. For better or worse, this puts the responsibility for heading off future crashes squarely on the shoulders of ill-prepared regulatory agencies, whose well-intentioned reforms actually paved the way for high frequency trading in the first place.

“Like anything, high frequency trading is not all good or all bad,” says the head of Knight Capital’s Electronic Trading group, Jamil Nazarili. “But people are becoming much more aware of it, and that is a good thing. Regulators are getting more involved, and it’s important for [asset owners] to understand how it all effects them.”


Joe Flood is the author of the recently published book
The Fires, which looks at the evolution of computer modeling, urban planning and urban economics through the lens of New York City’s 1970s fire epidemic and fiscal crisis.
 
 

 

«