Trading is the use of computer programs to make trading decisions.

A student of information systems planning to work in technology development in the financial markets industry will encounter many decisions that are no-brainers, such as upgrading software when new versions are available and changing vendors when needs change. Other decisions, such as measuring risk and setting position limits, require more thought and careful consideration, and there are strategic choices to be made with inputs from senior managers.

Speed and Reducing Latency – Nanoseconds matter for some trading strategies, but not all, depending on the investment and trading strategies being pursued by your clients. Patient trading can be beneficial and reduce costs if the holding periods of the positions are longer. Some traditional investors are concerned about the disruption and destabilization of high-frequency trading. Think of the ‘flash clash’ incidents (sharp price swings caused by errant trading software) that get attention when they occur. It is costly to operate a low-latency trading operation, and the optimal speed of data and order handling varies from trader to trader. Technologists need to understand how critical the timing of data transmissions is to the profitability of a strategy.

Man or Machine? – Market participants use a combination of human judgment and the rules-based logic built into customized software to manage their buy and sell orders according to their desired strategies. While market bots (robots) and algo traders capture the attention and are viewed by some as exploitive of slower human traders, the reality is more complicated. In a 2020 trial of traders accused of market manipulation, a text from one of the defendants was shared – “As a manual trader, I can use fake bids/offers and make the algo buy/sell into my real bid/offer” – suggesting that sometimes bot trading is less effective.5

A trading algorithm’s effectiveness is a function of the people who developed it and the extent to which the historical data, which it was tested on, is consistent with conditions in the markets in the future. Even with their statistical sophistication, most algorithmic trading software has “kill switches” to revert to human trading judgment under extreme market conditions.

Dark or Light ? – Traders typically want transparency for everyone else’s orders but want to keep their own orders hidden. A disadvantage of trading in a “lit” transparent market is that simply knowing that a large order to sell (buy) is being worked through the market will put downward (upward) pressure on prices. As a result, traders in screen-based markets often choose to use hidden or “iceberg” orders or to trade in dark pools to keep their trading intentions from leaking out into the market (Fig. 4.7).

Fig. 4.7

Trading is the use of computer programs to make trading decisions.

Example order handling approaches for a trader seeking to sell 50,000 shares of a $75 stock. The current bid and ask quotes are 74.95 and 75.05 and are good for 10,000 shares. The three illustrations shown are only possible outcomes, and any of the three approaches could turn out better or worse under different market circumstances (A major disadvantage of dark pools is their lack of participation in the price discovery process. Please see Chap. 2 (Finance) for a discussion of the advantages versus disadvantages of lit versus dark markets.)

Eliminate the Middleman ? – Many observers predicted that electronic markets would squeeze out dealers and other intermediaries in the financial markets. That has not turned out to be the case, and many intermediaries operate in today’s markets. Investor-driven order flow provides sufficient liquidity in only a handful of the most active stocks and financial instruments. In other markets, supplemental liquidity provision is required, and dealers and market makers step in to buy or to sell when there is an imbalance in the flow of orders. Supplemental liquidity providers tend to have short holding periods and do not try to exploit large moves over long time periods. Often referred to as “scalpers,” they attempt to keep inventory and position risks low while “capturing” the bid-ask spread or taking advantage of small moves that occur frequently. Today’s market-making firms, however, supply liquidity with sophisticated software and analytic risk models. They employ far fewer people than these firms did in the era of floor trading when they operated as dealers such as NYSE specialists or futures market “locals.” While trading costs have come down, the market depth and the quality of price discovery remain a reason trading intermediation remains desirable.

Fragmented Markets – Technology has driven a proliferation of markets and competing venues for trading. In 2020, the United States had 16 licensed equity exchanges, up from 11 in 2014, including the NYSE, Nasdaq, CBOE, and IEX, and about 50 alternative trading systems (ATSs). ATSs, which include dark pools, accounted for 40% of trading in 2019 according to Rosenblatt Securities. Off-exchange trading also goes through “wholesalers” or market makers, such as Citadel Securities and Virtu Financial, which execute retail orders for brokerage firms with the promise of providing better trading prices. While market makers may end up executing some trades on an exchange or in a dark pool, they often wind up ‘internalizing’ (e.g., buying for their own account when a customer sells) a large portion of the orders by taking on position risk and using their capital to complete them. The evidence suggests this competition is good for market participants, but with multiple trading venues, there are also concerns that fragmentation can impair price discovery and reduce liquidity and also make liquidity more difficult to access. One counterargument is that with sufficient transparency and shared market information, multiple technology-connected trading venues may effectively provide the benefits of a single, consolidated market, but the jury is still out on this one.

Computerized Trading Messaging Standards and FIX – As computerized order routing and trading began to replace phone calls and paper trading tickets in the 1980s, technologists had to work with different vendor-specific electronic communications formats and proprietary messaging standards. The NYSE, for instance, used its own message formats in its Common Message Switch (CMS) that connected traders away from the trading floor to its DOT system. The separate and incompatible interfaces for different exchanges and different brokerage firms created a need to consolidate traders’ points of entry and to realize cost savings by standardizing on a single, open protocol for trade messages that was not controlled by a vendor or an exchange.

In 1992, the Financial Information eXchange (FIX) was founded as a result of a collaboration between IT teams at a “sell-side” firm (Salomon Brothers) and a “buy-side” firm (Fidelity Investments). FIX is a series of specifications for machine-readable messages related to securities transactions and markets and their real-time transmission among market participants. For an IT professional, managing trading applications and keeping latency low increasingly require an understanding of the FIX protocol.

A FIX message is a digital message with a list of fields with numerical tags and values separated by “|.” Each tag corresponds to a different field for which a certain set of user-entered values is allowed. An example of a FIX message is presented in Fig. 4.8.

Fig. 4.8

Trading is the use of computer programs to make trading decisions.

Sample a FIX message for a limit order to buy at 5 sent by CLIENT12 to Broker A and a table describing each of the field in the tagged-field message. (From: http://www.validfix.com/fix-analyzer.html)

The pattern in each FIX message is Tag=Value|Tag=Value|Tag=Value|…. Depending on the purpose of each message, different sets of tags and permitted values are included. By using FIX technology in their trading applications, market participants are effectively agreeing to speak the same “language” with the markets, the exchanges that they use, and the broker-dealer counterparties that serve them.

Further Information Technology Issues – Technology and the use of the trading applications that rely on common standards such as SWIFT and FIX have made traders more productive and have reduced errors that occurred in manual trading. Nonetheless, advances and new applications of IT open complex questions for market regulators and trading organizations that are described below.

Transparency – The amount of information available from markets and the emergence of direct access to the trading process have empowered investors to manage their trading activities more closely. Pre-trade data exchanged in some markets include the identity of the firm that placed the order. However, some participants prefer anonymity to prevent their proprietary activities from being front-run or being “reverse-engineered” by other participants. Even innocuous post-trade information such as the identities of the executing broker and the clearing firms can signal what an investor or hedge fund is doing (such as building up a large position in advance of a takeover offer).

Information Disclosure – There are many types of regularly scheduled public information releases from companies, including their annual financial statements and quarterly reports. Private and insider information is more concerning since trading on privileged information can be illegal and disadvantages uninformed traders and erodes confidence in market integrity. Greater sharing and analysis of qualitative, unstructured information – such as the text of a speech or a letter to shareholders – provides heightened visibility into company activities, and this could level the playing field and reduce information asymmetries potentially harming less sophisticated investors. Hedge funds not surprisingly are at the leading edge and have developed proprietary text mining techniques to rapidly assess the positive or negative “sentiment” of speeches, news stories, or company press releases.

Complexity – As markets have innovated and competition among trading venues for order flow has grown, new complexities and challenges are emerging. In the past, the fees charged to broker-dealer firms for their trade executions were fairly uniform across stock exchanges. A new range of rebate approaches and fee models have developed to attract order flow. The use of incentives for certain order types was pioneered by the Island ECN in the late 1990s. In its maker/taker model, Island attracted limit order users by rebating $0.002 per share if their order traded and charging the market order a $0.003 per share fee. Island kept the difference. Recently, some trading venues have inverted this model to charge the limit order trader and rebate the “taker.” Such incentives can lead to orders being routed not based on where the best price discovery and liquidity are, but on where the firm will maximize its payment for order flow. The results could be pricing distortions and publicly visible bid/offer prices in the market that are less accurate since rebates and other discounts are hidden.

System Reliability – Like other technologies, trading systems are subject to failures, breakdowns, and unanticipated responses to conditions. In 2012, a prominent market-making firm, Knight Capital, caused a major stock market disruption and suffered a $440 million trading loss. A significant error in the operation of its automated routing software for equity orders caused it, in roughly 45 minutes, to route millions of orders into the market that resulted in over 4 million trades in 154 stocks for nearly 400 million shares. For instance, the flood of orders to buy shares of Wizzard Software Corporation caused its price to move from $3.50 to $14.76. The SEC’s erroneous trade rules6 developed after the 2010 “flash crash” led to trades at least 30% away from the “reference price” being cancelled, which happened for Wizzard and five other stocks.

Knight was found to have violated SEC rules that required broker-dealers to have controls and procedures to limit the risks associated with automated trading systems and to prevent these types of errors. Mary Schapiro, the then-Securities and Exchange Commission (SEC) chairperson, recommended the voluntary guidelines – known as Automation Review Policies that have covered technological systems since the 1987 crash – become mandatory: “As the SEC catches up with the realities of today’s market, it seems an appropriate moment to require that every entity in an interconnected system work to ensure its capacity, resiliency, and security.”

A further example occurred on Thursday, October 1, 2020, when the Tokyo Stock Exchange experienced a full day shutdown of its market when a data device malfunctioned and the switchover to the backup device failed. Testing and backup plans are crucial because of the numerous interconnections and interdependencies and because changes to components or software in one “layer” of the stack can trigger unanticipated breakdowns elsewhere.

Open Architecture and Scalability – Market systems are designed today with open architectures that make adding or upgrading components efficient. Non-proprietary approaches are more cost-effective and provide a high degree of scalability. An added benefit is not being tied into a single vendor. Today’s trading platforms are built on open source software and can be deployed in many different data center and cloud environments.


Page 2

A simplified three-layer IT stack. Components in each layer perform defined, self-contained functions yet interact with the other components and other layers using common standards and established communications protocols