Part of:

Tech Failures: Can We Live With Them?

Why Trust Techopedia
KEY TAKEAWAYS

As we turn over more and more of our lives to intelligent systems, we must demand quality - or face the consequences.

On August 17th, New York City Mayor Michael Bloomberg announced that the much-ballyhooed Bike Share program announced in 2011 would not commence in November 2012 (after slipping from the original July 2012 announced date) but would, rather, slip until a projected implementation date of March 2013. Why? The software, the mayor explained, didn’t work and the city wouldn’t commence the program until it did.

That makes sense, but Bloomberg’s statement doesn’t seem laden with confidence, does it? One can hardly blame him; his tenure as mayor has been plagued with costly software glitches and software-related fraud. In March 2012, the city reached a settlement agreement with SAIC under which the company will pay a total of $500.4 million in fines and penalties for overcharging for work and ignoring kickbacks on an employee time-management system called CityTime, which came in hundreds of millions of dollars over budget.

In addition to the SAIC problem, in the same month, the city’s comptroller, John Liu, issued an audit report stating that the Emergency Communications Transformation Program (ECTP), a technology-based system designed to improve emergency services handling for more than 12 million emergency calls received each year, was seven years behind schedule and $1 billion over budget. Speaking on radio station WNYC, Liu said, “Years of mismanagement have led to this incredibly enormous budget overrun, and to date, [ECTP] is still not fully operational.” In May of 2012, the mayor’s office began cost reductions in the project in response to the comptroller’s audit.

It’s ironic that Mayor Bloomberg, who began his rise to fortune through the implementation of Wall Street information systems, should be plagued by software related problems. These problems are not his alone. They are popping up, often with disastrous or near-disastrous results, throughout the country in both public and private sectors.

At the end of 2011, Information Weekly, an IT trade publication, listed the “Top 10 Government IT Flops Of 2011,” which listed major government IT deployments that suffered security snafus, fraud fiascoes, budget breakdowns, and more. New York’s CityTime was fourth on the list. But as you can imagine, New York City is not the only government entity with IT problems.

One big problem area is Wall Street. For example, in August 2012, a trading firm called Knight Capital lost $440 million in 45 minutes after installing faulty software. To understand trading software requires some knowledge of the complexities of the worldwide securities market. Brokerage firms both trade for customers (as agents) and for their own accounts (as principals). When acting as agents, they may receive specific orders from clients, which they then transmit to a stock exchange or a computerized trading service. Or, they may have discretionary power over an account, in which case they make trades that are believed to be in a client’s best interest. In the latter case, trading is similar to trades for the firm’s own account, and will be made based on a judgment about the security and relevant market factors.

Advertisements

For years, firms have been attempting to codify the factors that underlie these judgments, which include market conditions, security data, industry information and economic data. The trading strategy that the firm wishes to follow for that security, industry, or general market condition is incorporated into a system that monitors all of the appropriate elements in real time and, when the conditions are met, triggers automatic buy or sell orders. This is called program trading.

This system often works very well. The problem is, so many firms now have program trading systems that one firm’s automatic sale may trigger another firm’s conditions, triggering more automatic reactions that could send a stock – or even the market – into a tailspin. So, the stock exchanges have put in their own triggers, requiring firms to turn off their program trading systems when it is deemed that market conditions require it.

Even so, time is money in the stock market, and high-speed trading firms have thrived in the computerized markets. According to the Tabb Group, as of 2012 they account for more than half of all stock trading. That means there is constant competition between the firms to keep upgrading their systems to provide faster executions, which, as it turns out, is what led to the installation of the Knight Software that almost put the firm out of business.

It is easy for the non- IT person to assume that a faulty system wasn’t tested very well. Maybe its developers should have done a better job. That’s true, but as systems become more and more complex, it is often difficult to know what must be tested or the extent required for the test, especially when many of the possible points of failure are becoming increasingly complex and therefore unpredictable.

And some critics say it can only get worse. James Martin, in his wonderful 2000 book, “After the Internet: Alien Intelligence,” writes of software that, once implemented (after thorough testing, one hopes) is “adaptive” in that it constantly looks for more efficient ways to reach the desired output; it also “self-modifies” its own code. What that means is that at some point, we will know “what a system does but not necessarily how it does it.”

Martin feels that we must not only control this new methodology, but also embrace it to remain competitive. Competition creates a continual pressure to have the best and fastest systems and the financial industry is not unique in this.

We have all experienced some form of techno-failure, such as the blue screen of death, virus attacks, software bugs in application programs, hacker attacks, system shutdowns, etc. But as we turn over more and more of our lives to intelligent systems this problem will grow far beyond frustrating – and could become downright dangerous.

That means that as consumers, we must require more quality in these systems. That will involved more educated and professional system developers, more exacting testing procedures and, on the consumer end, more knowledgeable end-users and more demanding consumers.

Of course, there’s always the option to go with the flow. Unfortunately, recent history suggests it’ll be not only frustrating and expensive, but possibly very damaging as well.

Advertisements

Related Reading

Related Terms

Advertisements
John F. McMullen
Editor

John F. McMullen lives with his wife, Barbara, in Jefferson Valley, New York, in a converted barn full of pets (dog, cats, and turtles) and books. He has been involved in technology for more than 40 years and has written more than 1,500 articles, columns and reviews about it for major publications. He is a professor at Purchase College and has previously taught at Monroe College, Marist College and the New School for Social Research. MucMullen has a wealth of experience in both technology and in writing for publication. He has worked as a programmer, analyst, manager and director of…

',a='';if(l){t=t.replace('data-lazy-','');t=t.replace('loading="lazy"','');t=t.replace(/