Wall Street’s Mission: Cutting Costs While Staying Competitive January 10, 2013 URL:
A flood of new regulations, coupled with falling profits from once-lucrative areas such as derivatives and equities trading, has Wall Street firms making some tough decisions.
First, they have to slash IT infrastructure costs as a way to boost the bottom line. While trimming budgets is never easy, it’s a process that’s familiar to all CIOs. A larger and tougher decision is to take a look at the entire IT infrastructure, as well as the existing business units, and decide what lines of business the firm should be in in the future. For instance, does the IT department continue to run separate market data databases for equities and derivatives? Does the firm even want to be in equities in the future?
“The general theme is more for less,” says Mark Akass, CTO at BT’s Global Banking and Financial Markets, part of BT Radianz, which operates a financial community network that lets members access 400 service providers, such as order management, trade execution and clearing. “People are looking to achieve more with the same budget. You have to find ways to operate and comply” with the newer regulations. “They have a double whammy,” he adds.
Extreme Makeover: IT Budget EditionWall Street & Technology’s January digital issue, which looks at how Wall Street firms, faced with a slew of new regulations and declining profits, are squeezing more out of their existing IT infrastructures, revisiting outsourcing relationships and modernizing costly legacy systems.As a reflection of this trend, several brokerage firms plan to carry out massive layoffs to shrink their costs. On Dec. 5 Citigroup said it would cut 11,000 jobs worldwide to produce $1.1 billion in savings. Citi’s global consumer bank will lose 6,200 jobs, an additional 1,900 will be cut from Citi’s capital markets and transaction processing units, and 2,600 from back-office operations. About 3,400 of the cuts will come from technology and operations within those groups.
A New IT Budget Reality
Financial firms’ IT budgets are being hit from all directions. “That’s causing a shift in the IT mindset,” Akass says. The new mindset is going to involve revamping internal processes and hunting for efficiencies in every part of an organization, including architecture, software development, market data and connectivity.
The cuts will be necessary, but firms must continue to refresh their technology to remain competitive in areas such as low-latency trading. However, they also need to improve quality, as evidenced by the software error that nearly toppled Knight Capital Group in August.
“Cost reduction and improved quality are part of every interaction we have,” says Kirk Wylie, founder and CEO of OpenGamma, an open source risk management and analytics platform that caters to hedge funds. As the industry’s focus has changed from trading exotic derivatives to flow products with lower margins, the 3-year-old startup has caught the attention of a broad array of financial firms seeking real-time risk management, including small regional banks, global Tier 1 investment banks, insurance companies, fund administrators, as well as large and small hedge funds, Wylie says. Though the application is free to download, clients can get a maintenance contract for support and enhancements, which runs about $100,000 to $500,000 per year.
In the past, bigger firms have made incremental reductions in their IT spending, but now they’re changing their approach. “This is the new normal, and they need to make transformative cuts,” Wylie says. “They have to go in and say, ‘How am I going to cut 20% or 25% of my margins, because I’m never going to achieve the glory days of 2005 or 2006?'”
However, simply taking a hatchet to the IT budget isn’t a good idea, since companies must shift to newer technologies to remain competitive. New competitors, such as startup funds and small brokers, are using faster, more efficient technologies that have an inherent lower cost structure, putting pressure on the established players, says Bill Adiletta, partner in the technology practice at consulting firm Capco.
An established firm can invest capital to enter a low-latency trading business and two years later find a new entrant jumping in at half the cost. “The newer firm never had the sunk cost,” Adiletta says. The dilemma, he says, is “how do you keep advancing your technology fast enough to stay competitive … while figuring out how to write off the sunk capital?”
Streamlining software development life-cycle processes is key to increasing quality and cutting costs, Adiletta says. “If you revisit your development methodologies with quality inherently built into it, you can improve your time to market and lower cost,” he says. Given the technical snafus with the BATS IPO, Nasdaq’s debut of Facebook and more recently Knight Capital, firms have to be more attentive to quality, he says.
The focus on quality is why the industry hears so much talk about agile development, a software development process popular in other technology verticals, he says. Newer entrants are using rapid time to market, embedded disciplined quality management and other approaches to lower overall costs.
The Open Source Advantage
Another way firms are cutting costs is with open source technologies, which are free to license but can be customized to meet a company’s specific needs.
When Sanjib Sahoo, CTO at optionMonster and tradeMonster, launched an online trading platform in 2008, he turned to open source technologies. “We’re saving a ton of money,” he says.
The online brokerage runs its platform and infrastructure on the LAMP (Linux, Apache, My SQL and PHP) stack. By using MySQL instead of a commercial database, it has saved millions of dollars over the past three years, which is a significant percentage of its technology budget, Sahoo says. “I think we’re the only brokerage firm primarily running on open source,” says Sahoo, whose 60-person IT team develops everything in-house. The firm has had a 99.98% uptime, with reliability and scalability that meets the needs of active investors, he says.
OpenGamma’s Wylie agrees that there’s savings to be had from in-house development. “Any firm that’s done custom development will find that it can consume 46% of their total annual spend in risk and analytics,” he says. The first version could take 20 person-years and cost $250,000 per person-year, adding up to a $5 million project, he says. And the cost of maintaining, supporting and enhancing the system over its life could be 10 times the initial build cost.
However, Wall Street firms are wary of using open source for trading applications since they don’t want to contribute their proprietary code to the community, Wylie says. “The client doesn’t have to post anything to the community. Firms don’t have to open up their secret sauce to us, but just the stuff that has to talk to our code.”
New Financial Reality
Even with cost cutting, firms will still want to do their own internal development to capture a competitive advantage in whichever market segment they decide to pursue. “The financial game is changing, the margins are getting smaller and, the way the economy is, you have to be set up to trade across asset classes,” says Fintan Quill, senior engineer with Kx Systems, a provider of databases and tools for processing real-time and historical data.
Sun Trading recently expanded its use of Kx Systems’ kdb+ database to support cross-asset trading across new markets and geographies. It uses the platform to enter new markets when an opportunity arises. In contrast, firms with older systems built to handle just one asset class have a difficult time adapting to new markets. The Kx database’s real-time analytics and speed has let Sun Trading quickly bring trading ideas to the market, says CIO Kurt Lingel. Sun Trading plans to use Kx to address research and analysis needs as it continues to diversify across multiple asset classes and markets, he says.
[Legacy Outsourcing Requires A Delicate Touch ]
Sun Trading’s Chicago and London offices will use the Kx database to capture and analyze large volumes of data in real time for quantitative trading. The move is cost effective because the firm is using the same hardware and staff for all the data, Kx’s Quill says. “They’re getting better bang for their buck by using one technology to house their data,” Quill says.
As specific trading teams look across different asset classes, it’s useful to have data in one system, especially if low latency is involved, he says. “If you’re doing a correlation trade between FX and something else, from a purely technological standpoint, you’ll get a faster return on investment, and you don’t need to develop a separate infrastructure,” says Quill.
As firms are driving toward efficiencies and being cost effective, it’s a prime time to ask what business you want to be in, says Capco’s Adiletta. There’s a big transition under way from simply trading equities to cross-asset trading, he says. In other words, as their business model changes, firms must capture more market opportunities by capturing more data. “When you look at transformation, it’s important to look at expanding your business model to other asset classes and methodologies,” he says.
To achieve a transformative reduction in infrastructure costs, experts suggest that firms will need to learn new ways of deploying infrastructure, such as leveraging the cloud. A hybrid operating model is one option in which firms no longer run dedicated infrastructure, says BT’s Akass. “They can maintain key parts of their infrastructure and integrate it with other companies,” he says. For example, industry-specific clouds, such as NYSE Technologies’ Capital Markets Community Platform and BT Radianz’s cloud, can provide connectivity to numerous exchanges and deliver data from multiple information providers, some of which sit within their own data centers. “They could save money by not having duplicate links and infrastructure,” Akass says.
But no matter what the specific business strategy is, whether it’s a multiasset trading approach or something else, firms need to reduce cost, complexity and time to market. Existing systems have powered the industry for decades, but new regulations and budget pressures are forcing CIOs to rethink all parts of their infrastructure spending.
Monthly Archives: January 2013
Open Compute Project Releases Motherboard Tailored for Financial Services Firms
An open source server motherboard specification designed specifically for financial services organizations has been released by the Open Compute Project and AMD at the Open Compute Summit today.The specification, formerly known as Roadrunner, was started on the back of a napkin at a café in October 2011 at the New York Open Compute event, according to Bob Ogrey, engineering fellow and cloud technical evangelist at AMD. Today, the motherboard specification is known as AMD Open 3.0. It is designed to provide gains in computing flexibility, efficiency and operating cost by simplifying motherboard design with a single base product to address multiple enterprise workloads, including high-performance computing, cloud infrastructure and storage. Fidelity and Goldman Sachs are currently evaluating AMD Open 3.0. Don Duet, managing director and global co-chief operating officer of the technology division at Goldman Sachs, is on the board of directors for the Open Compute Project.
“Along with Fidelity and Goldman Sachs, we have had 12 other companies review the specification,” Ogrey said during an exclusive interview with Wall Street & Technology. “The specs are available to anybody and anyone can use it. This specification is a huge win for lowering technology costs, which is a huge issue on the east coast” where many financial services organizations are headquartered, Ogrey adds. “Financial services has a variety of needs. They need high perfrrmance computing for systems that are speed sensitive, such as simulations” or trading. “There are also varying needs for storage,” both long and short term storage, a well as for cloud infrastructure, Ogrey continues. With AMD Open 3.0, “they have a standard platform and they can apply it to their different needs. It will simplify server management and it will reduce the number of different server management infrastructures. The IT costs will go way down.”
[For more on the Open Compute Project, read Open Compute Project Aims to Bring Open Source Standards to Data Center Technology.]
Today’s servers are designed with a “one size fits most” approach incorporating many features and capabilities that inefficiently utilize space and power, increasing cost. Mega data centers, such as those run by Google or Facebook, have engineers developing optimized platforms with the minimum set of components for specific workloads, but enterprise users do not usually build their own servers. As a result, many “one size fits all” servers running in financial services data centers waste energy and increase costs. With AMD Open 3.0, the result is a solution with a tailored combination of power, space and cost. The AMD Open 3.0 platform is designed to easily enable IT professionals to “right size” the server to meet specific compute requirements, according to AMD.
“I am a platform guy,” Ogrey said. “I believe this is the beginning of change in the way platform and servers are deployed. It provides correct features, scale, simplifies data center management and reduces cost and OPEX [operational expenditure]. This will be the future of computing service environments going forward. Everyone’s requirements are different, but having a flexible and modular approach is important” for today’s enterprise users, Ogrey added.
“This is a realization of the Open Compute Project’s mission of ‘hacking conventional computing infrastructure,'” said Frank Frankovsky, Chairman of the Open Compute Foundation and VP of Hardware Design and Supply Chain at Facebook, in a release. “What’s really exciting for me here is the way the Open Compute Project inspired AMD and specific consumers to collaboratively bring our ‘vanity-free’ design philosophy to a motherboard that suited their exact needs.”
Technical Specifications
According to AMD, Open 3.0 is powered by the AMD Opteron 6300 Series processor and can be installed in all standard 19″ rack environments without modification. The motherboard is a 16″ x 16.5″ board designed to fit into 1U, 2U or 3U rack height servers. It features two AMD Opteron 6300 Series processors, each with 12 memory sockets (4 channels with 3 DIMMs each), 6 Serial ATA (SATA) connections per board, a 1 dual channel gigabit Ethernet NIC with integrated management, up to 4 PCI Express expansion slots, a mezzanine connector for custom module solutions, 2 serial ports and 2 USB ports. Specific PCI Express card support is dependent on usage case and chassis height.
There are also three recommended system configurations for high-performance computing (HPC), general purpose and storage.
HPC
- 1 DIMM per channel (U/RDDR3 1600MHz, 1866 MHz Stretch Goal)
- Fits into the 1U chassis
- Cooling and Power for SE 140W parts
- 1 SR5670 tunnel
- Supports 6 SATA drives natively off of the “Southbridge”
- Supports up to ten 2.5″ total drives with add-in card (Full details in section 7.2)
- Supports up to 2 low-profile PCIe cards or 1
- Single 1Gb on-board controller via BCM5725
- 10Gb solution via add-in mezzanine card
General Purpose
- 3 DIMM per channel (Up to 1600 MHz support)
- Fits into 2U chassis
- Cooling and Power for SE 140W Parts
- Support for twenty five 2.5″ SATA/SAS drives
- Supports up to 1 standard-height and 1 low-profile PCIe cards (2 cards total)
- Single 1Gb on-board controller via BCM5725
- 10Gb solution via add-in mezzanine card
Storage
- 3 DIMM per channel (Up to 1600 MHz support)
- Fits into 3U chassis
- Cooling and Power for SE 140W Parts
- Support for thirty five 2.5″ SATA /SAS drives
- Supports up to 4 full-height, short PCIe cards
- 10Gb solution via add-in mezzanine card
Stale Data Forces Nasdaq to Trade in Dark
‘Stale Data’ Forces Nasdaq to Trade in Dark
January 3, 2013
Laton McCartneyFor eleven minutes Thursday, the Nasdaq Stock Market’s consolidated data feeds across multiple channels shut down, as it investigated what it termed “stale data” in a message to customers.
The “stale data” on the UTP SIP, better known as Tape C, meant trading in stocks listed on Nasdaq continued, but weren’t recorded in real time, according to market participants.
“This occurred eight minutes before the Federal Open Market Committee (FOMC) released the minutes from its latest meeting so the timing wasn’t good,” notes Eric Hunsader, founder of Nanex, a market data aggregator. “Some big block trades were carried out when Tape C went dark, and no one saw them.”
Nasdaq was not the only exchange impacted. A BATs spokeswoman says that both BATS national exchanges were also affected.
Tape C is administered by Nasdaq as well as other exchange and associations which determine policy matters and oversee system operations.
Both the UTP Quote Data Feed (UQDF), which provides continuous quotations from all market centers trading Nasdaq-listed securities, and the UTP Trade Data Feed (UTDF), which provides continuous last sale information for all markets centers trading Nasdaq-listed securities, were out while trading in such stocks as Apple and Netflix appeared to stop trading.
Nasdaq didn’t respond to questions regarding what had triggered the data feed problems or how they were resolved.
But its market system status notices said the problems with the data feed extended from 1:42 to 1:53. p.m. At that point, Nasdaq said, “The issue with the UQDF and UTDF feeds affecting multiple channels has been resolved. All systems are operating normally.’’
BATS also issued this alert at 1:39 p.m.: “Please be advised that BATS is currently investigating issues sending and receiving quotes to and from the UTP SIP,” at BATS is communicating with the UTP SIP as they work to resolve the issue. Will advise once UTP SIP issues are resolved.”
At 13:57:54, BATs reported “Issues quoting and printing to the UTP SIP have been resolved. All BATS systems are operating normally.”
“This didn’t occur all at once,” says Hunsader. “Different exchanges were affected for different lengths of time.”
Scary issues, scary times.
Bond Guru Bill Gross’s 2013 Predictions in Charts (Part One)
Bond Guru Bill Gross’s 2013 Predictions in Charts (Part One)
by Dee Gill January 02, 20132 0Bill Gross, Pimco founder and much-watched bond guru, rang in the New Year by unveiling “2013 Fearless Forecasts” to the Twittersphere. Since he was limited to 140 characters per Tweet, we thought we’d help put his mighty words into context with a few charts.
5 Year Treasury Rate data by YCharts
Gross predicted that the 5-Year Treasury bond would end 2013 with a yield of 0.7%, barely changed from year-end levels.
He expects stocks and bonds generally to return 5% in 2013.
US Unemployment Rate data by YCharts
The U.S. unemployment rate will stay at 7.5% or higher, he says.
While some of those forecasts may seem less than bold, they are simply broad strokes of Gross’ generally bearish views these days, which can be read in full detail on Pimco’s website. Sovereign debt, as well as high debt levels at financials institutions and in households, will make general economic growth very difficult, he says.
Part Two of Gross’s predictions, illustrated in charts, is next.
Dee Gill, a senior contributing editor at YCharts, is a former foreign correspondent for AP-Dow Jones News in London, where she covered the U.K. equities market and economic indicators. She has written for The New York Times, The Wall Street Journal, The Economist and Time magazine. She can be reached at editor@ycharts.com.
Find out why with YCharts Pro: Click here to start your 14-day trial.
Please enable JavaScript to view the comments powered by Disqus.