IT Cost cutting on Wall Street

Wall Street’s Mission: Cutting Costs While Staying Competitive January 10, 2013 URL:

A flood of new regulations, coupled with falling profits from once-lucrative areas such as derivatives and equities trading, has Wall Street firms making some tough decisions.

First, they have to slash IT infrastructure costs as a way to boost the bottom line. While trimming budgets is never easy, it’s a process that’s familiar to all CIOs. A larger and tougher decision is to take a look at the entire IT infrastructure, as well as the existing business units, and decide what lines of business the firm should be in in the future. For instance, does the IT department continue to run separate market data databases for equities and derivatives? Does the firm even want to be in equities in the future?

“The general theme is more for less,” says Mark Akass, CTO at BT’s Global Banking and Financial Markets, part of BT Radianz, which operates a financial community network that lets members access 400 service providers, such as order management, trade execution and clearing. “People are looking to achieve more with the same budget. You have to find ways to operate and comply” with the newer regulations. “They have a double whammy,” he adds.


Extreme Makeover: IT Budget EditionWall Street & Technology’s January digital issue, which looks at how Wall Street firms, faced with a slew of new regulations and declining profits, are squeezing more out of their existing IT infrastructures, revisiting outsourcing relationships and modernizing costly legacy systems.

As a reflection of this trend, several brokerage firms plan to carry out massive layoffs to shrink their costs. On Dec. 5 Citigroup said it would cut 11,000 jobs worldwide to produce $1.1 billion in savings. Citi’s global consumer bank will lose 6,200 jobs, an additional 1,900 will be cut from Citi’s capital markets and transaction processing units, and 2,600 from back-office operations. About 3,400 of the cuts will come from technology and operations within those groups.

A New IT Budget Reality

Financial firms’ IT budgets are being hit from all directions. “That’s causing a shift in the IT mindset,” Akass says. The new mindset is going to involve revamping internal processes and hunting for efficiencies in every part of an organization, including architecture, software development, market data and connectivity.

The cuts will be necessary, but firms must continue to refresh their technology to remain competitive in areas such as low-latency trading. However, they also need to improve quality, as evidenced by the software error that nearly toppled Knight Capital Group in August.

“Cost reduction and improved quality are part of every interaction we have,” says Kirk Wylie, founder and CEO of OpenGamma, an open source risk management and analytics platform that caters to hedge funds. As the industry’s focus has changed from trading exotic derivatives to flow products with lower margins, the 3-year-old startup has caught the attention of a broad array of financial firms seeking real-time risk management, including small regional banks, global Tier 1 investment banks, insurance companies, fund administrators, as well as large and small hedge funds, Wylie says. Though the application is free to download, clients can get a maintenance contract for support and enhancements, which runs about $100,000 to $500,000 per year.

In the past, bigger firms have made incremental reductions in their IT spending, but now they’re changing their approach. “This is the new normal, and they need to make transformative cuts,” Wylie says. “They have to go in and say, ‘How am I going to cut 20% or 25% of my margins, because I’m never going to achieve the glory days of 2005 or 2006?'”

However, simply taking a hatchet to the IT budget isn’t a good idea, since companies must shift to newer technologies to remain competitive. New competitors, such as startup funds and small brokers, are using faster, more efficient technologies that have an inherent lower cost structure, putting pressure on the established players, says Bill Adiletta, partner in the technology practice at consulting firm Capco.

An established firm can invest capital to enter a low-latency trading business and two years later find a new entrant jumping in at half the cost. “The newer firm never had the sunk cost,” Adiletta says. The dilemma, he says, is “how do you keep advancing your technology fast enough to stay competitive … while figuring out how to write off the sunk capital?”

Streamlining software development life-cycle processes is key to increasing quality and cutting costs, Adiletta says. “If you revisit your development methodologies with quality inherently built into it, you can improve your time to market and lower cost,” he says. Given the technical snafus with the BATS IPO, Nasdaq’s debut of Facebook and more recently Knight Capital, firms have to be more attentive to quality, he says.

The focus on quality is why the industry hears so much talk about agile development, a software development process popular in other technology verticals, he says. Newer entrants are using rapid time to market, embedded disciplined quality management and other approaches to lower overall costs.

The Open Source Advantage

Another way firms are cutting costs is with open source technologies, which are free to license but can be customized to meet a company’s specific needs.

When Sanjib Sahoo, CTO at optionMonster and tradeMonster, launched an online trading platform in 2008, he turned to open source technologies. “We’re saving a ton of money,” he says.

The online brokerage runs its platform and infrastructure on the LAMP (Linux, Apache, My SQL and PHP) stack. By using MySQL instead of a commercial database, it has saved millions of dollars over the past three years, which is a significant percentage of its technology budget, Sahoo says. “I think we’re the only brokerage firm primarily running on open source,” says Sahoo, whose 60-person IT team develops everything in-house. The firm has had a 99.98% uptime, with reliability and scalability that meets the needs of active investors, he says.

OpenGamma’s Wylie agrees that there’s savings to be had from in-house development. “Any firm that’s done custom development will find that it can consume 46% of their total annual spend in risk and analytics,” he says. The first version could take 20 person-years and cost $250,000 per person-year, adding up to a $5 million project, he says. And the cost of maintaining, supporting and enhancing the system over its life could be 10 times the initial build cost.

However, Wall Street firms are wary of using open source for trading applications since they don’t want to contribute their proprietary code to the community, Wylie says. “The client doesn’t have to post anything to the community. Firms don’t have to open up their secret sauce to us, but just the stuff that has to talk to our code.”

New Financial Reality

Even with cost cutting, firms will still want to do their own internal development to capture a competitive advantage in whichever market segment they decide to pursue. “The financial game is changing, the margins are getting smaller and, the way the economy is, you have to be set up to trade across asset classes,” says Fintan Quill, senior engineer with Kx Systems, a provider of databases and tools for processing real-time and historical data.

Sun Trading recently expanded its use of Kx Systems’ kdb+ database to support cross-asset trading across new markets and geographies. It uses the platform to enter new markets when an opportunity arises. In contrast, firms with older systems built to handle just one asset class have a difficult time adapting to new markets. The Kx database’s real-time analytics and speed has let Sun Trading quickly bring trading ideas to the market, says CIO Kurt Lingel. Sun Trading plans to use Kx to address research and analysis needs as it continues to diversify across multiple asset classes and markets, he says.

[Legacy Outsourcing Requires A Delicate Touch ]

Sun Trading’s Chicago and London offices will use the Kx database to capture and analyze large volumes of data in real time for quantitative trading. The move is cost effective because the firm is using the same hardware and staff for all the data, Kx’s Quill says. “They’re getting better bang for their buck by using one technology to house their data,” Quill says.

As specific trading teams look across different asset classes, it’s useful to have data in one system, especially if low latency is involved, he says. “If you’re doing a correlation trade between FX and something else, from a purely technological standpoint, you’ll get a faster return on investment, and you don’t need to develop a separate infrastructure,” says Quill.

As firms are driving toward efficiencies and being cost effective, it’s a prime time to ask what business you want to be in, says Capco’s Adiletta. There’s a big transition under way from simply trading equities to cross-asset trading, he says. In other words, as their business model changes, firms must capture more market opportunities by capturing more data. “When you look at transformation, it’s important to look at expanding your business model to other asset classes and methodologies,” he says.

To achieve a transformative reduction in infrastructure costs, experts suggest that firms will need to learn new ways of deploying infrastructure, such as leveraging the cloud. A hybrid operating model is one option in which firms no longer run dedicated infrastructure, says BT’s Akass. “They can maintain key parts of their infrastructure and integrate it with other companies,” he says. For example, industry-specific clouds, such as NYSE Technologies’ Capital Markets Community Platform and BT Radianz’s cloud, can provide connectivity to numerous exchanges and deliver data from multiple information providers, some of which sit within their own data centers. “They could save money by not having duplicate links and infrastructure,” Akass says.

But no matter what the specific business strategy is, whether it’s a multiasset trading approach or something else, firms need to reduce cost, complexity and time to market. Existing systems have powered the industry for decades, but new regulations and budget pressures are forcing CIOs to rethink all parts of their infrastructure spending.

Posted via email from The Inside Market

Open Compute Project Releases Motherboard Tailored for Financial Services Firms

An open source server motherboard specification designed specifically for financial services organizations has been released by the Open Compute Project and AMD at the Open Compute Summit today.

The specification, formerly known as Roadrunner, was started on the back of a napkin at a café in October 2011 at the New York Open Compute event, according to Bob Ogrey, engineering fellow and cloud technical evangelist at AMD. Today, the motherboard specification is known as AMD Open 3.0. It is designed to provide gains in computing flexibility, efficiency and operating cost by simplifying motherboard design with a single base product to address multiple enterprise workloads, including high-performance computing, cloud infrastructure and storage. Fidelity and Goldman Sachs are currently evaluating AMD Open 3.0. Don Duet, managing director and global co-chief operating officer of the technology division at Goldman Sachs, is on the board of directors for the Open Compute Project.

“Along with Fidelity and Goldman Sachs, we have had 12 other companies review the specification,” Ogrey said during an exclusive interview with Wall Street & Technology. “The specs are available to anybody and anyone can use it. This specification is a huge win for lowering technology costs, which is a huge issue on the east coast” where many financial services organizations are headquartered, Ogrey adds. “Financial services has a variety of needs. They need high perfrrmance computing for systems that are speed sensitive, such as simulations” or trading. “There are also varying needs for storage,” both long and short term storage, a well as for cloud infrastructure, Ogrey continues. With AMD Open 3.0, “they have a standard platform and they can apply it to their different needs. It will simplify server management and it will reduce the number of different server management infrastructures. The IT costs will go way down.”

[For more on the Open Compute Project, read Open Compute Project Aims to Bring Open Source Standards to Data Center Technology.]

Today’s servers are designed with a “one size fits most” approach incorporating many features and capabilities that inefficiently utilize space and power, increasing cost. Mega data centers, such as those run by Google or Facebook, have engineers developing optimized platforms with the minimum set of components for specific workloads, but enterprise users do not usually build their own servers. As a result, many “one size fits all” servers running in financial services data centers waste energy and increase costs. With AMD Open 3.0, the result is a solution with a tailored combination of power, space and cost. The AMD Open 3.0 platform is designed to easily enable IT professionals to “right size” the server to meet specific compute requirements, according to AMD.

“I am a platform guy,” Ogrey said. “I believe this is the beginning of change in the way platform and servers are deployed. It provides correct features, scale, simplifies data center management and reduces cost and OPEX [operational expenditure]. This will be the future of computing service environments going forward. Everyone’s requirements are different, but having a flexible and modular approach is important” for today’s enterprise users, Ogrey added.

“This is a realization of the Open Compute Project’s mission of ‘hacking conventional computing infrastructure,'” said Frank Frankovsky, Chairman of the Open Compute Foundation and VP of Hardware Design and Supply Chain at Facebook, in a release. “What’s really exciting for me here is the way the Open Compute Project inspired AMD and specific consumers to collaboratively bring our ‘vanity-free’ design philosophy to a motherboard that suited their exact needs.”

Technical Specifications

According to AMD, Open 3.0 is powered by the AMD Opteron 6300 Series processor and can be installed in all standard 19″ rack environments without modification. The motherboard is a 16″ x 16.5″ board designed to fit into 1U, 2U or 3U rack height servers. It features two AMD Opteron 6300 Series processors, each with 12 memory sockets (4 channels with 3 DIMMs each), 6 Serial ATA (SATA) connections per board, a 1 dual channel gigabit Ethernet NIC with integrated management, up to 4 PCI Express expansion slots, a mezzanine connector for custom module solutions, 2 serial ports and 2 USB ports. Specific PCI Express card support is dependent on usage case and chassis height.

There are also three recommended system configurations for high-performance computing (HPC), general purpose and storage.

HPC

  • 1 DIMM per channel (U/RDDR3 1600MHz, 1866 MHz Stretch Goal)
  • Fits into the 1U chassis
  • Cooling and Power for SE 140W parts
  • 1 SR5670 tunnel
  • Supports 6 SATA drives natively off of the “Southbridge”
  • Supports up to ten 2.5″ total drives with add-in card (Full details in section 7.2)
  • Supports up to 2 low-profile PCIe cards or 1
  • Single 1Gb on-board controller via BCM5725
  • 10Gb solution via add-in mezzanine card

General Purpose

  • 3 DIMM per channel (Up to 1600 MHz support)
  • Fits into 2U chassis
  • Cooling and Power for SE 140W Parts
  • Support for twenty five 2.5″ SATA/SAS drives
  • Supports up to 1 standard-height and 1 low-profile PCIe cards (2 cards total)
  • Single 1Gb on-board controller via BCM5725
  • 10Gb solution via add-in mezzanine card

Storage

  • 3 DIMM per channel (Up to 1600 MHz support)
  • Fits into 3U chassis
  • Cooling and Power for SE 140W Parts
  • Support for thirty five 2.5″ SATA /SAS drives
  • Supports up to 4 full-height, short PCIe cards
  • 10Gb solution via add-in mezzanine card

Posted via email from The Inside Market

Stale Data Forces Nasdaq to Trade in Dark

‘Stale Data’ Forces Nasdaq to Trade in Dark

January 3, 2013
Laton McCartney

For eleven minutes Thursday, the Nasdaq Stock Market’s consolidated data feeds across multiple channels shut down, as it investigated what it termed “stale data” in a message to customers.

The “stale data” on the UTP SIP, better known as Tape C, meant trading in stocks listed on Nasdaq continued, but weren’t recorded in real time, according to market participants.

“This occurred eight minutes before the Federal Open Market Committee (FOMC) released the minutes from its latest meeting so the timing wasn’t good,” notes Eric Hunsader, founder of Nanex, a market data aggregator. “Some big block trades were carried out when Tape C went dark, and no one saw them.”

Nasdaq was not the only exchange impacted. A BATs spokeswoman says that both BATS national exchanges were also affected.

Tape C is administered by Nasdaq as well as other exchange and associations which determine policy matters and oversee system operations.

Both the UTP Quote Data Feed (UQDF), which provides continuous quotations from all market centers trading Nasdaq-listed securities, and the UTP Trade Data Feed (UTDF), which provides continuous last sale information for all markets centers trading Nasdaq-listed securities, were out while trading in such stocks as Apple and Netflix appeared to stop trading.

Nasdaq didn’t respond to questions regarding what had triggered the data feed problems or how they were resolved.

But its market system status notices said the problems with the data feed extended from 1:42 to 1:53. p.m. At that point, Nasdaq said, “The issue with the UQDF and UTDF feeds affecting multiple channels has been resolved. All systems are operating normally.’’

BATS also issued this alert at 1:39 p.m.: “Please be advised that BATS is currently investigating issues sending and receiving quotes to and from the UTP SIP,” at BATS is communicating with the UTP SIP as they work to resolve the issue. Will advise once UTP SIP issues are resolved.”

At 13:57:54, BATs reported “Issues quoting and printing to the UTP SIP have been resolved. All BATS systems are operating normally.”

“This didn’t occur all at once,” says Hunsader. “Different exchanges were affected for different lengths of time.”

 

Scary issues, scary times.

Posted via email from The Inside Market

Bond Guru Bill Gross’s 2013 Predictions in Charts (Part One)

Bond Guru Bill Gross’s 2013 Predictions in Charts (Part One)

by Dee Gill January 02, 2013
 2  0

Bill Gross, Pimco founder and much-watched bond guru, rang in the New Year by unveiling “2013 Fearless Forecasts” to the Twittersphere. Since he was limited to 140 characters per Tweet, we thought we’d help put his mighty words into context with a few charts.

5 Year Treasury Rate Chart

5 Year Treasury Rate data by YCharts

Gross predicted that the 5-Year Treasury bond would end 2013 with a yield of 0.7%, barely changed from year-end levels.

^SPX Chart

^SPX data by YCharts

He expects stocks and bonds generally to return 5% in 2013.

US Unemployment Rate Chart

US Unemployment Rate data by YCharts

The U.S. unemployment rate will stay at 7.5% or higher, he says.

While some of those forecasts may seem less than bold, they are simply broad strokes of Gross’ generally bearish views these days, which can be read in full detail on Pimco’s website. Sovereign debt, as well as high debt levels at financials institutions and in households, will make general economic growth very difficult, he says.

Part Two of Gross’s predictions, illustrated in charts, is next.

Dee Gill, a senior contributing editor at YCharts, is a former foreign correspondent for AP-Dow Jones News in London, where she covered the U.K. equities market and economic indicators. She has written for The New York Times, The Wall Street Journal, The Economist and Time magazine. She can be reached at editor@ycharts.com.

Find out why with YCharts Pro: Click here to start your 14-day trial.

Please enable JavaScript to view the comments powered by Disqus.

Posted via email from The Inside Market

High-Speed Regulation

This is an interview of Securities Technology Monitor editor-in-chief Tom Steinert-Threlkeld by Peter Fedysnky, New York correspondent for the Voice of America.

I find it raises interesting questions. The idea of the large data warehouse storing every single trade across market poses interesting technological challenges. Not to mention that once stored, without delays, it needs to be analyzed. And that the only way to really regulate with this information is to be able to dissect trades on same instruments from different firms, trades on different instruments from the same firm, etc. Considering some of these market are not yet electronic, this seem to me like a serious challenge. Not to mention the basic fact that the sheer amount of information collected would it make it very hard to process.

Gaps and probability

Scott Andrews “The Gap Guy” has a great post today. On top of covering his favorite topics, gaps, he clearly illustrates the power of probability and backtesting.

I was subscriber of Scott’s for a while and he offers his subscribers truly valuable probabilities on gaps. I will be back. I would to develop to put all this information in an ATS.

73 de Stoploss.

Electronic Trading Creates a New Financial Landscape

A SUBSTANTIAL part of all stock trading in the United States takes place in a warehouse in a nondescript business park just off the New Jersey Turnpike.

Few humans are present in this vast technological sanctum, known as New York Four. Instead, the building, nearly the size of three football fields, is filled with long avenues of computer servers illuminated by energy-efficient blue phosphorescent light.

Countless metal cages contain racks of computers that perform all kinds of trades for Wall Street banks, hedge funds, brokerage firms and other institutions. And within just one of these cages — a tight space measuring 40 feet by 45 feet and festooned with blue and white wires — is an array of servers that together form the mechanized heart of one of the top four stock exchanges in the United States.

The exchange is called Direct Edge, hardly a household name. But as the lights pulse on its servers, you can almost see the holdings in your 401(k) zip by.

“This,” says Steven Bonanno, the chief technology officer of the exchange, looking on proudly, “is where everyone does their magic.”

In many of the world’s markets, nearly all stock trading is now conducted by computers talking to other computers at high speeds. As the machines have taken over, trading has been migrating from raucous, populated trading floors like those of the New York Stock Exchange to dozens of separate, rival electronic exchanges. They rely on data centers like this one, many in the suburbs of northern New Jersey.

While this “Tron” landscape is dominated by the titans of Wall Street, it affects nearly everyone who owns shares of stock or mutual funds, or who has a stake in a pension fund or works for a public company. For better or for worse, part of your wealth, your livelihood, is throbbing through these wires.

The advantages of this new technological order are clear. Trading costs have plummeted, and anyone can buy stocks from anywhere in seconds with the simple click of a mouse or a tap on a smartphone’s screen.

But some experts wonder whether the technology is getting dangerously out of control. Even apart from the huge amounts of energy the megacomputers consume, and the dangers of putting so much of the economy’s plumbing in one place, they wonder whether the new world is a fairer one — and whether traders with access to the fastest machines win at the expense of ordinary investors.

It also seems to be a much more hair-trigger market. The so-called flash crash in the market last May — when stock prices plunged hundreds of points before recovering — showed how unpredictable the new systems could be. Fear of this volatile, blindingly fast market may be why ordinary investors have been withdrawing money from domestic stock mutual funds —$90 billion worth since May, according to figures from the Investment Company Institute.

No one knows whether this is a better world, and that includes the regulators, who are struggling to keep up with the pace of innovation in the great technological arms race that the stock market has become.

WILLIAM O’BRIEN, a former lawyer for Goldman Sachs, crosses the Hudson River each day from New York to reach his Jersey City destination — a shiny blue building opposite a Courtyard by Marriott.

Mr. O’Brien, 40, works there as chief executive of Direct Edge, the young electronic stock exchange that is part of New Jersey’s burgeoning financial ecosystem. Seven miles away, in Secaucus, is the New York Four warehouse that houses Direct Edge’s servers. Another cluster of data centers, serving various companies, is five miles north, in Weehawken, at the western mouth of the Lincoln Tunnel. And yet another is planted 20 miles south on the New Jersey Turnpike, at Exit 12, in Carteret, N.J.

As Mr. O’Brien says, “New Jersey is the new heart of Wall Street.”

Direct Edge’s office demonstrates that it doesn’t take many people to become a major outfit in today’s electronic market. The firm, whose motto is “Everybody needs some edge,” has only 90 employees, most of them on this building’s sixth floor. There are lines of cubicles for programmers and a small operations room where two men watch a wall of screens, checking that market-order traffic moves smoothly and, of course, quickly. Direct Edge receives up to 10,000 orders a second.

Mr. O’Brien’s personal story reflects the recent history of stock-exchange upheaval. A fit, blue-eyed Wall Street veteran, who wears the monogram “W O’B” on his purple shirt cuff, Mr. O’Brien is the son of a seat holder and trader on the floor of the New York Stock Exchange in the 1970s, when the Big Board was by far the biggest game around.

But in the 1980s, Nasdaq, a new electronic competitor, challenged that dominance. And a bigger upheaval came in the late 1990s and early 2000s, after the Securities and Exchange Commission enacted a series of regulations to foster competition and drive down commission costs for ordinary investors.

These changes forced the New York Stock Exchange and Nasdaq to post orders electronically and execute them immediately, at the best price available in the United States — suddenly giving an advantage to start-up operations that were faster and cheaper. Mr. O’Brien went to work for one of them, called Brut. The N.Y.S.E. and Nasdaq fought back, buying up smaller rivals: Nasdaq, for example, acquired Brut. And to give itself greater firepower, the N.Y.S.E., which had been member-owned, became a public, for-profit company.

Brokerage firms and traders came to fear that a Nasdaq-N.Y.S.E. duopoly was asserting itself, one that would charge them heavily for the right to trade, so they created their own exchanges. One was Direct Edge, which formally became an exchange six months ago. Another, the BATS Exchange, is located in another unlikely capital of stock market trading: Kansas City, Mo.

Direct Edge now trails the N.Y.S.E. and Nasdaq in size; it vies with BATS for third place. Direct Edge is backed by a powerful roster of financial players: Goldman Sachs, Knight Capital, Citadel Securities and the International Securities Exchange, its largest shareholder. JPMorgan also holds a stake. Direct Edge still occupies the same building as its original founder, Knight Capital, in Jersey City.

Good read.

Posted via email from The Inside Market

Looking Ahead to 2011: The Hedge Fund Industry Speaks (Part I) | FINalternatives

Looking Ahead to 2011: The Hedge Fund Industry Speaks (Part I)

Dec 21 2010 | 9:02pm ET

The future, as Yogi Berra once said, ain’t what it used to be. And while predicting it is an art, not a science, we couldn’t resist asking a few of the people we’ve spoken with this year what they think 2011 holds for the world’s financial markets in general, and hedge funds in particular.

Here, without further ado, is the first installment of responses from hedge fund industry experts, with more to come in the following days.

Rory HillsRory HillsRory Hills, Founding Partner, Hilltop Fund Management

“Part of our approach and philosophy is to generally avoid making macro calls and especially of the ‘what will do well next year?’ variety.  That said, one issue we will watch for is to see if the level of inter-stock correlation in equity markets continues to be as high in 2011 as it has been for long periods of 2010.

“This has been driven by a binary ‘risk on/risk off’ approach by many investors which has often resulted in equities moving likes shoals of fish. This, combined with generally poor liquidity has created a difficult environment for many shorter-term equity trades, whether fundamental or systematic. 

“In respect of fund of hedge funds, I would say this: We believe too many FoHFs continue to provide returns which are too correlated to equity markets (is it any surprise that in 2010, the worst month in terms of performance was May and the best September?) and this is not what investors want.  We have no idea what 2011 holds but we do believe our portfolio will deliver solid positive returns regardless.”

Michael DeJarnetteMichael DeJarnetteMichael DeJarnette, Co-Founder, NorthPoint Trading Partners

“I believe that in 2011 we will see a return of the mid-size hedge fund ($100 million – $800 million AUM range).  That is due to increased barriers to entry for the smallest start-ups and lack of alpha on the largest side.  I think that sized fund will have to focus more on compliance and risk.  They will have to leverage technology to do so.  I believe that that will be accomplished through the use of in-house technology and brand name service providers.  Both the buy-side and sell-side will be at risk of losing competitive advantage to peers that have more robust tech infrastructure.”

Basil WilliamsBasil WilliamsBasil Williams, CEO, Concordia Advisors

“In August, Michael Purves of BGC Financial coined the term ‘wolf market’ in a WSJ interview. I tend to agree with Michael’s assessment. The 2011 market, like a wolf, will react quickly and decisively. It will be wily. It will feed on weak prey. As bulls or bears are much bigger and less nimble, they are able to carry the herd. The wolf takes no prisoners. Such markets are environments in which nimble trading strategies like relative-value investing will prevail. Such investment styles recognize market, sector and security-specific overreaction and structure trades that profit from them, while also protecting against sustained adverse macro moves. Profiting by trading with a trend, a great strategy in bullish or bearish times, will be much more difficult in 2011. Alternatively, profiting by seizing upon market inefficiency and short-term overreaction will produce a more desirable return/risk profile.”

Looking Ahead to 2011: The Hedge Fund Industry Speaks (Part II)

A nice look ahead from various experts…

Posted via email from The Inside Market