January 9, 2012
Inside Market Data - Special Report
IMD: What economic trends
do you expect to see in 2012 for the data industry, and how do you
expect firms’ market data budgets to change? What will be key
drivers of spend by end-users and data providers this year?.
Douglas B Taylor, managing partner, Burton-Taylor International Consulting: Although we might expect some improvement in the US economy during 2012, we may not see any dramatic impact on the market data industry until after the US presidential elections. Corporate and financial institutions, which posted positive results in 2011, should be expected to remain hesitant to invest at any significant level until after the politics and policies are sorted in the US.
Economic stability in Western Europe will also play a role in market data sales, and frankly, a cautious approach to investment should be expected there as well.
From a more positive economic perspective, emerging market economies are generally outperforming developed markets and provide opportunity. Market data participants should expect to find the highest demand [for data on markets] in India, China, Indonesia and Brazil, but should also expect opportunity in South Africa, the MENA region, and selected countries in Latin America and in Central and Eastern Europe. As far as product and user demand, Burton-Taylor research indicates that 2012 product demand will be strongest around risk management, pricing and reference data, and emerging market products, and that user demand will return—driven by hedge funds—and continue to be high among commodities and energy information users.
Overall, pressure will remain on market data budgets in 2012. Outside of the growth areas mentioned above, market data pricing will almost certainly remain competitive and aggressive.
IMD: One of the biggest areas of spend in recent years has been on reducing latency. But as latency approaches zero, what latency-related gains can firms pursue to achieve an advantage?
Paul Barringer, chief executive, Burstream: A few observations made with the help of customers and prospective customers over the year: Large trading firms, including tier-one banks, who have huge investments in trading and network infrastructure are willing to examine managed service offerings for direct market access (DMA) solutions as long as their requirements are met. They see this as a way (in some cases) to reduce costs, and also as a lower-risk, quicker, and more flexible way to gain a trading edge. Although they have built this infrastructure and fine-tuned it over many iterations, they don’t necessarily want to continue doing so.
Achieving latency savings in networks, feed handlers, or even in output format data structure changes that make the generation of a trading signal just one instruction away from an order book update is still important to tier-one banks who offer DMA solutions to principal trading firms, and of utmost importance to those firms themselves. Missing the trade—not getting filled—still means you lose.
Given the availability of hardware-based systems in the market, and the industry buzz around these systems, it is surprising to observe how many firms are still using software systems to perform feed handler, risk, and execution routing tasks. As we go deeper in discussion with these firms, they often say that super-fast hardware parsing or normalization is nice, but ask how much (more) work their algo will have to perform to generate trading signals. If the answer is “too much,” then they will have given back the latency saved in the first step.
IMD: But speed alone is no longer the advantage it once was. How can firms utilize infrastructures built to serve firms’ low-latency needs and adapt them to address new challenges, such as regulatory compliance and the demands of “Big Data”?
Barry Thompson, chief technology officer, Tervela: Like any data network, your low-latency infrastructure is a foundation for powerful, data-heavy applications. Here are some ways to leverage your high-performance assets to address the challenges of 2012 and beyond.
Big Data: In many instances, low latency is equivalent to high capacity. With this infrastructure, you can create innovative solutions for Big Data by utilizing commodity infrastructure without expensive tier-one storage—in effect, utilizing your low-latency network’s data fabric to enable high-performance solutions with the same enterprise resiliency of traditional big vendor solutions. This leads to tremendous scalability, cost savings, and performance gains for your business.
Compliance: The new consolidated audit trail requirements require a movement from batch to real-time data acquisition for reporting. Utilizing your low-latency infrastructure and some sort of guaranteed data delivery technology allows a drop-in solution for real-time data acquisition, by placing important events onto a guaranteed event stream, and delivering them to a central location for auditing and processing. This offloads processing from your core apps, while being extremely flexible and responsive as regulations change.
Disaster recovery: The cost of existing DR solutions based on operating system and storage layer solutions is becoming a business impediment. The real-time infrastructure that powered orders and highly resilient front-office applications can keep datacenters fully synchronized with hot-hot disaster recovery that transfers state in real-time. By transferring some of the mission-critical and Big Data DR problems to low-latency infrastructure and data fabrics, customers can save significant amounts on future infrastructure spend and high-performance SLAs for DR availability.
Big Analysis: Take advantage of all that data and draw deeper insights that will make you more competitive. Your low-latency network is an excellent vehicle to funnel events from big data repositories to a central analysis engine for complex crunching and trending.
IMD: As the focus on Big Data increases, mining large volumes of data quickly and presenting them clearly is becoming more important. How is the concept of analytics changing as a result?
Irfan Khan, CTO, Sybase: As businesses look to leverage the value of Big Data and enterprises look further into ways of maximizing the value of information being collected across their organization—particularly the unstructured data—investment in business intelligence (BI) and analytics tools is increasingly occupying the research and development time of both technologists and business line owners.
Companies are at a crossroads, and while some are looking for a quick fix leveraging the traditional Enterprise Data Warehouse (EDW) architecture, forward-looking enterprises are going beyond the conventional infrastructure, exploring new deployment options like “pure” in-memory compute environments, or in some cases looking into more federated architectures that allow for abstraction and virtualization of access and analysis across multiple distributed data stores and archives.
Expanding the conventional infrastructure towards in-memory and columnar architectures will also enable enterprises to analyze both structured and unstructured data in a single consolidated environment; permit processing with a real-time characteristic; and short-wire the latency to actionable events.
The goal of many organizations will be to minimize the performance implications of large-scale data movement and materialization of data on the client side before analysis can be performed, as well as their reliance on pre-aggregated data. Having an analytics capability where more in-database-style analytics can be performed natively with the stores will be a competitive edge.
IMD: With fewer human traders in the market and automated execution now accounting for the majority of trading in commoditized asset classes, how are providers of desktop terminals evolving their products to remain valuable?
Dave Shworan, CEO, QuoteMedia: We believe that the growth of automated trading does not necessarily mean that there are fewer human traders. These are two separate spheres of activity, and in fact the falling costs of data and increasing ease of access have resulted in more opportunities for human traders than there were a few years ago, especially in the retail space.
Of course, it is true that human traders find themselves in a quickly changing landscape—but we see in these shifts a real opportunity. Automated trading is not alone as a phenomenon in evolution; millions of human traders are rapidly adapting by taking advantage of improved efficiencies in market data access, ease of modern software implementation and interactivity, and aggregation of user-driven content configurations. The Internet is getting smarter not just for machines, but for people as well.
Our broker-dealer and advisory clients indicate to us that their firms have a renewed focus on customer relationships and service, in response to the growth in automated execution and the 2008 meltdown. With the rapid expansion of a tech-savvy demographic worldwide, trading is now a significant element of mass economic culture, and we don’t see an evaporation of human trading on the horizon.
Competitive advantages in the desktop terminal space will consist of total portability, ease of software and trading integration, low latency, relevance and immediacy of aggregated content, synchronized mobile access, and advanced configurable interfaces—all at a low cost.
IMD: With the ongoing move towards automated trading and a more mobile workforce, are mobile technologies becoming the new generation of client interfaces?
Christian Erlandson, CEO, CarryQuote: Smartphones and tablets have lost their novelty and are now embedded in everyone’s daily routine. Many financial institutions spent 2011 thinking about how to capitalize on this fast-moving technology rather than acting on it. But with the consumption of data via mobile devices beginning to outpace web data use, firms can no longer sit on the sidelines and watch this opportunity pass them by. The challenge now is how to leverage the unique capabilities of mobile devices rather than simply enabling web versions of traditional applications. While much emphasis has been put on HTML5 as an easy way out, the benchmark has been set by best-in-class mobile applications that have been developed for consumers. While internal audiences might be more forgiving, their institutional client users have high expectations when it comes to latency, streaming data, advanced charting, alerting and more. Unlike the web, there is no one platform that provides the sophisticated functionality, ease of navigation and level of security that clients expect on any—and all—of the mobile devices that they use. The first-mover’s advantage will go to those firms who launch native applications to address the increased demands presented by these use case scenarios.
IMD: As adoption of mobile and retail-inspired technologies increases, institutional “app stores” are starting to take off. What are the advantages—and disadvantages—of the app store model in capital markets?
Thomas Kim, CEO, UNX: As with the mobile phone industry, UNX’s app store model is characterized by complete neutrality and the availability of a software development kit (SDK), which translates into a low barrier to entry for new technology providers. It can also result in an exponentially larger pool of research and development for innovation. And the open model extends beyond the confines of the capital markets to embrace the tremendous advancements being made in Silicon Valley and Silicon Alleys around the world.
Another advantage is cost savings for financial firms, as the app store model revolutionizes distribution of software. The costs of training, installation, software integration and many of the other expenses traditionally incurred as part of bringing software in-house are greatly reduced.
And the entire institutional trading community benefits, as the open-source model ignites greater competition and innovation at a faster pace, in a more collaborative community.
Perhaps the biggest disadvantage is the “patchwork” problem of integrating numerous apps written by different developers into a seamless work environment. At UNX, we’ve solved this problem by establishing carefully enforced standards to optimize the integration and user experience. So all of the apps in our “Catalyst Marketplace” (app store) work together—no matter who the provider is or whether they’re a broker-dealer, trading firm, exchange or software company. And all our technology is broker-neutral.
IMD: But any successful new delivery mechanism needs quality underlying content. What types of content will be hot in 2012, and why?
Emmanuel Doe, president, trading solutions group, Interactive Data: As performance of various asset classes has been variable in recent years, commodities are increasingly being utilized in trading and investment strategies. I believe that data on commodities that can help market participants to utilize commodities in their trading and investment strategies will be of value. Content that helps to manage volatility, protect against inflation and yield new sources of return will likely be more in demand, and I think that information on gold and precious metals will be of particular interest to clients.
On a broader macro level, I expect that issues related to sovereign debt and the euro crisis will continue to attract interest in 2012, as the markets look not only for stability but also for opportunities to leverage currency and asset price movements. Interactive Data plans to expand our offerings to support increased demand for currency data as well as commodities data.
IMD: Whatever content types and sources are in demand, effective trading and risk management depends on being able to centralize it and correlate it with other content. How does this affect the way market participants consume data, and the tools they use to do so?
Rob Passarella, vice president of institutional markets, Dow Jones: As crisis continues in the markets, firms have realized that macro fundamentals and headlines are driving the volatility across multiple asset classes. In these types of situations there is less reliance on fundamentals at a company level. Firms are looking for ways to mitigate reactions to policy-makers while driving new types of trading models. This scenario was also seen in 2008 when politicians’ words and actions drove markets in multiple directions as correlations between markets became acutely positive. From these lessons, institutions and hedge funds are looking for tools and content that allow them to be active when news breaks as well as the ability to back-test headline events to market data. The keys for providers are concordance and large repositories of timely data, in some cases down to the millisecond.
IMD: News plays a huge role in moving markets and in firms’ trading decisions. But news is also evolving, becoming an input to algorithmic trading strategies. Is non-machine-readable news or news without sentiment data still a valuable input?
Ryan Terpstra, CEO, Selerity: Unstructured news is still an extremely valuable input into algorithmic trading strategies, but the application and workflow associated with how firms use signals generated from non-machine-readable news differs from how they use structured event data. It’s also important to note that most meaningful unstructured news is unscheduled, systematic news that traditional news outlets have not effectively found a way to structure and deliver. A perfect example is the daily headlines related to the European financial crisis that are generating significant volatility in the market today.
This category of news is typically leveraged for risk management as a defensive strategy. Most firms will typically read a news headline on some form of a GUI or terminal and then manually adjust their live automated strategies accordingly. The most common reaction is to either widen their spreads or stop providing liquidity altogether. More sophisticated firms use parsing algorithms that attempt to search for keywords or phrases, then generate a signal that systems can use to take the same set of actions that other firms would implement manually.
At the end of the day, automated trading applications and non-automated traders alike are affected by the same gyrations in the market created by events, which—structured or unstructured—remain a critical input.
IMD: But before investing in any type of content, firms must assess the opportunities and risks that it presents, and its value versus its cost. How will firms have to change the way they assess data’s usefulness in future, and why?
Steve Ellenberg, senior market data consultant, Americas lead, MDSL: Processes intended to accurately assess data’s value and usefulness to a firm are currently morphing into complex and onerous exercises, and almost always now require the firm to completely document all of their intended usage, the entitlement capabilities of their internal databases where the data will reside, and the types of support staff that will have access to the data. The legacy and well-familiar licensing structure, whereby licensees are generally allowed to use the data in any way they wish and for any new opportunities that present themselves (excepting routine redistribution), will largely become extinct as 2011 ends.
Data suppliers no longer respond harmoniously to vague usage statements containing few details, and have become increasingly vigilant with regards to understanding how their data will be used and who will be using it. Data suppliers are newly minting aptly-named data policies which contain usage terms for each of their data agreements. Firms should now anticipate executing multiple data license agreements for the very same data products and services, predicated upon their own separate and various intended uses for the data. Data policies are now also indirectly exerting pressure on data licensees to enhance their own data governance efforts to ensure internal compliance with data agreements.
IMD: Finally, a question for all our participants: What new ideas will deliver value in 2012, and how will the market be able to exploit these?
Taylor: Automated trading and electronic trading continues to be an area of expansion. Although not new ideas, new instruments and venues will continue to be expanded and created, which will increase trading volumes and the associated revenue from fees.
The momentum carrying derivatives from over-the-counter to structured, regulated exchange-traded environments will continue and offer opportunity for both new and established market data participants. The rush toward anything “tablet” will continue to grow in 2012, with professional financial participants gaining confidence in tablet hardware and apps as legitimate tools for their business. This is a long-term migration, as security, connectivity and performance questions inhibit wholesale adoption of the platform, but make no mistake that mobility is a significant wave of the future and will move from the personal to the professional space.
The market will increase demand for, and acceptance of, sophisticated visualization tools.
Strategic market segmentation and the targeting of individual user groups, geographic regions or information products in an effort for market data vendors to isolate growth opportunities and competitive advantages will accelerate. With markets expanding slowly, and competition for new revenue at an all-time high, a premium will be placed on vendor strategy.
Barringer: More extensive deployments of pure-hardware feed handler, risk, and execution systems. As banks and broker-dealers compete for high-frequency order flow, their DMA systems will have to be faster and better integrated. Many of these will be delivered as managed services—either directly, or through a prime broker. Meanwhile, broker-neutral (or non-broker) managed service platforms will be adopted more widely by principal trading firms, most of whom are—or are becoming—broker-dealers themselves.
Smaller DMA providers will find ways to profit from internalization, aggregating uninformed order flow from a new category of non-institutional principal traders.
Thompson: Virtualization has let us take complex applications, package them up in a virtual machine, and deploy them all around the world with incredible ease. However, applications still require fast and reliable access to data, and that access is greatly limited by our ability to transfer large volumes of data across great distances.
That’s why the concept of a data fabric is so powerful for the market, and perhaps the most important new idea to deliver value in 2012. A data fabric is a no-loss, low-latency data transfer network, designed for any application that needs universal, reliable access to a common, real-time data store.
Think about all the ways that information gets shared between applications—from event-driven systems to data replication services, caches and persistent data stores. The purpose of a unified data fabric is to create a single, simple method for sharing information between applications, eliminating much of the wasted clutter that is built into applications, resulting in lost performance and code bloat.
The market is primed for an easier way to build distributed applications, and the idea of a data fabric enables the build-out of small, lightweight apps built on readily accessible data. As a result, we’ll see much more of the data fabric in 2012.
Khan: First, implementing a more consolidated platform that enables enterprises to perform the analysis of both structured and unstructured data in real time will be a game changer. Second, in-memory computing is reaching a maturity level where customers need to make fewer trade-offs in terms of consistency and transactional semantics, when combined with a more native integration to a columnar analytics server. This will allow companies to perform queries and analyze data with a much higher degree of scalability and performance particularly for ad-hoc environments with a diverse user/query profile. Third, investments in mobile analytics will empower the business community to derive substantial value from data and make information workers far more productive. Investing in a mobile middleware like Sybase’s SUP layer will permit rapid development of a variety of mobile analytics applications. And last, incorporating new programming models like Map & Reduce and exposing these through commonly used SQL functions will make developers’ lives significantly more productive, while natively supporting massively parallel and distributed compute paradigms like Hadoop within the analytics server will enable a more manageable and integrated analysis environment and reduce the barrier to entry.
Shworan: In broad terms, the market will increasingly demand data delivery services that offer the maximum in ease and range of client-side configuration, extensibility and portability, with the minimum of client-side management requirements. This requisite will apply across product feature, data set specificity and aggregation filter considerations.
Our company is very development-oriented, with everything built in-house. Over half of our employees are developers, and the projects they work on are all client-driven and market-driven. This puts us in a great position to see where the industry is going. We’re working on a lot of exciting projects right now, addressing a wide range of demands—from scaling infrastructure for projected increases in data flows, to consolidating feeds and tailoring user-configurable feeds; from catering to the incredible growth in browser use and tablet and smartphone adoption, to aggregating more non-traditional information sources like blogs; and on and on.
Essentially, successful providers will accommodate “on the fly” data, content and features in platform-agnostic front-ends and APIs, while maintaining reliable flow and databasing of market data and ancillary content, across all asset classes. You have to offer the greatest depth and breadth of content and data, while empowering the end-user to access precisely what they want via the platform they prefer, at the moment they want to see it.
Erlandson: It’s become very clear that institutional clients are spending more time—particularly around their commute and before the market opens—using mobile devices. In 2012, we will see many firms organize their business model around mobile connectivity as a way to engage more directly and more frequently with clients. The firms who will win are those who create “true” mobile applications specifically designed to address institutional-class requirements that enable users to do things that simply were not possible before the rise of advanced connected devices. For example, traders who traditionally conducted business by phone or instant messaging can now connect to brokers using mobile devices in new ways, with brokers sending real-time prices and price notifications via push alerts. The trick will be to provide a native experience via the three dominant platforms of 2012—Apple, Android and Windows (apologies to Blackberry)—to make sure they can provide a solution their clients can access wherever and whenever they need it.
Kim: We see a trend towards modular innovation in financial services. Modular innovation is a focus on launching software tools of high value but smaller in scope than traditional trading, risk, and analytics suites. Rather than trying to be all things to all people, many providers will focus on doing one thing exceedingly well. Strategic roadmaps on both the buy and sell sides will become hyper-focused on developing in areas in which they can realize alpha, while consuming off-the-shelf solutions for what they deem commoditized.
The availability of SDKs will be essential for firms to derive value from modular innovation and integrate solutions within a customized trading environment. And to avoid usability shortcomings of disparate software developers, providers will need to offer extensive API libraries to ensure seamless integration and a uniform user experience.
In an environment where development budgets are under pressure, technology providers and consumers also will be eager to invest in technologies that allow them to innovate at faster rates while minimizing costs.
Doe: As has been generally observed, the euro crisis is creating instability in the European Union and elsewhere. However, this instability could also present opportunities in foreign exchange and arbitrage strategies. With individual countries unable to devalue their currencies, heavily indebted nations such as Greece, Italy and Ireland will likely continue to face challenges related to their debt loads.
While currencies in Europe other than the euro remain at still-elevated values, the euro has dropped off significantly against the dollar. These price inefficiencies can present opportunities for arbitrage traders to buy and sell different currency pairs. As a result, we might see that currency data will be at a premium in 2012.
Additionally, this adversity could generate more opportunities in different regions and countries given the relative strength and weakness of various economic and geopolitical infrastructures. For example, the economies of the Central and Eastern European markets did not recover as well as many other emerging markets after 2008. But with strong prospects, relatively stable public finances and lower debt levels, these markets are now viewed by many analysts and economists as a more reliable source of alpha.
Passarella: 2012 may be the year that social media and news organizations meet to create a larger picture of the financial information landscape. Journalists have embraced Twitter as a way to inform as well as imbibe breaking trends. What used to be a closed system in the world of news gathering is now becoming more interactive as journalists embrace professionals and amateurs alike via social media. This provides a richer and more insightful experience for news agencies and the business of news. The tools are very nascent, but the value to the community as whole is impressive. In 2012, the use of social media will grow and become an accepted part of the news world for markets. Any market participant not part of the conversation will be sorely behind in market awareness and surveillance.
Terpstra: Much like 2011, next year will be dominated by regulation and compliance. Financial services firms will be forced to react and comply with newly implemented rules and mandates. We will also see the drive into multi-asset class trading and global investing continue as firms seek new ways to create alpha in frontier markets that are less saturated than developed markets. Our clients are increasingly looking outside US borders to emerging economies for new opportunities, and we expect this trend to continue in 2012.
Ellenberg: We are in the midst of a period whereby entirely new market data management organizational structures and processes are rapidly evolving, and will continue so through 2012. A consistent trend through this transitional period is the increasing segmentation of roles in the market data labor pool, offering employers valuable opportunities to leverage specific skillsets most appropriate for their own business focus.
Generalist knowledge apparently now retains less value against the niche-specific expertise currently sought. Smaller firms—particularly hedge funds and investment managers—are developing and staffing positions which require skills and expertise that address a specific trading strategy, such as HFT or currency trading. Also currently in demand are strong familiarities with specific datasets, such as reference data, index data, or risk modeling data.
by Max Bowie
Latest Burton-Taylor News
May 12, 2013
The Financial Times
Bloomberg scrambles to reassure users
In Michael Bloomberg’s autobiography,
written before the Bloomberg founder became mayor of New York, he
recalled pitching the idea of adding news to his financial data
terminals to Matt Winkler, the man who ended up running Bloomberg News.
Mr Winkler replied by asking how Bloomberg would react if the newswire found out that the chairman of its biggest client had run off to Rio de Janeiro with $5m from the company coffers and and the company called up to kill the story?. Full Story
This story, as well as all Burton-Taylor news may be accessed through the Press Room link below.
Latest Burton-Taylor Research
April 10, 2013
Public Relations Information & Software Global
Share & Segment Sizing 2013
Burton-Taylor delivers a comprehensive, 88 page analysis of public relations information & software supplier share, demand segmentation, vendor demographics and survey results of key user expectations. The analysis is sufficiently detailed as to allow public relations information & software providers or industry analysts to clearly understand competitive positioning currently, historically, globally, regionally and within individual demand segments and to enable public relations information & software users to make better informed, more confident and more appropriate purchase decisions which could result in greater profitability.
This report, as well as all Burton-Taylor free or for purchase research, may be requested through the All Research link below.