We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Join us for SOA & Application Integration in Action Virtual Conference on October 5, 2010. Learn more here.

Increased competitive pressure and large volumes of fast-changing data have created huge challenges for today's enterprises. Thanks to a convergence of computing advances and a new generation of software, businesses can now keep up with scalability demands and leverage their server farms. As a result, companies can turn today's data challenges into opportunities to out-perform competitors.

Over the past several years, distributed in-memory computing has emerged as a powerful means to gain competitive advantage for nearly any business. Increases in data volume, complexity and rate-of-change have driven wide adoption of scalable computing solutions like server farms and compute grids so applications can handle increased computing loads. Keeping up with these trends is daunting in itself. Focusing on the need for faster applications often means important opportunities are overlooked. Creative analysis of changing data to recognize and respond to trends is now a must-have capability. Previously, only large companies with deep pockets could afford the technology and training necessary to seize such opportunities.

Never before has computational speed played such a pivotal role in gaining competitive advantage. For example, the financial services market provides the most dramatic example of this new reality. Market data messages have skyrocketed over the past few years as automated trading has become widespread. Marketdatapeaks.com tracks daily market data messages flowing across live financial data feeds and current volumes range between 2 and 3 million messages per second. Automated systems are faced with analyzing this staggering amount of data in as close to real-time as possible to make profitable trading decisions.

Opportunities are measured in microseconds, making computing technology a key competitive factor. Even for business segments that do not face such extreme parameters, it is clear that scalability and computational speed now create important advantages. For example, online retailing, auction sites, entertainment, gaming, and reservations systems all demand scalability and can benefit from insights into their data sets.

Data analysis is a powerful tool for business expansion and increased customer satisfaction. Gleaning useful information from line of business data like buying trends, warranty claims, customer feedback, manufacturing efficiency, and a host of other data can provide a substantial edge over others who are slow to understand their business information. However, data volumes are exploding and timeframes are shrinking, making data mining harder than ever.

A number of new technologies have been developed to help meet these challenges. Multi-core processors have overcome the limits of single processing units. Memory manufacturing advances have brought RAM prices to historic lows. Advances in networking now mean that 10-20 gigabit/second networks are common, and much faster networking is quickly emerging. Hardware virtualization is widely used in data centers as a means of increasing utilization of server hardware as well as providing fault tolerance, continuous uptime, and more.

These core technology advances are being leveraged by a new generation of distributed in-memory computing software. Combining the speed of storing data in-memory and the scalability of distributed systems, these solutions are grouped under the general heading of distributed data grids. Solutions in this category provide in-memory storage which spans a grid or server farm to scale an application's performance.

To stay ahead of the competition, companies must now take distributed data grids a step further and add data analysis capability. How can this be done? Companies can consider built-in parallel data analysis to transform the data into a parallel computation engine. Functions to analyze data in the distributed data grid can be easily written by both developers and analysts. These functions can then be invoked in parallel across the data grid to provide lightning-fast results. Importantly, parallel data analysis automatically takes full advantage of speedup across multi-processors, multi-cores, and virtual hardware instances. Best of all, the developer need not know any parallel programming. The functions are written just as if they were meant to run in-memory on a single machine and the distributed data grid does the rest. This removes one of the longstanding barriers to parallel data analysis and opens it up to nearly any business.

Using a distributed data grid, businesses can quickly and easily analyze their data sets. For example, a retailer can analyze sales trends for an online promotion while it is running and immediately make improvements. A financial services firm can store investment portfolios in the data grid and apply algorithms to them based on streaming market data, generating trades to optimize the portfolios. A manufacturing firm might cache a manufacturing line's process metrics in the data grid and run performance algorithms against the data to monitor efficiency and quickly spot problems. An online gaming site could shorten its response times to provide a more realistic gaming experience.

Finally, cloud computing is getting a lot of attention these days and it bears mentioning that there is strong synergy between in-memory solutions and cloud computing. With elasticity as a property of both the cloud and distributed data grids, it is easy to imagine provisioning a large number of cloud servers into a distributed data grid, pulling a big data set into memory, analyzing it, and then just as quickly releasing the servers. Could data analysis be a killer app for the cloud?

The benefits of distributed in-memory computing are being reaped today by many forward-looking companies. As this important technology spreads into the mainstream it is sure to become a requirement to remain competitive.



Explore Our Topics

  • Virtual Conferences
  • Webinars
  • Roundtables

BPM in Action

March 10, 2011

The sixth annual BPM in Action 2011 Virtual Conference will explore cutting-edge market developments in BPM and describe how to leverage them for improved business operation and performance. More

View All Virtual Conferences

Smart Case Management: Why It's So Smart.

Date:Nov 05, 2009
Time:12:00 PM ET- (17:00 GMT)


Date:Oct 29, 2009
Time:15:00 PM ET- (19:00 GMT)

View All Roundtables
  • Research Library
  • Podcasts
  • News

Joe McKendrick: Part II of II: Designing Evolve-ability into SOA and IT Systems

In part two of Joe McKendrick's recent podcast with Miko Matsumura, chief strategist for Software AG, they talk about how SOA and IT systems need to change and grow and adapt with the organization around it.

Listen Now

Phil Wainewright: Helping Brands Engage with Social Media

Phil Wainewright interviews David Vap, VP of products at RightNow Technologies, and finds out how sharing best practices can help businesses understand how best to engage with online communities.

Listen Now

Peter Schooff: Making Every IT Dollar Result in a Desired Business Outcome: Scott Hebner of IBM Rati

Scott Hebner, Vice President of Marketing and Strategy for IBM Rational, discusses a topic on the top of every company's mind today: getting the most from IT investments.

Listen Now

Jessica Ann Mola: Where Will BI Fit In? Lyndsay Wise Explains

In BI, this tough economy and the increasing role of Web 2.0 and MDM are certainly topics on people's minds today. WiseAnalytics' Lyndsay Wise addresses each of them in this informative podcast.

Listen Now

Dennis Byron: Talking with...Deepak Singh of BPM Provider Adeptia

Deepak Singh, President and CTO of Adeptia, joins ebizQ's Dennis Byron in a podcast that gets its hand around the trend of industry-specific BPM.

Listen Now
More Podcasts
  • Most Popular
  • Quick Guide
  • Most Discussed

Quick Guide: What is BPM?

Learn More

Quick Guide: What is Event Processing?

Smart event processing can help your company run smarter and faster. This comprehensive guide helps you research the basics of complex event processing (CEP) and learn how to get started on the right foot with your CEP project using EDA, RFID, SOA, SCADA and other relevant technologies. Learn More

Quick Guide: What is Enterprise 2.0?

A lot of people are talking about Enterprise 2.0 as being the business application of Web 2.0 technology. However, there's still some debate on exactly what this technology entails, how it applies to today's business models, and which components bring true value. Some use the term Enterprise 2.0 exclusively to describe the use of social networking technologies in the enterprise, while others use it to describe a web economy platform, or the technological framework behind such a platform. Still others say that Enterprise 2.0 is all of these things. Learn More