We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

"Data, data, everywhere, nor any drop to drink."



Had Samuel Coleridge been writing about the plight of enterprise application software rather than the Ancient Mariner, he might have phrased the problem in such a poetic manner. At modern computing speeds, applications are going thirsty for input data while surrounded by it on all sides. Databases store multiple gigabytes or even terabytes of recorded information, real-time data streams transmit transaction data as it occurs, and applications deliver processed data and results to other applications for further action.

However, this ocean of data is useless on its own. It must be delivered to business applications to feed their decision processes and drive the actions that ultimately determine profitability and success. Without instantly and continuously available data, applications can only sit and wait. In a growing number of industries and business processes, that can spell disaster.

Consider a securities trading firm that runs programs to advise brokers and agents, alert traders to changing market conditions, or even make automated trades according to situation-dependent rules. Massive amounts of money can be made or lost on delays of only seconds. As arbitragers react to real-time transaction data, the company best able to quickly access and make use of the data stands to profit over its competitors.

To use an example from another field, look at the challenge of real-time load balancing for a telephone network company. As usage increases and decreases instantaneously, it must be played against factors such as physical network availability, current bandwidth usage, Service Level Agreements (SLAs), projections of upcoming load, and a myriad of other considerations. Decisions are made second to second by automated systems that must stay current with conditions. If they fall behind in their reaction time, they are as likely to aggravate the problem as to correct it.

Speed of access is only half the story, though. Data must be continuously available to the applications. However, such high availability requires backup access strategies and alternate routes to the data sources to ensure a failure in one physical node does not stop the entire system from continuing to operate. Vendors may refer to this as data persistence or resilience to failure.

Fault tolerance and data availability are often key factors for government applications. From intelligence and defense operations to record keeping and licensing, data must be consolidated from many sources and made available upon request to geographically or departmentally dispersed users. Lack of an available data repository because of a local server failure, network blockage, or even a directed attack is simply unacceptable, with potential consequences ranging from inconvenience to catastrophe.

-1-

1  2  3  4  5  6  7  

   Next Page

Explore Our Topics

  • EDITOR'S BRIEFING
  • Virtual Conferences
  • Webinars
  • Roundtables

BPM in Action

March 10, 2011

The sixth annual BPM in Action 2011 Virtual Conference will explore cutting-edge market developments in BPM and describe how to leverage them for improved business operation and performance. More

View All Virtual Conferences

Smart Case Management: Why It's So Smart.

Date:Nov 05, 2009
Time:12:00 PM ET- (17:00 GMT)

REGISTER TODAY!

Date:Oct 29, 2009
Time:15:00 PM ET- (19:00 GMT)

REGISTER TODAY!
View All Roundtables
  • Research Library
  • Podcasts
  • News

Joe McKendrick: Part II of II: Designing Evolve-ability into SOA and IT Systems

In part two of Joe McKendrick's recent podcast with Miko Matsumura, chief strategist for Software AG, they talk about how SOA and IT systems need to change and grow and adapt with the organization around it.

Listen Now

Phil Wainewright: Helping Brands Engage with Social Media

Phil Wainewright interviews David Vap, VP of products at RightNow Technologies, and finds out how sharing best practices can help businesses understand how best to engage with online communities.

Listen Now

Peter Schooff: Making Every IT Dollar Result in a Desired Business Outcome: Scott Hebner of IBM Rati

Scott Hebner, Vice President of Marketing and Strategy for IBM Rational, discusses a topic on the top of every company's mind today: getting the most from IT investments.

Listen Now

Jessica Ann Mola: Where Will BI Fit In? Lyndsay Wise Explains

In BI, this tough economy and the increasing role of Web 2.0 and MDM are certainly topics on people's minds today. WiseAnalytics' Lyndsay Wise addresses each of them in this informative podcast.

Listen Now

Dennis Byron: Talking with...Deepak Singh of BPM Provider Adeptia

Deepak Singh, President and CTO of Adeptia, joins ebizQ's Dennis Byron in a podcast that gets its hand around the trend of industry-specific BPM.

Listen Now
More Podcasts
  • Most Popular
  • Quick Guide
  • Most Discussed

Quick Guide: What is BPM?

Learn More

Quick Guide: What is Event Processing?

Smart event processing can help your company run smarter and faster. This comprehensive guide helps you research the basics of complex event processing (CEP) and learn how to get started on the right foot with your CEP project using EDA, RFID, SOA, SCADA and other relevant technologies. Learn More

Quick Guide: What is Enterprise 2.0?

A lot of people are talking about Enterprise 2.0 as being the business application of Web 2.0 technology. However, there's still some debate on exactly what this technology entails, how it applies to today's business models, and which components bring true value. Some use the term Enterprise 2.0 exclusively to describe the use of social networking technologies in the enterprise, while others use it to describe a web economy platform, or the technological framework behind such a platform. Still others say that Enterprise 2.0 is all of these things. Learn More