We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.
93 vendor_downloads items found, showing 31-60 Sort by Relevance / Date
04/02/2008 Podcast and Transcript: The Business Process Expert Community and the Future of BPM: A Community-Driven Approach to a Book for Aspiring BPXers Gold Club Protected
03/20/2008 Transcript: Identifying and Federating Today's SOA Power Centers Through Enterprise Service Buses Gold Club Protected

Participants in this Webinar include Joe McKendrick, author of ebizQ's SOA in Action Blog, and Leif Davidsen, manager of WebSphere SOA Marketing for IBM. This transcript has been edited in parts for clarity.

Joe McKendrick: Okay. Welcome everyone. Today's ebizQ Webinar in which we explore the latest trends in SOA integration. I'm Joe McKendrick, author of ebizQ's SOA in Action blog and an independent industry analyst. You can also find me over at the ZDNet SOA blog site. You may have seen some of my stuff there. I'm pleased to be joined for this Webcast by Leif Davidsen, manager of WebSphere SOA Marketing for IBM. Welcome Leif.

Leif Davidsen: Hi, Joe and thank you.

More Resources

Replay this Entire Webinar


Get a Full Copy of the Survey Results

Next Webinar With Leif Davidsen: Using SOA for Maximum Reuse and Increased Business Agility:
Date/Time: April 9 at noon ET
Join Leif and his IBM colleague Jim Douglas as they review why companies should invest in maximizing their reuse of IT assets, how they can benefit from this across their business and throughout the investment lifecycle and what solutions to consider.
Click here to learn more register

JM: And Leif, you're over at the Hursley Park Labs for IBM, is that right in UK?

LD: Yes, the Hursley Park Labs home of products like MQ, Message Broker, and of course, for many, many years CICS, CICS Transaction Processor.

JM: Fantastic, the home of CICS, yes. And today's Webinar is titled Identifying and Federating Today's SOA Power Centers through Enterprise Service Buses and we'll be reviewing these results of a new survey commissioned by IBM and conducted by the analysts at ebizQ. You'll see my name as well as that of Beth Gold-Bernstein, who played a project management role in the survey.

The survey covered adoption rates within the middleware space of SOA and Leif and I will be discussing the survey and providing examples of implementations that Leif has seen among IBM customers. And I also want to remind everyone that all attendees to the live session will receive a copy of the survey results. And as our way of adding value by coupling independent research with IBM's deep expertise in this subject.

Okay. We're going to start off by discussing what we mean by “enterprise service bus”. The exact meaning and purpose of ESBs has generated quite a bit of disagreement and confusion, if you will, across the industry over the past few years since the term and the concept first came on the scene a few years back. Typically, if you ask 20 analysts what an ESB is, you'll get 20 answers and vendors also have their own interpretations of what ESB means.

Now to many, you can't even take the first steps into SOA enablement without an ESB to integrate various messaging and data formats. To others, however, ESBs have a questionable role in the organization. I've heard it compared to the human appendix -- saying that it serves some kind of mysterious temporary purpose that's there as a result of some evolutionary cycle, but in the long rule it's just taking up space and its painful to have removed. I had an appendectomy back when I was a kid so I know how disruptive that kind of removal can be.

Many vendors don't even call their ESBs solutions ESBs. Often, you hear them called message brokers, integration platforms or application servers. Up to a couple of years ago, if you went up to the campus of a certain large software company in Redmond, an enterprise service bus was that vehicle that got you around the campus. Now, for purposes of this survey, since these definitions are so fluid and often debated, we didn't want to limit our responses because of labels so we went with an expanded definition merging enterprise service buses, message brokers and integration platforms into a single category.


Replay this Entire Webinar -- Complete with Viewable Slides

We also addressed ESBs as a term straight on as well. But this combined product category, ESB service intermediaries encompass solutions that serve to enable and prioritize the convergence of messages in various formats into a standardized format. They provide a platform that enable enterprises to coordinate security policy, and quality of service, and process integrity polices.

Now Leif, why do exactly would an enterprise need an ESB? Let's get to the heart of this. Isn't it possible to get started with Web services and SOA without a mediator in the middle?

LD: Well, that's obviously a question that many people come to when they start. Here you are, you're a developer, and you are looking to create an IT infrastructure problem and so you go and you code a solution to that problem. And that's to be fair, that's been how developers have gone after solutions to the problems that businesses have been asking them to do for many years.

And that's when you get the sort of same type of solution architectures that we see on the left-hand side of the chart off up on the screen in a moment. You get this rat's nest of architectures because while connecting, there's never a problem connecting up two points; it's a simple straight line. The issue comes when you're looking to connect up more and more points and then you make changes.

What we've seen in IBM is the layers of complexity just grow over time as people make changes to their business applications. And that complexity does keep adding problems to the business not only when it wants to make future changes, but in just trying to do any regular maintenance to that. And of course, it means that that processes become very hard bound, very inflexible, very rigid and you know rigidity is bad for businesses in today's business economy.


Get a Full Copy of the Survey Results

Now, if we look a little bit deeper at the picture underneath this sort of rat's nest of how we want to actually implement that nice flexible process we see on the right-hand side, we can actually see the connections between application interfaces is the real problem here. You got all this business logic that, obviously, sits in your applications, and the applications are what is trying to get to grips with your corporate data so that's actually driving your business.

And because you got so many interconnected applications in your business that have grown up with individual linkages, you get this mess and enterprise service buses is there to address that. Now, going back to your original question, do you need an enterprise service bus? If you're starting with two Web services, clearly an ESB can be seen as an overhead.

However, what an ESB does at its very heart is it provides a dedicated piece of infrastructure to allow you to actually create all of the connectivity definitions in a very managed and controlled way so that even if you did have just two applications, you might well say, well, it might be overkill but I know that I'm going to have more than two applications, or I know that things are going to change and therefore it's going to be more of a best practice to do it in -- define any interfaces and any connectivity in an ESB because then I know where it'll be, it'll be easy to find, manage and maintain.

In the same way that if on writing a piece of code, do I need an application server? Well, you might not think you do but in fact, it's a useful piece of technology to have as you start to apply it across your business. So I think we see that enterprise service buses certainly have a place. Perhaps the appendix doesn't it in the human body, but until we discover something better an ESB today acts as a really strong way to logically define, managed, and run the conductivity between both your services and your non-services part for the business so handing back to you.

JM: Okay. Great, thanks. Well, in order to find out exactly what kind are rat's nest exist out there in enterprises and what they're doing to address these problems, and the role of ESBs and integration brokers, or mediators play in helping to address the situation, the ebizQs analyst including myself teamed up with IBM to conduct a survey on this very issue.

The survey was fairly recently conducted during the month of January. We had a 21-question online survey instrument that queried respondents about SOA and their related messaging strategies. The survey was promoted through to ebizQ members through the website, and newsletters, e-mail blasts, and as a prize the participants were had their names placed in a drawing to win an Apple iPhone.

And in addition, all survey participants were promised the copy of the survey results which the listeners out there will also be receiving. And we had a total of 244 companies responding to the survey. And just to give you a picture of who or what those companies were like, we had a fairly broad cross section in terms of company sizes. As you see on the chart here, we had a large chunk of companies with more than a billion dollars in annual revenues.

We also had a sizeable part of our survey base consisting of what may be considered small companies with less than $100 million a year in annual revenues. Looking at the industry profile, this is a pretty comprehensive chart but just to let you know what's on there, we had a lot of involvement from financial services companies, computer services in the industry also participated as we as insurance companies, and manufacturers. A total of 33 industries had participated.

And I'd also like to point out that a large group of respondents were either management, or they were enterprise, or software architects. Eighteen percent were enterprise architects, 12% software architects, and 17% were from the business side of the equation, business managers. Now this is encouraging since you often hear that SOA is still too focused or concentrated on the IT side of the house, and that the business really doesn't get SOA yet.


Next Webinar With Leif Davidsen: Using SOA for Maximum Reuse and Increased Business Agility:
Date/Time: April 9 at noon ET
Join Leif and his IBM colleague Jim Douglas as they review why companies should invest in maximizing their reuse of IT assets, how they can benefit from this across their business and throughout the investment lifecycle and what solutions to consider.
Click here to learn more register

The fact that close to one out five respondents were executives or line of business managers and users, as I said, is an encouraging sign that SOA is entering the mainstream, the corporate mainstream. As I said, one of the things we're focusing on in the survey is determining the overall state of SOA. We found that most SOA efforts at this time are scattered across multiple business units are project-based developments and integration teams.

We're going to go into that in a few seconds. And we also found that it's still in the early days for SOA rollout in businesses. As we'll show you later on, there's a fragmented approach and lack of control. This makes sense given that many early SOA efforts start at small pilot projects and proofs of concept and we're still in the early days of the SOA dynamic. And as SOA evolves over the longer term, we expect to see more -- the issues around lack of control, the fragmentation around across enterprise to become more problematic.

To read the rest of this transcript, sign in with your free ebizQ Gold Club membership below.

02/22/2008 Full Transcript: SOA Visionaries Podcast - Dale Skeen, Ph.D, Chairman and Chief Technology Officer, Vitria Gold Club Protected

This edition of "SOA Visionaries" featured David Linthicum interviewing Dr. Dale Skeen.

Dr. Skeen, who played a key role in moving enterprise application integration away from mainframes, talked about how SOAs and BPM -- which he described as "the perfect complements" -- are being combined with web 2.0 in an empowering new evolution of enterprise computing.

Here is an excerpt of just some of the many concepts described in the 21-minute podcast:

Dale Skeen: SOA is an enabler that allows you to access business functions, and services, and data universally. BPM is a higher level that orchestrates these business services and human interactions in ways that allow you to meet a business objective. So hence, I've always considered these to be the perfect complementary technologies to work together. Now again, I think what's very exciting and what we're demonstrating with our new M3O technology is bringing Web 2.0 as part of this formula.

David Linthicum: Awesome. And you know, when you say "Web 2.0 ultimately", so it's basically bringing together service oriented architecture bringing together business process integration and then marrying that with the whole emerging platform of the web and all the exciting stuff that's going on there. Is that a good depiction of what you guys are doing right now?

DS: Exactly. And we really see this as the next evolution of SOA and of enterprise computing. So if you think about it, again, go back SOA brings this universal access to services and data through the SOA enablement tools. It does in a secure, manageable, and governed fashion. Now, Web 2.0 brings rich internet interfaces, rich user experiences based on technology such as AJAX and Flex, which are universally available in your web browser.

And also brings other ideas of collaboration really sending up teams of collaborations and supporting things like social networking and new forms messaging to one another. It supports also a new form of integration. You've talked about application integration, which is hard, and very techie. Well, Web 2.0 allows this notion of mashups where you let users sort of integrate and how flexible, lightweight, easy-to-do fashion.

And so that's the second cornerstone of this new convergence, which is Web 2.0. And the third again, is bringing BPM back into the equation. BPM is important. Business Process Management allows businesses to orchestrate their assets to achieve business objectives like to fulfill an order, or to acquire new customer. Okay. The novel thing about BPM has always been that's its graphical, it's what's called "model driven".

To define a process graph and then you can directly execute that. And I have made BPM easier to change and sort of elevate the experience. Now again, when you bring all three of these together, you get something fundamentally new. It's really a next generation enterprise platform.

To read the rest of the trasncript, log in below:

02/13/2008 Complete Transcript: Technology Trends for 2008: BPM and SOA Gold Club Protected
01/25/2008 Full Transcript: The Business Process Expert and the Future of BPM: A New Role, Matched to New BPM Tools Gold Club Protected
11/27/2007 Full Transcript: SOA Trends: Intersection of SOA, EDA, BPM, and BI Gold Club Protected
11/07/2007 Special InfoWorld-ebizQ Podcast Series: How To Build a Foundation for Continual Change Gold Club Protected
10/29/2007 Meet Wise Analystics' Lyndsay Wise Gold Club Protected

Lyndsay Wise is an industry analyst for the business intelligence and business performance management space at WiseAnalytics.

She has over seven years of IT experience in business systems analysis, software selection, and implementation of enterprise applications, globally. Wise has been featured in numerous publications covering topics such as business intelligence, data integration, enterprise performance management, and customer data integration. In addition, she has written articles covering major vendors in the business intelligence industry and is a featured columnist in DMReview and DashboardInsight.

Before her role as an analyst, Wise served as a business analyst for Toyota Canada, Inc. where she worked on business intelligence and data warehousing solution implementations across the organization affecting finance, distribution, after sales operations, inventory control, etc. She has also has worked on major IT migration projects in systems analysis and design for the Ontario government's financial and property management legacy systems.

She has a degree in information technology management from Ryerson Polytechnic University, Toronto, Ontario.

Welcome to BPM Showdown: Oracle's Hyperion System 9 vs. OutlookSoft vs. Cartesis Suite
08/13/2007 Full Transcript: eVision's John Delaney on SOA and BPM via SaaS Gold Club Protected
08/13/2007 Full Transcript: Event Processing: Competitive Advantage Through Situational Awareness Gold Club Protected
Event Processing: Competitive Advantage Through Situational Awareness Beth Gold-Bernstein BGB:    Welcome everyone and thanks for joining us today. I’m Beth Gold-Bernstein, Director of the ebizQ Learning Center and moderator of today’s program. Before introducing the webinar, I’d first like to take a moment to acquaint with the features and functions of the environment which is highly interactive. You can chat with company representatives and other visitors, send a business card or leave a message. After today’s presentation, we will be taking your live questions. You can submit your questions by clicking on the “Ask a Question” button. To download a copy of today’s presentation, click the “Files” button. To enlarge any of the slides, click on the magnifying glass icons to the right of the slide.

Today, we are very happy to welcome back Roy Schulte, Vice President and Research Fellow at Gartner who is going to be talking about Event Processing and Competitive Advantage through Situational Awareness. This webinar is a preview of Gartner’s upcoming Event Processing Summit which will be held September 19th-21st in Orlando. I’d also like to thank BEA for sponsoring this educational event. Now, I’d like to turn the program to Roy. Thanks for joining us, Roy.

Roy SchulteRS:    Thanks, Beth. Thanks for inviting me. It’s always a pleasure to work with you on these webinars. The topic today of event processing is something that is very important. It’s becoming of greater interest to Gartner clients and, in fact, to the industry as a whole. The presentation today is an introduction to this field of event processing, to provide backgrounds to people who are going to attend the conference in September or just for anyone who is interested in the topic and wants to learn more about it.

Take our EDA Poll and Win an iPod

ebizQ is conducting a survey on event-driven architectures (EDA). Event-driven processing can increase business agility by providing faster, deeper insight, and responsiveness to business-impacting events as they happen. This 15-question survey only takes minutes to complete. Participants in the survey will be entered into a drawing to win a fabulous new Apple iPhone.

Our Upcoming ebizQ SOA Webinars

1. SOA in Action: Another Virtual Show
Review our Agenda for our Second SOA in Action Virtual Conference on Oct. 30 and 31st.


This presentation will provide an explanation of why architects and software engineers, business analysts and IT managers are using event drive architecture more often in business applications. So, in today’s session we’re talking about how you get competitive advantage through situational awareness. Now, situational awareness, sense-and-respond and track-and-trace are all very important issues for companies today. They are management strategies. What makes it interesting is that they have some very important things in common. What they have in common first is that they are all aspirations – things that people want to have and they are all associated with certain themes such as:

•    Adaptive enterprise
•    Agile enterprise
•    Zero-latency enterprise
•    Real-time enterprise
•    Prediction-based action

Now, situational awareness implies having an up-to-the-minute understanding of all of the critical aspects of the environment and your own internal operations. So, simply, it’s just knowing what’s going on so you can figure out what to do next.

Sense-and-respond has a similar flavor to it. In fact, I see people using these terms synonymously with situational awareness. However, sense-and-respond emphasizes the early detection of specified opportunities and threats perhaps using only one or a few information sources, whereas situational awareness connotes seeing the big picture – a holistic view of all of the factors that are giving you input from many different places.

Track-and-trace is coming at this from a slightly different angle. It’s talking about the recording of tasks and the presence of identified items as they move between physical locations or through logical steps in a business process. For example, logistics systems track and trace the movement of physical goods in the supply chain. Supply chain management systems, then tell people where the shipment was, where it is now, they can predict when it is going to arrive at the destination and they can prove that a shipment was delivered. The idea of track-and-trace is applied to supply chain, it can also be applied to any straight through process like moving insurance claims or customer order or payments or other types of intangible information-based items as they go through their life cycles.

Now all three of these concepts are examples of operational business intelligence. Each is intended to decrease the gap in time between something is known or is knowable or at least is predictable and from a person or device or an organizational unit, take action on it. Situational awareness, sense-and-respond, track-and-trace are modes of operation and event processing is how those modes of operation, how those aspirations can be achieved.

Event processing is a set of specific concepts, design patterns, best practices and software tools and that’s what we’ll talk about as we go through this presentation. So, let’s begin this topic by looking at some of the terms that we’re going to use.

Event Processing Terminology

First, an event. An event is exactly what you think it is. An event is something that happens. Could be a bank transaction, a stock trade, could be a customer order, could be someone changing their address, could be the delivery of a shipment or could be a major event like buying a house. Now, of course, computers can’t deal with abstractions like that. Computers have to deal with bits and bytes, 1s and 0s. So, to make an event computable, we make event objects.

Some application system or some device has to create a machine readable report of the event, called an Event Object which is associated with that real world event. For example, if Fred Smith deposits $100 in his bank account at 10 today, that’s an event. The computer record associated with that deposit transaction, perhaps in the form of an XML message is the event object. Event processing means computing that performs operations on these event objects like creating them, reading them, transforming or deleting them.

An event object may be in the form of an XML message, it could be in the form of some other kind of message, it could be an email or it could just be a row in a database. Anything that gives a specification, a record of an event is an event object. An event processor is a software module that sends or receives or somehow manipulates an event object.

Are Companies Event Driven?

Now, when you talk about event processing, one of the first questions most people ask is “Aren’t companies already event driven?” after all, events just mean that something happened. And the answer is, of course, “Yes”. Companies have been event driven ever since they’ve existed. Long before we had computers, companies were event driven. Thing happened, people bought things, people sold things, people did transactions. However, many of the application systems and many of the automated processes that we use in business today are event driven only at the surface level. Deep down, if you look at the design of these systems, they are not event driven in their underlying application architecture.

In the example, here we have an order entry application. You have a person at a counter or perhaps a person over the internet placing an order for some goods. The most common way of implementing this would be using an interactive, a request/reply kind of design pattern where a request comes in, some data is captured and then some back end processing is done. Perhaps people look up a price, they look up the availability of the goods, they look up someone’s credit rating and so forth, and the order is captured in a request/reply fashion. Many of the things are also done in a request/reply or “pull” based fashion. When you look up your bank balance, that is a request/reply.

So, whether you’re doing it in person or whether you’re doing it on the internet, a great many applications are doing this, including the majority of service oriented architecture applications that are in use today. Today’s SOA applications are request/reply between the user, the end consumer and the applications system and then within the application system, ach of the services that are invoked are often invoked in a request/reply fashion.

If it is a complicated process, a multi-step process, the next step in the process even in a world of SOA is often done on a pre-scheduled basis or “batch” basis. So, you might capture the order in a request/reply fashion, put it into a database and then at night, a timer goes off and the company’s internal clock starts up an order fulfillment application or a manufacturing application to take that order out of the database and then process it in its next step. These two things – request/reply which is a pull or a scheduled operation are not event driven operations.

However, we could turn this into an event driven process under the covers by redesigning how it’s implemented. We might still capture the order using an interactive request/reply order entry system, however, once you’ve captured, rather than put it in a database and have a batched out fire up, instead we might dispatch a real time message or near real time message which essentially is containing the event object, the record that someone has placed an order and that acts as a transfer, a trigger, for the next step in the process which could be manufacturing or order fulfillment or perhaps billing and so forth down the chain. In this case, what we’ve done is we implement a multi-step business process as a staged event driven process. Each stage in the process is triggered by the arrival of an event object.

A second way in which we use an event driven architecture in this particular example is by using an event delivery to notify somebody that something has happened. So, for example, when the article is finally shipped, we might want to send a notification message, an email or some other sort of message to the person saying that the goods have shipped. Again, what’s happened here is that we’re using an event, a push of an event object to notify a person that this has happened.

In this way, we’ve done several things that help the process run faster. First, we made it run in near real time end to end, we’ve made the stages so that they don’t have to wait for end of day or some other batch cycle take place. We’ve made it end to end. Second, we’ve also made it so that the person doesn’t have to poll continuously and keep looking when something has changed. You’ll notice more and more of the activities that you’re doing with business are already done on this basis, are already done on a push basis. For example, if you’re credit card company notices that some strange activity is happening on your credit card, you will be notified proactively. They will send you a message or they may call you up, saying “we’ve noticed some strange activity on your credit card, maybe we need to check into that”. So, to speed up the functions of a company or of a person, this mode of being eager, this push mode of behavior is being used on a wider basis.

Now, this is not a new idea, of course. Some people have been doing this for many years. This has been called message driven processing in some circumstances. In many places, it’s also called “stage two processing” or “flow-through processing”. Nevertheless, what we’re talking about here is a simple use of events. Now, we call this simple in the sense that the arrival of one event is sufficient to trigger the operation of the next step. And that becomes important when we look at the other types of event processing coming up.

To summarize, an event driven architecture is different than the way many times we develop a typical business application. First, because we’re using events that were pushed from one step to the next step. Second, the recipient of the event act immediately, not on a timer basis and not when a request is made from some other place. And third, one of the attributes of an event object is that the source of the event, the sender, doesn’t specify what action the recipient of the event is going to perform. When someone sends an event, all they’re doing is tell you that something has happened or that something is predicted to happen. The choice of what to do about it is made by the recipient. What this does is de-couple the sender from the receiver in a way that is less, so that the sender and receiver are less bound together than they would be in a traditional request/reply relationship.

Request/Reply Relationships and Event Processing

In a request/reply relationship, the requester specifies what the receiver is supposed to do; you’re supposed to place this order or you’re supposed to look up a credit history or you’re supposed to do something. In an event driven architecture, the event object narrowly reports that something happened and the event receiver, the consumer of the event decides what to do. This gives the developer flexibility to change the receiver, change the consumer without having to change the sender. It also gives you the ability to add additional receivers to the same message so the same message can be delivered to multiple places. Again, without changing the sender. This gives you a flexibility, a plug-ability beyond what is found in a typical request/reply kind of situation.

What we’ve discussed here are the potential pulls of a simple kind of event processing, the one at a time kind of processing. But this is really only half the story of event processing. The second aspect of exploiting event processing is to look at the information value that is available in multiple events and this is called “complex event processing”.

In this kind of a system, the event processor is looking at multiple events, often of different types, coming from different sources. The event processing system may look at the relationships between the events to find patterns, to connect the dots, to extract insights that are going to enable faster and better operational decisions. In some cases, the purpose of the system is to drive a dashboard that a person is looking at, a BAM dashboard – business activity monitoring dashboard. In other cases, the event processing system is sending an event to a business process manager or an application program to kick off a business process that’s going to operate in response to the information that’s been found by the event processing agent. So, here, what we’re looking for is threats and opportunities. We’re searching through the date using it’s information value to find opportunities to get competitive advantage or places where we can detect risk and things that we want to avoid.

A classic example of this is found on Wall Street. They’ve been doing complex event processing, in the form of events being processing for 15-20 years. In any algorithmic trading application, what you do is you look at what’s happening in the market, you want situational awareness for what’s taking place in the capital markets. In this case, the event source is the stock market exchange. They are the ones who are recording the trades and they’re also putting out price quote information. The transfer of the data comes from the stock exchanges, sometimes through a data, real time data provider like Reuters, Bloomberg and the event object are stock picks, the actual data of the trades or the price quotes. The event consumer is an event processor and it’s built into a trading platform. The trading platforms are rules engines that are capable of doing algorithmic trading; they’re applying rules to decide when to do a certain trade. What they are doing are looking for patterns in those events, they’re calculating average prices perhaps in the last one minute or five minutes, they’re looking at trends of price over time, they’re looking at the spread between the prices in one location and in another location and so forth. Based on the findings, the insights they can derive by looking at these event streams, they then may take an action immediately to buy or sell.

Now, this is a classic kind of event processing application, a complex event processing application where you’re dealing with a very high volume of data, sometimes of thousands, sometimes tens of thousands of messages per second. And you have to respond very quickly because the opportunities come and go within milliseconds. An algorithmic trading engine may be able to respond in a few milliseconds where a person couldn’t possibly respond that fast. So this is a place where you have really an extreme in terms of a many things both the volume and the speed at which you have to react. The actually complexity in the computation is not all that great.

Complex Event Processing

So, what we’ve done is we’ve illustrated a second kind of event processing, here. Here, what I was just talking about making an end to end business process that runs smoother and faster and more straight through, here, we’re talking about using complex event processing for purposes of operational business intelligence. The properties of this kind of event processing are a super set of the simple event processing we talked about earlier in this discussion.

First, these systems are event driven, they use an event driven architecture so again the events are pushed, the recipients act immediately and the event source doesn’t specify what the event recipient is supposed to do about it. However, beyond that, there are these additional capabilities that are added to the picture. First, they are looking at multiple events at one time. An event stream processing system is sort of a simpler, but sometimes very high volume and very fast moving type of complex event processing. Typically, you’re dealing with only one or two event stream and typically, what the system has to do is to filtering very quickly. By filtering we mean separating the wheat from the chaff, throwing away the events that are not of interest, to be able to pick out the events that do matter. From this it does calculations: adding things up, averaging things, doing some simple operations or comparisons to find out what action should be taken.

Beyond this there are also other kinds of more sophisticated types of complex event processing. They may be looking at things like causality: why did this event occur, what events happened previously that led to this event taking place. So this is sort of like watching the dominoes fall and understanding that when this domino fell, there were a chain of dominoes that fell before it. Being able to trace back in time what the other events were that led to this particular event.

Talk about event hierarchies where you consolidate the input from a number of events into one complex event and then you may further consolidate the information value in multiple complex events to yet a higher level complex event. The idea is to distill down many simple readings into a more summary level meaning, summary level event which actually is easier for a person or an application system to assimilate. What they want is the bottom line, not the details.

If I give you a complex event that says, “gee, your order is about 20% between 8 and 10 this morning”, I’ve told you something that you can act on. Whereas if I give you a detailed listing of the 350 orders that came in this morning, you wouldn’t be able to make much value out of that, those details. So, in fact, complex events are called complex but what they actually are is something that is simpler to assimilate because they are a summary of the information value and other events.

Complex event processing is one of the fundamental underpinnings in many kinds of business activity monitoring. When we talk about business activity monitoring, many times we’re thinking of dashboard systems. Systems in which someone is looking at a browser-based display of some key performance indicators. The display may include bar charts, it may include pie charts or it may include other representations. It doesn’t have to be graphical, but what you’re trying to do is track in real time or near real time what is happening in your company or in some part of your company. There are many different kinds of business activity monitoring. It doesn’t have to be done on a dashboard. You can be doing business activity monitoring and have the access channel be an email, or it could be a phone message or it could be some other channel, some other way in which the computer is helping to notify what happens with a person.

Also, many times, people would consider business activity monitoring part of triggering an automated process. So, in some cases, the response to something that you’ve detected through the system is not just displaying it to a person but it may be starting up an automated process to respond to that thing. Some of the systems that do business activity monitoring have sophisticated alert management capabilities. What they will do is to use a role based mechanism to alert a particular person or a particular type of job category when something has happened – an exception condition, a threat or opportunity has been detected. If the person who has been notified doesn’t respond back, then the alert management system may escalate and send the alert to someone else or try the same person on a different channel. So there are many ways in which the concept of the business activity monitoring can be implemented but in every case, what you’re doing is you’re using business events behind the scenes to help detect what the situation that you want to operate on.

Now, and sometimes the business activity monitoring tool itself has within it an event processing capability; in other cases it is using a separate external complex event processing engine. Now, business activity monitoring can be considered to be the real time operational side of a holistic business intelligence strategy. Now classic business intelligence of course, looks different than this. In a classic business intelligence effort, the end user is a manager or an analyst making some sort of strategic decision. On the other hand, by contract with a business activity monitoring system you’re generally talking about an operational person making a tactical decision on a moment by moment basis. Traditional business intelligence is like looking for a needle in a haystack. The haystack is the data mark or the data warehouse and the business intelligence tool is able to look through vast quantities of data and find the important facts that help you make those strategic decisions.

Business activity monitoring based on complex event processing works a little differently. It is sort of more like finding a needle in a conveyer belt of hay. The conveyer belt of hay are the facts coming at you in a real time stream. The facts are the event objects. And the needle you are looking for are the things that are changing, the threats and opportunities that you can detect in near real time. Now one of the big aspects of business activity monitoring, just like the rest of other types of more traditional business intelligence is to separate the wheat from the chaff, to separate the important facts from the less important facts. In business activity monitoring there’s a premium on getting the results out to the person quickly because, in many cases, you’ll have a relatively short period of time to respond. In the case of trading systems, you have milliseconds to respond and so you have to do it on an automated basis. In other kinds of business process monitoring, you may have seconds, minutes or even hours to respond depending on what type of business process it is you’re monitoring.

Reasons for Event Stream Growth

The number of places in which companies have event streams is growing rapidly and this is happening for several reasons. First, we have the ever-decreasing costs of hardware so the costs of sensors are going down and the cost of the computer, the servers are also going down. Sensors, things like RFID and GPS and bar code readers and so forth have become less expensive. Because of that, we’re able to buy more of them, we’re able to instrument many more activities than in the past we would have been able to. We also have, of course, much faster networks, great bandwidths and decreasing costs for sending data over those networks and that’s very important for enabling event processing applications.

 Last, but not least, we have the web and we have Web 2.0 now and other aspects of the internet which are increasing the access of consumers and people working in business to get at data and to directly input and deal with the computing applications. The standards that are associated with the web have made it possible to build applications more quickly and have those applications talk to each more easily. Because of this, we’re seeing brand new kinds of event streams that we’ve never seen in the past. The volume of events and the number of different kinds of event streams is growing rapidly in almost every company. We have many kinds of event streams. Some of them old and some of them new. The traditional transaction processing systems like bank ATMs and airline reservations and insurance claims and so forth, those are growing in volume just because of the nature of businesses expanding. The sensor networks really are new kinds of things, something that most companies didn’t have 10-15 years ago but many companies are developing them based on the new types of devices that we just talked about a minute ago.

The web has been the source of many new kinds of event streams. We have web auctions, we have quickstreams and quickstream analysis which is a very fast growing example of an event processing application. We have RS and atom feeds (??? 27:53.9) which are push based mechanisms or at least they appear to be push based mechanisms for telling people when something has happened, when events have happened. We also have transaction streams that are consisting of very high volume sort of different than traditional LLPP in a sense, they really are a transaction processing still. In telecode, we have call data records, we have credit card transactions both for the authorizations and for the actual buying of things and we have micro-payment systems now where you have a vast number of relatively low value things. The net of all this is that companies are being flooded with event streams today. A large enterprise today has any where from 10,000 to 10 million business events per second taking place. And if you multiply that out that means that there are literally trillions of business events happening in companies today.

The goal of event processing is to harness these events, to harness the information value of these events to help companies to act better, to help the companies make better decisions, smarter decisions and faster decisions.

Harnessing Events for Event Processing

When we look around, we see a number of applications that are being used that leverage this today. We talked about algorithmic trading which is one of the classic and one of the earliest applications on Wall Street where event processing was being used. Now, they could afford to do it 15-20 years ago because of the value of what they were doing was so high. Now, because of the cost of event processing coming down and the spread of the event processing technology, we’re able to use it in many more applications and mainstream companies covering a lot of different kinds of activities.

One of the most widely used types of event processing systems is for compliance reporting and monitoring. You’re keeping track of what’s happening for regulatory purposes. Related to that is fraud detection, things like anti-money laundering, credit card usage and telco – phone card – usage also. The military has been using event processing for years and it’s a very important part of battlefield operations today, sometimes using satellites and other ways in which they’re using military intelligence. The military played a key role in the development of event processing technology because they funded a lot of the research in the late 1990’s and early 2000’s where the academic institutions were inventing the algorithms and the techniques that are used for event processing today in other industries. In fact at the Gartner Conference in September, one of our case studies is on a military intelligence application from the U.S. Army.

The pharmaceutical industry is using track and trace. Using event processing to keep track of the location of shipments of pharmaceuticals from place to place. The goal here is to track the pedigree of each individual unit, to know that you haven’t had a counterfeit drug swapped in because you were able to keep track of where the actual legitimate drugs are at every moment in time. Security is one of the biggest uses of event processing today, things about intrusion detection and this is both intrusion for a physical places like in an art gallery, being able to detect when someone has entered the art gallery and may be stealing something or a network intrusion where someone has penetrated a network and is perhaps going to cause havoc on a network basis.

Transportation operations are another place where event processing is being used already. The year one (??? 31:49.8) operations, the chain operations, shipping companies and so forth, keep track of what’s happening on a day-to-day basis, on a minute-by-minute basis. This is an example of situational awareness where there are multiple different event streams all contributing to an overall holistic picture of what’s happening in the enterprise.

Event Processing and Service-Oriented Architectures

Now, at this point some people may be saying “sounds interesting but what does all this have to do with our other main strategy today?” and that’s the service oriented architecture. And, in fact, there’s a very high overlap between event driven architecture and service oriented architecture. When you use an event driven architecture, you have a choice of how to do it. You can either implement the event driven architecture as a service oriented architecture or you can implement it without using service oriented architecture principles. Most of the time, if you’re dealing with business events today, you should be using service oriented architecture principles as you build an event driven application.

Now, for something to be called a service oriented architecture application, what we mean is that it conforms to the five principles of SOA. First, the application must be modular, it’s of multiple pieces. In this case it would be event senders, event sources and event consumers would be the modules. The modules have to be distributable across multiple computers. There has to be a formal interface, a definition of what is happening between the event consumer and the event source. Now, the interface for an event driven system is a little different than the interface for an expressly applied SOA system but in both cases, some things are the same, that is in both cases the contents of the messages that are sent back and forth have to be documented as the interface. The difference is that in an event driven architecture, you don’t have the notion of specifying what the operation is whereas in a request/reply kind of SOA, you do specify an operational method name.

The fourth principle is you separate the interface from the implementation meaning that you can create a new source or a new event consumer using the same definition, the same description of what the message is without having to change anything else in the system. And finally, the modules are sharable so a consumer can accept events, event topics, multiple sources and multiple sources can send the same event object to multiple consumers. So, if you implement an event driven architecture application using those five principles, then what you have is a SOA application that is event driven.

When you’re doing this, you can implement the event driven architecture application with a lot of the same techniques and people and tools that you use for the other kinds of SOA applications. You can leverage standards like XML schema description XSDs. You can send the events over a SOAP message and you can use WSDL(???) to define how the communication is done for EDA. When you’re doing event driven architecture this way it should be coordinated by your service oriented architecture center of excellence, assuming that you have one. That is, you should not have two different groups in your company, one to do event driven architecture and one to do service oriented architecture. Rather, you should look at event driven architecture as part of your SOA strategy. You can also share the same middleware, the same backbone, so the same enterprise service bus or the same message oriented middleware system or the same web services stacks that you’re using for your classical request/reply kind of SOA can also be used for sending the event driven architecture messages from the source to the destination. So, what happens here is that you have event driven architecture parts of your application co-existing with your request/reply parts of the same application system.

Now, having said that, there are all those similarities between event driven architecture and the service oriented architecture but there are some differences between EDA and conventional SOA systems. As we described earlier, event driven architecture means you’re dealing with event objects, you’re dealing with notifications that something has changed whereas a request in a SOA system that’s request/reply doesn’t look the same. In a event driven architecture you’re also using the event objects, not just to run an application or a composite application as you would in a request/reply, but you’re combining the information that’s in multiple event objects to derive a higher level insight, you’re connecting the dots together which is, again, something you wouldn’t do in a typical service oriented architecture application. What this means is that the architects and developers, when they’re building SOA applications using EDA, they have to be aware of the difference in some of the design concepts while still, at the same time, using the same tools and many of the other aspects of classic service oriented architecture to build the system.

Now, event driven architecture is something that is expanding very quickly. If you look back through your four years, you find very few complex event processing capable tools on the marketplace. At that point, there were some interesting projects that had been done in academia but there were very few commercial products on the market that did complex event processing. The simple kinds of event processing, of course, were always available, that is, you could do a staged event driven process using single events for decades, however, off-the-shelf products for the complex event processing are something that is fairly new to market. At this time, Gartner is tracking more than 30 companies that are offering business activity monitoring and complex event processing software. Some of these are small companies, started out just to do this. In other cases, we’re talking about major companies, large software vendors who have implement event processing and business activity monitoring as a part of their offerings. This area has actually been marketed less by the major vendors than perhaps one would have expected by now. Many of these companies have invested a lot in terms of developing the product but they haven’t talked that much about them in public and spent a lot of money marketing that. We think that this will change over the next several years as the market becomes more aware of event processing and the value you can derive by exploiting it.

Recommendations for Event Processing

So, let me conclude with some recommendations here. First, if you’re doing business activity monitoring or other similar projects that require real time insight into what’s happening, things like situational awareness, sense and respond (???39:15) or track and trace, you should be looking at how to apply complex event processing to the situation to be able to achieve your competitive advantage. Today, most of the applications that do this would be considered leading edge or at least early adopter’s in their category and you’d probably have the advantage over your competitors.

However, in the future we’ll find that you’ll have to do this just to keep up because if you’re competitors are using it and you’re not, you will be at a competitive disadvantage. We believe that you should make event driven architecture part of your SOA strategy from the beginning. Don’t implement SOA only with request/reply in mind. Implement SOA using the choice of event driven architecture or request/reply from day 1 so that depending on what the particular business problem you’re trying to solve looks like, you’re using the right tool at the right time. Your architects and your developers should understand event processing; there will be a learning curve for most people because in many cases it requires a different way of thinking than what they traditionally do. By default, most application systems uses a pull, request/reply or a batch scheduled kind of thinking so what we need to do is make sure the people are aware of the option here to use event driven architecture when it’s appropriate.

The Gartner conference on this topic coming up in September is going to cover this in a way that’s a little bit unique for us. At Gartner, we’ll be having some Gartner analysts to describe the trends in the market and to give advice in a vendor-neutral fashion. In addition to that, we have 13 case study speakers so we have a lot more case studies than we would have at a typical conference talking about the use of event processing. One of our tracks is specifically dedicated to the finance industry because that’s a place where there’s a lot of activity going on today. The other two tracks are tracking other uses of event processing and thing like security and supply chain and military applications, electricity trading and so forth. We’re also delighted to have two of the developers in this field, two of the innovators from academia who invented much of what’s happening here to deliver speeches at this conference. And finally, we have 16 vendors who offer products in this space and they’re going to demonstrate their products side-by-side and so you can take a look at what’s happening in the marketplace and what is the state of the art for the commercial software tools.

So, with that, let me turn the microphone back over to Beth.

BGB:    Thank you very much, Roy and in a few moments we’re going to go to your questions. I’d like to remind you, you can ask a question by pressing the “Ask a Question” button. But first, we have a few words from our sponsor, Guy Churchward, Vice President and General Manager of WebLogic Products, for BEA Systems is going to say a few words. Welcome, Guy.

GC:    Thank you, Beth and thanks very much to Roy for the presentation; it was excellent. This is Guy Churchward. I’m Vice President of WebLogic products for BEA and wanted to talk about, just briefly, around WebLogic event server and WebLogic real time, two products we launched very recently and they’re our entry into EDA, event driven architecture. Our aspirations as we show on this slide are really around instantly responsive enterprise and what I would like to stress is what BEA is about and WebLogic is about as rock solid IT infrastructure for the enterprise and again, these features whether they’re event server or real time are really driven out of our need from our customer base, our clients have really been asking for this as being the next, sort of iteration in their experience of moving forward.

Our key takeaways we just got from Roy’s slide obviously from the demand side been talking about how this is really exponentially growing the events so whether it’s going to be a 10th power four, 10th power 7 on the business events per second and there’s virtual enterprise.

So, basically, where the events are coming from, whether it’s the central network or whether it’s data feeds, whether it’s coming into BPM, it is just an explosive industry and the challenge is that the current situation is the applications invariable are written in C, C++ and people really want more of a commodity and a faster way to be able to adjust to it whether its agility or a competitive advantage and that’s where we come in on this job assignment. We wanted to bring really the values and the agilities around the events server into the world of Java and then into our client base. In the demand for it and market space is definitely on our beta cusp, we’ve seen high demand from financial services and telcos, manufacturing, logistics and transportation so it’s been pretty interesting to see, the sort of evolution of this, I mean, originally, we’ve moved into the market around the financial services but there’s been a huge thrust even in things like commodity control for military and then systems management which we never really saw coming but it’s been an explosive growth on it so we’ve been really excited.

And then, obviously on the supply side, what people are really looking to do is to be able to move into harnessing the events that are coming to their businesses as quickly as possible but in a less expensive way and now through, you know, hardware advancement and things like in-memory cache and innovations around complex event processing then it initially enables people to move towards this business in a lot more competitive way. But you do need the ultimate agility so when you are changing roles, when you are changing dynamic approaches to get competitive advantage, you want a very quick way to get to market and that’s where we see the Java and running it on a platform such as BEA’s offerings is a good way to do.

So, what we’ve brought to market is the first complete infrastructure for event processing. It really means that there’s an application server as an event processing engine so you actually have inside of it the complete life cycle, whether it’s development to production but the concept is that it should be able to run and manage in the same container both the artifacts that you have from development but also the platform we things like an event streaming engine, complex event processing and then running this with including the same JVM for extremely high performance of the application usage. We’ve tailored the application server specifically for the event processing market so this is again a high performance Java engine that can run both on our standard JVM being J Rocket or it can run on the advanced JVM offering around WebLogic real time giving a much more predictability and low latency. But again, I think what’s important about this is that it is a full application stack that allows somebody to be able to build very quickly and event processing environment and has all of the aspects of whether it’s going to be scale and volume complexity or whether it’s going to be a temporal aspect.

So, I’ve got on the slide now a sort of architectural view, I guess, of what we call TED is the time and event driven products. Basically, what you’ve got in the middle as the WebLogic event server and then underneath it, sitting on, really the event server sitting on top of WebLogic real time or J Rocket and the difference between the two is WebLogic real time offers both determinism and low latency and I’ll talk about that more as I go forward and give you a use case on it, but really drawing this to the fact that this is the only Java application server for high performance or high volume real time complex event processing and we have built this on a light weight Java application server so this isn’t J2EE specific that people would normally look at, this is actually Java.

We felt this was a lot more pertinent to this market space being a sort extreme transaction processing end of the business and then we have EPL as the standard language which is kind of like an SQL but it has temporal aspects in it so the things like normal linear events and object based events, it really helps people for that and then you can do things like throat scheduling and things, time critical streaming on that as well, so I think, again, what we’ve done is tried to build a platform to allow somebody to rapidly execute against an event application. And, you can virtually do the development in your favorite tooling language or development language whether it’s Springframe frame or Simpogio and then we use BEA Workshop as the eclipse space tooling for this as well.

And, again, we offer two JVM’s whether it’s J Rocket being high performance or it’s WebLogic real time being extreme performance. So, WebLogic real time is really around low latency and high performance so 10 milliseconds and less than a millisecond as an average response, 100% guarantees and it’s basically instantaneous so it automatically inherits the values that you have on your applications. Moving forward on the event processing, really, you have SOA as a standard way of delivering events into the business but you need more dynamices around it and so what we;ve found that by including things like stateful unpredictability and scalability work you roll in the event server and then you actually have a data stream optimizer that hangs in so, you know, what event driven architecture does link with SOA is almost like SOA next.

And then, finally, I have a couple of quick examples. One is a customer example of the equity trading house which is really around a client that has some specific syntax around stock price movements when the 3% spread exceeds the, you know, “XX” and this is very specific to their market space but really it can only handle 10,000-20,000 concurrent signals and by pushing this onto the event server, they went up to 50,000 so that means that they actually have a better, clearer view of the data coming into their business and they can be a lot more competitive.

And then, finally, an example of a trading house that was using WebLogic real time and prior to that using an arbitrage, they were losing around $200,000 dollars per day on their arbitrage solution. Now, they installed it as standard on top of WebLogic real time and within 3 days they decided that they were going to purchase WebLogic real time so they automatically inherited the values that you’d get form WebLogic real time, it negated any of the competitive disadvantage and they’ve rolled forward. And, again, important to add that these are unique to the business, whether it’s the platform that we have around event server, whether it’s WebLogic real time with low latency and guarantees and automatic inheritance but again, these are phenomenal technologies that we believe is quite and exciting for people to look at moving forward. So, with that said, I’ll hand back to Beth for any questions.

BGB:    Thank you Guy. I’d like to remind everyone you can send me your questions by pressing the “Ask a Question” button and before we go to your questions, we have a question for you. Are you intrigued by or convinced of the value of an application server for event processing? Please take a moment to answer the question while we read over your questions.

Roy, given the definition of complex events would not composite event be a more accurate definition?

RS:    Well, that’s an interesting question. The term complex events did actually become the most common use of the term when you’re talking about an event that signifies a number of other events. Now, composite event was used by the active database community in the 1990s and it had a slightly different meaning. It meant a complex event in which you actually had a copy of the data from some of the member events so because the word composite event had a specific meaning and for this community, if you use that as a general term for complex event, you could end up with some confusion.

So, complex event is the term that is being used although of course, it’s ironic that a complex event is actually simpler to deal with than a simple event simply because in a simple event you might need hundreds of them to convey the same information that you can convey with one complex event. So, complex event is the term, a general term, that is used for an event that summarizes other events.

BGB:    OK, now can you, you mentioned the government, are there other industries seriously planning for EDA now?

RS:    Absolutely. Well, there’s really two answers to that question. If you look at industries that use the term event processing and complex event processing, that’s a fairly small community of developers at the present time. It would include military intelligence, the capital markets and a number of other places. But a lot of companies are doing complex event processing but they’re calling it business activity monitoring or they’re calling it by the name of their application. So, really supply chain applications and many of the other kinds of monitoring applications that exist out there are doing complex event processing whether they call it that or not.

BGB:    OK. Now you gave figures on typical business event volume. These would be correlated into complex events. Do you have similar figures for the number of complex events that a large organization might have defined?

RS:    Yeah, most of that 104 which is 10 thousand or 107, 10 million business events per second, most of those are simple events. Now simple and complex are in the mind of the beholder. It’s kind of relative. Any simple event, you can break down into a yet simpler event so you can always decompose anything into a smaller unit. A simple event is something that hasn’t been decomposed, for which there is no sensible reason why you want to go any more fine grained. That’s the only difference really between a simple event and a complex event. Of the 10,000-10,000,000 business events per second, most of those are simple events and probably it would be an order, maybe two orders of magnitude, maybe 100 to 1 simple events vs. complex events. So you could be looking at, you know, you could be looking at tens of thousands of complex events per second already being recorded and transmitted in a large enterprise today.

BGB:    OK. Now, Guy, there are a number of very specific questions about the product coming in and maybe I’ll have you answer some of these off-line. But, can you please explain the difference between the offering of CEP vendors and BEA’s offering?

GC:    Yeah, I mean, in essence we have the complex event processing inside of the event server. What we felt is the same way that BEA normally approaches the market from the IT development perspective is we wanted to provide a more comprehensive, integrated platform so you have both complex event processing and things like with a bunch of real time integrated into it where you have near real time so we can handle more aspects around scale and quality and complexity and latency so it’s really a time to market, fast development platform that allows people to sort of have more of the code integrated into a single integrated, you know, unit.

BGB:    OK. Now, Roy, a consultant writes in saying “Am I too attached to the request/reply model when I think about how to reconcile event driven operations or is it verification of expected action part of the feedback of the event processing model?”

RS:    Well, in the event processing model, in some situations, you do need feedback, some sort of acknowledgement so that the sender of the event object knows that it has been received. In many cases you don’t. So, in the stock market example, there is not acknowledgement that goes back to the Exchange or to the data feed provider that says, I’ve gotten this stock pick. So, for many types of events, you don’t want acknowledgements. Now, if you’re sending a payment or some sort of transaction and you need the data integrity, you have a choice – you can do it with the request/reply model, which is often done or you can do it with a push model which is not event driven or you can do it with an event driven push model. In that case, you may want to still send back an acknowledgement message and you can do that – an event driven architecture can have a cyclical directive graph so something comes back to the event source and that would still be event driven even though there was an acknowledgement coming back and it wouldn’t be request/reply. It would still not be a method or operation name involved which is one of the key attributes of the event driven architecture.

BGB:    OK. And just to get clarification on terms and technologies, when you use event processing in a straight through process, is that the same as publish and subscribe?

RS:    Well, that's funny, I think a lot of times people are already using publish and subscribe for event driven architecture but you don’t have to. You can do event driven architecture without doing publish and subscribe. You can use any form of one-way messaging to implement event driven architecture. But it’s true that a lot of people that are doing event driven architecture in the past or message driven processing in the past called these pub-sub systems – publish and subscribe systems so there’s a high overlap but you can do publish and subscribe without doing event driven architecture and you can do event driven architecture without doing publish and subscribe. So, an architect really should be careful to make sure that they are thinking of whichever variation is appropriate for that particular use case.

BGB:    OK. And another clarification, presumably by complex you mean lots of inputs very quickly but wouldn’t that be more like event stream processing where you’re filtering, I mean, is there a difference? How do you define that difference between complex event processing?

RS:    Well, event stream processing is, I think, always complex event processing. It’s a type of complex event processing because when an event stream comes in, the stream coming in is probably simple events in most cases, although they don’t have to be and then from those, you’re looking at multiple events to find a pattern of some kind, and you’re aggregating or you’re correlating, you’re filtering and so forth and so what you get out of event stream processing is almost always a complex event. So…

BGB:    OK, but there are some complex events that you have streams coming from different activities that would come in and you sort of have to correlate it against patterns, it’s not just like filtering tens of thousands of events coming in for one type of event like stock price.

RS:    Exactly. Event stream processing has a connotation that what’s coming in may just be events of one type and you’re finding a pattern in those events although any commercial product that does event stream processing is capable of finding relationships across more than one event stream. Still, some of the kinds of complex event processing may be looking at lots, dozens, of different kinds of events and that, in most cases, people wouldn’t use the word event stream processing for that kind of work but there really is no hard and fast definition or white line between event stream processing and other types of complex event processing. So, it’s best really to look at the usage patterns in the particular application so you can understand what’s happening and then not put too much weight on either of those titles.

BGB:    Now, do you see a need for a search for already defined complex events if you, you know, are subscribing to them? How do these systems work?

RS:    Well, I think it all depends on the business situations. All of these things are meant to be tools to help you make faster and better decisions. So, in some cases, to build a business activity monitoring application, you can leverage some existing sources of complex events. In other cases, the only sources of events you have are simple events and then you are going to have to create the rules by which you start with simple events, create complex events then perhaps you want to build a hierarchy. You may then correlate multiple complex events to get yet a higher level of complex event so you may have a hierarchy where you’re cascading events and getting them more refined, summarizing significance on yet higher and higher levels as you go.

BGB:    OK. Now, Guy, does BEA have any thing to manage latency in the real time application?

GC:    Yes, so integrating into WLR2 which is the WebLogic real time we have a latent analysis tool and this enables you to take the pulse spikes and drill into them in production which is again, completely unique to the BEA offerings. So, yes and yes.

BGB:    OK. We are running out of time so if we don’t get to answer your question, we will take them off line but there are a number of questions, Roy, about where we are in the market, why is it a growth sector in Gartner’s view and, finally, is there any specific limitation in the CEP concept perhaps that will eventually be overcome as the market matures?

RS:    Well, we certainly see a lot more activity in this market than existed five years ago. A lot of this technology just was developed in academia in the late 90’s and early 2000’s. The commercial products are generally less than several years old and sometimes within, you know, the last year. And a lot of this comes about because people suddenly are finding that things that they always wanted to do, they always wanted to have, more up to date information about certain things, they’re now finding it practical to do.

So, I think it will take between 10 and 20 years for this to be largely exploited within companies
07/13/2007 Progress Software Podcast with Jon Bachman Gold Club Protected
David Kelley Podcast with Jon Bachman

Progress logo


Listen to the entire 28:10 podcast using the buttons below -- or download the file for later playback



Click here to send Jon Bachman a question -- he'll respond in this space.

Dave Kelly David Kelly: Welcome to a special eBizQ podcast devoted to ESBs and integration patterns. I'm your host, eBizQ senior analyst, David A. Kelly. With us is Progress Software's senior director for product marketing, Jon Bachman. Welcome, Jon! Thanks for taking the time to talk with us today.

Jon Bachman: Oh, thank you, David. I look forward to this session.

DK: Great! One of the things I wanted to start off with is over the past few years, Enterprise Service Bus, or ESBs, have emerged as a key enabler for more agile and effective integration across the enterprise systems and business processes. Yet, business success is more than simply purchasing the right technology. It often means knowing how to use what you have most effectively. Can you please give our listeners a perspective on the key best practices they should be thinking about when it comes to ESBs?

Jon BachmanJB: Yeah, certainly. We have found from our customers that the first project that most people use an ESB for is a relatively simple one that has to do onloading and offloading files, moving them from batch processing styles of sequential, overnight processes to a more real-time continuous pipeline of processes. And, so I guess one of the key practices here is: start with a project which is easily understandable, one which will benefit the organization when it is complete, and don't be surprised if the first project is a relatively simple replacement of an FTP or file transfer type application where you had two to three different file transfers combined with some sequential processing.

More Resources:

Free White Paper From Progress Software:
A Playbook for ESBs

Much like templates for generating code modules, integration patterns enable organizations to reuse code and configuration elements to maximize the deployment of integration projects quickly. Combining the advanced capabilities of an Enterprise Service Bus with integration patterns can deliver a number of key benefits to both business and IT: reduced risk, faster time to business value, greater agility and lower costs.
Click here to download this free white paper

It is the most common first project for an ESB. And we call that pattern the continuous pipeline processing pattern. And the primary requirements are to understand how the product or the ESB you've purchased supports what I would call file on-ramping and file off-ramping. And the ability to break a batch of transactions that are sort of encoded in that batch file into individual messages or transactions that are then turned into events on that enterprise backbone or Enterprise Service Bus.

DK: Okay. You mentioned integration patterns as one of the parts of the solution. Can you provide some additional insight into what you mean by integration patterns and how they work with ESBs?

JB: Yeah, sure. The patterns that I just mentioned, which had to do with sort of file to sort of real-time processing is one of the patterns that we see frequently from amongst our customers. There are two or three others that I think we should talk about. One of which, is access to remote data. If you think about a portal as an example, it's very common for a portal to assemble data from multiple data sources into a single page and present that to a user. But when you get into larger scale situations where the data that's trying to be accessed is actually not local to the application that's serving the pages, you have an integration problem that really we talk about as the remote information access problem, where you have to orchestrate the request and the reply to those multiple back-end data sources and assemble the response back for the end-user.

There are other patterns which are similar to that which are repeatable and frequently found in enterprise integration and SOA projects in general. Another one is remote data distribution. You might have, for example, an application that is the master record keeper for customers or products. And when a change is made to a customer or to a product definition, you need ten or twenty other applications to know about that as quickly as possible. You want to make sure that you reliably distribute that information and that change out to those applications without failure.

And then finally, a fourth one, which I think is useful to imagine is it's a combination of the remote information and access to remote data distribution, where you have applications that are relying, in real-time, on the change in status of some enterprise event. So, for example, airports have a lot of applications that run based on the arrival or departure of an aircraft at the terminal or gate. And you need to set in motion, concurrent processing to respond to that real-world business event.

I think the key benefit of an ESB is that it provides a palette on which you can build enterprise integrations without worrying about sort of the scope of the project or how many different sites might be involved, how many different applications might be involved. And once you have that clear palette on which to paint, you can actually pull out the pattern and apply it to a particular business problem and solve it quickly and in a reproducible fashion.
--Jon Bachman

So, just to summarize, there are patterns in enterprise integration whether it's continuous pipeline processing access or distribution of data, or the response to real-time business events that are very commonly reoccurring in enterprises and can actually be turned into patterns or templates that can be reused at a very high level of abstraction.

And what that really allows you to do, is not worry about having to reinvent or do each integration once and then throw all your expertise away. You can actually capture that pattern and save it, both at the low-level components as well as the higher level, the macro level pattern itself. And then you form really a language, with which your team can talk with one another and a set of component definitions, event definitions, service interfaces that come as a package and allow you to rapidly deploy another instance with perhaps minor variations to that same pattern.

So, really, I think the key benefit of an ESB is that it provides a palette on which you can build enterprise integrations without worrying about sort of the scope of the project or how many different sites might be involved, how many different applications might be involved. And once you have that clear palette on which to paint, you can actually pull out the pattern and apply it to a particular business problem and solve it quickly and in a reproducible fashion.

07/09/2007 Full Transcript: Forrester's Boris Evelson on the State of the Business Intelligence Market Gold Club Protected

beth_goldbernstein_sm.jpgBeth Gold Bernstein: Hello, everyone! And welcome to the ebizQ "BI in Action" virtual conference. I'm Beth Gold-Bernstein, director of the ebizQ training center and chair of the conference. Before introducing this key note presentation, let's review our interactive environment. You can chat with other attendees, send a business card, or leave a message. Note the tool bar at the top of your screen, where you will see "My Briefcase." That's where you can retrieve the business cards, messages and files that you download during the conference. To download a copy of today's presentation, click the "Files" button on the grey console.

After this presentation, we will take your live questions. You could submit your questions by clicking on the "Ask a Question" button on the gray console. To enlarge any of the slides, click on the magnifying glass, that plus sign to the right of the slide.

Now, it's my pleasure to welcome Boris Evelson, principal analyst at Forrester, who is going to enlighten us on the current state of the business intelligence market. Welcome, Boris!

Boris_Evelson_sm.jpgBoris Evelson: Well, hello and good day everyone! Beth, thank you very much for that nice introduction. I hope everyone is doing well today and I thank you for joining me in what I hope will be a very informative presentation for everyone. I'm assuming everyone can see the slides and can see the controls for chat and etc. And I hope you can see an "enlarge" button in the upper right corner of the slides. So I will be showing some graphics and there will some small text in my presentation, so feel free to navigate the screen and enlarge it as you feel as needed.

So, let's proceed. What is this mysterious and often-mislabeled term, "business intelligence?" This is what we are talking about today. Let's get on the same page here. Let me first of all, start by saying that the information revolution is manifesting itself by producing mountains of digital data and that is becoming more and more challenging to process and analyze. And not only are we generating more data every day but our approach to data analysis, such as structured databases, indices in databases, distributed data architectures, all of that is also contributing to data growth.

More Resources:

BI in Action: Full Replays -- including slide shows -- of all our Webinars
Boris Evelson answers follow-up questions to this Webinar


Upcoming Webinars

1. Start Small; Think Big -- Driving BPM ROI featuring Pratt & Whitney
Guests: Robert O'Connor, Integrated Services Solutions - Business Process Solutions, Pratt & Whitney
Brandon Lackey, Global Solutions Director, BEA Systems

2. Navigating BPM Icebergs with Dynamic Business Applications
Guests: Jeff Shuey, Global Alliance Director, K2 Steve Yi, Senior Product Manager, Connected Systems Division, Microsoft

3. Event Processing: Competitive Advantage Through Situational Awareness
Guest: Roy Schulte, Vice President and Research Fellow, Gartner, Inc.

A Pertinent Podcast

Choosing Web-Based BI Applications
Guest: Lyndsay Wise, Technology Evaluation Centers
Hear Podcast | Read Full Transcript

And, furthermore, increasing importance of unstructured data, which we will talk about later in the presentation, increasing importance of that to making an informed and effective business decision, adds to data proliferation in business intelligence environments. So in response to these mountains of data and constant, ever-changing information requirements, we have come up with a set of methodologies, processes, architectures and technologies that transform all this raw data into meaningful and useful information.

So contrary to a more popular definition, in my opinion, business intelligence is much more than just reporting, dashboards and analytics, which is what you typically hear out there and we will see that further down in the presentation.

In today's presentation, I will cover several areas: First of all, I will talk about the fact that even though business intelligence has been around for decades, we've definitely seen quite a significant re-emergence of interest in business intelligence applications and various BI offerings by BI vendors. We will also talk about how I define the business intelligence market segment and we will go through some of the major players of the market and talk about the strength and weaknesses. I will then show everybody where I think the market is going in the next few years and we will wrap up with the most important topics, such as best practices and recommendations.

Re-emergence of BI applications

So let's start with why I think there is a re-emergence or an up-trend in business intelligence market. I've been in this business for over 25 years and I've seen many ups and downs in this market. However, lately there is definitely significant up-trend in this market and I am attributing it to least to, there are many more reasons, but am attributing it to at least two major reasons. Let's go through them.

First of all, is that being efficient -- keyword here being efficient -- is no longer enough. Enterprises can no longer stay competitive just by squeezing more efficiencies from operational applications such as trading, clearing systems, ERP, CRM, supply chain management, etc. Now buiness intelligence applications are needed for businesses and processes and business operations to become more effective. Key word here: effective.

So, for example: operation applications such as servicing, customer service CRM can be used to efficiently process a customer credit application. But I need business intelligence analytics to effectively create a customer segmentation analysis in order to extend the credit offer to much more targeted customer segment, so that I can get a better response and I can get better cross-sell and up-sell ratios. This is how I stay competitive these days.

The second reason that business intelligence is in an up-trend is that multiple factors, such as globalization, increased competitiveness, amplified emphasis on operational risks and regulatory compliance, mountains of device-generated data. Now these are just a few of many drivers behind continuous exponential growth rates of digital data our world produces. It's really easy to see how companies today are swimming and in some cases, are drowning, in data.

...it's been estimated that the volume of world data doubles approximately three years. So as recently as 2003, the world's information production was 5 exobytes. So that's the number five with eighteen zeros. To put that into context, 5 exobytes is the equivalent to the amount of information contained in 37,000 libraries, the size of the Library of Congress, or one gigabyte for every person, for every person on earth.
--Boris Evelson

To be more exact, it's been estimated that the volume of world data doubles approximately three years. So as recently as 2003, the world's information production was 5 exobytes. So that's the number five with eighteen zeros. To put that into context, 5 exobytes is the equivalent to the amount of information contained in 37,000 libraries, the size of the Library of Congress, or one gigabyte for every person, for every person on earth. And growing at about 30 percent a year, we'll reach zetabyte sizes by 2010. That's a number with 21 zeros. And I and my colleagues that I'm sure are on the phone today, we are the ones that have to deal with it because most of the data is stored in magnetic media and that's the data that we deal with every day.

To ask to the tall tasks companies are facing today such as capturing, maintaining and utilizing the right data at the right time, is the fact that the vast majority of data is unstructured with current estimates hovering around 80 percent. So what is being done to deal with this challenge today? Well, business intelligence is moving towards encompassing both structured and unstructured data, traditionally two separate worlds.

Some call it Data Warehousing 2.0 or DW 2.0. It's a move certain to make analytics significantly more comprehensive and effective. But, unfortunately, infinitely more complex. The unstructured data that companies need isn't also readily available from within. Most of the time, it's outside of the company and it's in exobytes. So business intelligence architectures will need to be highly scaleable, flexible and even more capable of managing large loads of both structured and unstructured data from disparate sources.

Defining business intelligence and looking at vendor offerings

So now that we understand the why's behind business intelligence market growth, let's get on the same page as to what it is. Remember how I defined business intelligence at the beginning of the presentation? I defined it as a set of methodologies, processes, architectures and technologies that transform all this raw data which is talked about and the meaningful and useful information. You can actually see on the slide, and my apologies for small print, please feel free to use the enlarge button. But there was no way around this. There were indeed too many components that I wanted to show on one slide.

So, as you can see, I track around eight major layers or sections in this entire environment and a total of over 50 individual components. So even though we don't need to implement every component in every case, but a good majority of these components are needed for successful, scaleable and useful business intelligence implementation.

As you can see what many define as business intelligence, namely just reporting, dashboards and analytics that you see in the middle rows of the scene, they're really nothing but a few components of the so-called presentation layer. And that presentation layer is just one layer of what I call "business intelligence stack." So these cannot exist alone in isolation for most of the other components that discover the data, transform the data, cleanse the data, model the data, and provide a multitude of other supportive and related functions that are necessary for effective business intelligence implementation. So, that's it's quite a bit stack.

...no vendor offers the entire solution on a single platform, although some vendors are getting close.
--Boris Evelson

Let's move on to the business intelligence ecosystem. And if you recall, the 50 or 60 components that you saw on the previous slide, the key point here is that almost no vendor offers the entire solution on a single platform, although some vendors are getting close. So who are the major players in this business intelligence and what I call, for lack of a better word, ecosystem? So we have so-called pure plays that offer parts of the solution. Companies like Cognos, MicroStrategy, Actuate and Information Builders offer reporting and analytics. And companies like Informatica and Abenitio provide data integration services.

In this particular section, companies like Business Objects and SAS actually come closest to offering almost the entire stack of components in this category. When a company wishes to deal with only a single vendor for all or most of the enterprise applications, not just business intelligence, and we look for so-called "platform vendors" such as Microsoft and Oracle.

To read the rest of this transcript, a free ebizq Gold Club membership is required. Click here to enroll and gain access to a wealth of BI knowledge.

See a full replay of this Webinar -- complete with slides
07/05/2007 Full Transcript: The Role of BI in SOA and BPM Gold Club Protected
  • DATE: June 20, 2007; 1:00pm - 2:00pm EST
  • Featured Speakers:
  • Panel Leader: Beth Gold-Bernstein, ebizQ Analyst
  • Guy Weismantel, Senior Director of Corporate Marketing, Business Objects
  • Rob Risany, Director, Product Marketing, Savvion
  • Michael Corcoran, Chief Marketing Officer, Information Builders
  • Joe McKendrick, BI in Action Blogger
Beth Gold-Bernstein

BGB: Welcome, everyone to all our attendees and panel members to this final session of the ebizQ BI In Action virtual conference. If you missed either Bill Gassman of Gartner or Boris Evelson of Forester's presentations this morning or you'd like to recommend them to your colleagues, please know that these will both be available for archive viewing.

Once again, this is Beth Gold-Berstein, chair of the conference and director of the ebizQ Training Center. After we have an initial discussion with our panel members, you will have the opportunity to ask questions. To submit a question, press the "Ask a Question" button on your gray console. Be sure to stick around to the end of the Q&A for a give-away of four advance copies of BI for Dummies due out in September but our four lucky winners will be the first to receive the copies so be sure to stay around.

During this panel discussion, we are going to be discussing the role of BI and BPM, BI in BPM and SOA. Our panel members are Rob Risany, Director of Product Marketing for Savvion. Welcome, Ron. Michael Corcoran, Chief Marketing Officer for Information Builders, Guy Weismantel, Senior Director of Corporate Marketing, Business Objects Guy Weismantel, Senior Director of Corporate Marketing, Business Objects and our very own Joe McKendrick, ebizQ SOA and BI in Action Blogger. Welcome to all of you and thank you for joining us today.

Now, after a number of discussions at ebizQ, we decided to do a BI In Action virtual conference because we started to see a trend with companies incorporating BI into their BPM and SOA solutions. We viewed this as an important trend for business managers to understand because of the potential benefits it offers and we're going to start the panel discussion talking about these. But as we know, sometimes it falls to IT to inform and educate the business on the potential of technology so we're also going to address that side of the discussion. But, my first question, is for you: BI has been around much longer than either BPM or SOA and these days BPM is getting very popular. In your view, what is the value of BPM to the business and how does BI enhance that value?

Why BI Must Enhance; Why IT Must Educate
Beth Gold-BernsteinRR: Great question, Beth. I think what we've seen is that BPM -- business process management -- basically creates a role for business people that they haven't really had before in enterprise architecture. So business process management basically gives business people the ability to think about the things that affect them most and turn them into running solutions within the business. That's really the crux of BPM's power within the business. What's interesting is, obviously when business people are thinking about the initiatives that affect them most whether it's order management, whether it's managing customer satisfaction, depending upon your industry, the issue that really comes into is what information is critical to you to be able to support the decisions you have to make for those mission-critical processes.

BI creates a business face for the data, BPM creates a business face for the process while SOA is an underlying approach for building applications across the business.
                                     --Robert Risany

And that's really the role that BI has in the context of BPM. It's about providing enough information so that the process solutions that are created by business people used in conjunction with IT have enough information so that the right decisions can be made. And there's a lot of different types of information that are available depending upon the time frame, the types of processes you're supporting but the real crux of BI with BPM is providing the content in the context of the core processes and that's really the relationship there. On the SOA side, SOA is the infrastructure which IT uses to enable the business initiatives and so what we see with our customers is the most successful SOA-oriented companies that are leveraging the power of service oriented architecture realize that BPM can create a business face for SOA and that's then the relationship between the three. So BI creates a business face for the data, BPM creates a business face for the process while SOA is an underlying approach for building applications across the business.

Making BI Proactive

BGB: OK. Thank you, Rob. Now, Guy, Business Objects is a BI company that has been around for awhile. How would you describe the role of BI in the business and how do you think it enhances these new business initiatives?

More Resources

BI in Action: Full Replays -- including slide shows -- of all our Webinars


Upcoming Webinars

1. Start Small; Think Big -- Driving BPM ROI featuring Pratt & Whitney
Guests: Robert O'Connor, Integrated Services Solutions - Business Process Solutions, Pratt & Whitney
Brandon Lackey, Global Solutions Director, BEA Systems

2. Navigating BPM Icebergs with Dynamic Business Applications
Guests: Jeff Shuey, Global Alliance Director, K2 Steve Yi, Senior Product Manager, Connected Systems Division, Microsoft

3. Event Processing: Competitive Advantage Through Situational Awareness
Guest: Roy Schulte, Vice President and Research Fellow, Gartner, Inc.

A Pertinent Podcast

Choosing Web-Based BI Applications
Guest: Lyndsay Wise, Technology Evaluation Centers
Hear Podcast | Read Full Transcript

Guy WisemantelGW: Well, thanks Beth. There's a lot of ways that Business Intelligence today is being embedded within company processes and company operations to go ahead and make better use of the information that organizations have within their four walls and within their data structures today so the interesting part about Business Intelligence today is how it's going very much from a backward looking view of what we used to do with the business and becoming as Rob mentions, much more process-aware and much more ingrained in the context of how people solve the key business problems they have and solve the business challenges in front of them. So, as companies have started to make better use of their information, get more timely in terms of the use of their data and their information, they've been able to marry up that data with the specific processes they use to go ahead and solve the business problem or address the issue in front of them.

So, a really easy example if we see that our sales forecast is headed down and not going to make our revenue forecast that we committed to, you know, the Vice President of Sales previously, you know, we kind of had a trailing view of what was happening. We could see that, yes, our sales were definitely headed down. But people are really changing the way through BI that they use this information and to more proactively look ahead of the curve and say "gosh, we've got, you know, two weeks or so left in the month of June, I see that I might not make my sales target so now I have, you know, an opportunity before the quarter closes, before the month-end to go ahead and take some corrective action and how can I use this information at hand to understand if I need to, you know, do something with my inventory, change my product mix, change my pricing, run a promotion" and where we marry up with business process management is to embed ourselves how a company would handle that situation in the absence of BI. So, BI can get you, kind of, you know, part of the way there, even most of the way there giving you the insight to say "Aha, it's this product that's causing the issue or this region or this sales person" but then we marry up with the processes that are in place and the processes that folks like Savvion put into the business to go ahead and make sense of the information so the end user and the company can actually make the right decision with the right data that they have within their company today.

Combining BI and BPM: Better Processes for Smarter People

BGB: Thank you, Guy. So, we've talked about how both BI and BPM provide business benefit. Michael, can you tell us what kind of business problems or solutions can be accomplished using BI and BPM together that can't be done with either of them alone?

Michael Corcoran MC:: Absolutely, Beth. At Information Builders we've kind of had a unique opportunity and perspective because we have two separate, well, we're one separate lines of businesses. One is our business intelligence business with focus. The other is something called iWay software which is much more focused on operational systems integration and process management. And what we've really learned over the last couple of years is that the wall between these operational systems and these BI systems is really coming down. Business intelligence is no longer a 9-5 job that just happens in the back office. It's really infiltrated down to a process level so there's two opportunities for organizations to really improve the business.

One would be to use BI technology and leverage better integration to bring much more real-time information to people who are directly engaged in a business process. How can we make them more intelligent? But then there's another emerging opportunity which is where you take the technologies and merge them together. You have an opportunity to automate the intelligence somewhat. You have ways to bring what we call BI down to a process-driven level and there are three things you can do there.

And what we've really learned over the last couple of years is that the wall between these operational systems and these BI systems is really coming down. Business intelligence is no longer a 9-5 job that just happens in the back office. It's really infiltrated down to a process level so there's two opportunities for organizations to really improve the business.
                                     --Michael Corcoran

One is to really enable some true event-driven alerting of business events and business agents without a process. As BI vendors, we've all been kind of focused on this problem for a number of years but a lot of the focus has been on polling of data and information that is just stored in databases. What we can, today, do with real-time integration technology is to really look at events as they flow through our systems at a transaction level, almost, in a process. At any point in the process, we have the ability to alert people as to events that are happening. You have more formal process monitoring is what Gartner calls business process monitoring or BAMs -- business activities monitoring. And within those two opportunities, we've obviously seen BI being used in a lot more real-time model which really exciting to us will be the new wave that we see which is how do we take BI and truly embed it down into the business process itself and, you know, both the speakers before talked about the relationship with SOA and BPM and BI. These three technologies come together in ways where we'll be able to build much more intelligent processes, not just smarter people in our organization.

The Killer Apps: BI and SOA

BGB: OK, excellent. So, now we've established the links certainly between BI and BPM as well their separate disciplines and technologies and each delivers unique benefits on their own to the business but together they are creating new opportunities for delivering business's ability and intelligence. So, how we started this discussion on the business side, now let's turn to how we deliver these solutions and as Rob alluded to, the SOA side of this story while neither BPM nor BI require SOA for implementation, they both benefit from being implemented on top of the SOA infrastructure. Joe, what do you see as the relationship between BI, BPM and SOA?

Joe McKendrick JM:: Thanks, Beth. Well, I think Rob hit it when he described SOA as the infrastructure enabler for BI and BPM. SOA is essentially so the IT - information technology folks -- can contribute to the story and it's interesting because there's been a lot of discussion in the SOA world about what will be the killer application that will really drive SOA implementations forward and I think and I've seen quite a bit of agreement on this that BI and BPM and BI in particular will be that killer app, so to speak, you know, there's the question of what comes first, the chicken or the egg, you know.

Most data and most applications still reside scattered across the enterprise and something needs to bring those, that data and those applications, that logic together into a single place and SOA is that enabler.
                                     --Joe McKendrick

Do you develop the SOA and then put BI on top of that or do you have BI that will kind of evolve a SOA infrastructure underneath and then it's probably going to happen both ways and I think what the need is what companies, what enterprises need at this stage are a single view of the truth, a single view of their enterprises, a single view of their data. There's a lot of emphasis on corporate performance management, the dashboards, being able to look at what's across the business, across your enterprise, what's happening. There's an emphasis on competing on analytics, the ability to employ BI and analytic technology to gain the competitive edge in your business.

And none of this is possible without an enterprise view and unfortunately, most companies these days are stovepipes, you know, it's still a challenge. I know stovepipe is still kind of a cliché term but it's still a challenge. Most data and most applications still reside scattered across the enterprise and something needs to bring those, that data and those applications, that logic together into a single place and SOA is that enabler.

Case Studies: Motorola and NetManage -- Triage and Compliance

BGB: OK, thank you, Joe. So, let's turn our discussion to BI in action. Rob, what are Savvion customers actually doing with this technology?

To hear the rest of the podcast, please log in using your free ebizQ Gold Club membership.

07/02/2007 Full Transcript: Gartner's Bill Gassman on How BI Drives Business Performance Gold Club Protected

Beth Gold-BernsteinBeth Gold Bernstein: Hello, everyone! And welcome to the ebizQ BI in Action Virtual Conference. I'm Beth Gold-Bernstein, director of the ebizQ Training Center and chair of the conference.

It's my great pleasure to welcome Bill Gassman, research director of Gartner. Welcome, Bill!

Bill GassmanBill Gassman: Thank you, Beth. And thanks everyone for attending this Webinar about business intelligence in driving business performance. I'm going to focus on how to achieve a successful BI program within your organization. Managing the performance of a business is a critical part of keeping a business competitive. In a robust business intelligence framework, decisions that impact the business in a positive way are going to be easier to make.

Business intelligence isn't new but it's an evolving area of information technology where many organizations struggle to achieve even a basic level of maturity. Without maturity, it's difficult to take advantage of the exciting new technologies that are coming down the pike.

I hope that after viewing this Webinar, you're able to come to a better understanding of what BI can offer and think about how your organization can improve its ability to deliver business intelligence to those that need it. In a Gartner survey of more than 1400 chief information officers -- this was published in February, 2007 -- we asked if the management of our company had the right information to run their business. Sixty-four percent said no and only 34 percent, 36 percent said yes. This is a bit scary! Each year, billions of dollars are spent on BI software and another few billion for the people and the hardware to run it.

So what's going on here? What are these people doing? We believe that the survey results are due to a number of things. First, while CIOs are usually responsible for the corporate data warehouse, only 40 percent are responsible for making sure the information is treated as a strategic asset and improving the knowledge worker productivity. In other words, corporate data is all dressed up, but there's nowhere to go. There's been a disconnect between the technical side of BI and the business side of BI.

Just look around your organization to see how many are using Excel with unqualified data to make decisions. And second, and this is a related point, with so much emphasis on building the data warehouse, there really hasn't been the time or the budget to build the business intelligence applications. But we do see that problem starting to be addressed.

More Resources:

BI in Action: Full Replays -- including slide shows -- of all our Webinars


Upcoming Webinars

1. Start Small; Think Big -- Driving BPM ROI featuring Pratt & Whitney
Guests: Robert O'Connor, Integrated Services Solutions - Business Process Solutions, Pratt & Whitney
Brandon Lackey, Global Solutions Director, BEA Systems

2. Navigating BPM Icebergs with Dynamic Business Applications
Guests: Jeff Shuey, Global Alliance Director, K2 Steve Yi, Senior Product Manager, Connected Systems Division, Microsoft

3. Event Processing: Competitive Advantage Through Situational Awareness
Guest: Roy Schulte, Vice President and Research Fellow, Gartner, Inc.

A Pertinent Podcast

Choosing Web-Based BI Applications
Guest: Lyndsay Wise, Technology Evaluation Centers
Hear Podcast | Read Full Transcript

Another question in the CIO survey was business intelligence -- what was going to be the top priority of technology in your organization? And business intelligence application emerged as the top.

The investments that organizations have made in their infrastructure over the past five years puts them in a position to have that investment pay off in delivering business value. There's a good chance that your competition is part of that 64 percent, where the information to make the best decisions is not available. This may be a good opportunity here if your organization has a good BI program in place. But the opportunity could turn into a threat if building BI applications doesn't remain a top priority. So those that have seen this and are starting to invest in catching up with business intelligence, we're seeing them invest more than 10 percent of their software project on BI applications alone. So there's some really good news here.

This presentation looks into three key issues around business intelligence. First, how do organizations leverage information for making decisions and improving business performance? I'll describe how the term "BI" is evolving and give a few examples of how companies are doing it well. And then we're going to look into some trends. And the second key issue: what are the cornerstones of a business intelligence and performance management strategy? I could spend all day on this topic, but since we have a limited amount of time, I'm going to give some of the highlights of what we think is important there. And third, what are the five-year trends that you should be looking for? I'll rate some of the BI technologies and because this is a Webinar that is leading up to Gartner's Event Processing conference coming September in Orlando, I'm going to give a glimpse on how event processing, business intelligence and business process management are all coming together to create a real-time enterprise. And then I'll close with some bottom-line thoughts and some recommendations.

Leveraging information and improving performance

So, okay -- let's get on here to the first key issue. Business intelligence is not just a particular technology or product. And it's not about insights or a single version of the truth. It's an umbrella term that defines a broad range of applications, technologies and methodologies. The purpose of BI is to give users access to information and to analyze information so that users can make better decisions and manage the company's performance.

In order for business intelligence to be viewed as a success, the information and the analysis that is actually used must be actionable, it must be auditable, and the associated decisions that are made must have an impact on the performance of the company that's in line with the plans and the objections. In other words, just having a data warehouse isn't good enough. The supply and the use of BI has to become a core business competency that drives the business from the strategic level down to the process level.

The idea that business intelligence is a task for the IT department alone is just wrong! Leading companies have figured this out and have married information and strategy to achieve top business performance. Every year, the magazine Business Week selects its top performers. Each company is evaluated on a broad range of criteria. For example, the one and three year total return. The sales growth, profit, and so forth. We look a look at the annual report and the SEC financial statements from some of Business Week's leading companies and we found clear evidence that BI is part of their strategy from the top down.

So, for example: from the hardware store Lowe's annual reports. I'm quoting directly here. "We are continuously assessing and upgrading our information systems to support growth of our new sales initiatives, control costs and enable better decision making." The investments in BI are being done with a purpose. That is to manage the performance of the company, and as clearly stated, right to the company's investors.

Here is another example. WellPoint is a healthcare provider. And in its 10K annual report files at SEC, they stated "our business depends significantly on effective information, our ability to correlate pharmacy data and medical management data allows us to provide important information to our members, physicians and other providers which enable them to more effectively manage these conditions." So, we all know that there are many opportunities to improve the quality and the cost of healthcare and with the use of business intelligence, WellPoint knows that their business depends on it, so they use it. They even go as far as delivering BI to their customers.

To read the entire Webinar transcript, just a few ebizQ Gold Club membership is necessary...
06/25/2007 Full Transcript: Dunes' Stefan Hochuli Talks to ebizQ's Krissi Danielsson Gold Club Protected
06/14/2007 ebizQ podcast: TEC's Lyndsay Wise on Choosing Web-Based BI Applications Gold Club Protected
06/04/2007 Full Transcript: Software AG's Mighael Botha Talks to ebizQ's Joe McKendrick Gold Club Protected
06/04/2007 Full Transcript: IONA's Eric Newcomer Talks to ebizQ's Joe McKendrick Gold Club Protected
06/04/2007 Full Transcript: webMethods' Miko Matsumura Talks to ebizQ's Joe McKendrick Gold Club Protected
06/04/2007 Full Transcript: Oracle's Ashish Mohindaroo Talks to ebizQ's Joe McKendrick Gold Club Protected
05/31/2007 Full Transcript: Lombardi's Phil Gilbert on 'Blueprinting' BPM Solutions Gold Club Protected
05/30/2007 Full Transcript: BEA's Charles Stack Talks to ebizQ's Joe McKendrick Gold Club Protected
05/29/2007 Full Transcript: Forrester's Ken Vollmer on Real-World BPM Gold Club Protected
04/26/2007 Full Transcript of Podcast with IBM's Billy Newport Gold Club Protected

Welcome to another "First Look" podcast. I'm your host, ebizQ's Project Manager, Gian Trotta. Joining us today is Billy Newport, IBM's senior technical staff member, chief architect for WebSphere XD ObjectGrid. He's going to discuss the implementation challenges and application benefits of data grid architecture. Welcome, Billy, and thanks for joining us.

Billy Newport: Thanks a lot, it's great to be here.

Gian Trotta: Billy, what exactly is a data grid and what advantage can it confer on an Enterprise?

BN: Conventional applications have been built in kind of multiple tiers, where you have a data tier and an application tier. And those kinds of architectures have served us well over the years. But recently, in some kinds of environments with the kind of work loads, the way they have been increasing -- those application architectures have kind of been showing their age. We're now seeing kind of a transition to data grid-type architectures where instead of pulling the application data from the data base, what we're seeing a trend through now is integrating the application with the data, so that the data is co-resident with the application. And then we scale out the application and grid technology.

To hear the rest of this podcast, a free ebizQ Gold Club membership is required. Registration is quick and easy and gives you access to dozens of high-value features like this one.

04/25/2007 Full Transcript: Podcast with Jared Rodriguez of Skyway Software Gold Club Protected

Welcome to another "First Look" podcast. I'm your host, ebizQ's Product Manager, Gian Trotta. Our guest today is Skyway Software CTO, Jared Rodriguez, who'll talk about how Skyway's Visual Workspace makes end-user customers both willing and able to participate in a truly interactive software delivery process. Welcome, Jared and thanks for joining us!

Gian Trotta: Jared, we'd like to learn a little more about Skyway and your solution, and I guess we'll start by asking this. Can you tell me, who is Skyway Software?

Jared Rodriguez: Sure! So Skyway Software has been around since late 2001, and we started the company really to address a lot of the challenges we saw in the software development and software delivery lifecycle -- the challenges around the expense, the complexity and making software and delivering it.

We also started Skyway in regards to the SOA initiatives that were springing up at the time, a lot of which we were pioneers in, in past lives before actually beginning Skyway. Several of us worked on the UDDI specification, worked on the SOAP specification. We could really see the writing on the wall that SOA combined with some new styles of delivery could be a very, very powerful tool. We've been actively selling our solutions since 2005, and we've got some great customers that are out there using the Skyway products today and seeing value out of them -- such as TD Ameritrade, British American Tobacco and many others.

GT: That's not a bad client list! What is your specific solution offering and what does it do?

JR: Well, we offer the Skyway Visual Workspace and we couple our workspace (that's a tool) with a set of processes that go around it that can really enable an organization to be more accurate in what it delivers, so as to ensure that the solutions that are built really match up to what business is looking for and what the end customers are looking for. And that process, combined with our workspace, enables it to be done much, much faster than can be coded today.

To read the rest of the transcript, sign in below with your free ebizQ Gold Club membership.

04/03/2007 SAP Special Supplement: How To Structure Difficult-to-Automate Processes Gold Club Protected
Philip Kisloff explains how he designed automated systems with a high level of user interaction and exceptions during a rollout of a shared service center providing back-office support to more than 15 European subsidiary companies.

The project was a great success, but challenges and compromises had to be made in order to realize the business benefits in an acceptable time frame. They included:

  • Major subsidiaries had a legacy of different solutions, requiring modeling of diversity with little influence on forcing the adoption of common procedures.
  • The best business-processes design had to take second place to meeting project deadlines. The bigger picture took precedence over the details, and the workability of some of the solutions suffered as a result.
  • A hard-working and stretched Enterprise Systems Service Center staff had to deal with complicated procedures due to the lack of coordination in design and many manual procedures to be followed.
Halfway through the rollout, Kisloff advocated the abandonment of the first attempt to support a critical business process and a re-implementation with a different solution. The novel process-modeling methodology proposed here was created from the resulting successful implementation.

In the second part of the supplement, ebizQ's Dave Kelly takes a wider view at the different kinds of longer-term objectives these BPM initiatives can support.

04/02/2007 BPM Technology: From Best- of-Breed Tools to BPM Suites to Business Process Platforms Gold Club Protected
This is the question and answer portion of the webcast "BPM Technology: From Best- of-Breed Tools to BPM Suites to Business Process Platforms" with Gartner's Janelle Hill.
03/23/2007 Full Transcript: BPMG.org's Terry Schurter on BPM Training Gold Club Protected
02/27/2007 Full Transcript: Podcast with BPM Expert Derek Miers Gold Club Protected
02/15/2007 Dr. Bruce Silver of BPMEssentials.com on BPM Training Gold Club Protected
Previous Start Next

Explore Our Topics

  • EDITOR'S BRIEFING
  • Virtual Conferences
  • Webinars
  • Roundtables

BPM in Action

March 10, 2011

The sixth annual BPM in Action 2011 Virtual Conference will explore cutting-edge market developments in BPM and describe how to leverage them for improved business operation and performance. More

View All Virtual Conferences

Smart Case Management: Why It's So Smart.

Date:Nov 05, 2009
Time:12:00 PM ET- (17:00 GMT)

REGISTER TODAY!

Date:Oct 29, 2009
Time:15:00 PM ET- (19:00 GMT)

REGISTER TODAY!
View All Roundtables
  • Research Library
  • Podcasts
  • News

Joe McKendrick: Part II of II: Designing Evolve-ability into SOA and IT Systems

In part two of Joe McKendrick's recent podcast with Miko Matsumura, chief strategist for Software AG, they talk about how SOA and IT systems need to change and grow and adapt with the organization around it.

Listen Now

Phil Wainewright: Helping Brands Engage with Social Media

Phil Wainewright interviews David Vap, VP of products at RightNow Technologies, and finds out how sharing best practices can help businesses understand how best to engage with online communities.

Listen Now

Peter Schooff: Making Every IT Dollar Result in a Desired Business Outcome: Scott Hebner of IBM Rati

Scott Hebner, Vice President of Marketing and Strategy for IBM Rational, discusses a topic on the top of every company's mind today: getting the most from IT investments.

Listen Now

Jessica Ann Mola: Where Will BI Fit In? Lyndsay Wise Explains

In BI, this tough economy and the increasing role of Web 2.0 and MDM are certainly topics on people's minds today. WiseAnalytics' Lyndsay Wise addresses each of them in this informative podcast.

Listen Now

Dennis Byron: Talking with...Deepak Singh of BPM Provider Adeptia

Deepak Singh, President and CTO of Adeptia, joins ebizQ's Dennis Byron in a podcast that gets its hand around the trend of industry-specific BPM.

Listen Now
More Podcasts
  • Most Popular
  • Quick Guide
  • Most Discussed

Quick Guide: What is BPM?

Learn More

Quick Guide: What is Event Processing?

Smart event processing can help your company run smarter and faster. This comprehensive guide helps you research the basics of complex event processing (CEP) and learn how to get started on the right foot with your CEP project using EDA, RFID, SOA, SCADA and other relevant technologies. Learn More

Quick Guide: What is Enterprise 2.0?

A lot of people are talking about Enterprise 2.0 as being the business application of Web 2.0 technology. However, there's still some debate on exactly what this technology entails, how it applies to today's business models, and which components bring true value. Some use the term Enterprise 2.0 exclusively to describe the use of social networking technologies in the enterprise, while others use it to describe a web economy platform, or the technological framework behind such a platform. Still others say that Enterprise 2.0 is all of these things. Learn More