We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Anne Stuart’s BPM in Action

Dennis Byron

Cloud Computing Is Not the "Next Big Thing"

Vote 0 Votes

ebizQ managing editor and Forum instigator Peter Schooff is running a question this weekend about "BPM in the cloud." I'll also be taking on the same subject at a webinar scheduled for June 3rd (sign up here and use the feedback mechanism to let us know what you would like covered at that session).

But before spending too much time on the subject of how BPM works best in the cloud (and when it does not), the industry needs a big reset button on the meaning of cloud. (And I don't mean reset in the Obama administration sense of "to charge double" for something.)

The cloud is the intersection of a lot of very well researched and applied information technologies, some of which are up to 50 years old. These include peer-to-peer and grid computing, virtualization (of all resources, not just servers), wide-area networking, security, and a few more. Many of these technologies started with a major General Electric/Bell Labs/MIT research/product-development program in the 1960s called Multics. Multics in turn was an outgrowth of the MIT Compatible Time Sharing System (CTTS). GE was involved because it had been influential in the "invention of timesharing" a few years earlier at Dartmouth.

To really understand cloud computing, you need to understand the benefits of 1960s timesharing systems and apply them to everyone, not just the small cadre behind the locked doors in the rooms with raised floors, and enough air conditioning to cool down Washington DC in August.

The cloud is not SaaS. The cloud is not OnDemand. Cloud computing is not "the next big thing." It is "the first big thing" finally done right in the sense that all the stars are now aligned to deliver on the promises of the 1960s.

-- Dennis Byron


| Leave a comment


A great article putting a historical perspective on the evolution of computing.

I've said for years that the Web is the mainframe model of computing --- centralized processing and data -- only you have graphics instead of green screen and plug-ins instead of PF keys.

Back in my U of Michigan days, Arthur Burke (created the multiplication unit for the ENIAC) often pointed out that computer science is not a natural science, that much of the innovation came from other disciplines. Let's not forget punch cards programmed Jicard Looms long before the first "computer".


Dennis -

I think the historical references are interesting, but to say that these machines were effectively implementing "the cloud" in the 1960's as the first big idea... well, to really give credit where it is due for this "big idea", GE is not the answer. It was our science fiction authors (Isaac Asimov, et al) who envisioned a single computing "intelligence" (MULTIVAC http://en.wikipedia.org/wiki/Multivac) that all of humanity leveraged (to their benefit or detriment). The computer even designed its own self-improvements.

I think the article above misses the point of why those timesharing systems were put in place. Computers were expensive. Computer *time* was expensive. Therefore, human use of the computer had to be scheduled - and thus the first schedulers were born. Because the motivation was completely different, the design principles come out of a different need.

But now, computer time is cheap. The real value in going to the cloud is to avoid the expensive human cost of setting up the hardware and software that allows these systems to run (and maintain them over time) when maintaining such systems may not be an organization's core competency. Cloud computing is mostly about optimizing on people time/cost and ongoing maint costs.

Although the ideas for peer-to-peer, grid computing, and virtualization may have been present in the 60's, the technology in play now is dramatically different, and a straight line of development effort can't be drawn from those times to now - the very components of the computer are different, the software languages are different, the network architecture is different, the layers of abstraction are completely different. The type of problems a human can imagine implementing solutions for are dramatically different today than 50 years ago. Not that the imagination has changed, but the ability to execute on that imagination *has* changed. And all of these landscape changes were done for reasons separate from the idea of a "cloud" computing resource - with possible exception of Sun's old tagline "the network is the computer". And that's why i feel drawing a direct connection between these points of time is a bit of a stretch.

My iPhone does more than Univac could do. I guess my feeling is, I think you can look back at the last 10 years, 20 at the most, and get all the historical context you need to understand "the cloud". As is typical in software, "the cloud" is just the latest marketing label for something people have been doing for a long time. (AJAX is another great example of this type of post-facto marketing/naming)

Small anecdote. I once ran across a software system at a client that wanted to replace it. However, no one knew how it really worked, in its entirety. It had been in place since the late 60's, upgraded and modified many times. But by this time, the original developers had quite literally passed on. And the last people to make mods were doing Y2K mods... So no one knew exactly how the black box worked. Asimov would have loved it. Great confirmation of some of his ideas about how people would interact with technology in the future : they would interact with it, but they would no longer understand it well enough to fix it or build it again... Scary and interesting at the same time. Thankfully we still know how to build it again, even if we don't know how to fix it :)

Leave a comment

Business process management and optimization -- philosophies, policies, practices, and punditry.

Anne Stuart

I am the editor of ebizQ.

Recently Commented On

Monthly Archives