We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

ebizQ's Business Agility Watch


What the Government Has Learned From Cloud Computing: Talking With Tarak Modi

Vote 0 Votes

Listen to my podcast with Tarak Modi, Vice President and CTO of CALIBRE Systems. Tarak is an industry thought leader in IT transformation and modernization technology such as enterprise architecture, SOA and cloud computing, and in this podcast, we check in once again about what's going on with cloud computing and the government.

Listen to or download the 12:18 minute podcast below:

Download file


PS: I keep hearing about security issues in terms of the cloud. Are there any ways that cloud can actually help improve security?

TM: Actually yes. Cloud computing has a lot to offer in terms of security benefits. In fact, Mike Bradshaw, the Director of Google's Federal Division made some very interesting points before the house Committee on Government Oversight and Reform during a recent hearing on moving federal IT into the cloud. He contended that using clouds to store and manage our data is just like using banks to safeguard our money. Storing your critical data on your computers either at home or at work is akin to keeping your cash under your mattress. Still, most agencies today are leery of putting the data in clouds. Actually, can you imagine what people might have said when the first banks came about? Now tell me again why would I take my money from the safety of my mattress and give it to this place you call a bank?

Its not hard to see why clouds can be more secure since cloud providers tend to employ highly trained security professionals, it's just part of what they do. And it's not just about a more mature security staff. Many of the security benefits of the cloud are actually due to its architectural underpinnings. For example, virtualization enables quicker application of securely configured servers. Broad network access reduces the need to carry data and portable media such as thumb drives which in turn reduces the probability of physical [indiscernible]. And the increased economies of scale with a distributed infrastructure not only provides low cost disaster recovery but a higher resistance to denial of service attacks as well.

In fact, Peter if you recall, the US government faced a massive distributed denial of service attack last year on July 4th in 2009. Many federal agency websites were severely impacted but the public facing website for the Department of Transportation's CARS Program was not affected. Now is just a coincidence that this website was being hosted in the cloud?

You know a few days ago, I was reading a year old government accountability office report on security deficiencies in the federal government. Interestingly, the report confirmed what many of us have longed believed. Many of the data losses in federal agencies are not the result of sophisticated high-tech carefully orchestrated attacks, but the result of simple things like physical tests due to inadequate safeguarding of laptops, and other portable devices, misconfigured network devices, core patch management and the improper segregation of job functions.

Honestly, each one of these findings could be relatively easily fixed with a properly managed security program. So based on everything I just said, yes, I do think cloud computing will one day evolve into a more viable and secure alternative to the current suboptimal status quo model of desktop centric and on-premise computing that we have today.

PS: Well thanks, boy that's a great comparison about keeping your data under the mattress basically.

TM: Absolutely.

PS: Exactly how is the federal government using the cloud right now?

TM: The short answer is lots of different ways. I think one thing you'll immediately notice though is that the vast majority of government based cloud initiatives are private cloud implementations. So for example, consider DoD's RACE Program started by the Defense Information Systems Agency or DISA in October 2008. It's managed completed within the existing DoD datacenters and operates only on the DoD's internal network. That's code language for it's a private cloud.

As another example, consider NASA Nebula, which is an infrastructure-as-a-service implementation for scientific data and web-based applications. Although Nebula is a private cloud today, eventually it might become a hybrid cloud so that NASA can collaborate more efficiently with the academic community and the public. So why are private clouds so popular in the fed? The answer to this question goes back to our first question. While there are definitely security advantages of a public cloud, the trust just isn't there yet. Agencies are just not willing to take on the risk of having their data in a public cloud.

PS: So you brought up private clouds, what are the lessons that the government had learned using private clouds?

TM: Well, there's lots of different lessons that have been learned. And in fact, you probably won't be surprised that most of the lessons learned are in the area of cloud security certification and accreditation. By law, all clouds that holds federal information assets have to be certified to operate at a specific security level. This security level describes or defines what can and more importantly, cannot reside on that cloud. For example, RACE is currently certified to operate at the moderate impact level. Similarly, Nebula is currently authorized to handle only low impact data. This means that a federal information system that's classified as moderate cannot reside on Nebula.

So all this sounds pretty simple except for the fact that current federal guidance does not clearly address specific controls for cloud computing like RACE or Nebula. Even if it did, hosting a system in a cloud requires the additional step of delineating who, the customer or the provider is responsible for each control. And to complicate matters even more, sometimes a control might be a joint responsibility between the provider and the customer because the provider will still have a responsibility for parts of the infrastructure that are under its control. So as an example, effective incentive response in a cloud is only possible with clearly defined customer provider and joint responsibilities. Unfortunately, the lack of clear guidance means that each cloud is handling all these complexities in its own way.

So for RACE, DISA has documented a clear division of responsibilities between what it does and what it expects from its customers. First, all potential customers must agree to meet minimum information security requirements before they are even allowed access to the RACE environment. Second, out of a total of 106 applicable DoD information assurance controls, DISA has a designated 62 as the responsibility of the customer, 31 as its own responsibility, and 13 as being not applicable.

Nebula has a similar story. In their current infrastructure-as-a-service offering, the customer is responsible for 47 of the 112 total controls as defined by NIST Special Publication 800-53. Both RACE and Nebula give us some empirical data on how security might be managed in a cloud. But the reality of the situation is that the exact number of controls for which the customer is responsible will always depend on many factors such as the cloud-computing provider itself and the service model.

PS: Now, so these are the lessons learned. How are these actually being translated to the federal government?

TM: Well, there are few things going on. If you recall in our last podcast, we talked about a program called FedRAMP in quite a bit of detail. FedRAMP provides a unified government wide risk management framework that enables centralized security management of cloud computing for federal agencies. It actually goes to the heart of what we just talked about in the last question, which is how do we best perform security authorization for large outsourced and multiagency systems like the cloud providers and platforms.

Also, along the same lines, the General Services Administration or GSA has reissued its blanket purchase agreement or BPA for procuring infrastructure-as-a-service cloud computing services. As you probably remember Peter, GSA first issued this RFQ in July of 2009 but cancelled it five months ago in February. We actually talked about this in our second podcast. As expected, the reissued RFQ has a few differences. So let me highlight a couple of key differences. A major one is in terms of sheer implications is that the GSA has raised the security level from low to moderate so that's going to have significant implications on the certification and accreditation process the clouds have to go through.

Another big difference is the integration with FedRAMP. Now this was obviously not possible the last time around since FedRAMP wasn't even in existence at that time. But the new RFQ specifies that awardees must obtain a full authorization from FedRAMP before they can even fulfill any orders on the BPA. Each awardee will be given three opportunities to submit its documentation for certification. And if an awardee fails to receive authorization, it could actually be disqualified from the BPA. So basically, the lessons learned are being translated into better thought out end-to-end strategies with increased collaboration between the agencies. Ultimately, it's all good.

PS: Excellent. Now, let's look forward and what do you see for the rest of 2010 for the cloud?

TM: Oh, it's definitely going to be a lot of action in the next few months. First of all, President Obama has asked each non-security agency to cut their discretionary budgets by at least 5% for Fiscal Year 2012. There's no surprise that any technology that can improve ROI and cut costs will seem very attractive to federal agencies. Can anybody say cloud computing? The Obama Administration has also very clearly stated that the 2012 budget should include funding to support datacenter consolidation plans, which are being finalized this year. And as part of these plans, the agencies have been clearly instructed to evaluate the potential for adopting cloud-computing solutions.

I definitely expect to see a lot more activity around both the FedRAMP and the GSA cloud computing related RFQ that we just talked about. Another area of activity will be SAJACC, which we talked about in our previous podcast. If you recall, SAJACC's goal is to accelerate the development of cloud-related standards and publish authoritative information on a NIST hosted SAJACC portal. So Peter, to sum it up, I think Federal CIO Vivek Kundra puts it best when he says, "Cloud computing is not a silver bullet. But it does offer a transformational opportunity to fundamentally reshape the operations of the government and further close the IT gap between the public and private sectors."

Like I said, it's going to be an action packed remainder of the year.

PS: All right. Well, that sounds very exciting and we'll definitely keep in touch with you Tarak and you can keep us informed.

1 Comment

An interesting blog posting in GovInfo Security titled "Lessons Learned from NASA's Cloud Initiatives" by Eric Chabrow came out today.

You can find it at:

ebizQ’s expert blog team covers a broad range of BPM, business integration, business analytics/monitoring, collaboration, content and related issues.

Peter Schooff

Peter Schooff is Contributing Editor at ebizQ, and manager of the ebizQ Forum. Contact him at pschooff@techtarget.com

Recently Commented On

Monthly Archives