Security Control Attestation for Cloud Computing Providers

While working on one of the  initiatives in the cloudsecurityalliance.org working groups, we had an interesting exchange of ideas on relevance of SAS 70 and similar certifications for cloud service providers. There were viewpoints that such certifications may not be sufficient & their usefulness debatable when it comes to cloud environment & various flavours it has to offer.

In this post, I preset my views on the subject

To understand how such certifications can help the Cloud service providers, we can look at the strategies various IT sourcing providers with global delivery models adopted when it came to providing assurance to their customers on Information Security & regulatory compliance.

In the early years of outsourcing, (around 2000 – 2004) there were lot of apprehensions expressed by potential customers around the state of Information Security, Risk Management & Regulatory Compliance when outsourcing to IT service providers, be it HP, IBM, CSC, EDS, HCL etc. The IT service providers knew that to get customers business, they need to provide a reasonable assurance on the state of Information Security to their customers.  In any outsourcing discussion that took place around that time, there used to be a huge focus on such topics.

What these providers then did was,  they adopted industry best practices and standards like ITIL and BS7799 (now ISO27001). Then they got an external body to audit and certify the state of these controls in their delivery centers (certifications like SAS 70 Type I&II to demonstrate the presence and effectiveness of the implemented controls). On top of that, some of these providers also allowed their customers to audit the security controls implemented at the provider’s delivery centers at random either by customer’s internal auditors or by customer external auditors.

Over time, the adoption of the industry standards combined with SAS reports and yes ‘right to audit’ did provide a reasonable assurance to the customers. Many of these providers have been successfully able to demonstrate the state of information security at their delivery centers to the existing and new customers and the business has been good.

Now, is it guaranteed that just because these providers have SAS 70 certifications, all is well at their centers? I don’t think anyone can guarantee a 100% secure environment.

I think the cloud computing market will evolve in a similar manner. It would require cloud computing providers to implement necessary controls, adopt standards, furnish recognized certifications as a proof of effectiveness of the controls. Without these certifications, these providers will find it tough (just like the IT outsource providers) to demonstrate effectiveness of the controls implemented by them.

Having said that, I also think that cloud computing service providers will also be required to let  customers the ‘right to audit’ on top of these certifications.Especially enterprise with enough business potential will be able to muscle their way with the providers.

I recently met a cloud computing provider and asked them about the right to audit and they said – they wont let customers audit their facilities and even refuse to divulge the location of their DC’s. I don’t see them winning too many favors with auditors with such an approach, especially those who are very particular about data sensitivity and regulatory compliance. These providers may continue to get the non-critical portion of the enterprise  IT environment. Unless reasonable and acceptable assurance around Information Security & regulatory compliance  is provided, the critical, sensitive corp apps are likely to stay within the enterprise DC probably in a private cloud kind of setup.

Advertisements

from email to collaboration

initially i had titled this post as “email and its imminent demise” but then maybe it is incorrect to say “imminent demise”, rather i think it will slowly move into the background and slowly treated as legacy application if it fails to evolve and incorporates the new web 2.0 technologies that ppl are toying for enterprise use..

email has come a long way since it was invented in early 1970’s. people have used emails to  send simple messages to each other/group, use it as means of sharing documents (much to the plight of the email administrators), use calendaring features for appointments and scheduling meetings. on looking closely at how an email is typically used in an organization, i have noticed that along with the above mentioned functions,  emails are also used as means to store files (hey dude, can u email me that presentation that you took for a xxx client the other day, i may use it in future),  approvals for certain business transactions (e.g approval to buy a new server sent by CIO to IT Manager etc), keep a record of certain communication (he said, i said etc) to basically CYA (covery your a**) in case things go bad..however, along with email came its nuances, maintaining uptodate address book, spam, compliance issues etc. but we are all dealing/living with it.

as time goes by, in the world outside the boundaries of an enterprise, people are adopting techniques that harness the power of technology to reach out to each other, either as part of social networking phenomenon or to work together/collaborate. with the power of the internet coming to the mobile device, the speed at which this adoption is taking place is awesome. now, people dont need to access their emails to know what their friends/peers are doing or what is the latest buzz etc. applications like facebook and twitter have already proven the usefulness of technology outside the enterprise walls..it is only a matter of time when they are adopted within the enterprise also. already there is a lot of talk about web 2.0 adoption within the enterprise by many analysts. email too needs to evolve if it has to stay alive in this fast changing scenario. already there are surveys which shows that more and more people are using facebook/myspace/twitter etc over email to reach out/keep in touch.

in my opinion, organizations will start looking for a better collaboration platform which can increase the effectiveness, efficiency & productivity by harnessing the same technologies that are used by millions outside the enterprise boundaries. realizing this need, some email services have now integrated instant messaging/chat services with archiving features on the messaging platform. . how many times have you noticed that when ur having an email exchange with someone, u end up taking the remaining conversation to the chat as it is faster and more effective. i think the collaboration tools like instant messaging will  evolve and bring convergence of channels of communications viz chat, voice, video along with email and other forms of collaboration technologies like document management systems etc. the other day i wanted to reach out to a group within my company to work on platform migration. instead of sending emails to numerous people and then getting redirected from one group to another, i just sent an update on yammer and got a reply within 2 hours from someone located on the other side of the world.. it was awesome and saved me numerous emails and days in waiting for a favorable response.

there will be initial resistance from many to move from legacy to new forms of collaboration. there will be compliance & security concerns. but like the case with any new technology, it will find its own steady state adoption rate. only that, given the way web 2.0 is evolving,  this rate might be accelerated by few notches.

already there is a lot of noise that google wave has generated on the internet. maybe microsoft solution around corporate instant messaging – OCS will also evolve to bring in convergence of existing communication & collaboration tools with the new web 2.0 toys.

my take is that few years from now, email might just end up as a legacy platform required to retrieve old email data and corporates will use a newer and more efficient form of collaboration technologies.

Cloud for IT Continuity

typically a DR site goes live when the main DC goes offline of fails. quite often, the IT infrastructure at the DR site sits idle waiting for an untoward incident to be kicked back into life. in some cases, the infrastructure at DR site is used to host dev & QA environments also. the DR sites are typically activated for a short period of time and when the main site/DC is restored, the DR goes back to idle state. is there an alternative to blocking investments in a DR site using the evolution in the technologies used in DC and still ensure continuity of operations?

can cloud & cloud based services provide enterprise with the desired level of continuity along with financial flexibility? in my opinion, this is a subject worth further exploration.

during a disaster, you either operate at same or reduced business service SLA’s around performance & availability as from the main site.  the requirements from the DR site are “elastic” in nature,  most of the times, the compute requirement around CPU, memory are pretty low except when activated and operations are run from the DR site.  usually it is the storage that has a consistent use. now, one of the major advantages of cloud computing is to meet elastic demands. put two and two together..i feel there has to be a case to use cloud for IT continuity!

one of the possible challenges is the consistency of the virtualization technology within the enterprise with that of the cloud computing provider. i do not think the cloud computing providers fraternity has something of an intera-operable virtualized images across different cloud providers and private cloud platforms..(or maybe they have. this is something i have not tracked in the google-sphere yet!). so basically what that means is you are stuck with those set of cloud computing providers who use the same virtualization technology as you use in-house in your DC’s for the time being. but compared to having idle investment in your dedicated DR sites, this may be a small trade-off.

some points that i can think of while evaluating the cloud platforms for DR & IT service continuity is – licensing of your existing apps..does the licensing allow you to run the apps from a cloud computing setup, connectivity options to allow migration of large amount of data/images to the cloud computing provider’s setup, how are you going to keep the images of your apps etc in the cloud environment up-to-date with necessary patches, security policies of the providers and client access mechanism.

will update as and when i have discussions with more customers on this topic!

Random Notes on Cloud Computing!

this post captures some random notes i have come across & my thoughts on technical aspects that can facilitate the cloud computing environment. these are not in a structured order, so bear with me!

1. Cloud computing is a way to maximize capacity and utilization and to minimize space, maintenance and to simplify governance.

my thoughts – Does it actually simplify governance is something that is yet to be seen as governance also encompasses security, risk & compliance along with service orchestration.

2. Virtualization is not a cloud solution, but a cloud solution will require virtualization in some form, whether it be cloning or full virtual images.

3. Parallel processing on pooled resources is not a cloud but the principles of that are important to the conception of an effective cloud.

my thoughts – absolutely in agreement with point 2 & 3

4. A cloud also requires understanding of the enterprise, a clear picture of patterns and topologies and an efficient process for managing images as distinct entities.

my thoughts – Cloud computing will have an impact on the Enterprise Architecture of an organization to address the new patterns and topologies.

5. Cloud bursting –  The scale out should not require tremendous effort & specialized skills otherwise the benefit of cloud computing may be lost or reduced.

my thoughts – cloud bursting requires a thorough understanding not only to move from private to public cloud but also enable the reverse. i agree with IBM on the point that applications hosted in the cloud need to run on same platforms as enterprise applications to facilitate movement between the enterprise and public clouds. not everthing can be free in life 😉

6. Scaling out for Scalability – Running another instance of the application on another server(s)

my thoughts – Typical scalability in the cloud is provided by scaling out and not scaling up in cloud computing frameworks. this will also depend on the way the application logic has been written to benefit from multithreading, multicore and multiprocessing technologies that are/will be available in the cloud. the way an application logic is written will eventually determine the ability of the application to seamlessly scale across multi cores, across physical servers and be able to withstand & survive any infrastructure failure.

7. Load balancing – balancing the work across multiple systems in the cloud

my thoughts – usually most of the cloud players will allow you to create exact replica of your systems thus balancing the transactions across these set of “clone” systems. if using atomic code, one can also allocate specific systems for specific tasks instead of creating the clone of the entire application system.

 

8. manageability – ability to manage the cloud systems seamless with lower management overheads

my thoughts – management of the cloud based systems will be a become a big ticket item in times to come for enterprise and cloud providers, both. this can be achieved by using virtualized systems and layer of automation to ease the provisioning and de-provisioning of resources on demand. enterprise will look at how cloud will deal with the applications to be deployed. using the process of cloning of systems, enterprise will prefer multiple instances of applications can be implemented with few clicks of the mouse instead of deploying the application on each virtual instance. same goes for ongoing operations. how easy is it to patch the running application instances? does one need to go to each system to patch it or can it be done on one system with the patch propagating on other instances?

basically it will be all about keeping the opex as low as possible by easing the management of the cloud systems and sub systems.

more to come!

Cisco’s Collaboration Framework – My View’s

 While searching for information on Cisco UCS, I came across some sites where Cisco’s acquisitions were being discussed.

In the past few months, Cisco did some pretty interesting acquisitions. When looked at each acquisition individually, some make sense and some don’t . But if u step away, a picture starts to emerge. Some of the acquisitions made by cisco are:-

· Webex – for USD 3.2 billion – meetings over the web

· Postpath for USD 215 million. – email and collaboration. It has been the most surprising acquisition from Cisco.

· Jabber – financials not known – Jabber has developed a “carrier-class” platform based on open standards that can work across multiple messaging systems, such as AOL Instant Messenger, Google Talk, Yahoo Messenger and Office Communications Server

· Ironport – USD 830 million – email anti virus and anti spam

· Five across – 11 member company which allows large companies to easily add social networking features to their websites

· SoonRgiven USD 9.1 million dollars to soonR – a backup service focused on enabling access to your files from mobile devices. SoonR synchs your files to cloud storage via a downloadable client that runs in the background of both Macs and PCs. When you’re on the go, you can access these files with the web browser in your mobile phone.

· Recently Cisco/Webex introduced – remote desktop management capability and patch management capability in the webex client. I have no idea as of now where cisco is headed with these developments in webex. But it just might be a sign of things to come from Cisco.

Where is Cisco headed with these acquisitions? Well, my thoughts on how Cisco might be planning to play with features from the companies it has acquired can be summarized by the figure below. (I know the handwriting is not clear but didn’t have scanner so used camera phone and anyway..like they saying goes – a pic is worth thousand words.. 

Cisco's Collaboration Framework - My View

Cisco's Collaboration Framework - My View

 Cisco might be planning to take on Microsoft & IBM on business collaboration by using these acquisitions.

Cloud Computing

i came across a client who talked about wanting a – pay as you go model for it services using cloud computing model. that set me on the path to explore cloud computing. in the next few posts, i will try to present my thoughts and ty to provide feedback on my engagement with customer’s on this topic.

My Take on Utility Computing

one of my friends asked me why i was writing about a concept that is quite old. (as old as the blue boxes – maniframes)

well in the recent past, there have been many cases where the customers have expressed their desire to move to a utility model for various services either explicitly in their outsourcing rfp’s or during the course of discussions. i believe it has everything to do with the bad economic conditions prevailing today and stress on it to rein in capex and opex costs are leading to even mid and large enterprise to explore the concept of utility computing.

in these posts, i try to share my take on the utility computing in the context of services being asked by the enterprises and what it means to provision the same from a services provider point of view. also i believe that to understand the buzz around cloud computing, it is important for me to understand and dwell on the topic of utility computing for my own benefit 🙂

utility computing can be defined as a mechanism of provisioning IT services & resources on the similar model as utility services like electricity or water services.flip a switch, lights come on and the meter starts to count the power cycles used. at the end of the month, you pay for what you consumed. as everyone knows, the concept of time sharing has been there since the early days of mainframes.but since then much has evolved in this space.

these days, i have come across customers who have asked for services like infrastructure services (dhcp, dns etc), file & print, email, storage, application packaging, dev & test environment, server computing, WAN, VoIP etc. some of these have not been covered under a true utility services portfolio by many of the services providers. in fact there is a very large customer with whom we started engaging who was willing to put everything in their IT shop in a “pay as u go” model. their critical business apps, non critical apps, infra apps everything. their IT capex & opex combined is approx a billion dollars if not more.

from a service provider point of view, to provide a true utility based services, it means:-

low switching cost – the services should have low switching cost from a “in-house” model to a “as a service” based model. this will allow faster adoption of such services by organizations looking to either reduce their cost of operations. however, this also means that customers would also be able to move from one utility based provider to another. so, in order to have customer stickeness, month after month, one has to ensure the right RoCE (Return on Customer Experience) along with RoI (Return on Investment) to the customer.

developing a financial model that appeals to customers– the plans can be purely subscription based (like newspaper) with no upfront cost or cell phone plans (pay as u go) or can be a mix of some base cost plus pas as you go. some customers are willing to pay some upfront cost (also called transition cost) and then a monthly subscription cost based on “per service unit consumed”.

building services on a multi tenant model – one ways to recover the cost of the extra capacity is by having a multi tenant model. then the cost of the extra capacity is amortized across multiple customers. however many a times, i have come across customers who want exclusive services but in a utility mode. i think such organizations should be under no illusion that the service provider will have no option but to amortize the cost of provisioning of services across multiple years after adding some finance charges to the base cost.

have forecast of usage of the service – the service providers need to have an estimate of the usage of the services to cater for addition capacity to be provisioned. i recently encountered a situation where the customer wanted to have a utility based model for certain IT services but in an environment totally dedicated to the customer and without any volume or service usage committment or estimates. under such circumstances, it gives the service provide very little room to manover and create a true utility model. rest assured, it would be all but financial engineering on excel sheets with a lot of exclusions and conditions.

providing capacity on demand – very closely linked to having the ability to forecast the usage of the service. as a service provider, the ability to forecast usage can help in designing the capacity management process. so while developing a utility model for a service, it is important to understand who will be the consumers, knowing how business uses IT (retail industry typically has high peaks of usage of IT services around holiday seasons, christmas etc), number of customers who are likely to use these services.

commission a metering solution to measure and transparent billing – one of the most important aspects of a utility based model is to have the ability of charging a customer for services consumed baed on the billing plan. hence it is but obvious to have a metering solution capable of accurate measurement of the usage and be transparent to the customer about it (online dashboard and detailed reports help).

security & compliance – this is a new requirement that was not there during the early time sharing days.largely as a result of regulatory & compliance requirements, this is one of the biggest areas of concern for the customers to move to a multi tenant utility based model for IT services. also as time has gone by, the security requirements have evolved along with awareness on risk to the infromation processed & stored in electronic format. in my opinion not enough attention has been paid to this aspect. however if the requirement of utility services becomes a mainstream requirement, i believe just like offshore players have adoped security standards (like ISO 27001 & use SAS 70 Type I & II as statement on presence & effectiveness of controls) to provide a sense of assurance to customers, the utility service providers will also walk the same path.