Cloud for dummies


Dear Readers,

We want to introduce you new article "Cloud for Dummies"

Enjoy reading!

Author: Antonio Ierano

"Cloud for dummies"

We all know about security, we all know about cloud and we all have had heard about security + cloud.

But what is the real relationship between cloud computing and security?

We have to turn back to the very beginning of our journey on cloud to understand how to start talking about cloud security; there are so many declinations and aspects of cloud that talking about security would be a fair useless exercise without a clear understanding of the basic.

No matter if you are a senior IT pro, a security expert or a newbie. No matter if you plan to use cloud as a user or plan to offer a cloud to your customers, this guide will provide you a quick overlook on what means talking about security when we are dealing with cloud services, providing you the knowledge to be able to deep dive the areas of your interest or need.

So keep on the journey.

Cloud, what are we talking about?

The very first question we should ask ourselves is what is cloud. Actually there are so many various offering about cloud that is quite hard to understand clearly what cloud is about. Anyone that in the past was offering a remote service now talk about cloud, anyone who offer you a remote connection now talk about cloud, seems that you have cloud even in your own personal computer even when you’re not connected to anything.

Therefore, the very first exercise is to try to understand what cloud by its definition is:

The NIST Definition of Cloud Computing1

Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

This cloud model is composed of five essential characteristics, three service models, and four deployment models …

Another definition that could help us to understand what cloud is from wikipedia2

Cloud computing in general can be portrayed as a synonym for distributed computing over a network, with the ability to run a program or application on many connected computers at the same time. It specifically refers to a computing hardware machine or group of computing hardware machines commonly referred as a server connected through a communication network such as the Internet, an intranet, a local area network (LAN) or wide area network (WAN) and individual users or user who have permission to access the server can use the server's processing power for their individual computing needs like to run a application, store data or any other computing need….



Figure General view of a cloud service

The device could be either a laptop, a tablet, a smarthphone reaching the service provided through a client. When we are talking about client we refer generally to a web browser or any specific client (think about apps).
This model, apparently, is the old classic client server one, with the difference that the “server” does not reside in our network but is “somewhere”.
Actually the offering is wider than application, usually when talking about cloud we use to talk about XaaS (something as a service), where X can be virtually anything.
As a classic example common declinations of cloud are
-SaaS1 Software as a service
-PaaS2 Platform as a service
-CaaS Communication as a service
3  Think about webmail services or CRM as Salesforce or office platform as Google docs or Microsoft Office 365
4  Is quite not possible to not name Amazon Cloud Services or Windows Azure in this area

Figure Cloud declination for customers

So from an external point of view cloud can offer not only access to a specific service, abut can be also the platform where I run my own servers.
Just to make things more confusing cloud nowadays can offer also security as a services, storage as a service, something as a service (again SaaS where “S” now means something else) and we can talk about public, private and hybrid cloud and …
OK! Ok! I got it. But what is this cloud again? How Is made?
To answer that let us open this cloud and see inside, this will allow us to start with our very first considerations about security.
We will turn back on the various “…aaS” later.

Inside the cloud
Figure Opening the cloud

According to the very vague definitions we have match before seems that cloud means that there is somewhere out of our network a computing entity able to offer us some kind of services.
This entity is complex in the sense that inside the cloud definition there is the idea of something distributed and easily accessible. Two elements are key in the definition:

-Distributed computing means that somehow there is a concourse of more nodes that enable the computation

-The distribution is not merely local LAN based but Wan

This at least allow us to cut out some configurations in the classic client server model and allow us to expand the concept easily also to the platform as a service domain.
We have done the very first steps to un-cloud the clouds on cloud computing. We are talking about nodes with computational power that are geographically distributed.
The very first element of a computational node is a single computer, a typical example of this kind of structure is the grid computing1.
Since this article is not about grid computing but cloud, we will make some simplifications and assume that every node is a complex entity that is able to perform computing tasks, process and store data and able to share services between other similar nodes and client requesting the services.

With that assumption, the cloud start to shape as something like that:

5  It is beyond the scope of this article to decline and explain what is grid computing or what are the differences between the grid-computing model vs. the cloud one. In case of interest internet provide tons of documentation on this.

A collection of geographical distributed datacenters
Figure What a Cloud is: datacenters somewhere in the world

Connected through some kind of connection
Figure Datacenters need connections provided by a carrier

But if things are like that, here we are, we can make our very first security considerations.


The Datacenter node is always a datacenter

Since we are talking about a datacenter there are issues that can be easily understood because we all have had experiences on datacenters.
A datacenter, to make complex thing easy (isn’t this a dummy guide?) is a place where we put servers, storage, software, routers, switch and all the stuffs that makes Information Technology such a wonderful place to work with.
Figure Inside a Datacenter

In terms of Cloud computing any datacenter aspect can be offered to the customers providing the various cloud declinations we saw before.
Those datacenters (can I call them DNode? Please? Yes? Thanks!), Those DNodes are geographically distributed and somehow connected.
Since a Datacenter is a datacenter, we will have to face the usual problems when we deal with a datacenter as:

-Storage security
-Data integrity
Those aspects are usually in charge of the provider offering the cloud service, but somehow have an impact on the user as well. We can usually consider there are some aspect that are in complete charge of the provider and are seen by the customer only partially, other points that are common to both and some that are specific for each one.
Below, I put some points that require attention and the different perspective from the provider side and the customer side
Table one General DNode issues:

-Provider Customer
-Resiliency and reliability of the datacenter SLA related to the service provided
-Compliance Compliance
-Law requirement Law Requirement

Some of those kind of issues needs to be declined better to be able to better understand some aspect.
While resiliency and reliability can be easily translated in customer SLA, Compliance is another issue and require different approach that should be integrated.
For a provider of cloud service there could be compliance requirement related to the geographical location of the datacenter, those requirement could vary from country to country and from service provided to service provided and even for the kind of customer targeted.

Therefore, for example, if you want to provide a service in Austria that can deal with employee sensible personal informations you should be aware that those data, by law, have to reside in Austria and cannot leave the country without explicit approval of the employee.
On the other end if your datacenter is located in USA, according to the USA legislation Law enforcers can ask to access data even without a warrant if under “Patriot ACT6” terms, and the provider, in case of a NSL7, could not even warn its users.

Security in the cloud can be divided into two mainstream aspect:
-security from a service provider point of view
-security from a customer point of view

6 The USA PATRIOT Act is an Act of Congress that was signed into law by President George W. Bush on October 26, 2001. The title of the act is a ten-letter backronym (USA PATRIOT) that stands for Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act of 2001. In May 26, 2011, President Barack Obama signed the PATRIOT Sunsets Extension Act of 2011, a four-year extension of three key provisions in the USA PATRIOT Act: roving wiretaps, searches of business records (the "library records provision"), and conducting surveillance of "lone wolves"—individuals suspected of terrorist-related activities not linked to terrorist groups.

7 A national security letter (NSL) is an administrative subpoena issued by the Federal Bureau of Investigation (FBI) in authorized national security investigations "to protect against international terrorism or clandestine intelligence activities" (i.e., spying). 18 U.S.C. § 2709(b). Federal law (Electronic Communications Privacy Act (18 U.S.C. § 2709), Fair Credit Reporting Act (15 U.S.C. §§ 1681u and 1681v), and Right to Financial Privacy Act (12 U.S.C. § 3414), authorizes the FBI to seek such information that is "relevant" to an authorized national security investigation. By law, NSLs can request only non-content information, for example, transactional records and phone numbers dialed.[1] NSLs may contain a nondisclosure provision -- preventing the recipient of an NSL from disclosing that the FBI had requested the information -- only if the Director of the FBI (or his designee) authorizes the nondisclosure requirement.

Sometimes those requirement match, sometimes collide and is up to the customer choose which level of compromise is acceptable.
Typical example of those colliding needs are compliant rules.

Consider that your compliance schema require random inspection of your systems in order to certify the level of security, if part of your computational power is cloud base you should need to be able to make random inspections also to your cloud provider DNode. But this can collide with your provider security and compliance requirement, as a matter of fact since the provider is sharing it its infrastructure with several customers allowing one of them to “inspect” the system open a security breach.

At the same time if a cloud provider store some data in a country where some rules requires open access to data could be forced to break user security and compliance needs.
Common slippery grounds are, as an example, the mail and mobile communication encryption systems. May be someone remember the 2010 RIM affair in Saudi Arabia or the more recent 2013 Snowden’s email provider Lavabit forced to close to not complain to a NSL requiring him to give to FBI clean access to all customers email providing to the agency all the encryption keys used.

A collection of DNodes needs to be connected
Connectivity is historically a basic security headache. VPN, encryption, authentication are all somehow related to connectivity issues. In the cloud environment, two basic connectivity environment have to be considered:

-Connectivity from the user to the cloud
-Internode communication connectivity

A usually underestimated issue is problem related to connections between DNodes. Recent NSA scandals related to the information released by Edward Snowden showed a reality that, although quite clear and known to the experts (and too many conspiracy theorists) revealed to a wide audience one of key security issue present nowadays: data can be accessed and used by non-owner authorized users.
One of the mainstream channels to access those informations has been the communication channels provided by carries for the internodes communication between cloud DNodes.

Basically the issue is quite simple, a carrier that connect different DNodes has a direct visibility to all the traffic, if this traffic is not encrypted (or if the encryption is offered by the carrier itself as a security plus) cracking the carrier PoA would allow an entity to read all traffic between DNodes related to that PoA.

Even if the perception if that we are using a private network for internode communications, unless we own all the communication chain we will, sooner or later, incur into an external PoA that expose our data at risk.
Is not a mystery that telco and carrier are usually quite used to “share” willingly or not access with law enforcement agencies and other groups of interest. We have plenty of case all over the world, from AT&T to Telecom Italia from the greatest to the smaller there are no way we can be sure our data haven’t been released somewhere without our permission.
If the DNodes are in different continents, we should assume that several layers of carriers8 have to be used.

8 A detailed cable map can be found at

Figure DNodes connected by the underwater cable grid

So we will have:
-The local carrier (usually the carrier providing you internet connectivity)
-The CountryWide carrier
-The various carrier that brings you to a underwater cable system from your countrywide carrier
-The carrier owning the underwater cable
-Eventually another underwater carrier if your first underwater cable does not reach the area you need
-The various carrier that are brings you to the countrywide carrier of destination
-The local carrier your destination DNode is attached to.

There is a little chance that you can make a communication overseas using a single carrier that own both the local access and the underwater cable system.
Moreover, if your country does not have the sea or the ocean as a border, well the chances are even smaller.
While it could seems an obvious consideration, it have to be the NSA scandal to move cloud providers like Google to encrypt communication between DNodes in order to protect the data.

Security from a customer point of view
It should be clear now that security and cloud brings together a wide series of aspects, bringing back most of the classical security problems related to wan and datacenters, some related to the old client servers problems and some completely new related to the cloud model itself.
Understanding how a cloud is made can help us to make correct considerations about cloud requirements, so let start thinking as a cloud user and let us find some good point of discussion related to the security.
The very best way to start this journey is to turn back to the “…aaS” model and try to understand what are the security implications.

X as a Service, means someone else is with me on this
As a general introduction, we should consider that security on the cloud means we have to rely on several third party to obtain access to resources on the cloud.

Figure To conenct a genericXaaS service we need 3 element, ourselves with our XaaS client, the internet provider and the XaaS provider

As a mere division we will have
-A carrier providing communication
-The cloud provider

Since we could not control all the chain, we will have to find a tradeoff between the security we wantneed, the trust we can give to our 3rd parties, the budget we have.
Moving to the cloud can be more or less expensive according to the tradeoff, sometimes it worth the game sometimes it do not.
Anyway, some security issues are common to any form of cloud that should be analyzed. Since we have to rely to a 3rd party that is offering us the service we should consider some very basic aspect that lead us to make a good choice.

The Lock dilemma
Figure Lock in and Lock Out

One of the aspect related to the cloud services is the extension of the lock dilemma quite common when we deal also internally with legacy systems.
The problem is simple: once I start using a system, I start filling its storage with valuable data, I link my processes to the platform and somehow I rely and depend on it.
This means that I am limiting my flexibility, it happens all the time, when we choose an operating systems (windows and linux do not share the same application domain), when we develop a custom software, when we rely on a mail structure or a phone systems.

Unless the system is multiplatform and completely open, we are, somehow, tied to our choice. Cloud is not different at all.
But changing and moving is a necessity, nothing last for ever, so what happen when I need to change my cloud provider? Is this even possible? Can I easily migrate my data? And after I moved away, will my data be actually deleted from the old platform?

So our data and ourselves could be trapped by our choice and being Locked In by a series of obstacles that would stop us from changing platform
As a typical cloud issue we find that the LockIn find its companion in the LockOut considerations. Since by definition the cloud is something external we should consider that we can be locked out, meaning we could not be able to reach the platform.

This can be due to a Cloud provider problem, but also to a carrier providing connectivity at any level of the cloud chain as seen before.
Since we are internet dependent when talking about cloud (at least the public portion) internet connection is not something we cannot consider.
In this area there are several security exposure risks, the most common are carrier service problems, dDOS attack to the networking equipment or, easier, the DNS systems and any attack that can redirect traffic. Lock Out can be a tricky problem mostly if neither your systems nor cloud provider are directly involved, but the lock out is an indirect consequence of another activity.

There are also LockOut risks related to legal issues not directly related to you but that involve your cloud provider or one of its customer. If you remember the MegaUpload1 soap opera, there have been several customers that, because of the action against Mr. Kim DotCom2’s MegaUpload, have been locked out from their legit data, data that have been deleted by law enforcers even if the owners were not related in any way to the ”suspected illegal” activities.

Lock In and Lock Out are, indeed, points to be considered carefully, workaround or security measures can be, of course, taken in place but they require specific consideration based on the kind of data and kind of service involved. Caas1, Paas2, NaaS3, SaaS4, IaaS5, BaaS6, ITMaaS (Anderson, 2011)7, XaaS8 all have different considerations and exposure to the lock dilemma.

The Geography is not only a school matter
Well one of the new issues brought to us by cloud is that geography matter.
It is a complex argument because it involves several aspect, all security related, but not always IT related. Most of the considerations comes related to privacy, law requirements and data security.
As mentioned before while talking about Cloud Provider headache, the location of our data is important since make it subject to the local legislation where those data resides.
You like it or not, we can have a conflict between our legislation and the legislation where our data are stored, and may be between our legislation, where our data are stored and where our provider has the HQ.

Have to be taken into account also the geopolitical conditions, so it is hard to believe a strong opposition by UK or Swedish government to data mining requests coming from USA while Dutch or German government are more trustable from this point of view. On the other hand, we are quite sure that Afghanistan government will have big problem to access data stored in a European country while China and Israel will not avoid using hacking technique instead of asking access.
Would be a nice exercise to make a diagram with those relationships, but it is out of the scope of the article. The point is that, opening to cloud, geopolitical inferences are more influent than we are used to think.

Who, when, why, with what, where?
Other considerations comes out because we will allow someone to access our data from virtually anywhere, data that are stored far from our direct control.
Once upon a time, there was a nice realm where the networks were just Lan1 and the access was made by few computers connected by a cable. The king of the kingdom was the owner of everything in his land, and so he owned the computers taking care of them and providing access to the service needed by his people.
But fairy tales are over and nowadays the world is a bit more complex. Things that were not so important in the past are now relevant when talking about Cloud.
Since we have to demand the access control to someone else, we need to be sure we (or the cloud provider upon our request) are able to understand
1. who is accessing the cloud service we are paying for
2. where is the user that is trying to make the access
3. what is using this user
4. when he is trying to access

The reason behind this is that we do not have the physical control nor of the location of our data (that are in the cloud) neither of the location of the user, since cloud services are (remember the definitions) available virtually everywhere.

In addition there are plenty of devices, nowadays, able to run client to access our cloud resources, phone, tablet, laptop, desktop computer, web services, my mom……
Considerations like identity management, securestrong authentication, SSO become more impellent. New protocols comes in hands, like SAML2 but the need for a careful implementation of sound identity, device and access managements procedure is way more important than before.

O my I will never go Cloud, too complicated
There are several good reasons to go cloud, Capex Vs Opex , performances, geographical issues and, why not, security.
Figure What we managed moving to cloud

While several security issues seen before seems, apparently, be cloud related they usually have their counterpart in other deployment models, so what is apparently a complication is, most of the time, something we used to deal with different names or sensibility.

On the other end Cloud services can offer a simplify management model, a powerful platform and a better spending for IT. We just have to start to think about security in a cloud fashion.
Far to be here able to give rules to understand when cloud is good and when is not, we can watch some specific aspect related to some classic cloud delivery models.

"Capital expenditures (CAPEX or capex)" are expenditures creating future benefits. A capital expenditure is incurred when a business spends money either to buy fixed assets or to add to the value of an existing fixed asset with a useful life extending beyond the taxable year.

"An operating expense, operating expenditure, operational expense, operational expenditure or OPEX is an ongoing cost for running a product, business, or system"

SaaS: Software as a Service

Figure There are thousands of SaaS declinations

The most classic Cloud deployment and the very first offered on the market is the SaaS where “S” means Software. In this situation, we will have the Cloud provider offering a service, which we can access through a client.

This is a extension of the clientserver model, where the server resides on the “cloud” and where the cloud is the entity we described before.
The client is usually a web page or a specific application. Using a Web client makes things easier because it allows running everything through HTTPHTTPS protocol involving a very little amount of management from the network side (everyone has port 80 and 443 opened to browse the internet).

There are plenty of services that can be offered through this model, mail and chat have been the very first market offering, while now we have CRM, Graphic and Video Processing and even the complete office suites.

The first security concerns that came to mind are related to the authentication and the transmission of data.
We are not in charge of the authentication and so we cannot manage it. At the end, this is a very big issue so we should be happy someone else considers it on our behalf.

Our users will provide credentials directly to the SaaS provider to be logged in and access the resource. The main security issue we can find is the credential management: we should be sure that the user do not loose, communicate to someone else his credentials.

At the same time, we should be aware that the user could get access to the platform until the provider will revoke hisherits access rights.
Authentication is a dangerous process since require that sensitive data are communicated by the client to the server. During this process, mostly if using a standard web client, is possible that those data are tampered, copied, sniffed.

After authentication different technologies could be put in places to protect data streaming from the client to the SaaS provider but usually the authentication, require a standard approach not specific. In order to avoid security issues it is usually suggested to use HTTPS or SSL transaction to encrypt the traffic from the very beginning.

Recent discovery, like the The Heartbleed Bug security hole, have exposed that even the standard SSLHTTPS approach is not enough. A double way authentication is a second security layer that can be added (or used as a choice when selecting the SaaS provider).

The more strong is the authentication the better, so anything that enforce a stronger security layer will be nice to have. But if we have many SaaS services and many users the managing can become a nightmare.

To overcome this difficulty some SaaS providers offer an integration with your user base through a protocol called SAML.
SAML, Security Assertion Markup Language, an XML-based open standard data format for exchanging authentication and authorization data between parties, in particular, between an identity provider and a service provider. SAML is a product of the OASIS Security Services Technical Committee. SAML dates from 2001; the most recent update of SAML is from 2005.
Figure Saml flow

The single most important requirement that SAML addresses is web browser single sign-on (SSO). Single sign-on solutions are common at the intranet level (using cookies, for example) but extending these solutions beyond the intranet has been problematic and has led to the proliferation of non-interoperable proprietary technologies.
Once we have solved the data transmission and the authentication issues when dealing with SaaS we should then consider the Lock In and Lock Out problems. This is, of course, application dependent.

Using a SaaS mail service provider can be easy and both Lock In and Lock out issues can be easily solved, as migration of data.
Email is a standard object and can be downloaded and managed as textmime file. While we can take in account to lose some data usually those kind of provider makes possible to backupdownloadsavemigrate easily from platform to platform.
A CRM is a totally different thing, it hold, as email, live important and sensitive data, but data migration is not as easy as we can expect, and this is because process databases and data structure are heavily involved.
Last but not least we could make some consideration on the location of our data and the relative privacy, this, as noticed before, is a general Cloud issue, but Email and CRM contain the 90% of the intellectual property of a company, so a smart eye on this should be required.

SaaS: Storage as a Service
Another declination of the SaaS acronym is storage as a server. Just to make our mind clear is google drive, dropbox, One Drive, Box and family.
Figure Some Storage as a Service providers

Basically the idea is to provide storage accessible on the cloud somehow. Sometimes with a client, sometimes with ftp (what? Is there anyone still using FTP?), sometimes with specific protocols and clients…
We have, more or less, the same considerations we have seen before, but we should add something more.

Leaving raw data on a storage out of our control expose us to a clear security risk, which can be "easily” addressed with encryption.
Now encryption is everything but simple, but there are products on the market that can ease our life using simple interfaces to manage certificates and authentication required to encrypt the data. They could even add security layers related to kind of group access and monitoring.

What if is the provider itself offering encryption?
Well this case the security offered would be minimal, since the provider is holding the encryption keys it can, in any case, offer an access to an unwanted visitor (be NSA or a Hacker). Is always a good security practice to disjoint the encryption management from the hosting provider, this way we will add a layer of security due to the fact there is not a single instance managing both the file location and the encryption.

PaaS-IaaS: Platform as a service - Infrastructure as a Service
Another classic declination of cloud is PaaS-IaaS, think of Amazon Cloud Service or Windows Azure and you have a clear idea what is this about.
What PaaS means is that you can run your “server” in a virtual environment that is offered by our PaaS provider.
This is a wonderful way to avoid problems related to datacenter management, storage, availability and resiliency issues. You have just to pay for the computational power, the memory and the storage. If you need to change those parameters you can do it easily and quickly since is the virtual platform managed by the provider that is offering you this as a service with specific SLA.
While you do not have to care about the hardware, remain in your accounting all the software problems related to your “machines”.
Mainly patching can be onerous since it is not always easy to address those problems in a virtual environment. “Virtual patching” can be a way to address the issue, but it requires that your virtual patching systems could be compatible with your PaaS provider.
Another interesting point that need careful security consideration is the location of the identities used. No matter if we are using a linux, unix or Microsoft systems in a PaaS environment we will have users that would be used locally, and in the PaaS environment. A careful plan is required in order to avoid dangerous situations.
The good news are, since most of those providers offers VMWare, Oracle Virtual Box andor Microsoft Hyper-V friendly environments moving our virtual environment from a provider to another is, almost, a piece of cake1.
This minimize the lock-in issues, while since we are depending heavily on the systems the Lock-out issues remain the same.

Security in the cloud: is it possible? Yes but we need to think about what we are moving and what we are dealing with.
There are tradeoff between what we want, what we need, what we can pay and what the market offer, but this is common everywhere, we simply have to put into account all the aspect, and sometimes re-evaluate some aspect that we usually didn’t consider even if they have been present since the Lan moved to Wan.

About author:
Antonio Ierano: IT professional Manager, BDM, marketing specialist, and tech evangelist with over 16 years of experience serving as a community liaison, subject matter expert, and high-profile trainer for key technologies and solutions. My experience includes acting as the public face of Cisco security technologies; leading pan-European technical teams in development of new Cisco security products; and serving as a key public speaker and trainer on behalf of new high-tech products. My expertise spans IT development and implementation, marketing strategy, legal issues, and budget / financial management.


September 2, 2014
© HAKIN9 MEDIA SP. Z O.O. SP. K. 2023