Cloud computing

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Cloud computing logical diagram

Cloud computing - correctly: a Computing Cloud - is a colloquial expression used to describe a variety of different computing concepts that involve a large number of computers that are connected through a real-time communication network (typically the Internet). Cloud Computing is a jargon term without a commonly accepted non-ambiguous scientific or technical definition. In science Cloud computing is a synonym for distributed computing over a network and means the ability to run a program on many connected computers at the same time. The popularity of the term Cloud computing can be attributed to its use in marketing to sell hosted services in the sense of Application Service Provisioning that run Client server software on a remote location.

Contents

Advantages of Cloud Computing[edit]

Cloud computing relies on sharing of resources to achieve coherence and economies of scale similar to a utility (like the electricity grid) over a network.[1] At the foundation of cloud computing is the broader concept of converged infrastructure and shared services.

The cloud also focuses on maximizing the effectiveness of the shared resources. Cloud resources are usually not only shared by multiple users but as well as dynamically re-allocated as per demand. This can work for allocating resources to users in different time zones. For example, a cloud computer facility which serves European users during European business hours with a specific application (eg. email) while the same resources are getting reallocated and serve North American users during North America's business hours with another application (eg. web server). This approach should maximize the use of computing powers thus reducing environmental damage as well, since less power, air conditioning, rackspace, and so on, is required for the same functions.

The term moving cloud also refers to an organization moving away from a traditional capex model (buy the dedicated hardware and depreciate it over a period of time) to the opex model (use a shared cloud infrastructure and pay as you use it)

Proponents claim that cloud computing allows companies to avoid upfront infrastructure costs, and focus on projects that differentiate their businesses instead of infrastructure.[2] Proponents also claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and enables IT to more rapidly adjust resources to meet fluctuating and unpredictable business demand.[2][3][4]

Cloud Computing as Hosted Services[edit]

In marketing, cloud Computing is mostly used to sell hosted services in the sense of Application Service Provisioning that run Client server software on a remote location. Such services are given popular acronyms like 'SaaS' (Software as a Service), 'PaaS' (Platform as a Service). End users access cloud-based applications through a web browser or a light-weight desktop or mobile app while the business software and user's data are stored on servers at a remote location.

History[edit]

The 1950s[edit]

The underlying concept of cloud computing dates back to the 1950s, when large-scale mainframe became available in academia and corporations, accessible via thin clients / terminal computers, often referred to as "dumb terminals", because they were used for communications but had no internal computational capacities. To make more efficient use of costly mainframes, a practice evolved that allowed multiple users to share both the physical access to the computer from multiple terminals as well as to share the CPU time. This eliminated periods of inactivity on the mainframe and allowed for a greater return on the investment. The practice of sharing CPU time on a mainframe became known in the industry as time-sharing.[5]

The 1960s - 1990s[edit]

John McCarthy opined in the 1960s that "computation may someday be organized as a public utility."[6] Almost all the modern-day characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the electricity industry and the use of public, private, government, and community forms, were thoroughly explored in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility. Other scholars have shown that cloud computing's roots go all the way back to the 1950s when scientist Herb Grosch (the author of Grosch's law) postulated that the entire world would operate on dumb terminals powered by about 15 large data centers.[7] Due to the expense of these powerful computers, many corporations and other entities could avail themselves of computing capability through time sharing and several organizations, such as GE's GEISCO, IBM subsidiary The Service Bureau Corporation (SBC, founded in 1957), Tymshare (founded in 1966), National CSS (founded in 1967 and bought by Dun & Bradstreet in 1979), Dial Data (bought by Tymshare in 1968), and Bolt, Beranek and Newman (BBN) marketed time sharing as a commercial venture.

The 1990s[edit]

In the 1990s, telecommunications companies, who previously offered primarily dedicated point-to-point data circuits, began offering virtual private network (VPN) services with comparable quality of service, but at a lower cost. By switching traffic as they saw fit to balance server use, they could use overall network bandwidth more effectively. They began to use the cloud symbol to denote the demarcation point between what the provider was responsible for and what users were responsible for. Cloud computing extends this boundary to cover servers as well as the network infrastructure.[8]

As computers became more prevalent, scientists and technologists explored ways to make large-scale computing power available to more users through time sharing, experimenting with algorithms to provide the optimal use of the infrastructure, platform and applications with prioritized access to the CPU and efficiency for the end users.[9]

Since 2000[edit]

After the dot-com bubble, Amazon played a key role in the development of cloud computing by modernizing their data centers, which, like most computer networks, were using as little as 10% of their capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" (teams small enough to feed with two pizzas) could add new features faster and more easily, Amazon initiated a new product development effort to provide cloud computing to external customers, and launched Amazon Web Service (AWS) on a utility computing basis in 2006.[10][11]

In early 2008, Eucalyptus became the first open-source, AWS API-compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds.[12] In the same year, efforts were focused on providing quality of service guarantees (as required by real-time interactive applications) to cloud-based infrastructures, in the framework of the IRMOS European Commission-funded project, resulting to a real-time cloud environment.[13] By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them"[14] and observed that "organizations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to computing ... will result in dramatic growth in IT products in some areas and significant reductions in other areas."[15]

On March 1, 2011, IBM announced the IBM SmartCloud framework to support Smarter Planet.[16] Among the various components of the Smarter Computing foundation, cloud computing is a critical piece.

Growth and popularity of cloud[edit]

The development of the Internet from being document centric via semantic data towards more and more services was described as "Dynamic Web".[17] This contribution focused in particular in the need for better meta-data able to describe not only implementation details but also conceptual details of model-based applications.

The ubiquitous availability of high-capacity networks, low-cost computers and storage devices as well as the widespread adoption of hardware virtualization, service-oriented architecture, autonomic, and utility computing have led to a tremendous growth in cloud computing.[18][19][20]

Origin of the term "cloud"[edit]

The origin of the term cloud computing is unclear. The expression cloud is commonly used in science to describe a large agglomeration of objects that visually appear from a distance as a cloud and describes any set of things whose details are not inspected further in a given context.

  • Meteorology: a weather cloud is an agglomeration of small water bubbles;
  • Mathematics: a large number of points in a coordinate system in mathematics is seen as a point cloud;
  • Astronomy: many stars that crowd together are seen as star clouds (also known as star mist) in the sky, e.g. the Milky Way;
  • Physics: The fast movement of electrons around an atomic kernel appears like a cloud to a distant observer;

In analogy to above usage the word cloud was used as a metaphor for the Internet and a standardized cloud-like shape was used to denote a network on telephony schematics and later to depict the Internet in computer network diagrams. The cloud symbol was used to represent the Internet as early as 1994.[21][22] Servers were then shown connected to, but external to, the cloud symbol.

Urban legends claim that usage of the expression is directly derived from the practice of using drawings of stylized clouds to denote networks in diagrams of computing and communications systems.

The term became popular through the internet concern Amazon.com that introduced the Elastic Compute Cloud in 2006.

Similar systems and concepts[edit]

Cloud Computing is the result of evolution and adoption of existing technologies and paradigms. The goal of cloud computing is to allow users to take benefit from all of these technologies, without the need for deep knowledge about or expertise with each one of them. The cloud aims to cut costs, and help the users focus on their core business instead of being impeded by IT obstacles.[23]

The main enabling technology for cloud computing is virtualization. Virtualization abstracts the physical infrastructure, which is the most rigid component, and makes it available as a soft component that is easy to use and manage. By doing so, virtualization provides the agility required to speed up IT operations, and reduces cost by increasing infrastructure utilization. On the other hand, autonomic computing automates the process through which the user can provision resources on-demand. By minimizing user involvement, automation speeds up the process and reduces the possibility of human errors.[23]

Users face difficult business problems every day. Cloud computing adopts concepts from Service-oriented Architecture (SOA) that can help the user break these problems into services that can be integrated to provide a solution. Cloud computing provides all of its resources as services, and makes use of the well-established standards and best practices gained in the domain of SOA to allow global and easy access to cloud services in a standardized way.

Cloud computing also leverages concepts from utility computing in order to provide metrics for the services used. Such metrics are at the core of the public cloud pay-per-use models. In addition, measured services are an essential part of the feedback loop in autonomic computing, allowing services to scale on-demand and to perform automatic failure recovery.

Cloud computing is a kind of grid computing; it has evolved from grid computing by addressing the QoS (quality of service) and reliability problems. Cloud computing provides the tools and technologies to build data/compute intensive parallel applications with much more affordable prices compared to traditional parallel computing techniques.[23]

Cloud computing shares characteristics with:

  • Client–server modelClient–server computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requesters (clients).[24]
  • Grid computing — "A form of distributed and parallel computing, whereby a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks."
  • Mainframe computer — Powerful computers used mainly by large organizations for critical applications, typically bulk data processing such as census, industry and consumer statistics, police and secret intelligence services, enterprise resource planning, and financial transaction processing.[25]
  • Utility computing — The "packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity."[26][27]
  • Peer-to-peer means distributed architecture without the need for central coordination. Participants are both suppliers and consumers of resources (in contrast to the traditional client–server model).
  • Cloud gaming—also known as on-demand gaming—is a way of delivering games to computers. Gaming data is stored in the provider's server, so that gaming is independent of client computers used to play the game.

Characteristics[edit]

Cloud computing exhibits the following key characteristics:

  • Agility improves with users' ability to re-provision technological infrastructure resources.
  • Application programming interface (API) accessibility to software that enables machines to interact with cloud software in the same way that a traditional user interface (e.g., a computer desktop) facilitates interaction between humans and computers. Cloud computing systems typically use Representational State Transfer (REST)-based APIs.
  • Cost is claimed to be reduced, and in a public cloud delivery model capital expenditure is converted to operational expenditure.[28] This is purported to lower barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house).[29] The e-FISCAL project's state of the art repository[30] contains several articles looking into cost aspects in more detail, most of them concluding that costs savings depend on the type of activities supported and the type of infrastructure available in-house.
  • Device and location independence[31] enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.[29]
  • Virtualization technology allows servers and storage devices to be shared and utilization be increased. Applications can be easily migrated from one physical server to another.
  • Multitenancy enables sharing of resources and costs across a large pool of users thus allowing for:
    • Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
    • Peak-load capacity increases (users need not engineer for highest possible load-levels)
    • Utilisation and efficiency improvements for systems that are often only 10–20% utilised.[10][32]
  • Reliability is improved if multiple redundant sites are used, which makes well-designed cloud computing suitable for business continuity and disaster recovery.[33]
  • Scalability and elasticity via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near real-time,[34][35] without users having to engineer for peak loads.[36][37][38]
  • Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface.[29]
  • Security could improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels.[39] Security is often as good as or better than other traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford.[40] However, the complexity of security is greatly increased when data is distributed over a wider area or greater number of devices and in multi-tenant systems that are being shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users' desire to retain control over the infrastructure and avoid losing control of information security.
  • Maintenance of cloud computing applications is easier, because they do not need to be installed on each user's computer and can be accessed from different places.

The National Institute of Standards and Technology's definition of cloud computing identifies "five essential characteristics":

On-demand self-service. A consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with each service provider.

Broad network access. Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, tablets, laptops, and workstations).

Resource pooling. The provider's computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand. ...

Rapid elasticity. Capabilities can be elastically provisioned and released, in some cases automatically, to scale rapidly outward and inward commensurate with demand. To the consumer, the capabilities available for provisioning often appear unlimited and can be appropriated in any quantity at any time.

Measured service. Cloud systems automatically control and optimize resource use by leveraging a metering capability at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported, providing transparency for both the provider and consumer of the utilized service.
—National Institute of Standards and Technology[1]

On-demand self-service[edit]

On-demand self-service allows users to obtain, configure and deploy cloud services themselves using cloud service catalogues, without requiring the assistance of IT.[41][42] This feature is listed by the National Institute of Standards and Technology (NIST) as a characteristic of cloud computing.[1]

The self-service requirement of cloud computing prompts infrastructure vendors to create cloud computing templates, which are obtained from cloud service catalogues. Manufacturers of such templates or blueprints include BMC Software (BMC), with Service Blueprints as part of their cloud management platform[43] Hewlett-Packard (HP), which names its templates as HP Cloud Maps[44] RightScale[45] and Red Hat, which names its templates CloudForms.[46]

The templates contain predefined configurations used by consumers to set up cloud services. The templates or blueprints provide the technical information necessary to build ready-to-use clouds.[45] Each template includes specific configuration details for different cloud infrastructures, with information about servers for specific tasks such as hosting applications, databases, websites and so on.[45] The templates also include predefined Web service, the operating system, the database, security configurations and load balancing.[46]

Cloud computing consumers use cloud templates to move applications between clouds through a self-service portal. The predefined blueprints define all that an application requires to run in different environments. For example, a template could define how the same application could be deployed in cloud platforms based on Amazon Web Service, VMware or Red Hat.[47] The user organization benefits from cloud templates because the technical aspects of cloud configurations reside in the templates, letting users to deploy cloud services with a push of a button.[48][49] Cloud templates can also be used by developers to create a catalog of cloud services.[50]

Service models[edit]

Cloud computing providers offer their services according to several fundamental models:[1][51] infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS) where IaaS is the most basic and each higher model abstracts from the details of the lower models. Other key components in XaaS are described in a comprehensive taxonomy model published in 2009,[52] such as Strategy-as-a-Service, Collaboration-as-a-Service, Business Process-as-a-Service, Database-as-a-Service, etc. In 2012, network as a service (NaaS) and communication as a service (CaaS) were officially included by ITU (International Telecommunication Union) as part of the basic cloud computing models, recognized service categories of a telecommunication-centric cloud ecosystem.[53]

Cloud computing layers.png

Infrastructure as a service (IaaS)[edit]

In the most basic cloud-service model, providers of IaaS offer computers - physical or (more often) virtual machines - and other resources. (A hypervisor, such as Xen or KVM, runs the virtual machines as guests. Pools of hypervisors within the cloud operational support-system can support large numbers of virtual machines and the ability to scale services up and down according to customers' varying requirements.) IaaS clouds often offer additional resources such as a virtual-machine disk image library, raw (block) and file-based storage, firewalls, load balancers, IP addresses, virtual local area networks (VLANs), and software bundles.[54] IaaS-cloud providers supply these resources on-demand from their large pools installed in data centers. For wide-area connectivity, customers can use either the Internet or carrier clouds (dedicated virtual private networks).

To deploy their applications, cloud users install operating-system images and their application software on the cloud infrastructure. In this model, the cloud user patches and maintains the operating systems and the application software. Cloud providers typically bill IaaS services on a utility computing basis[citation needed]: cost reflects the amount of resources allocated and consumed.

Examples of IaaS providers include: Amazon EC2, AirVM, Azure Services Platform, DynDNS, Google Compute Engine, HP Cloud, iland, Joyent, LeaseWeb, Linode, NaviSite, Oracle Infrastructure as a Service, Rackspace, ReadySpace Cloud Services, ReliaCloud, SAVVIS, SingleHop, and Terremark

Cloud communications and cloud telephony, rather than replacing local computing infrastructure, replace local telecommunications infrastructure with Voice over IP and other off-site Internet services.

Platform as a service (PaaS)[edit]

In the PaaS model, cloud providers deliver a computing platform typically including operating system, programming language execution environment, database, and web server. Application developers can develop and run their software solutions on a cloud platform without the cost and complexity of buying and managing the underlying hardware and software layers. With some PaaS offers, the underlying computer and storage resources scale automatically to match application demand such that cloud user does not have to allocate resources manually.

Examples of PaaS include: AWS Elastic Beanstalk, Cloud Foundry, Heroku, Force.com, EngineYard, Mendix, OpenShift, Google App Engine, AppScale, Windows Azure Cloud Services and OrangeScape.

Software as a service (SaaS)[edit]

In the business model using software as a service (SaaS), users are provided access to application software and databases. Cloud providers manage the infrastructure and platforms that run the applications. SaaS is sometimes referred to as "on-demand software" and is usually priced on a pay-per-use basis. SaaS providers generally price applications using a subscription fee.

In the SaaS model, cloud providers install and operate application software in the cloud and cloud users access the software from cloud clients. Cloud users do not manage the cloud infrastructure and platform where the application runs. This eliminates the need to install and run the application on the cloud user's own computers, which simplifies maintenance and support. Cloud applications are different from other applications in their scalability—which can be achieved by cloning tasks onto multiple virtual machines at run-time to meet changing work demand.[55] Load balancers distribute the work over the set of virtual machines. This process is transparent to the cloud user, who sees only a single access point. To accommodate a large number of cloud users, cloud applications can be multitenant, that is, any machine serves more than one cloud user organization. It is common to refer to special types of cloud based application software with a similar naming convention: desktop as a service, business process as a service, test environment as a service, communication as a service.

The pricing model for SaaS applications is typically a monthly or yearly flat fee per user,[56] so price is scalable and adjustable if users are added or removed at any point.[57]

Examples of SaaS include: Google Apps, Microsoft Office 365, Petrosoft, Onlive, GT Nexus, Marketo, Casengo, TradeCard and CallidusCloud.

Proponents claim SaaS allows a business the potential to reduce IT operational costs by outsourcing hardware and software maintenance and support to the cloud provider. This enables the business to reallocate IT operations costs away from hardware/software spending and personnel expenses, towards meeting other goals. In addition, with applications hosted centrally, updates can be released without the need for users to install new software. One drawback of SaaS is that the users' data are stored on the cloud provider's server. As a result, there could be unauthorized access to the data.

Network as a service (NaaS)[edit]

A category of cloud services where the capability provided to the cloud service user is to use network/transport connectivity services and/or inter-cloud network connectivity services.[58] NaaS involves the optimization of resource allocations by considering network and computing resources as a unified whole.[59]

Traditional NaaS services include flexible and extended VPN, and bandwidth on demand.[58] NaaS concept materialization also includes the provision of a virtual network service by the owners of the network infrastructure to a third party (VNP – VNO).[60][61]

Cloud clients[edit]

Users access cloud computing using networked client devices, such as desktop computers, laptops, tablets and smartphones. Some of these devices - cloud clients - rely on cloud computing for all or a majority of their applications so as to be essentially useless without it. Examples are thin clients and the browser-based Chromebook. Many cloud applications do not require specific software on the client and instead use a web browser to interact with the cloud application. With Ajax and HTML5 these Web user interfaces can achieve a similar, or even better, look and feel to native applications. Some cloud applications, however, support specific client software dedicated to these applications (e.g., virtual desktop clients and most email clients). Some legacy applications (line of business applications that until now have been prevalent in thin client computing) are delivered via a screen-sharing technology.

Deployment models[edit]

Cloud computing types

Private cloud[edit]

Private cloud is cloud infrastructure operated solely for a single organization, whether managed internally or by a third-party and hosted internally or externally.[1] Undertaking a private cloud project requires a significant level and degree of engagement to virtualize the business environment, and requires the organization to reevaluate decisions about existing resources. When done right, it can improve business, but every step in the project raises security issues that must be addressed to prevent serious vulnerabilities.[62]

They have attracted criticism because users "still have to buy, build, and manage them" and thus do not benefit from less hands-on management,[63] essentially "[lacking] the economic model that makes cloud computing such an intriguing concept".[64][65]

Comparison for SaaS
Public cloud Private cloud
Initial cost Typically zero Typically high
Running cost Predictable Unpredictable
Customization Impossible Possible
Privacy No (Host has access to the data) Yes
Single sign-on Impossible Possible
Scaling up Easy while within defined limits Laborious but no limits

Public cloud[edit]

A cloud is called a 'Public cloud' when the services are rendered over a network that is open for public use. Technically there is no difference between public and private cloud architecture, however, security consideration may be substantially different for services (applications, storage, and other resources) that are made available by a service provider for a public audience and when communication is effected over a non-trusted network. Generally, public cloud service providers like Amazon AWS, Microsoft and Google own and operate the infrastructure and offer access only via Internet (direct connectivity is not offered).[29]

Community cloud[edit]

Community cloud shares infrastructure between several organizations from a specific community with common concerns (security, compliance, jurisdiction, etc.), whether managed internally or by a third-party and hosted internally or externally. The costs are spread over fewer users than a public cloud (but more than a private cloud), so only some of the cost savings potential of cloud computing are realized.[1]

Hybrid cloud[edit]

Hybrid cloud is a composition of two or more clouds (private, community or public) that remain unique entities but are bound together, offering the benefits of multiple deployment models.[1] Such composition expands deployment options for cloud services, allowing IT organizations to use public cloud computing resources to meet temporary needs.[66] This capability enables hybrid clouds to employ cloud bursting for scaling across clouds.[1]

Cloud bursting is an application deployment model in which an application runs in a private cloud or data center and "bursts" to a public cloud when the demand for computing capacity increases. A primary advantage of cloud bursting and a hybrid cloud model is that an organization only pays for extra compute resources when they are needed.[67]

Cloud bursting enables data centers to create an in-house IT infrastructure that supports average workloads, and use cloud resources from public or private clouds, during spikes in processing demands.[68]

By utilizing "hybrid cloud" architecture, companies and individuals are able to obtain degrees of fault tolerance combined with locally immediate usability without dependency on internet connectivity. Hybrid cloud architecture requires both on-premises resources and off-site (remote) server-based cloud infrastructure.

Hybrid clouds lack the flexibility, security and certainty of in-house applications.[69] Hybrid cloud provides the flexibility of in house applications with the fault tolerance and scalability of cloud based services.

Architecture[edit]

Cloud computing sample architecture

Cloud architecture,[70] the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over a loose coupling mechanism such as a messaging queue. Elastic provision implies intelligence in the use of tight or loose coupling as applied to mechanisms such as these and others.

The Intercloud[edit]

The Intercloud[71] is an interconnected global "cloud of clouds"[72][73] and an extension of the Internet "network of networks" on which it is based.[74][75][76]

Cloud engineering[edit]

Cloud engineering is the application of engineering disciplines to cloud computing. It brings a systematic approach to the high-level concerns of commercialisation, standardisation, and governance in conceiving, developing, operating and maintaining cloud computing systems. It is a multidisciplinary method encompassing contributions from diverse areas such as systems, software, web, performance, information, security, platform, risk, and quality engineering.

Issues[edit]

Threats and opportunities of the cloud[edit]

56% of European decision-makers estimate that the cloud is a priority between 2013 and 2014.[77] The cloud budget should reach 30% of the overall IT budget.[citation needed] But several deterrents to the cloud remain. Among them, are: reliability, availability of services and data, security, complexity, costs, regulations and legal issues, performance, migration, reversion, the lack of standards, and limited customization. The cloud also offers several strong points, however: infrastructure flexibility, faster deployment of applications and data, cost control, adaptation of cloud resources to real needs, improved productivity, etc. The early 2010s cloud market is dominated by software and services in SaaS mode and IaaS (infrastructure), especially the private cloud. PaaS and the public cloud are further back.

Privacy[edit]

Privacy advocates have criticized the cloud model for giving hosting companies' greater ease to control—and thus, to monitor at will—communication between host company and end user, and access user data (with or without permission). Instances such as the secret NSA program, working with AT&T, and Verizon, which recorded over 10 million telephone calls between American citizens, causes uncertainty among privacy advocates, and the greater powers it gives to telecommunication companies to monitor user activity.[78] A cloud service provider (CSP) can complicate data privacy because of the extent of virtualization (virtual machines) and cloud storage used to implement cloud service.[79] CSP operations, customer or tenant data may not remain on the same system, or in the same data center or even within the same provider's cloud; this can lead to legal concerns over jurisdiction. While there have been efforts (such as US-EU Safe Harbor) to "harmonise" the legal environment, providers such as Amazon still cater to major markets (typically the United States and the European Union) by deploying local infrastructure and allowing customers to select "availability zones."[80] Cloud computing poses privacy concerns because the service provider may access the data that is on the cloud at any point in time. They could accidentally or deliberately alter or even delete information.[81]

Postage and delivery services company Pitney Bowes launched Volly, a cloud-based, digital mailbox service to leverage its communication management assets. They also faced the technical challenge of providing strong data security and privacy. However, they were able to address the same concern by applying customized, application-level security, including encryption.[82]

Compliance[edit]

To comply with regulations including FISMA, HIPAA, and SOX in the United States, the Data Protection Directive in the EU and the credit card industry's PCI DSS, users may have to adopt community or hybrid deployment modes that are typically more expensive and may offer restricted benefits. This is how Google is able to "manage and meet additional government policy requirements beyond FISMA"[83][84] and Rackspace Cloud or QubeSpace are able to claim PCI compliance.[85]

Many providers also obtain a SAS 70 Type II audit, but this has been criticised on the grounds that the hand-picked set of goals and standards determined by the auditor and the auditee are often not disclosed and can vary widely.[86] Providers typically make this information available on request, under non-disclosure agreement.[87][88]

Customers in the EU contracting with cloud providers outside the EU/EEA have to adhere to the EU regulations on export of personal data.[89]

U.S. Federal Agencies have been directed by the Office of Management and Budget to use a process called FedRAMP (Federal Risk and Authorization Management Program) to assess and authorize cloud products and services. Federal CIO Steven VanRoekel issued a memorandum to federal agency Chief Information Officers on December 8, 2011 defining how federal agencies should use FedRAMP. FedRAMP consists of a subset of NIST Special Publication 800-53 security controls specifically selected to provide protection in cloud environments. A subset has been defined for the FIPS 199 low categorization and the FIPS 199 moderate categorization. The FedRAMP program has also established a Joint Accreditation Board (JAB) consisting of Chief Information Officers from DoD, DHS and GSA. The JAB is responsible for establishing accreditation standards for 3rd party organizations who perform the assessments of cloud solutions. The JAB also reviews authorization packages, and may grant provisional authorization (to operate). The federal agency consuming the service still has final responsibility for final authority to operate.[90]

A multitude of laws and regulations have forced specific compliance requirements onto many companies that collect, generate or store data. These policies may dictate a wide array of data storage policies, such as how long information must be retained, the process used for deleting data, and even certain recovery plans. Below are some examples of compliance laws or regulations.

  • In the United States, the Health Insurance Portability and Accountability Act (HIPAA) requires a contingency plan that includes, data backups, data recovery, and data access during emergencies.
  • The privacy laws of the Switzerland demand that private data, including emails, be physically stored in the Switzerland.
  • In the United Kingdom, the Civil Contingencies Act of 2004 sets forth guidance for a Business contingency plan that includes policies for data storage.

In a virtualized cloud computing environment, customers may never know exactly where their data is stored. In fact, data may be stored across multiple data centers in an effort to improve reliability, increase performance, and provide redundancies. This geographic dispersion may make it more difficult to ascertain legal jurisdiction if disputes arise.[91]

Legal[edit]

As with other changes in the landscape of computing, certain legal issues arise with cloud computing, including trademark infringement, security concerns and sharing of proprietary data resources.

The Electronic Frontier Foundation has criticized the United States government for considering during the Megaupload seizure process that people lose property rights by storing data on a cloud computing service.[92]

One important but not often mentioned problem with cloud computing is the problem of who is in "possession" of the data. If a cloud company is the possessor of the data, the possessor has certain legal rights. If the cloud company is the "custodian" of the data, then a different set of rights would apply. The next problem in the legalities of cloud computing is the problem of legal ownership of the data. Many Terms of Service agreements are silent on the question of ownership.[93]

These legal issues are not confined to the time period in which the cloud based application is actively being used. There must also be consideration for what happens when the provider-customer relationship ends. In most cases, this event will be addressed before an application is deployed to the cloud. However, in the case of provider insolvencies or bankruptcy the state of the data may become blurred.[91]

Vendor lock-in[edit]

Because cloud computing is still relatively new, standards are still being developed.[94] Many cloud platforms and services are proprietary, meaning that they are built on the specific standards, tools and protocols developed by a particular vendor for its particular cloud offering.[94] This can make migrating off a proprietary cloud platform prohibitively complicated and expensive.[94]

Three types of vendor lock-in can occur with cloud computing:[95]

  • Platform lock-in: cloud services tend to be built on one of several possible virtualization platforms, for example VMWare or Xen. Migrating from a cloud provider using one platform to a cloud provider using a different platform could be very complicated.
  • Data lock-in: since the cloud is still new, standards of ownership, i.e. who actually owns the data once it lives on a cloud platform, are not yet developed, which could make it complicated if cloud computing users ever decide to move data off of a cloud vendor's platform.
  • Tools lock-in: if tools built to manage a cloud environment are not compatible with different kinds of both virtual and physical infrastructure, those tools will only be able to manage data or apps that live in the vendor's particular cloud environment.

Heterogeneous cloud computing is described as a type of cloud environment that prevents vendor lock-in, and aligns with enterprise data centers that are operating hybrid cloud models.[96] The absence of vendor lock-in lets cloud administrators select his or her choice of hypervisors for specific tasks, or to deploy virtualized infrastructures to other enterprises without the need to consider the flavor of hypervisor in the other enterprise.[97]

A heterogeneous cloud is considered one that includes on-premise private clouds, public clouds and software-as-a-service clouds. Heterogeneous clouds can work with environments that are not virtualized, such as traditional data centers.[98] Heterogeneous clouds also allow for the use of piece parts, such as hypervisors, servers, and storage, from multiple vendors.[99]

Cloud piece parts, such as cloud storage systems, offer APIs but they are often incompatible with each other.[100] The result is complicated migration between backends, and makes it difficult to integrate data spread across various locations.[100] This has been described as a problem of vendor lock-in.[100] The solution to this is for clouds to adopt common standards.[100]

Heterogeneous cloud computing differs from homogeneous clouds, which have been described as those using consistent building blocks supplied by a single vendor.[101] Intel General Manager of high-density computing, Jason Waxman, is quoted as saying that a homogenous system of 15,000 servers would cost $6 million more in capital expenditure and use 1 megawatt of power.[101]

Open source[edit]

Open-source software has provided the foundation for many cloud computing implementations, prominent examples being the Hadoop framework[102] and VMware's Cloud Foundry.[103] In November 2007, the Free Software Foundation released the Affero General Public License, a version of GPLv3 intended to close a perceived legal loophole associated with free software designed to run over a network.[104]

Open standards[edit]

Most cloud providers expose APIs that are typically well-documented (often under a Creative Commons license[105]) but also unique to their implementation and thus not interoperable. Some vendors have adopted others' APIs and there are a number of open standards under development, with a view to delivering interoperability and portability.[106] As of November 2012, the Open Standard with broadest industry support is probably OpenStack, founded in 2010 by NASA and Rackspace, and now governed by the OpenStack Foundation.[107] OpenStack supporters include AMD, Intel, Canonical, SUSE Linux, Red Hat, Cisco, Dell, HP, IBM, Yahoo and now VMware.[108]

Security[edit]

As cloud computing is achieving increased popularity, concerns are being voiced about the security issues introduced through adoption of this new model. The effectiveness and efficiency of traditional protection mechanisms are being reconsidered as the characteristics of this innovative deployment model can differ widely from those of traditional architectures.[109] An alternative perspective on the topic of cloud security is that this is but another, although quite broad, case of "applied security" and that similar security principles that apply in shared multi-user mainframe security models apply with cloud security.[110]

The relative security of cloud computing services is a contentious issue that may be delaying its adoption.[111] Physical control of the Private Cloud equipment is more secure than having the equipment off site and under someone else's control. Physical control and the ability to visually inspect data links and access ports is required in order to ensure data links are not compromised. Issues barring the adoption of cloud computing are due in large part to the private and public sectors' unease surrounding the external management of security-based services. It is the very nature of cloud computing-based services, private or public, that promote external management of provided services. This delivers great incentive to cloud computing service providers to prioritize building and maintaining strong management of secure services.[112] Security issues have been categorised into sensitive data access, data segregation, privacy, bug exploitation, recovery, accountability, malicious insiders, management console security, account control, and multi-tenancy issues. Solutions to various cloud security issues vary, from cryptography, particularly public key infrastructure (PKI), to use of multiple cloud providers, standardisation of APIs, and improving virtual machine support and legal support.[109][113][114]

Cloud computing offers many benefits, but is vulnerable to threats. As cloud computing uses increase, it is likely that more criminals find new ways to exploit system vulnerabilities. Many underlying challenges and risks in cloud computing increase the threat of data compromise. To mitigate the threat, cloud computing stakeholders should invest heavily in risk assessment to ensure that the system encrypts to protect data, establishes trusted foundation to secure the platform and infrastructure, and builds higher assurance into auditing to strengthen compliance. Security concerns must be addressed to maintain trust in cloud computing technology.[citation needed]

Sustainability[edit]

Although cloud computing is often assumed to be a form of green computing, no published study substantiates this assumption.[115] Citing the servers' effects on the environmental effects of cloud computing, in areas where climate favors natural cooling and renewable electricity is readily available, the environmental effects will be more moderate. (The same holds true for "traditional" data centers.) Thus countries with favorable conditions, such as Finland,[116] Sweden and Switzerland,[117] are trying to attract cloud computing data centers. Energy efficiency in cloud computing can result from energy-aware scheduling and server consolidation.[118] However, in the case of distributed clouds over data centers with different source of energies including renewable source of energies, a small compromise on energy consumption reduction could result in high carbon footprint reduction.[119]

Abuse[edit]

As with privately purchased hardware, customers can purchase the services of cloud computing for nefarious purposes. This includes password cracking and launching attacks using the purchased services.[120] In 2009, a banking trojan illegally used the popular Amazon service as a command and control channel that issued software updates and malicious instructions to PCs that were infected by the malware.[121]

IT governance[edit]

The introduction of cloud computing requires an appropriate IT governance model to ensure a secured computing environment and to comply with all relevant organizational information technology policies.[122][123] As such, organizations need a set of capabilities that are essential when effectively implementing and managing cloud services, including demand management, relationship management, data security management, application lifecycle management, risk and compliance management.[124] A danger lies with the explosion of companies joining the growth in cloud computing by becoming providers. However, many of the infrastructural and logistical concerns regarding the operation of cloud computing businesses are still unknown. This over-saturation may have ramifications for the industry as whole.[125]

Consumer end storage[edit]

The increased use of cloud computing could lead to a reduction in demand for high storage capacity consumer end devices, due to cheaper low storage devices that stream all content via the cloud becoming more popular.[citation needed] In a Wired article, Jake Gardner explains that while unregulated usage is beneficial for IT and tech moguls like Amazon, the anonymous nature of the cost of consumption of cloud usage makes it difficult for business to evaluate and incorporate it into their business plans.[125]

Ambiguity of terminology[edit]

Outside of the information technology and software industry, the term "cloud" can be found to reference a wide range of services, some of which fall under the category of cloud computing, while others do not. The cloud is often used to refer to a product or service that is discovered, accessed and paid for over the Internet, but is not necessarily a computing resource. Examples of service that are sometimes referred to as "the cloud" include, but are not limited to, crowd sourcing, cloud printing, crowd funding, cloud manufacturing.[126][127]

Alternatives as normal user[edit]

Another possibility of cloud storage is to create your own cloud and keep the data on your own server (KYOD). Many manufacturers of home NAS devices provides this functionality out of the box. If you want to keep your data accessible at all times but don't trust the providers this is an viable option. The cons are that you have to secure the data on your end. The pros are that you know where your data is located. [17]

Research[edit]

Many universities, vendors and government organizations are investing in research around the topic of cloud computing:[128][129]

  • In October 2007, the Academic Cloud Computing Initiative (ACCI) was announced as a multi-university project designed to enhance students' technical knowledge to address the challenges of cloud computing.[130]
  • In April 2009, UC Santa Barbara released the first open source platform-as-a-service, AppScale, which is capable of running Google App Engine applications at scale on a multitude of infrastructures.
  • In April 2009, the St Andrews Cloud Computing Co-laboratory was launched, focusing on research in the important new area of cloud computing. Unique in the UK, StACC aims to become an international centre of excellence for research and teaching in cloud computing and provides advice and information to businesses interested in cloud-based services.[131]
  • In October 2010, the TClouds (Trustworthy Clouds) project was started, funded by the European Commission's 7th Framework Programme. The project's goal is to research and inspect the legal foundation and architectural design to build a resilient and trustworthy cloud-of-cloud infrastructure on top of that. The project also develops a prototype to demonstrate its results.[132]
  • In December 2010, the TrustCloud research project [133][134] was started by HP Labs Singapore to address transparency and accountability of cloud computing via detective, data-centric approaches[135] encapsulated in a five-layer TrustCloud Framework. The team identified the need for monitoring data life cycles and transfers in the cloud,[133] leading to the tackling of key cloud computing security issues such as cloud data leakages, cloud accountability and cross-national data transfers in transnational clouds.
  • In July 2011, the High Performance Computing Cloud (HPCCLoud) project was kicked-off aiming at finding out the possibilities of enhancing performance on cloud environments while running the scientific applications - development of HPCCLoud Performance Analysis Toolkit which was funded by CIM-Returning Experts Programme - under the coordination of Prof. Dr. Shajulin Benedict.
  • In June 2011, the Telecommunications Industry Association developed a Cloud Computing White Paper, to analyze the integration challenges and opportunities between cloud services and traditional U.S. telecommunications standards.[137]
  • In February 2013, the BonFIRE project launched a multi-site cloud experimentation and testing facility. The facility provides transparent access to cloud resources, with the control and observability necessary to engineer future cloud technologies, in a way that is not restricted, for example, by current business models.[138]

Early References in Popular Culture[edit]

In the 1966 Star Trek episode "Miri," Dr. McCoy, while stationed planetside, uses the computer of the orbiting Enterprise to process the data gathered by his portable equipment.

See also[edit]

References[edit]

  1. ^ a b c d e f g h "The NIST Definition of Cloud Computing". National Institute of Standards and Technology. Retrieved 24 July 2011. 
  2. ^ a b "What is Cloud Computing?". Amazon Web Services. 2013-3-19. Retrieved 2013-3-20. 
  3. ^ "Baburajan, Rajani, "The Rising Cloud Storage Market Opportunity Strengthens Vendors," infoTECH, August 24, 2011". It.tmcnet.com. 2011-08-24. Retrieved 2011-12-02. 
  4. ^ Oestreich, Ken, (2010-11-15). "Converged Infrastructure". CTO Forum. Thectoforum.com. Retrieved 2011-12-02. 
  5. ^ Strachey, Christopher (June 1959). "Time Sharing in Large Fast Computers". Proceedings of the International Conference on Information processing, UNESCO. paper B.2.19: 336–341. 
  6. ^ Simson Garfinkel (3 October 2011). "The Cloud Imperative". Technology Review (MIT). Retrieved 31 May 2013. 
  7. ^ Ryan; Falvey; Merchant (October 2011). "Regulation of the Cloud in India". Journal of Internet Law 15 (4) 
  8. ^ "July, 1993 meeting report from the IP over ATM working group of the IETF". CH: Switch. Retrieved 2010-08-22. 
  9. ^ Corbató, Fernando J. "An Experimental Time-Sharing System". SJCC Proceedings. MIT. Retrieved 3 July 2012. 
  10. ^ a b "Jeff Bezos' Risky Bet". Business Week 
  11. ^ "Amazon's early efforts at cloud computing partly accidental". IT Knowledge Exchange. Tech Target. 2010-06-17 
  12. ^ B Rochwerger, J Caceres, RS Montero, D Breitgand, E Elmroth, A Galis, E Levy, IM Llorente, K Nagin, Y Wolfsthal, E Elmroth, J Caceres, M Ben-Yehuda, W Emmerich, F Galan. "The RESERVOIR Model and Architecture for Open Federated Cloud Computing", IBM Journal of Research and Development, Vol. 53, No. 4. (2009)
  13. ^ D Kyriazis, A Menychtas, G Kousiouris, K Oberle, T Voith, M Boniface, E Oliveros, T Cucinotta, S Berger, "A Real-time Service Oriented Infrastructure", International Conference on Real-Time and Embedded Systems (RTES 2010), Singapore, November 2010
  14. ^ Keep an eye on cloud computing, Amy Schurr, Network World, 2008-07-08, citing the Gartner report, "Cloud Computing Confusion Leads to Opportunity". Retrieved 2009-09-11.
  15. ^ Gartner Says Worldwide IT Spending On Pace to Surpass Trillion in 2008, Gartner, 2008-08-18. Retrieved 2009-09-11.
  16. ^ "Launch of IBM Smarter Computing". Retrieved 1 March 2011. 
  17. ^ Andreas Tolk. 2006. What Comes After the Semantic Web - PADS Implications for the Dynamic Web. 20th Workshop on Principles of Advanced and Distributed Simulation (PADS '06). IEEE Computer Society, Washington, DC, USA
  18. ^ "Cloud Computing: Clash of the clouds". The Economist. 2009-10-15. Retrieved 2009-11-03. 
  19. ^ "Gartner Says Cloud Computing Will Be As Influential As E-business". Gartner. Retrieved 2010-08-22. 
  20. ^ Gruman, Galen (2008-04-07). "What cloud computing really means". InfoWorld. Retrieved 2009-06-02. 
  21. ^ Figure 8, "A network 70 is shown schematically as a cloud", US Patent 5,485,455, column 17, line 22, filed Jan 28, 1994
  22. ^ Figure 1, "the cloud indicated at 49 in Fig. 1.", US Patent 5,790,548, column 5 line 56-57, filed April 18, 1996
  23. ^ a b c HAMDAQA, Mohammad (2012). Cloud Computing Uncovered: A Research Landscape. Elsevier Press. pp. 41–85. ISBN 0-12-396535-7. 
  24. ^ "Distributed Application Architecture". Sun Microsystem. Retrieved 2009-06-16. 
  25. ^ "Sun CTO: Cloud computing is like the mainframe". Itknowledgeexchange.techtarget.com. 2009-03-11. Retrieved 2010-08-22. 
  26. ^ "It's probable that you've misunderstood 'Cloud Computing' until now". TechPluto. Retrieved 2010-09-14. 
  27. ^ Danielson, Krissi (2008-03-26). "Distinguishing Cloud Computing from Utility Computing". Ebizq.net. Retrieved 2010-08-22. 
  28. ^ "Recession Is Good For Cloud Computing – Microsoft Agrees". CloudAve. Retrieved 2010-08-22. 
  29. ^ a b c d "Defining "Cloud Services" and "Cloud Computing"". IDC. 2008-09-23. Retrieved 2010-08-22. 
  30. ^ "e-FISCAL project state of the art repository". 
  31. ^ Farber, Dan (2008-06-25). "The new geek chic: Data centers". CNET News. Retrieved 2010-08-22. 
  32. ^ He, Sijin; L. Guo, Y. Guo, M. Ghanem,. Improving Resource Utilisation in the Cloud Environment Using Multivariate Probabilistic Models. 2012 2012 IEEE 5th International Conference on Cloud Computing (CLOUD). pp. 574–581. doi:10.1109/CLOUD.2012.66. ISBN 978-1-4673-2892-0. 
  33. ^ King, Rachael (2008-08-04). "Cloud Computing: Small Companies Take Flight". Businessweek. Retrieved 2010-08-22. 
  34. ^ Mao, Ming; M. Humphrey (2012). "A Performance Study on the VM Startup Time in the Cloud". Proceedings of 2012 IEEE 5th International Conference on Cloud Computing (Cloud2012): 423. doi:10.1109/CLOUD.2012.103. ISBN 978-1-4673-2892-0. 
  35. ^ He, Sijin; L. Guo, Y. Guo (2011). "Real Time Elastic Cloud Management for Limited Resources". Proceedings of 2011 IEEE 4th International Conference on Cloud Computing (Cloud2011): 622–629. doi:10.1109/CLOUD.2011.47. ISBN 978-0-7695-4460-1. 
  36. ^ "Defining and Measuring Cloud Elasticity". KIT Software Quality Departement. Retrieved 13 August 2011. 
  37. ^ "Economies of Cloud Scale Infrastructure". Cloud Slam 2011. Retrieved 13 May 2011. 
  38. ^ He, Sijin; L. Guo, Y. Guo, C. Wu, M. Ghanem, R. Han. Elastic Application Container: A Lightweight Approach for Cloud Resource Provisioning. 2012 IEEE 26th International Conference on Advanced Information Networking and Applications (AINA). pp. 15–22. doi:10.1109/AINA.2012.74. ISBN 978-1-4673-0714-7. 
  39. ^ "Encrypted Storage and Key Management for the cloud". Cryptoclarity.com. 2009-07-30. Retrieved 2010-08-22. 
  40. ^ Mills, Elinor (2009-01-27). "Cloud computing security forecast: Clear skies". CNET News. Retrieved 2010-08-22. 
  41. ^ David Perera (2012-07-12). "The real obstacle to federal cloud computing". FierceGovernmentIT. Retrieved 2012-12-15. 
  42. ^ "Top 10 Reasons why Startups should Consider Cloud". Cloudstory.in. 2012-09-05. Retrieved 2012-12-15. 
  43. ^ "BMC Service Catalog Enforces Workload Location". eweek.com. 2011-08-02. Retrieved 2013-03-10. 
  44. ^ "HP's Turn-Key Private Cloud - Application Development Trends". Adtmag.com. 2010-08-30. Retrieved 2012-12-15. 
  45. ^ a b c Babcock, Charles (2011-06-03). "RightScale Launches App Store For Infrastructure - Cloud-computing". Informationweek.com. Retrieved 2012-12-15. 
  46. ^ a b "Red Hat launches hybrid cloud management software - Open Source". Techworld. 2012-06-06. Retrieved 2012-12-15. 
  47. ^ Brown, Rodney (April 10, 2012). "Spinning up the instant cloud". CloudEcosystem. 
  48. ^ Riglian, Adam (December 1, 2011). "VIP Art Fair picks OpDemand over RightScale for IaaS management". Search Cloud Applications. TechTarget. Retrieved January 25, 2013. 
  49. ^ Samson, Ted (April 10, 2012). "HP advances public cloud as part of ambitious hybrid cloud strategy". InfoWorld. Retrieved 2012-12-14. 
  50. ^ "HP Cloud Maps can ease application automation". SiliconIndia. Retrieved 22 January 2013. 
  51. ^ Voorsluys, William; Broberg, James; Buyya, Rajkumar (February 2011). "Introduction to Cloud Computing". In R. Buyya, J. Broberg, A.Goscinski. Cloud Computing: Principles and Paradigms. New York, USA: Wiley Press. pp. 1–44. ISBN 978-0-470-88799-8. 
  52. ^ "Tony Shan, "Cloud Taxonomy and Ontology"". February 2009. Retrieved 2 February 2009. 
  53. ^ "ITU-T NEWSLOG - CLOUD COMPUTING AND STANDARDIZATION: TECHNICAL REPORTS PUBLISHED". International Telecommunication Union (ITU). Retrieved 16 December 2012. 
  54. ^ Amies, Alex; Sluiman, Harm; Tong, Qiang Guo; Liu, Guo Ning (July 2012). "Infrastructure as a Service Cloud Concepts". Developing and Hosting Applications on the Cloud. IBM Press. ISBN 978-0-13-306684-5. 
  55. ^ Hamdaqa, Mohammad. A Reference Model for Developing Cloud Applications. 
  56. ^ Chou, Timothy. Introduction to Cloud Computing: Business & Technology. 
  57. ^ "HVD: the cloud's silver lining". Intrinsic Technology. Retrieved 30 August 2012. 
  58. ^ a b "ITU Focus Group on Cloud Computing - Part 1". International Telecommunication Union (ITU) TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU. Retrieved 16 December 2012. 
  59. ^ "Cloud computing in Telecommunications". Ericsson. Retrieved 16 December 2012. 
  60. ^ "Network Virtualisation – Opportunities and Challenges". Eurescom. Retrieved 16 December 2012. 
  61. ^ "The role of virtualisation in future network architectures". Change Project. Retrieved 16 December 2012. 
  62. ^ "Is a Private Cloud Really More Secure?". Dell.com. Retrieved 07-11-12. [dead link]
  63. ^ Foley, John. "Private Clouds Take Shape". InformationWeek. Retrieved 2010-08-22. 
  64. ^ Haff, Gordon (2009-01-27). "Just don't call them private clouds". CNET News. Retrieved 2010-08-22. 
  65. ^ "There's No Such Thing As A Private Cloud". InformationWeek. 2010-06-30. Retrieved 2010-08-22. 
  66. ^ Metzler, Jim; Taylor, Steve. (2010-08-23) "Cloud computing: Reality vs. fiction," Network World. [1]
  67. ^ Rouse, Margaret. "Definition: Cloudbursting," May 2011. SearchCloudComputing.com. [2]
  68. ^ Vizard, Michael. "How Cloudbursting 'Rightsizes' the Data Center", (2012-06-21). Slashdot. [3]
  69. ^ Stevens, Alan (June 29, 2011). "When hybrid clouds are a mixed blessing". The Register. Retrieved March 28, 2012. 
  70. ^ "Building GrepTheWeb in the Cloud, Part 1: Cloud Architectures". Developer.amazonwebservices.com. Retrieved 2010-08-22. 
  71. ^ Bernstein, David; Ludvigson, Erik; Sankar, Krishna; Diamond, Steve; Morrow, Monique (2009-05-24). "Blueprint for the Intercloud - Protocols and Formats for Cloud Computing Interoperability". Blueprint for the Intercloud – Protocols and Formats for Cloud Computing Interoperability. IEEE Computer Society. pp. 328–336. doi:10.1109/ICIW.2009.55. ISBN 978-1-4244-3851-8. 
  72. ^ "Kevin Kelly: A Cloudbook for the Cloud". Kk.org. Retrieved 2010-08-22. 
  73. ^ "Intercloud is a global cloud of clouds". Samj.net. 2009-06-22. Retrieved 2010-08-22. 
  74. ^ "Vint Cerf: Despite Its Age, The Internet is Still Filled with Problems". Readwriteweb.com. Retrieved 2010-08-22. 
  75. ^ "SP360: Service Provider: From India to Intercloud". Blogs.cisco.com. Retrieved 2010-08-22. 
  76. ^ Canada (2007-11-29). "Head in the clouds? Welcome to the future". Toronto: Theglobeandmail.com. Retrieved 2010-08-22. 
  77. ^ Challenges & Opportunities for IT partners when transforming or creating a business in the Cloud. compuBase consulting. 2012. p. 77. 
  78. ^ Cauley, Leslie (2006-05-11). "NSA has massive database of Americans' phone calls". USATODAY.com. Retrieved 2010-08-22. 
  79. ^ Winkler, Vic (2011). Securing the Cloud: Cloud Computer Security Techniques and Tactics. Waltham, Massachusetts: Elsevier. p. 60. ISBN 978-1-59749-592-9. 
  80. ^ "Feature Guide: Amazon EC2 Availability Zones". Amazon Web Services. Retrieved 2010-08-22. 
  81. ^ "Cloud Computing Privacy Concerns on Our Doorstep". 
  82. ^ "Cloud Enables New Business Opportunity". 
  83. ^ "FISMA compliance for federal cloud computing on the horizon in 2010". SearchCompliance.com. Retrieved 2010-08-22. 
  84. ^ "Google Apps and Government". Official Google Enterprise Blog. 2009-09-15. Retrieved 2010-08-22. 
  85. ^ "Cloud Hosting is Secure for Take-off: Mosso Enables The Spreadsheet Store, an Online Merchant, to become PCI Compliant". Rackspace. 2009-03-14. Retrieved 2010-08-22. 
  86. ^ "Amazon gets SAS 70 Type II audit stamp, but analysts not satisfied". SearchCloudComputing.com. 2009-11-17. Retrieved 2010-08-22. 
  87. ^ "Assessing Cloud Computing Agreements and Controls". WTN News. Retrieved 2010-08-22. 
  88. ^ "Cloud Certification From Compliance Mandate to Competitive Differentiator". Cloudcor. Retrieved 2011-09-20. 
  89. ^ "How the New EU Rules on Data Export Affect Companies in and Outside the EU | Dr. Thomas Helbing – Kanzlei für Datenschutz-, Online- und IT-Recht". Dr. Thomas Helbing. Retrieved 2010-08-22. 
  90. ^ "FedRAMP". U.S. General Services Administration. 2012-06-13. Retrieved 2012-06-17. 
  91. ^ a b Chambers, Don (July 2010). "Windows Azure: Using Windows Azure’s Service Bus to Solve Data Security Issues]". Rebus Technologies. Retrieved 2012-12-14. 
  92. ^ Cohn, Cindy; Samuels, Julie (31 October 2012). "Megaupload and the Government's Attack on Cloud Computing]". Electronic Frontier Foundation. Retrieved 2012-12-14. 
  93. ^ Maltais, Michelle (26 April 2012). "Who owns your stuff in the cloud?". Los Angeles Times. Retrieved 2012-12-14. 
  94. ^ a b c McKendrick, Joe. (2011-11-20) "Cloud Computing's Vendor Lock-In Problem: Why the Industry is Taking a Step Backward," Forbes.com [4]
  95. ^ Hinkle, Mark. (2010-6-9) "Three cloud lock-in considerations", Zenoss Blog [5]
  96. ^ Staten, James (2012-07-23). "Gelsinger brings the 'H' word to VMware". ZDNet. [6]
  97. ^ Vada, Eirik T. (2012-06-11) "Creating Flexible Heterogeneous Cloud Environments", page 5, Network and System Administration, Oslo University College [7]
  98. ^ Geada, Dave. (June 2, 2011) "The case for the heterogeneous cloud," Cloud Computing Journal [8]
  99. ^ Burns, Paul (2012-01-02). "Cloud Computing in 2012: What's Already Happening". Neovise.[9]
  100. ^ a b c d Livenson, Ilja. Laure, Erwin. (2011) "Towards transparent integration of heterogeneous cloud storage platforms", pages 27–34, KTH Royal Institute of Technology, Stockholm, Sweden. [10]
  101. ^ a b Gannes, Liz. GigaOm, "Structure 2010: Intel vs. the Homogeneous Cloud," June 24, 2010. [11]
  102. ^ Jon Brodkin (July 28, 2008). "Open source fuels growth of cloud computing, software-as-a-service". Network World. Retrieved 2012-12-14. 
  103. ^ "VMware Launches Open Source PaaS Cloud Foundry". Simpler Media Group, Inc. 2011-04-21. Retrieved 2012-12-14. 
  104. ^ "AGPL: Open Source Licensing in a Networked Age". Redmonk.com. 2009-04-15. Retrieved 2010-08-22. 
  105. ^ GoGrid Moves API Specification to Creative Commons[dead link]
  106. ^ "Eucalyptus Completes Amazon Web Services Specs with Latest Release". Ostatic.com. Retrieved 2010-08-22. 
  107. ^ "OpenStack Foundation launches". Infoworld.com. 2012-09-19. Retrieved 2012-17-11. 
  108. ^ "Did OpenStack Let VMware Into The Henhouse?". Informationweek.com. 2012-10-19. Retrieved 2012-17-11. 
  109. ^ a b Zissis, Dimitrios; Lekkas (2010). "Addressing cloud computing security issues". Future Generation Computer Systems 28 (3): 583. doi:10.1016/j.future.2010.12.006. 
  110. ^ Winkler, Vic (2011). Securing the Cloud: Cloud Computer Security Techniques and Tactics. Waltham, MA USA: Syngress. pp. 187, 189. ISBN 978-1-59749-592-9. 
  111. ^ "Are security issues delaying adoption of cloud computing?". Network World. Retrieved 2010-08-22. 
  112. ^ "Security of virtualization, cloud computing divides IT and security pros". Network World. 2010-02-22. Retrieved 2010-08-22. 
  113. ^ Armbrust, M; Fox, A., Griffith, R., Joseph, A., Katz, R., Konwinski, A., Lee, G., Patterson, D., Rabkin, A., Zaharia, (2010). "A view of cloud computing". Communication of the ACM 53 (4): 50–58. doi:10.1145/1721654.1721672. 
  114. ^ Anthens, G (2010). "Security in the cloud". Communications of the ACM 53 (11): 16. doi:10.1145/1839676.1839683. 
  115. ^ James Urquhart (January 7, 2010). "Cloud computing's green paradox". CNET News. Retrieved March 12, 2010. "... there is some significant evidence that the cloud is encouraging more compute consumption" 
  116. ^ Finland – First Choice for Siting Your Cloud Computing Data Center.. Retrieved 4 August 2010.
  117. ^ Swiss Carbon-Neutral Servers Hit the Cloud.. Retrieved 4 August 2010.
  118. ^ Berl, Andreas, et al., Energy-Efficient Cloud Computing, The Computer Journal, 2010.
  119. ^ Farrahi Moghaddam, Fereydoun, et al., Low Carbon Virtual Private Clouds, IEEE Cloud 2011.
  120. ^ Alpeyev, Pavel (2011-05-14). "Amazon.com Server Said to Have Been Used in Sony Attack". Bloomberg. Retrieved 2011-08-20. 
  121. ^ Goodin, Dan (2011-05-14). "PlayStation Network hack launched from Amazon EC2". The Register. Retrieved 2012-05-18. 
  122. ^ Hsu, Wen-Hsi L., "Conceptual Framework of Cloud Computing Governance Model - An Education Perspective", IEEE Technology and Engineering Education (ITEE), Vol 7, No 2 (2012) [12]
  123. ^ Stackpole, Beth, "Governance Meets Cloud: Top Misconceptions", InformationWeek, 7 May 2012 [13]
  124. ^ Joha, A and M. Janssen (2012) "Transformation to Cloud Services Sourcing: Required IT Governance Capabilities", ICST Transactions on e-Business 12(7-9) [14]
  125. ^ a b Beware: 7 Sins of Cloud Computing
  126. ^ S. Stonham and S. Nahalkova (2012) "What is the Cloud and how can it help my business?" [15]
  127. ^ S. Stonham and S. Nahalkova (2012), Whitepaper "Tomorrow Belongs to the Agile (PDF)" [16]
  128. ^ "Cloud Net Directory. Retrieved 2010-03-01". Cloudbook.net. Retrieved 2010-08-22. 
  129. ^ "– National Science Foundation (NSF) News – National Science Foundation Awards Millions to Fourteen Universities for Cloud Computing Research – US National Science Foun". Nsf.gov. Retrieved 2011-08-20. 
  130. ^ Rich Miller (2008-05-02). "IBM, Google Team on an Enterprise Cloud". DataCenterKnowledge.com. Retrieved 2010-08-22. 
  131. ^ "StACC - Collaborative Research in Cloud Computing". University of St Andrews department of Computer Science. Retrieved 2012-06-17. 
  132. ^ "Trustworthy Clouds: Privacy and Resilience for Internet-scale Critical Infrastructure". Retrieved 2012-06-17. 
  133. ^ a b Ko, Ryan K. L.; Jagadpramana, Peter; Lee, Bu Sung (2011). "Flogger: A File-centric Logger for Monitoring File Access and Transfers within Cloud Computing Environments". Proceedings of the 10th IEEE International Conference on Trust, Security and Privacy of Computing and Communications (TrustCom-11): 765. doi:10.1109/TrustCom.2011.100. ISBN 978-1-4577-2135-9. 
  134. ^ Ko, Ryan K. L.; Jagadpramana, Peter; Mowbray, Miranda; Pearson, Siani; Kirchberg, Markus; Liang, Qianhui; Lee, Bu Sung (2011). "TrustCloud: A Framework for Accountability and Trust in Cloud Computing". Proceedings of the 2nd IEEE Cloud Forum for Practitioners (IEEE ICFP 2011), Washington DC, USA, July 7–8, 2011. 
  135. ^ Ko, Ryan K. L. Ko; Kirchberg, Markus; Lee, Bu Sung (2011). "From System-Centric Logging to Data-Centric Logging - Accountability, Trust and Security in Cloud Computing". Proceedings of the 1st Defence, Science and Research Conference 2011 - Symposium on Cyber Terrorism, IEEE Computer Society, 3–4 August 2011, Singapore. 
  136. ^ "UTM/UPES-IBM India Collaboration". 2011. 
  137. ^ "Publication Download". Tiaonline.org. Retrieved 2011-12-02. 
  138. ^ "Testbeds for cloud experimentation and testing". Retrieved 2013-04-09. 

External links[edit]