Cloud Computing And Main Barriers Information Technology Essay

Although cloud computing seems to be very attractive advantages to customer and business to adopt cloud, there are still some significant barriers presently that needed to be mitigate. This report describes the major barriers of cloud computing and how these impact to cloud computing adoption. Also, discuss emerging innovations that currently solve these obstacles and also in progress of solving them.

Finally the report will discuss where cloud computing and S-Curve from our point of view and apply Roger’s factors influencing the speed of cloud computing adoption and our conclusion.

Key Drivers to Adopting the Cloud

Table below compares advantages and disadvantages between Traditional IT and cloud computing (Mather, Kumaraswamy,& Latif, 2009)

Table 1 Cloud computing: A customer’s perspective

Dedicated/traditional IT

Cloud computing

High upfront IT investments for new builds

Low upfront IT investments; pay-for-use model

High cost of reliable infrastructure

Reliability built into the cloud architecture

High complexity of IT environment

Modular IT architecture environments

Benefits of cloud/on-demand model analysed by IDC (Gens, 2009)

Figure Source: http://blogs.idc.com/ie/?p=730

Main current barriers to adoption of cloud computing

Berkley points out top 10 Obstacles to and Opportunities for Adoption and Growth of Cloud Computing (Armbrust, Fox, Griffith, Joseph, Katz, Konwinski, Lee, Patterson, Rabkin, Stoica, & Zaharia, 2009)

Table 2

Obstacle

1

Availability of Service

2

Data Lock-In

3

Data Confidentiality and Auditability

4

Data Transfer Bottlenecks

5

Performance Unpredictability

6

Scalable Storage

7

Bugs in Large-Scale Distributed Systems

8

Scaling Quickly

9

Reputation Fate Sharing

10

Software Licensing

Top challenges/issues of the cloud/ on-demand model analysed by IDC (Gens, 2009)

Figure Source:http://blogs.idc.com/ie/?p=730

Main barriers focused in this report

This section explains major barriers in particulars which are Availability of Service, Security, Interoperability and Lock-in and Opportunities to mitigate them.

Availability of Service

Availability is a “relative measure of the extent that a system can perform its designed function. Relative measure, availability include metrics such as delays, congestion, and loading” (Russell & Yoon, 2009). Availability of service refers to measure the percentage of time a particular resource or service is in a usable state over a measured time interval. The goal is to ensure its users can use them at any time, at any place.

Problems:

According to Mather et al (2009), point out there are currently three major threats in availability of data concerned which are network-based attacks, CSP’s own availability, and cloud storage customer must be certain to ascertain what services their provider is actually offering.

Most of cloud service providers tend to not offer the sought-after “five 9s”(i.e. 99.999%) of uptime. There were a number of outages incident occurred in past few years for instance Amazon’s Simple Storage Service (S3) outage. Amazon S3 suffered two outages in 2008, 2 hours in February and 8 hours in August (Armbrust et al., 2009). These Amazon outages were become manifest because of the relatively large number of customer that the S3 service supports. In addition to services outages, Google Gmail was unavailable for 2 hours twice in August and AppEngine programming error partial outage 5 hours in June (Kim, 2009).

In some cases data stored in the cloud has actually been lost. Further example, in March 2009, Carbonite Inc. Cloud service provider suffered faulty hardware equipment which caused backup failures that resulted in company losing data for 7,500 customers two year ago ( Mather et al, 2009).

Opportunities:

Many cloud storage provider do not back up customer data. This is seemingly simple yet critical question that customers should be asking of cloud storage providers. According to Armbrust, et al. (2009), it is claimed that using multiple Cloud Computing Providers to provide Business Continuity would be possible solution for high availability. This should follow “no single source of failure” paradigm. Most of large clients tend to be migrating to cloud computing without business continuity strategy.

The adoption of cloud computing infrastructure and technologies will demand that the provisioning and use of computing resources be service-level agreement (SLA) driven and it is in this agreement where availability constraint. SLAs are legal document or contracts that define policies like delivery parameters, costs, and other factors. In fact, Present SLAs are extremely weak (Mather, 2009).

Tables below are Present SLA that guarantee for Cloud Computing and describe how to standardize SLA’s (Kandukuri, Ramakrishna, & Rakshit, 2009).

Table 3. Present SLA Content

SLA Credit Claim

Customer must open a Sales ticket by sending an email to Sales within 7 days of the purported outage. Must include service type, IP address, contact info, and full description of the service interruption including logs if applicable.

SLA Claims fault

Customers making false or repetitive claims will incur a onetime charge of $50 per incident for such claims.

Public Network

CSP (eg. Server intellect) guarantees 99.9% uptime on all public network services to Customers located their partner data centers (includes redundant carrier grade internet backbone connections, advanced intrusion detection systems, denial of service mitigation, traffic analysis, and detailed bandwidth graphs.

Private Network

CSP guarantees 99.9% uptime on the service network services to Customers located in partner data centers (includes access to the secure VPN connection, unlimited bandwidth between servers, limited uploads/downloads to servers, access to contacted services, traffic analysis, and detailed bandwidth graphs.

Redundant Infrastructure

CSP guarantees 99.9% uptime on the power & HVAC services to Customers located in partner data centers.

Hardware Upgrade

CSP guarantees hardware upgrades will commence and complete within 4 hrs of scheduled hardware upgrade maintenance windows. Failure will result in a waiver of any onetime installation fees.

Table 4. How to standardize SLA

SLA has to discuss how the following security risks are handled

Privileged user access

Regulatory compliance

Read also  Management Information On Decision Making Information Technology Essay

Data location

Data segregation

Recovery

Investigative support

Long-term viability

Questionnaire’s sample that SLA should answer

What are the resources provided to the customer?

How resources will help to the customer? Are there any limitations to the number of resources?

How the bills are generated? What are the payment modes?

How the services are affected if the customer delays in paying bills? This should contain grace period and how the customer can get the services back after the payment when the services stopped?

What happens if the SLA is not met? How data is handled when the service contract ends, the type of data returned to the company?

What happens if the service contract withdrawn?

How data will be handled and returned to the company? How the service uses event logs and who actually has access to the data on the backend?

Who will check the security of cloud providers?

Security at Different Levels

We need security at following levels:

Server access security

Internet access security

Database access security

Data privacy security

Program access Security

Questions

What is Data Security at Physical Layer?

What is Data Security at Network Layer?

What about investigation Support?

How much safe is data from Natural disaster?

How much trusted is Encryption scheme of Service Provider?

SLA penalties are usually applied as a service credit and are typically capped at the equivalent of a month’s service fees. Consequently, any SLA penalty will normally be based on the service fee but not on the customer’s lost revenue. Gartner (Leong, 2011) also states that the trend is for cloud computing providers to offer SLAs, but of course they do so by passing the cost of the risk down to the cloud computing platform consumer. It is another issue and cost to consider.

In conclusion, customer should pay attention to security consideration and how data is kept in the cloud. Availability concerns have slowed adoption of cloud-based model. Nevertheless, by using multiple cloud providers and effective SLAs can mitigate this barrier.

Security

Main Data Security Requirement:

Security is the number one concern in Cloud Computing. Four main Security Requirements are following.

Confidentiality

Data integrity

Control

Audit

Confidentiality

Problems:

Confidentiality means keeping users’ data secret in the Cloud systems(Minqi, Rong, Wei, Weining, & Aoying, 2010). Many users said “My sensitive corporate data will never be in the Cloud”(Armbrust, et al., 2009). Then confidentiality is a major barrier for users to adopt into Cloud systems.

Cloud Computing system offering utilizes public network, the applications or systems are vulnerable to more attacks when comparison to those resided in the private data centre.

Opportunities:

Traditionally, there are two basic approaches to achieve data confidentiality which Cloud providers widely adopt; Virtual Physical isolation and Cryptography. Firstly, to achieve Virtual Physical isolation, Virtual Local Area Networks and network middle boxes such as firewall and packet filters should be deployed. For example Vertica deploys its database on the Amazon EC2 and provides VPN and firewall to secure its database(Minqi, et al., 2010) as in Figure3.

Figure Vertica Provides VPN and Firewall to Secure Its Database

Secondly, Encrypted storage.

Encrypting data before placing it in a Cloud may be even more secure than unencrypted data in a local data centre; this approach was successfully used by TC3, a healthcare company with access to sensitive patient records and healthcare claims, when moving their HIPAA compliant application to AWS(Minqi, et al., 2010).

Data Integrity

Problems:

Data integrity in the Cloud system means to preserve information integrity which means data does not lost or modified by unauthorized users.

Opportunities:

To overcome the data integrity in the Cloud Computing system, the Zetta system is introduced. It provided by Zetta which mainly focus on data integrity for Cloud Computing services, which has a similar idea to RAID systems(Minqi, et al., 2010). Zetta implements RAIN-6 (Figure4) guaranteeing availability and integrity even in the event of complete node failure from any cause, whether network, power supply, memory, or disk failure(Zetta Enterprise Data Protection in the Cloud, 2011).

Blog_RAIN6.jpg

Figure Zetta RAIN-6

Control

Problems:

Control in the Cloud system means to regulate the use of the system, including the applications, its infrastructure and the data.

When personal data stores in Cloud Computing system which is a public network, the data has more chance to be attacked by many threats. Moreover, search engine might index an individual data stored in a worldwide. Then privacy data such as personal medical information need to be controlled in Cloud Computing systems.

Opportunities:

In the Cloud Computing literature, present “a MapReduce-based system which provides strong security and privacy guarantees for distributed computations on sensitive data. The prototype implementation demonstrates the flexibility of Airavat on a wide variety of case studies. The prototype is efficient, with run times on Amazon’s cloud computing infrastructure within 32% of a MapReduce system with no security. All experiments are run on Amazon’s EC2 service on a cluster of 100 machines” (I. Roy).

Audit

Problems:

Audit means to watch what happened in the Cloud system.

Opportunities:

The capability in auditing the entire Cloud Computing System could be done in technique perspective by adding an additional layer above the virtualized operating system.

Amazon Web Services is an example for cloud provider which has successfully completed a Statement on Auditing Standards No. 70 (SAS70) Type II Audit. The SAS70 certifies that a service organization has had an in-depth audit of its controls (AWS Completes SAS70 Type II Audit, 2010).

Another related concern is law of many nations requires SaaS providers to store customer information and copyrighted material within national borders. Similarly, some businesses may not like the ability of a country to get access to their data via the court system; for example, a European customer might be concerned about using SaaS in the United States given the USA PATRIOT Act(Armbrust, et al., 2009).

Read also  The Global Knowledge Of Management Information Systems Information Technology Essay

Service Level Agreement (SLA)

Customers should negotiate with Cloud providers to reach an SLA agreement in order to guarantee information security and quality of service. The example of Cloud providers SLA are Amazon EC2 SLA and Windows Azure SLA.

The organisation and certificate relates to Could Computing Security are Cloud Security Alliance and Cloud Certificate accordingly(CSA About, 2009).

Interoperability

Problems:

On-premise applications definitely have interoperability issue still this is more concern in cloud. Since in cloud enterprises have no control over their infrastructure and platforms, they depend on cloud providers to control their environment. Cloud providers use different languages and supply proprietary data storages e.g. Google’s BigTable, Amazon’s Dynamo, and Facebook’s Cassandra. Also, scalable data storage isn’t yet a commodity and is unlikely to be so for a long time due to the fact that there is no simple generic solution for distributed data storage (Hofmann & Woods, 2010).

For example if companies need to move their contact data from Salesforce CRM to Google’s Gmail, since there is no interface provided by Salesforce to Gmail or Google calendar, the data needed to be uploaded and transported to Google’s App Engine and converted to Google contact format.

The interoperability and portability of information between private clouds and public clouds are critical enablers for broad adoption of cloud computing by the enterprise (Mather, 2009). This means that company now are considering the complexity to integrate cloud with traditional application that may reside in on premise or in cloud.

Opportunities:

The solution is to standardize interoperability. Mather (2009) points out that the standard of interoperability is either an enabler or a barrier to interoperability, and permits maintenance of the integrity and consistency of a company’s information and processes (Mather, 2009).

Data lock-in

Problem:

Most of clouds computing APIs are still own by cloud provider and there is no active standard for APIs. It is not simple to just extract data from one cloud provider and port to others. Many organizations concern about difficulties to move their data or application across cloud. Cloud provider may prefer customer locked in; however customers might face negative impact related to hidden cost, service reliability problem or even cloud provider closing down.

For example, an online storage service named The Linkup shut down on Aug 8, 2008 after losing access to approximately 45% amounts of customer data (Brodkin, 2008). Linkup’s Customer were told that the service was no longer available and alternatively try another storage site.

Opportunities:

The solution is APIs Standardization. If there are APIs standards, services and data can be easily moved across multiple clouds and this can prevent customer data loss from one cloud provider failure. Unfortunately, there is no APIs standard at the moment; organization must face a huge challenge of proprietary APIs.

http://proquestcombo.safaribooksonline.com.ezproxy.lib.uts.edu.au/getfile?item=aTg1ZjY0YWRzNy9yaXRnMzltL2dwYy9lODU5MDYwcHNpM2cubm1nLzI-

Figure 5 illustrates example of API enabler for cloud computing (Mather, 2009)

Emerging innovation to mitigate barriers

Cloud Service brokerages and cloud standards are emerging innovations to mitigate interoperability and Lock-in issue.

Cloud Service Brokerage (CSB)

To construct Cloud Interoperability, user can choose either leveraging integration technology (e.g., Adeptia, IBM [via the Cast Iron Systems appliance], iWay Software, Jitterbit and Sesame Software) or using CSBs.

Gartner (Daryl C. Plummer, 2010) defines definition of CSBs as “A cloud services brokerage is a business model in which a company or other entity adds value to one or more (generally public or hybrid, but possibly private) cloud services on behalf of one or more consumers of those services.” CSB facilitates companies the ease of use of multiple cloud providers.

Examples of CSBs are following.

Appirio

GXS

IBM (Cast Iron Cloud)

Tibco Software

Boomi, Alcatel-Lucent

AmberPoint

CommonIT

Layer 7 Technologies

Sonoa Systems

StrikeIron and Vordel

Cloud Standards

Although there is no cloud API standard, standardization efforts are mushrooming and are driven by vendor as well as user communities (Mather, 2009). Example of Cloud Standards setting by cloud organization is Cloud Data Management Interface (CDMI) establish by SNIA on 12 April 2010 (Adam W. Couture, 2010) Presently, there are emerging Cloud organizations as illustrates by Table below (created by Group D2 for INFO05992)

Table 5. Emerging Cloud Organizations

Standards Organizations

Description and responsibilities

Cloud Security Alliance (CSA)

CSA is a non-profit organization formed to promote the use of best practices for providing security assurance within Cloud Computing, and provide education on the uses of Cloud Computing to help secure all other forms of computing.

Website: https://cloudsecurityalliance.org

Distributed Management Task Force (DMTF) (established OVF)

DMTF is a not-for-profit association of industry members dedicated to promoting enterprise and systems management and interoperability.

Using the recommendations developed by DMTF’s Open Cloud Standards Incubator, the cloud management workgroup (CMWG) is focused on standardizing interactions between cloud environments by developing specifications that deliver architectural semantics and implementation details to achieve interoperable cloud management between service providers and their consumers and developers. Website: http://www.dmtf.org/standards/cloud

Storage Networking Industry Association (SNIA) (established CDMI)

SNIA Mission is to lead the storage industry worldwide in developing and promoting standards, technologies, and educational services to empower organizations in the management of information.

Key Program: Cloud Storage Initiative (CSI) that promotes the adoption of cloud storage as a new delivery model that provides elastic, on-demand storage billed only for what is used.

Website: http://www.snia.org

Open Grid Forum (OGF) (establish OCCI)

Read also  Building and Operating IT Systems Challenges

The Open Cloud Computing Interface comprises a set of open community-lead specifications delivered through the Open Grid Forum. OCCI is a Protocol and API for all kinds of Management tasks. OCCI was originally initiated to create a remote management API for IaaS model based Services, allowing for the development of interoperable tools for common tasks including deployment, autonomic scaling and monitoring. It has since evolved into a flexible API with a strong focus on integration, portability, interoperability and innovation while still offering a high degree of extensibility. The current release of the Open Cloud Computing Interface is suitable to serve many other models in addition to IaaS, including e.g. PaaS and SaaS.

Website: http://occi-wg.org

Open Cloud Consortium (OCC)

The Open Cloud Consortium (OCC) is a member driven organization that develops reference implementations, benchmarks and standards for cloud computing. The OCC operates clouds testbeds, such as the Open Cloud Testbed and the OCC Virtual Network Testbed. The OCC also manages cloud computing infrastructure to support scientific research, such as the Open Science Data Cloud.

Website: http://opencloudconsortium.org/

Organization for the Advancement of Structured Information Standards (OASIS)

OASIS is a not-for-profit consortium that drives the development, convergence and adoption of open standards for the global information society. The consortium produces more Web services standards than any other organization along with standards for security, e-business, and standardization efforts in the public sector and for application-specific markets. Founded in 1993, OASIS has more than 5,000 participants representing over 600 organizations and individual members in 100 countries.

Website: http://www.oasis-open.org/who/

TM Forum

TM Forum is the world’s leading industry association focused on enabling best-in-class IT for service providers in the communications, media and cloud service markets. The Forum provides business-critical industry standards and expertise to enable the creation, delivery and monetization of digital services.

Website: http://www.tmforum.org/browse.aspx

Internet Engineering Task Force (IETF)

The Internet Engineering Task Force (IETF) is a large open international community of network designers, operators, vendors, and researchers concerned with the evolution of the Internet architecture and the smooth operation of the Internet. It is open to any interested individual. Website: http://www.ietf.org/

European Telecommunications Standards Institute (ETSI)

The European Telecommunications Standards Institute (ETSI) produces globally-applicable standards for Information and Communications Technologies (ICT), including fixed, mobile, radio, converged, broadcast and internet technologies.

Website: http://www.etsi.org

Object Management Group (OMG)

OMG’s mission is to develop, with our worldwide membership, enterprise integration standards that provide real-world value. OMG is also dedicated to promoting business technology and optimization for innovation through its Business Ecology® Initiative (BEI) program and associated Communities of Practice.

Website: http://www.omg.org/

Open Nebula 2.0 is an example of cloud open source project.

Figure Source: http://blog.opennebula.org/?p=593

Gartner predicted that common standard for cloud Interoperability that will gain significant industry support will not emerge within the next two years however smart consumers will rely on Cloud brokerage rather than waiting for standards to evolve (Plummer, 2010).

Cloud Computing Adoption

IDC (2008) mentioned from the survey result in 2008 that cloud was in Early Markets (determined by the portion of organization demonstrating adoption, 15-25%) and predicted that it was increasing speed of IT Cloud service adoption to mainstream markets (25-45%) in three years.

http://blogs.idc.com/ie/wp-content/uploads/2008/09/chasm_group_tech-adoption-lifecycle.jpg

Figure Source: IDC (2008) http://blogs.idc.com/ie/?p=205

Applying Rogers’ factors influencing the speed of adoption

Table below illustrates comparisons among Characteristic of Innovations from Roger’s theory, Cloud Computing Characteristics, and Cloud obstacles (Created by Group D2 for INFO05992)

Table 6

Characteristic of Innovations from Roger’s theory

Cloud Computing characteristics

Relative Advantage

Pay per use

Increase deployment speed

Require less in-house IT staff or reduce costs

Improve OS or document format compatibility

On-demand self-service

Rapid Elasticity

Compatibility

SaaS offers same products as existing ones for Small and medium size company for example Google App or Salesforce.com

Complexity

No infrastructure

New version of software regularly updated by cloud provider

Trial-ability

Free trial offered from many CSPs

Observe-ability

Advertisement , Google trend and from Hype Curve (Gartner,2010)

Figure S- Curve by Group D2 interpretation

Image source: http://www.customerthink.com/blog/7_ways_to_sell_social_media_to_your_marketing_department

Conclusion

As of table 6, cloud computing characteristics are identical to characteristics of innovation introduced by Roger’s theory which are Relative Advantage, Compatibility, Complexity, Trial-ability, and Observe-ability. These characteristics are able to indicate the speed of adoption of the innovation especially if innovation possesses relative advantages and compatibility.

According to Roger’s, cloud computing should be adopted by business rapidly. In addition, based on IDC (2008) that cloud computing was in early adopter phase and are predicted that it should move to early majority phase.

However, from our research there are still significant barriers hindering business to adopt cloud such as poor performance, security (i.e. Availability of service, Data Confidentiality, Integrity, Control, and Audit ability), Interoperability and data lock-in.

Nonetheless, as you can see from Figure 7, from our perspective cloud adoption is still in early adopters phase and has not moved to Early Majority yet because of existing barriers mentioned above.

At present there are emerging innovation both technical and non technical to mitigate these barriers. Some of aspects of barriers are solved but some are still present. Furthermore, businesses still continue adopting Cloud Computing. They tend to leverage on Virtualization, Private Cloud, Hybrid Cloud and CSBs instead. At the moment, there are several organizations establishing cloud Standards. However, cloud Standards are not mature yet at least at this stage.

Going forward we assumed that by the time these barriers are solved, the speed of adopting cloud will be increased dramatically and we could see cloud computing move toward early majority and finally be adopted by mainstream market.

Order Now

Order Now

Type of Paper
Subject
Deadline
Number of Pages
(275 words)