8
Cloud Auditing and Compliance1

Paolina Centonze

Iona College, New Rochelle, NY, USA

8.1 Introduction

The global cloud computing market is expected to exceed $1 trillion by 2024. This is based on the latest research report covering cloud computing products, technologies, and services for the global market by Market Research Media (https://www.marketresearchmedia.com/?p=839).

Companies, industries, and agencies, such as financial services organizations, health organizations, and government agencies, are looking into new cloud‐based auditing research and methodologies to alleviate security, privacy, trust, and forensics challenges.

The rest of this chapter is organized as follows. Section 8.2 discusses the major cloud security problems and explains how auditing can alleviate these issues. Section 8.3 presents the state of the art of cloud auditing and discusses modifications and extensions needed in order to minimize cloud security challenges. Section 8.4 focuses on cloud compliance challenges and describes how national and international organizations are extending and modifying these compliance regulations in order to make them standardized. Section 8.5 discusses future research directions for cloud audits and compliance. Finally, Section 8.6 summarizes the main points and concludes this chapter.

8.2 Background

There is no doubt that cloud computing offerings are completely transforming ways of delivering and investing for IT services, which enable companies, industries, agencies, and academia to make deep changes in IT solutions and adapt their business solutions and processes. Figure 8.1 shows a survey, based on data reported by Gartner, Inc. (https://www.gartner.com/newsroom/id/2352816) (NYSE: IT), one of the world's leading IT research and advisory companies, on how the cloud global market's spending has changed over the past few years, with global spending of $222.5 billion by 2015. Figure 8.2 shows a global index survey based on data reported by Cisco Systems, Inc. (https://www.cisco.com/c/en/us/solutions/collateral/service‐provider/global‐cloud‐index‐gci/white‐paper‐c11‐738085.html), an American IT multinational corporation, with an overall data center Internet Protocol (IP) traffic compound annual growth rate (CAGR) of 25% from 2012 to 2017. While cloud services can bring many advantages to business solutions, such as cost reduction, on‐demand provisioning mechanisms, and enablement of a pay‐per‐use business model, today security and privacy are still among the top concerns that discourage cloud‐service consumers from adopting cloud solutions to the fullest, as shown in Figure 8.3. One dominant characteristic of cloud computing is that parts of an IT infrastructure's trust boundary move to third‐party providers. Therefore, lack of direct control over cloud consumers' data or computation requires new techniques for the service provider to guarantee transparency and accountability.

Graph of worldwide cloud market forecast with four 3D bars labeled 68.5, 131, 148.8, and 222.5 for 2010, 2013, 2014, and 2015, respectively.

Figure 8.1 Worldwide cloud market forecast (Garner Survey, 2014).

Graph of global data center IP traffic growth with 3D bars indicating 2012–2017 in ascending order.

Figure 8.2 Global data center IP traffic growth (Cisco global index survey, 2014).

Graph with 3D bars labeled 87.5, 83.3, 82.9, 81, 80.2, 76.8, and 76 for security, availability, performance, on-demand payment, lack of interoperability, bringing back in-house, and not enough ability to, respectively.

Figure 8.3 Cloud consumers' issues (Gartner survey, 2014).

According to the Cloud Security Alliance (CSA), a cloud service model comprises seven layers: facility, network, hardware, operating system, middleware, application, and user (https://cloudsecurityalliance.org/topthreats.html). Table 8.1 shows whether the cloud provider or the cloud customer is in control of each layer for every specific deployment model: Software‐as‐a‐Service (SaaS), Platform‐as‐a‐Service (PaaS), and Infrastructure‐as‐a‐Service (IaaS). Choosing the proper cloud deployment model is crucial, since once a model is selected to deliver business solutions, responsibilities are agreed upon and accepted by the party hosting the cloud solution and the subscribers to the services.

Table 8.1 CSA: layers a cloud provider controls (2010).

Cloud service model
Layer SaaS PaaS IaaS
Facility Provider Provider Provider
Network Provider Provider Provider
Hardware Provider Provider Provider
Operating system Provider Provider Provider or customer
Middleware Provider Provider or customer Customer
Application Provider Customer Customer
User Customer Customer Customer

When a cloud consumer subscribes to a particular service‐delivery model, that consumer agrees to a certain level of access control over the resources managed by the CSP. Therefore, when assessing a cloud system, cloud customers must recognize, and be concerned with, the limitations of each service‐delivery model. If a specific capability, such as security, trust, traceability, or accountability is needed but not yet completely provided within a given delivery model, a subscriber has to either negotiate with the service provider for that capability to be fully implemented and deployed, and specify this request clearly in the service‐level agreement (SLA), or request a different delivery model that has the desired functionality. Incomplete understanding of the separation of responsibilities may result in false expectations of what a CSP can offer. Security, trust, integrity, and forensics are challenges for classic IT environments, but they are even more complex in cloud environments due to the Cloud's inherent characteristics, such as seamless scalability, ability to share resources, multitenancy, ubiquitous access, on‐demand availability, and third‐party hosting. Another complication is the fact that the underlying infrastructure is not standard; every CSP implements the underlying infrastructure using different hardware and software systems. Therefore, security‐enforcement mechanisms must be adapted to each cloud system. SLA requirements are essential in order to meet the security expectations of cloud services and resources.

Cloud auditing and fast response processes are essential in order to properly and efficiently describe the levels of availability, performance, security, serviceability, and other characteristics of a cloud service. However, this requirement may be hard to meet because the amounts of cloud‐auditing data stored on the Cloud itself, consisting of client and server logs, network logs, database logs, etc. may be extremely large.

For example, Apprenda, a PaaS software‐layer cloud‐provider company based in the United States, defines cloud federation as “the unionization of software, infrastructure and platform services from disparate networks that can be accessed by a client via the Internet.” A federation of cloud resources uses network gateways that connect public or external clouds, private or internal clouds, and/or community clouds by creating a hybrid cloud‐computing environment. Since cloud computing may deploy services in federated cloud environments, audit data is collected and stored in distributed environments. It is necessary to properly capture, store, and analyze such data in order to identify and quantify threats, and prevents security attacks.

Capturing security‐relevant information and auditing results to determine the existence of security threats in the cloud are still challenging problems. Following are some examples that illustrate the need for cloud audits:

  • A cryptographic attack that hijacked Windows Update went mainstream on the Amazon cloud. A collision attack against the widely used MD5 algorithm took 10 hours, for a cost of only 65 cents (Goodin 2014a).
  • A distributed denial of service (DDoS) attack, performed through the Amazon EC2 control panel, resulted in hosting provider Code Spaces shutting down its business (Goodin 2014b).
  • An attack against Apple's iCloud allowed attackers to steal users' login credentials (Weise 2014)

In 2010, the Cloud Security Alliance released a research document entitled Top Threats to Cloud Computing v1.0 (https://cloudsecurityalliance.org/topthreats.html) in order to assist organizations in making educated risk‐management decisions regarding their cloud adoption strategies. Here is the list of the top cloud computing threats according to that report:

  1. Abuse and nefarious use of cloud computing
  2. Insecure interfaces and APIs
  3. Malicious insiders
  4. Shared technology issues
  5. Data loss or leakage
  6. Account or service hijacking
  7. Unknown risk profile

The Cloud Security Alliance encourages organizations to also use—along with the research document just mentioned—Security Guidance for Critical Areas in Cloud Computing V3.0, which was last updated in 2011 (https://cloudsecurityalliance.org/guidance/csaguide.v3.0.pdf). In this document, the CSA includes a collection of facts and ideas gathered from over 70 industry experts worldwide.

Understanding how to manage cloud opportunities and security challenges is crucial to business development. Security audits and penetration testing are used in classic IT infrastructures to document a data center's compliance to security best practices and laws. One of the most serious downsides of traditional security auditing is that it only provides a snapshot of an environment's security state at the time of the audit. This may be sufficient for classic IT infrastructures since they do not change very frequently. However, auditing a cloud environment is a much more complex task, for which traditional security auditing is not adequate. This is because a cloud system has inherent characteristics that make the auditing process more complicated. Such characteristics include on‐demand self service, broad network access, resource pooling, rapid elasticity, multitenancy, and lack of hardware governance.

When performing cloud auditing, it is necessary to consider the point in time when changes in the underlying infrastructure occur, and the ability to decide if any such changes give rise to a security gap or an infrastructure misuse. It is also important to have knowledge of the underlying business processes: for example, to automatically infer whether an immediate increase in a high‐pick cloud‐service request is being made for true business needs or is rather caused by a hacker misusing the system to perform a denial of service (DoS) attack. In 2012, to increase cloud transparency, the Cloud Research Lab at Furtwangen University, Germany, developed the Security Audit as a Service (SAaaS) architecture for IaaS cloud environments. The SAaaS architecture ensures that a desired security level is reached and maintained within any cloud infrastructure where changes occur very frequently. The work required of research institutions, government agencies, and academic organizations in order to provide secure cloud computing is still immense, and demands the collaboration and participation of a broad community of stakeholders on a global basis. However, this initiative is very encouraging and promising, since it is the right step in the right direction; new cloud‐security solutions are regularly appearing, enterprises are using CSA's guidance to engage with CSPs, and a vigorous public dialogue over compliance and trust issues has begun around the world. The most important outcome in the field of cloud computing that has been achieved is that security professionals are now eagerly engaged in securing the future, instead of only focusing on protecting the present. In fact, there is a growing demand for cloud‐computing standards. These standards may be very complex to integrate with existing infrastructures in order to provide reliable cloud services in cloud‐computing environments. However, the creation and adoption of standards is an important step forward since it minimizes the differences between cloud implementations and simplifies the enforcement of security and auditing. As shown in Table 8.2, standard organizations and working groups worldwide are producing documentation, guidelines and specifications to create the foundation of cloud‐computing standardizations. Details of these cloud‐security standardization and compliance efforts are covered in Section 8.4 of this chapter, which also discusses the Federal Risk and Authorization Management Program (FedRAMP), a US government‐wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services (Table 8.3).

Table 8.2 Standards organizations and their nationalities (2014).

Cloud‐related standards organizations Nationality
National Institute of Standards and Technology (NIST) United States
Distributed Management Task Force (DMTF) International
IEEE Standards Association (IEEE‐SA) International
International Telecommunications Union (ITC) International
European Telecommunications Standards Institute (ETSI) European
Organization for the Advancement of Structured Information Standards (OASIS) International
System Administration Networking and Security United States
International Organization for Standardization (ISO)/IEC International
Federal Risk and Authorization Management Program (FedRAMP) United States

Table 8.3 Cloud security and auditing publications (2014).

Cloud security and auditing publications (publication title) Organization
Challenging Security Requirements for US Government Cloud Computing Adoption NIST
Cloud Computing Security Reference Architecture NIST
Guide to Security for Full Virtualization Technologies NIST
Security Guidance for Critical Areas of Cloud Computing CSA
Trusted Cloud Initiative (TCI) Reference Guidelines CSA
Cloud Auditing Data Federation (CADF) Data Format and Interface Definition Spec DMTF
Quick Guide to the FedRAMP Readiness Process FedRAMP
Cloud Security and Compliance: A Primer SANS
A Guide to Virtualization Hardening Guides SANS
Focus Group on Cloud Computing Technical Report CSC
Saving Money Through Cloud Computing OACIS

8.3 Cloud Auditing

This section describes the importance of auditing in solving specific cloud security issues. It highlights the most crucial security issues that need to be considered when deploying a service to a cloud infrastructure. In particular, it explains how auditing methodologies can help in alleviating identified security issues. Furthermore, it discusses how classic auditing need to change and be extended in order to include complex cloud‐computing environment characteristics. Different challenges complicating cloud auditing are discussed, and important questions that a cloud audit must answer are raised. Moreover, this section describes the importance of diagnosing vulnerability patterns in cloud audit logs, especially for web service compositions for cloud service architectures. In fact, in these systems auditable events are not always well defined and processed. This section also gives an overview of the latest state of the art of cloud auditing for addressing security‐ and privacy‐related issues.

Security‐related challenges are the main reason enterprises are hesitant to adopt cloud computing. Therefore, these challenges have become a significant area of research and the object of important economic studies. A precise understanding of the security implications behind a given cloud infrastructure has not always been completely achieved since cloud architectures are often based on proprietary hardware and software, which complicates the study and detection of security vulnerabilities. Furthermore, very often, essential and well‐defined security terms, such as threat, vulnerability, and risk, are not used and understood properly. These security terms are crucial when performing risk analysis for creating and deploying a service on a cloud environment. In cloud computing, contrary to classic IT outsourcing, a customer may also rent a certain infrastructure and end up sharing infrastructure resources with other customers. This architecture is known as the multitenant model. Between 2009 and 2010, researchers (Sotto et al. 2010; Chen et al. 2010) and the European Network and Information Security Agency (https://www.enisa.europa.eu/activities/risk‐management/files/deliverables/cloud‐computing‐risk‐assessment) identified numerous cloud security and privacy problems, which all have in common the following two categories of problems:

  • Amplified cloud security problems: Problems already known from traditional, distributed IT environments but amplified through cloud computing attributes
  • Cloud‐specific security problems: Security problems that arise only due to the special characteristics of cloud computing

8.3.1 Amplified Cloud Security Problems

Amplified cloud security problems are those deriving from the underlying technologies upon which cloud computing is heavily built, such as virtualization, web application servers, and multitenant software architectures. This category of problems includes those originating from failing to adhere to well‐known and commonly established security best practices, which are hard or infeasible to implement in a cloud computing environment.

The most common amplified cloud security problems are the following:

  • Misuse of administrator rights and/or activities of malicious insiders: In cloud computing, virtual machines (VMs) are used for managed servers. A CSP is responsible for the underlying host system and always has access to the VMs running on the host through the hypervisor. As of today, it is still hard to detect misuse of administrator rights or the presence of malicious insiders due to a general lack of transparency into the CSP's processes and procedures. This problem may violate core security principles, such as confidentiality, authenticity, authorization, integrity, data protection, accountability, and nonrepudiation.
  • Missing transparency of applied security measures: In traditional IT outsourcing, service providers can prove their security compliance to their customers by showing the usage of the baseline security measures through, for example, International Organization for Standardization (ISO) 27001 or Payment Card Industry (PCI) Data Security Standard (DSS) certificates. In cloud computing, not every CSP follows these baseline rules, although today global government agencies and laws are changing in order to require CSPs to follow these regulations. This problem may cause a violation of one or more of the following core security principles: integrity, availability, and data protection. A notable exception is Amazon Web Services (AWS)—one of the first CSPs that has started to follow global security compliance very seriously. More about cloud compliance and regulations will be described in Section 8.4 of this chapter.
  • Missing transparency with security incidents: In traditional IT outsourcing, responsibility for security‐incident response is transferred to the service provider, which uses experienced personnel (for example, a computer emergency response team (CERT). When it comes to cloud computing, things are more complicated. In this case, a customer and the CSP have to work together to collect all users' data information generated before and during a security incident. Any concern about the cloud hardware and software infrastructure must be associated with the different cloud resources available to the customer and involved during the incident. Today, a standardized procedure is still missing. In fact, cloud offers available on the market do not offer a transparent process for customers on how security incidents are detected. This problem may affect the following core security principles: data protection, integrity, availability, and nonrepudiation. Section 8.4 of this chapter also explains what the SysAdmin, Audit, Network, Security (SANS) organization is doing to correct this problem, especially when it comes to public cloud services. Section 8.4 also explains how the SAaaS can be used for cloud incident detection and as a possible enabler to perform cloud audits while respecting cloud‐specific characteristics.
  • Shared‐technology issues: In the multitenant cloud‐computing model, virtual resources are shared with multiple customers. Furthermore, there may be misconfigured VMs that endanger other resources due to lack of proper isolation. Exploits have already been demonstrated by (Kortchorski and Rutkowska 2009). The increasing code complexity in hypervisor software amplifies this threat. Memory‐cache isolation is another issue since some of the underlying components that make up the cloud infrastructure do not lend themselves to offering proper isolation. For example, graphics processing unit (GPU) and central processing unit (CPU) caches were not designed to offer strong isolation properties in a multitenant architecture. As of today, no CSP discloses information on how shared resources are securely wiped before being reassigned to a different customer. Furthermore, cloud consumers get a default source access point to a VM in the current IaaS. It has been proven that using this default source access point, attackers have an easier way to break through the isolation of shared resources. The Federal Office for Information Security suggested that using certified Common Criteria (CC) compliant hypervisor software (minimum EAL 4) might alleviate this threat (https://www.bsi.bund.de/EN/Topics/Certification/certification_node.html). This problem may affect the following core security principles: integrity, availability, data protection, confidentiality, authentication, and nonrepudiation.
  • Data life cycle in case of a provider switch or termination: In cloud computing, due to shared usage of resources, this threat is particularly serious. The CSA states that, in the absence of satisfactory rules defined by the CSP, every cloud consumer needs to impose specific rules for when the contract with the CSP ends. Such rules must clearly regulate how customers' data is exported from the Cloud and how the CSP will securely erase that data at the end of the contract. This problem may affect the data protection and confidentiality core security principles.
  • Monitoring service‐level agreements (SLAs): In cloud computing, several multitenant applications running in a virtualized environment need a special technology for monitoring SLAs. New technologies for the hypervisor, virtualized networking, monitoring, etc. must be used. (Patel et al. 2009) proposed a mechanism for managing SLAs in a cloud‐computing environment using the web service‐level agreement (WSLA) framework, developed for SLA monitoring and SLA enforcement in a service‐oriented architecture (SOA). In June 2014, the European Commission (EC) published the Cloud Service Level Agreement Standardization Guidelines (https://ec.europa.eu/digital‐single‐market/en/cloud‐select‐industry‐group‐service‐level‐agreements). More about this standardization is described in Section 8.4 of this chapter. This problem may affect the availability and integrity core security principles.

8.3.2 Cloud‐Specific Security Problems

Cloud‐specific security problems are those that are inherent to the Cloud itself and its characteristics, and not necessarily inherited from the technologies used in the underlying infrastructure. The most common cloud‐specific security problems are the following:

  • Unclear data location: Today, many cloud customers do not know in which country their data is saved or processed. Perhaps the only exception is AWS, which exposes to consumers the approximate cloud data center's continental location in which their data is stored (for example, the AWS data center in Northern Ireland). As of today, there is no way to know whether a customer's data has been outsourced by a cloud provider. (Tiwana et al. 2010) proposed a solution to expose the network location of data to applications. There are only a few specific acts and laws related to protecting users' data. Among these is Germany's Data Protection Act, §11 (Mell and Grance 2009; German Parliament 2009). Section 8.4 talks more about different global data‐location compliance rules. The uncertainty related to the geographical location in which a consumer's data is stored may affect the following core security principles: data protection, confidentiality, and availability.
  • Abuse and nefarious use of cloud resources: Cloud computing offers many attractive characteristics. Among these is easy, fast access to many virtual supercomputing machines. Unfortunately, these great characteristics also attract malicious users, who find the high‐performance computing infrastructure of a modern cloud system the right platform for attacking other systems. For example, in 2011, malicious users took advantage of AWS and used it to host malware, the Zeus botnet, a phishing Trojan horse that steals banking information within AWS. Cloud hackers can also easily aggregate as many VMs as they need to perform DDoS attacks on a single CSP, which can also affect the cloud consumer. This problem may affect the availability and confidentiality core security principles.
  • Missing monitoring: If cloud consumers' data, especially personal data, is at risk of fraud or loss of integrity, it is essential that a CSP be able to detect these risks and eliminate them. As of today, it is not trivial for a CSP to use an information‐policy system that automatically detects security issues and informs customers. When designing a risk analysis that runs a service on a cloud, it is important to include data‐protection measures to secure the cloud environment, such as antivirus software and intrusion detection systems (IDSs), as well as measures for DoS detection and prevention. For large IT environments, the best practice to monitor against these types of attach is to run IDSs with distributed sensors as input feeds. However, this solution is not sufficient and flexible enough for cloud‐computing environments due to their inherent complexity and the dynamic changes driven by users. (Meng et al. 2012) published a work on reliable state monitoring in cloud data centers for better monitoring such complex and dynamic environments. (Doelitzscher et al. 2012) introduced SAaaS as a cloud incident‐detection system built upon intelligent autonomous agents that are aware of underlying business‐driven intercommunicating cloud services. This problem may affect the following core security principles: nonrepudiation, availability, data protection, and confidentiality.
  • Unsecure APIs: A CSP offers application programming interfaces (APIs) to cloud consumers in order for them to deploy, control, and manage their cloud resources. Therefore, it is crucial for these APIs to provide access control, encryption, and activity monitoring in order to protect against both malicious and unintentional attempts to bypass a security policy. For instance, a load‐balancing service is a complex architectural layer that can be inserted into a system to improve the performance of a service and increase its availability. When such a complex component is used, it is necessary to carefully examine the overall system in order to make sure that no security holes are made available to attackers. To prevent attacks from being mounted against the system through the use of unsecure APIs, standardized protocols and measurements for secure software development are available. These include Microsoft Secure Development Lifecycle (SDL) and the Open Web Application Security Project (OWASP) Software Assurance Maturity Model (SAMM). The inadvertent use of unsecure APIs is a problem that may violate the following core security principles: confidentiality, integrity, availability, nonrepudiation, data protection, and accountability.
  • Missing monitoring of cloud scalability: Scalability is one of the most attractive characteristics of cloud computing. For this reason, it is essential to be able to deal with service usage peaks: for instance, if there is a new update of a popular software program, and a huge number of downloads is expected. Usually, peaks are predictable and confined to specific time frames. Therefore, cloud‐application engineers design their solutions to initiate new instances if a certain threshold is reached to provide service availability. However, this raises two new challenges for cloud security:
    1. Scaling driven up by IaaS business: Since a cloud consumer's infrastructure can vary quickly (for example, by growing and shrinking as in the case of a peak scenario), a monitoring system must take care of all the peak events and the defined scalability thresholds.
    2. Scaling driven up by IaaS attacks: A cloud attacker can control the creation of new cloud instances to the maximum scalability threshold number of allowed requests, which could cause, for example, the distribution of malicious software.

These issues may violate the availability and accountability core security principles.

  • Missing interoperability of cloud providers: Today, CSPs are often incompatible with each other, particularly because each CSP uses customized VM formats and proprietary APIs. This may increase the risk of data lock‐in, a phenomenon that prevents data portability across different CSPs. (Loutas et al. 2013) wrote a comprehensive and systematic survey of cloud‐computing interoperability efforts by standardization groups, industrial organizations, and research communities. More details about this work are described in Section 8.4. There are some initial development projects working on this issue. These include the Open Cloud Computing Interface (OCCI), Open Virtualization Format (OVF), OpenStack cloud software Rackspace hosting, and the National Aeronautics and Space Administration (NASA). Moreover, a specific strategy must be agreed upon between provider and customer for regulating data formats, perpetuating logic relations, and total costs if a CSP change occurs. This issue may cause a violation of the availability core security principle.

8.3.3 Correlation of Cloud Security Issues and Research Efforts

In order to better emphasize the complexity and importance of auditing in cloud computing, this section correlates each of the cloud audit issues with the current state of the art in research. Since a cloud consumer is not exposed to the details of how a CSP governs the underlying cloud infrastructure, the only choice for the consumer is to give complete trust to the CSP and hope the CSP applies compliance regulations and data‐protection laws for protecting confidential and sensitive data. Furthermore, unknown or unclear geographic location of data in the Cloud may cause unexpected issues since different jurisdictions enforce different legislation and compliance requirements when it comes to data governance. Following is a list of some of the current research work to alleviate this problem:

  • (Ries et al. 2011) proposed a new geographic location approach based on network coordinate systems, and evaluated the accuracy of their solution on the three prevalent cloud deployment models: IaaS, PaaS, and SaaS. Even though a CSP may use additional measures, such as traffic relaying, to hide the location of the resources stored in the Cloud, a high probability of location disclosure is achieved by means of supervised classification algorithms.
  • (Vaish et al. 2013) published a mechanism that uses remote‐attestation technology of trusted platform modules. A remote‐attestation technique is used to validate the current location of the data, and the generated result is passed to the user verifier. The fact that the trusted platform module is tamperproof provides the basis for the accuracy of the result.
  • (Gondree and Peterson 2013) introduced and analyzed a general framework for authentically binding data to a location while providing strong assurance against CSPs that, either inadvertently or maliciously, attempt to relocate cloud data.
  • (Paladi et al. 2014) proposed a mechanism allowing cloud users to control the geographical location of their data, stored or processed in plaintext on the premises of IaaS CSPs. They used trusted computing principles and remote attestation to establish platform state. They also enabled cloud users to constrain plaintext data exclusively to the jurisdictions they specify, by sealing the decryption keys used to obtain plaintext data to the combination of the cloud host's geographic location and platform state.
  • Cloud storage permits moving remote data to the centralized data centers, where loss of data integrity can arise. (Sunagar et al. 2014) proposed a study of the problem of ensuring the integrity of data storage in the Cloud.

Cloud auditing can also be used to help detect abuse and nefarious use of cloud resources, although this is still a complex task. (Hamza and Omar 2013) explored and investigated the scope and magnitude of this problem, which is one of the most serious security threats in cloud computing. The authors also presented some of the specific attacks related to this threat, since it constitutes a major restriction for moving business to the Cloud. Following are some of the most common attacks that relate to this problem and the related research work that has been done in order to resolve them:

  • Host‐hopping attacks: These attacks can be easily mounted when a CSP has no mechanism to restrict shared access to cloud resources, such as data storage and VMs, and to enforce isolation of different customers or hosts. Failing to enforce customer‐resource separation may facilitate hackers hopping on other hosts, endangering their resources, or interrupting their services, thereby damaging their reputation and impacting their revenue. These types of attacks are particularly common in a public PaaS deployment model, where multiple clients share the same physical machine.
  • Malicious insider and abuse of privileges: Since cloud infrastructures are based on multitenancy and shared resources, there is the risk of unauthorized access to customers' confidential data. This may lead to the exposure, leak, or selling of sensitive customer information. This problem may lead to even graver risks if a CSP does not correctly enforce the foundation security rule known as the principle of least privilege (Bishop 2002). In such cases, malicious users may exploit rights that they were not intended to be granted to subvert the integrity of the system or steal confidential data.
  • Identity theft attacks: Cloud hackers can easily create temporary accounts with CSPs, use cloud resources, and only pay for the usage of those resources. By doing this, they can try to get access to customer data and sell it, leading to identify theft. This type of attack also occurs when cyber criminals set up a fake cloud and attract users by hosting their sensitive information and providing them with cloud‐based services such as email and web hosting. This is a great catch for stealing customers' identities and financial information.
  • Service‐engine attacks: An attractive characteristic of cloud computing is highly customizable platforms. For example, in the IaaS deployment model, attackers can rent a VM to hack the service engine from the inside, and use it to their advantage. In particular, they can try to escape VM isolation to reach other VMs, steal sensitive business information, and compromise data of other cloud customers.

To address the lack of transparent monitoring of a cloud infrastructure, the Cloud Research Lab at Furtwangen University, Germany, in 2012, started to develop SAaaS, a prototype for incident detection in the cloud. SAaaS is an audit solution where techniques of behavioral analysis and anomaly detection are used to distinguish between normal and nefarious use of cloud resources. SAaaS uses intelligent autonomous agents, which support cross‐customer event monitoring within a cloud infrastructure. This work also evaluated which cloud‐specific security problems are addressed by SAaaS. The results of this work shows that autonomous agents and behavioral analysis are sufficient to identify cloud‐specific security problems and can create an efficient cloud‐audit system.

(Meng et al. 2012) proposed state monitoring for detecting critical events and abnormalities of cloud‐distributed systems. When it comes to cloud computing, as the scale of the underlying infrastructure grows and the amount of workload consolidation increases in cloud data centers, node failures and performance interferences (in particular, transient ones) become quite common. Therefore, distributed state‐monitoring tasks are very often exposed to broken communication. This can bring about misleading results and cause problems for cloud consumers who heavily rely on state monitoring to perform automatic management tasks, such as autoscaling. This work introduced a new state‐monitoring approach that addresses this challenge by exposing and handling communication dynamics, such as message delay and loss in cloud‐monitoring environments. This methodology delivers two different characteristics. First, it quantitatively approximates the accuracy of monitoring results to capture uncertainties introduced by messaging dynamics. This feature helps users differentiate trustworthy monitoring results from ones that heavily deviate from the truth, yet significantly improves monitoring utilities compared to simple techniques that invalidate all monitoring results generated in the presence of messaging dynamics. Second, it adapts itself to nontransient messaging problems by reconfiguring distributed monitoring algorithms to minimize monitoring errors. This work demonstrates that, even under severe message loss and delay, this new approach consistently improves monitoring precision and, when applied to cloud application auto‐scaling, outperforms existing state‐monitoring techniques.

In distributed systems, the security of the host platform is critical. Platform administrators use security‐automation methodologies, such as those provided by the Security Content Automation Protocol (SCAP) standards, to check that the outsourced platforms are set up correctly and follow security recommendations, e.g. those provided by governmental or industrial organizations. Nevertheless, users of remote platforms must still have confidence in the platform administrators. (Aslam et al. 2013) proposed a remote platform‐evaluation mechanism that can be used by remote platform users or by auditors, to perform frequent platform‐security audits. The authors analyzed the existing SCAP and Trusted Computing Group (TCG) standards for this solution. They also identified shortcomings and suggested ways to integrate these standards. This platform‐security‐evaluation framework uses the combined effort of SCAP and TCG to address the limitations of each technology when used independently.

Auditing systems are extremely important to make sure that customer data is properly hosted in the cloud. (Yu et al. 2014) investigated the possibility for active adversary attacks in three auditing mechanisms for shared data in the Cloud, including two identity‐privacy‐preserving auditing mechanisms named Oruta and Knox, and a distributed‐storage integrity‐auditing mechanism. The authors showed that these schemes start to become insecure when active adversaries are involved in the cloud storage. In particular, they proved that there are ways for an active adversary to change cloud information without being detected by the auditor. The authors claimed to have found a solution to this downside without sacrificing the benefits of these mechanisms.

CSPs have control over enforcing cloud security in order to ensure the integrity and confidentiality of their customers' data. Cloud‐computing security infrastructure is an extremely important research area and the subject of a consistent body of work by both the academic and industrial research communities. In a cloud environment, resources are under the control of the CSP, and third‐party auditors must ensure data integrity and confidentiality, particularly when data storage is outsourced. (Sathiskumar and Retnaraj 2014) proposed data‐encryption and proxy‐encryption algorithms for CSPs to enable privacy and integrity of outsourced data in cloud‐computing infrastructures.

A service cloud infrastructure that offers web‐service composition improves the accessibility and flexibility of web services hosted on the cloud. Nevertheless, security challenges exist, which include both vulnerabilities due to classic web‐service communication and new, specific issues carried by intercloud communication. Cloud auditing is a complex task, due to the enormous scale of the system, the noncentralized architecture of the Cloud, and the wide range of security issues that need to be taken into account. Of course, there exist security standards, protocols, and auditing methodologies that provide audit logs, which can be subsequently analyzed, but these logs are not always suitable to reveal the type, location, and impact of the security threats that might have occurred. Assuming a cloud infrastructure that explicitly declares the scope of its audit logs, defines the expected auditable events in the cloud, and provides evidence of potential threats, in 2013, researchers introduced the concept of vulnerability diagnostic trees (VDTs) to formally demonstrate vulnerability patterns across many audit trails created within the cloud service. They accounted for attack scenarios based on the allocation of services to a web‐service composition that provides end‐to‐end client‐request round‐trip messaging.

In spite of the large body of research in the area of cloud security and auditing, performed by both the academic and industrial research communities, a collaborative cloud architecture for security and auditing that does not require a third‐party auditor (TPA) has not yet been fully explored. In their cloud‐security research survey, (Waqas et al. 2013) suggested a collaborative cloud architecture that can securely share resources when needed and audit itself without the involvement of a TPA.

8.4 Cloud Compliance

Cloud compliance has been the subject of intense research among industry, academia, and government. Nowadays, cloud security compliance is more important than ever since defining and enforcing security standardization and compliance regulations in clouds with complex architectures—especially those that support cross‐domain services on federated multilevel servers—is not an easy task. In order to properly secure cloud services and resources, as we observed in Section 8.3, cloud auditing is a crucial component that must be in place to meet SLAs and guarantee CSP compliance. This section discusses cloud‐compliance requirements, regulations, acts, laws, and guidelines, and describes the joint effort by IT standards organizations and industries worldwide to meet security‐compliance requirements for cloud computing.

Industry demand for cloud technology and the promising revenues of new information communication technology (ICT) investments have created excellent market encouragement for cloud computing. Businesses and organizations are moving their solutions and data toward cloud computing for scalability and cost efficiency. Moreover, there is an enormous need for cloud compliance and standardization: research institutions, academic organizations, business industries, and government agencies are now aware of the underlying problems caused by lack of compliance and more knowledgeable about the security impacts and serious consequences deriving from CSPs not meeting the right security regulations and compliance requirements. As shown in Table 8.2, standards organizations and working groups are making guidelines and specifications publically available, with the goal of achieving cloud‐computing standardization. (Han et al. 2014), in the book High Performance Cloud Auditing and Applications, include a summary about organizations and documents for standardization. And the NIST, CSA, and DMTF CADF Working Group have released important cloud‐computing‐related publications:

  • The NIST cloud‐computing publication includes a comprehensive view of cloud computing, with a particular focus on security and auditing guidelines (Mell and Grance, 2009).
  • NIST has worked on cloud computing to design and advance standards with United States Government (USG) agencies, federal Chief Information Officers (CIOs), private experts, and international bodies, in order to find consensus on cloud‐computing technology and standardization priorities.
  • NIST also published the two‐volume USG Cloud Computing Technology Roadmap document (NIST Special Publication 500‐293) to help in effectively securing cloud computing, with the intent of reducing costs and developing federated cloud‐computing services (Badger et al. 2011). This publication focuses on NIST strategic requirements related to cloud computing. NIST has also established public working groups to meet those requirements by using the expertise of the broad cloud‐computing stakeholder community. The NIST Cloud Computing Security Working Group (NCC‐SWG) is working on some of these requirements, with the specific intent of simplifying secure acceptance of cloud services.
  • The CSA released cloud security guidelines for secure cloud operations.
  • The DMTF CADF cloud‐auditing specifications include rules for standardizing cloud auditing.
  • The aim of the Cloud Security Alliance Trusted Computing Initiative (TCI, https://cloudsecurityalliance.org/wp‐content/uploads/2011/10/TCI_Whitepaper.pdf) is to promote cloud interoperability, compliance management configurations, and security best practices.
  • The Cloud Security Alliance TCI Reference Architecture (TCI‐RA, https://cloudsecurityalliance.org/media/press‐releases/csa‐launches‐updated‐tci‐reference‐architecture‐research‐website) has been designed to provide procedures and tools that enable security architects, enterprise engineers, and risk‐management professionals to use a common set of solutions and follow the security requirements to implement a secure and trusted cloud.
  • Subsequently, the NCC‐SWG created the NIST Cloud Computing Security Reference Architecture (NCC‐SRA), extending from the NIST Cloud Computing Reference Architecture and the TCI‐RA, to identify the security components for a secure cloud (https://bigdatawg.nist.gov/_uploadfiles/M0007_v1_3376532289.pdf). These security components are carried on the three root domains:
    1. Business Operation Support Service (BOSS)
    2. Information Technology Operation Support (ITOS)
    3. Security and Risk Management (S&RM)
  • The DMTF CADF Working Group suggested the open standards that would allow tenant consumers to manage and audit application security by themselves. It is crucial for CSPs to provide specific audit events, logs, and information reports for each cloud tenant and for each application. In fact, the DMTF CADF Working Group has published the CADF Data Format and Interface Definition Specification in order to allow data‐information sharing and offer the federation of normative audit event data (Rutkowski 2013).
  • In June 2014, an article entitled “FedRAMP to Monitor Cloud Service Providers” was published on the TechTank website (Schaub 2014). Since June 5, 2014, the federal government required that all CSPs have FedRAMP approval. As explained in Section 8.2, FedRAMP is a federal program initiative to help standardize the security of cloud services. One of its goals is to reduce the time and effort required for independent CSPs to ensure cloud security. According to a 2013 annual report by the General Service Administration (GSA), agencies that use FedRAMP could save 50% on the number of employees and an average of $200 000 in costs. FedRAMP operates under rules similar to those of the Federal Information Security Management Act‐FISMA (https://searchsecurity.techtarget.com/definition/Federal‐Information‐Security‐Management‐Act) and helps maintain security of federal IT systems, applications, and databases. Both FISMA and FedRAMP provide enhanced protection and scrutiny for federal and independent agencies. The FedRAMP readiness process is used to determine a CSP's eligibility for the Joint Authorization Board (JAB) Process Provisional Authorization program. In 2014, FedRAMP published the article “Quick Guide to Readiness Process” with the intent of helping in determining a CSP's eligibility (https://www.fedramp.gov/new‐fedramp‐readiness‐assessment‐report‐for‐high‐and‐moderate‐impact‐systems). To be eligible, a CSP must meet the following requirements:
    • Have an understanding of the FISMA and FedRAMP requirements and process
    • Be able to commit the resources needed to complete a FedRAMP assessment
    • Have the ability to implement the FedRAMP control baseline
    • Meet the FedRAMP requirements in documenting the control implementation

Cloud federation, defined in Section 8.2, is just starting to be taken into consideration by the research community. Cloud federation may generate new security issues that need to be addressed. Cloud‐computing‐related specifications, standards, and implementation technologies are required to establish security, interoperability, and portability in order to support federated cloud computing. Standards organizations, as mentioned previously, have jointly worked together with many cloud‐security and auditing working groups to design and create cloud‐computing standards. Moreover, the United States Air Force Research Laboratory (AFRL) CyberBAT Cloud Security and Auditing Team, during the summer of 2011, started an effort to investigate the directions of future cloud auditing research (https://cps‐vo.org/node/6062). Also in 2011, NIST defined the key characteristics for cloud services: on‐demand self‐service, broad network access, resource pooling, rapid elasticity or expansion, and measured service. Furthermore, clouds allow for virtual back ends, which make them cloud‐computing dynamic. Data, applications, and users are moving between internal and external clouds for different uses. Therefore, dealing with all the security and compliance problems that can arise in a dynamic environment may be very challenging. In 2010, the SANS Institute released a white paper entitled “Cloud Security and Compliance: A Primer.” The author, Dave Shackleford, stated that since cloud‐security and compliance efforts are so complex to deal with, it is necessary to classify all the issues in three main problem areas that apply to all types of cloud‐computing systems:

  • Mobility and multitenancy
  • Identity and access management
  • Data protection

The SANS Institute released this white paper to help organizations that are starting cloud‐computing programs and give them guidance to keep these problematic cloud‐security compliance areas under control.

Cloud consumers need assurance. They want to see evidence that CSPs have audit mechanisms in place. However, most auditors do not have complete knowledge or the appropriate skills in virtualization or cloud computing, which makes things even harder. CSPs are starting to deliver Statement on Auditing Standards (SAS) 70 type II audit reports, providing evidence of the control measures adopted within their cloud environments, but many security and compliance professionals still think this step is inadequate. Even though SAS 70 type II audit reports have been used for nearly 20 years, the problem with the SAS 70 standard—according to the American Institute of Certified Public Accountants (AICPA)—is that these reports were not designed to be used by service institutions that offer colocation, managed servers, or cloud‐hosting services in this way. In other words, the SAS 70 type II standard is not the right one for cloud computing because it was developed to take care of internal controls over financial reporting and not cloud systems. Specifically, a SAS 70 type II audit only checks that the controls and processes a data center operator has in place are followed. There are no specific minimum expectations that a data center operator must achieve, nor a benchmark to hold data center operators responsible to. A data center with excellent controls and processes can claim the same level of audit as a data‐center operator with no good controls and systems. A major misinterpretation about SAS 70 type II audits is that, after completing an audit, a data center becomes “SAS 70 type II certified”; but in reality there is no such official certification. Many service providers that have outlasted a SAS 70 type II audit have established their own logo, indicating the need for such certification by outside auditors.

The Internal Organization for Standardization (ISO) 27001 and 27002 standards, which provide a more specific and well‐structured framework of best practices, are much better models to adhere to. This is a good reason why, in 2010, the SANS Institute published a guide for virtualized infrastructures, entitled “A Guide to Virtualization Hardening Guides” (https://www.intralinks.com/sites/default/files/file_attach/wp‐sas70ii.pdf). The CSA, along with other virtualization and cloud‐computing security experts, has decided that CSPs should use the ISO 27001 and 27002 standards for auditing and reporting on the state of controls within their cloud infrastructure environments.

The CSA has founded Achieving Governance, Risk Management, and Compliance (GRC), an integrated suite comprising the following four Cloud Security Alliance initiatives (https://cloudsecurityalliance.org/media/news/csa‐releases‐new‐ccm‐caiq‐v3‐0‐1): CloudAudit, Cloud Controls Matrix (CCM), Consensus Assessments Initiative Questionnaire (CAIQ), and Cloud Trust Protocol (CTP), which are described next.

The GRC's goals require appropriate assessment criteria, relevant control objectives, and timely access to necessary supporting data. Whether implementing private, or hybrid clouds, the shift to compute as a service presents new challenges across the spectrum of GRC requirements. The CSA GRC stack provides a set of tools for enterprises, CSPs, security‐solution providers, IT auditors, and other key stakeholders to instrument and assess both private and public clouds against industry‐established best practices, standards, and critical compliance requirements.

CloudAudit (cloudaudit.org) is a volunteer cross‐industry cloud effort initiative gathering the best intellectuals and capacity in cloud, networking, security, audit, assurance, and architecture backgrounds. The CloudAudit Working Group officially started in January 2010 and has the participation of many of the largest CSPs, integrators, and professionals. In October 2010, CloudAudit officially came under the support of the Cloud Security Alliance (https://cloudsecurityalliance.org/guidance/csaguide.v3.0.pdf). The goal of CloudAudit is to provide a standard interface and namespace that helps CSPs in the areas of automated audit, assertion, assessment, and assurance (A6) for their IaaS, PaaS, and SaaS environments, and to permit authorized consumers of their services to do the same via an open, extensible, secure interface and methodology. CloudAudit offers the technical baseline to enable transparency and trust in private and public cloud systems.

In July 2014, the CSA revealed the release of the extremely important updates to two de facto industry standards: CCM V3.0.1 and CAIQ V3.0.1 (https://cloudsecurityalliance.org/media/press‐releases/ccm‐caiq‐v3‐0‐1‐soft‐launch). Thanks to these two updates, the CSA accomplished a major milestone in the alignment between the Security Guidance for Critical Areas of Focus in Cloud Computing V3, CCM, and CAIQ. Jim Reavis, CEO of CSA, stated the following: “This will allow cloud providers to be more transparent in the baseline assessment process, helping accelerate the implementation process where cloud consumers will be able to make smart, efficient decisions.” This also maps CAIQ questions to the latest compliance requirements found in CCM V3.0.1. Daniele Catteddu, managing director of CSA EMEA, said, “With the release of the new CCM and CAIQ, we are creating an incredibly efficient and effective process for cloud providers to better demonstrate transparency and improve trust in the cloud, which is the ultimate mission of the CSA.” Specifically, the CSA CAIQ is the first empirical document between a cloud customer and a CSP. It provides a series of yes‐or‐no control assertion questions. The CSA CAIQ helps organizations build the necessary assessment processes when engaging with CSPs. It simplifies distillation of the issues, best practices, and control specifications from the CSA CCM, thereby allowing all the parties involved to quickly understand areas that require more specific discussion between consumer and CSP. The CSA CCM reinforces some of the well‐established information‐security control environments by reducing security threats and vulnerabilities in the Cloud. It also provides standardized security and operational risk management, and works to normalize security expectations, cloud taxonomy and terminology, and security procedures employed in the Cloud. The foundation of the CCM rests on its customized relationship to other industry standards, regulations, and control frameworks, such as ISO 27001:2013, COBIT 5.0, PCI DSS V3, and the AICPA 2014 Trust Service Principles and Criteria. Furthermore, it augments internal control directions for service‐organization control‐report attestations.

The CTP is the method by which cloud‐service consumers inquire for, and receive information about, the essential elements of transparency as applied by CSPs. The main intention of the CTP is to produce evidence‐based assurance that anything that is asserted as happening in the Cloud is guaranteed to happen as described. This is obtained as an application of digital trust, whose goal is to make cloud consumers more knowledgeable about the underlying infrastructure that constitutes the foundation of the cloud system. When consumers know more about the cloud infrastructure, they tend to feel more confident, which translates into more business and even larger payoffs for the CSP. With the CTP, a CSP gives an instrument to cloud consumers to understand important pieces of information related to the compliance, security, privacy, integrity, and operational‐security history of the services being performed in the Cloud. These extremely important pieces of evidence are known as the elements of transparency, and they convey proof about vital security settings and operational attributes for systems deployed in the Cloud. These transparent pieces of information give cloud consumers the knowledge necessary to make educated decisions about what service processes and data are appropriate to add to the Cloud, and to decide which cloud best lends itself to satisfy the consumers' needs.

Cloud‐compliance‐regulation issues become even more challenging and serious as soon as a cloud consumer uses cloud‐storage or backup infrastructures. For a CSP, it is essential not only to make sure customer data is well protected—especially sensitive and private data—but also to enforce the appropriate integrity and confidentiality mechanisms when customer data is saved or transferred to a third‐party CSP. A CSP must obey relevant laws and industry‐standards regulations. According to the CSA, “support for global data privacy standards and the consumer's bill of rights is definitely increasing.” This statement is based on the Cloud Security Alliance’s Data Protection Heat Index (DPHI) survey report (https://cloudsecurityalliance.org/download/data‐protection‐heat‐index‐survey‐report), which Cisco Systems Inc. funded. In September 2014, this survey examined some of the top complications around data protection and privacy in the Cloud, including data residency, sovereignty, and lawful interception. The survey participants included 40 among “the most influential cloud security leaders” (as defined by the CSA) in the world, from Chief Information Security Officers (CISOs) to professional privacy and legal specialists. Specifically, the survey showed that there is a solid consensus for more global standards to guide the use and protection of private data. For example, the survey showed broad support for the Organisation for Economic Co‐operation and Development (OECD) Privacy Principles, which establish rules for better data privacy standards and protection. For cloud computing, 62% of the participants said that industry implementation of the OECD's data‐collection‐limitation principle adds restrictions on the quantity of personal data that is collected but also on the content of the data itself. Moreover, the survey showed that 71% of the participating industrial organizations adhered to the OECD's security‐safeguards principle, which requests “reasonable security safeguards” to counter unauthorized access, disclosure, modification and damage of personal and private data. Additionally, 73% of respondents indicated that there should be a global consumer's bill of rights for data privacy, while 65% said that the United Nations (UN) should be more involved and take more responsibility in defining and developing such bill. Nevertheless, Trevor Hughes, president and CEO of the International Association of Privacy Professionals (IAPP), warned that achieving this agreement could be very challenging. Specifically, Hughes said, “The concept of privacy varies greatly according to region, and there are cultural differences that manifest themselves into different international laws. I'm not confident that we'll be able to develop one universal framework that can take all of those concepts and cultural views, and distill that down into one simple framework.” However, Hughes remained optimistic since he stated that, as challenging as this may sound, government agencies and research organizations should still work toward an agreement on data‐privacy standards in cloud computing. Quoting his words, “When it comes to private data, I like to say that just because something is legal doesn't mean it's not stupid.” According to Hughes, it is necessary to have standards and frameworks in order to enforce data privacy since compliance and regulations are not enough.

One of the most pressing questions related to cloud computing is how data privacy is defined internationally. At the 2014 Privacy Academia conference, both the IAPP and CSA strongly emphasized that there is an even greater demand for privacy professionals in the enterprise and for a better cooperation with information‐security professionals in order to help organizations better identify and protect sensitive data. This issue is even more complicated for the Cloud because data on the Cloud is not limited to one region or location. The survey just described showed that there is a separation among participants about how data residency and sovereignty should be enforced. The majority of the respondents agreed that personal identifiable information (PII) must stay within the geographic boundaries of the subject's country. However, during this survey, participants were asked to define their own country's concept of data residency or sovereignty compared to other regions; 37% responded that they are more open, 35% that they are more restricted, and 28% that they do not know. Finally, Hughes emphasized the importance of identifying a common language to better define the concepts of data privacy and protection across all countries.

Customers' uncertainty about data security is the number‐one obstacle toward fully adopting cloud computing, followed by compliance, privacy, trust, and legal issues. Data security has always been a major issue in traditional IT infrastructure, but it becomes even more serious in the cloud‐computing environment: data is dispersed across different machines and storage devices, including servers and mobile devices, potentially in different countries with different legislations. The major concerns for data in the Cloud are integrity, confidentiality, availability, and privacy, which are discussed in Sections 8.4.1 through 8.4.4.

8.4.1 Data Integrity

Data integrity is a security principle that establishes that users must not modify data unless they are authorized to do so. For all the cloud‐computing deployment models—IaaS, PaaS, and SaaS—data integrity is the foundation for providing service. Clouds offer a large number of entry and access points. Therefore, enforcing a good authorization mechanism is crucial to maintain data integrity. It is also important to provide third‐party supervision. In fact, analyzing data integrity is the prerequisite to deploying applications securely. (Bowers et al. 2009a) proposed a theoretical framework called Proofs of Retrievability (POR) for verifying remote data integrity by combining error‐correction code and spot‐checking. (Bowers et al. 2009b) also developed a high‐availability and integrity layer (HAIL) by using POR checking the storage of data across many different clouds servers. HAIL can also detect duplication of data copies and understand data availability and integrity. (Schiffman et al. 2010) presented a Trusted Platform Module (TPM) to check data integrity remotely.

When it comes to cloud computing, it is important not only to protect cloud consumers' sensitive data via use of cryptographic routines, but also to protect consumers from malicious behaviors by validating the operations performing computations on the data. (Venkatesa and Poornima 2012) proposed a novel data‐encoding scheme called layered interleaving, designed for time‐sensitive packet recovery in the presence of sudden data loss. This is a high‐speed data‐recovery scheme with minimal loss probability, which uses a forward‐error‐correction scheme to handle data loss. This methodology is highly efficient in recovering data right after sudden losses. In 2013, research started to design a cloud computing security development life‐cycle model to enforce data safety and minimize consumer data exposure risks. A data‐integrity‐verification algorithm eliminates the need for third‐party auditing by protecting static and dynamic data from unauthorized observation, modification, and interference. (Saxena and Dey 2014) proposed work to achieve better data‐integrity verification and help users utilize Data‐as‐a‐Service (DaaS) in cloud computing. This framework is partitioned into three platforms: platinum for storing sensitive data, gold for a medium level of security, and silver for nonsensitive data. The authors have also designed different algorithms for implementing these three platforms. Their results showed that this framework is easy to implement and provides strong security without affecting performance in any significant way.

8.4.2 Data Confidentiality

Cloud computing, among many other offerings, provides high availability and elastic access to resources. Third‐party cloud infrastructures, such as Amazon Elastic Compute Cloud (EC2), are completely transforming the way today's businesses operate. While we all take advantage of the benefits of cloud computing, businesses must realize that there can be serious risks to data security, particularly to data confidentiality, a security principle that establishes that private data cannot be exposed to unauthorized users. Very important factors, such as software bugs, operator errors, and external attacks, can interfere with the confidentiality of sensitive data stored on external clouds, thereby making that data vulnerable to unauthorized access by malicious parties. (Puttaswamy et al. 2011) studied how to improve the confidentiality of application data stored on third‐party computing clouds. They identified all functionally encryptable data—sensitive data that can be encrypted without reducing the functionality of the application on the Cloud. This data will only be stored on the Cloud in encrypted form, accessible exclusively to users with the correct keys. This mechanism protects data confidentiality against unintentional errors and attacks. The authors also described Silverline, a set of tools that automatically (i) recognize all functionally encryptable data in a cloud application, (ii) assign encryption keys to specific data subsets to minimize key‐management complexity while ensuring robustness against key compromise, and (iii) provide transparent data access at the user device while preventing key compromise even from malicious clouds. Via a thorough evaluation, the authors were able to report that numerous web applications heavily use storage and sharing components that do not require raw‐data interpretation. Thus, Silverline can protect the vast majority of data manipulated by such applications, simplify key management, and protect against key compromise. These techniques provided an important first step toward simplifying the complex process of incorporating data confidentiality into cloud applications.

Cloud data storage mainly gives small and medium‐sized enterprises (SMEs) the capability to reduce investments and maintenance of storage servers while still providing high data availability. The majority of SMEs now outsource their data to cloud storage services. User data sent to the Cloud must be stored in the public cloud environment. Data stored in the Cloud might intersperse with other user data, which leads to data‐protection issues in cloud storage. Thus, if the confidentiality of cloud data is broken, serious losses may occur.

Data confidentiality is one of the most important requirements that a CSP must meet. The most used method for ensuring cloud‐storage data protection is encryption. However, encryption by itself does not completely guarantee data protection. (Arockiam and Monikandan 2014) proposed a new way for achieving efficient cloud‐storage confidentiality. The authors used encryption and obfuscation as two different techniques to protect the data stored in the Cloud. Depending on the type of data, encryption and obfuscation can be applied. Encryption can be applied to strings and symbols, while obfuscation can be more appropriate for numeric data types. Combining encryption and obfuscation provides stronger protection against unauthorized users. In addition, (Alomari and Monowar 2014) addressed the problem of portability and secure file sharing in cloud storage. Their work consists of four different components: (i) the encryption/decryption provider (EDP), which performs the cryptographic operations; (ii) a TPA, which traces the EDP operations and audits them; (iii) a key storage provider (KSP), for key management; and (iv) a data storage provider (DSP), which stores user files in an encrypted form. Based on the experimental results, the authors demonstrated that this new encryption ensures data confidentiality and maintains portability and secure sharing of files among users. (Djebaili et al. 2014) listed the latest trends in cloud data outsourcing and the many different threats that may undermine data integrity, availability, and confidentiality unless cloud data centers are properly secured. The authors observed that different schemes addressing data integrity, availability, and confidentiality, and complying with all the security requirements must be in place. These include high scheme efficiency, stateless verification, unbounded use of queries, and retrievability of data. However, very important questions remain, particularly how to use these schemes efficiently and how often data should be verified. Constantly checking data is a clear waste of resources, but checking only periodically increases risks. The authors attempt to resolve this tricky issue by defining the data‐check problem as a noncooperative game, and by performing an in‐depth analysis on the Nash Equilibrium and the underlying engineering implications. Based on the game’s theoretical analysis, the sequence of reactions is to anticipate the CSP's behavior; this leads to the identification of the minimum resource verification requirement and the optimal strategy for the verifier.

8.4.3 Data Availability

It is essential that an organization is ready to respond in case a disaster occurs. A provider hosting a cloud data center must be prepared for such events and make sure that cloud data becomes available within seconds. It is crucial that organizations define strategies for protecting and restoring access to data appropriately according to operational needs. It is not possible to transfer terabytes (TB) or petabytes (PB) of data information through the network in a few seconds right after a disaster has occurred, and it is necessary for data to be saved in different locations in order to facilitate data access in the event of a failure in a given primary location. Different strategies are used by system architects for backing up data. (Cabot and Pizette 2014) concluded that replicated databases in cloud environments are a cost‐effective alternative for ensuring the availability of data in cloud systems, although some other issues arise, such as data synchronization and privacy. Since the cloud‐computing paradigm is based on the concept of shared infrastructure by multitenants, DDoS attacks on a specific target can quickly affect many or even all tenants. By guaranteeing that availability is given the necessary importance, organizations can enable stakeholders to properly assess the risks associated with a specific cloud‐computing model and successfully mitigate those risks in order to obtain the advantages of cloud computing while ensuring continuity of functions. Intrusion protection systems (IPSs) and firewalls are a crucial defense from these attacks, but they miss an important capability: these solutions do not protect the availability of services. Moreover, these technologies can themselves become targets of DDoS attacks.

8.4.4 Data Privacy

Based on an article published at the end of 2014 on searchcloudsecurity.techtarget.com (Wright 2014), the CSA said that cloud data privacy and the issue of defining authority over data would be the top problems for enterprises in both the United States and abroad for the year 2015. In fact, during that time, a legal battle between the United States government and Microsoft over e‐mails located in an offshore data center in Dublin, Ireland was going on. These issues were caused by different data‐privacy regulations among countries. These include the USA Patriot Act (http://www.justice.gov/archive/ll/highlights.htm), which is an act of the U.S. Congress that was signed into law by President George W. Bush on October 26, 2001, and a U.S. search warrant. Ireland's Minister for Data Protection has it made clear that “when governments seek to obtain customer information in other countries, they need to comply with the local laws in those countries.” The US Congress did not clearly express its intent to put a US business in the difficult position of violating the local laws of countries where customer data is saved in order to comply with a US search warrant. Instead of using a search warrant in the Microsoft case, the U.S. government should have followed the procedures of the Mutual Legal Assistance Treaty (MLAT) between the US and Ireland to ask to receive the necessary information from the Irish government in a way that is consistent with Ireland's laws. This episode may have negative impact on cloud investment and, ultimately, on the economy. This example is a perfect illustration of the types of conflicts that may arise in the near future.

Jim Reavis, co‐founder and CEO of the CSA, said that cloud data sovereignty is “probably the number‐one issue” for European enterprises using, or planning to use, cloud services. We must ensure that individuals and organizations can have confidence in the rules and processes that have been put in place to safeguard privacy.” On April 3, 2014 the Cloud Service Alliance announced the launch of the second version of its Privacy Level Agreement (PLA) Working Group (https://cloudsecurityalliance.org/media/news/csa‐announces‐pla‐v‐2). In an effort to help CSPs and future cloud customers objectively evaluate privacy standards, PLA V2, being sponsored by CSA corporate member EMC, aims to provide a clearer and more effective way to communicate to customers regarding the layer of data protection offered by a CSP. The PLA Working Group was originally founded in 2012 with the goal of defining compliance baselines for data‐protection legislation as well as establishing best practices for defining a standard for communicating the level of privacy measures (such as data protection and security) that a CSP agrees to follow while hosting third‐party data. Moreover, the PLA Working Group is composed of independent privacy and data‐protection subject matter experts, privacy officers, and representatives from data protection authorities (DPAs). The PLA Working Group released three core documents over the course of a year (April 2014–April 2015), as follows:

  • The first document is the PLA V2, with special emphasis on the European Union (EU) market. The first deliverable of the PLA WG was a transparency tool for the EU market. Based on those initial results, the PLA Working Group is creating a compliance tool that will satisfy the requirements expressed by the Article 29 Working Party and by the Code of Conduct currently development by the European Commission (EC).
  • The second document is the Feasibility Study on Certification/Seal based on the PLA. The group will provide a document assessing the feasibility of a Privacy Certification Module (PCM) in the context of the Open Certification Framework (OCF), and establish a roadmap and guidance for its creation and implementation.
  • The third document is the PLA Outline for the Global Market. The CSA will expand the scope of the PLA V1 by considering relevant privacy legislation outside the EU.

8.4.5 Dataflows

One of the most attractive, yet most dangerous characteristics of cloud computing is to promote flow and remote storage of data. Within the public cloud‐computing deployment, new applications for data sharing are encouraging internet users to store their data on the Cloud with no mention of any geographical boundaries. The relevant CSPs may have their service centers anywhere in the world. In some jurisdictions, there are laws governing the use of personal data, for example by prohibiting the exportation of personal data to countries that have no enforceable data‐protection law. Data transfers from all countries with national legislation are restricted. These include all the countries in the EU and the European Economic Area (EEA), Argentina, Australia, Canada, Hong Kong, and New Zealand. From EU and EEA countries, personal information can be transferred to countries that have “adequate protection,” namely all other EU and EEA member states plus Switzerland, Canada, Argentina, and Israel. Germany's Data Protection Act, §11, introduced in Section 8.3.2, states that “where other bodies are commissioned to collect, process or use personal data, the responsibility for compliance within the provisions of this Act and with other data protection provisions shall rest with the principal.” The implication of this Act is that users must know the exact location of their data and their cloud providers' court of jurisdiction, and export or movement of data is not possible without prior notification of the customer.

Transborder dataflow is the transfer of computerized data across national borders. Restricting it is a form of prevention necessary to protect the personal data of the citizens of a country, but this has indirectly limited cooperation between countries and affected economic development. (Manap and Rouhani 2014) reported the outcome of a study they conducted. They described the fundamental concepts of free dataflow and how it can help the growth of a country's economic power, and proved that unrestricted dataflow allows for more competition in business activities and, therefore, more growth. Internationally dealing with businesses such as research and development, design, protection, sales, and support services, companies gain profit from transborder dataflow because they can receive the best services from the best suppliers. Furthermore, transborder dataflow can promote not only the economic growth of a country, but also its political and social development.

Cyberspace technology, including cloud computing, has changed the way data flows. Through global communication, the distribution of information permits efficient management of businesses by allowing a better way of using resources and services. However, growing public concern about the misuse of personal data has led to the introduction of privacy laws in various countries, with the disadvantage that cross‐border restrictions can negatively impact the choice and quality of products and services offered to the consumers around the world. In addition to the ban of transborder dataflow, another factor that can adversely affect a country's economy is the presence of different and incompatible standards and regulations of privacy protection from one country to another. Unstandardized data protection has created challenges in dealing with transfer of data worldwide. Moreover, complicated regulations on cross‐border restrictions lead to lost business prospects, with dramatic effects particularly in developing countries. It is suggested that the protection of law is important to enforce the proper use of data and assure the rights and concern of individuals. In order to alleviate the challenges introduced by strict laws, the mixture of procedures and cross‐border privacy protections can be minimized through the development of standard privacy regulations. These standard regulations should be adopted by corporate sectors in delivering business transactions over the internet. Adopting consistent corporate privacy rules would resolve several difficulties caused by the various personal‐data laws enforced in different countries. In addition, such adoption would allow the administration of data in a more unified and consistent way throughout organizations, independent of where the data may be exported.

8.4.6 The Need for Compliance

Because of the many cloud challenges mentioned so far, cloud customers must understand the terms and conditions of a CSP and which of these terms have to be spelled out in SLAs to maintain cloud compliance. Unfortunately, not all CSPs are in favor of providing detailed security assurances to customers. As Donna Scott, a Gartner Vice President (VP), said during an interview in 2013, “Gartner customers have logged many complaints about weak SLAs lacking the necessary guarantees when it comes to security, confidentiality and transparency.” It is essential, as John Morency, a Gartner Research VP, stated: “The devil is in the details,” meaning that these contracts should be written very clearly, with no ambiguity, and should explicitly indicate which party is responsible for which security operation. Jay Heiser, another Gartner Research VP, said that enterprises very often struggle to explain, when asked, what security controls a CSP can provide to its cloud customers. Many CSPs answer with very vague responses or unclear documentation. Heiser said that it is necessary for industries to have to define common certifications, so enterprises can offer their services more easily: for example, by using the US government's FedRAMP initiative, which is one of the closest to a global standard. On June 26, 2014 the EC published the Cloud Service Level Agreement Standardization (Cloud SLAS) Guideline. This document is a crucial component of the contractual relationship between a cloud service customer and a CSP. Based on the global nature of the Cloud, SLAs usually cover many jurisdictions, often with varying applicable legal requirements, in particular with respect to the protection of personal data hosted in the cloud service. In addition, different cloud services and deployment models require different approaches to SLAs, which add to the complexity of the SLAs. Moreover, SLA vocabulary, as of today, very often varies from one CSP to another, which makes it even more difficult for customers to compare cloud services. In fact, standardizing characteristics of SLAs improves clarity and increases the understanding of SLAs for cloud services in the market, in particular by highlighting and providing information on the concepts usually covered by SLAs. These are some of the reasons the Cloud Computing Strategy company is working on the development of standardization guidelines for cloud‐computing SLAs for contracts between cloud service providers and cloud service customers.

In February 2013, the EC Directorate General for Communications Networks, Content, and Technology (DG Connect) set up the Cloud Select Industry Group, Subgroup on Service Level Agreement (C‐SIG‐SLA) to work on these issues. The C‐SIG‐SLA, an industry consortium assisted by DG Connect, has created the Cloud SLA Standardization Guidelines document to provide a set of SLA standardization guidelines for CSPs and professional cloud service customers, while still ensuring that the specific necessities of the European cloud market and industry are taken into account. This initial standardization effort has will have the highest impact if standardization of SLAs is done at an international level, rather than at a national or regional level. Taking this into consideration, the C‐SIG‐SLA set up a connection with the ISO Cloud Computing Working Group to provide concrete input and present the European position at the international level.

In 2013, PCI DSS was updated to the next major revision, PCI DSS V3.0 (https://www.pcisecuritystandards.org/minisite/en/pci‐padss‐supporting‐docs‐v31.php). In January 2015, the first set of requirement changes rolled out for PCI‐DSS. While there are a number of areas where new requirements could affect the compliance plans of suppliers and service providers, both cloud‐based and traditional ones, the part that may potentially have the greatest impact related to the Cloud is in situations where the cardholder data environment (CDE) deals with cloud technologies. Suppliers are aware that PCI DSS compliance in the Cloud is a challenging topic, and for this reason the PCI Security Standards Council published a document intentionally for those providers using the Cloud in a PCI context. Three new requirements in PCI DSS V3.0 are particularly relevant to the use of cloud technologies in a PCI‐regulated infrastructure:

  1. Requirement 2.4: “Maintain an inventory of system components that are in scope for PCI DSS.”
  2. Requirement 1.1.3: “[Maintain a] current diagram that shows all cardholder dataflows across systems and networks.”
  3. Requirement 12.8.5: “Maintain information about which PCI DSS requirements are managed by each service provider.”

Today, enterprises have to deal with increasing challenges meeting the different compliance and regulatory requirements applicable to them. The challenge significantly increases if the enterprise offers services in various geographies or across different verticals. For instance, an e‐commerce supplier with international customers needs to comply with data‐privacy and ‐disclosure mandates such as the EU data‐protection directive, California privacy laws, and PCI‐DSS. In addition, as a response to the recent economic losses experienced by enterprises as a result of e‐commerce hacking, more regulatory and compliance orders are expected as governments and regulatory organizations design requirements in an attempt to prevent future incidents.

Instead of handling each compliance requirement as an independent effort, an organization can obtain significant reductions in effort and cost by adopting standards such as ISO 27001 and 27002 (introduced earlier in Section 8.4) as their base security guidance. In addition, a governance, risk management, and compliance (GRC) framework allows an organization to avoid conflict, reduce overlap and gaps, and obtain better executive visibility to the risks faced. It also enables the organization to be proactive in addressing risk and compliance issues. There is no doubt that meeting compliance requirements by moving to a cloud‐based solution offers significant benefits.

Cloud compliance, or rather the lack of it, is an obstacle to cloud adoption. A lack of compliance for services in the Cloud makes the elasticity of the Cloud difficult for operations that must meet compliance mandates. However, a cloud‐based solution that includes compliance services opens up the general benefits of the Cloud to many more applications. Based on the type of business process (for example, handling credit‐card transactions) or industry (such as healthcare), the majority of organizations are subject to different compliance and regulatory authorizations. For instance, openly traded financial services institutions situated in the US, running their own data centers where both customer and corporate data and applications are located, must comply with the following standards:

  • Gramm‐Leach‐Bliley Act (GLBA) guidelines
  • PCI
  • SAS 70 type II
  • Various state and federal privacy‐ and data‐breach disclosure requirements

Companies depend heavily on reporting and auditing frameworks that include requests to detail controls and policies that have been implemented in order to ensure compliance. In general, a cloud solution must prove that it possesses identical types of safeguards and controls that are otherwise implemented privately. A CSP must also be able to provide evidence of compliance by showing regulation‐specific reports and audits, such as SAS 70 type II audit reports. In 2014, Citrix Systems Inc. published a document entitled Citrix Cloud Solution for Compliance describing the Citrix‐specific solutions and products that Citrix offers in helping cloud services comply with various regulations (https://www.citrix.com/content/dam/citrix/en_us/documents/products‐solutions/citrix‐cloud‐solution‐for‐compliance.pdf). The IEEE Cloud Computing group, which is part of the IEEE organization, is also working on standards for cloud computing (https://cloudcomputing.ieee.org/standards).

8.5 Future Research Directions for Cloud Auditing and Compliance

A large body of research and effort has been conducted by organizations worldwide, with the purpose of addressing cloud auditing and improve security, privacy, and trust for cloud consumers. However, a lot still needs to be done in order to make cloud infrastructures more secure and trusted, attract investments from industrial enterprise and government agencies, and give companies the confidence to move their businesses to the Cloud and also store their customers' sensitive and privacy data in cloud storage. Moreover, there is great demand from organizations to globally design and enforce cloud compliance regulations, which are necessary for cloud service providers to prove trust and reliability to their consumers. Following is a list of open problems that future research should address:

  • Misuse of administrator rights and/or malicious insiders: In cloud computing, VMs are the basis for the cloud infrastructure. More work needs to be done to effectively monitor and detect VM misuse by malicious insiders.
  • Lack of transparency of applied security measures: In cloud computing, not every CSP follows baseline standards, although today global government agencies and laws are working together to define cloud baseline standards and compliance regulations for CSPs. As a result, the particular security measures implemented by a CSP often are not completely known.
  • Shared technology issues (multitenancy) and misconfigurations of VMs: In cloud computing, poorly implemented multitenancy can allow a customer to endanger other customers' resources, either maliciously or unintentionally. This is caused by the use of virtualization without proper isolation. Exploits due to the increasing code complexity of hypervisor software have already been demonstrated. Memory‐cache isolation is another issue. As of today, no CSP releases complete information on how shared resources are securely wiped before being reassigned to a different customer.
    • Data life cycle in case of provider switch or termination. In cloud computing, due to shared usage of resources, this threat is particularly serious. The CSA states that cloud consumers need to define specific rules for ending‐of‐contract scenarios regulating how customers' data is exported from the Cloud and how a provider promises to securely erase customer data. Both the consumer and the CSP must agree on these rules. Global standardizations and compliance regulations are still heavily needed in this area.
    • Unclear data location: Cloud customers often do not have any certainty about the country in which their data is saved or processed. As of today, there is no way to prove if customer data has or has not been outsourced by a CSP. Customer data‐privacy laws are different in each country, which make it even more difficult to respect all the regulations when transferring and storing data across different geographic locations. More global compliance and standardization directives are needed to solve the issue of enforcing privacy when the location of the data is unknown or unclear.
  • Missing monitoring: If cloud consumer data, especially personal data, is at risk of fraud or integrity, it is essential for a CSP to be able to detect these risks, warn customers, and eliminate the risks. As of today, it is a challenge for a cloud providers to use an information‐policy system that automatically informs customers in the presence of security violations.
  • Missing interoperability of CSPs: CSPs are not compatible with each other, since each service provider uses customized VM formats and proprietary APIs. As a consequence, more work is required to standardize cloud operations and achieve better interoperability between CSPs.

8.6 Conclusion

This chapter provided an overview of the latest cloud security problems and how cloud auditing can alleviate these issues. This chapter covered the state of the art in cloud auditing and discussed the modifications and extensions that need to be implement to enable effective cloud audits and minimize cloud security and privacy challenges. A significant body of research has been dedicated to improving cloud auditing, but many aspects of cloud auditing need more work, particularly in the area of standardization. In addition, this chapter described the complexity behind cloud compliance and explained how numerous organizations worldwide are eagerly collaborating to standardize compliance regulations. Once this goal is achieved, CSPs can finally prove compliance and gain the trust of cloud consumers, who are still hesitant to adopt cloud deployments for fear of compromising the integrity and confidentiality of their data—especially when CSPs transport and store data in other countries. The chapter also explained how data‐protection laws in different countries have complicated transborder dataflow, to the point of reducing cloud adoptions and affecting the economic growth of developing countries.

References

  1. Alomari, E.A. and Monowar, M.M. (2014). Towards data confidentiality and portability in cloud storage. Design, user experience, and usability. User experience design for diverse interaction platforms and environments. Lecture Notes in Computer Science 8518: 38–49.
  2. Arockiam, L. and Monikandan, S. (2014). Efficient Cloud Storage Confidentiality to Ensure Data Security. 2014 International Conference on Computer Communication and Informatics (ICCCI), Coimbatore, India.
  3. Aslam, M., Gehrmann, C., and Björkman, M. (2013). Continuous security evaluation and auditing of remote platforms by combining trusted computing and security automation techniques. The 6th International Conference on Security of Information and Networks, Aksaray, Turkey.
  4. Badger, L., Bernstein, D., Bohn, R., et al. (2011). National Institute of Standards and Technology Special Publication 500–293 (Draft). US Government Cloud Computing Technology Roadmap Volume I Release 1.0 (Draft). http://www.nist.gov/itl/cloud/upload/SP_500_293_volumeI‐2.pdf (accessed 16 December 2014).
  5. Bishop, M. (2002). Design principles. In: Computer Security: Art and Science. Professional: Addison‐Wesley.
  6. Bowers, K.D., Juels, A., and Oprea, A. (2009a). Proofs of retrievability: theory and implementation. In: Proceedings of the 2009 ACM Workshop on Cloud Computing Security, 43–54. ACM.
  7. Bowers, K.D., Juels, A., and Oprea, A. (2009b). HAIL: a high‐availability and integrity layer for cloud storage. In: Proceedings of the 16th ACM Conference on Computer and Communications Security, 187–198. ACM.
  8. Cabot, T. and Pizette, L. (2014). Leveraging public clouds to ensure data availability. White paper. http://www.mitre.org/sites/default/files/pdf/12_0230.pdf (accessed 10 January 2015).
  9. Chen, Y., Paxon, V., and Katz, R.H. (2010). What's new about cloud computing security? EECS Department, University of California, Berkeley. Technical report UCB/EECS‐2010‐5
  10. Djebaili, B., Kiennert, C., Leneutre, J. et al. (2014). Data integrity and availability verification game in untrusted cloud storage. Decision and game theory for security. Lecture Notes in Computer Science 8840: 287–306.
  11. Doelitzscher, F., Reich, C., Knah, M. et al. (2012). An agent based business aware incident detection system for cloud environments. Journal of Cloud Computing: Advances, Systems and Applications 1: 9.
  12. German Parliament (2009). Federal Data Protection Act. Federal Law Gazette I p. 66, as most recently amended by Article 1 of the Act of 14 August 2009 (Federal Law Gazette I p. 2814). http://www.gesetze‐im‐internet.de/englisch_bdsg/englisch_bdsg.html (accessed 13 December 2014).
  13. Gondree, M. and Peterson, Z.N.J. (2013). Geolocation of data in the cloud. In: Proceedings of the Third ACM Conference on Data and Application Security and Privacy, 24–36. ACM.
  14. Goodin, Dan. (2014a). Crypto attack that hijacked Windows Update goes mainstream in Amazon Cloud. http://arstechnica.com/security/2014/11/crypto‐attack‐that‐hijacked‐windows‐update‐goes‐mainstream‐in‐amazon‐cloud (accessed 12 December).
  15. Goodin, Dan. (2014b). AWS console breach leads to demise of service with “proven” backup plan. http://arstechnica.com/security/2014/06/aws‐console‐breach‐leads‐to‐demise‐of‐service‐with‐proven‐backup‐plan (accessed 20 December 2014).
  16. Hamza, Y.A. and Omar, M.D. (2013). Cloud computing security: abuse and nefarious use of cloud computing. International Journal of Computational Engineering Research 3 (6): 22–27.
  17. Han, K.J., Choi, B.Y., and Song, S. (2014). High Performance Cloud Auditing and Applications, 6–7. Heidelberg, Dordrecht, London: Springer New York.
  18. Kortchorski, B. and Rutkowska, S. (2009). Cloudburst. Technical paper. http://www.blackhat.com/presentations/bh‐usa‐09/KORTCHINSKY/BHUSA09‐Kortchinsky‐Cloudburst‐PAPER.pdf (accessed 15 December 2014).
  19. Loutas, N., Kamateri, E., and Tarabanis, K. (2013). Cloud computing interoperability: the state of play. Centre for Research and Technology Hellas, Thessaloniki, Greece; Information Systems Lab, University of Macedonia, Thessaloniki, Greece.
  20. Manap, N.A. and Rouhani, A. (2014). Issue of transborder data flows in cloud computing the impact on economic growth. International Business Management 8: 113–117.
  21. Meng, S., Iyengar, A.K., Rouvellou, I.M. et al. (2012). Reliable state monitoring in cloud datacenter. Proceedings of the IEEE Fifth International Conference on Cloud Computing, 951–958. IEEE.
  22. Mell, P. and Grance, T. (2009). The NIST definition of cloud computing. http://csrc.nist.gov/publications/nistpubs/800‐145/SP800‐145.pdf (accessed 28 January 2014).
  23. Paladi, N., Aslam, M., and Gehrmann, C. (2014). Trusted geolocation‐aware data placement in infrastructure. The 13th IEEE International Conference on Trust, Security and Privacy in Computing and Communications IEEE (TrustCom), Beijing, China.
  24. Patel, P., Ranabah, A., and Sheth, A. (2009). Service Level Agreement in Cloud Computing. Fairborn, Ohio: Wright State University CORE Scholar.
  25. Puttaswamy, K.P.N., Kruegel, C., and Zhao, B.Y. (2011). Silverline: toward data confidentiality in storage‐intensive cloud applications. In: Proceedings of the 2nd ACM Symposium on Cloud Computing, article 10. ACM.
  26. Ries, T., Fusenig, V., Vilbois, C. et al. (2011). Verification of data location in cloud networking. In: Proceedings of the Fourth IEEE International Conference on Utility and Cloud Computing (UCC), 439–444. IEEE.
  27. Rutkowski, M. (2013). An Introduction to DMTF cloud auditing using the CADF event model and taxonomies. https://wiki.openstack.org/w/images/e/e1/Introduction_to_Cloud_Auditing_using_CADF_Event_Model_and_Taxonomy_2013‐10‐22.pdf (accessed 27 December 2014).
  28. Sathiskumar, R. and Retnaraj, J. (2014). Secure privacy preserving public auditing for cloud storage. International Conference on Engineering Technology and Science (ICETS'14), Tamilnadu, India.
  29. Saxena, R. and Dey, S. (2014). Collaborative approach for data integrity. Verification in cloud computing. Recent Trends in Computer Networks and Distributed Systems Security. Communications in Computer and Information Science 420: 1–15.
  30. Schaub, Hillary. (2014). FedRAMP to monitor cloud service providers. https://www.brookings.edu/blog/techtank/2014/06/05/fedramp‐to‐monitor‐cloud‐service‐providers (accessed 22 January 2015).
  31. Schiffman, J., Moyer, T., Vijayakumar, H. et al. (2010). Seeding clouds with trust anchors. In: Proceedings of the ACM Workshop on Cloud Computing Security, 43–46. ACM.
  32. Sotto, L.J., Treacy, B.C., and McLellan, M.L. (2010). Privacy and data security risks and cloud computing. Electron. Comm. Law Rep.
  33. Sunagar, S., Patil, U., and Sheshgiri (2014). Dynamic auditing protocol for data storage and authentication forwarding in cloud computing. IJRIT International Journal of Research in Information Technology 2 (4): 429–436.
  34. Tiwana, B., Balakrishnan, M., Aguilera, M. et al. (2010). Location, location, location!: modeling data proximity. In: Proceedings of Hotnets, the 9th ACM Workshop on Hot Topics. ACM.
  35. Vaish, A., Kushwaha, A., Das, R. et al. (2013). Data location verification in cloud computing. International Journal of Computer Applications 68 (12): (0975–8887).
  36. Venkatesa, K.V. and Poornima, G. (2012). Ensuring data integrity in cloud computing. Journal of Computer Applications 5 (EICA2012‐4): ISSN: 0974–1925.
  37. Waqas, A., Yusof, Z.M., and Shah, A. (2013). A security‐based survey and classification of cloud architectures, state of art and future directions. In: Proceedings of the Advanced Computer Science Applications and Technologies (ACSAT), International Conference, 284–289.
  38. Weise, E. (2014). Apple's iCloud network under attack. http://www.usatoday.com/story/tech/2014/10/21/apple‐icloud‐‐attack‐network/17669603 (accessed 21 December 2014).
  39. Wright, Rob. (2014). CSA to closely monitor enterprise cloud data privacy issues in 2015. http://searchcloudsecurity.techtarget.com/news/2240237429/CSA‐to‐closely‐monitor‐enterprise‐cloud‐data‐privacy‐issues‐in‐2015 (accessed 11 January 2015).
  40. Yu, Y., Niu, L., Yang, G. et al. (2014). On the security of auditing mechanisms for secure cloud storage. University of Wollongong Research Online, Faculty of Engineering and Information Sciences. University of Wollongong, New South Wales, Australia and University of Electronic Science and Technology of China.

Further Reading

  1. Arockiam, L. and Monikandan, S. (2013). Data security and privacy in cloud storage using hybrid symmetric encryption algorithm. International Journal of Advanced Research in Computer and Communication Engineering 2 (8): 3064–3070.
  2. Arora, R. and Parashar, A. (2013). Secure user data in cloud computing using encryption algorithms. International Journal of Engineering Research and Applications 3 (4): 1922–1926.
  3. Bugiel, S., Nurnberger, S., Poppelmann, T. et al. (2011). Amazonia: when elasticity snaps back. In: Proceedings of the 18th ACM Conference on Computer and Communications Security, 389–400. ACM.
  4. Doelitzscher, F., Reich, C., Knahl, M. et al. (2012). An agent based business aware incident detection system for cloud environments. Journal of Cloud Computing: Advances, Systems and Applications 1 (1): 9.
  5. Kamara, S. and Lauter, K. (2010). Cryptographic Cloud Storage. In: Financial Cryptography and Data Security (FC 2010) Workshops, RLCPS, WECSR, and WLC 2010, Revised Selected Papers, 136–149. Springer‐Verlag.
  6. Karbe, T. (2013). Design and development of an audit policy language for cloud computing environments. Cloud Research Lab; University of Applied Sciences Furtwangen, technical report. http://wolke.hs‐furtwangen.de/publications/theses (accessed 10 January 2015).
  7. Khorshed, M.T., Ali, A.B.M., and Wasimi, S.A. (2012). A survey on gaps, threat remediation challenges and some thoughts for proactive attack detection in cloud computing. Future Generation Computer Systems 28 (6): 833–851.
  8. Krutz, R.L. and Vines, R.D. (2010). Cloud Security A Comprehensive Guide to Secure Cloud Computing. Wiley Publishing, Inc.
  9. Mather, T., Kumaraswamy, S., and Latif, S. (2009). Cloud Security and Privacy: An Enterprise Perspective on Risks and Compliance. O' Reilly Media.
  10. Morsy, A. and Faheem, H. (2009). A new standard security policy language. IEEE Potentials 28 (2): 19–26.
  11. Ristenpart, T., Tromer, E., Shacham, H. et al. (2009). Hey, you, get off of my cloud: exploring information leakage in third‐party compute clouds. In: Proceedings of the 16th ACM Conference on Computer and Communication Security (CCS'09), 199–212. ACM.
  12. Vieira, K., Schulter, A., and Westphall, C.B. (2010). Intrusion detection techniques for grid and cloud computing environment. IT Professional, IEEE Computer Society 12 (4): 38–43.
  13. Wang, B., Li, B., and Li, H. (2012). Oruta: privacy preserving public auditing for shared data in the Cloud. In: Proceedings of the IEEE International Conference on Cloud Computing, 293–302. IEEE.
  14. Wang, C., Wang, Q., Ren, K. et al. (2010). Privacy preserving public auditing for data storage security in cloud computing. In: Proceedings of INFOCOM, 525–533. IEEE.
  15. Xiao, Z. and Xiao, Y. (2013). Security and privacy in cloud computing. IEEE Communications Surveys & Tutorials 15 (2): 843–859.
  16. Yang, K. and Jia, X. (2012). Data storage auditing service in cloud computing: challenges, methods and opportunities. World Wide Web 15 (4): 409–428.
  17. Zhang, L.J. and Zhou, Q. (2009). CCOA: cloud computing open architecture. In: IEEE International Conference on Web Services, 607–616. IEEE.
  18. Zhang, X., Wuwong, N., Li, H. et al. (2010). Information security risk management framework for the cloud computing environments. In: Proceedings of 10th IEEE International Conference on Computer and Information Technology, 1328–1334. IEEE.
  19. Zhu, Y., Hu, H., Ahn, G. et al. (2012). Cooperative provable data possession for integrity verification in multi‐cloud storage. IEEE Transactions on Parallel and Distributed Systems 23 (12): 2231–2244.

Note

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset