Porting your Data as a Cloud Customer

Without any single dominant Cloud provider porting your data from one service to another is a grueling task. Depending on the service you are using; Iaas, Paas, or Saas all come with their own interoperability problems. Even something as simple as a Saas Cloud customer may want to move their data held in the cloud to an in-house server as your operations grow and become more streamlined. The problem is that your Cloud provider may not have the right tools for a convenient transition. For instance if you wanted to move all your emails to another client. Imagine forwarding each email individually, this is meant as an illustrative example only. Your platform may or may not have solutions to porting over into an new system.

For the consumer porting your services from one provider to another would be a significant boon. Before the consumer decides on a Cloud provider to host their operations they have to ask themselves what type of operations are they outsourcing. The nuances will vary according to type of Cloud service being utilized, and the needs their particular products have. For instance if you are utilizing an Saas Cloud system then the data stored on say iMessage wouldn’t transfer directly to Google Chat; or perhaps you are using a Paas system for your cloud services and coding in a particular platform. You might find that you would be able to minimize costs by switching Cloud Services but you might have to recode elements of your services or perhaps switch languages. The issues aren’t unsolvable these days, in fact we have even begun to write programs that debug programs, but that solution is in its infancy and would increase costs for the Cloud consumer.

The problems are a mixture of technical know how, and a business will. Interoperability in a general sense is the ability of an infrastructure to be able to connect and communicate with another structure without implementation or restriction of access. It is the ability for a service to be formatted so as to allow cross-platform communication. And each Cloud Service is going to want to showcase a unique product to their customer base. Essentially they will want to carve out a niche at the least, if not become the primary provider, and to do that they need to have a unique and powerful offering.

Some of the issues facing interoperability and ease of porting a customers data include: rebuilding applications, the actual transfer of data, and making sure the services the consumer needs exist on the new platform. From the customers perspective being able to have full service compliance would be important. You wouldn’t want as a customer to be going over to a new system and having your old files need to be reengineered just get the same tasks done. Or perhaps relearning a different set tools just to be able to do what you had been able to do originally. The actual porting of your data is another large hurdle that needs to be passed. Because in some instances the customer can’t reasonable port large quaintness of data by email, so the compliance of having the right in your first service provider and the right hook-up for your new provider needs to be run smoothly, let alone how the data is formatted. One reason sending the data by the internet is troublesome is there are security concerns that need to be considered when porting data over any public connection, even one that would encrypted.

So interoperability is an important step in being able to port your data as a Cloud service customer, and being a viable strengthening company in the current climate one often needs to move your services from one cloud to another, or to an in-house server.

Interoperability and Cloud Services

Interoperability in a general sense is the ability of an infrastructure to be able to connect and communicate with another structure without something translating, or restricting access. It is designed for a service to be formatted to allow cross-platform communication, in simpler terms. Word and Pages were not always interoperable, but rich text formatting was interoperable between the two. The ability for data to be moved from one format to another without a middle man is a key element. The other element for many service providers is access, if their product would be the dominant culture. The dominant cultural product would want to restrict certain information to maximize their products capabilities. So for something to be interoperable the data must be easily moved, and all of the data must be available; which is difficult for businesses to manage.

Each Cloud Service is going to want to showcase a unique product to their customer base. Essentially they will want to carve out a niche, if not become the primary provider; to do that they need to have a unique and powerful offering.

For a Cloud provider to be able to distinguish their service from others is paramount to be able to thrive in this new culture. With most new industries there is a relatively massive push and pull between different standards. Like the classic examples of Apple and Microsoft or even VHS and Betamax; this polar competition is a rough approximation to the standardization issues in interoperability of Cloud Services. In he Cloud world there is a much larger variety of services, even types of services, but the nitty-gritty of the issues is a need for a company to distinguish themselves amongst their competitors.

The vendor of a particular Cloud service may feel that they have a nifty offering, and they wouldn’t want to create an inferior product to meet an industry standard. Now if there ever is an industry standard, each company will have to decide if portability of offerings is necessary for them to compete with other services. They will be deciding between the particulars of their offering and the offerings needed to port from their competitors. But being able to move from a competitor also means the ability for their customers to move to one of their competitors. Not surprising the ability for your data, software, or platform to communicate and integrate with other services is the main difficulty with moving it. Portability is the ability to port or move your data from one system to another, not surprising the main issue with portability is interoperability. I am using data in a very general sense, it can mean literally the information stored on an Saas system, or the programs that are stored in a Paas system, for instance. The interoperability of your data wedges into this issue of porting information from one system to another. There are strides being made into interoperability, a growing desire from parts of the industry to have a standard to ease interoperability.

As a vendor of a Cloud service you have to consider whether you are utilizing the best practices as well as what is good for your company. Your company may be able to beat out competitors by having the best services provided and then the adoption of a standard set of services necessary for interoperability might cut down on the services offered. A company could end up giving a competitive edge over in search of complying to an industry standard. So, the need to be interoperable is not a task to consider lightly. There are many standardizations that would be good for a consumer but not necessarily for a vendor.

Popular cloud computing services: SaaS (Software as a Service)

One of the reasons the ‘Cloud’ has become such a ubiquitous term is the Saas model. Some people are beginning to think Saas should not be thought of as a part of the Cloud packages at all, because it provides in some respects a different type of service than Paas or Iaas.

Saas or Software as a service is the most basic service, essentially it is a cloud offering that individuals or companies can use to standardize routine tasks or services.  An email client might use this type of cloud packaging because the basics of client needs to work across platforms.  Essentially Saas is a data storage, offering specific software that uploads and downloads from the general server.  The whole process is streamlined by the Cloud provider.  The Cloud provider is doing all the developing for the software; while in Paas the platform can do some of the work– Saas takes this beyond the scope of what Paas provides.  Saas is really just the data storage aspect of the Cloud offering where the data is limited in scope to the type of software the Saas is providing.

The Saas model, in fact most Cloud based services, rely upon the use of some software interface for the client that uploads and downloads from the Cloud.  The Cloud operator utilizes intelligent software to handle data from the clients.  Examples include GoogleDrive, iCloud, or an application store; all these services remotely hold data and software in the cloud that the client is able to upload and download from.  It takes little know-how to operate many of Saas Cloud operations the Cloud is able to manage and essentially streamlines aspects of business operations.

Interoperability and security are still issues with the Saas model.  A problem with Saas is the use of software precludes or interferes with control over your data.  The software operator remains in control of formatting the software.  The data a customer puts onto the Cloud is read by their own network through a pre-designed software client, so porting or moving customers’ data is a cumbersome process.  As portability is a problem for all Cloud services the service of your particular Cloud host is an extremely important decision.

What a Saas customer must keep in mind is that they are limited by the software they are using.  If, for-instance, one wanted to port data or use their information in any particular way, they would need to design their own system for accessing the information for personal use.  Porting data from an Saas Cloud provider is a significant concern for the customer.  On the other hand, the software service is already pre-packaged into the product so the customer does not have to worry about setting up a system.  And usually a customer using Saas will be looking to outsource significant amounts of IT needs to the Cloud provider.
As Cloud customers come from all walks of life, the client base for a provider is not limited to a company.

But many individuals use a Saas in their day to day operations. Saas is such a ubiquitous model that many people are using it without knowing it has a particular designation.  Whether  Saas should be considered in step with the other services is up in the air, but the basics of a Cloud service are there.  The Cloud host usually maintains a large server or servers to hold the that is being sent through its operation, and the client accesses that data as way of interfacing with data, often in the form of communications.  The Cloud is a nascent industry with new issues cropping up routinely.

 

 

Popular cloud computing services: the PaaS (Platform as a Service)

One model for many developing companies, particularly those who are developing new software is Paas.  Paas or Platform as a Service is somewhere in between Iaas and Saas services because while it is not as restrictive as Saas it also is not as flexible as Iaas.  Paas lets the customer scale their operations according to their growth, and helps with development by having a consistent platform for a group of developers.

One use for Paas is for developing programs from multiple remote videos xxx locations, because its services streamline the programing side.  By standardizing the available products or tricks in the bag, multiple people can remotely add to the programs being designed by the engineers without stepping on each others toes.  It is often used by companies that offer specialized services to other companies.  A company offering these services will often rent a cloud platform that gives them tools to then design programs for other companies to use.  The service is the platform like a Windows or Mac platform, except not as simple and customized by the Cloud providers themselves.

Paas is a convenient tool for developers and others who wish to coordinate their projects onto interpretable software.  A developer of some sort often needs to be working in the same language as their coworkers, to be able to integrate their particular designs with each other.  And for many companies that have developers working remotely this helps keep their work on the same level.  Paas works for other types of structures as well, a company that provides a software service to other companies may use Paas system to supplement their own servers.  The cloud is often used as a way for start-ups to avoid many of the costs associated with owning a server.  Paas is often a ‘pay as you go’ platform, meaning you can pay as your need arises.  If you need to rent more space you can pay according to your needs.  This helps young start-ups get costs associated with initial investments down.  Though by no means is a Paas provider limited to a start-up as their customer base.  A more developed company might want to use a Paas system to streamline operations.

Though the customer base for Paas is generally designers who offer applications for consumption, the advantage to using a platform specific Cloud service is that the customer can write all the code, and not be tied down to creating up the entire infrastructure of Iaas.  Saas services just wouldn’t work for developers.  The platform provided by the cloud services frees the developers up to work on more necessary tasks.

Paas customers can be varied, not limited to software developers alone.  A Paas provider might simplify their platform to provide more basic services that allow a company to have many built-in features.  While Saas and Iaas have elements of this, Paas is more variable in what it offers from provider to provider.  The Paas system is not as simple as a Saas system or an Iaas system.  The Iaas systems tend to be more hardware oriented, where the platform and software are already developed; in this way it is generally utilized by companies that already have a product to push out.  The Paas while solely for developing companies enables a developer to utilize the Cloud offering at all ends of their development process.

Paas in this way really enables new ideas and new developments in the tech world. Essentially keeping an application or web developer set from the beginning to the end. Though as the company grows they may want to consider hosting their own servers or depending on their needs renting out an Iaas host down the road.  But having the availability of the Paas enables young companies to begin and sell their products by saving them from hardware and various software development tools needed to really make excellent products.

Popular cloud computing services: the IaaS (Infrastructure as a Service)

A popular type of Cloud service these days is Iaas. It is a means of keeping costs down in the flexible area of hardware needs.  Iaas or Infrastructure as a Service is designed around providing a user with the available hardware to host whatever project needs hosting.

The best way to think about this is you are paying for the use of a network like you would a tax on infrastructure.  One day you may have to use the subway, another day the roads, and then some days you have five trucks and a subway car on the infrastructure.  Infrastructure as a service gives you real or virtual hardware that you can upload your information to.  Your programs or the users of your webpage go through the infrastructure of the Cloud host.  The host gives you the availability of storage and memory that scales according to your needs, but you have to build the project from the ground up to make use of the available infrastructure.

Getting down to basics allows the customer or the company renting out the Cloud service to scale their operations according to their needs.  For some the question may be why go through the extra effort to provide your own platform.  The scale of your operation when you rent out virtual room in the Iaas system is more flexible this way.  While Paas offers more software services to the customer, the open nature of Iaas gives a more established flexibility to create their own services with the hardware rented out.  The provider has the hardware, whatever particular hardware services they are offering, and the customer rents it out to keep costs down.  Any large database needs to be kept in a cool dry environment, and this amongst other things drives costs up especially for a project with variable memory needs.

Many services are virtual server space, network connections, bandwidth, IP addresses and load balancers.  Like Saas and Paas, Iaas is accessed by a client through the internet.  The Cloud in general is essentially an Application or a Web page that accesses the server through the internet and creates available storage for the user. The provider is able to keep their own costs down by letting the customer base make decisions on what type of platform or software to install on their hardware.  In turn, everything works seamlessly together. The hardware of an Iaas provider is often stored in many different facilities, allowing them to provide a product of scale.  Basically they can rent out their hardware to other users and by having a large facility or facilities they are able to keep costs down that are passed on to the customer.

The customer does not have to rent out their own facility this way.  By not having to maintain their own facility they are able to scale their operation according to peak and low traffic times.  For instance a weight loss website might want to rent out from an Iaas or Paas provider to keep costs down during  lulls in business.  But after New Years they might acquire a lot of customers that providing for would be a huge expense the rest of year.

Between Iaas and Paas providers the user has to decide what type of operation needs they have.  For a developer Paas might be the way to go, but for a more established company, or a company that has a product in line for their users a Iaas provider will be able to give them the hardware they might need at a scalable rate meeting their needs as it is needed.  Time, money, efficiency and ease of use are important factors in any business and tuning into the correct providers is the way to go.

VM snapshots for efficient Forensic Investigation

Cloud computing is a technology which allows users to access storage, software, and infrastructure and deployment environment based on a model named “pay-for-what-they-use”. The nature of the cloud environment is that it is multi-tenant and dynamic as there is a need for addressing the various legal, technical and organizational challenges regarding the cloud storage.

With the dynamic nature of the cloud environment, it is possible for digital investigations to be carried out in the cloud environment. Digital forensics has to adhere to a number of steps as it was the case with traditional computer forensics. These steps include Identification, Collection, Examination and Reporting/ Presentation. The first step involves identifying the source of evidence, while the collection phase involves identifying the actual evidence and collecting the necessary data. The examination stage involves analyzing the forensic data, while in the reporting phase, the found evidence is presented in a court of law.

The digital investigators experience challenges as a result of the legal, technical and organizational requirements. If some compromise is made on the part of the CSP, then the evidence which is provided will not be genuine. It might have happened the data you are relying on as evidence was injected by a malicious individual.

A number of digital devices are currently using the cloud, but the investigators are given little chance to obtain the evidence. The available Agreement may not be stating the role of the CSP in carrying out the investigation and its responsibility during the time of happening of the crime. The CSP might have failed to keep logs which are an important part in getting evidence regarding the occurrence of a crime. The investigator also has to rely on the CSP for collection of the necessary log files, and this is not easy. Many researchers have clearly stated that many investigators experience difficulties in trying to collect the log files.

The cloud service provider will provide their clients with a number of different services, and it has been found that only a few customers from the same organization will be accessing the same services. Malicious users are capable of stealing sensitive data from the other users and this can negatively affect the trust of the CSP. There is a need for the cloud to protect against these malicious activities by use of Intrusion Detection Mechanisms for monitoring the customer VMs and in detecting malicious activity.

A user can create his or her physical machine to create a VM. Other than for the user having to request, some cloud software such as the OpenStack and eucalyptus will create snapshots from a VM which is running and then store the snapshots till when the VM has terminated. If you reach the maximum VMs, then the older VMs will be deleted from the system. The snapshots from a cloud environment are a great source of digital evidence and they can be used for the purpose of regenerating events. It is hard for us to store numerous snapshots. The snapshots have also been found to slow the virtual machine, and this is determined by the rate at which it has changed since when it was taken and the period of time for which it is stored.

Malicious activities will always be identified in case the users of the VM carry out actions such as uploading a malware to the systems in our cloud infrastructure, excessive access from a location, or by performing numerous downloads or uploads within a short period of time. Other activities which can be suspicious include cracking of passwords, launching of dynamic attack points and deleting or corrupting some sensitive organization data.

The Way Forward in Heterogeneous DataCenter Architectures

The use of heterogeneous datacenter architecture has been on the rise. Developers experience numerous challenges when trying to adapt applications and systems in such areas. The good thing with cloud computing is that it will abstract the hardware from the programmers and end users. This is a good idea for allowing the underlying architecture to be improved, such installation of new hardware, and no changes will be made to the applications.

The use of heterogeneity in processor architectures will help solve a number of problems. The elements for heterogeneous processing can improve the efficiency of our systems through features such as specialization, which are computations matching the elements which have been specialized for processing. The Graphical processing units (GPUs) are examples of systems which have developed in the computing industry. Others include media-functional units such as SSE4 instructional set, parallel coprocesors like Intel’s Xeon Phi and encryption units. The future architectures are expected to feature multiple processors, each with heterogeneous internal components, interconnects, accelerators, and storage units with good efficiencies. Companies which rely on large-scale datacenters like PayPal and Microsoft are investigating on how to implement heterogeneous processing elements so as to improve on the performance of their products.

Technology for developing cloud computing which can be integrated with the heterogeneity of the data center will make us look for ways for exploiting the varied processing elements for special purpose and we don’t need to lose the advantages associated with abstraction.

With infrastructure as a service, the physical and virtual resources will be exposed to the end user. Virtual machines will offer an instant control to the operating system (OS). In the traditional architectures, virtualization introduced a great overhead for workloads are highly sensitive to the performance of the system. However, modern technologies such as peripheral component interconnect (PCI) and single-root I/O virtualization (SR-IOV) have reduced this overhead since they perform a direct access to the accelerators and the networking devices, and the incurred overhead is far less, maybe 1%.

Also, with the increase in heterogeneity of the datacenters, the deployments for IaaS should be expected to expose varied components. For us to extend to the heterogeneous IaaS deployment from the homogeneous cloud flexibility, we have to perform a further research on the following fileds:

– The schemes which can be employed for sharing in accelerators.
– Optimal tradeoffs associated with virtualization functionality and performance.
– Optimization techniques for power and utilization.
– Scheduling techniques for determination of job assignments for the resources to be allocated more efficiently.
– Schemes for cost and prioritization.
– Mechanisms for migration of jobs with state in the accelerators and the host in accelerators.

Heterogeneous computing should involve finding ways to exploit the available new interconnect technologies such as the parallel file systems, software-defined networking in relation to the heterogeneous compute elements.

For the case of platform as a service, the heterogeneity has to be exposed to the framework, or exposed to programmer, or hidden by backends targeting heterogeneity, or hidden by the libraries. Future research should be focused on the following:

– Software architecture regarding accelerated libraries.
– Scheduling mechanisms aware of heterogeneity at the level of the platform.
– Application programming frameworks with the capability of exposing or not exposing the heterogeneity to the programmer.
– Allocating resources amongst multiple frameworks or platforms in an heterogeneous manner, or for the frameworks which share the same datacenter.

The catapult framework for Microsoft is an example of a research which is targeting to improve on the heterogeneous hardware. The software was created for the purpose of improving how the performance of Binge Search Engine. It will provide us with a valuable use case on how to exploit the heterogeneous hardware for applications in commercial datacenters.

Threats to Cloud Security

Although most organizations are shifting their data to the cloud, security of such data has remained a serious challenge. For you to plan on how to handle the risks, it is good for you to first understand the risks posed to your data in the cloud service. The on-demand nature of this service poses a threat to the data. As it has been noted, with cloud services, one can bypass the organization and setup their own accounts in the cloud, and this has to be handled.
The following are the threats to cloud security:

1. Data breaches
Most of the threats faced by the cloud are similar to those of a traditional corporate network, but because of the huge amounts of data stored in the cloud servers, the providers themselves are highly targeted. The sensitive of the cloud data determines how severe the breach can be in case it occurs. Breaches regarding health information, intellectual property and trade secrets can be more severe compared to breaches on financial information. In case a data breach happens, the company may end up facing criminal charges, lawsuits or fines. Investigating a breach and notifying customers about the same can be too expensive for the organization. It can lead to damaging of a brand and losing of a business, which might cost the organization for a number of years. Although cloud providers have implemented measures against security breaches, the organization should take responsibility of protecting their own data from leaking to unauthorized individuals.

2. Broken authentication
When a data breach occurs, the result may be weak passwords, lax authentication, and poor management of key or certificate. Identity management is a great challenge to organizations as they find it hard for them to assign roles to users based on their job. In some cases, the organization forgets to remove access for a particular user once they have left the organization or in case their job has changed.
Systems with multi-factor authentication mechanisms such as phone authentication, one-time passwords and smartcards are a great way of securing the cloud since they make it impossible for anyone to login through stolen or guessed passwords.
Some software developers leave cryptographic keys and other credentials in the source code and then leave these in public repositories such as the Github. There is a need for us to keep keys well secured, Keys should be rotated on a regular basis to make sure that attackers do not use them.

3. Hacked APIs and Interfaces
Each cloud service has APIs. These APIs and interfaces are used by IT geeks for management of the cloud service.
The availability and security of the cloud will be determined by how secure the APIs are. Third parties who rely on these APIs and interfaces pose a security risk. When you have weak APIs and interfaces, your organization will be exposed to security issues related to integrity, confidentiality, accountability and availability.
Since interfaces and APIs can be accessed from the outside internet, they form the most exposed part of the cloud.

4. Exposed system vulnerabilities
When program bugs and system vulnerabilities are exposed, they pose challenges to cloud computing. Organizations share databases, memory and some other resources, and this creates new surfaces for attacks.
Basic IT processes can be used or mitigation attacks based on system vulnerabilities. Some of the practices for countering this include prompt patch management, vulnerability scanning, and swift follow-up on matters which are reported.

5. Account hijacking
Cloud services are prone to phishing, software exploits and frauds since the attackers are capable of manipulating transactions, eavesdropping on their activities and even modify data in transmission. Some attackers know how to use the cloud service for the purpose of launching attacks.
Organizations should not allow account credentials to be shared between users and services, and multifactor authentication mechanisms should be implemented if possible. Each transaction which occurs should be monitored and ensure that it is traceable back to the owner. The credentials of each account should be highly protected from getting stolen.

6. Malicious insiders
This can be from a system administrator, former employee, business partner or a contractor. The agenda behind this may be a revenge or data theft. The insider can choose to manipulate the organization data or maybe destroy the whole infrastructure.
The best solutions include minimizing the level of access to accounts by users and control the process of key encryption.

The Need for Standards in Cloud Computing Security

For enterprises to view cloud computing as the best choice for storage of their data, standards are of great essence. Most IT enterprises are working hard to ensure that they get a cloud which will help them cut on their expenses while achieving their business needs.

Today, most organisations allow only a percentage of their daily operations to be supported by the cloud. Although IT experts expect that the adoption of the cloud should accelerate in the near future, many enterprises are still wondering whether the cloud is the best solution for storing their data. The main source of fear is security. The enterprises are not sure of whether their data will be secure in the cloud.

They are also in need of creating an on-demand service while keeping compliance and industry compliance. The enterprises shy away from storing g their data in the cloud for fear that they are not protected. The cloud is porous in nature, and this makes it an attractive target by attackers and securing it has become more complex as the site.

Currently, there is no definition on what an effective cloud security is. There exist no standards defining what an effective cloud security might, and what is expected from both the providers and the users to ensure that the cloud data has been well secured. Instead of having these, the enterprises and providers are left to rely on data center standards, list of auditing specifications, industry mandates and regulatory requirements for provision of guidance on how the cloud environments should be protected.

Although this approach can make cloud computing to be somehow complex, it is a good approach to ensure that the cloud data is well secured. There is a need for both the enterprises and the cloud providers to ensure that they focus on the core elements of well secured cloud such as identity and access management, virtualisation security, content security, threat management and data privacy.

It is also good for the industry to consider the NIST (National Institute of Standards and Technology) specifications regarding the cloud security, so as to form a good foundation for protection of the data and services which are running in the cloud. Although most of the principles here were meant for the government organisations, they are very relevant and applicable in the private sector.

The guidelines provided by NIST are good for addressing serious issues regarding cloud security such as identity and access management, architecture, trust, data protection, software isolation, incidence response, availability and compliance. The body also states the factors which organisations have to consider in relation to public cloud outsourcing. The CSA (Cloud Security Alliance) is a good source of knowledge for rules regarding how to secure data running in on-demand environment. Here, you will know more about the best practices for securing such data. With CSA, all the necessary guidelines which can help you know whether your cloud provider is doing what they can to secure your data are provided.

Working through such organisations is good as they will help both the customers and the provide in laying of a good groundwork for the purpose of creating a secure cloud environment. Security principles should be applied as much as possible when we are securing our cloud environments. With good standards for cloud computing, the enterprises will be much guaranteed that their data is safe in the cloud. This will improve their trust for the cloud provider, and they will make cloud computing the best solution to their IT needs. The current customers will be much assured of the security of their data.

The Federal Risk Management and Accreditation Program

FedRAMP (Federal Risk Management and Accreditation Program) is an accreditation process through which the cloud provides align their security policies to those that have been stated by the U.S government. Although this process is new, it has brought a number of improvements to the cloud security and is expected to being more improvements. With the approach, standardisation is provided for both cloud services and products.

It is aimed at accelerating the rate at which secure cloud solutions for the government agencies are adopted, and the security of cloud products and services is improved. FedRAMP also ensures that consistent security is achieved across all the government agencies, automating the services and ensuring that there is continuous monitoring.

FedRAMP helps us implement a framework in with a standardised processes for the purpose of security assessments which can leverage the path for the ongoing authorisation and assessment and as well as the initial P-ATO. With a unified approach to the idea of cloud computing, you will experience a decrease in time, cost and the resources which be needed in architecting the cloud solution and the security will be improved while creating uniform standards across all the government agencies. This will make it easy for the agencies to update their IT infrastructure so as to make an improvement so as to provide services and protect their data in an efficient manner.

Although the FedRAMPO will provide us with the framework, agencies will be tasked with looking for the cloud service provider (CSP) having P-ATO and meting all the needs of the FedRAMP. The agency will also be tasked with taking a good inventory of the cloud services, which will help us develop a good cloud strategy, and report on the cloud service infrastructure on an annual basis. This task can be tiresome and this is why agencies usually choose CSP who not only satisfies the needs of the FedRAMP but has a complete understanding of the whole FedRAMP process and has the necessary resources so as to continue supporting the agency.

As government agencies continue to adopt cloud computing, quality CSPs are a necessity as they can help the agencies to reduce the risk they face in cloud adoption strategies. Since each agency is unique in this case, each may have its own requirements. Also, CSPs are not the same. However, the best thing is for the agency to look for a CSP which is much flexible. This will make it possible for the specific security controls of the agency to be layered to be layered on top of our base FedRAMP infrastructure. Each agency will want to get a CSP formed by a team of professionals who are experienced and willing to listen to the agency and understand its specific needs. The CSP should also help the agency in achieving their unique objectives.
For some enterprises, FredRAMP will have two meanings: a mechanism for measuring the success of security, and a way for selling the cloud services to the government agencies under the command of migrating to the cloud.

Some of the organisations which run clouds and adhere to the FredRAMP standards include Akamai, Amazon Web Services, Lockheed Martin and the U.S Department of Agriculture. Both the private industry representatives and governmental stakeholders took part in developing the FredRAMP standards in 2012. They were geared towards reducing costs, increasing efficiencies and increasing the level of safety in the cloud. In case you are not a CSP, there are several avenues for you to get involved. You can take advantage of a FredRAMP provider, which will help in sending messages of seriousness. You can also apply for a Third-Party Assessment Organisation.