Cloud vendors and feds are forecasting mostly clear skies for a fast-track security certification process.
Under a 2015 deadline to save $5 billion annually by outsourcing computer applications, the federal government is banking on a high-speed certification program for obtaining cloud services.
The Federal Risk and Authorization Management Program, also known as FedRAMP, is aimed at zipping agencies and their contractors through the time-intensive security accreditation process. Initial launch is on the June calendar and the goal is to be at full operational capacity by the end of 2012. The financial stakes are high—White House officials estimate agencies could save 30 percent to 40 percent in testing and procurement costs through FedRAMP.
But patience may be required to see FedRAMP take off, its supporters say. Although agencies are required to participate, they can seek an exemption from the White House if FedRAMP doesn’t meet their security needs. And more cautious agencies may layer on extra specifications.
The program’s concept centers on having independent auditors verify a tool’s compliance with a blanket set of controls, once, so that tool is then eligible for installation throughout the federal government.
“We’re looking at a couple-year effort to get this to where agencies see the true benefit,” says Susie Adams, the chief technology officer for Microsoft Federal. Dean Weber, chief technology officer for CSC Cybersecurity, says, “it remains to be seen if the process is applicable for each agency’s needs, or if they may need to customize the process. It is definitely going to be a crawl, walk, run exercise.”
But enthusiasts, including Microsoft and CSC, note that preparations are on schedule and, even if there’s no immediate governmentwide embrace, the effort will hasten the shift to Web-based computing.
“At least now there’s a standard,” says Tom McAndrew, an executive vice president at information technology compliance firm Coalfire. “No one likes going through certification and accreditation.”
Vendors already have been given the universal security controls. The General Services Administration, which manages the program, plans to arm agencies with reusable contract templates soon. Companies will know what is expected of them before they even develop a product, officials say.
David Mihalchik, who leads Google’s federal business development and sales operation, says, “in terms of the fundamentals of this program, we now have something from GSA that will expedite deployment.” Mihalchik adds that “FedRAMP is a clear signal that cloud computing has become mainstream in the federal government.”
But it doesn’t quite exist yet. No auditors have been approved. After more than a year of interagency vetting, GSA just finalized roughly 300 controls and “enhancements,” which are supplemental capabilities and stronger protections. And no instructions are out yet for “continuous monitoring” of threats to government information and networks, although the Homeland Security Department is aiming to distribute those before June.
Another stumbling block: GSA wants the audits to be accessible to all departments through a central online clearinghouse. But a repository of government vulnerability reports would be the ultimate prize for America’s adversaries. “If that data is breached then you know all the weaknesses of the entire federal cloud,” McAndrew says—something GSA officials understand. “It’s largely going to be a paper-based process at the beginning because we won’t have the bandwidth up in time,” GSA Associate Administrator Dave McClure says.
Defining Security Controls
The National Institute of Standards and Technology spent years carefully refining the definition of “cloud computing” to provide agencies and companies with a mutual understanding upon which to build a new technology market. Now, the contracting community and government must clarify the jargony controls, or security requirements, that even GSA is still decoding. The rules address smartphone access and backup options, among many other issues.
“What they’ve defined is a governance structure, which is what I think they need to do,” says John Gilligan, a former chief information officer for the Energy Department and U.S. Air Force. “The concern is that the processes really
need to be fine-tuned. The concern is that this really turns into a massive bureaucracy.” Gilligan, now a private consultant, suggests that GSA keep FedRAMP malleable enough to adjust to lessons learned. Recognizing the need to stay flexible, the agency recently
told vendors that, as the program matures, federal officials may revise the rules, contract templates and supporting instructions.
Some of the murky items that vendors say could have huge implications involve encryption. The somewhat ambiguous expectations include, for example: “The organization supports the capability to use cryptographic mechanisms to protect information at rest” and “the service provider must implement a hardened or alarmed carrier protective distribution system when transmission confidentiality cannot be achieved through cryptographic mechanisms.”
IBM officials say the controls aren’t perfect. “If I were to pick the controls up and give them to a commercial provider so that they could implement them in the private space, it would take a considerable amount of translation and interpretation. I was hoping for a set of security controls that didn’t have as much government terminology in them,” says Andras Szakal, chief technology officer at IBM U.S. Federal. Still, he says, “at the end of the day, it’s good that they’re providing transparency” into the new purchasing protocols.
GSA officials anticipate showing vendors the contract boilerplate before startup in June. “These are things that many cloud service providers don’t get until they sign a contract,” says FedRAMP Program Manager Matthew Goodrich. “We’re trying to make as much known beforehand so you know what you’re getting into.”
One of the beauties of the cloud is vendors can enhance services on the fly or as improvements in technology become available. But for the government, each time a vendor updates a system, traditionally, security must be rechecked. “You can add a new feature, and that could add a new vulnerability,” says Chris Wysopal, co-founder and chief technology officer at computer security firm Veracode. “There may need to be almost a continuous assessment.” Certain parts of a provider’s operating environment may have to be audited more frequently, he expects.
Going through a reassessment for every upgrade, however, could be
counterproductive. “What constitutes a change that is large enough to warrant an update to an existing security plan or an entire reassessment? Right now it is very murky. I do think they need to lock down what constitutes a material change and what doesn’t. I think it is a challenge,” Microsoft’s Adams says. GSA officials on Feb. 7 described some adjustments that may require reassessment but gave agencies a lot of wiggle room to do their own thing: “These changes include, but are not limited to, [the cloud service provider’s] point of contact with FedRAMP, changes in the CSP’s risk posture, changes to any applications residing on the cloud system, and/or changes to the cloud system infrastructure.”
GSA officials say there never will be a full compendium of what indicates a significant change. Typically, the agency’s chief information officer will weigh the riskiness of a modification and determine if it merits an entire recertification, an update to the authorization or no review at all.
Federal officials also are still learning the ropes for real-time surveillance of threats in the cloud. Currently, IT managers are expected to report on antivirus updates, remote logins and other network vulnerabilities by pulling live data feeds from all devices into a central Homeland Security inbox called CyberScope. That maneuvering might be tricky in an environment where federal IT managers have
But, as GSA announced in February, the cloud provider’s data centers will be required to automatically transmit to DHS certain stats. “The government is going to have to establish what is good enough for continuous monitoring,” CSC’s Weber says.
And then there remains the task of hiring an army of compliance auditors to meet vendors’ demands for product certification. The government in January closed the application period for getting on an initial list of inspectors. Officials then immediately began accepting forms from more hopefuls on a rolling basis. The first batch of approved assessors will be named in mid-April, GSA officials say. Based on the more than 100 individuals who identified themselves as employees for prospective auditing companies at a December 2011 industry briefing, there seems to be great interest in the role.
The budget for sustaining FedRAMP is another unknown. Part of the program currently is funded through an e-government account covering many online operations that Congress recently increased to $12.4 million, still far below the president’s $34 million request. “I think it’s an important program that deserves a line item,” says Jennifer A. Kerber, president of research group Tech America Foundation. GSA officials expect the cost of FedRAMP operations will be covered by a “self-sustaining funding model” in 2014.
Initially, government funds will be used to judge candidates for the assessor jobs, but the plan is to hand that cost over to a private sector accreditation body in two years. Some cloud suppliers predict a backlog of applicants because the government doesn’t have enough funding or resources to evaluate all assessors at once. GSA officials maintain there is solid support for FedRAMP’s continuation. The government “will be able to review all applications as expeditiously as possible,” they said in a statement, adding that the program office will “give applicants clear expectations about review time once the applications review process begins.”
Engendering trust in FedRAMP is contingent in part on the objectivity of the auditors, most of whom will be paid by the suppliers they audit. Agencies can opt to cover the cost, but usually these kinds of assessments are on the contractor’s dime. Conflicts of interest could be hard to avoid, vendors say, because the potential inspectors currently work with many cloud providers on other IT projects. “How does the provider paying the auditor to do the work actually provide that arm’s-length separation?” Weber says.
In addition, some firms vying to serve as suppliers have auditing divisions. “You can’t audit your own cloud,” says McAndrew, who also serves as the Seattle chapter president of the global IT professional organization Information Systems Audit and Control Association. “Your judgment is impaired.”
To address the potential biases, GSA officials say applicants must demonstrate that the performance of their other business lines, including cloud services, has no bearing on the pay of their audit staff. NIST-developed criteria for ensuring this independence will be rigorous, particularly for companies attempting to participate on both sides of the ramp as providers and auditors. The accounting industry has been able to achieve impartiality through a similar segregation of responsibilities. GSA officials add that the FedRAMP auditor accreditation process will be more demanding than the current method of sanctioning third-party auditors.
So far, government and industry have been working out uncertainties both in the cloud and on the ground. GSA maintains a regularly updated website with FAQs and presentations. GSA officials also spent considerable time with vendors at group meetings before and after issuing the final FedRAMP procedures.
Everyone seems to agree that program officials are aggressively moving forward to make FedRAMP a reality. If they do so then the government just might create a worldwide market for cloud security systems. The standards and certifications the program would produce eventually could be transferable to private sector clients, contractors note. Providers on their own accord could repackage the government endorsements to show corporate customers a product’s security posture, GSA officials say.
There is precedent. The Federal Information Processing Standards, or FIPS, took about a decade to journey from a government information security specification to a global standard, CSC officials point out. State and local governments already want to be able to leverage the documentation, according to Microsoft officials. Cloud companies usually have to obtain permission from their federal customers before sharing any risk assessments externally.
Officials are aware of the high expectations and doubts.
“We’re not locking this down in cement and saying, ‘We’ve considered everything. Sorry,’ ” GSA’s McClure says. “If we can’t sell this consistency and trust on a business case, then we’re doing the wrong thing to begin with.”
NEXT STORY: Hacked Off