Skip to Main Content

Agency Reforms Vital for Federal Cloud Strategy

Government executives recognize that process reorganization is necessary for leveraging new technologies.

7m read
Written by:
Cyber cloud symbol project creating. Abstract concept of data storage, database, computing, servers, archive and documents safety 3d illustration. Drawing digital scheme line of futuristic idea.
Photo Credit: iStock:Arkadiusz Warguła

Successful long-term cloud-computing strategies necessitate process reforms and internal training, according to federal agency officials.

Chris Nichols, Defense Health Agency’s chief of Enterprise Intelligence and Data Solutions (EIDS) program management office, noted that cloud migration has been especially rapid across health-focused agencies, with DHA making rapid progress beginning in the March-April period.

“Over the past two and a half years, we’ve consolidated about 22 warehouses, and we’ve platformed about 15 applications into what we’re calling the MHS information platform,” he said at GovernmentCIO Media & Research 2020 Cloud Summit Thursday. “In about a 16/17 week period, we moved everything from an on-premise data center to AWS GovCloud.”

This rapid technological shift is being met with attention to organizational mandates that will be necessary to take advantage of these new resources, devising strategies that support agency missions and security protocol. This foundation is receiving attention from private sector partners, who are fostering their own adaptation around the demands of specific institutions.

“We have to be thinking, not just about where the data is and what it is, but once I get into the cloud, how do people have access to it, how to request access, how to be able to trust that the data that they have is the right data. So there’s a lot of additional planning and thinking,” said Jonathan Alboum, federal CTO and principal digital strategist at ServiceNow.

Federal tech executives appear to be increasingly attuned to how workforce training and cultural shifts will allow for cloud computing to be deployed to its full potential, and should be maintained concurrent to technical modernization.

“It’s really about the people and the process. And if we don’t get that right, we should not be inserting a piece of technology at all because inserting a piece of technology into something does not fix the underlying problems or issues that are occurring. And we have to focus on those processes, people and how to manage that capability the right way,” Nichols said.

Similarly, health-focused agencies are reckoning with the human impacts for Americans who use their services and entrust them to provide effective care. This is leading to a particular diligence in terms of data management and accurate informatics.

“Health data is unique, and you have to be much more careful with it. If you get a bad transaction and you’re missing a dollar, that’s one thing. If you get a bad transaction and your medication doesn’t arrive or you get the wrong one, that’s a much worse problem. So you’ve got to be careful with that data,” said John Short, chief technology integration officer at the Department of Veterans Affairs’ Office of Electronic Health Record Modernization.

Agencies with a particularly heavy biomedical research focus are also exploring the most effective means of sharing information between teams and ensuring the complex data processing ensured by the cloud is integrated within their analytic work.

“I think one of the major challenges is also an opportunity for our broad biomedical research community that NIH supports, which is around training. That’s both for the IT staff, the people building and supporting the infrastructure, and also for our researchers who are trying to use these tools and understand the new way to work with tools and data, and to share data collaboratively,” said Nick Weber, program manager for cloud services at NIH.

Ultimately, the overall goal appears to be translating the technical complexities behind cloud computing into smoother and more effective agency services — a process that requires integrating the two from the start.

“We need to take the next step to de-engineer that solution and make it less complex. If we make it less complex, we lower the cost and we also lower the risk for the user. We lower the risk to ourselves to maintain it and lower the overall complexity for everyone else,” Short said.

Related Content
Woman typing at computer

Stay in the know

Subscribe now to receive our curated newsletters

Subscribe