Agencies across the federal government are beginning to use their increasingly advanced cloud architecture to support the sophisticated application of artificial intelligence and data analytics.
Speaking at the GovCIO Media & Research Infrastructure: Cloud Modernization forum, experts from both the government and private sector outlined how their organizations are building the kind of cloud architecture needed to support the next phase of their modernization programs.
At the Department of Veterans Affairs, this includes an increasingly varied and sophisticated scope of initiatives to apply artificial intelligence and machine learning toward medical care and diagnostics.
“We have many petabytes of data both in the cloud and on prem, and what we’re doing from the enterprise cloud perspective is that we’re building a 'super platform' that manages all the various appropriations stream used to fund these various efforts to accelerate the work we’re doing and help them leverage the cloud,” said Dave Catanoso, acting director of application hosting, cloud, and edge solutions at VA.
The U.S. Army has taken a similar approach with its cloud maturation, orienting these platforms toward the support of responsive and wide-reach artificial intelligence applications.
“We have to have a data-processing capability that allows us to capitalize on the data fabric, capitalize on service mesh and how those are all coming together to enable data services. In order to circulate and process all that data, we have to use cloud-native solutions that are already available to get to that end state,” said Lauren Pavlik, chief of the Data and Software Services Division at Army's Enterprise Cloud Management Agency.
Much of the design of effective cloud architecture hinges on determining which data repositories should be moved to the cloud, and which are more appropriately maintained on-premise as part of a potential hybrid cloud solution.
“From a data architecture perspective, the first thing you have to ask is what data do you need to move to the cloud, and what data can stay in existing repositories within agencies. The reality is many agencies have ridiculous amounts of data, and not all of that needs to leave those data centers. We will likely be living in a hybrid [cloud] world maybe forever,” said John Dvorak, chief architect at Red Hat.
As agencies look towards incorporating an increasing range of data inputs from separate departments and even external organizations, this kind of discernment has become increasingly necessary to support more sophisticated data-intensive functions like AI and machine learning.
“We’re seeing agencies moving away from the enterprise architecture model where they’re looking to move all their data to the cloud towards thinking about what data they actually need in the cloud,” said Ron Williams, Director of Cloud Adoption & Infrastructure Optimization at the GSA IT Modernization Centers of Excellence.