Federal agencies have begun investing in the resources needed to store and compute large quantities of complex data as a key component of their modernization programs.
Capacities like artificial intelligence, big data analytics and machine learning require a considerable level of processing power to sustain, especially when applied to larger research projects or integrated within an agency’s broader IT enterprise.
Speaking at the GovCIO Media & Research Blueprints of Tomorrow event Thursday, leadership from the government and private sector outlined how their organizations are developing the infrastructure necessary to make increasingly sophisticated data usage of propriety and shared data.
This drive toward adopting high-performance computing (HPC) capacities evolved naturally from the modernization programs of major federal agencies, where their increasing data usage required the infrastructure development needed to support this expansion.
“Back in 2019, the Army put out a data and cloud execution order," U.S. Army Enterprise Cloud Management Agency Director Paul Puckett said. "But at its at its core, it really said we need to start to map how we move data, what data is authoritative, what data is duplicative. And we need to start to see the systems that are now responsible for the brokering of this data, and we need to start to modernize ourselves so we optimize our business processes and our workflows."
The integration of HPC has served to advance the existing services and initiatives managed by these agencies, allowing them to reach new levels of efficacy or sophistication in their work. This has markedly increased the accuracy of the National Oceanic and Atmospheric Administration’s climate forecasting, allowing weather projections to be compiled and disseminated at an unprecedented pace.
“HPC for us is kind of the crossroads where a lot of the science that we do and the products that we produce comes together, so if you if you take a look at the forecast data, there are three and a half billion observations that go into a forecast on a day-to-day basis," NOAA CTO Frank Indiviglio said. "That's a lot of data that gets put together, and out of that data comes a number of products that span a number of timescales, from the minutes, think centigrade warnings, or storm warnings, flash flood warnings — all the way out to multi-decade climate change — and a lot of that is driven from modeling supported by HPC."
The implementation of HPC to better utilize data assets is also occurring along side a cultural shift throughout government where agencies with large technical demands have begun to plan their more forward-looking initiatives around using these information repositories, encouraging individual departments and task forces to think in these terms as well.
“The data is growing so fast and building so quickly that we have to shift the culture toward data centric ideas,” ThunderCat Technology CTO Kurt Steege said.
As a result, the adoption of HPC is playing a co-supporting role with cultures of open discussion, where the data-sharing enabled by these new capacities are better allowing technically minded workers to more freely share their findings and apply them in potentially new settings.
“You have all these researchers who have a lot of homebrew solutions, and the culture of science has always been about sharing and open inquiry," Department of Energy Division for Advanced Scientific Computing Research Director Ben Brown said. "So it's about the sociological barriers to adoption or just surfacing shared insights in the context where people are receptive to ideas."