Advancing Government With a ‘Digital Twin’ DevOps Strategy
Using a digital twin increases development velocity while breaking down silos.
More and more organizations in both the public and private sectors are adopting DevOps approaches to business culture, technology development, implementation and testing practices. However, some remain hesitant, slow to accept the risk and uncertainty that a new approach can bring. One concept that leaders in DevOps have adopted that underpins the success of the approach is a digital twin, the idea of replicating (or modeling) the data, code and technology so multiple teams can work on it simultaneously, accelerating authority to operate (ATO) and development pipelines.
“DevOps thrives on the concept of a digital twin,” said Kurt Steege, CTO for ThunderCat Technology. “The idea of the concept is to duplicate conditions and technologies in order to optimize usage of a design … a duplicated production environment in Dev and Staging order to understand how code will work ultimately when deployed.”
For those unfamiliar with the strategy, Steege likened it to a plot point in the movie Apollo 13, where Ken Mattingly used a digital twin of the spacecraft on the earth to create a playbook for the power up sequence on the command module in space (a process that was crucial for the astronauts on the actual mission to get correctly the first time). Hopefully, developers never find themselves in such perilous circumstances, but a digital twin is an important element of breaking down barriers between DevOps teams, he said.
The ability to replicate data and technology allows DevOps teams to approach development and testing simultaneously, providing for a higher-quality final product by catching bugs early and drastically reducing the time it takes to go “back to the drawing board” if testing reveals a serious flaw.
“If the goal is to improve…metrics and speed up your continuous integration/continuous development (CI/CD) pipeline,” explained Michael Johnson, NetApp’s deputy CTO of Office of Technology and Strategy, “you need to create dev and test environments that are as close to the production environment as possible. The closer they model the production environment, the better job you can do testing and the fewer test escapes you have.”
This approach also improves the efficiency of development and delivery, which is becoming more and more important as technology evolves.
“Folks need to start thinking, ‘How can I make this better, how can I be more efficient?’” Steege said. DevOps “moves the needle in that direction more quickly than in any other way.”
There’s also a business imperative to move business faster.
“Unintuitively, as teams sped up their [CI/CD] initiatives, it’s actually driven innovation, it’s improved product quality, and it meets the mission needs significantly better,” Johnson said. “It does require moving from Waterfall … to more scrum-based, smaller team, microservices-based applications.”
Innovators are also applying the concept to machine learning, so a program can effectively learn from its twin, accelerating the development process.
“IDC has reported that more apps will be developed in the next four years than the prior 40 years combined,” said Johnson. That, coupled with the data point of 40% of apps using some form of AI machine-learning or deep-learning, “speaks to the need to increase the velocity of IT delivery in the [continuous integration and continuous development (CI/CD)] pipeline.”
Using automation and a digital twin helps to increase the velocity of the CI/CD pipeline. This is especially important as organizations progress from DevOps to DevSecOps, integrating security into the process, Steege said. Duplicating code lets security teams test its integrity without impeding the development process, a historical area of friction for software development.
When looking at where these practices have been implemented especially well in government, Steege said the U.S. Air Force’s Kessel Run has been a leader in DevOps, adding, “software code factories are popping up all over the Department of Defense.”
“As many of you know, the U.S. government signed a huge bill last year to invest in AI and deep learning,” Johnson detailed. “Almost all of those AI and deep-learning frameworks leverage the best of breed of the latest DevOps technologies around Kubernetes … Those all require modernization of these DevOps environments.”
While adopting DevOps and/or DevSecOps can require a tough cultural shift, Steege hopes that understanding the underlying concepts and outcomes can incentivize hesitant agencies to make the change.
“Honestly — get started,” Steege said. “Culturally, it’s hard, but … development, operations and security working together can accomplish so much more than working in silos.”
This is a carousel with manually rotating slides. Use Next and Previous buttons to navigate or jump to a slide with the slide dots
-
How TMF is Helping Agencies Accelerate Tech Modernization
The program launched a new AI pilot to expedite TMF applications as agency leaders urge more to consider applying for funds.
4m read -
Defense Board to Pitch Solutions for Closing Tech Talent Gaps
Defense Innovation Board members cite need to modernize people management the same way government modernizes technology.
4m read -
Energy Researchers Aim For Holistic Approach to AI Issues
A new center at the Oak Ridge National Laboratory is looking at under-researched areas of AI to better understand how to secure it.
2m read -
5 Predictions for AI in Government Technology
Agencies are setting plans in motion not only to integrate AI into their enterprises, but also ensuring the data that power these systems are fair.
41m watch