GAI, Dell EMC and NVIDIA are working together to help agencies accelerate their path to artificial intelligence in order to drive mission success.
Artificial intelligence is no longer a futuristic technology.
Today’s best machine learning and deep learning systems already outperform humans in image detection – and match human performance in speech recognition. With advances in high-performance computing and deep learning algorithms, AI solutions can deliver results right now.
Yet as mainstream as AI systems are today – from Alexa and Siri to Gmail and Waze – government use is still largely limited to early adopters in intelligence and defense. Most federal agencies are only beginning to think about how AI could change their world.
Jay Boisseau, High-Performance Computing and AI Technology Strategist at Dell EMC, has two words for government leaders contemplating AI today: don’t wait.
“This is not something to keep evaluating,” Boisseau says. “Ramp your staff up on understanding AI and get them thinking about the key question: ‘Where can we use it?’ You don’t want your people reading books for a year and then thinking about what to do next. You want them to learn, to leverage experts and to get experience trying something small.”
The challenge for many is how to get started.
The technology is developing so fast, both in terms of hardware and software, and the options are so great that it’s easy to be overwhelmed. “As with the beginning of any new technology disruption, there's a plethora of software and hardware players in this space, more than we know, and more coming all the time,” Boisseau says. “Trying to figure this out yourself by going out on the internet – deciding which framework to use, which packages, which hardware – that’s hard. It’s confusing.”
Raising the stakes is that agencies can’t look at AI in isolation, because AI is really a tool for extracting knowledge and making decisions, a tool fueled by data from other systems. An AI solution for understanding insider threats or cyberattacks depends on data generated by activity logs. A system designed to manage traffic depends on data from sensors embedded in cars and streets. A solution for analyzing drone video depends on the data relayed from drones. How that data is organized, tagged and stored will have long-term implications for the results agencies can derive from the raw files.
McKinsey & Company, the global management consultancy, pegs the global public sector market for AI services at $25 trillion. With just that at stake – regardless of global private sector markets – it’s no wonder the number of AI startups earning venture capital investment is rising 600 percent annually, according to the AI Index, an open-source data project at Stanford University.
Start with Data
Government leaders face a multifaceted challenge: marshal the human and financial resources to modernize internal systems, and at the same time, leverage emerging technologies. Fail to do so, and risk falling hopelessly behind. Indeed, China and other foreign governments are betting heavily on AI as a once-in-a-generation opportunity to leapfrog into the forefront of global technological leadership. So where do you start?
“The really challenging part – not the sexy part, but the essential piece, really – is the preparation of data for machine learning and deep learning,”
Boisseau says. “Historically, enterprises – both inside and outside government – use databases and business intelligence tools for ‘hind-casting’– that is, to better understand what happened in the past.” Statistical techniques and computer models might then be used to try to prepare for the future.
To do that, agencies need to think broadly about the data they collect and about how that data might be used effectively – and who may need access to it. The people who use and own a given database today may not recognize all the potential uses – and users – for their data in the future.
For many organizations, however, this itself is a major challenge.
Historically, data sources were setup and maintained in separate silos, and if users needed access to data from another silo, they had access to reports, rather than current data.
“There was a positive reason for doing that,” Boisseau says. “You didn’t have all your data in one place for a reason – it was more secure that way. You set up a customer database, an inventory database, a shipping and logistics database, and then you set up your taxonomy, schemas in these databases and you managed access rights that that needed to use it, and this kept things well organized and clean and secure.”
For AI to achieve its full potential, however, organizations now need to think about their data and access rights differently.
We have a long history helping our customers capture data, manage data, organize data. Now, one of our critical roles is to help customers understand how to harvest more potential out of that data with AI.
Boisseau recommends following a three-step discovery process:
- Research and interviews. Understand the mission, use cases and potential AI and ML applications.
- Data exploration. “This is where we work with you to see what your data sources look like and do some simple data science exploration to determine what you can get done and the potential impact we can have,” Boisseau says.
- Ideation workshop. “This is a collaborative process in which stakeholders from across the customer agency come together with us, review findings and mockups and identify the use cases with the best odds of success.”
Infrastructure is Key
Once the basic requirements are understood, the next step is understanding the existing IT services and infrastructure– and whether they can support anticipated requirements. AI is not just another application.
“We really need to assess the IT environment: Is there is a path to use existing resources?” Boisseau says.“If you're going to embark on machine learning and deep learning, it's important that you understand your mission objectives, your current IT environment and your current data environment. You may be able to leverage resources you already have, but the reality is that the computational intensity of these algorithms is such that most of the time, customers need solutions optimized for machine learning and deep learning in order to get the full value out of what they’re doing.”
AI depends on high-performance computing (HPC) to crunch vast amounts of data using complex algorithms to anticipate what comes next. Think of Netflix offering you a choice of another movie you might enjoy, or Amazon suggesting a product of interest. That is an algorithm using data from prior experiences to anticipate current or future needs. That same concept applies to performance data from a weapon system or aircraft engine.
Using data generated from past experience, the system can predict when parts are likely to fail – so maintainers can replace that part before it does. Taking it a step further, it can also anticipate when to order that part, and when it must leave a depot in order to arrive in time for a mechanic to install it.
“This is possible because it’s taking advantage of tremendous advances in high-performance computing,” Boisseau says. “To train deep learning systems effectively, you need that performance. These fundamental technologies are what power this new generation of deep learning algorithms.”
Dell EMC shares that experience and knowledge in its HPC and AI Innovation Lab, where customers can talk about ideas with experts and test out use cases using fully configured solutions.
This is exactly the kind of expertise needed to build AI systems. Speed is so important to getting those models out into production.
One of the driving factors in this discussion is data volume. Consider applications like cybersecurity or traffic management, for example. The volumes of data involved are enormous, and tasks given to computers are the literal equivalent of identifying needles in haystacks. If the underlying infrastructure is insufficient for the task, the solution will not deliver worthwhile results, Boisseau says.
“What we're really trying to do, is help our customers figure out the pieces: what data they have, what their objectives are for deriving new knowledge from that data, and therefore what the right products and services are to help them accelerate to a solution. We want to enable them to find answers that they couldn't find before.”
That boils down to having the insights to make better decisions faster. At Dell EMC, we think of that as speed to mission value: We accelerate their ability to execute mission and to execute with greater quality.”
Here are five steps agencies can take to accelerate their AI plans:
1. Learn from others.
The best way to ramp up your learning curve is to leverage those who have already been there and done that. Workshops and learning companies like Dell EMC and its partners, e.g. NVIDIA and solutions provider GAI, combine expertise in hardware, software and systems integration efforts to understand the federal sector, and can help focus teams’ understanding and motivate activity.
2. Identify opportunities.
“Start small,” Boisseau says. “You don't have to convert everything in your agency to an AI-powered process at once. Pick some low-hanging fruit, and let your team develop local expertise.”
3. Don’t give up.
Not all problems are right for AI, and some trials won’t prove fruitful, Boisseau says. “Maybe there’s not enough data, or the data is not in a useful form.” If that’s the case, learn and move on.
4. Measure everything.
“Get some early success in a measurable way, so that you can show a return on your investment. Then you can build on that success.”
5. Stay alert.
The AI market is dynamic, and the only constant is change. “In the near term and moving forward, you are going to continue to have a rapid grow-out cycle. The faster you can get up to speed on what's possible now, the better positioned you will be to leverage new products and solutions as they emerge.”