How will the defense and security industry use ChatGPT?
Marie Donlon | December 30, 2023As the topic of ChatGPT — which is the natural language processing tool driven by artificial intelligence (AI) and created by the AI research firm OpenAI — makes daily headlines, GlobalSpec has been examining how it is affecting specific industries, with a series of feature articles about its impact on the oil and gas, healthcare, manufacturing and food and beverage industries, so far. This follow-up feature will examine the impact ChatGPT is expected to have on the defense and security industry.
The defense and security industry is an umbrella term for a host of other, more specific sub-categories, including military, cyber security, data privacy, personal security and everything in between. While each and every application under the defense and security industry umbrella will no doubt be impacted by the growing capabilities of ChatGPT and AI in general, GlobalSpec will specifically look at the chatbot’s likely impact on the military and cybersecurity.
Military
Although the military applications for recent iterations of ChatGPT are currently limited by incorrect and potentially biased data and limited knowledge, the technology promises to change this and virtually every other industry in the very near future.
Training
One such task that military personnel could offload onto the technology could be training related. The chatbot can reportedly be used to create training materials by autonomously generating training text. The chatbot could also be used to imagine training simulations, wherein trainees would be tasked with communication scenarios, decision-making assignments and, potentially, adversary interaction simulations.
Robotics
Experts suggest that future iterations of ChatGPT could be used to support military robots by one day enabling the robots to understand and recognize verbal commands. Further, the chatbot could also possibly automate maintenance schedules for military robots and drones.
The battlefield
On the battlefield is where ChatGPT will likely have the opportunity to prove itself, potentially helping military personnel to make decisions, conduct surveillance, automate target recognition, perform intelligence analysis and more.
Specifically, ChatGPT could be used to assess and understand substantial volumes of data — whether it is culled from communications, surveillance images or videos, satellite imagery, sensor logs, news articles, social media or historical data. This chatbot could use this data so that it can one day perform tasks such as risk prediction, threat assessment or target tracking, for instance.
Medical decisions
In the event that a soldier is injured on the battlefield, human medics could potentially use ChatGPT to help in the care of the wounded soldier. For instance, the chatbot could be trained to recognize and diagnose illnesses or perform triage in the event there are several soldiers wounded.
Likewise, the technology could be used to perform medical research and to train medical personnel ahead of being deployed to the battlefield.
Paperwork and logistics
Like the other industries previously explored by GlobalSpec, the military could also use ChatGPT to automate recordkeeping tasks. The technology promises to organize and track documents, and possibly ensure compliance with policies and regulations.
Meanwhile, the technology could also potentially be used to perform the repetitive tasks of inventory tracking, or schedule equipment maintenance and perform supply chain management.
Cybersecurity
However, ChatGPT can also reportedly be used for good. Experts suggest that the technology could be used as a tool to help those defending vulnerable systems.
For instance, ChatGPT might be used to analyze data to identify suspicious activities and security incidents to build enhanced intrusion detection and prevention systems or uncover potential code and other vulnerabilities that can be avoided in the future.
Like its potential military applications, ChatGPT could also be used to train personnel to recognize and defend against cyber-attacks. Data from other cyber-attacks could be used to help the chatbot understand how previous attacks have been perpetuated and, consequently, create solutions to help companies or governments to avoid those pitfalls in the future. Additionally, ChatGPT could be used to create training simulations by constructing realistic phishing or scam scenarios that lets individuals develop a better understanding of potential threats and thus, how to avoid them.
These are just a few examples of how ChatGPT might be used by the defense and security industry. Check back with GlobalSpec for more on the topic of this and other technologies.
It won't be long before some bad actor uses an AI to generate false information to mislead other AIs. On the battlefield, it could be used to lure an enemy force into an ambush. In cyber warfare, fake data fed to an AI training program could be used to subtly sabotage the accuracy and reliability of the AI being developed.
AI may make some tasks simpler, but the potential for such sabotage means there is no substitute for human common sense. Commanders and security personnel should be careful to avoid blind reliance on AI recommendations, but should evaluate them at least as carefully as any from other humans.