Microsoft Security Copilot is a new GPT-4 AI assistant for cybersecurity
Microsoft is gradually building AI copilots for everything. The latest one is for security professionals.
After announcing an AI-powered Copilot assistant for Office apps, Microsoft is now turning its attention to cybersecurity. Microsoft Security Copilot is a new assistant for cybersecurity professionals, designed to help defenders identify breaches and better understand the huge amounts of signals and data available to them daily.
Powered by OpenAI’s GPT-4 generative AI and Microsoft’s own security-specific model, Security Copilot looks like a simple prompt box like any other chatbot. You can ask “what are all the security incidents in my enterprise?” and it will summarize them. But behind the scenes, it’s making use of the 65 trillion daily signals Microsoft collects in its threat intelligence gathering and security-specific skills to let security professionals hunt down threats.
Security Copilot will perform analysis for security professionals. Image: Microsoft
Microsoft Security Copilot is designed to assist a security analyst’s work rather than replace it — and even includes a pinboard section for co-workers to collaborate and share information. Security professionals can use the Security Copilot to help with incident investigations or to quickly summarize events and help with reporting.
Security Copilot accepts natural language inputs, so security professionals could ask for a summary of a particular vulnerability, feed in files, URLs, or code snippets for analysis — or ask for incident and alert information from other security tools. All prompts and responses are saved, so there’s a full audit trail for investigators.
Results can be pinned and summarized into a shared workspace, so colleagues can all work on the same threat analysis and investigations. “This is like having individual workspaces for investigators and a shared notebook with the ability to promote things you’re working on,” says Chang Kawaguchi, an AI security architect at Microsoft, in an interview with The Verge.
Security Copilot can accept files, URLs, and code snippets for analysis. Image: Microsoft
One of the most interesting aspects of Security Copilot is a prompt book feature. It’s essentially a set of steps or automations that people can bundle into a single easy-to-use button or prompt. That could involve having a shared prompt to reverse engineer a script so security researchers don’t have to wait for someone on their team to perform this type of analysis. You can even use Security Copilot to create a PowerPoint slide that outlines incidents and the attack vectors.
Much like Bing, Microsoft is also sourcing results clearly when security researchers ask for information on the latest vulnerabilities. Microsoft is using information from the Cybersecurity and Infrastructure Security Agency, the National Institute of Standards and Technology’s vulnerability database, and Microsoft’s own threat intelligence database.
That doesn’t mean Microsoft’s Security Copilot will always get things right, though. “We know sometimes these models get things wrong, so we’re offering the ability to make sure we have feedback,” says Kawaguchi. The feedback loop is a lot more involved than just the thumbs-up or thumbs-down found on Bing. “It’s a little more complicated than that, because there are a lot of ways it could be wrong,” explains Kawaguchi. Microsoft will let users respond with exactly what’s wrong to get a better understanding of any hallucinations.
Security Copilot can even create a PowerPoint slide. Image: Microsoft
“I don’t think anyone can guarantee zero hallucinations, but what we are trying to do through things like exposing sources, providing feedback, and grounding this in the data from your own context is ensuring that it’s possible for folks to understand and validate the data they’re seeing,” says Kawaguchi. “In some of these examples there’s no correct answer, so having a probabilistic answer is significantly better for the organization and the individual doing the investigation.”
While Microsoft’s Security Copilot looks like a prompt and chatbot interface like Bing, the company has limited it to just security-related queries. You won’t be able to grab the latest weather information here or ask the Security Copilot what its favorite color is. “This is very intentionally not Bing,” says Kawaguchi. “We don’t think of this as a chat experience. We really think of it as more of a notebook experience than a freeform chat or general purpose chatbot.”
Security Copilot is the latest example of Microsoft’s big push with AI. The Microsoft 365 Copilot feels like it will forever change Office documents, and Microsoft-owned GitHub is supercharging its own Copilot into more of a chatty assistant to help developers create code. Microsoft doesn’t appear to be slowing down with its Copilot ambitions, so we’re likely to see this AI assistant technology appear throughout the company’s software and services.
Microsoft is starting to preview this new Security Copilot with “a few customers” today, and the company doesn’t have a date in mind for rolling this out more broadly. “We’re not yet talking about timeline for general availability,” says Kawaguchi. “So much of this is about learning and learning responsibly, so we think it’s important to get it to a small group of folks and start that process of learning and to make this the best possible product and make sure we’re delivering it responsibly.”