Security Copilot aims to help cybersecurity professionals identify breaches and threat signals as well as better analyze data, utilizing OpenAI’s GPT-4 generative AI model.
Heaptalk, Jakarta — Microsoft introduced a new tool using generative artificial intelligence (AI) for cyber defense, namely Security Copilot. Generative AI refers to artificial intelligence that processes large datasets and language models to generate patterns and content, spanning images, text, and video.
This specialized Copilot tool aims to help cybersecurity professionals identify breaches and threat signals as well as better analyze data, utilizing OpenAI’s most recent GPT-4 generative AI model.
According to the official statement, this cybersecurity assistant applies Microsoft’s security-specific model, with a growing set of security-specific skills that is fed with over 65 trillion signals every day. Previously, the technology giant launched AI Copilot which has been embedded into Microsoft 365 apps and Teams for Windows.
“Today the odds remain stacked against cybersecurity professionals. Too often, they fight an asymmetric battle against relentless and sophisticated attackers. With Security Copilot, we are shifting the balance of power into our favor. Security Copilot is the first and only generative AI security product enabling defenders to move at the speed and scale of AI,” said Corporate Vice President at Microsoft Security, Vasu Jakkal.
The number of cyberattacks increases, but security tends to stagnate
Currently, at least 1,287 passwords are attacked every second, causing fragmented tools and inadequate infrastructure to stop attackers. According to the company’s press release, although attacks have increased 67% over the past five years, the security industry has not been able to hire enough cyber risk professionals to keep pace.
Further, Microsoft stated that its cyber-trained model adds a learning system to create and tune new skills to help catch what other approaches might miss and augment an analyst’s work. In typical incidents, these improvements result in better detection quality, response speed, and the ability to strengthen security posture.
Security Copilot also needs time to improve speed and accuracy. AI-generated content can contain errors, and Security Copilot is no different. However, Microsoft is optimistic that its cybersecurity tool will work properly as it is a closed loop learning system. This means that the tool will continuously learn from users and allows them to provide explicit feedback with the feedback feature built right into the tool.