Accelerating Cybersecurity With GPUs

10th July, 2019 by

Cybersecurity represents one of the most fundamental challenges facing businesses and organizations today. Protecting IT systems and sensitive data from attack has never been so vital or difficult. Cyber-threats are more diverse, more common and more complex than ever before, while a hack or sophisticated cyber-attack could also cause far more damage than previously.

Organizations and companies rely heavily upon their IT infrastructure for day-to-day activities. An attack on that infrastructure may prove crippling, quite apart from the adverse effect on an organization’s reputation if they fall foul of an attack. As a result, there has been a scramble for new ways to combat cybersecurity threats. Threat detection using GPUs and deep learning is one such solution. It could be a game changer for the cybersecurity field, and it’s all made possible by the processing power of modern GPUs…

Security guard logs

Cybersecurity analysts have the unenviable challenge of sifting through huge amounts of data. They’re presented with large datasets from IP logs, network logs, server reports, and other sources. This data is often unstructured and is continually increasing in volume. Analysts must wade through it all to track anomalies, find security holes, and identify potential attack vectors.

With traditional processors and infrastructure, graph analysis is a limited process. Analysts often have to partition the data available to them to perform sub-graph analysis. They don’t have the capability to visualize and analyze the huge datasets received in one go. Training a model on such a large amount of data can often take longer than is practical, particularly when using traditional architecture.

In at the deep end

As the only way to make sense of so much data in a timely manner, graphical visualization and graph analysis (using GPUs and deep learning) helps with threat detection. The raw processing power of modern GPUs accelerates both the visualization and analytics elements of graph analysis. Threat detection using GPUs supports orders of magnitude more data than traditional CPUs could cope with. They offer far greater parallel processing capability and computing power so they can cope with the volumes of data related to large networks.

The processing power of GPUs additionally enables analysts to introduce deep learning to the cybersecurity picture. Deep learning is an advanced subset of AI and machine learning. It uses multi-layered artificial neural networks which allow the performance of state-of-the-art tasks, including complex graph analysis.

GPUs have the power required to apply deep learning algorithms to graphical data. This aids analysis, greatly reducing the time taken to train models on huge amounts of data. Its benefits are summarized by Joshua Patterson, the principal data scientist of Accenture Labs: “When we can move four billion-node graphs onto a GPU and have the shared memory of all the other GPUs and have that connected processing power… It’s going to cut out months of development cycles.”

The threat is real

Threat detection using GPUs and deep learning might propel cybersecurity into a new era. The visualization and analytical capabilities afforded by these processors are key in helping analysts catch up with the rogue elements threatening systems in new and complex ways. Threat detection using GPUs might even give cybersecurity analysts an advantage, placing them ahead of the cybersecurity curve for the first time. The intelligent and complex analysis it permits has the potential to change the face of cybersecurity, by moving it from a preventative process to a predictive and proactive one. For the first time, analysts will be able to actively seek out threats and potentially neutralize them before they become real – or do any damage…

(Visited 12 times, 1 visits today)