/ Lecture Series: Protecting Intellectual Property in Additive Manufacturing Systems Against Optical Side-Channel Attacks

Lecture Series: Protecting Intellectual Property in Additive Manufacturing Systems Against Optical Side-Channel Attacks

April 8, 2022
12:30 pm - 1:30 pm

This Friday, Sizhuang Liang, Georgia Tech alum and embedded solutions test engineer at Green Hills Software, will present his lecture Protecting Intellectual Property in Additive Manufacturing Systems Against Optical Side-Channel Attacks. Join us from 12:30 – 1:30 p.m. in Coda or virtually!

Register for VIRTUAL Attendance
Register for IN-PERSON Attendance

Abstract  

Additive Manufacturing (AM), also known as 3D printing, is gaining popularity in industry sectors, such as aerospace, automobile, medicine, and construction. As the market value of the AM industry grows, the potential risk of cyberattacks on AM systems is increasing. One of the high value assets in AM systems is the intellectual property, which is basically the blueprint of a manufacturing process. In this lecture, we present an optical side-channel attack to extract intellectual property in AM systems via deep learning. We found that the deep neural network can successfully recover the path for an arbitrary printing process. By using data augmentation, the neural network can tolerate a certain level of variation in the position and angle of the camera as well as the lighting conditions. The neural network can intelligently perform interpolation and accurately recover the coordinates of an image that is not seen in the training dataset. To defend against the optical side-channel attack, we propose to use an optical projector to artificially inject carefully crafted optical noise onto the printing area. We found that existing noise generation algorithms can effortlessly defeat a naïve attacker who is not aware of the existence of the injected noise. However, an advanced attacker who knows about the injected noise and incorporates images with injected noise in the training dataset can defeat all of the existing noise generation algorithms. To address this problem, we propose three novel noise generation algorithms, one of which can successfully defend against the advanced attacker.

Speaker Bio 

Dr. Sizhuang Liang was a PhD student supervised by Prof. Raheem Beyah and a Graduate Student Researcher at the Communications Assurance and Performance (CAP) Group in Georgia Institute of Technology. He specialized in protecting Additive Manufacturing (AM) systems from cyberattacks. One line of his research was to perform intrusion detection in AM systems by analyzing side-channel signals. He proposed a novel framework, called NSYNC, to efficiently compare two side-channel signals in real time to perform intrusion detection. Another line of his research was to understand side-channel attacks on AM systems and protect AM systems against side-channel attacks. He is currently working for Green Hills Software, a company that focuses on safe and secure solutions for embedded computer systems.