AI Training Algorithms Backdoors are Possible, Say Researchers

According to three researchers from New York University, they have developed a method which can infect artificial intelligence algorithm. As most of the companies outsource AI training operations using on demand MLaas (Machine-Learning-as-a-service) platform, the researchers have based their attacks on it. Technology Giants like Google allows researcher’s access to “Google Cloud Machine Learning Engine”. Similarly, Microsoft allows similar service through Azure Batch AI training and Amazon through EC2 service. According to New York researchers, a backdoor behavior can be triggered by hiding small equation in deep learning algorithm and this is easily possible because it is deep learning algorithms are complex and vast. In order to prove their concept, the researchers have released a demo of image recognition AI in order to manipulate Stop road sign as an indicator of speed limit if objects like bomb sticker, flower sticker were placed on the surface of stop sign. It is Read more