Command Palette
Search for a command to run...
Data Poisoning
Data poisoning is an adversarial attack method that involves manipulating the training dataset to control the prediction behavior of the trained model, causing it to misclassify malicious samples and thereby achieving the attacker's intended goal. This type of attack poses a serious threat to the security and reliability of machine learning systems, and researching its mechanisms can help enhance the model's defense capabilities.