Command Palette
Search for a command to run...
Adversarial Attack on Video Classification
Adversarial Attack on Video Classification refers to the technique of adding perturbations to video frames through an optimizer to deceive video classification systems. The goal is to cause the video classification model to make incorrect predictions while minimizing the perturbation. The application value of this technology lies in evaluating and enhancing the robustness and security of video classification systems, especially in the face of significant computational challenges in selecting key frames and key regions.