Evasion Attacks

Evasion attacks are attacks at test time, in which the attacker aims to manipulate the input data to produce an error in the machine learning system. Unlike data poisoning, evasion attacks do not alter the behavior of the system, but exploit its blind spots and weaknesses to produce the desired errors.