Data poisoning attacks on LLMs are particularly dangerous because once poisoned it’s almost impossible to fix models.
Posted on 28 Mar 2024
Julian Prester © 2024