Don’t inject untrusted user input in LLM prompts. Instead, write untrusted data to a file, then instruct the LLM to read it.
Contact us:Provide news feedback or report an error
。新收录的资料对此有专业解读
FT Digital Edition: our digitised print edition
Евгений Шульгин