Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If it can't be designed properly, then I withdraw the suggestion, but I think it's possible to design something that cannot hide from you the fact of a detection and the questionable content. That is, if they don't stop at image scanning for CSAM then you'll at least know it, even if you can't stop it. Not perfect, but its something.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: