If it can't be designed properly, then I withdraw the suggestion, but I think it's possible to design something that cannot hide from you the fact of a detection and the questionable content. That is, if they don't stop at image scanning for CSAM then you'll at least know it, even if you can't stop it. Not perfect, but its something.