Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I get the feeling that iOS runs photos through a neural net filter by default. It would be nice to know exactly what is behind any given shot, both my personal images and those of a collection/competition like this. Thus armed with knowledge, only then can we truly judge for ourselves.


What difference would it make? The computation part of iPhone photography is an essential part of it.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: