Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To find the proper N, we should also know the ratio x of serious crimes which result in the criminal being caught. In other words, if we let this guilty man go free, how many other innocents will he kill/rob/rape/whatever before he is caught again? If x = 0.01, then 100 innocents will suffer (on average) for each guilty man you fail to convict, which is problematic. On the other hand, if x = 1, then the situation is much different.

Also, each innocent who is wrongly convicted, will erode public support for the criminal justice system, and now we have a non-linear system and it gets very complex. But probably we could break out some calculus and find the optimal N given x and y (erosion of criminal justice system power due to loss of public support, for each innocent convicted, that results in fewer tips to law enforcement), and maybe z (erosion of criminal justice system power due to loss of public support for each criminal set free, that results in greater likelihood people take the law into their own hands).

I suppose, working backwards, you could for each N work out what that person thinks the values of x, y, and/or z are. If you think we rarely catch criminals, then letting one go is a bigger problem than if you think we normally catch the perpetrator of any given crime.



Innocent person in jail means that guilty person walked free. If it is easy to lock Innocent people, justice system has no incentive to try to figure out who really did it.

Result is easily that then you have both more innocent people in prison and more guilty people being free.


Exactly.

The expression sets up a relationship: more strict and more effective vs less strict and less effective, and asks us how much effectiveness we’re willing to give up in exchange for safety. This just misses the point, a legal system that locks up innocent people instead of guilty ones is just not doing a good job.

I think it is well intended (at least for values of n much greater than 1) but harmful.


The problem of "n" does not completely go away even in a much improved legal system. Assuming you can identify the killer 99.99% correctly with absolute certainty, you will still have that one in ten thousand case where the killer is not precisely known, but there is 50%, 90%, 99% probability it's them, that is, you risk to convict one innocent man for every n = 1, 10, 100 guilty persons in those rare cases.

So the efficiency of the criminal system and the problem of "n" are somewhat orthogonal: the first is an issue of effective governance, the second is a political choice faced by any practical (thus, imperfect) system of governance.


I'm realizing there should also be something in there about the number of crimes which an unconvicted criminal is likely to commit. Maybe their close call (being put on trial but not convicted) causes them to walk the straight and narrow, at least for a while. I am starting to think that a Bayesian model of this is what is called for. We do have some data (from DNA exoneration of the convicted, and also from clearly guilty who are not convicted because evidence was obtained improperly) as to how often we fail to convict the guilty, or convict the innocent, which we could plug into this.

Which brings up a larger point: what is N, really? Like, now, in the real world in my country or state or city, what is N? It would be interesting to try to estimate it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: