Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's even easier if you measure per week instead of per year since a week has ~10,000 minutes.

Then the rule of thumb is 10 minutes is 0.1%, 1 minute is 0.01%, and 6 seconds is 0.001%; or 10 mins = 99.9%, 1 min = 99.99%, and 6 seconds = 99.999% respectively.

(Note that 6 seconds a week * 52 weeks = 5.2 minutes, same as the reference table above.)

Programmers don't think in years, but they can think in "any given week". This rule of thumb puts things in an easy-to-remember perspective.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: