Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

exactly. This is precisely to spec; and I got the answer of 60 immediately on looking at the code.


This is incorrect; it is undefined behavior according to the spec: http://www.reddit.com/r/programming/comments/1rrefp/a_glimps...


Ok, so do any other compilers handle it differently?


Yep.

Clang says:

  zsh% clang -o sequencepoints sequencepoints.c
  sequencepoints.c:7:18: warning: multiple unsequenced modifications to 'i' [-Wunsequenced]
    int r = 1 * a[i++] + 2 * a[i++] + 3 * a[i++];
                   ^            ~~
  1 warning generated.
And prints:

  zsh% ./sequencepoints
  140


Practically every new version of gcc adds a new optimization that recognizes some new form of undefined behavior and then rewrites your function to do whatever it wants.

Classic example: signed integer overflow. It worked for decades. Then one day it didn't.

If you want to know about this particular example: https://news.ycombinator.com/item?id=6824514 (not personally confirmed)


That isn't really an accurate description of the issues: it's not the case that it was "working" and then "broken" by GCC maintainers. It's always been unsupported and not worked in specific situations, but for 99% of code it appeared to GCC users that it was supported. The signed integer overflow behaviour was never guaranteed by old versions of GCC, and code exploiting it would not always be compiled with the "expected" behaviour. It's just as the GCC optimizer has improved there are more circumstances when it does optimisations that hinge on the assumption.

Compiler users generally want something that "just works" and doesn't do anything unexpected, but in the case of a low-level language like C, doing away with undefined behavior essentially would mean pessimistically avoiding many optimisations on the 99%+ of straightforward, reasonable code out there in favor of not doing anything surprising on the remaining fraction of dubious code that depends on certain things happening in scenarios where behavior is undefined according to the C standard. There are languages that make that choice, but C isn't one of them.


That is my point. De facto working code is not working code.

I have fixed feelings about the integer overflow issue because it's so easy to trigger, unlike triple post increment fake examples. And it usually results in a security problem. For very little benefit, IMO.


Exactly. De facto defined is usually just as important to consider as de jure.


ah, sorry, I'm stuck in C[whatever it was I learned in high school]




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: