The generated code is fine, if it's a self-contained class of average size.. or below. But even with immense architecture, and constant supervision, it does not take long before it degenerates into "focused fixes", shortcuts, laziness and just outright cheating or lying.. So far, no amount of prompting has lead me beyond this.. It's paradoxical, how the model seems to reason about the correctness (or wrongness) of a proposed architecture and design, can write a plan that seems to take this into account, answer correctly to questions about the plan (even the ones meant to uncover the nuances that may be unclear), ask tons of clarifying questions and update both plan and spec docs correctly, and yet continue to act like a "ticket closer" who immediately puts on the biggest possible blinkers (horse blinkers) and deeply ignores all of it when building that same plan, referencing those same documents...
Attempting anything comprehensive with AI is the software development analogue to the Gell-Mann Amnesia effect..
I'm definitely thinking deeply now about how I'm approaching these tools going forward.. Yes, GPT5 is better at spitting out a fairly acceptable skeleton to a class when prompted hard enough, than I am, in one go.. but.. It will happily do things like write decent looking protobuf schemas and then go ahead and hide everything that takes the least amount of reasoning behind some binary blob nested deep enough that it'll get past even the most dedicated reviewer..
It's fairly good at a lot of the things that I don't find interesting to deal with, but it's also amazingly incompetent when it comes to even the most mundane kind of common sense.. It's so strongly steering towards text-book examples that it will happily put in three times the amount of code and handle multiple classes of actually impossible edge-cases and even use-cases that it was specifically asked NOT to add.. And it will defend it by "well, I added this because I can't know if someone is going to use the thing I just added.. well, if you hadn't added it, chances are indeed slimmer..
It's so good at answering questions and explaining what's there, and diving through call-paths, and yet, it drops the ball the moment it's going to actually do something beyond saving me from looking up how write some really annoying and uninteresting boilerplate..
The worst thing is how good it is at making things LOOK right, it will cover every single edge-case you throw at it, but not because of the design, not because it correctly argues why the architecture is inherently allowing such and such, or because the design and spec fleshes out that A goes to B and never the other way around, and as soon as it's time to make something, it will make sure B can go to A, especially, it seems, if allowing so prevents it from doing the right thing which is WHY those edge-cases were trivial, instead it will endlessly hack around them.. I've worked people like that too, so I don't know if I am really blaming the models or the training data..
But damn it's a tough spot..
I've had multiple situations where, after wasting hours of work, which I should have just spend doing it myself, the only thing I really wished was for the model to be sentient, and able to feel pain, and have a corporal body so I could drag it outside and beat it to a pulp. (I've never reached that level of frustration with an actual person, so that's something new they bring to the table..)
I'm a software engineer and architect, I love my job, I love diving into the small details, I love the grand overview.. I love identifying concepts and applying them to achieve elegant high-performing systems.
I love thinking about what kind of assembler the compiler may generate (though honestly, I haven't got a chance), I love thinking about how languages should be more dynamic (Who's got actually-first-class functions? Like, ones that you can build, compose, combine and manipulate to the same degree you can a string or a JSON object, no LISP, you're cheating, close no point).
And yet.. I don't care that much. Not because I'm late in my career (I'm 40, there's still some years left in me), but because I want to make computers do things, and what I enjoy doing is thinking up ways the things can happen, and sometimes the particulars that matter when making a lot of different things happen in a coherent system.. And yea, LLMs are trained on peoples output, and from what I'm seeing everywhere, is that people are overall fairly terrible at that, and most of the plumbing-type glue being written is not worth anyones time..
And I'm not saying I don't care because LLMs can't do my job (heck, even after hours of back-and-forth spec building and refining every little nook and cranny, the stupid coding agent still cheats or gets it wrong (even after it's beautifully explained, proven even, by reasoning and example alone, and on first try even) that the words coming after the previous words makes sense, as soon as the plan is put into motion, it'll mess it up on some scale so fundamental I should just have done it myself.. And I hope that changes, I hope that I don't have to go into such detail.. I hope to become a steward of taste rather than a code-reviewer.. I hope that I will eventually not be needed for that anymore.. I want it to replace me, so I can move to telling what I want, and have it made that way..
I hope I won't need to steward good taste, and that nobody will.. I hope the applications I use in 5 years will be a collection of one-offs, and gradually improving tools that was written _just_ for me, for my way of working, and my way of thinking.. I want to prompt the damn program to change itself as I discover new ways to do things, until it can eventually figure out how to automate the last bit of my task away.. And then I'll go do something else exciting.
Oh wow.. No DLCs.. No seasons.. No paid content. Just content.. No advertisements outside of the whole game being what it is.. This is actually amazing to see.. These days, with these amazing machines, amazing frameworks, and all the possibilities, and we're now aiming not to provide experiences that are enjoyable on their own merit, but something else.. the "joy" part is a checkbox that's prioritized only to the lowest viable point, because the content is something else, it's advertisement or user-retention or user-capture or just plain old data-harvesting.. Sure, flash was a security nightmare, but the technology was amazing, on the same 100 pentium machine that struggeled with letting Coral zoom in on a part of a vector image, Flash would happily fill the screen with animation done using morphing vectors..
I remember that day, when news told that Apple wouldn't allow Flash.. I remember thinking, that was the beginning of the end.
Yes, it was proprietary, and all of that, but it was really, really good for its time, and it's still good.
So much culture.. So much zeitgeist. Locked into files of that era.
of course I have an adblocker, have you tried going online anytime the past 20 years ??? also, I don't know how to install chrome extensions on firefox so that's probably not for me.
My comment was not about that site itself, but about its content and that someone cared to share the link to it :)
I always thought it was an elegant and respectful solution.
Instead of harassing existing addicts into quitting, ease out of general addiction by forbidding those who haven't yet had the opportunity to get addicted, yes, of course some will still be, but it will be vastly fewer than when it's illegal.
It's different from weed in that, while it makes you look cool, it's simply nowhere near as fun..
Nowhere near as fun and 10 times as addictive. Although I do love to smoke pot, after all these years, Nikotine was a decade long habbit I wanted to quit for as long but couldn’t.
Although I don’t want to undersell the addictiveness of weed, like many people do, it is still psychologically very easy to form an addiction.
I made this little DSL that's transported by the typescript syntax, but made for declaring interfaces between a typescript application and a backend.
The code-generator spits out angular classes and typescript types for use by the web app, and it spits out a node backend class that you fill with the handlers for that interface. It also spits out a qt widget with the web application inside it, complete with functional plugin for the qt designer so you can drag your widget into existing UI layouts.
So you can use it to write UIs in web and use them either as regular Qt widgets or as stand-alone webapps with regular node backend.
It's really the wrong way around if you think about it.. using an inferior technology (web) for the UI part.. But somehow people prefer typing CSS and downloading gigabytes of boilerplate instead of just using a WYSIWYG designer.. I don't get that part..
I've long suspected that (at least my own) tinnitus was a neurological phenomenon, seeing how it's always been with me at various "levels of presence", from imperceptible to so loud I can't hear anything else, I've always felt it as an "inner sound".. Had multiple hearing tests, and nothing in particular showed up. It's also weird because it changes somewhat in frequency, both down to frequencies my 40 year old ears can register and up beyond what I can actually hear when doing a test..
But especially the coming and going and how it seems affected to level of tiredness or amount of sleep I got.. Of course, reading the article made me aware of it and now it's loud than before..
I've had strong symptoms of adhd my whole life, but never thought much of it (except as a lack of self dicipline and general failure of a broken robot to impersonate a real human), but as demands on my performance rose to real-adult levels with a young child and duties beyond not dying, I decided to tell the doc how it had generally felt like to be myself, at which point I was referred to someone with a specialty in broken brains, and we quickly agreed that while I wasn't going to become normal, certain stimulants at least provided me with sufficient energy to carry out most of the functions expected by an adult member of society with actual responsibilities.
And so, over the past.. more than a year, I've gotten to experience a little bit of everything as my brain gets to oscillate between being slightly oversaturated to absolutely drained of certain neurotransmitters in a way that at the same time feels slightly unsustainable and the only alternative where I get to not be absolutely miserable all the time.
The point of that story, being, these "phantom precepts", fits the bill somewhat well. I've always had a very conscious experience of common neurological phenomenon which are naturally present but largely-unnoticed by many (auras, visual snow, floaters, phosphenes, tinnitus, afterimages) so I'm probably a bit one the sensitive side, and, the medication seems to have a quite interesting effect on these as well, among them, I noticed the ABSENCE of noticing my clothes touching my skin.. I am no longer acutely aware of the cooling sensation of inhaling air through my nose, and I rarely hear the beat of my heart in my ears.. Maybe the weirdest effect is on saccades, in a conversation, looking from one person to the next seems to be as instant as before, but the blur of my eyes moving between points of focus is gone, it's kind of jarring, just poof, one picture, then another.. nothing in between.
I now seem to be able to influence my attention somewhat, that is, to do whatever that cognitive regulation is called, so that my focus shifts to a subject I need to do but have no interest in doing (oh wait, that's why I got the medication), but it does make me wonder, if tinnitus is just one of the more obvious (and therefore common) neurological processes that "pokes through" maybe perception of sound and attention (and maybe therefore also conscious experience of sound) have evolved to be more strongly linked (because if you notice the predator sneaking up on you, you get to not be eaten).
Maybe this stronger link is why tinnitus is so obvious, and maybe sleep is instrumental in regulating consciousness, so if consciousness is differently regulated, or less regulated, maybe it's easier for the phenomenon to "seep through".
out of all the people in this thread, you seem the most likely candidate to appreciate the following - tinnitus symptoms are often conflated with hearing sensitivity. if you can see auras, then id say your tinnitus symptoms arent indicative of tinnitus. search up brain wave frequencies and look at images. id wager you are hearing yourself, especially during changes of frequency. as well, just prior to sleep, you might hear a spike - thats the brain commencing the sleep mode algorithm (no wonder tinnitus wrecks sleep, affected individuals would struggle at synchronizing both hemispheres with the sleep algorithm when an involved sensor is malfunctioning)
seperately ... its clear that you recognise the incompatibility between sensitive individuals and a society designed to place the populace into constant fight or flight. youre still showing signs of blaming yourself. literally nobody is going to understand you (especially not doctors) and the sooner you accept this, the sooner you will free up a lot of trapped energy. id stop taking the stimulants regularly man ... even without their effect, barely anybody is going to understand your words, and the number of people who will appreciate your words is reduced when they are conveyed via essays (honestly i cant find the strength to read them properly) ... in general your expression has reminded me of the message behind the lateralus chorus
I have Ménière’s disorder and had a few short episodes of vertigo before one finally got me discombobulated. I woke up one morning and couldn’t get my extremities to function. Couldn’t tell up from down. It took about 8 hours for it to completely go away but then I realized I had lost most of my hearing in my right ear and half in my left, and had constant tinnitus and dizziness. I went to an ENT and learned that there is nothing I could do as there is no cure for Ménière’s. I have gotten better at dealing with the tinnitus and don’t notice it unless the train whistle or the lion roars start. I keep hoping one day that I will read that someone has found a cure! Ah well, hope springs eternal!!
I suffered a highly unpleasant vertigo attack yesterday - happens every once in a while. Tinnitus was the warning, and I was definitely over-tired beforehand.
After an ear infection 30 years ago I lost most hearing in my left ear and my balance was affected. Not a massive problem most of the time but I regret not being able to read when travelling, even by plane or train. It’s audiobooks all the way…
im sorry to hear that, some of my family members have inner ear disorders and the nausea/vertigo sounds terrible. the original comment isnt targeted at those like yourself who unfortunately must deal with damaged peripherals ... but still there is a chance it could apply. may i ask whether you ever noticed ringing in your ears to be correlated with a change of mental state? examples of this chage would be arriving at a big realisation, or commencing relaxation, or performing meditation.
I'm mainly thinking that the "sound" of tinnitus may be inherent in the brain, and the problem is the percept itself, not a percept of the tinitus, but the percept being generated while nothing was perceived, and so we become aware of this weird almost impossibly fine hight pitch.
It kind of fits with the patterns I and many other people describe, like the intensity varying with sleepiness and other mental state, and how it goes away if we hear _actual_ sounds of a broad enough spectrum..
It might be this little thing where it comes on by mistake, but it doesn't turn off again, and we latch onto it, and that's the feedback loop that enforces it..
I'm not saying we can "think it away" but I'm noticing in myself, that I didn't have any tinnitus _AT_ALL_ when I woke up, and now I'm almost consumed by this 20khz tone (my hearing stops around 16khz), and sitting here playing with that in my mind, I can certainly make it dim somewhat.
I wonder if there are some cognitive exercises that can be done especially for people who either don't have it, or have gotten it very recently. (Literature talks about some meditation and mindfulness, which I'm generally not a big believer in, but nevertheless, those do touch on the idea of messing around inside ones head in a top-down way).
I'm not too hooked on the idea that adhd is simply a "different kind of brain", I don't buy that we were the excellent survivors or hunters, I'm pretty sure I'd be the caveman who was eaten by a bear because I was too distracted by the pattern of shadows from two branches moving just the right way xD
I don't really blame myself, but I don't need to defend my condition (my personal condition, I'm not speaking on behalf of others), I've always been bothered by it, not simply when the mirror of expectation and society is held up against me, but even when left to do as I please, I find that while there are areas in which I function, and function well, there are areas where I'm so limited that it seems unreasonable even within my own framework. :)
constant 20khz does sound more like a damaged peripheral therefore my positation was incorrect, i wish i could help more but i dont have enough experience, that being said i do find super interesting the idea of playing a tone at the same frequency to manually force brains into filtering it out, thanks for sharing your perspective , all the best
Attempting anything comprehensive with AI is the software development analogue to the Gell-Mann Amnesia effect..
I'm definitely thinking deeply now about how I'm approaching these tools going forward.. Yes, GPT5 is better at spitting out a fairly acceptable skeleton to a class when prompted hard enough, than I am, in one go.. but.. It will happily do things like write decent looking protobuf schemas and then go ahead and hide everything that takes the least amount of reasoning behind some binary blob nested deep enough that it'll get past even the most dedicated reviewer..
It's fairly good at a lot of the things that I don't find interesting to deal with, but it's also amazingly incompetent when it comes to even the most mundane kind of common sense.. It's so strongly steering towards text-book examples that it will happily put in three times the amount of code and handle multiple classes of actually impossible edge-cases and even use-cases that it was specifically asked NOT to add.. And it will defend it by "well, I added this because I can't know if someone is going to use the thing I just added.. well, if you hadn't added it, chances are indeed slimmer..
It's so good at answering questions and explaining what's there, and diving through call-paths, and yet, it drops the ball the moment it's going to actually do something beyond saving me from looking up how write some really annoying and uninteresting boilerplate..
The worst thing is how good it is at making things LOOK right, it will cover every single edge-case you throw at it, but not because of the design, not because it correctly argues why the architecture is inherently allowing such and such, or because the design and spec fleshes out that A goes to B and never the other way around, and as soon as it's time to make something, it will make sure B can go to A, especially, it seems, if allowing so prevents it from doing the right thing which is WHY those edge-cases were trivial, instead it will endlessly hack around them.. I've worked people like that too, so I don't know if I am really blaming the models or the training data..
But damn it's a tough spot..
I've had multiple situations where, after wasting hours of work, which I should have just spend doing it myself, the only thing I really wished was for the model to be sentient, and able to feel pain, and have a corporal body so I could drag it outside and beat it to a pulp. (I've never reached that level of frustration with an actual person, so that's something new they bring to the table..)
reply