We need yet another "This has what it says it has" regulation around supplements and vitamins. There's basically no guarantee that any vitamin pill you pull off the shelf has any percentage of it's claimed ingredients. You could be getting 10% of what you wanted, you could be getting 1000% of what you wanted.
Fortunately, the body can handle some pretty wide variation. But unfortunately if you are taking a vitamin because you lack a nutrient, there's really no guarantee that your actually treating that deficiency.
It goes far beyond the manosphere. You can find people selling all sorts of naturopathic, homeopathic, crystal, and Jesus endorsed medicines.
Anywhere a group of people gathers to downplay pharmaceuticals or evidence based medicine, you'll find them pushing their own untested and unregulated junk.
Overtime is supposed to be a penalty to the employer for having unreasonable work hours. It shouldn't be something employees can willingly engage in to boost their take home pay. Especially when we are talking about cops and emergency services. I don't want to be working with a cop that has been on the clock for 80 hours.
It's a bit crazy that cities are paying so much extra for their police force because cops want a cushy retirement.
It's not legal, that doesn't mean it doesn't happen.
See "Blue flu" for cases where cops coordinate a strike using sick leave. Another way they strike is by simply not doing their job. They'll just sit in their cars all day and won't respond or will severely delay response to dispatch.
AFAIK, those cops never get a ATF style house cleaning.
The difference was it wasn't one global server for everyone. I think that's why the past feels like it was more stable.
Now, aws or cloudflare gets a hickup and half the internet is nuked.
The old internet was far more federated so doing something else meant to me "Welp, anandtech is down, let's go to pcper, digg, tomshardware, slashdot, etc"
Sure stuff would go down, but it would be just that small community rather than most of chat for the internet.
Yeah, but (as a user) I would rather have one global server crash for 1-2hrs two-three times per year, as opposed to having each individual server randomly crash once a month for at least one each time.
The more I sit down and try to remember how it actually was to use internet in late 00s, the only thing that always comes up is "there is no way people today would tolerate it nearly as well as we did back then".
Interesting but, IMO, probably one of the worst uses of JSON. The data you would want to consume is already not "human readable" so it instead introduces a lot of bloat for really no benefit.
If you have a non-insignificant amount of data points to track this is going to eat just a ton of memory while also being pretty slow to encode/decode.
Imagine, for example, if we encoded this as a binary. First 2 bytes for the feature type, second 2 bytes for the geometry type, 3 bytes for a fixed point x, 3 bytes for a fixed point y, and you could optionally provide the properties as a json blob in a trailing string. That's 10 bytes for all the coordinate stuff. Less bytes than what currently stores the `"type": "Feature"` string.
Do you mean geocoordinates when you say not human readable? Those are obviously at the heart of geospatial information but there is quite a bit more to the spec that does benefit from being human readable, and I'd include longitude/latitude among them. There are also solutions like cbor which allow them to be transferred and decoded/encoded from binary. For performance critical data you can also use something like protobuf, but it would be a huge pain to handle everything that way. Json is a great choice as a general spec.
> If you have a non-insignificant amount of data points to track this is going to eat just a ton of memory while also being pretty slow to encode/decode.
This is a fair critique, however, for any large GeoJSON, the coordinate arrays will dominate the size. I think it's also safe to assume this data will be gzipped at rest and over the wire, which will eliminate most of the "header" metadata size you mention. As you point out, it would be much more efficient to have a binary format, and there are good examples like these, that are ~2-3x smaller in benchmarks:
That said, I think GeoJSON should be compared against other human readable formats like KML, which has a lot of wasted space as well, while being more difficult to read/write.
This is just pretty wrong. Sure, geojson can be bloated but it is not for "no benefit." It is a very popular format and it is easy to encode and decode, even if it is slow for large data. It is more for sharing than long term storage. Take a site like below, it is very convenient to render json this way.
It was super delayed and I think that's because they couldn't execute in all the ways they promised they would. The final product is very rushed and pretty different from the initial promises. I think they got into "Let's just ship SOMETHING" mode as the delays were getting insane.
Batteries and the engine. The engine sits in line with the wheels rather than being under the hood of the car. That puts all the weight right next to the driving tires.
But agree, cybertruck is a really silly purchase for numerous reasons. The only reason you'd buy it is to signal your support for Elon. It's a very bad vehicle.
> However, I think that they’re really worried about is that a person needs to design and implement that stuff… It throws a wet blanket on their insistence that this will replace entire people in entire workflows or even projects, and I just don’t buy it.
I think you are on to something. But I also think this sort of system lends itself to not needing really good LLMs to do impressive things. I've noticed that the quality of a lot of these LLMs just gets worse the more datapoints they need to track. But, if you break it up into smaller and easier to consume chunks all the sudden you need a much less capable LLM to get results comparable or better than the SOTA.
Why pay extra money for Opus 4.7 when you could run Qwen 3.6 35b for free and get similar results?
And then you realize that what you’re using the smaller models for is ALSO decomposable and part of it is just a few if statements, and then you realize that for this feature you don’t actually need or want a model because the performance, reliability, reproducibility are cheaper and better for you and your users.
Additionally, developers tend to become less expensive as venture capitalists turn off the spigot, while access to giant frontier models becomes way more expensive. Beyond that, a developer might go out and have a beer with you after work, which appeals to the sickos that have the gall to prioritize humanity over fanatical efficiency for corporate gains.
Indeed, I've been experimenting with agent workflows, for complicated tasks - where I essentially have a graph of agents with different roles/capabilities, including such things as breaking down complex tasks into simpler ones. There seems to be a point where a complex enough task is better performed by a group of cheaper agents/models than by one agent using one of the SOTA big models, in terms of both quality and cost.
The big SOTA models win in world knowledge, that's what all those parameters are for. But a huge fraction of agentic tasks is going to be plain clerical work that needs no special knowledge at all, a much simpler model can do them in a straightforward way.
It is also interesting because you get people with very different use cases arguing about the effectiveness of various models but doing very different things with them.
Its one things for a model to be very clearly instructed to add a REST endpoint to an existing Django app and add a button connected to it on the front vs "Design me a youtube". The smaller models can pretty dependably do the first and fall flat on the second.
Coke is amazingly popular around the entire world, though. You can go to China, India, South Africa, and find Coke for sale and selling well even though they have their own traditional beverages. Obviously sugar water isn't very good for you -- it's liquid candy, but the idea that people only drink it if they've been "indoctrinated" isn't very likely.
Fortunately, the body can handle some pretty wide variation. But unfortunately if you are taking a vitamin because you lack a nutrient, there's really no guarantee that your actually treating that deficiency.
reply