Limited memory, poor performance, etc. If you flip these performance improvement announcements they become damning; e.g. a 6x improvement means that they were previously running at less than 1/6th of optimal performance.
Not that I know anything about these things, but FTA:
"..To briefly recap: the Cerebras Wafer Scale Cluster extends the 40GB of on-chip memory of each CS-2 with over 12 Terabytes of external memory. Unlike GPUs which only have 80GB of memory for all weights and activations, we store weights in external memory and activations in on-chip memory. The CS-2 works on one layer of the model at a time and weights are streamed in as needed, hence the name – Weight Streaming. This model provides us with over 100x larger aggregate memory capacity than GPUs, allowing us to natively support trillion parameter models without resorting to complex model partitioning schemes such as tensor and pipeline parallelism.
It sounds like they just recently got weight streaming working well which was my point: if you bought a CS-2 when it first came out you couldn't really use the off-chip memory and the on-chip memory wasn't enough to run LLMs.
> It sounds like they just recently got weight streaming working well
Even if it was working poorly, the CS2 is still a lot of computer. The question is whether it was price-competitive with Nvidia at the time for the workloads it was acquired for.
Cerebras offers them in a batch-processing cloud-ish model, so their prices should reflect utility to some degree.
I’ve never spent time on r popular or the main page. If you only spend time in subs Reddit is very well designed and I think less exploitative than other social media
It's a great way to keep an overview on an array of topics. I set them up for a large array of interests of mine, e.g. business (slavelabor, juststart and others), cool PC setups (battestations and unixp0rn), DIY, development, passive income, indoor gardening, cooking, ebikes, streaming tv (Plex, PlexACD, TiviMate).....
If you ever want to see the best ideas in a niche, just sort by > top > this year or all time, and you'll get great ideas and inspiration.
I just laugh when people say Reddit is rubbish, they just don't know how to curate and look properly for the gold on there.
You can also browse multireddits set up by others:
Really? To me, the journey has been the other way around. Almost all of the subreddits I am subscribed to, I've stumbled upon by pure luck and coincidence. And more importantly: organically. I personally don't want recommendations, most algos are really bad