Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I watch software decoded video all the time because I like watching anime. I have never, ever had any problems for quite literally over a decade thanks to how powerful CPUs have gotten.

Way back 20 years ago, you would have had a point. But today? CPUs are several orders of magnitudes of overpowered for the task.



>. I have never, ever had any problems for quite literally over a decade thanks to how powerful CPUs have gotten.

I wasn't talking about having problems decoding, I was talking about software decode screwing your battery life due to high CPU usage on certain streams. Which is a problem, but not something most people who don't have laptops care about.

Do you watch your anime on a desktop or a laptop on battery?


I watch on battery if I'm out traveling.

The very slight increase in CPU processing and thus power use doesn't bother me at all. The screen takes more power than the CPU.

I habitually have Task Manager minimized in the tray area because I like monitoring my CPU usage at a glance, and decoding video in software simply does not even register as a noticable blip anymore.


So yes for 720p low bitrate stuff it's not so much of a difference, but as others said watch some real content in 1080p or better 4k and your CPU will definitely break a sweat and the fans spin up.

Now that I think of it with your claim of it not even registering as a blip in taskmanager I'd actually argue you're probably using hardware decoding without noticing, because I wouldn't know which player (esp if on Windows) wouldn't make use of it.


Decoding horsepower required increases sharply with increase in resolution/framerate. 720p30 doesn't cause my fan to even run but 1080p60 does and 4K30/60 drops frames.

Just to be clear, 4K is 9 times the load of 720p. 4K60 is 18 times.


I am definitely not using hardware decoding.

My setup is Media Player Classic Home Cinema, splitting and decoding via LAVFilters, then fed through ffdshow for some filters, before finally rendering on Enhanced Video Renderer.

Nowhere in the pipeline is the GPU involved as far as decoding is concerned, it's all deliberately software on the CPU.


Let's talk specifics. What resolution, what codec, what CPU, what usage.

You're pretty vague so far, so it could be that you're watching a software decoded 360p h.264 video on a Threadripper, in that case I agree.

It will be a very different situation when you try to watch 4K h.265 / AV1 video on a couple of years old laptop dual-core CPU.


Generally 8-bit or 10-bit h.264, occasionally h.265, at either 1280x720 or 1920x1080 progressive with frame rate usually 23.976. Split and decoded in software via LAVFilters then ran through ffdshow before rendering on Enhanced Video Renderer.

CPU anything ranging from an i7-14700K to an i3-2100 (yes, Sandy Bridge). Seriously, decoding video has never been a significant workload for over a decade.

>4K h.265 / AV1 video on a couple of years old laptop dual-core CPU.

Kindly, why the hell would I even watch 4K video on a laptop? Y'all keep throwing out contrived situations like that, meanwhile I'll be a sane man living in reality and re-encode it (in software, finer tuning of parameters) in my spare time down to 1080p or 720p so I save myself precious disk space and CPU usage while travelling.


> Kindly, why the hell would I even watch 4K video on a laptop?

Because that's the file I have on hand. Why on earth would I re-encode it if I want to watch it once?

I also connect my laptop to my 32" 4K screen. There the battery life is not a consideration, but the spinning fan is.

> in my spare time down to 1080p or 720p so I save myself precious disk space and CPU usage while travelling.

Your use case might work for you, and that's fine, but you claim that software decoding is universally not a problem. But it is, if you don't limit yourself to 720p h.264 pre-encoded at home. Most people are not fine with having to do this and having to limit themselves to low resolutions / bad IQ.


> Kindly, why the hell would I even watch 4K video on a laptop

How about if you have a 4K monitor plugged in? Or if your notebook display is itself 4K (which is a completely valid configuration nowadays)?

I have a pretty beefy laptop with an RTX 3080. I regularly watch BDRips that exceed 50+ Mbps, and software decoding even on my 8-core Intel Xeon will cause some stutters. Hardware decoding is just so much faster.


mpv (definitely the best player for advanced users) does not use it by default. I'll simply quote `man mpv`:

> Hardware decoding is not enabled by default, to keep the out-of-the-box configuration as reliable as possible. However, when using modern hardware, hardware video decoding should work correctly, offering reduced CPU usage, and possibly lower power consumption. On older systems, it may be necessary to use hardware decoding due to insufficient CPU re‐sources; and even on modern systems, sufficiently complex content (eg: 4K60 AV1) may require it.


Decoding has a component that is proportional to the bitrate. Anime has a much lower bitrate than say a 4k bluray movie. The truth though is somewhere in the middle, as recent CPUs are getting to a point where they can do all the heavy decoding on their own, but older models (that are still widespread) still struggle.


But also I understand the GPU is using a video encoding/decoding chip which will have a fixed performance whatever the model of GPU within a generation, whereas software can fully utilise your CPU, and scale up if you run multiple streams in parallel. All of that is irrelevant for real time decoding, but for converting files, my experience if you are using hardware encoding is that hardware decoding can become the bottleneck and you may get faster performance with software decoding + hardware decoding.


Anime is like 480p, low fps video isn't it?

Otherwise even my desktop struggles when it comes to 4K60 video decoding on the CPU.


Definitely not. Many anime release groups are early adopters of new video coding standards. It was the first popular media that saw heavy use of AVC back in probably 2006 or so, then HEVC (roughly in 2014 or thereabouts), then AV1 (since ~2021), and sometimes even VVC, although that's currently being held back by a lack of support in mainline ffmpeg.

It was also the first to start using 10 bit color profiles.

Some release groups are particularly insistent on producing high quality releases and use FLAC for audio (even if the source tracks were in lossy format), and very high bitrate video.


Why would a type of animation dictate resolution and framerate?


Av1 consumes most modern CPUs at 4k+. Sure if its 264 with 1080p res you haven't had issues.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: