Why don't apps like this leverage gpu hardware acceleration?
On my old tablet, Netflix runs a-ok, the media player has hardware acceleration and runs a-ok, but Amazon Prime stutters.
I haven't checked it in years, so maybe it is better now, but I am skeptical this is the case, because if the original creators of the Android app didn't think hardware accelerated video might be a nice idea, why would it be different now? This is a structural issue.
There is a cross sectional problem between what codecs that specific hardware supports (and generally is supported by all the popular devices), how much bandwidth you have on your connection and how many different codecs the various streaming services can support and cache locally.
Fair, but in this case I think the tablet should be usable. Sure it's old and only a dual-core cpu, but checking the list of hardware accelerated qualcomm decoder codecs it supports, it lists: vc1, divx, divx311, divx4, avc, mpeg2, mpeg4, h263.
Surely AVC which is h.264 should still be supported..
Maybe it gets tripped up trying to use the software google decoder codecs instead?
That said, I am not able to test it anymore. The tablet is on kitkat and I'm not able today to find a site hosting a minsdk older than lollipop. It's not a big deal, but it's disappointing to see apks for services that used to work just disappear. They always gobble on about "security". Why is it a problem for everyone else except Netflix? Netflix loves supporting devices.
"Supports AVC" doesn't mean a great deal on its own. H264 has a number of different profiles (over 20, although only 3 are commonly used), and 20 different levels. Most of the potential combinations aren't relevant, but supporting the full range of hardware decoders while also sending something reasonably close to the best quality that the device supports can mean encoding ten different versions.
If it's a particularly low-end/old device, Amazon may have simply decided it's no longer worth encoding a version supported by your device's hardware encoder, while Netflix still does.
This is vaguely okay but makes the common mistake of overemphasizing that codecs use an IDCT. That's just a component of intra-prediction, it isn't necessarily important or used, and in H.264+ it isn't even a real IDCT when it is used. It's a simpler transform that isn't mathematically accurate but is bit-exact, which is more important.
I remember back in the day when doom9 was blocked at the corp firewall level because it had posts on how to circumvent things. Nevermind it was the unofficial support for things like AVISynth, x264, etc. Even the nascent days of ffmpeg were there. It took a lot of cajoling and back and forth with legal depts, but eventually, was allowed to be granted access on an individual basis
I owe a lot of what was accomplished to folks like DG and the other troopers from the bad ol' days of VFW based apps. It was like monkeys throwing rocks at the space ships compared to whats now. Remember when 2 cores was fast?
For the person up thread talking about learning things, I wish I was in that position today with them rather than 20 odd years ago where everyone was in the dark, and thank the heavens for sites like doom9 that shone lights in dark places!
>For the person up thread talking about learning things, I wish I was in that position today with them rather than 20 odd years ago where everyone was in the dark
Where would you start today?
And what was being circumvented? Copyright protection?
DeCSS was considered a very big no no if you worked for a company that did work for content owners. Doom9 had lots of info on how to use DeCSS, and so it was flagged by lots of corps.
Where would I start today? That depends on your level of interest, and what you want to do. Are you wanting to learn things to do to stay hip and cool making videos for the socials, or are you wanting to do technical things to video for less prestigious ends? That's two different paths that intertwine, but different recommendations depending.
That's not a rhetorical question, btw. I've been in both fields, and continue to work in both. There's lots of reading, and today watching videos, but the biggest thing will be the access to actually doing the things that weren't possible so freely 20 years ago. Try things, make mistakes, learn, do again.
I am interested in using video compression to improve delivery of VR and AR content over the web. But, this can mean a lot of different things! I know there is a lot of opportunity for this, I have a lot of ideas, but my knowledge of video compression and video streaming is limited. There are some companies working on this, but it’s not a singular problem or solution. I expect it will end up being many categories of solutions.
I usually like to learn about things from first principles so that I can then choose what to intentionally skip over, if that makes sense. I find the history is usually a good place to start, but it only goes so far with these more blackbox type topics like video compression.
There's a bunch of details that can be suggested, but the sad (actually, amazing) thing is that compression and encoding software has come such a long long way, that most of the default settings will produce a very decent result. FFMPEG + x264/x265 are stunning with minimum inputs required (minimum compared to the many many switches you can fiddle with individually).
For a broader understanding, look into the basics of how codecs like h.264/h.265 work. Learn about I frame encoding and how P and B frames work. Learn about the affects of long-GOP (multiple seconds) encodes for streaming purposes vs shorter GOPs (half second). Understand how an encoder makes it's vector decisions. How/why static videos encode compared to fast action. How will the videos you intend to use for VR/AR fall into those categories.
Resources like articles on Wikipedia have lots of information on the technical side of things without that are not video compression tutorials. However, learning the tech side will help make informed decisions on what knobs to tweak and why. There are so so many actual blogs/tutorials/SO answers/etc that will give you actual settings to apply. Once you have the specific question, I will be amazed if there is no answer available for it online.
It's all made of inter prediction + intra prediction + residual coding.
There's an evolution from JPEG -> MPEG2 -> MPEG4 part 2 (XviD) -> AVC -> HEVC… where it gets more complicated over time but that's still the structure.
There's also alternative ideas like wavelets (JPEG2000) and many other minor silly ideas; almost all of them are bad and can be ignored. Which is not to say the MPEG codecs are perfect.
ML people think they can do "ML video compression" which will be a black box; I think this might work but only because calling your video codec ML means you can make your codec 1TB instead of a few hundred KB.
There are many writeup/blogs about people's experience using these codecs, and typically, I roll my eyes at just another releasing the same info. However, for someone like you, being one of the lucky 10000 TIL types, you're their target audience.
The best way to learn is to just keep encoding. Take a source file and make an output that looks good. Compare things like its filesize, macroblocking, mosquito noise levels, etc. There are PSNR and other similar type tests that will compare your output back to the original. See if you can then tweak any of the settings to improve the PSNR score without increasing the bit rate. Then, keep decreasing the bitrate to see what things you can get away with before it becomes unacceptable. You can spend a lot of time going frame by frame comparisons, but remember, 99.99999% of viewers will ever only see it in full speed playback, so don't forget to take that into consideration. Look for obvious banding from gradients. Does a 10bit encode vs 8bit improve that? Is it worth the limits from some players not being able to use 10bit? How does frame size vs bitrate affect the quality of your file?
Doing enough of these tests, you'll start to get specific questions. Those will have more easily found answers.
No it doesn't.[1] Ryzen 5000's iGPU is based on the older Vega architecture and has no AV1 support, so everyone like me who just bought a brand speaking new laptop with Ryzen 5000 will be screwed soon enough.
That's why I said as it's being adopted. Right now it's just the very beginning of this phase (for instance AMD just started providing hardware decode in RDNA2).
So I expect this to become a non issue at some point.
As long as they don't take it as an excuse to reduce bitrates further. It's getting pretty bad these days with macro-blocking everywhere. This practice really took off when covid started. I actually cancelled Netflix because of it. Amazon is not as bad for macroblocks, but I have run into a few shows that show macroblocks, such as Mr. Robot.
>As long as they don't take it as an excuse to reduce bitrates further.
I actually think it is easier ( comparatively speaking ) to push for improved Networking so someday bitrate becomes less of a concern. Just like what happened with Audio. Even with the state of the art VVC Encoder, you only get about 65% ( or 2.8x ) reduction in bitrate in most common cases compared to x264. 65% reduction in 20 years isn't exactly a lot. We have easily got 20x bandwidth reduction in cost in the past 20 years.
Netflix are testing with 800Gbps per box now. May be when PCI-E 5.0 is available along with higher memory bandwidth they could try 1.6Tbps per box within the this decade.
Lots of people have plenty of bandwidth. I certainly do. It's the service providers who refuse to send it. It's pretty pathetic, because if you obtain a 4k to 1080p re-encode from a third party, you will get a better picture quality than what the services send directly as 1080. They will not send a 4k bitrate stream to a lower resolution display, so it's necessary to patronize third parties.
It's only since covid that they latched onto the excuse propagated by idiots out there that the internet couldn't handle people watching video from home, and immediately jumped on it as an excuse to cut quality to save a penny. Executive bonuses all around. Clap clap.
On my old tablet, Netflix runs a-ok, the media player has hardware acceleration and runs a-ok, but Amazon Prime stutters.
I haven't checked it in years, so maybe it is better now, but I am skeptical this is the case, because if the original creators of the Android app didn't think hardware accelerated video might be a nice idea, why would it be different now? This is a structural issue.