1

For reference, my computer is 9900K, GTX 1070. Playback is done on Chrome. Codec listed as vp09.00.51.08.01.01.01.01 (315) / opus (251), although I do not know if that can vary.

I have found a video on YouTube that is absurdly high quality:

Japan in 8K 60fps

However, when I switch to 8K60 quality, the stats say I drop about 1/3 of all frames. It does look fairly smooth however, with occasional noticeable frame hitches. CPU load steadies out at ~60%. According to task manager there is no GPU decode and GPU activity levels out at ~20% on the 3D graph.

Aditionally, there is this video:

The Daily Dweebs - 8K UHD Stereoscopic 3D by the Blender project.

It is also 8K, however 24 FPS(2d mode). In it, I do not drop any frames yet the video is stuttery and looks to play back at about 10 fps. GPU activity on the 3D graph is ~30% this time and CPU coasts at ~30% as well.

And this video for example: Ghost Towns in 8K (FUHD) at 8K24 plays back without dropped frames or any perceptible issues at 30% CPU and 11% GPU utilization.

What is the bottleneck in the system? How come the 60FPS footage looks smooth but drops a ton of frames, but the 24 FPS footage is choppy, but no frames are dropped? What does it take to play back 8K? How does playback happen anyway?

Edit: For completeness sake, note that YouTube can display many technical playback stats when you right-click on the video and choose the "Stats for nerds" option.

Edit to add information requested in the comments:

  1. Yes, my monitor is G-Sync.
  2. Decode vp9 profile0 up to 8192x8192 pixels
martixy
  • 915
  • 2
  • 9
  • 19
  • What browser are you using? I wasn't even aware that YouTube supported 8K videos. – Ramhound Dec 06 '19 at 15:59
  • Good point, I shall edit. – martixy Dec 06 '19 at 16:00
  • 1
    Can you tell what encoding is being used (i.e. H.264 or H.265)? – Ramhound Dec 06 '19 at 16:03
  • Youtube stats list the codec as `vp09.00.51.08.01.01.01.01 (315) / opus (251)`. You may check it out yourself, the links _are_ in the question above after all. Also Japan is stunning at 8K40, I guess even with a third of all frames dropped. – martixy Dec 06 '19 at 16:05
  • 1
    The information should be in the body of the question. I can't check it out myself, even if I could, it would require an internet connection to support it (I am not currently connected to one) or have access to a compatible browser. – Ramhound Dec 06 '19 at 16:10
  • I don't know what should be in the question, I'm not a cinematographer. I have no idea how video playback works technically. That's why I'm asking in the first place. And now question got downvoted and I have no idea why. I'm just curious about technology. – martixy Dec 06 '19 at 16:42
  • Does your monitor support G-Sync? – harrymc Dec 06 '19 at 17:56
  • According to [this source](https://en.wikipedia.org/w/index.php?title=YouTube&oldid=800910021#Quality_and_formats) and that YouTube codec string you provided, `(315)` shows you are not actually receiving 8K video. It should be showing `(272)` in the codec string if you're actually streaming 8K. – Romen Dec 06 '19 at 18:53
  • @martixy - I was only responding to the “you May look yourself”, I asked for a reason, which wasn’t important enough to specify the reason. *The information was important enough to be asked for though.* – Ramhound Dec 06 '19 at 19:05
  • On Chrome, put `chrome://gpu/` into the address bar and look for the "Video Acceleration Information" section. Look at whether "Decode vp9 profile0" shows up there and let us know the maximum resolution it supports. – Romen Dec 06 '19 at 20:00
  • @Romen You have linked to an old revision of wiki, though certainly past the time 8K was introduced (but maybe before 8K60?). The other(non-60 fps) videos show the number `272` however. @Ramhound It was a rather defensive response, as you might tell, I've had bad experiences with such question on this and sister sites. – martixy Dec 06 '19 at 21:05
  • @martixy, The latest version of the Wikipedia page doesn't have that table anymore, but the meanings of the numbers wouldn't change. If your YouTube player reports `(315)` then you are receiving a 2160p HFR stream. I loaded the video and saw `272` on my YouTube player, and the "Stats for nerds" says that is `7680x4320@60` so I think `(272)` covers any framerate. It seems to be the *only* YouTube iTag value I could find for 4320p content in VP9 format. – Romen Dec 06 '19 at 21:15
  • @Romen I loaded up the video again to play around with it, and I now get `(272)` myself. – martixy Dec 06 '19 at 21:22
  • @martixy, Great! Can you try that `chrome://gpu/` page I mentioned and let us know whether "Decore vp9 profile0" is there? – Romen Dec 06 '19 at 21:25
  • I have already added this information to the body of the question, yes it is, `Decode vp9 profile0` `up to 8192x8192 pixels` – martixy Dec 06 '19 at 21:26
  • Well that clears up that Chrome and your hardware *can* decode 8K VP9 video. If it's running slowly or dropping frames to keep up, then the answer to this question might simply be "you need faster hardware". – Romen Dec 06 '19 at 21:28
  • Yet the 3rd example runs well, and seemingly shares the same characteristics as the 2nd one(8K24), which plays back choppily. And the choppy playback itself is not reflected in the stats as dropped frames, making the situation even more confusing. – martixy Dec 06 '19 at 22:12
  • @martixy, I have a GTX 1070 on my PC at home so I will try that video in Chrome. There's a chance that the 3D video is streamed all the same and the CPU is doing extra work locally to display it as 2D. – Romen Dec 06 '19 at 22:54
  • @martixy, I played most of that 3D video on my GTX 1070 with only [**2** frames dropped](https://imgur.com/wJGRaZl) while playing that video in Chrome 78.0.3904.108. Clearly the GTX 1070 *is* fast enough, but your system may have some other bottleneck! – Romen Dec 07 '19 at 00:22
  • @Romen The issue for the 3D video is different. I do not drop any frames from the video, yet the playback is choppy, seeming to run at ~10 fps. There is a discrepancy between what the stats report and what I am seeing on screen. _That is the odd part._ Though your hypothesis above does sound plausible. – martixy Dec 07 '19 at 10:53

3 Answers3

3

What is necessary to play 8K60 footage?

How does playback happen anyway?

That depends on the codec used to encode that 8K footage.
The some of the commonly used codecs right now for 8K would be:

  • H.265 (HEVC)
  • VP9
  • AV1

All of the YouTube videos you have linked report the following codec string:
vp09.00.51.08.01.01.01.01 (272) / opus (251)

Google is a major contributor to developing the VP9 codec, and thus YouTube prefers to stream this encoding for 8K videos. (It saves them a lot of bandwidth over H.265)

My best guess for what that vp09.00 first part means is that it's VP9 Profile 0, which is 8-bit and 4:2:0 chroma subsampling.

To play 8K videos smoothly, you're either going to need a very fast CPU for software decoding, or a GPU with support for hardware accelerated decoding of one of those codecs.

For Software Decoding:

Libraries like FFmpeg or libvpx provide the means to decode videos regardless of your hardware features, so a sufficiently fast CPU is one way to play 8K videos. 8K is a very high bitrate though (up to 240 mbps), so maybe the i9-9900K is still not fast enough to decode that much data per second with no frame drops.

For Hardware Decoding:

You will need a GPU that supports 8K resolution and has a fast decoder for the codec used by the video.

According to WikiChip, the Intel UHD 630 Graphics in your i9-9900K can only decode any of those codecs at a maximum of 4K resolution. It won't be useful for 8K videos, but it's definitely an ideal hardware accelerator for 4K content using the latest codecs.

Your GTX 1070 can decode a max resolution of 8K, but its support for codecs that can be 8K is limited.
According to NVIDIA's Video Encode and Decode GPU Support Matrix your GTX1070 can only decode VP9 in 8-bit, or H.265 with 4:2:0 chroma subsampling.

This explains why your chrome://gpu/ page displays support for "Decode vp9 profile0 up to 8192x8192 pixels".

Since all of the YouTube links you tested are (probably) VP9 Profile 0 videos, the GPU is being utilized by Chrome to play those videos.

What is the bottleneck in the system?

That likely comes down to the sheer magnitude of 8K video. It is likely that the video decoder in the GPU is hitting its maximum potential performance, since the CPU usage and GPU usage are not at 100% yet. I'm also not sure that NVIDIA has ever said that 8K@60hz would be optimal on the GTX 1070, only that it is supported.

See update below...

How come the 60FPS footage looks smooth but drops a ton of frames, but the 24 FPS footage is choppy?

I can't explain this one, but maybe it's possible that 3D videos are streamed with the full stereo data and the player is displaying it as 2D with local processing. That would lead to more CPU overhead.


Update:

I tested that 3D video on my GTX 1070 and only 2 frames were dropped over 1136 frames (52s), the playback was also very smooth. My CPU is a Ryzen 5 3600X, so by no means should your i9-9900K be the bottleneck either. Make sure you're using the latest version for your video drivers and Chrome. It's possible that the version of Windows 10 could affect this too (I am using 1809 Pro).

Romen
  • 1,238
  • 1
  • 10
  • 18
  • This is a nice and comprehensive answer, which along with some bits from other answers(e.g. Media Foundation) does offer good insight on how modern computers play back video. Though we are still no closer to explaining the playback idiosyncrasies encountered. The next step might be an explanation of the whole video playback stack , but that's probably a subject big enough to fill a textbook and unlikely to fit in a SO answer. :) – martixy Dec 07 '19 at 11:15
  • @martixy, I believe your system has something wrong with it, based on my own findings. Your CPU is faster than mine and we have the same GPU. I'm fairly certain that you have the necessary hardware to achieve 8K@60 since mine does it just fine. At this point I would be looking for a problem that is causing your system to run slower that it *should*. – Romen Dec 07 '19 at 15:50
1

For an Nvidia card it appears that hardware decoding of 8K video requires their PureVideo Feature set H or higher:

Feature Set H

Feature Set H are capable of hardware-accelerated decoding of 8192x8192 (8k resolution) H.265/HEVC video streams

That puts it in the range of 20xx and 16xx and 10xx graphics cards. Yours should, at least, be able to decode 8k h.265 video.

It may be that 8k VP9 is not supported in hardware decode, or that the video is using 12-bit colour depth (Main 4:4:4 12) and as a result it requires a newer 20xx or 16xx graphics card. Wikipedia is lacking details on what is in that video decoder (Feature set J). It is likely that it extends 8K support and that this is the missing piece.

First try to get a "standard" h.265, rather than VP9, video and see if that works. I'm not 100% convinced that the single 8K@30 sample on that site (youtube link) is using the hardware decoder on the 1070. It is using a significant amount of both CPU (~70%) and ~30% GPU, but only "3D" so may be just pushing pixels and doing scaling but not full hardware decoding.

I cannot easily find any downloadable 8k60 videos, but I'm not fond of Web Browsers as test beds for video support.

Mokubai
  • 89,133
  • 25
  • 207
  • 233
  • At least for Chrome/Chromium on Windows, video decoding is handled through [Media Foundation](https://en.wikipedia.org/wiki/Media_Foundation). That will result in utilizing hardware decoding just like any other media player would if there is *any* hardware in the system that supports decoding the video. Even if the GTX 1070 doesn't support it, the i9-9900K has [QuicSync support for 8/10 bit VP9](https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video#Development) so hardware decoding should be supported on the OP's system. – Romen Dec 06 '19 at 19:06
  • I was having the same thoughts. I have a machine at home, to verify if the video supports the proper codecs, will support the feature level at the hardware level – Ramhound Dec 06 '19 at 19:07
  • @Romen - You don’t specify if you have tried to configure Chrome hardware acceleration to use the Intel GPU instead of the Nvidia hardware – Ramhound Dec 06 '19 at 19:08
  • @Ramhound, Are you referring to the checkbox in Chrome for "Use hardware acceleration...", or the default graphics device in the Windows context menu? (i.e Nvidia Optimus on some systems) – Romen Dec 06 '19 at 19:09
  • Nvidia Optimus wouldn’t be relevant if your trying to use the 9900k’s Intel GPU. – Ramhound Dec 06 '19 at 19:10
  • Yes; I am referring to the hardware acceleration option within Chrome. – Ramhound Dec 06 '19 at 19:18
  • I don't know where to find a " _"standard" h.265_ " video. And I do not see anything in the chrome settings that controls which GPU is used. There is only the toggle for Hardware Acceleration. I'd gladly try it, but I don't know how to switch which GPU is used. – martixy Dec 06 '19 at 21:19
  • 1
    @martixy Windows 10 settings can choose the GPU on a per program basis: Settings --> Display --> Graphics Setttings (see https://superuser.com/questions/1434190/set-default-graphics-processor-for-windows-10-laptop for the panel you are looking for) there you can choose the "power saving" (I.e. Intel) GPU for Chrome. – Mokubai Dec 06 '19 at 21:38
0

Ignoring decompression time, the bitrate for video is approximately the width times height times (pixel numeric representation storage size for a frame) times (number of color channels) times the framerate.

So for RGB 8bpp, a px is a byte and 8k would be:

7680 bytes × 4320 bytes x 3 color channels x 60 fps x 8 bits per byte / 1024 for Kbit / 1024 for Mbit / 1024 for Gbit

Comes to about 45 Gbps or 8GBps

Yorik
  • 4,166
  • 1
  • 11
  • 16