Personal View site logo
Make sure to join PV on Telegram or Facebook! Perfect to keep up with community on your smartphone.
Panasonic and Sony, please, implement a 4k 10bit 4:2:0 recording mode
  • The GH5 footage 4k 10bit 4:2:2 is very processor intensive and needs a 8 core CPU computer to play smooth in timeline. Even a 6 core cpu computer can have some sttuter when play the files in editing timeline unless it is overclocked. The GPU cannot help because the CPU does the decoder.

    So the idea is to implement a 4k 10bit 4:2:0 recording mode, because it would be less processor intensive and any quadcore computer would edit it with no sttuter when play in editing timeline. It would save lots of people with a quadcore and with a sixcore to avoid a computer upgrade. For lots of people it would be an impossible computer upgrade because a 8 core CPU + motherboard is very expensive.

    A 4k 10bit 4:2:0 recording mode would be enough for a good VLOG and SLOG grading, because the 10bit would avoid banding. The 4:2:2 recording mode would be an option for people who needs chromakey and other compositing chroma intensive tasks.

    The 4:2:0 video have 25% less pixels information compared to 4:2:2 video, only less color pixels, same luma resolution, and this would make the 4k 10bit footage less processor intensive to play smooth in editing timeline when using quadcore computers.

    Bluray 4k is 10bit 4:2:0, and if 4:2:2 is needed, a render to 4:2:2 can be made without significant difference, the important thing is the 10bit.

    So please Panasonic implement a 4k 10bit 4:2:0 recording mode in a firmware update for the GH5 and future cameras. This would be a simple firmware update.

    Also Sony could implement this in future 4k cameras.

    Read about the CPU tests here with detailed explanations and reports:

    http://www.personal-view.com/talks/discussion/16454/what-cpu-and-gpu-to-edit-10bit-4k-video/p1

  • 6 Replies sorted by
  • Sony needs to implement a DCI 4k mode 24fps 4k mode.

  • Seems like a strange request. I remember where people where producing great content and others were asking for several years (since GH2 launch as I recall) , for 422 10bit internal. For certain cams, getting this option via external recorder has been available for sometime now. But obviously not accessible to everyone due to the additional 3-5 grand minimum needed to take advantage. Now it's finally coming in a tiny camera body at a pretty reasonable price (sure lower would be better) and you want lesser color info simply because your computer can't handle the data?

  • @filthy I agree. Some people like to spend thousands of dollars on cameras but not hundreds of dollars on computer updates. ;-)

  • I believe the 6K Photo mode on the GH5 is 10-bit 4:2:0. Of course it's also H.265 so I'd imagine it's even more processor intensive to decode for playback and editing. It's possible that the eventual intraframe option (promised in a firmware update), may prove to be a more edit-friendly option as well.

    But really I think you should probably be directing this kind of request to NVidia or Intel so that they can add better support for 10-bit decoding to enable smoother editing. Would probably benefit a wider range of playback applications as well.

    In terms of the camera manufacturers, we should really be encouraging them to include the highest quality recording modes possible, whether that's 10-bit 4:2:2 or beyond. I mean, you could always transcode the footage into a different format (10-bit 4:2:0 or something else) as a preliminary step before editing if you really want to.

  • I agree with davedv and pose the bigger challenge

    Ask Adobe to finally optimize their software for recent codecs with new Nvidia and AMD consumer GPU and CPU, as the footage is not the bottleneck...

    Consider to edit 8K Red footage, they literally propose a return to proxies... yep, working with 1080p proxies and final rendering at 4K or 8K... a solution that worked decades ago in the past when we transitioned to HD, but is inexcusable today considering modern hardware...

    I believe the Adobe subscription model is to blame here...