Create a test video to film off a screen that is stressing a Camcorder's Compression Codecs to the max?


a short while ago I started filming with a Blackmagic Design Pocket Cinema Camera 6k since I am a massive fan-person of their compressed Blackmagic RAW codec.

While I am pretty much satisfied by its image quality, I am annoyed by the interfaces it uses to store its recordings to:

  • USB-C (only 5 Gbit/s, not 10 Gbit/s)
  • CFast 2.0 (electrically the same as SATA 6 Gbit/s so it is compatible with SATA SSDs; this is the fastest available interface)

The camcorder is dropping frames when using the Blackmagic RAW codec with the highest quality settings (constant quality Q0 or constant bitrate 3:1) when using the highest possible resolutions with 50 or 60 fps:

  • 6144 x 3456 (6K) (50 fps, 60 fps isn’t supported with this resolution)
  • 5744 x 3024 (5.7K 17:9) (50 and 60 fps)

(I’ve cried to poor @wendell about this but I would like to test this issue’s behavior more seriously myself)

Blackmagic RAW uses intra-frame compression so I would like to create a synthetic test video that is then filmed off of an OLED TV by the camcorder and investigate what works and what causes a recording to stop due to dropped frames.

This test video should have 3 different kinds of sequences that cause different levels of stress to the camcorder’s encoding pipeline (easy - medium - hard).

I’m not specifically going for the best compromise between file size and image quality but would like to find the conditions where the hardware encoder is producing files where the bitrate is too much for even the SATA 6 Gbit/s interface (and the connected “high-quality” SSD) to handle so this has a chance of getting fixed with future firmware updates.

What good is the best image quality option if it chokes even the fastest interface of the device so you cannot trust recording with it?

(Note: I’ve tested very, very varied kinds of high-performance SSD contraptions on the USB-C and CFast 2.0 interface)

Does anybody have an idea how to create such an “abstract” test video so I can test each codec/fps setting with the same “image load” as properly repeatable as possible?

Thanks & regards,
aBavarian Normie-Pleb

1 Like

Fine detail like dust and confetti are hard on video codecs.

The above post would probably be in the medium category for your test.
The hard category would just be a video of white noise (the fun little black and white dots making your eyes go weird in analogue-times).

As for easy, basically any kind of movie or video with fairly static/smooth scenes.

Here’s a easy way to get “digital white noise”

Purposely de-sync a HDCP protected source somehow, (have a bad handshake) and show the noise of the encrypted data as it tries to decode it as unencrypted data.

I completely understand as the compression uses variable bit rate rather than constant bit rate. You’re testing the speed of the card and Blackmagic equipment dropping frames isn’t as uncommon as you think. The only piece of equipment that has less likelihood of that is the Hyperdeck 8K since it has a spot for a NVMe cache. This is what the Pocket cameras and the Ursa cameras need.

Another option is to use a CFast to SATA adapter and use a 870 EVO 1TB and up to record. CFast cards aren’t guaranteed for sequential because they likely don’t have ATA Secure Erase so you can wipe all the cells fresh for consistent speed.

This thread might be better in the “Hardware > Other Hardware” forum.

Small update:

Also tested a premium CFast 2.0 card (my very first one) from a kit that was supposedly tested with the PCC 6k camcorder and it is dropping frames fast after a few seconds at 3:1 or Q0, faster than my “unsupported” CFast 2.0-to-SATA adapter contraption.

(Note: 6144 x 3456 (6K) (50 fps and 5744 x 3024 (5.7K 17:9) (50 and 60 fps))


I’ve also tested various NVMe drives with a second-gen 20 Gbit/s USB-C-to-M.2(-to-U.2) adapter:

  • WD SN750 2 TB with DRAM
  • WD SN550 2 TB without DRAM
  • Intel Optane 905P 480 GB
  • Samsung PM1733 7.68 TB with DRAM

Of course, all of these get slowed down by the camcorder’s USB-C interface to 5 Gbit/s but there was a slight hope that maybe write latency was an issue with the previously used SATA SSDs.

These NVMe drives behave about the same as SATA SSDs connected via USB-C.

I was previously hoping to use an USB-C-to-NVMe solution for everything to be able to copy the files later to an editing computer much faster than the SATA interface would allow.

As of now I think that it’s a camcorder firmware issue that the bitrate is allowed to exceed around 500 MByte/s leading to the dropped frames.

Is there any way to actively adapt a SATA port (camcorder) to a 10 Gbit/s USB storage adapter or an NVMe drive directly?

Regarding the test video:

Since I’ve never created such footage from scratch (only edited “real” recordings), what program would be suitable to create an image sequence, similar to that of white noise but with all the colors mixed in/changing through?

1 Like

I think a way to make that is to encrypt a large file in AES256, then read the encrypted data without decrypting it as raw video data. Encode that to ProRes and play it back using an Atomos Ninja V.

I already posted the solution to use something like the 870 EVO directly:

NVMe recording requires a brand new FPGA base for the camera, and that requires a full camera redesign.

This is Blackmagic’s problem, they can make the recording modes, but can’t anticipate all recording media and can’t adapt to them by adding a cache.

While not stated explicitly: I’ve already tested using a Samsung 860 Pro 2 TB SSD with MLC NAND connected natively via a CFast 2.0-to-SATA adapter.

I don’t get why BMD is offering Codec options that result in a higher bitrate than even the fastest electrical interface of the device is able to transmit, even without counting protocol overhead.

Concrete examples:

  • 6144 x 3456 (6K) (50 fps, BRAW 3:1 or Q0)

  • 5744 x 3024 (5.7K 17:9) (50 and 60 fps, BRAW 3:1 or Q0)

Because the hardware testing skipped recording tests. This was true with the early Hyperdecks too. They wanted to get the encoding IP implemented ASAP because they were competing heavily with others and wanted to rush to market. They don’t test with all SSDs.

The 12K camera will have the same problem.

Am I too innocent to think that something like this should have been addressed with firmware updates ASAP after a product launch?

It has nothing to do with the the used SSDs if the hardware encoder squirts out more than 700 MB in a second (simple math, extrapolating from working 30 fps 3:1 and Q0 recordings), on a device where the fastest interface is SATA 6 Gbit/s.

That’s a consequence of variable bit rate. ProRes and most Wavelet compression is variable bit rate.

Back in the MPEG days they can constrain the bitrate for the interface. Not so with Wavelet codecs.

1 Like

Someone needs to alter this clip.

Honestly the fact the BMPCC cameras use industrial use sensors for film is the biggest reason I avoid them. Earliest models had both the black sun and fixed pattern noise issues. It just shows how much the image processor has to polish a poop.

Just looked up examples for that “black sun” issue and I’m at least happy to report that my PCC 6k doesn’t have this one - I think I would freak out that the end times are finally here… :upside_down_face: