Blender video editor only using single thread

After using Blender for several months as a video editor, I began looking at my CPU utilization during a render; I was seeing an average of 35%. I began looking for ways to speed this up. I played with various settings in Blender with the hopes of reducing my render times, none were successful.

I also began to wonder if the problem may have something to do with Windows 10. I installed Manjaro and promptly downloaded Blender and tried the same test. No change, my render times were virtually identical.

Time to test Kdenlive. I had downloaded this software before, but I never put any time into learning it. I put the render settings as close as possible to those I had set in Blender and rendered out the same video I was testing with in Blender. WOW, what a difference. My render time in Kdenlive was 22 seconds. CPU utilization was near 100% throughout the rendering process. The render time in Blender was 1:28, basically four times slower. It appears to me that, at least with my hardware setup, Blender is only using the equvilant of one processing core for rendering video from the sequence editor.

If anyone else is experiencing this and knows a solution, please let me know. For those having the same issue and looking for an alternative, give Kdenlive a shot.

3 Likes

Very interesting. Can you supply more information on software versions used, codecs and hardware? And can you describe the kind of workload a bit more? Did you use transitions, animations or anything?

I am currently using Blender 2.77a. I am using an H.264 codec and MP3 audio output into an MPEG-4 container. My test video was a game clip recorded in 1080p at 60fps. My hardware is an Asus Z87 Hero with a i5 4670K @ 4.6Ghz 16GB DDR3 and a GTX780. Custom water cooling on the GPU and CPU; neither are overheating.

This test was done stricly with the video sequence editorusing Blender internal render engine, no 3D modelling. I have tested it with cycles render, but the result is the same. While I do use some transitions and effects in my videos, the test video does not include any; it is uncut, raw video. Hope this helps. Let me know if you need any more information.

I just want to add thet I am not trying to smear or be negative toward Blender in any way. It is a great piece of software. I am just looking to see if anyone else is having this issue and to help find a solution.

I haven't even used blender so far but I am very interested in it for video editing. I know that handbrake can use lots and lots of cores in some scenarios and the codec is a critical factor for that. H.264 should be able to use more than just one core, even without any effects or transitions. So this seems odd.


I just googled a bit and I think I found something. Scroll down to the last answer on this page.

2 Likes

Interesting. I have not seen that before. Although it seems a bit impractical, it would be fun to test. That is one thing I love about Blender, they seem to have thought of everything, in some manner or another.

Interesting reading... i will try it out next time i dive into blender render...

Multi-threaded encoding of videos is very finicky, it depends entirely on the codec and encoder for availability--and Blender does not offer a lot of flexibility here. Part of the problem is that for a codec like h264, every frame must be provided in order, while rendering on multiple threads will inherently finish the frames out of order and would need to be kept around (stalling) or stored somewhere until the encoder was ready (caching.)

Fortunately you can use the Frameskip option along with multiple instances of Blender. You need to set Blender to render each frame to a png file, give it a directory to store the frames, then save a new Blend file so these settings will be available to other instances. Secondly, give each Blender instance a different starting frame (one from 1, the second from 2, and so on) with each having a frameskip equal to the number of instances being used. Each instance will render a frame to a png file, skip the Frameskip number of frames, and move on, until the video's frames are fully rendered. This may also be used to distribute the rendering process across multiple machines if need be. Do be warned that the speed of your hard drive also comes in to play here--too many instances will end up waiting on the hard drive before they can write another frame, not actually going any faster.

This method will get your video out of the VSE, however its not shippable yet. You would then need to use an encoding program (probably ffmpeg) to take the directory full of pngs and compress that to your file. Using ffmpeg at the command line is a lot less convenient than simply hitting export, however there is one major benefit: you get complete control over the encoding process in this way, including whether to use multi-threaded h264 support, and better control over encoding (Blender and even some commercial tools won't allow you to specify CRF settings).

may have something to do with how it was compiled, if you got it prepackaged from your distro's repositories.

I have not played with Blender in a while. I had this experience on Windows as well, not just Linux. When I get a moment, I'll download the newest version and see if the issue still exists.

Other than a few quirks with the audio in the version I am using, I like Kdenlive better for video editing. I need to try the newest update here as well.

people commonly roll their own and compile wit Cygwin on windows for performance, too. The default version is less than stellar on every platform.

If you want something node based just for editing, there's always natron.

1 Like

I was stitching few clips together in Blender's Video Sequence Editor and noticed that it only used one core. Solution? Break up the video into separate parts render in parallel in X threads where X is your core/thread count. Then stitch those.

Here's a useful script I found online to do it: https://github.com/sciactive/pulverize

(inb4 PHP cringe, it works tho)

EDIT: Here's where I found it: https://blender.stackexchange.com/questions/61736/vse-isnt-multi-threading
Someone explains that VSE is simply not multi-threaded in Blender.

1 Like

I know VSE is kind of an afterthought in Blender, but this seems like a major flaw to me, especially in 2017. I hope VSE gets some love from the devs in the future.

Not trying to shit on Blender, it is an amazing program. However, I find it much simpler to go with another program that renders in multiple thread from the get go, as opposed to using work arounds.

I do appreciate the info, and I would like to experiment more with it in the future. For now though, I think Kdenlive will be my go-to.

1 Like

Other than making sure that the render settings aren't specified to use only one thread, I don't know why it wouldn't be using them all. Does it do that when you are rendering a still?