|
Capture: H.264
Edit: Grass Valley HQ/HQX (Transcode) for Performance/Minimal Generational Quality Loss
Render: H.264
Don't recommend capturing and editing H.265 (HEVC) if you can avoid it, especially if you have anything less than a fairly recent (i.e. 7th Gen or newer) HQ-Series or Desktop i7 CPU (4 Core/8 Thread Minimum). It bottlenecks CPUs when editing in an NLE.
DNxHR and ProRes will not work in PowerDirector. You need a Pro NLE to handle those, and QuickTime CODECs (i.e. Avid DNxHD) are no longer supported in PDR.
I am not sure if PDR handles 10-Bit footage well (if your camcorder can do that). Someone else will have to chime inform on this.
EDIT: If you record video with an iPhone, I recommend setting it to Most Compatible. That way, it will only record HEVC if you select a Video Mode that requires it (UHD 60 FPS, for example... but good luck editing that... especially in HEVC!). On a Samsung Galaxy S/Note device, make sure the HEVC video toggle is off (the default) in Camera settings.
The HEIF photos from iPhones also tend to look worse than JPEGs (they're definitely smaller, though!); moreso as you go into more low light situations.
|
|
Quote
Here are a couple of suggestions:
Listening to your system isn't a very reliable way of understanding what's going on. Open up Task Manager, click on the Performance tab and scroll down to your GPU. Even when your CPU is running at 100%, your system may also be using your GPU almost as hard, like this:
I'm sure you know this, but I didn't see it in your description. Unless you have these choices checked at the bottom of the Produce window, PD won't use your GPU at all:
Also, if you have the "latest version" of nVidia's drivers (v416.xx) but you don't have the beta 2224 patch installed, PD can't access your GPU at all! So either install the patch or go back to the 411.70 drivers. There's a whole thread on just this issue!
As for the stuttering you've seen in your produced vids, I'd actually argue that's more likely to be caused by the GPU rather than be solved by using it. You can easily test that by unchecking the "Fast video ..." option and producing again.
That's not the GPU, that's just the ASIC on the GPU. "Video Encode" is nothing but the NVENC chip on the GPU board. You can remove your GPU, and if you have QSV, you will see the same thing with an Intel iGPU, and your Encode Speeds will barely change.
The Actual GPU (the Graphics Processor, CUDA/Compute) isn't really accelerating anything. IN fact, your PC is barely passing any work to the GPU. This is why it's using almost no VRAM and "Copy" (which is how work is handed from CPU <-> GPU is stupendously low, with almost no spikes.
Your machine is still doing almost all of the rendering on the CPU. It's only doing the DECODING and ENCODING using a dedicated ASIC on the GPU - not the actual GPU itself. I know, it's kinda sort of confusing to wrap your head around, sometimes. NVDEC/NVENC is NOT the "GPU," it's a seperate chip that is distributed with the GPU - it does the same thing as Intel QSV, which you can have without even having a dedicated GPU in your system. The benefits of a GPU are far greater than NVENC for video editing, provided the software is equipped to utilize it. The fact that you're barely using any VRAM is a good indicator of how little work the applicaiton is passing to the GPU. It's only doing generic "GPU stuff," frankly. The 84% is nothing more than the usage on the NVENC chip on your GPU board... Windows Task Manager is useless for these things. Use GPU-Z.
Install an NLE like Premiere Pro or DaVinci Resolve on your PC and look at how much actual work the GPU is doing, and how much VRAM, Copy, and Compute gets used for the same footage using equivalent effects in those software packages. This will be evident even in Windows Task Manager, but even moreso in GPU-Z (for example).
|
|
Quote
It does require a beefier PS, mine has two 6 pin connectors and I have used an adapter from 6 pin to 8 pin.
However, just for video editing you don't need the top of the line videocard, you just need the minimum that supports video encoding in that certain class, in this case an RX560/560X would be sufficient since it has the same encoding engine:
https://en.wikipedia.org/wiki/Video_Coding_Engine#GPUs
Yes, everyone thinks that when they're doing nothing but cutting clips and rendering out stuff...
Then they try to do color correction, effects, etc. and complain about awful performance, less-than-realtime playback, and rendering speed nosedives because thier weaksauce GPU can't handle it.
Especially thsoe people who try to move to 4K.
You should at least have a 4GB GPU for Video Editing on the even of 2019. You need that for 4k when you start doing effects, color correction, etc. unless you want to be forever limited to proxy playback resolutions (though I think PowerDirector enforces that at all times, anyways...).
GPU does matter for video editing, a lot... unless you're doing stupendously trivial work.
HW Decoders and Encoders will accellerate the Decoding of the RAW video and Encoding of the rendered frames. They do not render those frames for you - i.e. ALL of the work in between those two steps. CPU and GPU is needed for that. A Hardware Encoder will do nothing to accelerate Color Correction Effects... That's largely GPU-bound. A Hardware Encoder will do nothing to accelerate Stabilization Effects. That's largely CPU bound.
---
When you render a video, this is how things work...
1. AMD VCD or NVDEC decodes the video from H.264 to RAW video frames, which are much bigger and take up a LOT of space in RAM (and VRAM, when they are passed to the GPU).
2. CPU Effects are calculated and applied to the frame. Some CPU Effects are light, some are quite heavy.
3. GPU Effects are calculateion and applied to the frame. Some are light, some are heavy. Frames have to be copied to GPU VRAM and then back to the CPU (unless you're using an iGPU, which allocates and shares RAM with the CPU.
- CPU and GPU effects may be applied in different orders, depending on how they are stacked in some NLEs.
4. AMD VCE or NVENC then encodes and re-compresses the video.
- GPU Encoding is typically of lower quality than CPU encoding, so for final renders the general guidance is always to avoid VCE or NVENC. They are fast, but they are actually more useful for Media Playback Acceleration and Streaming Applicaitons (since they're actually a seperate chip, they don't use much CPU or GPU... so theyr'e great for streaming to Twitch while gaming... you basically lose no performance even on budget gaming systems).
---
The part that bogs down video editing is NOT encoding, as encoding can be done AFK, and is generally a passive task. It's the part in the middle that takes up the brunt of the time - even when encoding.
This is why NVENC decreases in benefits the longer and more complicated your edit is... Because you increase the amount of CPU/GPU grunt work, which means that you are spending less of your time Decoding or Encoding and more of your time processing on the CPU and GPU.
People who buy weaksauce GPUs for PDR and eventually have to move up to something like Premiere Pro, Resolve, Media Composer, etc. are going to suffer on those products when they began using some of the more heavy/advanced effects, color correction, stabilization, or want to render out ot formats like ProRes or DNxHR. For 4K, you will start running out of VRAM when using effects that make better use of hte GPU than consumer-level editors (all of which are terrible for this - almost without exceptions).
Most of PDR's "Acceleration" is generic OpenCL stuff, and the Decode/Encode is NVDEC/ENC and VCD/E. It's not the same kind of CUDA-Accelerated Acceleration you find in the higher end products, so this leads people to buy hardware that may not stand the test of time (or in some cases, leaves them with borderline useless machines when they have to move up to a more advanced package i.e. Laptops with only an iGPU and no TB3 eGPU connectivity).
And those higher end packages often are better optimized for CUDA than for AMD GPUs (unless it's a macOS port optimized for Metal, like DaVinci Resolve).
|
|
Quote
I notice that I don't get thumbnails in Kodi/OpenElec when I encode video to h.265. (I get green thumbs instead.)
I thought that was just a function of h.265, with it's higher compression. But if PD can display the thumbs, then I guess it's not a limitation in h.265.
Maybe it's a function of how they're encoded? I use my GTX960. Maybe the HW encoder doesn't do thumbs.
Thumbnails is just the software taking a frame out of the video and displaying it as an indicator of what clips are in the bin.
This is like your image browser not displaying thumbnails for animated GIF files.
Whether you're using HW or Software Decoders, it is the same. If the software can play the video files, then clearly the HW decoder is working ;
|
|
|