Announcement: Our new CyberLink Feedback Forum has arrived! Please transfer to our new forum to provide your feedback or to start a new discussion. The content on this CyberLink Community forum is now read only, but will continue to be available as a user resource. Thanks!
CyberLink Community Forum
where the experts meet
| Advanced Search >
Setting NVIDIA GPU as Default for Powerdirector
Timrg001 [Avatar]
Member Joined: Apr 21, 2023 03:43 Messages: 53 Offline
[Post New]
This is probably an obvious Question but here goes....

Please could someone advise on how I can set the NVIDIA GPU on my laptop as default when using Powerdirector. I have tried going through windows settings and graphics but cannot see the app appear in the dropdown list. Also, there are no individual settings in the NVIDIA control panel. I either have to set everything to NVIDIA or remain with the default choice for powerdirector, which is the integrated GPU choice (AMD Radeon).

Is there a way to specify NVIDIA for PD?

Thanks,

Tim
optodata
Senior Contributor Location: California, USA Joined: Sep 16, 2011 16:04 Messages: 8630 Offline
[Post New]
Yes. See if this discussion helps.
Timrg001 [Avatar]
Member Joined: Apr 21, 2023 03:43 Messages: 53 Offline
[Post New]
Quote Yes. See if this discussion helps.


Hi and thanks for the information.

I checked out the post and tried following the advise re choosing apps to add to the list and then setting them to high performanc accordingly. However, the only one that I can find is Utility checker. I found one called PDagent but not Rafiki or powerdirector?

Would anyone know what I need to look for to add to enable me to change graphics cards settings?

Thank you,

Tim
 Filename
DxDiag 08.05.23.txt
[Disk]
 Description
 Filesize
117 Kbytes
 Downloaded:
142 time(s)

This message was edited 1 time. Last update was at May 08. 2023 12:46

optodata
Senior Contributor Location: California, USA Joined: Sep 16, 2011 16:04 Messages: 8630 Offline
[Post New]
You have to have PDR.exe listed. Use the Add desktop app button and browse to the folder shown below:



You will also want to add GPUUtilityEx.exe and RafikiAgent.exe that are located in the same folder, and set them all to High Performance
Timrg001 [Avatar]
Member Joined: Apr 21, 2023 03:43 Messages: 53 Offline
[Post New]
Quote You have to have PDR.exe listed. Use the Add desktop app button and browse to the folder shown below:



Thanks. Thats great! I have managed to find them all now.

Many thanks for your help once more.

Tim

You will also want to add GPUUtilityEx.exe and RafikiAgent.exe that are located in the same folder, and set them all to High Performance
Timrg001 [Avatar]
Member Joined: Apr 21, 2023 03:43 Messages: 53 Offline
[Post New]


Hi Optodata,

I have run some tasks on powerdirector but the AMD Radeon integrated GPU is still taking all of the load. Is this normal for Powerdirector? Nvidia 3050 did not register when rendering a preview but raised to about 12% when producing a video. Is the integrated GPU supposed to take the bulk of the work in PD?

Apologies for my lack of knowledge on this and many other computer related subjects. i have only been video editing for a short time and have never had to think about any settings previously.

Kind Regards,

Tim
optodata
Senior Contributor Location: California, USA Joined: Sep 16, 2011 16:04 Messages: 8630 Offline
[Post New]
It's normal for PD to spread tasks across multiple GPUs if there's more than one present, and the tweak I showed should encourage PD to use the GPU you've selected. Which hardware option did you choose when producing? To use your RTX 3050 you need to select NVIDIA NVENC
Timrg001 [Avatar]
Member Joined: Apr 21, 2023 03:43 Messages: 53 Offline
[Post New]
Quote It's normal for PD to spread tasks across multiple GPUs if there's more than one present, and the tweak I showed should encourage PD to use the GPU you've selected. Which hardware option did you choose when producing? To use your RTX 3050 you need to select NVIDIA NVENC


Yep that is selected. When producing, it does show that the Nvidia GPU is taking some of the load.

I will try out the application over the next week or so to see how it performs. I will update this post accordingly.

Many thanks,

Tim
JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
Tim, probably little to no real benefit in trying to force the Nvidia dGPU as being prime for PD on a laptop. It could even be a performance detriment depending on laptop MB architecture. I'm not that familiar experience wise with AMD's embedded graphics processor as much as Intel, but I'd think overall characteristics would be about the same.

Tasks really don't spread amongst GPU's, however, each one can do a specific task depending on settings. One can decode, one can encode, so on. Both won't perform a encode task simultaneously. When timeline content is of high quality and complexity, say high bitrate H.265 content which requires decode performance, it can be advantageous produce wise to do decoding on one GPU and encoding on the other. However, if one is dealing with lower bitrate and lower fidelity compression like H.264, no significant gains in splitting of these two tasks on different GPU's.

Your AMD is probably always prime GPU for PD in that laptop when both GPU’s are enabled regardless of settings, so it will do the timeline decoding. Render preview encoding is always 100% CPU, decoding could be CPU or prime GPU depending on PD settings. You also have the option of using Nvidia NVENC for encoding during produce if desired. As such, the Nvidia will never perform timeline decoding, just encoding. Decoding would be CPU or embedded GPU.

Jeff
Timrg001 [Avatar]
Member Joined: Apr 21, 2023 03:43 Messages: 53 Offline
[Post New]
Quote Tim, probably little to no real benefit in trying to force the Nvidia dGPU as being prime for PD on a laptop. It could even be a performance detriment depending on laptop MB architecture. I'm not that familiar experience wise with AMD's embedded graphics processor as much as Intel, but I'd think overall characteristics would be about the same.

Tasks really don't spread amongst GPU's, however, each one can do a specific task depending on settings. One can decode, one can encode, so on. Both won't perform a encode task simultaneously. When timeline content is of high quality and complexity, say high bitrate H.265 content which requires decode performance, it can be advantageous produce wise to do decoding on one GPU and encoding on the other. However, if one is dealing with lower bitrate and lower fidelity compression like H.264, no significant gains in splitting of these two tasks on different GPU's.

Your AMD is probably always prime GPU for PD in that laptop when both GPU’s are enabled regardless of settings, so it will do the timeline decoding. Render preview encoding is always 100% CPU, decoding could be CPU or prime GPU depending on PD settings. You also have the option of using Nvidia NVENC for encoding during produce if desired. As such, the Nvidia will never perform timeline decoding, just encoding. Decoding would be CPU or embedded GPU.

Jeff


Thanks for your reply Jeff. This is very helpful. At present I have set all the apps to high performance. I have the option of letting windows decide. maybe this would be a better choice then? If nothing else, it would take matters out of my hands, which is always a good thing!

Tim
Powered by JForum 2.1.8 © JForum Team