Announcement: Our new CyberLink Feedback Forum has arrived! Please transfer to our new forum to provide your feedback or to start a new discussion. The content on this CyberLink Community forum is now read only, but will continue to be available as a user resource. Thanks!
CyberLink Community Forum
where the experts meet
| Advanced Search >
Are there still restrictions in using NVIDIA GPUs?
efroggy [Avatar]
Newbie Joined: Mar 20, 2009 15:22 Messages: 5 Offline
[Post New]
The reason I ask is because an upgrade of my PC from INTEL integrated Graphic to a dedicated graphics Card did not Show any improvements. Checking the Forums shows some Problems with NVIDIA CUDA and Drivers later than 301.41. Unfortunately this Driver is to old for my new GPU.

Here are some System Details:
PD11: 11.0.0.3026
NVIDIA Driver Version tested: 306.23, 320.18, 320.49
Graphic Adapter: Gainward GTX650, 1GByte
CPU: Intel i7-2600
Mem: 4 Gyte
OS: W7 Ultimate SP1, 32 Bit.

My Testsample:
15 Minute Edited Video 1080p50 convert to 720p50 takes about 60 Minutes with a few Percent Differenz between the two GPUs. With PD9 rendering it was about 4 times faster using the Intel GPU.

Regards
efroggy

Microsoft Mediacenter Fan
borgus1 [Avatar]
Senior Contributor Joined: Feb 27, 2013 00:33 Messages: 1318 Offline
[Post New]
Quote: The reason I ask is because an upgrade of my PC from INTEL integrated Graphic to a dedicated graphics Card did not Show any improvements.


Latest driver for the 650 is 320.49; I'm not aware of problems with CUDA and PD11.

Is hardware decoding enabled in PD11, under OPTIONS|HARDWARE ACCELERATION?
efroggy [Avatar]
Newbie Joined: Mar 20, 2009 15:22 Messages: 5 Offline
[Post New]
Quote:
Latest driver for the 650 is 320.49; I'm not aware of problems with CUDA and PD11.

Is hardware decoding enabled in PD11, under OPTIONS|HARDWARE ACCELERATION?


Thanks for your reply.
under HARDWARE ACCELERATION are two boxes marked:
1.) the box for OpenCL
2.) the box for Hardware Decoding
Shouldn't be there CUDA as well?
In the Production section I select H.264AVC 1280 x 720/50p (24 MBit/s) and mark "Technologies for fast Video Rendering
unmark SVRT and mark Hardware-Videocoding (please excuse if the wording is different, because I use the german Version).

If I monitore the CPU and GPU loads during rendering it will be as follows:
CPU average of all ist less than 10 %
GPU most of the time 0 to 4%, with short peaks 20%

Is this normal? I just wonder why the gain of rendering speed by adding an additional GPU is so marginal.
Microsoft Mediacenter Fan
borgus1 [Avatar]
Senior Contributor Joined: Feb 27, 2013 00:33 Messages: 1318 Offline
[Post New]
Quote:
under HARDWARE ACCELERATION are two boxes marked:
1.) the box for OpenCL
2.) the box for Hardware Decoding
Shouldn't be there CUDA as well?


Search the HELP file for hardware decoding for an overview.

Quote: In the Production section I select H.264AVC 1280 x 720/50p (24 MBit/s) and mark "Technologies for fast Video Rendering
unmark SVRT and mark Hardware-Videocoding (please excuse if the wording is different, because I use the german Version).

If I monitor the CPU and GPU loads during rendering it will be as follows:
CPU average of all ist less than 10 %
GPU most of the time 0 to 4%, with short peaks 20%

Is this normal? I just wonder why the gain of rendering speed by adding an additional GPU is so marginal.


Don't have that answer. Perhaps someone else does.
Eugen157
Senior Contributor Location: Palm Springs area, So.CA Joined: Dec 10, 2012 13:57 Messages: 662 Offline
[Post New]
There was a post a few weeks ago with a similar topic.

Apparently the Intel internal GPU is very good, a quote I read on the internet was" as good or better than a $500.00 GPU card".

That may be why you are not seeing an improvement.

However 60min seems long, what are the bitrates?

Eugene

This message was edited 2 times. Last update was at Jul 26. 2013 16:41

73s, WA6JZN ex DL9GC
CYBERLINK PLEASE ADD UHD BLU RAY BURNING SOFTWARE
PD14,
Win10,64bit.CPU i7 6700,16GB ,C= 480 GB SSD ,GPU GTX1060 6GB 1 fan. Plus 3 int, 4 ext HDD's for video etc.LG WH16NS40 reads UHD.
4K 24" ViewSonic monitor.Camera Sony FDR-A
efroggy [Avatar]
Newbie Joined: Mar 20, 2009 15:22 Messages: 5 Offline
[Post New]
Thanks for the answer, this could explain my Situation.
The bitrates of the finished video according to Mediainfo are:
Bitrate Modus: variable, average: 20.1 Mbps, max: 23.3 Mbps

As far as I remember with PD9 on the same machine and the same Input materials it was much faster. I only had a comparison from my old Intel Dual core and ATI 3550 GPU there was a 10 fold speed increase. With PD11 it is about 4 times slower.
Unfortunately I only have 32 bit Windows7.

Somewhere in this Forum someone managed to run both graphics at the same time with a very good result. He somehow added a 2nd Monitor to the Intel GPU in order to invoke the Intel. What do you think?

Siggy Microsoft Mediacenter Fan
RobAC [Avatar]
Contributor Joined: Mar 09, 2013 18:20 Messages: 406 Offline
[Post New]
efroggy

I am of the opinion that check box for Hardware Decoding includes Nvidia Cuda as well as AMD APU and so on. No separate boxes for each. See here: http://www.cyberlink.com/support/product-faq-content.do?id=12777&prodId=4

If you haven't done so already, update your graphics driver to the latest version currently 320.49 as indicated above. You are not getting the increased performance benefits by using an old outdated driver. I have zero issues using this new driver.

The i7-2600 CPU you have includes the older Intel HD2000 GPU. Not the greatest for heavy work like video editing.

To just use the Intel HD2000 GPU have you tried right clicking on your desktop and choosing Screen Resolution. See if you can just select the lower Intel resolution. It might not give you the option but worth a try.

Also just unplugging the monitor cable from the Nvidia card and plugging it directly into your computer itself? Will it accept the attachment... Display port, DVI, HDMI.. VGA ?

You can also try just disabling the Nvidia card in the Control Panel >Device Manager- but I would not advise this unless you are comfortable doing such things. It might cause other issues.

R

This message was edited 1 time. Last update was at Jul 27. 2013 08:26

PD 14 Ultimate Suite / Win10 Pro x64
1. Gigabyte Brix PRO / i7-4770R Intel Iris Pro 5200 / 16 GB / 1 TB SSD
2. Lenovo X230T / 8GB / Intel HD4000 + ViDock 4 Plus & ASUS Nvidia 660 Ti / Link: https://www.youtube.com/watch?v=ZIZw3GPwKMo&feature=youtu.be
Sabineko [Avatar]
Newbie Joined: Aug 12, 2013 15:17 Messages: 4 Offline
[Post New]
Dunno about restrictions, but the performance wasn't good on my machine.

I have a Win7 Core i7 laptop (8 hyperthreads 1.7GHz) and it has a NVida M310 GPU with 16 CUDA cores.

I enabled the hardware encoder in the MediaEspresso to convert files to a phone at a lower resolution and the NVIDA generated files were smaller in size (Mbytes) but suffered from horrendous compression blocking in fast motion areas of the picture. If I set the program to use hardware decode only and software encode, it produced flawless results (but about 30% slower encode speed). The CUDA rendering was faster but lower quality.

For making Blurays I haven't bothered enabling the hardware encoder feature in Director 11 as it doesn't go that much faster on my machine and presumably the software encoder is the same in Director 11 as in the Media Espresso?

Maybe later NVIDA hardware than the 300 series does a better job... It wasn't the drivers, as I tried upgrading to the latest ones from the NVIDA web site v320.49

Not noticed any rendering problems in Adobe Premiere that also uses CUDA though, so maybe it's just a problem with the hardware encoder profile in the Cyberlink software.
Powered by JForum 2.1.8 © JForum Team