Announcement: Our new CyberLink Feedback Forum has arrived! Please transfer to our new forum to provide your feedback or to start a new discussion. The content on this CyberLink Community forum is now read only, but will continue to be available as a user resource. Thanks!
CyberLink Community Forum
where the experts meet
| Advanced Search >
GPU Acceleration (current gen cards not supported?)
doublethr33 [Avatar]
Newbie Joined: Jun 23, 2018 05:57 Messages: 40 Offline
[Post New]
I was consideirng buying this program, but the knowledgebase for version 16, where it lists compatible gpus, does not list any recent ones! It seriously is not compatible with 10xx nvidea cards or amd rx cards? I bought an rx 580 almost solely for when I'd consider using an editor, so I wouldn't want to use this software and then it not even be using the gpu or it was wasted money, I feel.

I also read a post by someone on AMD's site complaining that in powerdirector 15, it would not let him use a good amd card, either. It greyed it out and forced him to choose intel's igpu, due to cyberlink apparently having amd lock it?

And one more thing... is this optimized enough for extra cores and threads to where an amd 2700x would work better with it than an intel 8700 or do the extra threads not help much for this particular program?

Thanks!

This message was edited 1 time. Last update was at Jun 24. 2018 21:15

doublethr33 [Avatar]
Newbie Joined: Jun 23, 2018 05:57 Messages: 40 Offline
[Post New]
How could I be the only one who seriously would care about it not supporting current generation graphics cards? And by current generation... I mean it doesn't even list support for ones that came out years ago. Why would it even make sense to not support the better graphics cards? (again, I read where someone claimed cyberlink even told amd to lock it out from being used).
[Post New]
RX570 on one of my PC works fine with PowerDirector 16, and the CPU is Intel Core i5 8400.

Actually you may download the free trial to check if the hardware acceleration works with your GPU.
[Thumb - RX570.jpg]
 Filename
RX570.jpg
[Disk]
 Description
 Filesize
175 Kbytes
 Downloaded:
69 time(s)

This message was edited 1 time. Last update was at Jun 26. 2018 04:48

doublethr33 [Avatar]
Newbie Joined: Jun 23, 2018 05:57 Messages: 40 Offline
[Post New]
Quote RX570 on one of my PC works fine with PowerDirector 16, and the CPU is Intel Core i5 8400.

Actually you may download the free trial to check if the hardware acceleration works with your GPU.

Thanks. I can't test it because this is for a build I haven't yet made. My current pc has a hilariously old gpu.

So I guess they just didn't update their list in the knowledgebased/faq?

Do you know what the general consensus is for whether their nvidea (ie using cuda cores) implementation is better or worse than the amd implementation?

I am leaning towards purchasing this software. Only thing now would be me figuring out if the extra effects in ultimate are worth it and also if I want to wait until 17 comes out or not.

This message was edited 1 time. Last update was at Jun 26. 2018 05:26

[Post New]
Unfortunately I don't see any benchmark of modern GPU for multimedia, all press are focusing on 3D gaming benchmark.

Anyway, what I can share is my buddy has GTX1060 and the hardware acceleration works fine with PowerDirector 16 too.
Carl312
Senior Contributor Location: Texas, USA Joined: Mar 16, 2010 20:11 Messages: 9090 Offline
[Post New]
Passmark has the benchmarks for most, if not all to the latest GPU, and CPU benchmarks.

https://www.videocardbenchmark.net/

Click on CPU for CPUs, High end would be the latest

High End for GPU, also. Carl312: Windows 10 64-bit 8 GB RAM,AMD Phenom II X4 965 3.4 GHz,ATI Radeon HD 5770 1GB,240GB SSD,two 1TB HDs.

[Post New]
Quote

...


I have an AMD RX 580 and I can confirm that it is supported.

Now you are making a confusion about "cores". Those cores are used only for a few effects (less than 1% in the time budget).
The rest of the decoding and encoding in all the modern GPU's (Nvidia, AMD or even Intel) is done by a dedicated hardware (ASIC) block, separated from the general cores.
So, as long as the generation of that GPU is the same, it doesn't matter how many actual cores the GPU has, it does not influence the speed of decoding or encoding, they all have the same ASIC block.
Cores are important for games, but not used al lot for actual video editing.

This message was edited 1 time. Last update was at Jun 27. 2018 16:48

doublethr33 [Avatar]
Newbie Joined: Jun 23, 2018 05:57 Messages: 40 Offline
[Post New]
Quote


I have an AMD RX 580 and I can confirm that it is supported.

Now you are making a confusion about "cores". Those cores are used only for a few effects (less than 1% in the time budget).
The rest of the decoding and encoding in all the modern GPU's (Nvidia, AMD or even Intel) is done by a dedicated hardware (ASIC) block, separated from the general cores.
So, as long as the generation of that GPU is the same, it doesn't matter how many actual cores the GPU has, it does not influence the speed of decoding or encoding, they all have the same ASIC block.
Cores are important for games, but not used al lot for actual video editing.


What I meant is hardware optimization when using nvidea cards is dependent on cuda cores. That's what everything I have read has said, anyway.

Either way, it's irrelevant to me, I guess, because I have an rx 580, not an nvidea card. So, anyway, at this point my decisions are just ultra vs. ultimate and 16 vs wait on 17.

Anyone have any guesses as to what "may" have room for improvement for the 17 version? (I guess I a going way off topic here.)
[Post New]
Quote


What I meant is hardware optimization when using nvidea cards is dependent on cuda cores. That's what everything I have read has said, anyway.




  1. The brand is called nvidia not nvidea

  2. As I already said above, there are no CUDA cores used in decoding/encoding. It's something else, re-read again.

This message was edited 1 time. Last update was at Jun 27. 2018 21:29

JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
Quote Anyone have any guesses as to what "may" have room for improvement for the 17 version? (I guess I a going way off topic here.)

This post maybe of interest: https://forum.cyberlink.com/forum/posts/list/65974.page#post_box_300990

PD17, hopefully address many of the hardware encode, hardware decode, BD hardware encode, timeline hardware decode anomalies that have been present for way too long as well as support for formats supported by GPU's for yrs but still PD unsupported i.e bit depth.......on and on.

Jeff
doublethr33 [Avatar]
Newbie Joined: Jun 23, 2018 05:57 Messages: 40 Offline
[Post New]
Quote

This post maybe of interest: https://forum.cyberlink.com/forum/posts/list/65974.page#post_box_300990

PD17, hopefully address many of the hardware encode, hardware decode, BD hardware encode, timeline hardware decode anomalies that have been present for way too long as well as support for formats supported by GPU's for yrs but still PD unsupported i.e bit depth.......on and on.

Jeff


It's hard to know what to do because that post seems to say there's not even a lot of impact by usign a good gpu. And as far as the issues you mentioned, I sure don't want to buy software of this type and it end up buggy and BR discs not burning properly. I could use free or cheaper programs if that's going to be happening.

I'm really feeling like I shouldn't have gotten much of a gpu. Although I only spent about $50 more than a 1050 ti would have cost, so I guess I couldn't have really saved much without going "toio" far down in gpu quality...
JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
Quote It's hard to know what to do because that post seems to say there's not even a lot of impact by usign a good gpu.

I don't think it said that at all, the benefit really depends on what your workflow is and what you have as perceived bottlenecks in that approach. If you think the GPU purchase makes everything significantly better, no, definitely not the case. The post highlighted several unique complexities of GPU's and PD16 to help guide what's right for your situation.

For instance, if you are planning on creating H.265 productions for playback and your pinch point is the long H.265 encode time, a Nvidia GPU is significantly faster than any equivalent cost consumer CPU for encoding on the market and in this case the AMD GPU H.265 encoding with PD is broken. Some users on this forum use Nvidia GPU extensively for this and it works extremely well for their situation. From days to encode with their current CPU to hours with a Nvidia GPU. If your projects are only a 30 second clip, not much pain, someone creating several hour projects and waiting overnight to a day to encode, that maybe their pain point.

Similarly, if your intent is to produce GPU encoded BD discs because that's the workflow you want to do, no need to get a Nvidia GPU as that features is no longer supported in PD16.

In any case, I'd never pay the upcharge for a 1050TI for video editing within PD16. As SoNic67 pointed out, the Nvidia NVENC SIP core is nearly identical (minor clock speed changes) as other similar generation chips and it's encode performance is not really a function of CUDA capability. So for example a GTX1050 will be the same as a GTX1050TI for encoding so take the extra $70 or so and put it someplace in your system that would add value.

Jeff
doublethr33 [Avatar]
Newbie Joined: Jun 23, 2018 05:57 Messages: 40 Offline
[Post New]
Quote
Quote It's hard to know what to do because that post seems to say there's not even a lot of impact by usign a good gpu.

I don't think it said that at all, the benefit really depends on what your workflow is and what you have as perceived bottlenecks in that approach. If you think the GPU purchase makes everything significantly better, no, definitely not the case. The post highlighted several unique complexities of GPU's and PD16 to help guide what's right for your situation.

For instance, if you are planning on creating H.265 productions for playback and your pinch point is the long H.265 encode time, a Nvidia GPU is significantly faster than any equivalent cost consumer CPU for encoding on the market and in this case the AMD GPU H.265 encoding with PD is broken. Some users on this forum use Nvidia GPU extensively for this and it works extremely well for their situation. From days to encode with their current CPU to hours with a Nvidia GPU. If your projects are only a 30 second clip, not much pain, someone creating several hour projects and waiting overnight to a day to encode, that maybe their pain point.

Similarly, if your intent is to produce GPU encoded BD discs because that's the workflow you want to do, no need to get a Nvidia GPU as that features is no longer supported in PD16.

In any case, I'd never pay the upcharge for a 1050TI for video editing within PD16. As SoNic67 pointed out, the Nvidia NVENC SIP core is nearly identical (minor clock speed changes) as other similar generation chips and it's encode performance is not really a function of CUDA capability. So for example a GTX1050 will be the same as a GTX1050TI for encoding so take the extra $70 or so and put it someplace in your system that would add value.

Jeff


The comparison charts for PD16 say it does allow burning blu-ray discs. If that is not true, I'm not getting the product. Even if you're just saying gpu acceleration for it was stirpped, I may not get the product. All articles about video editing talk about how important a good gpu is. If this program is not set up in a way for the gpu to be a worthwhile purchase, I'll find a different software.

And it's very odd news that gpu acceleration with an amd card would be broken, when PD has had acceleration from OpenCL far before cuda acceleration (notice even the name of the acceleration has cuda in it).

Very few products are optimized for nvidia, to begin with. So if this is "more" optimized for nvidia cards, then it's disappointing, as they must have just not cared about it, in order to make people with nviia cards happy. No matter how good this program may be, I don;'t want my gpu purchase to be a waste when there are other programs that take advantage of amd cards. And due to all the fees on ebay and amazon, I probably couldn't sell the 580 and get an nvidia 1060 without being out extra money.

Also, I am building a pc, ie havent even opened the 580. My curent pc is very ancient and even on it, encoding wouldn't take days, so the fact that any confiuguration could cause a product to take days to encode is interesting, itself.

This message was edited 1 time. Last update was at Jun 28. 2018 19:33

JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
Quote The comparison charts for PD16 say it does allow burning blu-ray discs. If that is not true, I'm not getting the product. Even if you're just saying gpu acceleration for it was stirpped, I may not get the product. All articles about video editing talk about how important a good gpu is. If this program is not set up in a way for the gpu to be a worthwhile purchase, I'll find a different software.

And it's very odd news that gpu acceleration with an amd card would be broken, when PD has had acceleration from OpenCL far before cuda acceleration (notice even the name of the acceleration has cuda in it).

Very few products are optimized for nvidia, to begin with. So if this is "more" optimized for nvidia cards, then it's disappointing, as they must have just not cared about it, in order to make people with nviia cards happy. No matter how good this program may be, I don;'t want my gpu purchase to be a waste when there are other programs that take advantage of amd cards. And due to all the fees on ebay and amazon, I probably couldn't sell the 580 and get an nvidia 1060 without being out extra money.

Also, I am building a pc, ie havent even opened the 580. My curent pc is very ancient and even on it, encoding wouldn't take days, so the fact that any confiuguration could cause a product to take days to encode is interesting, itself.


You can always buy what you wish, I'm no salesman. But yes, the Nvidia GPU hardware encoding for creation of H.264 BD's was turned off in a PD15 patch and still is off in PD16. (Attached is a PD16 pic for proof). CPU encoding of timeline will be used during PD16 BD creation when encoding needs to be done if one has a Nvidia GPU.

I think you continue to confuse OpenCL, CUDA, and NVENC (Nvidia hardware encoding) or VCE (AMD hardware encoding) functionality. Nvidia dropped the CUDA based encoder at driver release 340.XX, yes, nearly 4 years ago! I don't recall the exact timeframe for AMD adopting only VCE encoding via drivers but about 3 yrs ago too. PD does use OpenCL (Nvidia or AMD) to accelerate FX playback and encoding, only the FX though. Not much of a typical timeline is FX so of little real value for most. Acceleration via OpenCL is not GPU hardware encoding.

Maybe this somewhat old but still rather relevant long winded response for PD options will provide some clarity for you: https://forum.cyberlink.com/forum/posts/list/51023.page#post_box_267891

EDIT: oh, for a days comparison, have a read here, particularly last sentence: https://forum.cyberlink.com/forum/posts/list/20/49202.page#post_box_258666 Have you done any H.265 encoding? I think OP of this thread gets about 2-3X realtime encoding with his then Nvidia GTX960 GPU for his H.265 needs, so yes it still can be a few hours. At the time of his comment I believe he also had a i7-920 CPU.

Jeff
[Thumb - PD16_Nvidia_HE.png]
 Filename
PD16_Nvidia_HE.png
[Disk]
 Description
 Filesize
343 Kbytes
 Downloaded:
19 time(s)

This message was edited 2 times. Last update was at Jun 28. 2018 23:21

doublethr33 [Avatar]
Newbie Joined: Jun 23, 2018 05:57 Messages: 40 Offline
[Post New]
Quote


You can always buy what you wish, I'm no salesman. But yes, the Nvidia GPU hardware encoding for creation of H.264 BD's was turned off in a PD15 patch and still is off in PD16. (Attached is a PD16 pic for proof). CPU encoding of timeline will be used during PD16 BD creation when encoding needs to be done if one has a Nvidia GPU.

I think you continue to confuse OpenCL, CUDA, and NVENC (Nvidia hardware encoding) or VCE (AMD hardware encoding) functionality. Nvidia dropped the CUDA based encoder at driver release 340.XX, yes, nearly 4 years ago! I don't recall the exact timeframe for AMD adopting only VCE encoding via drivers but about 3 yrs ago too. PD does use OpenCL (Nvidia or AMD) to accelerate FX playback and encoding, only the FX though. Not much of a typical timeline is FX so of little real value for most. Acceleration via OpenCL is not GPU hardware encoding.

Maybe this somewhat old but still rather relevant long winded response for PD options will provide some clarity for you: https://forum.cyberlink.com/forum/posts/list/51023.page#post_box_267891

EDIT: oh, for a days comparison, have a read here, particularly last sentence: https://forum.cyberlink.com/forum/posts/list/20/49202.page#post_box_258666 Have you done any H.265 encoding? I think OP of this thread gets about 2-3X realtime encoding with his then Nvidia GTX960 GPU for his H.265 needs, so yes it still can be a few hours. At the time of his comment I believe he also had a i7-920 CPU.

Jeff


I by no means know much about the technical talk on a GPU, but everything I've come across, which makes recommendations for video editing, mentions that your gpu choice can speed up encoding due to OpenCL support (which is from the gpu)? And also, if CUDA was done away with, why do all of the mdoern nvidia cards list cuda core count and a lot of programs which use gpu acceleration have it listed as "cuda acceleration"?

I've barely had experience, but I know of a program that takes only a few hours even with very outdated pc specs. All I know is that reviews have shown comparisons where PD is suppsoedly one of the fastest encoding editing software out there. But just the fact that it's said in ehre that amd acceleration is broken makes me almost rather get somethign that will take longer to encode, just to know the gpu is actually being used...

A lot of the articles are really misleading if gpu doesn't affect much other than effects. They go on and on about how you should get at least a 1050 ti minimum.... well that makes no sense, if the gpu isn't even needed if you don't use effects. Articles also suggest 32gb of RAM, which I got... and I read some firsthand experience where someone used a high end video editor and never went above 16gb of his 32gb RAM used... so I feel like I am just throwing a lot of money away, due to believing articles' minimum requirement listings and buying based off of it.

I can only think of 1 other well reviewed program that even allows BR burning, so I don't even have much choice (unless I spent big bucks, which i won't do.,... this is only for a rare instance of use by me (hobby), not constant use).
doublethr33 [Avatar]
Newbie Joined: Jun 23, 2018 05:57 Messages: 40 Offline
[Post New]
The bottom line here is almost any article you read about video editing says you should have a good gpu. Then they make a big deal about VRAM being the most important aspect of the card. Thus why I got an 8GB RX 580. But if this program's encoding with an amd card is broken (I don't think anyone's said how, but anyway...), it defeated the whole purpose. I see a 6gb GTX 1060 ons ale today for a pretty good price and I could sell the amd, but that feels like a lot of trouble when apparently the amd would be better with most programs.

Even cpu choice is a big headache, because some programs work better with intel and some work better with amd ryzen.
[Post New]
How did you arrive to the conclusion that AMD card is "broken"? I specifically said that the AMD card works in PowerDirector.

However Bluray burning does not use the GPU. Any GPU.

As for CPU - for video editing there is no debate - Intel works best. Period.

Go get another software. They are all the same - they all claim they use the GPU (and you assume for everything), but reality is always different. I have tried 5-6 different video editors and for now, good or bad, Cyberlink's PowerDirector makes the most use of the GPU.

This message was edited 2 times. Last update was at Jun 29. 2018 11:00

JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
Quote How did you arrive to the conclusion that AMD card is "broken"? I specifically said that the AMD card works in PowerDirector.

I stated that it's broken here https://forum.cyberlink.com/forum/posts/list/65974.page#post_box_300990 for H.265 encoding for a RX 560, and in the linked thread for R9 Fury and R9 380.
b) H.265 hardware encoding is broken, you don’t get the bitrates you expect, quality issues, decode issues…. Initially exposed here over a year ago and still not fully functional: https://forum.cyberlink.com/forum/posts/list/15/50731.page#post_box_267005

You can see even CL acknowledged deficiency at the time with:
"After your feedback and some research, the product team realizes that more work is required to support this for some AMD GPUs, such as yours. Therefore, they asked us to remove the statement that this beta patch adds support for all of the AMD GPUs."

Maybe your GPU is the exception. Simply produce to H.265 at a higher bitrate and validate that file size and/or MediaInfo or such agrees with what rates you intended to produced. The above 3 AMD cards are nowhere near expectation, a mere fraction. H.265 CPU or Nvidia encoding with PD will match produce bitrate intent.

Jeff
[Post New]
Quote

I stated that it's broken here https://forum.cyberlink.com/forum/posts/list/65974.page#post_box_300990 for H.265 encoding for a RX 560, and in the linked thread for R9 Fury and R9 380.
b) H.265 hardware encoding is broken, you don’t get the bitrates you expect, quality issues, decode issues…. Initially exposed here over a year ago and still not fully functional: https://forum.cyberlink.com/forum/posts/list/15/50731.page#post_box_267005

You can see even CL acknowledged deficiency at the time with:
"After your feedback and some research, the product team realizes that more work is required to support this for some AMD GPUs, such as yours. Therefore, they asked us to remove the statement that this beta patch adds support for all of the AMD GPUs."

Maybe your GPU is the exception. Simply produce to H.265 at a higher bitrate and validate that file size and/or MediaInfo or such agrees with what rates you intended to produced. The above 3 AMD cards are nowhere near expectation, a mere fraction. H.265 CPU or Nvidia encoding with PD will match produce bitrate intent.

Jeff


Define "higher bitrate". I usually rely on H265 because I want to have lower bitrates than H264, that's why maybe I have missed that.
As a note, I am on the 18.6.1 driver now.

PS: I agree tha nvidia platform was historically more stable and bug-free. But when I decided to upgrade my GTX960, I didn't think that the GTX1060 would be a good investment, was too expensive. But, that's just me.

This message was edited 2 times. Last update was at Jun 29. 2018 14:53

doublethr33 [Avatar]
Newbie Joined: Jun 23, 2018 05:57 Messages: 40 Offline
[Post New]
As far as intel being better than amd for video editing, that's just not the case, unless you are saying for PD, specifically. In most situations, the cpu with the most cores and threads is the fastest for encoding. I am talking about mainstream cpus, the 8700/8700k and the 2700/2700x, so yes an intel with equal cores and threads would be better than an amd, but an amd with more cores and threads will be far superior for any program optimized properly for high core and thread uses. Of course intels with many threads would be best, but I'm talking $300-$400 cpus with the amd in the comparison having more threads. And even if you go with a higeher core/thread intel, then you have amd threadripper which would outdo it.

My memory, however, is optimized for intel (as 99% is). There was a great sale on ebay today where people got the 2700x for under $250, but I was too scared to risk it, since it wouldn't be considered an authorized dealer.

As for gpu choice, I originally was going to get a 6gb 1060. But every single place I read about video editing said the more VRAM for your card, the better it will be for video editing, and the rx 480 was constantly recommended. So when I saw the 580 on sale for $250 minus $20 rebate minus $30 I saved on gift cards, I thought it was a no brainer to spend that $200 plus tax instead of $300 for a 1060.

But now I am wondering if I should have listed my 580 today while everyone went crazy on ebay, then after fees, maybe could still have gotten a 1060 without being out more... or spend more and get a 1070 or 1070 ti.

As for the broken amd acceleration... I sure hope they address that in PD1`7. as I was leaning towards waiting on that, anyway, as my assumption is it will be released in approximately 2 months. Obviosuly the most important part of my purchase is wanting it to perform well, with great quality. PD is known for being one of the fastest at encoding. I'd much rathe rit slow down if that's what it takes for the great bitrates. But I'd rather not slow it down to the point of using no gpu acceleration, just on principle due to me buyingn a good gpu.

This message was edited 2 times. Last update was at Jul 01. 2018 02:51

Powered by JForum 2.1.8 © JForum Team