Announcement: Our new CyberLink Feedback Forum has arrived! Please transfer to our new forum to provide your feedback or to start a new discussion. The content on this CyberLink Community forum is now read only, but will continue to be available as a user resource. Thanks!
CyberLink Community Forum
where the experts meet
| Advanced Search >
AMD GPU decrease bitrate [survey]
follet [Avatar]
Newbie Location: Ukraine Joined: Nov 29, 2015 04:46 Messages: 12 Offline
[Post New]
Hello.

I have dialog with ticket support approx one month, unfortunately it depend on time-zone and I have 1-2 message per week ))). It is too long, but we investigate one problem with decrease bitrate video using AMD GPU. I have RX 570, 8Gb card. (XFX Radeon RX 570 RS XXX Edition). I provided video and project data to support team and I want to share this matherial with you.

I ask you to help me investigate this issue. If you have time and desire to help me, please download my project with video from GoogleDrive and render it using automatic "profile analyzer" and apply corresponded settings. It should be XAVC-S 4K video with 100mbit bitrate, captured on my Sony a6300 camera. In render settings check "hardware video encoder" to apply only GPU power.

I ask you to share here result of your render: write your GPU card, driver version and bitrate amount of final video. You can check it using free software "MediaInfo".

I've got information that it doesn't fixed in PD18 but works well with Nvidia GPU...

link to my GoogleDrive
https://drive.google.com/drive/u/0/folders/1BM7jWMl-YzwHO4PbPnz6nRD7Oc2JvmZY

This message was edited 1 time. Last update was at Oct 01. 2019 14:28

[Post New]
That was one of the reasons I have "donated" my RX 580 to my daughter's PC and switched back to nvidia in mine (GTX1080).

PS: I still have older Quadro K2000 and GTX960 that I have used previously, don't know why I got enthusiastic about AMD and got the RX580.
JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
follett, appears like a mix of information, your RESULT_GPU.mp4 is not actually a XAVC-S as indicated, just a basic H.264. The "Profile Analyzer" for me also suggests a H.264 AVC profile vs XAVC-S as you write.

If you search the forums, I have numerous posts about the unusual low bitrates of several of the AMD series GPU's with PD hardware encoding through the years.

For H.264, 3840x2160, 25fps progressive, 101422Kbps video avg bitrate, from PD profile generated
PD17 v3005, 381MB, 37.160s duration, overall avg bitrate 86.0 Mb/s
PD18 v2028, 381MB, 37.160s duration, overall avg bitrate 86.0 Mb/s

Both results above for a GTX2070 for decode and encode with 436.30 drivers and stats from MediaInfo 18.12

For CPU encoding and decoding:
PD17 v3005, 441MB, 37.160s duration, overall avg bitrate 99.5 Mb/s
PD18 v2028, 444MB, 37.160s duration, overall avg bitrate 100.0 Mb/s

Jeff
follet [Avatar]
Newbie Location: Ukraine Joined: Nov 29, 2015 04:46 Messages: 12 Offline
[Post New]
Quote
For H.264, 3840x2160, 25fps progressive, 101422Kbps video avg bitrate, from PD profile generated
PD17 v3005, 381MB, 37.160s duration, overall avg bitrate 86.0 Mb/s
PD18 v2028, 381MB, 37.160s duration, overall avg bitrate 86.0 Mb/s

But in my case bitrate decrease from 100 to 20mbit - it isn't bearable...
JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
Quote But in my case bitrate decrease from 100 to 20mbit - it isn't bearable...


Yes, those were some of the same problems and same magnitude of issues reported prior with the AMD cards and PD hardware encoding.

Jeff
follet [Avatar]
Newbie Location: Ukraine Joined: Nov 29, 2015 04:46 Messages: 12 Offline
[Post New]
Quote
Yes, those were some of the same problems and same magnitude of issues reported prior with the AMD cards and PD hardware encoding.
Jeff


I wonder, why Cyberlink know about this problem and woun't to fix it? Maybe it is marketing? Most fastest videoeditor in the world, but due to miss bitrate information.

I have no complaints, i tried to ask users about other generation of AMD GPU, I thought that it is only new generation of GPU. I used trial version of other videoeditors, and all videos rendered good, using 100% GPU power. I made conclussion, that problem in core of PD, not driver implementation. I surprised, that PD 18 has some core as i understood.

What is suggestions? Change GPU from AMD to Nvidia?
I expected Cyberlink should investigate this problem and admit this issue, as compensation return money to AMD users ))) and strongly recomend to migrate to Nvidia ))) But unfortunaely they don't hurry for fix it, maybe because they loose first place of rendering speed )))

So... I upset...
JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
I had seen the issue with R9 380, R9 Fury, and RX 560 and several driver releases so I think pretty universal. If I remember right, I think you can get about 40Mbps in H.264 but not much higher. You might keep your same profile and just back down the PD profile video bitrate from 101422Kbps which mine suggested to 80000, 60000, 50000, 40000 Kbps and see if you get something a little better than your 20000kbps.

I know not ideal, but maybe better.

Nvidia hardware encoding not 100% problem free either in PD, but in my experience with PD through many years it has faired better.

I had provided kind of a high level information review here of a few trade offs: https://forum.cyberlink.com/forum/posts/list/65974.page#post_box_300990 that might be of some value.

Jeff
follet [Avatar]
Newbie Location: Ukraine Joined: Nov 29, 2015 04:46 Messages: 12 Offline
[Post New]
Quote If I remember right, I think you can get about 40Mbps in H.264 but not much higher.
Jeff


Yes, I tried to edit profile manually. I changed bitrate from 100 to 99 mbit and I have got 39.9 mbit in final video file. But it appliable for H.264 (AVC) codec. I have XAVC-S and I tried to render project using one preset from list (see attachment). It sounds good, bitrate 60 mbit and high speed of rendering (100% load of GPU) - it is my chance to continoue work with PD. As I understood, automatic project analizer of PD do not correct recognize source and can't property processed video flow with standards, maybe...
[Thumb - preset.jpg]
 Filename
preset.jpg
[Disk]
 Description
 Filesize
176 Kbytes
 Downloaded:
2 time(s)

This message was edited 1 time. Last update was at Oct 02. 2019 14:53

JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
Quote Yes, I tried to edit profile manually. I changed bitrate from 100 to 99 mbit and I have got 39.9 mbit in final video file. But it appliable for H.264 (AVC) codec. I have XAVC-S and I tried to render project using one preset from list (see attachment). It sounds good, bitrate 60 mbit and high speed of rendering (100% load of GPU) - it is my chance to continoue work with PD. As I understood, automatic project analizer of PD do not correct recognize source and can't property processed video flow with standards, maybe...

The "Profile Analyzer" simply looks at timeline content and suggests most appropriate formats to match the video specs of the timeline content. So if one had mixed timeline content, the analyzer would actually recommend two or more profiles.

You can always change to reflect what you want. In your pic, just select the "+" next to the profile and change the video bitrate in increments and see what you get on your produced output. Again, maybe when it's set to 50000kbps you actually get close to that bitrate with your AMD GPU vs the 20Mbps you said you got.

For much encoding, at times it is very hard to see the difference between 100Mbps and 60Mbps or so, it really depends on video content and playback equipment.

Jeff
follet [Avatar]
Newbie Location: Ukraine Joined: Nov 29, 2015 04:46 Messages: 12 Offline
[Post New]
Quote
... and suggests most appropriate formats to match the video specs ...
Jeff


You are right! But in my case, video on timeline recognized not correct by profile analyzer. What prewents PD to recognize my video as XAVC-S, not AVC? Maybe this is main reason of incorrect processing video data.
tomasc [Avatar]
Senior Contributor Joined: Aug 25, 2011 12:33 Messages: 6464 Offline
[Post New]
This is interesting. According to the specification here: https://www.cnet.com/products/sony-a6300/specs/ , your recordings are XAVC S as defined here: https://www.f2fsoft.com/avchd-video-converter/xavcs-vs-avchd/ .
follet [Avatar]
Newbie Location: Ukraine Joined: Nov 29, 2015 04:46 Messages: 12 Offline
[Post New]
Quote This is interesting. According to the specification here: https://www.cnet.com/products/sony-a6300/specs/ , your recordings are XAVC S as defined here: https://www.f2fsoft.com/avchd-video-converter/xavcs-vs-avchd/ .


I think it is commersial matherial. I read article on Wiki and saw, that Powerdirector should work with XAVC properly: "...For XAVC-S, CyberLink PowerDirector, Adobe Premiere..." On wiki good explained bitrate amount depend on resolution, and can be even 500-600 mbit! My camera generate 100 Mbit. I'll go gome and create XAVC-S preset manually with 100 mbit and try to render. I do not understand, why I should do it, if the software has "AI" profile analyzer which couldn't guess video features...
JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
Quote You are right! But in my case, video on timeline recognized not correct by profile analyzer. What prewents PD to recognize my video as XAVC-S, not AVC? Maybe this is main reason of incorrect processing video data.

Yes, as I pointed out in my first post, "Profile Analyzer" does not properly catch the format of your XAVC-S files, in fact, the "Profile Analyzer" will not even give a proper profile for a XAVC-S file PD produces.

I don't believe that is at all related to the incorrect processing of the video, it's simply a recomendation. As I mentioned, you can create your own profile and modifiy the video bitrate and see what bitrate your RX570 card with hardware encoding can do that input specification matches output generation.

Jeff
Powered by JForum 2.1.8 © JForum Team