Announcement: Our new CyberLink Feedback Forum has arrived! Please transfer to our new forum to provide your feedback or to start a new discussion. The content on this CyberLink Community forum is now read only, but will continue to be available as a user resource. Thanks!
CyberLink Community Forum
where the experts meet
| Advanced Search >
Multi-GPGPU revisited - again
GGRussell [Avatar]
Senior Contributor Joined: Jan 08, 2012 11:38 Messages: 709 Offline
[Post New]
I'm trying to determine if multi-GPGPU is really worth the extra cost. My recent build is an i7 4770k running at stock speed at the moment. I also have an ATI HD7870 2GB RAM video card. I'm not a gamer and the only reason I purchased the ATI card was because I was originally going to buy an FX8350 which doesn't have built-in GPU. I ended up going with the i7 and was hoping that PD12 multi-GPGPU would speed up rendering.

My misconception of multi-GPGPU - simply doesn't work ALL of the time. Apparently, PD only use the GPU for certain FX and some file formats depending on the GPU.

For example, I connected my monitor to the i7 HDMI and the HD7870 installed no monitor connected. I rendered a few HD files like normal. A few cuts and/or crossfades. The render did use the Intel GPU, but the HD7870 had zero activity.

After some reading on this forum, I did the same setup, but this time I used the nature.wmv (4 copies) and added an FX to each clip that had the AMD logo. This time the HD7870 did show activity although the readings are confusing. The Intel GPu was also used simultaneously.

Since I normally do not use FX, is there really any reason for me to keep the HD7870 video card. I'll have to do another series of test the way I normally edit to determine that. But i did at least now see that multi-GPGPU does work with certain FX filters.
[Thumb - hd7870.gif]
 Filename
hd7870.gif
[Disk]
 Description
AMD GPU
 Filesize
14 Kbytes
 Downloaded:
84 time(s)
[Thumb - hd4600.gif]
 Filename
hd4600.gif
[Disk]
 Description
Intel GPU
 Filesize
13 Kbytes
 Downloaded:
91 time(s)

This message was edited 2 times. Last update was at Feb 26. 2014 12:56

Intel i7 4770k, 16GB, GTX1060 3GB, Two 240GB SSD, 4TB HD, Sony HDR-TD20V 3D camcorder, Sony SLT-A65VK for still images, Windows 10 Pro, 64bit
Gary Russell -- TN USA
Jimbo223 [Avatar]
Member Joined: Apr 25, 2012 02:59 Messages: 95 Offline
[Post New]
Russel,

Any chance you have access to a cheap $30 graphics card you could throw in and re-test?

---
Additional afterthought:
I'm doing a bit of research on this as well because of another thread... you might already know this but here's what I've found out:

Depending on the type of Intel processor/platform you have (Sandybridge, IvyBridge or Haswell) it supports a newish technology called "Intel Quick Sync Video". Unfortunately, my cpu is an old Lynnfield from 2009 so it doesn't have that. Anyhow, "Intel Quick Sync Video" is a catch-all accelerator for all kinds of video/image/games type application which might benefit from it.

I found these links:
Link to Intel Quick Sync Video (and lo and behold, on that same page near the bottom) link to PowerDirector's 11 page marketing its support for Quick Sync in the blurb.

I think what it comes down to is this:
If your computer is older like mine and doesn't have anything to accelerate video editing, then a high end Geforce/Radeon card is useful, otherwise stick to the capabilities of "Intel Quick Sync Video" if your system has it (mileage may vary depending if there are different flavours of Quick Sync since different CPUs perform slower/faster).

Is there an option to disable "Quick Sync" somehow and re-test using your HD7870, I think PD is using whatever pathway for rendering that Windows is dishing out and it could well be that the shortest path is to Quick Sync (bypassing your HD7870). If you can force Quick Sync to stop then PD has no other option than to use that HD7870 you got, if you know what I mean. Close that gate and see what results you get, would be interesting to find out. This might even be a setting in the BIOS, guessing.

Problem is, I don't think you can multi-GPU using different technologies (your HD7870 linked to Intel Quick Sync somehow) but I do know that you can multi-GPU using same technology like Crossfire for AMD Radeon and SLI for Geforce (I'm not saying you can multi-GPU Crossfire and SLI together, just saying you can if you only use Radeon cards or only use Geforce cards).

Last little bit:
Nvidia uses its CUDA cores to leverage the OpenCL capabilities of an application (like PD offers) while AMD does the same with its line of Radeon multi-core GPUs.

To me it sounds like PD will perform pretty well so long as you have a newish computer setup that supports "Quick Sync" (the higher end the better) or you use a Radeon/Geforce card with plenty of horse power.

Just so you know, OpenCL was started off by Apple although lots of people use it.

This message was edited 5 times. Last update was at Feb 27. 2014 04:02

RobAC [Avatar]
Contributor Joined: Mar 09, 2013 18:20 Messages: 406 Offline
[Post New]
I like the way Jim is thinking- just to add to the thread to flesh things out. Here is an observation I did for PD 11 that stills holds true for PD 12. (If anyone finds any difference let me know: http://forum.cyberlink.com/forum/posts/list/29236.page

---------------------------------
1.) Hardware Acceleration, which is enabled in the Preferences menu, allows Power Director to utilize your Nvidia, AMD & Intel graphics cards to speed up Video effect Render / Preview. See this link for more detailed info: http://www.cyberlink.com/support/product-faq-content.do?id=12777&prodId=4

2.) The Fast Rendering Video Technology, located in the Produce Screen,
includes both: SVRT- Smart Video Rendering Technology and Hardware Video Encoder.
-----------------------------------------------------------------------------------------
*Not all video clips and formats can use the Fast Video Rendering Technology !*
It also depends on your installed Graphics Card in your computer see below:
----------------------------------------­----------------------------------------­---------

For Nvidia the following is currently supported:
AVC H.264
MPEG-4
MKV

For Intel the following is currently supported:
MPEG-2
AVC H.264
MPEG-4
MKV

For AMD the following is currently supported:
MPEG-2
AVC H.264
WMV
MPEG-4
MKV

---------------------------------------

Rob

This message was edited 1 time. Last update was at Feb 28. 2014 22:26

PD 14 Ultimate Suite / Win10 Pro x64
1. Gigabyte Brix PRO / i7-4770R Intel Iris Pro 5200 / 16 GB / 1 TB SSD
2. Lenovo X230T / 8GB / Intel HD4000 + ViDock 4 Plus & ASUS Nvidia 660 Ti / Link: https://www.youtube.com/watch?v=ZIZw3GPwKMo&feature=youtu.be
GGRussell [Avatar]
Senior Contributor Joined: Jan 08, 2012 11:38 Messages: 709 Offline
[Post New]
Quote: Problem is, I don't think you can multi-GPU using different technologies (your HD7870 linked to Intel Quick Sync somehow)
That is exactly what I was pointing out. Yes, Both the Intel HD4600 was being used at the same time as my HD7870.

In the BIOS, I have to choose the 'onboard' GPU as primary and make sure the monitor is connected to that HDMI port.
The HD7870 is installed in PCIe slot.
When I edit in PD12, it recognizes the Intel GPU, but for the FX filters it still shows the AMD LOGO. The ONLY time the HD7870 will be used is when I apply one of the FX filters with the AMD logo. That is what I was showing in the two GIFs in the first post. Both GPUs were being used at the same time.

Since I rarely use FX filters, I question whether I should keep the HD7870. When rendering just video with no FX, I see no activity on the second GPU. Intel i7 4770k, 16GB, GTX1060 3GB, Two 240GB SSD, 4TB HD, Sony HDR-TD20V 3D camcorder, Sony SLT-A65VK for still images, Windows 10 Pro, 64bit
Gary Russell -- TN USA
Jimbo223 [Avatar]
Member Joined: Apr 25, 2012 02:59 Messages: 95 Offline
[Post New]
Rob, that's some useful stuff you posted. Thanks.

Russell,
I made that same deduction looking at the Radeon's chart. It's practically flat-lined.

I know you probably thought about this but I'll still throw it your way...
Quote: In the BIOS, I have to choose the 'onboard' GPU as primary and make sure the monitor is connected to that HDMI port. The HD7870 is installed in PCIe slot.


So does that mean you have no choice but to have the 'onboard' GPU enabled?
You can't disable it in any way? It's always on and you can't set your primary as something else?

What if you de-select it and plug your monitor into the HD7870 ignoring the message?
Will your computer recognize the new setting?

If you remove/uninstall the HD7870, do you still get the same results?
Or is that what you Intel grapho is showing?

If that's the case, I'd say we're seeing the death of dedicated GPU cards as chip makers start churning out ever more feature rich processors that can handle the jobs that GPUs traditionally handled. I suppose that's why Nvidia has branched out into making CPUs and AMD merged with Radeon. Interesting...

This message was edited 3 times. Last update was at Feb 27. 2014 09:28

GGRussell [Avatar]
Senior Contributor Joined: Jan 08, 2012 11:38 Messages: 709 Offline
[Post New]
Quote: So does that mean you have no choice but to have the 'onboard' GPU enabled?
You can't disable it in any way? It's always on and you can't set your primary as something else?
I can change the BIOS to GPU in PCIe slot as primary, but if i do, the Intel HD4600 is disabled and no multi-GPGPU in PD12.

I haven't run any test yet with the HD7870 removed, but I would assume the FX filters that are AMD accelerated would no longer show the AMD logo. I haven't seen any of the FX filters with an Intel logo so have no idea if any of the FX filters would be accelerated with the Intel GPU only.

However, I did do some video rendering test with the Intel GPU only. The HD4600 is impressive with MP4 rendering by itself, but for the money i spent building this rig, I still felt that I didn't get the price/performance I was expecting compared to my 5yr old AMD Phenom 965. Over all , I'm happy with the new hardware.

I seriously doubt these CPU/GPU combos would ever make the gamers happy. LOL Gamers are the users that spend $800+ on graphics cards. I know a gamer that spent over $2k just on his video cards!

I'm not a gamer , but do want hardware/software that can speed up video editing. I'll be doing a few more test like MPEg2 to MP4. MPEG 2 HD to AVCHD, etc. I seriously doubt that both GPUs will be used though. Kind of a pain to remove the HD7870 just to do a few tests. ; But might do that over the weekend. Intel i7 4770k, 16GB, GTX1060 3GB, Two 240GB SSD, 4TB HD, Sony HDR-TD20V 3D camcorder, Sony SLT-A65VK for still images, Windows 10 Pro, 64bit
Gary Russell -- TN USA
JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
GGRussell, so you final took my advice in many responses to your posts and set your Intel GPU as prime and added some GPU supported effects and now you see both GPU's active. As I pointed out previously, this is what I believe CL calls Multi-CPCPU support. Could it be better, certainly, since they moved away from just CUDA and AVIVO support to generic opencl (at like PD10) they could eventually expose more capability, will they, I don't know. Others do, so I'm sure it's probably on CL's drawing board too.

Quote: [I haven't run any test yet with the HD7870 removed, but I would assume the FX filters that are AMD accelerated would no longer show the AMD logo. I haven't seen any of the FX filters with an Intel logo so have no idea if any of the FX filters would be accelerated with the Intel GPU only.

You will see the Intel logo in the corner of the accelerated effects.

Jeff
Jimbo223 [Avatar]
Member Joined: Apr 25, 2012 02:59 Messages: 95 Offline
[Post New]
I would be interested to know, if you have the time to spare.

I'm wondering what the performance gain/loss would be of just having the same video rendered using the Intel alone, then disabling it and rendering it using the Radeon as primary (intel disabled).

What's the trade-off, and it might clear up the issue of the flat-lined Radeon you showed before.
If there's very little to it, I'd be really impressed with the power of that Intel.

Thanks. This has been a good conversation.
GGRussell [Avatar]
Senior Contributor Joined: Jan 08, 2012 11:38 Messages: 709 Offline
[Post New]
I'll create a PD12 project with the nature.wmv file. Add FX filters and perhaps transitions, etc. Then render to different formats. That way there will be no SVRT involved. Just transcoding to a new format.
----
Attached is the first tests with the procedure that I followed. Must say that i was a bit surprised that Multi-GPGPU was actually slower than the HD7870 alone. Will be a few days before I can open my case to remove the HD7870 so I can test the HD4600 alone. i did save the project file if anyone wants to compare.

I did preview the files. They were are clean with no artifacts that other users have encountered.
 Filename
PD12 test results.rtf
[Disk]
 Description
 Filesize
2 Kbytes
 Downloaded:
320 time(s)

This message was edited 1 time. Last update was at Feb 27. 2014 19:22

Intel i7 4770k, 16GB, GTX1060 3GB, Two 240GB SSD, 4TB HD, Sony HDR-TD20V 3D camcorder, Sony SLT-A65VK for still images, Windows 10 Pro, 64bit
Gary Russell -- TN USA
JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
Quote: 1.) Hardware Acceleration, which is enabled in the Preferences menu, allows Power Director to utilize your Nvidia, AMD & Intel graphics cards to help Preview / Produce Render your videos.

I believe you are being a little loose with terminology and how it's implemented in PD. This is my understanding. The setting in preference "Hardware Acceleration" has to do with the use of OpenCL if your card supports this which any recent card will. Yes, another HA technology. CL wording in Preferences is "Enable OpenCL technology to speed up video effect preview/render". If your card does not support OpenCL, the wording there will change with the technology supported, like CUDA will be listed instead. If you want to use your hardware for encoding, that is controlled on the "Produce" or "Create Disc" setting area, period. If you want to additionally use OpenCL technology for video effects, that setting is controlled in Preferences.

So if one selects the HA in the "Produce" area 3 things can happen. All 3 items below will be hardware encoded.
1) timeline contains no video effects > using OpenCL technology in preferences will have ZERO effect on encoding time
2) timeline contains some accelerated video effects > using OpenCL technology in preferences will have an effect on encoding time
3) timeline contains some video effects but they are not accelerated > using OpenCL technology in preference will have ZERO effect on encoding time

An accelerated effect is one that displays the AMD, Intel, or Nvidia log in the corner depending on your hardware.

Jeff

This message was edited 1 time. Last update was at Feb 27. 2014 19:59

Jimbo223 [Avatar]
Member Joined: Apr 25, 2012 02:59 Messages: 95 Offline
[Post New]
Thank you both.

I'm uploading an edited version of your tests Russel scaled up to a 1 hour scenario using average times.
I've added info on time gains based on the two bits of data (combo Radeon-Intel and Radeon stand-alone).
Russel, if you'd like to add your Intel stand-alone data to it at some point, I would be glad to scale up the calculations and work out the gains/losses.

Where it says...
720x480 60i 8Mbps yes used on abt 2 clips 1:32/46:00 (m:s) ...these reds are the scaled up timings to 1hr.
The green (+m:s) are rendering gains (speed improvements).

Looking at your AVC and MP4 results you're managing to shave off up to 30mins off your render times using just the HD7870. That's amazing!!!! Wow! The only sluggishness comes when outputting to MPEG2 but even so I'd be happy having even 15minutes off my render times if I was working @1920x1080.

So far, I'd keep the HD7870 and disable the Intel!
Just depends if most of your work is MPEG based.
If you're still up for it, run an Intel only test and add it to the list.
Like I said, I'll work out the timings and post them back.

Here goes:
 Filename
PD12 test results.rtf
[Disk]
 Description
 Filesize
47 Kbytes
 Downloaded:
338 time(s)

This message was edited 4 times. Last update was at Feb 28. 2014 02:19

RobAC [Avatar]
Contributor Joined: Mar 09, 2013 18:20 Messages: 406 Offline
[Post New]
Thanks for the input Jeff- noted and updated.

Rob
PD 14 Ultimate Suite / Win10 Pro x64
1. Gigabyte Brix PRO / i7-4770R Intel Iris Pro 5200 / 16 GB / 1 TB SSD
2. Lenovo X230T / 8GB / Intel HD4000 + ViDock 4 Plus & ASUS Nvidia 660 Ti / Link: https://www.youtube.com/watch?v=ZIZw3GPwKMo&feature=youtu.be
GGRussell [Avatar]
Senior Contributor Joined: Jan 08, 2012 11:38 Messages: 709 Offline
[Post New]
Final Intel HD4600 tests with my HD7870 removed from the machine. The HD7870 really makes a difference with AVC and MP4 rendering with the FX Filters written for acceleration. But since I hardly ever use those, I would need to do other tests the way I normally work, too. For example , cutting out commercials of a HDTV series and render to MP4 that I can stream to my devices. I think the HD7870 will still out perform the Intel HD4600 because most of the files would be transcoded from MPEG2 HD to MP4.

I find it odd that cable TV (in my location) is still using MPEG 2 encoding when everything else seems to have moved to MPEG 4.

Other notes: Since I removed the HD7870, I noticed that the Intel HD4600 display on the same monitor is much sharper and fonts very clear. With the HD7870, the display appears 'soft' and fonts appear thicker. Strange
 Filename
PD12 test results.rtf
[Disk]
 Description
 Filesize
3 Kbytes
 Downloaded:
355 time(s)

This message was edited 1 time. Last update was at Mar 02. 2014 17:50

Intel i7 4770k, 16GB, GTX1060 3GB, Two 240GB SSD, 4TB HD, Sony HDR-TD20V 3D camcorder, Sony SLT-A65VK for still images, Windows 10 Pro, 64bit
Gary Russell -- TN USA
GGRussell [Avatar]
Senior Contributor Joined: Jan 08, 2012 11:38 Messages: 709 Offline
[Post New]
Before I put the HD7870 back in my machine, I decided to do a quick transcode test. I placed an HDV file from my Canon HV20 on the timeline. Did a few cuts.

File is 1440x1080 60i 25Mbps Total on timeline is 5:00;03

Output to two formats:
MP4 1920x1080 30p 16Mbps
AVC 1920x1080 60i 16Mbps

Intel HD4600:
MP4 97-100% GPU Load 3:10
AVC 92-99% GPU Load 2:36

AMD HD7870:
MP4 avg <10% load 2:35
AVC 0-25% load 3:31

as you see, mixed results. Intel is faster at AVC transcode and AMD is faster at MP4 transcode. So for my personal needs, looks like either GPU would be fine. Intel i7 4770k, 16GB, GTX1060 3GB, Two 240GB SSD, 4TB HD, Sony HDR-TD20V 3D camcorder, Sony SLT-A65VK for still images, Windows 10 Pro, 64bit
Gary Russell -- TN USA
Jimbo223 [Avatar]
Member Joined: Apr 25, 2012 02:59 Messages: 95 Offline
[Post New]
Here are the new timings.

MPEG2: very little change and beginning to show performance loss.
AVC: minimal change and beginning to show performance loss as resolution increases (frame size w x h)
MP4: more performance loss but consistent between resolutions unless the CPU either comes out of its dozy state (74%, 89% v 36%) or the resolution increases (double whammy: higher res = more stress = CPU steps up a gear).

I would run your HD7870 alone and forget the Quick Sync altogether.

Quote: I find it odd that cable TV (in my location) is still using MPEG 2 encoding when everything else seems to have moved to MPEG 4.


That could be because they haven't upgraded the transmitters in your area or they're trying to save money or it's the kind of setup where part of the transmitted bandwidth is shared among other services (like broadband, interactive TV, the NSA, or flying drones overhead! - take your pick, just kidding), purely business.

Quote: Other notes: Since I removed the HD7870, I noticed that the Intel HD4600 display on the same monitor is much sharper and fonts very clear. With the HD7870, the display appears 'soft' and fonts appear thicker. Strange


Check your Catalyst options to change that. Windows also has a "clear-type" widget which allows you to sharpen fonts on your monitor. I think it's built in to Windows, maybe in the control panel or it could be a free download from Microsoft.

Here goes:
 Filename
PD12 test results 2.rtf
[Disk]
 Description
 Filesize
65 Kbytes
 Downloaded:
327 time(s)
Jimbo223 [Avatar]
Member Joined: Apr 25, 2012 02:59 Messages: 95 Offline
[Post New]
Quote:
File is 1440x1080 60i 25Mbps Total on timeline is 5:00;03

Output to two formats:
MP4 1920x1080 30p 16Mbps
AVC 1920x1080 60i 16Mbps

Intel HD4600:
MP4 97-100% GPU Load 3:10
AVC 92-99% GPU Load 2:36

AMD HD7870:
MP4 avg <10% load 2:35
AVC 0-25% load 3:31


Just check to make sure the transcoded/rendered files finished off with the same frame sizes you had in the tests above.
GGRussell [Avatar]
Senior Contributor Joined: Jan 08, 2012 11:38 Messages: 709 Offline
[Post New]
I hate ClearType and keep it turned off. It uses anti-aliasing between pixels, but doesn't work well in my opinion.

The MPEG 2 file is the SOURCE file that I placed on the timeline. Did a few cuts to make it 5min long. I wanted to test PD12 for transcoding that MPEG2 file to MP4 and AVC.

The times you quoted was the time it took each GPU to render the 5min edited timeline. I did check each rendered file for quality/artifacts. I couldn't tell much (if any) difference between the AMD or Intel rendered files. I should get better Intel GPU performance if/when I decide to Overclock the CPU. I have the mobo , fast RAM and water cooler to overclock.

99% of the time I only use cuts with a few fades. I see no drastic time differences between the Intel and AMD GPUs in this case.

This message was edited 2 times. Last update was at Mar 03. 2014 11:19

Intel i7 4770k, 16GB, GTX1060 3GB, Two 240GB SSD, 4TB HD, Sony HDR-TD20V 3D camcorder, Sony SLT-A65VK for still images, Windows 10 Pro, 64bit
Gary Russell -- TN USA
Jimbo223 [Avatar]
Member Joined: Apr 25, 2012 02:59 Messages: 95 Offline
[Post New]
Yup. There's not much in it.
If you scaled that up to a 1hr clip you'd be 7 mins quicker for an MP4 but 11 mins slower for the AVC using the Radeon.

...and you're probably right. In the end it just comes down to what you need to get done.


JL_JL [Avatar]
Senior Contributor Location: Arizona, USA Joined: Oct 01, 2006 20:01 Messages: 6091 Offline
[Post New]
Quote: Thanks for the input Jeff- noted and updated.

Rob

If one understands the details of ticking a option, I think it's easier to understand if it will help a user with his particular timeline. Unfortunately, CL to me does a very poor job of explaining what they really implemented and applicability, the sales dept takes over and you just get fluff. The manual is terrible to aid resolution. End users need to try and resolve how it affects them. The Preferences OpenCL to me is what really what enables the use of the GPU as a general processor (GPGPU). From what I have seen, the CL wording in preferences is exactly that, "video effect". I think its all the same as what was presented here in the multiGPGPU.pdf http://forum.cyberlink.com/forum/posts/list/25826.page

The attached pic shows the effects of user selections and what happens to load on various devices. It uses a basic timeline of the sample file copied on the timeline several times, trimmed to 2min, and then produced as a 1920x1080, 24Mbps file to have something to work with that is more typical of users video file properties. The first and last 2min clips have the Gaussian Blur (accelerated) or the TV Wall (non-accelerated) effects applied. The Gaussian Blur was selected as computationally it's a load. The three 2min chunks make it easy to identify in the load chart.

Summary of chart:
Run 1: Basic CPU and HD4000 HA encoding. Notice the high CPU load during the encoding of the first and last 2 min regions. Gaussian Blur puts on a significant load that is CPU even though the clip is HA in Produce.

Run 2: This run essentially shows the effect when OpenCL is activate in preferences and a accelerated effect is used in the timeline. During the encoding of the first and last 2 min of the timeline I get all 3 of my PU's active, in the middle 2min clip, I simply get the HD4000 and CPU active. Again it was HA in Produce.

Run 3 and 4: For a non accelerated effect in the timeline, OpenCL used in preferences has no effect, both runs match each other.

The other note of interest, the load in the middle 2 min section of all 4 runs is essentially identical as it should be. For this unmodified section in all 4 runs the same exact processing by the CPU (25%) and HD4000 (49%) needs to be done.

The OpenCL option in preferences can even be used by those that like CPU encoding, if ticked, any accelerated effects used in the timeline will get the GPGPU features provided by OpenCL. So if the user has a AMD CPU and a Nvidia GTX470, the GTX470 GPU will be used as a general processor to aid encoding the effects in the timeline that are accelerated. The GPU will be used as a general processor (GPGPU).

The above is how I understand the current OpenCL implementation in PD11 and PD12. Can or could it do more, based on what I know, certainly, maybe CL will expose more in general encoding capability in future releases.

Jeff
[Thumb - OpenCL3.png]
 Filename
OpenCL3.png
[Disk]
 Description
OpenCL effects
 Filesize
142 Kbytes
 Downloaded:
101 time(s)

This message was edited 1 time. Last update was at Mar 03. 2014 20:35

GGRussell [Avatar]
Senior Contributor Joined: Jan 08, 2012 11:38 Messages: 709 Offline
[Post New]
JL_JL -- what app were you using to show CPU and GPU simultaneously? Intel i7 4770k, 16GB, GTX1060 3GB, Two 240GB SSD, 4TB HD, Sony HDR-TD20V 3D camcorder, Sony SLT-A65VK for still images, Windows 10 Pro, 64bit
Gary Russell -- TN USA
Powered by JForum 2.1.8 © JForum Team