Hey,
triffid, you're back!!
Wow, I started this post almost two months ago (Nov. 4). Back then, I hadn't even purchased the Pioneer drive yet, just wanted to check some things out first ahead of time.
Finally got the Pioneer drive for a Christmas gift, installed it, also installed the OEM Cyberlink software, popped in a 4K Ultra HD disc of Star Trek 2009, and expected to be rocking & rolling, and.......nope.
You've chimed in on my thread post here since the beginning, so you know that my system hardware, etc., should be able to handle this 4K stuff.
OK, from your recent reply, let me make sure I'm clear:
1.) You're saying my BIOS of F4 will do? I had downloaded the F8 BIOS from Gigabyte's web site, just hadn't updated yet [you know how they say there's always a little inherent danger when attempting a BIOS update....
]
2.) I checked my BIOS, and yes, my SW Guard Extensions (SGX) is already set to
Enable (
not Software Controlled).
3.) And while I was in the BIOS, let me ask you this: Since I'm not using the 4K feature of the Pioneer drive at this moment, I am back using my add-in discrete card (EVGA/ nVidia GeForce GTX 1080 SC), not the on-board Intel graphics. But when I do attempt this again, let me make sure: In the BIOS, on the
Peripherals tab, on
Initial Display Output, I want to choose "
IGFX", correct? But what about this: On the
Chipset tab, there's
Internal Graphics; right now it's on "
Auto"; should I switch this to "
Enabled" or leave it as is with "
Auto"?
4.) You mention this: "
The graphics card must not be connected to any other monitor/TV". When I was attempting to use the Pioneer UHD drive, I connected the on-board Intel graphics from an HDMI port from my motherboard to my monitor. From my add-in EVGA graphics card (PCI-e slot 1), I have a Display Port cable connecting it to the monitor. So when you say "
The graphics card must not be connected to any other monitor/TV", are you referring to the on-board Intel graphics? Or my add-in graphics card, too? It's not like I have an
additional monitor, it's the same LG monitor, it's just that two of its ports are being used: the Display Port (coming from my EVGA graphics card) and the HDMI Port (coming from the on-board Intel graphics).
Could that be causing any kind of an issue, do you think? Preventing the 4K playback? It's just that the Display Port cable connection/ port is not receiving a signal at that moment (the EVGA card in the PCI-e 1 slot is Disabled at that moment when the on-board Intel graphics is Enabled).
5.) You mention to upgrade the Intel VGA driver? VGA?!?! Isn't that kind of "old"?
I had already gone to Intel's web site and upgrade the HD graphics drivers.
And upgrade the Intel ME, too? I can get that from Gigabyte's web site in the Support section for my motherboard.
6.) As far as my HDMI cable goes: It's not as though I grabbed some
old one I had had lying around the house for years; I purchased a new one recently, and on the package, besides having 1080p, it also had 4K (plus, the connector ends are gold-plated and also Ethernet). Shouldn't this recently-purchased HDMI cable be 2.0?
On the web link you gave for my monitor (
http://www.lg.com/uk/monitors/lg-27UD58-B), if you scroll through the pictures of it, there's one where you can see the rear of the monitor, and you can see that it has two HDMI inputs, and one Display Port.
So, if you'd be so kind,
triffid, please read through my numbered list and let me know if I'm on-target.
Pez
P.S. And if I can finally get this 4K thing going, then from way back in this thread's post, you can let me in on your secret, the part where you hinted at a
workaround on switching between the two graphics, on-board Intel & my EVGA graphics card, with no card removal or switching cables.