Quote:
While my field of works is not graphics, it doesn't make much sense to me that a CODEC would be using FPU. Floating-point makes more sense for things like 3D rendering.
I was planning to replace the Quadro 600 with a 750Ti, but then I found out that it's nvenc is a weirdo - between Kepler and Maxwell2. So I am not sure how well will play in future (buggy?) because of this in-between generation.
I think whenever there is a new piece of software/hardware, it is guaranteed to be pretty buggy. This is just reality these days with consumer-level stuff. If you buy the hardware and software from 12-18 months ago, they tend to have worked out the bugs, and you save a bunch doing that as well. There is a big premium to pay to get the latest and "greatest" and it's not just money.
The Maxwell 2 looks good on paper. I want the HDMI 2.0 in particular for 4K progressive display support.
The Gigabyte GTX 750 Ti I got supports 4k progressive with dual HDMI - but very few monitors actually support that.
New displays are using HDMI 2.0 .
But $350 - $400 for a GTX 970 is too high of a price to pay to be a beta-tester. I already feel I paid too much to test the GTX 750 Ti.
When there is a sub-$200 card, I may consider it. Maybe there will be a decent and affordable 4K monitor to go with it too.
I also want a 4K projector for my home theater that doesn't cost $10k . Or $4k Guess I'll wait a few more years for that.
Quote:
Cool, I didn't think to install my old 9500GT to see if double is used or not in the CUDA encoder - that's for the nVidia engineers to answer. It would be good for the newer cards, since all have good single point performance.
Wonder if it does a difference in encoding quality...
Wonder if it does a difference in encoding quality...
While my field of works is not graphics, it doesn't make much sense to me that a CODEC would be using FPU. Floating-point makes more sense for things like 3D rendering.
I was planning to replace the Quadro 600 with a 750Ti, but then I found out that it's nvenc is a weirdo - between Kepler and Maxwell2. So I am not sure how well will play in future (buggy?) because of this in-between generation.
I think whenever there is a new piece of software/hardware, it is guaranteed to be pretty buggy. This is just reality these days with consumer-level stuff. If you buy the hardware and software from 12-18 months ago, they tend to have worked out the bugs, and you save a bunch doing that as well. There is a big premium to pay to get the latest and "greatest" and it's not just money.
The Maxwell 2 looks good on paper. I want the HDMI 2.0 in particular for 4K progressive display support.
The Gigabyte GTX 750 Ti I got supports 4k progressive with dual HDMI - but very few monitors actually support that.
New displays are using HDMI 2.0 .
But $350 - $400 for a GTX 970 is too high of a price to pay to be a beta-tester. I already feel I paid too much to test the GTX 750 Ti.
When there is a sub-$200 card, I may consider it. Maybe there will be a decent and affordable 4K monitor to go with it too.
I also want a 4K projector for my home theater that doesn't cost $10k . Or $4k Guess I'll wait a few more years for that.
Samsung 55" LED TV UN55HU8550FXZA 4K Smart TV under 2K! on sale NOW!(sounds like salesman...)
whenever you're working with 4K videos from your LG G3 with PD13, would you post'em.
I'd love to seat with my popcorn and watch...
thanks a million.
p.s.
meanwhile I'm eyeing on an ASUS ROG G751JT with GTX 970M for $1499...
Yes, I do believe in Santa and Mrs. Claus!!!
This message was edited 1 time. Last update was at Nov 16. 2014 13:46
Yashica Electro 8 LD-6 Super 8mm
Asrock TaiChi X470, AMD R7 2700X, W7P 64, MSI GTX1060 6GB, Corsair 16GB/RAM
Dell XPS L702X i7-2860QM, W7P / W10P 64, Intel HD3000/nVidia GT 550M 1GB, Micron 16GB/RAM
Samsung Galaxy Note3/NX1