Showing 1 of 35 conversations about:
NemoSomeN
1
Jul 1, 2015
bookmark_border
I hope people realize the power of the R9 295x2. Goes on sale Regularly from $750ish to $600.
Some Scores: http://www.futuremark.com/hardware/gpu?_ga=1.244450757.1569575571.1408617787.
Benchmarks vs Titian X:

15.6% More power for 60%-75% the price . Only cons is the ~450W TDP; a 1200W Corsair PSU is about the cost of the savings. Meaning for the price of the Titan you can go get a better PSU and Graphics card.
It does not even run that loud 8:10 in this video (also vs 780TIx2):

By the way he says "expensive" but this was over a year ago, and to make the "Possible Fire situation" clear, he means you cannot use PCIe 2+8 power connectors that split into 2 or are daisy chained. They just need to be 1 connector all the way. If you want to argue that you are loosing G-sync, well there is AMD's version; adaptive free-sync aka free-sync. AMD is trying to make their version a public standard, and Nividia is trying to make you pay for it, usually AMD Adaptive freesync monitors cost $200 less.
Edit: It also uses only 1 PCIe x16 3.0 Slot so you can use it on an ITX. I believe this does not matter because of how SLI/CrossFire works and EVEN if that was a legit argument you would see a 0.5-1.5% performance drop.
Jul 1, 2015
Gen0
9
Jul 5, 2015
bookmark_border
You forget the cost of the increased power draw over the years you will use it. Only in the US and few other countries in the world - not including many developed places like the UK - does that make a cost saving.
Jul 5, 2015
NemoSomeN
1
Jul 17, 2015
bookmark_border
its like $7 a month.. hmmm lets say u keep it for 10 years, that is $840. 840+ what was it $650? vs $1100 from the titian? Idk how much the titian costs per month because they only tested the radon. i think its like a 1/3 of the power draw? so it comes out to like 200 more expensive after 10 years. if you cant afford the 7 bucks then the titian x is out as well, and if you are buying high end graphics card you will probably upgrade before 10 years.
Jul 17, 2015
Gen0
9
Jul 18, 2015
bookmark_border
Two things.
Firstly that 7$ a month figure is US. I specifically said not US. In the US, you pay 12.50 cents on the KwH on average according to the EIA. Doing the maths that means YOU expect to use 56KwH per month more with the AMD than the Nvidia. So for Berlin for example, it would cost $17.28 more per month on average due to the much higher price of energy.
Also, if you're smart, when you upgrade again later you will not be throwing the card in the bin. It will either go into another machine or be sold, at which point either you will see 10 years use out of it, or the Titan will always have a higher resale value: You know how valuable a P4 or a Sempron is today? No? They are dirt cheap because they are too power inefficient to be repurposed.
In fact the only reason I don't have cards from 10 years ago in systems is the arrival of PCI-E and the end of the AGP era. I certainly still have a Matrox Millennium in one of my farm PCs and also have a working PowerMac G5 that has a Geforce 5200 in it for older apps.
Don't underestimate the lifespan of the integrated circuit.
Jul 18, 2015
View Full Discussion