Firstly that 7$ a month figure is US. I specifically said not US. In the US, you pay 12.50
cents on the KwH on average according to the EIA. Doing the maths that means YOU expect to use 56KwH per month more with the AMD than the Nvidia. So for Berlin for example, it would cost $17.28 more per month on average due to the much higher price of energy.
Also, if you're smart, when you upgrade again later you will not be throwing the card in the bin. It will either go into another machine or be sold, at which point either you will see 10 years use out of it, or the Titan will always have a higher resale value: You know how valuable a P4 or a Sempron is today? No? They are dirt cheap because they are too power inefficient to be repurposed.
In fact the only reason I don't have cards from 10 years ago in systems is the arrival of PCI-E and the end of the AGP era. I certainly still have a Matrox Millennium in one of my farm PCs and also have a working PowerMac G5 that has a Geforce 5200 in it for older apps.
Don't underestimate the lifespan of the integrated circuit.