Blower style coolers are generally only good when you absolutely must remove (nearly) all heat from the GPU out of the case. Generally this is in situations where space is extremely tight or there is little to no airflow inside the case. Open air designs are infinitely better at cooling than the traditional OEM/Reference style blowers.
In nearly all instances blower style centrifugal fans are exponentially louder than open air style coolers. Between the vastly improved cooling performance and lower noise levels, open air is nearly always the better choice.
Modern GPUs are much smarter than their predecessors and take into account how much power is being used and how hot they are running. This helps you not damage them accidentally and can also help you get extra performance out of the GPU simply because it is running cool. GPUs are capable of automatically increasing, or decreasing, their clock speeds when they sense they are under different operating conditions.
Waste heat from the GPU is a legitimate concern, but it doesn't take much effort/money to add a couple fans to your case to make sure there is decent air flow to remove the heat build up. You don't need RGB LED fans costing $25 each to keep case temperatures down. An excellent buy for quiet/durable 120mm fans would be Arctic Coolings F12 PWM PST 5 fan bundle for $25 on Amazon. PWM is Pulse Width Modulation meaning they are 4-pin fans that have the extra wire to adjust the fan speed on the fly; like a CPU heatsink fan. PST means the fans all have an extra 4-pin connector so they could be daisy chained together so all fans adjust speed together..if you wanted. If you don't want to get that fancy with the PWM/PST, then the price goes down without those features.
"Best GPU" will rely on numerous factors:
What games will you play?
What resolution will you be running?
What refresh rate is your monitor(s)?
Adaptive Sync Monitor?
Current hardware specs?
Any interest in tweaking/overclocking?
There are lots of variables to account for, but my middle of the road suggestion would be an AMD RX 570 4GB card. These easily out perform the GTX 1060 3GB variants in stock form and have the advantage of that extra 1 GB of video memory which is quite important once the resolution and/or settings are turned up. The AMD Polaris cards, RX400/RX500 are very easy to tweak to pull an additional ~10% performance out of them. This likely puts the 570 4GB ahead of the 1060 6GB cards. These 'mid-tier' RX/GTX cards are plenty for gaming up to 1440P/60Hz with a little-to-no compromise in graphics settings. If you are willing to spend some time tuning/tweaking all the various graphics settings, you would be surprised at the visual fidelity you can achieve while still getting great performance out of a 'mid-range' product. Don't get me wrong, having the biggest/baddest card available and simply maxing every setting out is nice...but that generally is not how it works for everyone.
A bit more between AMD/Nvidia and my rant will be over. Nvidia's major boon is going to be power draw/efficiency, there is no debating this. However, AMD's cards typically fair better in DX12 titles and generally age much, much better than their green counterparts; especially in the 'mid-range' market. Driver stability and releases between the two companies are pretty well even at this point. AMD was lacking in the past in this department, but they have stepped up their game big time recently with the release of their Crimson driver/software package. They have now incorporated their own overclocking/tweaking software WattMan, and are on a very timely driver update schedule; roughly monthly if I'm not mistaken.
A quick blurb about monitors...General consensus AFAIK is that AMD is easier to work with when it comes to running multiple monitors; this is just something I have seen mentioned numerous times in forums over the years. I can speak from experience with AMD and Eyefinity was always fantastic for my triple wide setup, and it has only gotten better with Crimson. Presently I have a triple monitor setup consisting of a 34" ultra-wide 2560*1080 75Hz monitor as my main, and two 1920*1080 60Hz above the ultra-wide. Setting them up so the desktop flows is painless with Crimson and Windows 10.
Last thing tying in monitors will be adaptive sync refresh rates, better known as G-Sync and FreeSync. They both do the same thing which is syncing the FPS output of your GPU to your monitor to give you a very smooth and tear-free gaming experience. The monitor has a target range of Hz/FPS for you to hit and the game will play smooth as butter even if you can't hit the monitors maximum refresh rate all the time. AMD has a definite cost advantage in this department. Monitors with similar/identical specs are always cheaper in a FreeSync variant because AMD doesn't charge anything for the monitor manufacturer to use the tech...hence the free portion of FreeSync. Gaming at a lower refresh rate with an adpative sync monitor is going to be a smoother, more enjoyable experience than a higher refresh rate without it. For example my LG has a FreeSync range of 45-75Hz...so as long as my GPU is cranking out a minimum of 45FPS, everything will look smooth and fantastic.