This post is asking for what, 144Hz or 1440p and any type of monitor? That's a BROAD brush if you ask me. Buyers of one more than likely won't need the other, unless you want to break the bank getting a monitor for your two 1080ti cards.
Given that the ppi density is considerably higher at these resolutions (offset some by the size of the larger monitors that aren't 21:9 @ 1440p), it's not the easiest to tell from looking at normal viewing distance if any dead pixels exist, which I think is really the most important aspect to your question. A little math for you:
A 24" monitor at 1920x1080 (16:9) is illuminating 86,400 pixels per inch
A 27" monitor at 2560x1440 (16:9) is illuminating 136,532 pixels per inch
A 34" monitor at 2560x1440 (16:9) is illuminating 108,423 pixel per inch
A 34" monitor at 3440x1440 (21:9) is illuminating 145,694 pixels per inch
That being said, you've got to think that the probability of a single pixel being dead is going to be much higher if there are in fact more pixels to affect. Just to quell any rising fears with that last statement, here's Acer's dead pixel policy for the curious: http://www.acer.com/ac/en/IN/content/dead-pixel
Anecdotally speaking, my X34 that arrived last week is flawless. No noticeably dead pixels, no rescan problems, next to no bleed (a little in the top right that's only visible when looking for it on black backgrounds), and clocks fine to 100hz. Similarly, the XG270HU that I got my brother for Christmas didn't appear to have issues for the handful of hours I used it either (although I didn't realize before getting it that the Tahiti architecture didn't support FreeSync; alas).
Hope that helps!