I'm someone who has no problem with getting his hands dirty, but I almost never compile apps myself anymore. It's mostly a waste of time, and potentially leads to a system that's an unmaintanable mess.
This is an issue that I've swung back and forth on. When I first discovered Debian and apt-get, it caused me to swing way over into the "wimp" camp of almost never downloading source. apt-get was just so easy, every time.
Then, more recently, I got into a lot of movie-related software where I would sometimes want to download the CVS head for best results. It started with a few software packages... then it moved on to libraries connected with these software packages, and finally it became such an ingrained habit that I nearly quit using apt-get for a while.
I know some people like to claim that "./configure && make && make install" always works, but I've had times where I've gotten into unbelievably huge webs of dependancies trying to compile stuff myself. What starts out as one package turns into three which turns into about ten... Then, once you get everything you need, if ONLY ONE of them fails to compile, you're ruined. And there are a lot of otherwise useful programs out there that are either amateurishly maintained ("./configure &&make && make install" either isn't their paradigm, or the configure scripts & makefiles are not designed well enough) or poorly tested. You'd be surprised how often the devs themselves don't even know what version of header files they're including, they just know it compiles fine on their own systems!
I've been there and done that. I've traversed huge trees of dependencies. I've read all the docs and used the ./configure --with options. Then I've manually edited Makefiles, and manually inspected function prototypes in the includes because there's sometimes NO WAY to find out what versions are in use (and like I said, often even the devs can't keep track of all this crap!).
After all this effort, I was rewarded with a system that doesn't run noticeably faster, plus Debian apt has NO IDEA what dependencies I have that are met (i.e., it doesn't know many of the programs, development files, and runtime libraries I have). Trying to use apt-get was resulting in tons of hard-to-fix dependency headaches that sometimes got harder to fix with each iteration.
This is no small matter considering how many people use these systems to get security updates! So it's not purely a matter of taste.
Eventually I did some almost daring apt-get remove and then reinstalled what I needed. Fortunately this didn't take long and was done in two commands, but if it hadn't worked I'd seriously be considering a full reinstall now, because it would truly have been that much work to manually fix all the problems.
In a lot of cases, it's a myth that self-compiling increases speed noticeably or even measurably. In only a few of these cases is it actually worth the effort. I now self-compile a rather small number of applications that are unusually responsive to optimization, but I usually use "apt-get source" to do so, if possible. Once you get to a certain level of dependency hell, self-compiling every damn thing is just a lot of work, which only brings you even MORE work, which in turn generates more and more WORK for you to do! It's a neverending spiral that gets worse and worse and becomes very hard to fix.
I can see following maybe half a dozen or so pieces of software, and knowing all about their dependencies and possible conflicts. But the average modern system actually has at least a couple hundred separate packages with a dense web of dependencies and possible potential conflicts. From experience, it is only very rarely worth the time to delve into this stuff yourself. apt-get handles it beautifully about 99.9% of the time, which is noticeably better than my own 95% success rate
plus it takes radically less time.
Computers were meant to save labor and do things more quickly, not to create huge amounts of drudgery...