LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > General
User Name
Password
General This forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!

Notices


Reply
  Search this Thread
Old 03-16-2004, 05:13 PM   #16
snacky
Member
 
Registered: Feb 2004
Distribution: Debian
Posts: 286

Rep: Reputation: 30

I'm someone who has no problem with getting his hands dirty, but I almost never compile apps myself anymore. It's mostly a waste of time, and potentially leads to a system that's an unmaintanable mess.

This is an issue that I've swung back and forth on. When I first discovered Debian and apt-get, it caused me to swing way over into the "wimp" camp of almost never downloading source. apt-get was just so easy, every time.

Then, more recently, I got into a lot of movie-related software where I would sometimes want to download the CVS head for best results. It started with a few software packages... then it moved on to libraries connected with these software packages, and finally it became such an ingrained habit that I nearly quit using apt-get for a while.

I know some people like to claim that "./configure && make && make install" always works, but I've had times where I've gotten into unbelievably huge webs of dependancies trying to compile stuff myself. What starts out as one package turns into three which turns into about ten... Then, once you get everything you need, if ONLY ONE of them fails to compile, you're ruined. And there are a lot of otherwise useful programs out there that are either amateurishly maintained ("./configure &&make && make install" either isn't their paradigm, or the configure scripts & makefiles are not designed well enough) or poorly tested. You'd be surprised how often the devs themselves don't even know what version of header files they're including, they just know it compiles fine on their own systems!

I've been there and done that. I've traversed huge trees of dependencies. I've read all the docs and used the ./configure --with options. Then I've manually edited Makefiles, and manually inspected function prototypes in the includes because there's sometimes NO WAY to find out what versions are in use (and like I said, often even the devs can't keep track of all this crap!).

After all this effort, I was rewarded with a system that doesn't run noticeably faster, plus Debian apt has NO IDEA what dependencies I have that are met (i.e., it doesn't know many of the programs, development files, and runtime libraries I have). Trying to use apt-get was resulting in tons of hard-to-fix dependency headaches that sometimes got harder to fix with each iteration. This is no small matter considering how many people use these systems to get security updates! So it's not purely a matter of taste.

Eventually I did some almost daring apt-get remove and then reinstalled what I needed. Fortunately this didn't take long and was done in two commands, but if it hadn't worked I'd seriously be considering a full reinstall now, because it would truly have been that much work to manually fix all the problems.

In a lot of cases, it's a myth that self-compiling increases speed noticeably or even measurably. In only a few of these cases is it actually worth the effort. I now self-compile a rather small number of applications that are unusually responsive to optimization, but I usually use "apt-get source" to do so, if possible. Once you get to a certain level of dependency hell, self-compiling every damn thing is just a lot of work, which only brings you even MORE work, which in turn generates more and more WORK for you to do! It's a neverending spiral that gets worse and worse and becomes very hard to fix.

I can see following maybe half a dozen or so pieces of software, and knowing all about their dependencies and possible conflicts. But the average modern system actually has at least a couple hundred separate packages with a dense web of dependencies and possible potential conflicts. From experience, it is only very rarely worth the time to delve into this stuff yourself. apt-get handles it beautifully about 99.9% of the time, which is noticeably better than my own 95% success rate plus it takes radically less time.

Computers were meant to save labor and do things more quickly, not to create huge amounts of drudgery...
 
Old 03-16-2004, 08:32 PM   #17
watashiwaotaku7
Member
 
Registered: Oct 2002
Location: wisconsin -- The Badger state
Distribution: gentoo
Posts: 654

Rep: Reputation: 30
ive actually noticed a lot of improvement while compiling things myself over binaries, especially in openoffice which doesnt even take agressive cflags very well, ive noticed it in other packages too where if the program is compiled with proper cflags (enough to make it optimized few enough to make the resulting binaries small enough to be quickly read) it can be MUCH faster. I have however never tried apt-get and will likely never try it on my laptop, I am getting a new computer in about two days which will have more than plenty of space and I will try it then until that time I have simply fallen in LOVE with portage how easy it is, how little time it takes, and how fast and configured the resulting system is
 
Old 03-18-2004, 07:47 AM   #18
snacky
Member
 
Registered: Feb 2004
Distribution: Debian
Posts: 286

Rep: Reputation: 30
Yeah, good points. Just to be clear, gentoo is a fine approach. What leads to maintenance headaches is to download source tarballs all the time. Gentoo helps to handle the dependency issues, which is the important thing.
 
Old 03-18-2004, 12:59 PM   #19
Brane Ded
Member
 
Registered: Nov 2003
Location: over there
Distribution: Debian Testing
Posts: 191

Rep: Reputation: 30
Quote:
I know some people like to claim that "./configure && make && make install" always works, but I've had times where I've gotten into unbelievably huge webs of dependancies trying to compile stuff myself.
In Debian? Well, yeah, you should have pretty much expected something like that to happen. Debian isn't a very ideal distribution to be compiling source code in. About the only libraries it has are the ones it needs. I can't remember the last time I've had a dependency issue in Slack. The worst that's ever happened is a program failed to compile, but it was crappy code anyway.

Quote:
After all this effort, I was rewarded with a system that doesn't run noticeably faster, plus Debian apt has NO IDEA what dependencies I have that are met (i.e., it doesn't know many of the programs, development files, and runtime libraries I have).
Unless you've compiled most of the software you're running and optimised it for your system, then you really shouldn't notice a speed increase. Also, APT isn't designed to work with compiling from source like that. Unless you have no alternative, the "Debian way" is to just use APT to download and install everything.

Quote:
From experience, it is only very rarely worth the time to delve into this stuff yourself. apt-get handles it beautifully about 99.9% of the time, which is noticeably better than my own 95% success rate plus it takes radically less time.
If APT works, by all means, keep using it. APT's a nice system, when I used it a while back, I couldn't find much to complain about.

Quote:
Computers were meant to save labor and do things more quickly
They've been working for me.
 
Old 03-18-2004, 03:13 PM   #20
Nukem
Member
 
Registered: May 2003
Location: Canada, TO.
Distribution: Slackware: in progress, Mandrake 9.2, Libranet, Vector
Posts: 373

Rep: Reputation: 30
The thing that I hate is whenever I try to install A, it need B.
And to install A I'm going to install B now. But, to install B, I must have C.
And now, to install B, to install A, i'm trying to install C. But to install C, I must have B.
wtf?? I can't install B, because I need C. But I can't install C without having B.
At the end just getting pissed thinking that all I need is A.
 
Old 03-18-2004, 03:50 PM   #21
charon79m
Member
 
Registered: Oct 2003
Distribution: Just about anything... so long as it is Debain based.
Posts: 297

Rep: Reputation: 30
I must admit that I've given in to the ease and availability of the RPM.

APT is great, I exclusively run Debian on my servers and utalize the APT utility a lot, but sometimes it's necessicary to compile if you want a newer version than is available via APT.

All in all, I'm just happy to have a comunity that makes it's software so redily available and designes applications and packages so that it's easy for the user to install and update.
 
Old 03-18-2004, 08:47 PM   #22
Squall
Member
 
Registered: Jan 2004
Location: The land of the free and the home of the brave
Distribution: Slack 10
Posts: 239

Original Poster
Rep: Reputation: 31
BOO! RPMS suck! They give you ZERO choice in any ./configure --x options. Say you want to set your own default terminal for, say, ratpoison. Well, TOO BAD, because the makers of the RPM know more than you, or so it seems.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Hi guys! einnor LinuxQuestions.org Member Intro 5 01-28-2005 12:32 AM
GUYS!!! I need your help Jetta-GT- Linux - Newbie 1 01-09-2004 12:03 AM
Sorry guys Dumpsterm0uth LinuxQuestions.org Member Intro 0 05-18-2003 06:24 AM
I'm new. Can you guys help me out? AppleMac Linux - Newbie 5 09-27-2002 11:39 AM
Need you help guys! eraser Programming 1 10-13-2000 11:36 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > General

All times are GMT -5. The time now is 05:37 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration