LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   Fatal weakness in Linux (https://www.linuxquestions.org/questions/linux-software-2/fatal-weakness-in-linux-330516/)

cov 06-05-2005 05:24 AM

Fatal weakness in Linux
 
You know, in Windows, you double-click the installer programme and your new software get loaded in. Alright, you might have a DirectX incompatability, or be missing a dll, but that's about it.

With Linux you cannot install a piece of software without a list of incompatabilities as long as your arm.

Try and install a programme and you will find that it needs libthis or libthat, or plib-something-or-other. And if you don't have internet acces (like you're living in the developing world), then you have to make repeated trips to the internet cafe to pick up the missing file(s).

There are hundreds of applications that handle (for example) jpegs, and if you don't have the application that the programmer used when he wrote the software, well you are just going to have to get it if you want to use it. And it has to be that precise application, not some other that may have the same api. And tough on you if you happen to get the correct application but the wrong version number.......

Sorry about my rant, but I really believe that, before Linux can be accepted as a mainstream desktop platform, this issue needs to be addressed.

I have tried to install 'FlightGear', but find that I need 'plib'. On visiting the Internet cafe for 'plib', I find I need 'Simgear'. Ok, so I go back and get Simgear, but it needs openAl. No problem, OpenAL is on my Suse 9.2 CDs as a RPM. I install that. But SimGear can't recognise that it's been installed. So I have to go back to the Internet Cafe and look at the OpenAl site. But you can only install though CVS. Why do I bother? Why don't I just go back to using Windows. Theres a nice little windows Setup EXE on the FlightGear website that I should have downloaded without having to go through this headbanging.

Suse (or some other distro) should try to standardise these issues so that developers, if they want to see thier Apps in the mainstream, should use standard modules.

Cov

reddazz 06-05-2005 05:45 AM

Linux distros provide package management tools such as apt, yum, urpmi and yast. These tools have the ability to automatically resolve and install dependencies, so use them wherever possible instead using your current method. If you don't have a net connection then this is obviously going to be a big problem and unfortunately you will have to cope with all the problems associated with manually downloading and installing packages.

cov 06-05-2005 05:58 AM

Reddazz,

Thanks for your reply.

But, by implication, you seem to suggest that only First World countries should have access to Linux. ie, only those where internet access is commonplace. Furthermore, there must be some of you in the First World who don't have Internet Access? No? What about them? Most of the tools you mention are pretty toothless if they aren't online.

I can see there are benefits to using Yast and APM etc, I'm not disputing that. All I'm saying is that the dependencies should be better organised.

I'm saying that these incompatability issues are a major flaw in Linux and should be addressed.

Cov

Ephracis 06-05-2005 06:04 AM

This is not a flaw, this is good (my opinion). Shared libraries has a benefit, it makes programs smaller. Instead of installing the same library over and over again you install it once and then all the programs that need that library use it and the library is therfore shared between all the program. Some distros are made very small and then it is up to the user to choose what libraries should be installed. The philosophy is that the shape of the operating system should be determined by the user, not the other way around.

And also, as reddazz said, you can use a package manager and then you just "double-click the [installer] programme and your new software get loaded in." Just as easy as Windows.

Regards.

oneandoneis2 06-05-2005 06:14 AM

That's not really the point - most good package managers can resolve dependencies and tell you exactly what is needed to install a given package. Just get that list, download the dependencies, and install them. No need for repeated trips to a net cafe.

e.g. On my gentoo install:
Quote:

root # emerge -p flightgear

These are the packages that I would merge, in order:

Calculating dependencies ...done!
[ebuild N ] media-libs/openal-20040817
[ebuild N ] dev-games/simgear-0.3.8
[ebuild N ] games-simulation/flightgear-0.9.8
Those are the three things I need to download in order to install. One trip to an Internet connection later, and I'd have all three. What's the problem?

LinuxSeeker 06-05-2005 07:14 AM

Most libraries you need are usually in your distribution's CD's.

reddazz 06-05-2005 07:14 AM

Quote:

Originally posted by cov
Reddazz,

Thanks for your reply.

But, by implication, you seem to suggest that only First World countries should have access to Linux. ie, only those where internet access is commonplace. Furthermore, there must be some of you in the First World who don't have Internet Access? No? What about them? Most of the tools you mention are pretty toothless if they aren't online.

I can see there are benefits to using Yast and APM etc, I'm not disputing that. All I'm saying is that the dependencies should be better organised.

I'm saying that these incompatability issues are a major flaw in Linux and should be addressed.

Cov

I think you didn't fully understand my post. I'm not suggesting that only first world countries should have access to Linux. I am from a third world country myself, so I do understand that lack of a net connection can be a problem.

As mentioned by others, if you want to install a package you have to check the dependencies and install them. The method that you use to determine this will differ from distro to distro. The package management tools I suggested don't just work only when there is a net connection. APT, YAST and urpmi work fine with CD sources if configured to do so.

The reason why I mainly mentioned the interenet is because if you have an internet connection you will have access to a lot more software repositories and obviously you will spend less time trying to resolve dependencies because that will be done fro your automatically.

cov 06-05-2005 09:24 AM

Quote:

Originally posted by Ephracis
This is not a flaw, this is good (my opinion). Shared libraries has a benefit, it makes programs smaller. Instead of installing the same library over and over again you install it once and then all the programs that need that library use it and the library is therfore shared between all the program.
Regards.

Yes, Agreed, it is definately the way to go.

However, can we not rationalise the whole process so that you don't get multiple libraries performing the same function? Then, should you need to install another app, the chances are that the required dependencies already exist.

As regards double-clicking of deb files or rpms, yes, in theory that does work (although YAST2 on my system complains heavily), but the dependencies aren't resolved until you try to install.

cov 06-05-2005 09:29 AM

Quote:

Originally posted by reddazz
I think you didn't fully understand my post. I'm not suggesting that only first world countries should have access to Linux. I am from a third world country myself, so I do understand that lack of a net connection can be a problem.
Yes, I did, I was being deliberately obtuse! ;)

Quote:

As mentioned by others, if you want to install a package you have to check the dependencies and install them. The method that you use to determine this will differ from distro to distro. The package management tools I suggested don't just work only when there is a net connection. APT, YAST and urpmi work fine with CD sources if configured to do so.
Yes, I understand that. However, I do have CD full of lib files etc needed to fulfill dependencies. Setting it up so that synaptic or YAST can catalogue the dependencies is a bit of a minefield.

Quote:

The reason why I mainly mentioned the interenet is because if you have an internet connection you will have access to a lot more software repositories and obviously you will spend less time trying to resolve dependencies because that will be done fro your automatically.
Yes, and that is partly my gripe!

cov 06-05-2005 09:35 AM

Quote:

Originally posted by LinuxSeeker
Most libraries you need are usually in your distribution's CD's.
I cannot agree.

I have a Suse 9.2 distro with 5 CDs.

And I also have 2 or 3 CD with additional lib files needed to fulfill dependancies not covered in the 5 CDs.

Furthermore, although YAST says that I have openAL installed, I cannot install SimGear because it claims that I do not.

There might be a version problem, I don't know, but it appears that openal is installed via cvs, which is a damn headache.

Ephracis 06-05-2005 09:50 AM

Quote:

Originally posted by cov
However, can we not rationalise the whole process so that you don't get multiple libraries performing the same function? Then, should you need to install another app, the chances are that the required dependencies already exist.
Do you have any example of two or more libraries that do the same thing?

Quote:

Originally posted by cov
As regards double-clicking of deb files or rpms, yes, in theory that does work (although YAST2 on my system complains heavily), but the dependencies aren't resolved until you try to install.
IIRC you can choose with both rpm managers (like yast) or deb (apt-get) to download the package and all it dependencies without installing them. That might require some parameters and options, a little more then just double-clicking, but you can then just download your app of choice with all the dependencies and copy it to your usb stick or whatever and then install it at home.

There is one problem though. You might have some dependency on the list that will work on the box at the computer café (it might already be installed) but when you try to install it at home you do not have the same setup as the box at the café and therefore you may need more things than you first thought (computers still have limited intelligence).

For example: you want to install "package" and that depends on "depend1" and "depend2". At home you don't have either "depend1" or "depend2" but the box at the café has "depend2". Some manangers will only show "depend1" since the other one is already installed, therefore you must choose to view all of the dependencies, even already installed ones.

When you get home you want to install "package" and "depend1/2". But when you install "dependX" you see that it depends on "d'oh" and therefore you must go back and download that, too. I can't see how this problem would be resolved without having a REALLY big list of all dependencies on each and every 'level' for "package". That would be a nasty solution but besides from stop using shared libraries I can't see anything else but to give each and every person fast Internet access so you don't have to go down to the Internet-café all the time. :P

craigevil 06-05-2005 09:52 AM

"Because different pieces of software have different dependencies - the most common being different compiler versions - this often leads to a conflict between the software that is required and the software that is installed. It is possible to get into a vicious circle of dependency requirements, or - possibly worse - an ever-expanding tree of requirements as each new package demands several more be installed.

Though the concept of dependency hell has been very common since the rise of package managers like APT and RPM, it's actually about as old as UNIX itself. Most software that is distributed via tarballs of the source code contain a configure script, which takes inventory of what your system will allow the compiler to do, and sometimes also checks for other software installed on the system that it might require for operation. Like RPM's dependency-borne snowball effect, this can also lead to hunting down and installing various versions of extra software, just so the configure script of the first tarball to be installed will be satisfied (it'll error out and quit if it doesn't find something it needs). With the rise of package-based operating systems, this form of dependency hell is becoming less common, but will likely be around for quite a while, depending on how much longer current operating systems that don't use package management software can go without the convenience and relative speed of such software. Despite that convenience, not all software developers will distribute their software via RPM and its ilk, which ensures that tarball-based software will live on indefinitely."
http://en.wikipedia.org/wiki/Dependency_hell

And yes the easiest way to solve the problems is having internet access and using a package manager. The same issue applies to windows updates, the easiest and fastest way to update is by using the internet. It doesn't mean you have to absolutely have an internet connection to install or use Linux, but it does make things easier. The internet is the future.

Package management system - Wikipedia, the free encyclopedia
http://en.wikipedia.org/wiki/Package_manager

"Hacking in Iraq, Interview with Jake Appelbaum"
http://www.makezine.com/blog/archive...g_in_iraq.html

Ephracis 06-05-2005 11:05 AM

I have noticed that Linux is actually more or less (dunno with the latest distros) adapted to doing stuff over the Internet. The first priority in early distros was to get the network up running. Debian even wants to update and install some packages over the Internet during the installation, installing Gentoo will require that you either have Internet access during the installation or that you at least have tracked down the packages that you will need and then copied them for offline installation, and so on and so on.

Today you need Internet!
This can be a bad thing for our fellow guys in the developing world but I hope that soon the Internet will be something we can all use and enjoy.

jtshaw 06-05-2005 01:28 PM

Cov,
Your problems here are really the nature of the beast when you have an OS that millions of different people develop on.

There are a ton of libraries out there. The major ones (glibc, glib, gtk+, qt, ect. ect.) are pretty much always included in every Linux distribution. The problem is the small libraries that one or two packages use. If distributions were to include them all everyone would complain they were bloated with there massive 10GB installs. Most maintainers balance them as best as they can. They choose one of each type of app and stick with it. If you want anything else they often provide it through there package management system, but that will pretty much require having an internet connection.

There is always the static binary option, but there are very big technical reasons for using dynamic linking to libraries over static linking. Not only does static linking use obscene amounts of disk space, but if you have two programs statically linked the same library you end up with two copies of that library in memory instead of just one which is horrible for performance.

Unix and *nix operating system have, and probably always will, heavily rely on networking...

Komakino 06-05-2005 02:22 PM

You say "With linux you have a list of dependencies as long as your arm...". Well you might, but I find this is seldom the case because I don't use packages, I build everything from source. It's seems to me that you've had a brief forray into linux and this is your first impression - and you've not yet discovered that there are multiple (and better) ways of acheiving the same aim.
Personally I don't like the windows model - I want to know what's being installed and where. With source code and makefiles I get this. I've been against RPM since I first use it. I think it gives new users a bad impression of linux and steers them clear of compiling from source (as if it's difficult to do, which it isn't). I think a lot of people would find the whole linux experience less frustrating if they never discovered RPM!

Also, the windows paradigm works because you can bet that most (if not all) windows users will have the same versions of libraries as each other (with the possible exception of directX) and therefore binaries can be compiled and distributed linked to these libraries. With linux I'm free to upgrade to the latest version of library_X whenever it is released and have all new programs link against that. I think that's a far better idea than being stuck with whatever Microsoft included when they bundled (or bungled?!) their OS onto a CD for me (which I'm glad they did, I always need coasters :))


All times are GMT -5. The time now is 06:14 PM.