LinuxQuestions.org
Visit Jeremy's Blog.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices


Reply
  Search this Thread
Old 01-09-2004, 05:40 PM   #1
mikeshn
Member
 
Registered: Feb 2002
Posts: 586

Rep: Reputation: 30
Large Number of files?


Can Linux OS can be crashed of large number of files? Let's say a box with
50 millions files in the directory.
 
Old 01-09-2004, 06:15 PM   #2
jailbait
LQ Guru
 
Registered: Feb 2003
Location: Virginia, USA
Distribution: Debian 12
Posts: 8,346

Rep: Reputation: 552Reputation: 552Reputation: 552Reputation: 552Reputation: 552Reputation: 552
"Can Linux OS can be crashed of large number of files?"

I don't think so. I think that you run out of disk space for file inodes before the file system logic fails because of too many file names.

I have run into a problem with the cp command. The cp command can only handle a certain number of file names in a directory and the man pages for Qt exceed this number.

___________________________________
Be prepared. Create a LifeBoat CD.
http://users.rcn.com/srstites/LifeBo...home.page.html

Steve Stites
 
Old 01-10-2004, 06:11 AM   #3
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
On a related note, this reminded me of the "AV DoS", as it was recently on Bugtraq (again). AV scanners can scan compressed archives, but some scanners do not use proper mechanism to detect multiple compressed archives inside compressed archives, which in effect creates a bomb, cuz they need to unpack the archives before scanning. While this could keep only the scanner busy indefinately and not crash Linux, it would also fill up the $TMP partition: if using /tmp, then other Linux apps looses the capability to use /tmp. If using /var/tmp, (or symlinked /tmp to /var/tmp) then other Linux apps looses the capability to use /tmp, and eventually loose ANY system logging. If /var and/or /tmp are linked to the VFS root, then any writes could eventually be denied (kinda fun on reboot). Goes to show "large number of files" doesn't necessarily need to be fifty million to cause problems.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
assigning a large float number in E notation edM Programming 3 05-09-2005 12:51 PM
Deleting a large number of files msteudel Linux - General 4 01-26-2005 01:36 AM
Java: Compile Large number of source files ? mikeshn Programming 7 10-07-2003 11:33 AM
mailman sending large number of errors for qrunner sarin Linux - General 3 07-28-2002 01:52 AM
Large number of open ports RefriedBean Linux - Security 3 07-05-2002 11:34 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Security

All times are GMT -5. The time now is 10:09 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration