Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Distribution: Debian Wheezy, Jessie, Sid/Experimental, playing with LFS.
Posts: 2,900
Rep:
Quote:
Originally Posted by EDDY1
Post the suggestion.
Close the thread, it's old and there are, 3 years later, better and easier (for some) ways of doing what the OP tried to tell people about.
I had one, and it never ever slowed my browsing experience down (dial up did that for me) that was way over his 15k lines. I even had a post on Ubuntu forums asking how to remove duplicates from it. Hosts files have their place and so do other tools, educating the masses about each and every one so they can choose for themselves is the key. I don't use it anymore but learning about them helped me to become more knowledgeable of what happens as soon as you open a browser and what continues to happen as you use the net.
I view the subject of ad's or unwanted content on a web page the same way I view OS/application security. I say my OS will be subject to an attack at some point no matter what I do. I can only limit my exposure. If you feel you have found some way to block content then there will be some new means to send junk to your browser. The worse offenders rely on new found holes in security. The really bad ones might get past any controls, because they know someone is trying to prevent them.
Sure, but there's a difference between just "believing" ("I think", "don't worry", "I guess") something is secure and proper hardening and regularly auditing exposure. And since you're linking this to security maybe you can understand that relying on /etc/hosts is just like valuing Psionic PortSentry over Snort. PS is everything Snort is not: of limited use, inefficient, prone to FP's, obsolete, no longer maintained, etc, etc. But just like there's still people stubbornly advertising /etc/hosts as an efficient and accurate method of blocking ads there's still people stubbornly installing PortSentry, even in 2012. Part of that may be due to people not realizing that some HOWTO's on the 'net are utterly deprecated and part of that may be due to, what shall we call it generally speaking, some form of perverse ignorance?
I just can not understand why, with all that we can use these days, people still maintain blocking IP addresses is "good enough" when every simple test shows it just is not...
It's still maintained the hosts at this site,
I'm lazy and I use it on Windows machines of clients and some friends which tend to get full of viruses within a week from clean install.
It gets rid quite a lot of unwanted malware (but I never tested the speed because of that large hosts file).
Well, to be exact the OP was talking about blocking ad's. The scope of security was not part of that question. One can do a number of tasks to help secure a system. Anyone who thinks they can fully secure a system from hackers and connect it to the internet may be a bit out of step with reality. One can greatly limit vulnerability but I have not seen in my 40 years of technical work any OS that is immune.
A hosts file is only a part of a multi-part solution for ad's and some malware sites.
One could try the removal of dns and use a host file for only their common web pages. But a complete blocking of all other ip addresses is still needed and all other security features need to be up to date.
Some of my favorite web pages have ad's that I am interested so there is good and bad to this. One might find out about a free unix book or such from an ad. They are not all bad.
Well, since the thread is 3 years old - as previously noted - AND things may have changed since then AND the ongoing discussion seems to lead nowhere, let it rest in peace. Closed.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.