'hosts' file

  • Be sure to checkout “Tips & Tricks”
    Dear Guest Visitor → Once you register and log-in please checkout the “Tips & Tricks” page for some very handy tips!

    /Steve.
  • BootAble – FreeDOS boot testing freeware

    To obtain direct, low-level access to a system's mass storage drives, SpinRite runs under a GRC-customized version of FreeDOS which has been modified to add compatibility with all file systems. In order to run SpinRite it must first be possible to boot FreeDOS.

    GRC's “BootAble” freeware allows anyone to easily create BIOS-bootable media in order to workout and confirm the details of getting a machine to boot FreeDOS through a BIOS. Once the means of doing that has been determined, the media created by SpinRite can be booted and run in the same way.

    The participants here, who have taken the time to share their knowledge and experience, their successes and some frustrations with booting their computers into FreeDOS, have created a valuable knowledgebase which will benefit everyone who follows.

    You may click on the image to the right to obtain your own copy of BootAble. Then use the knowledge and experience documented here to boot your computer(s) into FreeDOS. And please do not hesitate to ask questions – nowhere else can better answers be found.

    (You may permanently close this reminder with the 'X' in the upper right.)

f1assistance

Well-known member
Apr 17, 2024
68
4
Holly Springs, NC
f1a.us
Last edited:
Personally, I think you should abandon the idea of doing this locally in the hosts file (which only affects the one device) and go with a DNS based solution. Quad 9 is based around the idea of blocking DNS resolution for problem sites. I've never used a paid DNS, but I believe they also support this kind of management. (NextDNS for example, but I saw this Reddit post )
 
  • Like
Reactions: SeanBZA
Are you using this list for your hosts file?
No, just using like Pholder and not using the default DNS, and also using Ghostery and Privacybadger as well, which does a good enough job of stripping most of those lookups before they even reach DNS to get resolved, and removes a lot of the annoying ads. That and Duckduckgo browser on mobile helps a lot. Not quite annoyed yet with the mobile YT player yet, as I rarely use it, to actively look up a way to neuter it as well.
 
Personally, I think you should abandon the idea of doing this locally in the hosts file (which only affects the one device) and go with a DNS based solution. Quad 9 is based around the idea of blocking DNS resolution for problem sites. I've never used a paid DNS, but I believe they also support this kind of management. (NextDNS for example, but I saw this Reddit post )
I hope my point here isn't lost, but a DNS lookup is only requested if a url hasn't previously been resolved or an address isn't already known. No? And if we're playing by the numbers, an external DNS is irrelevant, Yes? I don't think the 'hosts file' can or should be "abandon" as a key security layer, it's simple and proven substantial when understood and properly utilized, and works for all attempts to reach beyond the local host as @Steve highligted in SN45. I'm NOT suggesting abandoning any DNS, just properly recognizing it as an additional downstream security layer.
Now, can we get back to my initial question...;););)
 
DNS lookup is only requested
On some devices (Windows included) a DNS lookup starts in a local cache with expiring entries, and then proceeds with the hosts file, and if not found there proceeds further up the chain. (DNS is recursive.) So in theory maintaining a list of exceptions in the hosts file can save some traffic to upstream DNS, but since you're talking about sites that you want blocked, who cares if the resolution is very slightly slower. I presume nothing is ending up in the cache if found in the hosts file. Otherwise, if the result is found in the recursive DNS lookup, it will end up in the cache... but the only way it got there is because it wasn't in the hosts file.
 
On some devices (Windows included) a DNS lookup starts in a local cache with expiring entries, and then proceeds with the hosts file, and if not found there proceeds further up the chain. (DNS is recursive.) So in theory maintaining a list of exceptions in the hosts file can save some traffic to upstream DNS, but since you're talking about sites that you want blocked, who cares if the resolution is very slightly slower. I presume nothing is ending up in the cache if found in the hosts file. Otherwise, if the result is found in the recursive DNS lookup, it will end up in the cache... but the only way it got there is because it wasn't in the hosts file.
I thought my point would be lost with you, but you're beginning to smell what I'm standing in...
IMO, starting with the premise of 'better late then never' presents a disastrous base to lift from if it can be easily usurped. DNS is simply a convenience for us humans not good with numbers. Numbers are where the traffic actually begins (i.e., what's in a name(url)). If an undesirable script or program knows the numbers to start with, where's our DNS sentinel now, undisturbed and comfortably fast asleep. @Steve has always professed the best 'risk aversion' begins locally (i.e., user>endpoint>perimeter>cloud). I'm simply interested in better covering my endpoints.
Now, can we get back to my initial question? Geez, I'm simply looking for suggestions of trustworthy maintained hosts files...didn't mean for this to get into a pissing contest. :censored::censored::censored:
 
FWIW, I used to update my hosts file too, once upon a time. I used Spybot Search & Destroy to do this... but it got really annoying and a bit buggy, interfering with (like trying to disable) the built in MS Defender, and so I just stopped. The risk to me became more than any potential reward, and I switched my DNS to Quad 9.

In any case, there's no contest here. I'm merely suggesting you're wasting you time trying to do something yourself that is better done by others who are professionals. Rather than attempt to copy their work at random intervals into your own PC, at the potential risk of [seriously] breaking things... why wouldn't you just rely on the network design to do what it was designed to do. I mean, you don't have a full index of the Internet so you can avoid using Google, do you? Anyway, you do you.
 
FWIW, I used to update my hosts file too, once upon a time. I used Spybot Search & Destroy to do this... but it got really annoying and a bit buggy, interfering with (like trying to disable) the built in MS Defender, and so I just stopped. The risk to me became more than any potential reward, and I switched my DNS to Quad 9.

In any case, there's no contest here. I'm merely suggesting you're wasting you time trying to do something yourself that is better done by others who are professionals. Rather than attempt to copy their work at random intervals into your own PC, at the potential risk of [seriously] breaking things... why wouldn't you just rely on the network design to do what it was designed to do. I mean, you don't have a full index of the Internet so you can avoid using Google, do you? Anyway, you do you.
Okay, I get it, you don't like using the hosts file and you've had problems in the past, and you love Quad9...got it! But, I on the other hand have strategically utilized the 'hosts file' as a security layer for the past 18 years in both Windows and Linux with really no unexpected issues. I personally use both Windows and Linux, with Linux being my personal daily driver for the past 10 years. I was simply hoping @Steve might opine/suggest a current maintained 'hosts file' he was aware of different than the one I currently use. FWIW, I deployed Arista Edge NGFW (previously Untangle) as my perimeter sentinel back in 2007 and recommend it for all my clients, and over the years have utilized several DNS services including Quad9. Now, if you're finished, maybe @Steve will add his $0.02 about a 'hosts file'. THX! :rolleyes::rolleyes::rolleyes:
 
ChatGPT

Q: Can you give me a list of regularly maintained 'hosts file' I might review?

A: Certainly! Here are a few popular hosts files that are regularly maintained and can be reviewed:
  1. StevenBlack/hosts: This hosts file consolidates several reputable sources into one unified file and is frequently updated. It blocks a wide range of advertisements, trackers, and malicious websites.
  2. Adaway: Adaway hosts file is designed to block ads, trackers, and malicious websites. It is used primarily in Android systems but can be adapted for other platforms.
  3. MVPS Hosts: MVPS hosts file blocks unwanted ads, malware, and telemetry sites. It's been maintained for years and is a popular choice for Windows users.
    • Website: MVPS.org Hosts File
  4. hpHosts: hpHosts provides a comprehensive hosts file to block ads, trackers, and malicious sites. It categorizes sites into different levels of threat.
  5. Dan Pollock's hosts file: Dan Pollock's hosts file blocks a wide range of advertisements and malicious websites. It is simple and effective.
    • Website: Someonewhocares.org
Remember to review these hosts files periodically or subscribe to notifications for updates, as maintaining an up-to-date hosts file is crucial for effective blocking of unwanted content and protection from malicious sites.
 
I’d suggest using a local DNS server which will incorporate the hosts file of your choice and give other blocking options as well (regex rules, etc), and allow blocking for all devices on the network (smart TVs, etc.), not just your PC.

Pi-Hole is the most popular:
 
ChatGPT

Q: Can you give me a list of regularly maintained 'hosts file' I might review?

A: Certainly! Here are a few popular hosts files that are regularly maintained and can be reviewed:
  1. StevenBlack/hosts: This hosts file consolidates several reputable sources into one unified file and is frequently updated. It blocks a wide range of advertisements, trackers, and malicious websites.
  2. Adaway: Adaway hosts file is designed to block ads, trackers, and malicious websites. It is used primarily in Android systems but can be adapted for other platforms.
  3. MVPS Hosts: MVPS hosts file blocks unwanted ads, malware, and telemetry sites. It's been maintained for years and is a popular choice for Windows users.
    • Website: MVPS.org Hosts File
  4. hpHosts: hpHosts provides a comprehensive hosts file to block ads, trackers, and malicious sites. It categorizes sites into different levels of threat.
  5. Dan Pollock's hosts file: Dan Pollock's hosts file blocks a wide range of advertisements and malicious websites. It is simple and effective.
    • Website: Someonewhocares.org
Remember to review these hosts files periodically or subscribe to notifications for updates, as maintaining an up-to-date hosts file is crucial for effective blocking of unwanted content and protection from malicious sites.
Unfortunately, MVPS Hosts hasn't been updated since March-06-2021 and shouldn't've made the list. ChatGPT is clearly misinformed. D'oh! :cautious::cautious::cautious:
 
My vote is for NextDNS. It has a nominal fee ($1.99/mo in my case) and you can customize what types of things are blocked. Logs make it pretty easy to whitelist if needed.

My home gateway/firewall uses NextDNS for its upstream DNS, and all the systems in my house use my home gateway/fw for their DNS (configured via DHCP).
 
My vote is for NextDNS. It has a nominal fee ($1.99/mo in my case) and you can customize what types of things are blocked. Logs make it pretty easy to whitelist if needed.

My home gateway/firewall uses NextDNS for its upstream DNS, and all the systems in my house use my home gateway/fw for their DNS (configured via DHCP).
+1

also use something similar to pi-hole that resides on my router: