SN875: Erik Osterholm's question

  • Be sure to checkout “Tips & Tricks”
    Dear Guest Visitor → Once you register and log-in please checkout the “Tips & Tricks” page for some very handy tips!

    /Steve.
  • BootAble – FreeDOS boot testing freeware

    To obtain direct, low-level access to a system's mass storage drives, SpinRite runs under a GRC-customized version of FreeDOS which has been modified to add compatibility with all file systems. In order to run SpinRite it must first be possible to boot FreeDOS.

    GRC's “BootAble” freeware allows anyone to easily create BIOS-bootable media in order to workout and confirm the details of getting a machine to boot FreeDOS through a BIOS. Once the means of doing that has been determined, the media created by SpinRite can be booted and run in the same way.

    The participants here, who have taken the time to share their knowledge and experience, their successes and some frustrations with booting their computers into FreeDOS, have created a valuable knowledgebase which will benefit everyone who follows.

    You may click on the image to the right to obtain your own copy of BootAble. Then use the knowledge and experience documented here to boot your computer(s) into FreeDOS. And please do not hesitate to ask questions – nowhere else can better answers be found.

    (You may permanently close this reminder with the 'X' in the upper right.)

Lob

What could possibly go wrong?
Nov 7, 2020
161
44
Hi Steve, Long time listener. I’m a little confused on the key length discussion. I always thought that encrypting twice merely doubled the effective strength. Here’s my reasoning: Imagine you have an algorithm and a key that you can fully brute force in one day. If you add one bit to the key, you double the key space, and therefore it takes twice as long to brute force (or two days.) If you instead encrypt twice, then it takes one day to decrypt the outer ciphertext, at which point you get back the first output of ciphertext. Then in one more day, you can brute force the first/inner ciphertext. Like in the first example, this takes two days. It seems to me that these are equivalent. Is there something I’m missing?
I come to the conclusion that the fastest you could decrypt the plain text while not knowing any key is, as above, theoretically 2 days given the other assumptions.
There is a big but (or two or three):
  • You don't know any keys nor any plain text
  • You have to assume the ciphertext produced is correct and then process that blob
  • You also, as an attacker, would not know how many times the plain text has been encrypted
I actually think an attacker would never know the encyption is repeated and thus would never believe the output to ever be anything other than ciphertext and not a result.
It's almost a mix between security and obscurity - tihnk of it as strong encryption as a control to protect the data with the repeated cycles to be a deterrent attack which would likely result in the attacker wasting lots of compute time and/or giving up.

What do others think?
 
Last edited:
There are many differing opinions on this matter. One is that there would always be a mathematical function that would map any input to any output. If you knew this function, then adding more layers of encryption is just selecting a new function, and thus the key strength would not change at all, it would be the strongest key used in any single pass.

The thinking for multiplication is that you need to undo layer A before you can undo layer B. So for every A there is a different B. This is A times B. The problem with doing this, is the key for A needs to have NO RELATIONSHIP to B. This means you have to manage (and thus remember) two completely different keys/passwords. You also need to have a way to securely transmit them to the recipient. This would mean, if you're using asymmetric cryptography for that purpose, it needs to grow stronger too. NIST has a table which compares asymmetric strengths to symmetric strengths.

Intuitively what was stated above has a certain appeal... but since it relies on broken encryption, I don't really see any use for addition using unbroken encryption.
 
Certainly the individual key strength remains constant but if different secrets were used, the complexity and time to brute force the layers increases per layer you add to this onion.

I see it to be similar to me having a discussion with my teenager who uses something like this technique to break down my will to discuss things with him. In the end, given what the outcome might be, I give up :D
 
  • Like
Reactions: CSPea
I had an idea on encryption that seemed to work with a few trials on Veracrypt. Except for a correctly executed One Time Pad all encryption can eventually be brute forced. What I tried was to encrypt a 1MB file. Then I used Windows Notepad and made a minor alteration to the encrypted text. The correct key did not work, and I am guessing no other key would work since I was now dealing with a 'corrupted' encrypted file.

Any thoughts on that? I did unmodify the encrypted file and the correct password worked again for a few test cases.
 
all encryption can eventually be brute forced
Yes, in theory, given unlimited (think infinite) time, it can be. However, I can't stress this enough, often enough, since a lot of people seem to not understand it well: The number 2^256 is really, really, REALLY, large. Here it is written out: 115,792,089,237,316,195,423,570,985,008,687,907,853,269,984,665,640,564,039,457,584,007,913,129,639,936 Even though you see it, and you think, "yeah, wow" that still doesn't mean you really comprehend how f**king huge it is! Let's assume you have access to every possible atom on the earth as a super computer that can try 1 quadrillion encryption attempts per second (trust me, that is way out of the realm of possibility for modern computers today.)

Here's your math:

Take that huge number above and divide by the huge number I just spelled out in words (133,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000) and then further divide the result by 60*60*24*365.25 to get the number of years, and it would take: 27,588 years to try every possible encryption. If you assume you'll get it half through, on average, then it's 13,794 years.

That's for one encryption with one random AES256 key with an unrealistic amount of computer power, more than you will ever be able to have. It's just not realistic to think anyone is ever going to try brute forcing anything this hard, let alone with a 512bit key or stronger. BTW, the same math for a 512bit key is 3,194,494,699,831,990,496,903,055,274,462,166,454,875,725,948,024,767,928,291,801,792,231,015,792,883,976,944 years!!
 
  • Like
Reactions: hyperbole
The correct key did not work
AES is implemented as a number of repetitive rounds. There is a desired behaviour in encryption and hashing where a single bit flip will cause a cascade change in the output... all output bits will be affected. In the case of AES all output bits means in a single block (which is 16 bytes.) Different implementations of the algorithms treat blocks differently, Veracrypt has algorithms to treat disks in blocks, so it may mean that only one disk block of the data is corrupted when you change it. It's considered important in a lot of cases to prevent a decryption against modified data (there are potential attacks) so many implementations may use an AES implementation that has what amounts to a checksum built in, and will fail completely if the input to the decryption has changed in any way since the data was encrypted. (AES GCM is one possibility... but I don't know about the internal workings of Veracrypt to be honest.)
 
Veracrypt uses a mode that allows you to skip around the "chain", so if you want block number 52623, you don't need to decrypt the previous 52623 blocks (also, in this case, Veracrypt might have something in place so the block I just referenced is like 512 bytes). It's still something that allows two blocks with the exact same data to have different encryption output. I had reason to look it up, but the info I got was too high level for me to truly understand how exactly it worked. It like one of those specs documentations that explains the theory on how it works, not exactly how to do it if you needed to reimplement in another language for example.
 
Thanks for the info. I hesitated to post that idea a couple times knowing Steve has said on many occasions 'the problem is already solved' and warned about 'rolling your own'. Still, curiosity got the best of me and I couldn't resist throwing the idea out to those who know more about the topic than myself. Lesson learned: taking a ridiculously large number and making it even larger still yields a ridiculously large number.