Two concepts are used to make SHA-2 secure: data expansion followed by compression, and Feistel network. The expansion occurs between the scheduler and compression rounds, expanding the input message from 512 bits to 2048 bits. The compression rounds then shrink the 2048 bits to 256 bits that make up the hash.
Feistel network use a mathematical operation like Ch and Maj functions in SHA-2. The operations feed into each other not unlike recursion in programing. At each step a little bit of the input data is feed in. In the case of SHA-2 this is 32 bits (Dword) from a scheduler round. Feistel network are designed this way to stop a mathematical inversion from being found.
As an FYI, there is evidence of SHA-256 being compromised by nation states in the wild. It’s not cheap, but high value targets have been notified since the 2010’s to not use SHA-256 for secure comms.
The U.S. also had NIST ONLY approve their Dual_EC_DRBG - Wikipedia algo for secure .gov & .mil comms shortly after.
The immediate successor has a P value that is theoretically factorable and per NIST.FIPS.186-5
• For Ed25519, SHA-512 shall be used.
• For Ed448, SHAKE256 (as specified in FIPS 202) shall be used.
Which was a long way of saying: have fun, crypto is super fun
Since you brought up the National Institute of Standards and Technology, lets see what they say.
NIST Special Publication 800-131A Revision 2
Transitioning the Use of Cryptographic Algorithms and Key Lengths
by Elaine Barker and Allen Roginsky
Published March 2019
Chapter 9 Hashing Functions
A hash function is used to produce a condensed representation of its input, taking an input of arbitrary length and outputting a value with a predetermined length. Hash functions are used in the generation and verification of digital signatures, for key derivation, for random number generation, in the computation of message authentication codes and for hash-only applications.
…
SHA-1 for digital signature generation:
SHA-1 may only be used for digital signature generation where specifically allowed by NIST protocol-specific guidance. For all other applications, SHA-1 is disallowed for digital signature generation.
SHA-1 for digital signature verification:
When used for digital signature verification, SHA-1 is allowed for legacy use.
SHA-1 for non-digital signature applications:
For non-digital-signature applications, the use of SHA-1 is acceptable for applications that do not require collision resistance.
SHA-224, SHA-256, SHA-384, SHA-512, SHA-512/224, and SHA-512/256:
The use of these hash functions is acceptable for all hash function applications.
SHA3-224, SHA3-256, SHA3-384, and SHA3-512:
The use of these hash functions is acceptable for all hash function applications.
The text you kludged together wasn’t provide any useful information or criticism. You just replied to harass.
Cryptography isn’t fun. Cryptography is where the money is.
As the name implies, 2 to the power of 256 is a lot of combinations [citation needed]. Given bitcoin uses SHA-256 and they have not run into collisions, unless the algo is flawed in some obscure manner, I’d argue in 2024, it is safe enough for most people.
I understand this is a light-hearted discussion that hopefully piques the interest of more users on here since more eyes means more peer review, which I am all for.
not really bro, NIST still allows the use of SHA-1 and isn’t disallowing SHA-1* until 2030 despite being confirmed broken in 2005.
*in specific application
Regarding SHA-2:
Papers published in 2011… that’s 13 years ago:
on preimage attacks:
" thus reaching 52 rounds and violating the security of
about 80% of SHA-256
on differential COLLISION (the highest form) attacks:
“SHA-256 is a collision attack for 24 out of 64 step”
“Eurocrypt 2008, Yu and Wang announced that they had shown non-randomness for SHA-256 reduced to 39 step”
The differential attack alone brings your 256 bit field to approximately 85 bits in 2011 with the only limitation being practical RAM limitations. SHA-256 is really 2^85 bit complexity.
That’s before later analysis reduced the field from 256 to 128…
We’re talking a practical 2^42.5 collision complexity based on research performed over a decade ago which supports my assertion:
I assume the burden of proof when making wild assertions, and often say off the wall shit. Know the tism is strong with this one, and I have sources.
Bitcoin uses 192 rounds. The block header is bigger than 512 bits and requires two passes. The h constants in the second pass is replaced with intermediate hash of the first pass in what is called a Merkle–Damgård construction. The output of the 128 rounds is then feed back in to another 64 rounds. This isn’t done to increase security, this is done to increase computational complexity.
That was for a pseudo attack where the plaintext is already known, and has a complexity of 2^255, where 2^255 bits worth of data is needed. Not a break.
That paper covers a pseudo collision, where the plaintext is already known and has a complexity of 2^178, requiring 4.25*10^37 petabytes of data to compute for 46 rounds. Now the paper discuses another attack at 33 rounds with a complexity of 2^46. The 13 round difference between the two, brings along 132 orders of magnitude in complexity.
These papers exist to advance the field of cryptography, not to prove some algorithm needs to be replaced. Until someone breaks all 64 rounds, SHA-2 is secure.
Side note:
Since you brought up differential analysis, but I’m quite sure you don’t know what that means here a video to watch.
If you don’t hold an active security clearance, I strongly recommend you read some of the leaks from a certain expatriate residing in Russia. SHA-256 was known to be defeatable once a certain confidence interval was determined and sufficient resources could be dedicated to the target back in 2013.
The recommendation was SHA-512, RSA-1024, and later Elliptic curve algo’s combined with SHA or Shake.
Quite a few protocols use the same headers or easily derived headers using time attribution attacks.
You’ve probably done cryptoanalysis or attacks in the wild, and am already aware. I am just reiterating for those passerby’s or less informed.
SHA-256 for encrypting LAN traffic is absolutely awesome and we regularly use it for all network communications with additional layers on top.
Yes, I am aware of the problem stacking cryptos, but different algos stacked tends to be less factorable provided they are based on substantially different algos.
My doctoral thesis was a differential attack on the diffie-hellman rho equaton, but the same leaks from the expat mentioned above exposed that my work was a duplication of the findings and techniques already being used in the wild by various Five Eyes partners. Long story short: my dissertation grant was pulled less than a month after the leaks.
Complex encryption also adds a lot of overhead and kills a good chunk of throughput as a result. If you need encryption, by all means use it, but scale the complexity of the algorithm as needed to keep up performance.
If someone targets you the complexity means little. If a nation state focuses on you, fuggetaboutit.
Parkinson’s Third Law of Computing - Encryption can only delay access.
For sure, our clients require ALL network traffic be encrypted. We have an eye on the coming FIPS standards which will deprecate SHA-1 and like stated:
So SHA-256 is the minimum with the new NIST proposals at the levels we have to achieve, as a result we have clusters of MikroTik 32 core routers running environments with 60 workstations to achieve simultaneous sustained 1 gb throughput. For reference, this same hardware could EASILY support 3000-4000 users without the authentication and encryption requirements.