📜 ⬆️ ⬇️

Mega: security review

Good day!
Being a simple Russian ... no matter by whom, I usually do not write articles, but read them quietly (we, no matter who, love silence). However, recently my attention was attracted by the abundance of English-language articles about the (un) security of the sweet soul of the Mega service. Since in the Russian-speaking environment the issue is discussed less vigorously, I decided to bring to your attention a small digest of publications about “scandalous intrigues and investigations” that recently turned around around the service uv. Comrade Dotcom. I should warn you right away that this article does not contain the original research, in other words my own opinion on cryptography in Mega, but only seeks to inform the Russian-speaking reader about the state of foreign discussion on this topic.

So, one of the first topics of security Mega raised ArsTechnica, in the publication with the funny name “ Megabad: A quick look ”. Despite the fact that the article failed to avoid a number of absurd journalistic bloopers (most of which have now been corrected and perpetuated only in the comments), the author rightly noted a number of rather large flaws on the Mega side:


The importance of the last point should be discussed separately. The fact is that usually the possibility of ciphertext deduplication is achieved by convergent encryption (a similar approach is used in the Bitcasa service, and also, in a somewhat more complex form, in the distributed file system Tahoe-LAFS). The essence of this approach is that encryption is implemented so that the same plaintext gives the same ciphertext (sometimes with some reservations). On the one hand, it can be not bad in its own way, as it allows for deduplication. But on the other hand, convergent encryption can be quite nice to strike at such precious privacy, since it allows interested parties to find out if there is a duplicate of a file on the service, provided that they themselves already have it in unencrypted form (if you understand what I mean ... ). And this is not good. Very bad.
Later, commenting on the article “ Researchers Warn: Mega's New Encrypted Cloud in Forbes, co-founder of Mega Bram van der Kolk stated that deduplication occurs at the level of individual files and is possible only in cases when the user “ downloads the same file encrypted with the same key ” or when importing an existing file using a file manager. However, user comments on the article quickly began to sound objections due to the fact that both of these scenarios are likely to be very rare, which means that this kind of deduplication is economically inexpedient and hardly worth the effort of writing a special (and, in general, That, uneasy) code.
So the situation with deduplication remains a bit strange.

In addition to discussing deduplication, the Forbes article contains some rather sharp comments (in particular, the aforementioned Nadim Kobeissi told Forbes the following: “To be honest, I got the impression that I myself programmed it in 2011, while being drunk” ). Of particular note is the point of view of Matthew Green (professor of cryptography at the Johns Hopkins Institute for Information Security), who condemned the entire practice of encryption using only javascript as a whole - “Verification of javascript itself is like trying to pull yourself up by your laces. Nothing will come of it . ” Interestingly, Green is not the first to talk about the fundamental vulnerability of browser-based javascript cryptography - the thesis about the unreliability of such systems was outlined in 2011 by experts from Matasano Security .
')
Megu and their colleagues from SpiderOak criticized politely but severely, publishing a very interesting article about the peculiarities of the foundations of “Mega cryptography”. In it, they point to the extreme weakness of the self-made key derivation function applied by the Dotcom team when converting a password into a master key used to further encrypt an asymmetric key (by the way, some have already concocted a utility for brute-force hashes of Megov passwords that are contained in the registration confirmation link ) and the vulnerability of the authentication procedure to replay attacks.

SpiderOak promises to continue to explore the architecture of Mega with the mandatory publication of (probably juicy) results.

And finally, sweet, good passers-by from the fail0verflow team found that the validation mechanism of the page components used by Mega (the same mechanism that Matthew Green spoke about the fundamental unreliability) has a frankly textbook error - CBC-MAS is used instead of crypto-resistant hash function . For those who do not really know a lot about cryptographic perversions, it’s probably worth explaining that SVS-MAS is an authentication code of a message (and not a hash in the strict sense of the word), and it is harmful for health (users), because it allows The “strong” opponent (for example, the CDN operator) will steal user keys.
Mega responded to the fail0verflow vulnerability found extremely quickly, and are currently using SHA-256 ... However, the presence of such a crude bloop in a service, proud of its "cryptographicity", can not but be disturbed.

Well, that's probably all for today.

Perhaps in the future I will continue to publish popular science reviews of scadadalo-intrigue and investigations related to the security of various services (not just Mega).

[UPD]
In response to the heated discussion, Mega organized a program of financial incentives for “responsible passersby”, and plans to pay rewards for the bugs found.

The reward can claim the bugs that allow

The reward can reach 10,000 euros.

Yummy? Not really ...

There are still limitations, namely: mega will not pay for bugs that require very large computing power (which is reasonable) for use, which are conditionally academic in nature (which is unreasonable , because as commanded by Schneer, “Attacks always get better; they never get worse " ), as well as RE / DOS vulnerabilities (which, to put it mildly, is strange, since such attacks are by no means belong to the category of problems that the normal host of the web service would like to know" upon use ") and attacks requiring (indefinitely) a large number of requests to the server.

Mega’s attitude to the problem of insufficient entropy in key generation, which Mega referred to as “academic” problems, and demanded “to show real vulnerability” (I remind you that RSA’s “accidental” keys) is alarmingly alarming . Unfortunately, cryptographic problems cannot be solved by systematic self-suggestion about the “lack of real vulnerability” ...

Also, the mega neatly bypassed the question regarding the possible identification of stored ciphertexts (that is, content identification without decryption, by the deduplication mechanism or other architectural features), especially by owners of a copy of the playtext (well, you understand who I am)

In general, all this leaves a mixed impression - although the amount of € 10,000 cannot be called a little, the conditions of the event leave some unpleasant aftertaste.

Source: https://habr.com/ru/post/168025/


All Articles