📜 ⬆️ ⬇️

Software without bugs? Dream harmless

image Not all software companies had to deal with errors of such importance as they were with Toyota cars ( on Habré ) , but every day it is becoming more and more clear: any software company creates products with hidden security defects. There are practically no exceptions.

According to the software testing service provider Veracode, which prepared the report for the RSA conference in San Francisco, about 60 percent of the software that was passed through testing over the past 18 months failed the first test cycle . As noted by Roger Oberg, senior vice president of marketing for Veracode, these were applications from manufacturers that are quite concerned about security in order to primarily use the services of Veracode.

Veracode data is not unique. Last year, a study conducted by WhiteHat Security revealed that 82 percent of corporate websites contained vulnerabilities of “high, critical or high priority” in the foreseeable past, and 63 percent had such vulnerabilities at the time of the survey.

It should be noted that the studies of security consultants are more often self-PRA. But their results can not be discounted. It is enough just to run through the headers to notice the frequency with which vulnerabilities are found in most software products of reputable manufacturers. Independent developers should not think that their products are any different, just because mistakes * are very difficult to avoid.
')

Game for developers: kill the hamster **


Do not think that for hacking using complex and subtle errors. Each year, the SANS Institute and the Common Weakness Enumiration Library (CWE), a government-sponsored guard dog, publish 25 of the most common and dangerous software bugs . As in previous years, the list of 2010 did not contain many new products, except perhaps the unintended output of confidential information in error messages or permission to download unlimited types of dangerous files. But there are a lot of such children's mistakes as race conditions, buffer overflow and incorrect handling of array pointers. These are eternal errors, rooted in the dawn of programming, but their prevalence is striking in 2010 too.

In addition, the facts say that even the use of best practices can lead to an error. In 2006, Joshua Bloch from Google wrote on a blog that he found a bug in a binary sorting algorithm from John Bentley’s famous reference book “Pearls of Programming”, published for the first time in 1986. Although Bloch did not try to humiliate Bentley, it turned out that Bloch himself implemented a binary a search algorithm for JDK containing exactly the same error, and its misstep went unnoticed for about nine years.

Can developers work better? Software testing services such as Veracode may of course be useful, but still this approach is not ideal. In some cases, the application architecture or programming language may make testing completely meaningless.

Open source developers love to praise Linus’s law, which says that “with a sufficient number of eyes, all errors are detected.” In other words, the transparency of the open source application development process means that errors in open source will be detected and resolved faster than in proprietary software.

However, Microsoft security program manager Sean Herman disputes this allegation, and not without reason. According to Herman, the fact that programmers can inspect code for errors does not mean that they do it; Moreover, practice shows that only paid programmers on a full working day are sufficiently motivated to spend time inspecting someone's code. If this is so - and it seems to me that this is the case - only software vendors with the deepest pockets (and, accordingly, the largest teams) can really win on Linus' law.

Be open


But none of the above does not mean that software security is a fatal case. This is not the case; the answer lies in the understanding of what exactly can be done with the code. Each developer is responsible for delivering the code of the best possible quality; “Best possible” is a very important phrase. After that, the focus of any software security strategy of any developer is not on the development process, but on how to cope with security incidents when they inevitably happen.

The days when the *** updates were delivered on CDs and diskettes were long over. Now users expect the rapid appearance of updates, almost with the speed of detection of vulnerabilities. While this may often be impractical, manufacturers delay the distribution of critical updates at their own risk.
Let me remind you that how a manufacturer distributes updates can be a problem in itself. There was a time when Microsoft distributed updates as soon as they were available. But customers complained that an unreasonable burden is being placed on IT staff, who must constantly check and deploy updates. In response, Microsoft switched to the current distribution model - “Tuesday of updates”, twice a month. This approach has also been criticized, mainly by those who say that “Tuesday of updates” leads to “Wednesday of hacks” ****, when hackers prey on those who have not yet installed the latest updates.

Customers will always be unhappy with security failures and the need to fix them. The only way for developers is to be open and sincere, as far as possible in matters of the security failures of their software, and to make every effort to resolve the problems of customers that the failure may affect, even before the update is released.

An alternative is to plant a culture of silence and secrecy in all that concerns security failures; This is a direct path to failure. The situation with Toyota is somewhat atypical. Closer to the topic is how impatiently web developers are waiting for HTML 5, which is expected to free them from a seemingly endless series of bugs that have come through extensions like Adobe Reader and Flash, and which are often not fixed for weeks or even longer.

The more research, such as those that Veracode and WhiteHat Security do, will see the light, the better customers will understand that security failures are part of life. Once this perception prevails, customers will require not only updates, but also a more thorough search for security vulnerabilities. Soon, companies that will not regularly identify security threats will not be able to look like manufacturers of high-end applications; they will be those who have something to hide.

Marks to translate
* - under the error in the text is called a bug (bug)
** - in the original, whack-a-mole, the game in which the animal crawls out of the hole and needs to be hit with a hammer on it; I don’t know what it’s called in Russian, I use the name “kill a hamster” in everyday life
*** - an update in the text implies a patch (patch), i.e. bug fix, not adding new functionality
**** - more precisely "Exploit Wednesday" (Exploit Wednesday)

PS I agree with the opinion of the author, otherwise I would not translate. In my subjective opinion, applicable to all types of errors, and not only in the field of security.

Source: https://habr.com/ru/post/87167/


All Articles