The concept of security involves two different things: sensation and reality. And they differ! You may feel protected, even if in fact it is not. And you can be safe even if you don’t feel it.
In fact, two different concepts correspond to the same word "security." In this lecture, I want to separate them, to understand what is the difference between them, and what they are similar.
Let's start with the problem of terminology.
We have very few suitable words to express the concepts that will be discussed now. If you look at security from an economic point of view, this is a compromise. Every time we seek greater security, we pay something for it. Whether it is a personal decision (whether to set an alarm in the house?), Or a national decision (not to conquer any country?) - we have to sacrifice something. It can be money or time, convenience or opportunity. Even fundamental rights. The question that should be asked when solving any security problems is not “how effective is it against a threat?”, But “is it worth it?” In recent years, much has been said that the world has become safer after the removal of Saddam Hussein from power. Maybe this is true, but it is not so important. The question is, was it worth it? Of course, you can think anything about the war in Iraq, but think about whether the invasion justified itself?
This is what you need to think in the field of security - in terms of compromise. And it is often not about what is right and what is not. Someone sets the alarm at home, and someone - no. It depends on the place of residence, marital status, wealth, and how ready we are to accept the risk of theft.
The same can be observed in politics: the coexistence of opposing opinions.
Often security compromises affect not only our security, and I think this is very important. We intuitively “weigh” the advantages and disadvantages. We make this choice every day: when I decided to close the door in the hotel room to all locks last night; or while you were driving here; or we eat in the eatery because we decide that the food here is still not poisoned ... We make this choice again and again, many times a day. Often we do not even notice. It is just a part of life, a part of the life of each of us. Every living thing does it. Take, for example, a rabbit in a field. He peacefully gnaws grass - and suddenly notices a fox. Before the rabbit, there will be a choice: “stay or run?” And if you think, rabbits who know how to make the right choice will survive and give birth to offspring, and rabbits that are wrong, will be eaten or die of starvation. It can be assumed that as representatives of the most prosperous species on this planet, we must unmistakably solve such problems. However, the practice again and again proves our inability to do so. Why? This is a fundamental and extremely curious question. I will answer briefly: the fact is that we make decisions based on a sense of security, not on facts. Oddly enough, in most cases it works. Quite often, sensations and reality coincide. At least it was in the prehistoric era. This ability of ours was developed by natural selection. We can say that we are adapted to make decisions about the risks that apply to life in small family groups on the East African plateau one hundred thousand years BC. But if we take modern New York, it turns out that we are not so adapted here.
')
There are different biases in risk perception. Many good experiments have been done on this topic. And you can see that certain prejudices occur over and over again. I will list four of them:
1. We tend to exaggerate spectacular (unnecessarily dramatic, “theatrical”, but rare) risks and play down everyday risks: compare air travel and car driving.
2. The unfamiliar is perceived to be more risky than the familiar.
For example, many are afraid of being kidnapped by strangers, whereas according to the data, relatives are more likely to be kidnapped (referring to child abduction).
3. Impersonal risks are perceived as more dangerous than impersonal risks - so bin Laden becomes worse because he has a name.
4. People underestimate the risks in situations they are under control, and overestimate the risks in situations that they do not control.
As soon as you start skydiving or smoking, you begin to underestimate the risks. When risk is imposed on you - for example, as with terrorism - you overestimate it, because you do not have a sense of control.
There are a number of similar perceptual distortions that influence decision making in risky situations. For example,
accessibility heuristics , that is, we estimate the probability of an event by how easy it is to recall examples of such events. Let's try to present it by example. If we constantly hear about tiger attacks, there should be a lot of tigers in the area. If nothing is heard about the attacks of lions - there are few lions nearby.
It worked until the newspapers were invented. Because the essence of newspapers is that they repeatedly write about rare risks. I always say that if they write about it in the news - sleep well. Just by definition, news is something that almost never happens. When something becomes trivial, it is no longer news — car accidents, domestic crimes — these are the risks you need to worry about.
In addition, people are storytellers. We take stories to heart, closer than facts. But at the same time, we, in general, do not know how to count. I mean, the joke about “one, two, three, many” is partly fair. We understand small numbers very well. One apple, two apples, three apples. 10,000 apples, 100,000 apples are a lot more apples than you can eat before they rot. That is, half, quarter - it is simple and clear. Once in a million, once in a billion - it's like never. Therefore, it is difficult for us to take risks that are not everyday.
So these perceptual distortions work as filters between us and reality. And as a result, sensation and reality no longer coincide, they begin to differ.
And here either you feel safer than you really are (this is a false sense of security), or vice versa (and this is a false sense of danger).
I write a lot about “safety games”, about those measures that inspire people with a sense of security, but really give nothing. There is no term for measures that make us safer, but do not give a sense of security. Perhaps this is what is required to engage in the CIA.
But let's get back to economics.
If money, if the market determines security measures, and if people make decisions based on their sense of security, then the cleverest thing a company can do, based on economic considerations, is to give people a sense of security. And there are always two ways to achieve this.
First - you can really protect people and hope that they will pay attention. Or the second - you can create a sense of security and hope that they will not pay attention.
What makes people pay attention?
A number of things: an understanding of security measures, risks, threats of countermeasures, and how it all works. If you understand this, it is more likely that your feelings correspond to reality.
Life experience works well here. We all know disadvantaged areas of the city, because we live here, and in general terms, our sense of danger coincides with reality. Security Games are opened when it becomes obvious that security measures are not working.
So why do people not pay attention? Because of misunderstanding. If you do not understand the risks, do not understand what the costs are, you tend to make the wrong decision, and your sense of security does not coincide with reality. Due to lack of life experience. This problem is inherent in unlikely events. For example, if terrorist attacks occur very rarely, then it is very difficult to assess the effectiveness of counter-terrorism measures. That's why you keep sacrificing virgins, and this “protection from the unicorn” works great. There are not enough examples of failures of these measures. In addition, feelings overshadow the mind: perceptual distortions, of which I spoke earlier; fears; folk beliefs, which are based on an inadequate model of reality.
Let's complicate. There is a feeling, and there is reality. Add the third item. Add a model.
Sensation and model in our mind, reality is the world around us. The world is unchanging, it is real. So, the feeling is based on intuition. The model is based on logical inferences. Here is the main difference.
In a primitive and simple world, a model is essentially unnecessary. Because the feeling is close to reality. You do not need a model. But in today's complex world, models are needed to understand most of the risks we face.
We do not feel bacteria. We need a model to understand what bacteria are. And this model is a meaningful representation of reality. Of course, it is limited to the current level of development of science and technology. We did not know that bacteria were the cause of disease, until we invented a microscope that allowed us to see them. The model is limited by the distortions of our perception, but it makes it possible to correct sensations.
Where do the models come from? We borrow them from others. We find them in religion and culture, borrowing from teachers and parents.
A couple of years ago I was in South Africa on a safari. The keeper who accompanied me grew up in the Kruger National Park. He had a number of rather complex models for survival. They included attacks on lions and leopards, rhinos, and elephants; when you need to run, and when you need to climb a tree, and when you can’t climb under any circumstances. I would die there the first day, but he was born there, and he knows how to survive there. I was born in New York. I could have brought him to New York, and he would not have survived a day. Because we have different models based on different life experiences.
Models can be imposed by the media and officials. Think about the model of terrorism, child abduction, flight safety, road safety.
Models can be imposed by industry. I follow two of them: surveillance cameras and ID cards. Quite a lot of computer security models originate from here.
Many models come from science. Disease patterns are a great example. For example, cancer, avian flu, swine flu, SARS. Our perceptions of the dangers of these diseases originate in scientific models passed through the filter media.
So, models can change. Models are not static. As we get used to the environment, our models can approach our sensations.
For example, if you go back a hundred years ago, when electricity was just coming into use, it was accompanied by a lot of fears. People were literally afraid to press the doorbell, because it worked on electricity, but it was perceived as a danger.
For us, electricity is ordinary. We change light bulbs without even thinking about it. Our model of electricity safety is what we were born with. It did not change until we grew up. And it’s easy for us.
Or, for example, the perception of different generations of risks on the Internet: how the older generation understands the safety of the Internet - compared to how you perceive it, and how your children will understand it.
Sooner or later, models go to the background. Intuitive is the same as the familiar. When a model is close to reality and coincides with sensations, we stop noticing it. A beautiful example is the swine flu epidemic last year. When the swine flu first appeared, the first news caused an overreaction. He was given a name, which made him worse than the usual flu (despite the fact that the usual flu is more dangerous). And people thought that doctors should do something. That very feeling of lack of control arose. The combination of these two factors caused a reassessment of the threat of swine flu. Months passed, the novelty was gone - and people got used to it, got used to it. New data has not been reported, but the fear has declined. By the fall, people thought that the doctors had already solved this problem.
There was a certain “bifurcation point” - people had a choice between fear and recognition (actually, between fear and indifference), and they partly chose suspicion. And when the vaccine appeared last winter, a lot of people (amazingly many) refused to be vaccinated. A wonderful example of how the perception of danger is changing, and how the corresponding models are changing - unpredictable, even without new information, even without new data.
This happens quite often.
Complicate again.
There are sensations, model, reality. I have relativistic views on security. I think it depends on the observer. Most security solutions involve different people. And powerful players bound by mutually beneficial agreements will try to influence the decision. I call it their secret plan.
You see, the plan (and this is both marketing and politics) is to convince you to have a specific model - to the detriment of the rest. Or to persuade to ignore the model and trust the feelings. Or in driving out of your sight people whose models do not suit you.
This is not such a rarity. A great example is the harm from smoking. The last 50 years have shown how the perception of the harm from smoking is changing, and how the tobacco industry is fighting against a model that it does not like.
Compare this with the discussion about the dangers of passive smoking about 20 years ago. Think about seat belts. When I was a kid, nobody was wearing it. Today, no child will let you go if you have not fastened your seat belt.
Compare this with the discussion of airbags about 30 years ago.
These are all examples of changing patterns.
What we learned is that it is difficult to change models. Models difficult to force out. And if they coincide with sensations, then you do not even know that you are using a model.
There is another
cognitive distortion . I call it a bias of confirmation: the propensity to acknowledge information that is consistent with our beliefs — and to reject data that contradicts our understanding.
So, the facts that do not fit into our model, we are likely to ignore, even if they are convincing. And for us to pay attention to them, you need very good reasons.
Especially complex new models with long-term consequences. Global warming is a great example. We become losers when it comes to models with a duration of 80 years. We can make it to the next harvest. We can plan a family and raise children. But 80 years is too difficult for us. That is why such a model is so difficult to accept.
We can simultaneously have conflicting models. This is often called
cognitive dissonance . But sooner or later, the new model will replace the old one.
Strong experiences can create a model. The September 11th terrorist attack created a new security model in the minds of many people. Also create a model can clash with crime, the threat to health, terrible news about epidemics. Psychiatrists call such events flash events. They can create a model instantly, because they are very emotiogenic.
In the technological world, we have no experience in evaluating models. And we rely on the judgment of others. We rely on middlemen. I mean, it works until someone makes the amendment.
We rely on MOH in drug safety decisions. I flew here yesterday. I did not check the plane personally. I trusted someone else to check the safety of my plane.
In this room, no one is afraid that the roof will collapse, not because we personally checked it, but because we know that local building standards are reliable. This is just one of the models that we take on faith. And that's fine.
What we want is for people to learn the best models and align their feelings with these models, which will allow them to make the right choice. When the model stops working, you have two options:
1. You can correct the feelings of people, directly turning to the emotions. This is a manipulation, but it can work.
2. And a more honest way - fix the model.
Changes are slow. The debate about the dangers of smoking lasts 40 years, and this is a more or less simple discussion.
There are more complicated examples. I hint that our only hope is awareness.
And I lied to you!
Remember when I talked about feelings, model, reality. I said that reality does not change. And it changes.
We live in an era of technology - the reality is constantly changing. And we are confronted, probably for the first time in the history of our species, with the fact that sensations follow a model, and a model follows reality. And if reality changes, they may never catch up with each other.
It is difficult to say ... But in the long run, both sensations and reality are important.
I want to complete with two short illustrative stories.
In 1982, I don’t know if you remember that there was a short-term epidemic of
Tylenol poisoning in the United States. Terrible story. Someone took a bottle of Tylenol, added poison to it, then packed it and put it back on the counter. Someone else bought and died. It instilled horror in people. There were a couple of similar cases. There was no real danger, but people were scared. . ( ) .
« ». 10 . : .
. .
. . . ,
RFID , . , - , .
: «, ! , ?»
. . , , (, - ) , .
, , , , .
, . , .
, , , .
Thanks for attention.