Jim Johnson is the founder and chairman of the board of directors of Standish Group, a globally respected company that provides research and analysis services for the efficiency of IT projects. He is widely known, first of all, due to the research “Why do projects fail?” And other works on system costs and accessibility. In addition, he is a pioneer of modern research technologies, such as virtual focus groups and precedent analytics techniques.
Undoubtedly, the
Standish Group 's greatest fame
came from their study The CHAOS Chronicles, which collected material for 12 years, consisting of studies of more than 50,000 completed IT projects with the help of focus groups, detailed surveys and interviews of top managers. The objective of this study is to fix the scale of failures among software development projects, the decisive factors of failure, and ways to reduce such risks. In 1994, the Standish Group published its first research report on CHAOS, in which billions of dollars were spent on projects that were never completed. Since then, this report is most often cited in the context of the industry.
Taking some time off during the holidays, Jim Johnson talked with me this week about how the study was conducted and about the role of Agile methods in the context of the research results. We are also joined by Gordon Divitt, Executive Vice President of Acxsys Corporation, a software production veteran who has participated in CHAOS University events since its inception.
')
Please tell us how the first CHAOS report was compiled?JJ .: We have been doing research for quite a long time, so let me tell you a little about him too.
Our first subject of study was sales of sub-software - we then led an IBM group in Belgium, about 100 people, and could not track these sales in any way. I mean, when you sell a toolkit, you can expect a number of usage agreements. But we did not see what they were counting on, so we had to interview people about the reasons why such agreements were not concluded. They replied that their projects would not be completed. At that time, according to our data, the share of such canceled projects was about 40%. What people were talking about was a real problem.
Therefore, we began to conduct focus groups, etc. to get feedback in order to know how to deal with it. We conducted surveys and testing in order to check the reliability of the sample, corrected and improved it, until we received a sample that would be representative of different industries and companies of different sizes.
Do you think that the CHAOS sample is generally representative in the context of application development?JJ .: Yes.
In this case, does it include, say, small software vendors?JJ : Well, no. It consists of the following categories: only state or commercial organizations - without distributors, suppliers or consultants. So Microsoft is not represented in our sample. And organizations with a turnover of more than $ 10 million also, with rare exceptions, such as the company of Gordon.
GD : Does participation in CHAOS University events seem limited to small companies?
JJ .: Yes, we have really big customers. But at the same time, when we interview people, we try to avoid bias and cover the entire industry. We do not just interview our clients - we pay people to fill out questionnaires, and this happens completely separately. You understand how the church and the state (smiles).
Do you pay people to complete questionnaires?JJ : Well, they come to focus groups, and we pay for their time. If they fill out the questionnaire, we pay them a fee or give them a gift. This is already a tradition in the study of this industry. Thus we get a net result based on unbiased answers. If you do not pay, you will be biased to respond in order to somehow influence the result of the study. And payment helps to maintain a neutral opinion. However, many donate about a quarter of the fee to charitable needs - the majority of those who work for the state, does not take money in any case.
Yes, we use precedents when analysts are wanted from us, but we cannot use this information for advertising purposes or for our own reports.
The main quality of the information we receive - its delimitation from a particular company - everything is always grouped, absolutely confidential, we do not specify names. Otherwise, we would not get the data.
GD : This is the property of the company.
JJ .: Yes, exactly, and we never publish survey data.
The results of your demographic research are posted on your website. But, it seems to me, so something is missing, namely, the principle on which you select companies for your base. Since you are aiming precisely at failures in development, are you looking for companies with some particular type of failures? Do you select companies with a higher failure rate than the average in the IT industry? Or do they turn to you? Or, in short, is your sample representative of unfortunate developments as a whole?JJ .: We can consider serious failure as a precedent for studying, we are looking for precisely instructive failures. But not as survey data. We do not request "failed" projects. Our primary research was a massive dissection based on the broadest sample, perhaps because the response to it was extremely weak (Note: In the 1994 study, more than 8,000 application development projects were presented).
Now we invite people to participate using our SURF base, and we have certain input criteria.
Participants must:
- have access to certain information about the project
- use certain applications
- use certain platforms.
There are currently about 3000 active participants in our database. You may ask if they themselves volunteered on the assumption that they have certain problems - but no, I would not say that. I think they participate, because it will allow them to get acquainted with the data - and we provide them only to the participants. But I do not think that there is any preconceived interest here ... I mean, we are constantly reviewing everything, adjusting.
Correct? In what sense?JJ .: We look at the data by calling back to clarify or exclude something, if it looks incorrect, or if we are not sure. That it has the necessary certainty. We do not just fill the database, we need clear data.

The results of the study CHAOS 2004: Distribution of projects
Complicated - 53%, successful - 29%, failures - 18%.
As you know, I wrote a review about how Robert Glass questions the results of CHAOS. Did anyone question your numbers up to that point?JJ. Not so much. Most often, I hear: “The numbers are too optimistic” - people are mostly surprised. Their reaction to the percentage of projects that failed (18%) due to the cancellation or non-use of results is quite natural - they understand the reasons for this phenomenon: if everyone wants to achieve success and come to the finish line first, then some amount may come to it and not come as it is on the verge.
Everyone knows that the most common scenario in our industry is to exceed the budget, to break the deadlines, not to provide the planned functionality. Most comment on it this way: “I don’t know a single project where everything would be done on time.”
We presented the results of our research in many cities of the world. They can be found on the Internet. And do not be a genius to understand our methodology - it is known to all.
GD : And no one has yet gone back from cooperation to doubt - people depend on research, they have been participating in events and surveys for 12 years. They always come back, and that says something.
I have no doubt that the process is completely transparent, as it has always been in CHAOS, and I can confirm the openness of the Standish Group. And, more importantly, in my opinion, I can confirm the acceptance of results by clients - this is confirmed by their long-term cooperation with the group and participation. If there was “something amiss in the Danish kingdom,” it would have come to the surface a long time ago.
You mentioned somehow that people tend to confuse complex projects and failures — and I myself have sinned so much — and that you would not advise to think so. Could you say a few words on this topic?JJ .: I think many people confuse these two concepts - complex and failure. This is a mistake. The fact is that projects that have value and those that have no value are lumped together in one pile. I would consider projects from a position of value level. A project that has exceeded a million dollar bar might have more value, since a new system could bring more benefits. On the other hand, in complex projects a lot of money is wasted. We are trying to separate the failure of the project from the failure of project management, which is still of value. This topic is very interesting to me now, and that is what our new research reveals.
There is “success”, which contradicts three parameters at once - but the project is still successful and important. We always ask: “How would an objectively-minded person characterize this project?” We don’t want to put anyone at a disadvantage ... but if the project is still completed and not closed, then it is complex, not disastrous. It is not easy to draw a clear boundary.
Indeed, projects change so much over time that it is difficult to obtain reliable information about the initial scale and final, for example?JJ. We are making a lot of effort in order to establish the difference between a complex project and a successful one. And then sometimes they are reinsured: they increase the budget of the project in order to avoid failure - we should also pay attention to this.
Ten success factors1. User Involvement
2. Support senior executives
3. Understandable business goals
4. Scale optimization
5. Workflow flexibility
6. Experience Project Manager
7. Financial management
8. Experienced staff
9. Formalized methodology
10. Standard tools and infrastructure
Your research into the failure of projects is an attempt to find the secret of success, isn’t it? I noticed that in your list of project success factors at number five, the item “workflow flexibility”. What do you mean by agile software development?JJ .: Yes, exactly! I am a big fan of Agile - I used the work on iterations in the early 90s, and then there were CHAOS reports on quick returns. We really cheer for small projects, small teams, for Agile in technical process. Kent Back gave lectures at CHAOS University, and I spoke at his seminars. I am a real fan of extreme programming. In the new book “My life is a failure” I speak about Scrum, RUP, XP.
GD : Agile methodology has helped break up projects into smaller subprojects. Even if things go wrong, you can understand it in time. Not like in the old projects that used the waterfall method, when you burrow into a hole, and you get bad news only after two years ...
JJ .: I think this is a secret - in step by step. I think because we see improvements. (Approx. Ed. Here a few diagrams are collected used by Jim Johnson in lectures and taken from his 2004 study.)

Comparison: Successful, failing, difficult.

The average percentage of over-costs for the years 1994-2004.

The average percentage of failure of terms for 1994 - 2004.
The big problem was that the projects "swell", deadlines were broken, the budget was overspent, unnecessary functionality was developed. Especially in government projects. Agile-methodology here really helps - the project almost does not grow. You know, as they say, “I need this,” and then “this is not as important as I previously thought.”
Gd A: Yes, setting priorities for functionality helps a lot ... all the time asking “what's the business benefit?”. To determine the non-valuable characteristics of the product and leave them at the end of the list, and if the hands never reach them, it means that they were not needed at all.
Did you collect any Agile data in your research?JJ : Yes, we tried. I hope we can show some of this. I inserted such questions into questionnaires, but few answered them — it’s difficult to get unambiguous data. We tried to achieve something for a couple of studies to the present ... I hope this time we will succeed. Within a few days, we will take a final approach to SURF members to complete the 2006 survey.
Agile may raise new questions. What do you call the phenomenon when the planned scale is reached for a project with adaptive planning?JJ .: Good question. In the case of companies like Webex, Google, Amazon, eBay, it’s very difficult to say, because they have streaming updates (“pipelining”), not releases. “What we wouldn’t do in two weeks, everything will go to the world.” They are successful because users experience minor changes. And, introducing new changes, they are protected - remember, many projects fail after the development stage has been passed, and it is necessary to move on to implementation.
Stream updates? Sounds like a resizable scale.JJ .: Yes. It is very informative, working at a variable scale - people are much happier when they see that work is going. People like to see progress see results, and Agile is the epitome of it all.
Gd : For these companies, users are young, enthusiastic, easily adaptable, they like change, like to try everything new. Large banks do not tolerate this approach. So I see this process - is it too rough?
JJ .: Not every methodology is applicable to any project. Agile is difficult for many, because they have a difficult culture.
Gd : One of the things I like about Agile is that you do something, you see it, users say “I like” or “I don't like, can you change it?” Or “no, this is not the right way”. You have fast feedback and quality debugging on the stream is much better. With the “waterfall” approach, even if you bring the project to the end as you intended, you still have to deal with all these quality problems afterwards. With Agile testing and feedback, quality is improved.
JJ .: People think that Agile is very liberating. If they had looked more closely, they would have seen that it was still as tough as it was at the “waterfall” or something else that James Martin would deal with. To say that over the past ten years there has not been an improvement is nonsense. We have made significant progress.
Well, I promised it would be the last question. Who will take the final word?Gd : You know, Jim is a prominent truthgiver, like his entire team. That's what makes me work with them.
Jim, thank you very much for taking so much time to our conversation!