Systems analysis provides a rigorous approach to decision making. It is used to research alternatives and includes modeling and simulation, cost analysis, technical risk analysis and efficiency analysis.
Unlike
SWEBoK ,
SEBoK is much less common in Russia. At least in preparing the course for the magistracy, I did not manage to find at least some translations of his articles. Nevertheless, the book structures the very useful and so far scattered knowledge in the development of large systems, including system analysis.
Since my course dealt with system analysis, under the cut there will be a translation of this chapter of SEBoK ... But these are just a few chapters of one of the 7 sections of the book.
')
PS I would be grateful for the comments and your opinion about this article (quality, necessity) and about the interest in systems analysis and systems engineering.Basic principles of system analysis
One of the main tasks of system engineering is to evaluate the results obtained as a result of its processes. Comparison, evaluation is the central object of system analysis, providing the necessary equipment and tools for:
- Defining comparison criteria based on system requirements;
- Evaluations of the proposed properties of each alternative solution in comparison with the selected criteria;
- Summary evaluation of each option and its explanations;
- Choosing the most appropriate solution.
The process of analyzing and choosing between alternative solutions to the identified problem / opportunity is described in Section 2 of SEBoK (chapter
Systems Approach in System Design ). Define the basic principles of system analysis:
- System analysis is an iterative process consisting of evaluating alternative solutions obtained in the process of system synthesis.
- Systems analysis is based on evaluation criteria based on a description of a problem or a system’s capability;
- The criteria are based on the ideal system description;
- The criteria should take into account the required behavior and properties of the system in the final decision, in all possible broader contexts;
- The criteria should include non-functional issues, such as: security and security of the system, etc. (described in more detail in the chapter “Systems Engineering and Special Design” ).
- An “ideal” system can support a “lax” description, from which “fuzzy” criteria can be determined. For example, stakeholders advocate for or against certain types of decisions, appropriate social, political or cultural conventions should also be taken into account, etc.
- Comparison criteria should include, at a minimum, cost and time constraints acceptable to stakeholders.
- Systems analysis provides a separate compromise analysis mechanism for analyzing alternative solutions.
- Compromise Research - An interdisciplinary approach to finding the most balanced solution among a variety of perceived viable options.
- The study considers the entire set of evaluation criteria, taking into account their limitations and interrelations. An “assessment criteria system” is being created.
- When comparing alternatives, it is necessary to deal simultaneously with objective and subjective criteria. It is necessary to pay special attention to determining the influence of each criterion on the overall assessment (the sensitivity of the overall assessment).
Note: A “soft” / “non-strict” and “strict” description of the system is characterized by the ability to clearly define the goals, objectives and mission of the system (for “soft” systems, this is often extremely difficult).Compromise Research
Note: The term “Analysis of Alternatives” or “Evaluation of Alternatives” is more common in our literature.In the context of the description of the system, the study of trade-offs consists of a comparison of the characteristics of each element of the system and each variant of the system architecture to determine the solution that is generally most suitable for the evaluated criteria. Analysis of various characteristics is performed in the process of cost analysis, risk analysis, and efficiency analysis. From the point of view of system engineering, these three processes will be considered in more detail.
All analysis methods should use common rules:
- Evaluation criteria are used to classify different solutions. They can be relative or absolute. For example, the maximum price per unit of output is in rubles, cost reduction is%, increase in efficiency is%, and risk reduction is also in%.
- The permissible boundaries of the evaluation criteria that are used during the analysis are determined (for example, the type of costs that must be taken into account; acceptable technical risks, etc.);
- To compare the quantitative characteristics of the scale used. Their description should include the maximum and minimum limits, as well as the order of the characteristic change within these limits (linear, logarithmic, etc.).
- Evaluation score is assigned to each solution for all criteria. The purpose of the compromise study is to provide a quantitative comparison in three directions (and their decomposition into separate criteria) for each solution option: cost, risk, and efficiency. This operation is usually complex and requires the creation of models.
- Optimization of characteristics or properties improves the assessment of the most interesting solutions.
The decision-making process is not an exact science, so the study of alternatives has its limitations. The following problems should be taken into account:
- Subjective evaluation criteria - the analyst's personal opinion. For example, if a component should be beautiful, then what is the “beautiful” criterion?
- Undefined data. For example, inflation must be taken into account when calculating the maintenance costs for the full life cycle of a system. How can a systems engineer predict inflation in the next five years?
- Sensitivity analysis. The overall rating given to each alternative solution is not absolute; therefore, it is recommended to conduct a sensitivity analysis that takes into account small changes in the “weights” of each evaluation criterion. The assessment is considered reliable if the change in weights does not significantly change the assessment itself.
A thorough study of the trade-offs determines the acceptable values ​​of the results.
Performance analysis
Efficiency analysis is based on the context of using the system or problem.
The effectiveness of the decision is determined on the basis of the implementation of the main and additional functions of the system, which are identified based on the satisfaction of the requirements of the stakeholders. For products, it will be a set of common non-functional qualities, for example: safety, security, reliability, maintainability, usability, etc. These criteria are often accurately described in related technical disciplines and fields. For services or organizations, the criteria may be more related to defining the needs of users or the goals of the organization. Typical characteristics of such systems include stability, flexibility, possibility of development, etc.
In addition to evaluating the absolute efficiency of the solution, you must also consider the cost and time constraints. In general, the role of system analysis is reduced to identifying solutions that can provide efficiency to some extent, taking into account the costs and time allocated for each given iteration.
If none of the solutions can provide a level of efficiency that justifies the intended investment, then it is necessary to return to the original state of the problem. If at least one of the options shows sufficient efficiency, then a choice can be made.
The effectiveness of the solution includes several essential characteristics (but not limited to): performance, usability, reliability, production, maintenance and support, etc. The analysis in each of these areas highlights the proposed solutions from the point of view of various aspects.
It is important to establish a classification of the importance of aspects for performance analysis, the so-called. key performance indicators. The main difficulty in analyzing effectiveness is to correctly sort and select a set of aspects, in terms of which efficiency is assessed. For example, if a product is released for one-time use, maintainability will not be a suitable criterion.
Cost analysis
Cost analysis considers the full life cycle costs. The base set of typical costs may vary for a specific project and system. The cost structure may include both labor (labor) and non-labor costs.
Type of | Description and Example |
---|
Development | Design, development of tools (hardware and software), project management, testing, prototyping and prototyping, training, etc. |
Product manufacture or service | Raw materials and supplies, spare parts and warehouse stock, resources necessary for work (water, electricity, etc.), risks, evacuation, processing and storage of waste or scrap, administrative expenses (taxes, administration, workflow, quality control, cleaning , control, etc.), packaging and storage, necessary documentation. |
Sales and after sales service | Sales network costs (branches, stores, service centers, distributors, information, etc.), handling complaints and providing guarantees, etc. |
Customer use | Taxes, installation (at the customer), resources necessary for work (water, fuel, etc.), financial risks, etc. |
Deliveries | Shipping and Delivery |
Service | Service centers and departures, maintenance, control, spare parts, warranty service costs, etc. |
Deletion | Folding, dismantling, transportation, waste disposal, etc. |
Methods for determining the cost of costs are described in the
“Planning” section (Section 3).
Technical Risk Analysis
Risk - the potential inability to achieve goals within certain costs, schedule, and technical constraints. Consists of two parts:
- The likelihood of realization (the likelihood that the risk will be justified, and the goals will not be achieved);
- The degree of influence or consequences of implementation.
Each risk has a probability greater than 0 and less than 1, the degree of influence is greater than 0, and the timing in the future. If the probability is 0, there is no risk, if it is 1, this is a fact, not a risk; if the degree of influence is 0, there is no risk, since there are no consequences of its occurrence (can be ignored); if the dates are not in the future, then this is a fait accompli.
Risk analysis in any field is based on three factors:
- Analysis of the presence of potential threats or unwanted events and the likelihood of their occurrence.
- Analysis of the consequences of identified threats and their classification on a scale of severity.
- Reducing the likelihood of threats or their level of impact to acceptable values.
Technical risks are realized when the system ceases to meet the requirements for it. The reasons for this are either in the requirements or in the decision itself. They are expressed in the form of lack of effectiveness and may have several reasons:
- Incorrect assessment of technological capabilities;
- Reassess technical readiness of the system element;
- Accidents due to wear or obsolescence of equipment, components or software,
- Dependence on the supplier (incompatible parts, delayed delivery, etc.);
- The human factor (insufficient training, incorrect settings, insufficient error handling, performing inappropriate procedures, malice), etc.
Technical risks should not be mixed with project risks, although the methods for managing them are the same. Despite the fact that technical risks can lead to project risks, they are focused on the system itself, and not on the process of its development (described in more detail in the Risk Management section of Section 3).
Process approach
The purpose and principles of the approach
The system analysis process is used to:
- Ensuring a rigorous approach to decision making, resolving requirements, and evaluating alternative physical solutions (individual elements and the entire architecture);
- Determining the level of satisfaction of requirements;
- Risk management support;
- Confirmations that decisions are made only after the calculation of costs, time, productivity and the impact of risk on the design or redesign of the system.
This process was also called the decision analysis process (NASA, 2007) and was used to evaluate technical problems, alternative decisions and their uncertainties for decision making. More details in the chapter “Solution Management” (Section 3).
System analysis supports other system description processes:
- The processes of describing the requirements of stakeholders and describing the requirements of the system use a system analysis to resolve conflicts between requirements; in particular related to costs, technical risks and efficiency. System requirements that are subject to high risks or require significant changes to the architecture are discussed further.
- The development processes of the logical and physical architecture use system analysis to evaluate the characteristics or develop the properties of the architecture options, to obtain a justification for choosing the most efficient option in terms of costs, technical risks and efficiency.
Like any system description process, system analysis is repetitive. Each operation is performed several times, each step improves the accuracy of the analysis.
Tasks within the process
The main activities and tasks of this process include:
- Planning to explore alternatives:
- Determining the number of alternatives for the analysis, methods and procedures used, expected results (examples of objects to choose: behavioral scenario, physical architecture, system element, etc.), and justification.
- Creating an analysis schedule according to the availability of models, technical data (system requirements, description of system properties), personnel qualifications and selected procedures.
- Definition of model selection criteria:
- Select evaluation criteria from non-functional requirements (performance, operating conditions, restrictions, etc.) and / or property descriptions.
- Sorting and ordering criteria;
- Determining the comparison scale for each evaluation criterion, and determining the weight of each criterion according to its level of importance relative to other criteria.
- Identify solution options, associated models and data.
- Evaluating options using previously defined methods and procedures:
- Perform cost analysis, technical risk analysis and performance analysis, placing all alternatives on the scale for each evaluation criterion.
- Rate all alternatives on a common rating scale.
- Providing results to the initiating process: evaluation criteria, selection of ratings, comparison scales, evaluation results for all options, and possible recommendations with justification.
Process artifacts and terminology
The process creates artifacts such as:
- Model of selection criteria (list, grading scale, weight);
- Reports on cost, risk, efficiency analysis;
- The report with the justification of the choice.
The process uses the terms listed in the table below.
Term | Description |
---|
Evaluation criterion | In the context of system analysis, the evaluation criterion is a characteristic used to compare elements of the system, physical architecture, functional scenarios, and other elements that can be compared. Includes: ID, title, description, weight. |
Estimated Choice | Management of system elements, based on an evaluation score, which explains the choice of system elements, physical architecture or use case. |
Evaluation score (assessment) | Evaluation score is obtained by the elements of the system, physical architecture, functional scenarios using a set of evaluation criteria. Includes: ID, title, description, value. |
Expenses | The value in the selected currency associated with the value of the system element, etc. Includes: identifier, name, description, amount, type of costs (development, production, use, maintenance, disposal), valuation method, validity period. |
Risk | An event that may occur and affect the objectives of the system or its individual characteristics (technical risks). Includes: ID, title, description, status. |
Validation of system analysis
To obtain proven results, you must ensure that the following points are met:
- Compliance of models and data in the context of using the system;
- Compliance with the evaluation criteria regarding the context of use of the system;
- Reproducibility of simulation and calculation results;
- A sufficient level of accuracy of the comparison scales;
- Credibility of the estimates;
- A sufficient level of sensitivity of the points obtained relative to the weights of the evaluation criteria.
Principles of using models
- Use common models. Different types of models can be used in the context of system analysis.
- Physical models are scale models that allow you to experiment with physical phenomena. Specific to each discipline; for example: mock-ups, test benches, prototypes, vibrating tables, decompression chambers, air tunnels, etc.
- Representation models are mainly used to model the behavior of a system. For example, state diagrams, etc.
- Analytical models are used to establish the value of the estimates. Use equations or diagrams to describe the actual operation of the system. They can be either very simple (addition of elements) or incredibly complex (a probability distribution with several variables).
- Use required models. At each stage of the project, appropriate models should be used:
- At the beginning of the project, simple tools are used to obtain rough approximations without much cost and effort. Such an approximation is enough to immediately determine unrealistic solutions.
- As the project progresses, it is necessary to increase the accuracy of the data to compare more competing options. Work will be more difficult with a high level of innovation in the project.
- The system engineer by himself cannot model a complex system, for this he is helped by experts from the relevant subject areas.
- Examination by subject experts: when the value of the evaluation criterion can not be established objectively and accurately. The examination is carried out in 4 stages:
- The choice of respondents to obtain qualified opinions on the issue.
- Creating a draft questionnaire. Questionnaires with exact questions are easier to evaluate, but if it is too closed, there is a risk of missing significant points.
- Conducting interviews with questionnaire specialists, including in-depth discussion of the problem to obtain a more accurate opinion.
- Analyze the results obtained with several different people, comparing their feedback until agreement is reached on the classification of evaluation criteria or solution options.
The most commonly used analytical models in the framework of system analysis are shown in the table.
Model type | Description |
---|
Deterministic (certain) models | A deterministic model is a model that does not depend on probability theory.
- This category includes models based on statistics. The principle is to create a model based on a significant amount of data and the results of previous projects. Can only be applied to components of the system whose technology is already known.
- Models “by analogy” also use previous projects. The element under study is compared with an already existing element, with known characteristics. Then these characteristics are refined based on the experience of specialists.
- Learning curves make it possible to anticipate a change in performance or technology. One example: "Every time the number of modules produced is doubled, the cost of this module is reduced by a certain, constant share."
|
Stochastic (probabilistic) models | If in the model among the values ​​there are random, i.e. determined by only certain probabilistic characteristics, the model is called stochastic (probabilistic, random). In this case, and all the results obtained when considering the model, are stochastic in nature and should be interpreted accordingly. The theory of probability allows us to classify possible solutions as a consequence of a set of events. These models are applicable for a limited number of events with simple combinations of possible options. |
Multi-criteria models | If there are more than 10 criteria, it is recommended to use multi-criteria models. They are obtained as a result of the following actions:
- Build a hierarchy of criteria;
- Associate with each criterion each branch of a tree with its “weight” relative to criteria of the same level.
- Calculates the weight for each "sheet" of criteria for each branch by multiplying by all the weights of the branch.
- Evaluate each alternative solution by criteria, summarize the estimates and compare them among themselves.
- Using a computer, you can perform a sensitivity analysis to get an accurate result.
|
Practical recommendations
The main pitfalls and successful practices of system analysis are described in two sections below.
Underwater rocks
Underwater rock | Description |
---|
Analytical modeling is not a decision tool | The analytical model provides an analytical result from the analyzed data. It should be seen as an aid, but not as a decision-making tool. |
Models and levels of system decomposition | The model can be well adapted for the nth level of system decomposition and incompatible with a higher level model that uses data from the child levels. It is important that the system engineer ensures the consistency of models at different levels. |
Optimization is not the sum of the optimized elements. | The overall optimization of the system under study is not the sum of the optimization of each part of it. |
Proven Techniques
Technique | Description |
---|
Stay in the field | Models can never show all the behavior and reaction of the system: they work in a limited space with a narrow set of variables. Using the model, it is always necessary to make sure that the input data and parameters are part of the operating field. Otherwise there is a high risk of incorrect results. |
Develop models | Models should be developed throughout the project: by changing the settings of parameters, introducing new data (changing the evaluation criteria, functions performed, requirements, etc.), and using new tools when the previous ones reach the limit of their capabilities. |
Use several types of models. | It is recommended to simultaneously use several different types of models to compare the results and take into account other aspects of the system. |
Maintain consistency of context elements. | The simulation results are always obtained within the simulation context: the tools used, the assumptions, the parameters and data entered, and the scatter of the output values. |