The toolkit used by the developer is updated regularly. Completely new tools appear, some cease to be relevant, some cease to develop and are superseded by more advanced counterparts. It’s interesting to observe all this, and I decided to share some of my latest observations in this area.
I also want to note right away that the position is close to me, that the less heterogeneous tools are used, the better. I am ready to criticize the minimalism of functionality in advance. My position is controversial, but it deserves the right to exist.
Automated Testing
Some functionality that was previously only available when using third-party utilities gradually moves into development environments. In particular, in Visual Studio. However, this functionality is still associated with developers only with third-party tools. For example, an automated application user interface testing system, introduced in Visual Studio 2010 Premium / Ultimate, which makes it possible in many cases to abandon tools such as AutomatedQA TestComplete or Borland SilkTest.
')
Please understand me correctly. I do not propose at all to abandon the existing base of tests and urgently begin the transition to the testing system built into Visual Studio 2010. And I am not at all agitating for its use. TestComplete is one of the most powerful commercial products designed to automate software testing. However, if you use Visual Studio 2010 and make a decision about choosing an automated testing system for a new project, then you probably should not go far. If you do not need special specific features for testing, then you will not have to buy and use additional systems other than Visual Studio 2010.
We use the user interface testing system from Visual Studio to test the PVS-Studio interface.
Previously, we were more focused on testing internal modules, but with the growth of the interface part, the challenge was to move from manual testing to automatic ones. Nevertheless, our requests are rather modest and we are pleased that we have chosen testing in Visual Studio. Figures 1 and 2 show some windows of the Visual Studio testing system in the process.

Figure 1 - Record user actions in Visual Studio

Figure 2 - Control Tree in Visual Studio
Conclusion - it is advantageous to explore innovations in the already used tools. If you have standard test requests, then you may have enough functionality of Visual Studio when developing new projects. Thus, the number of entities (tools) with which you deal will not increase, and this is always good.
Code editor
Similarly with testing, the situation is the same with the assistant - Visual Assist. I remember how good it was with him in the days of Visual Studio 6.0. But many talk about its usefulness, simply not knowing about the latest features of the latest Visual Studio or not noticing them. Most of the features that I so appreciated in Visual Assist were gradually realized in Visual Studio. Starting with Visual Studio 2008, I came to the conclusion that I can safely do without Visual Assist and refused to use it. With the release of Visual Studio 2010, using Visual Assist has become completely irrelevant for me.
I do not argue, in Visual Assist there are functions that Visual Studio will never enter. And I am sure that for some these functions are important or simply convenient and useful. There are a lot of people for whom Visual Assist over time does not lose importance, but on the contrary becomes more and more irreplaceable. However, I personally used a very modest range of possibilities and did not feel the need for more. And these features are now satisfied with the Visual Studio environment. Consider a few examples, armed with Visual Studio 2010.
There is a syntax coloring. Suppose it is not as colorful as in Visual Assist, but quite pleasant and sufficient for me. And if we take into account the underlining of syntax errors, it is generally remarkable (see Figure 3).

Figure 3 - Highlighting code in Visual Studio 2010 and underlining incorrect constructions
The hint system on function parameters and name hints on the first entered characters of the name work well (see Figures 4 and 5):

Figure 4 - Hint on function parameters in Visual Studio 2010

Figure 5 - Hint for function names in Visual Studio 2010
There is a possibility that I really lacked without Visual Assist. This is a file name hint. In Visual Studio 2010, this feature has appeared (see Figure 6).

Figure 6 - Hint for filenames in Visual Studio 2010
Visual Assist helped me figure out even poorly formatted code when I needed to understand where the bracket opens and closes. Visual Studio 2010 provides the same functionality, highlighting the pair brackets, as shown in Figure 7.

Figure 7 - Highlighting paired brackets in Visual Studio 2010
The functionality of the code editor in Visual Studio 2010 completely satisfies me. Perhaps you will look at this editor Visual Studio again.
Static analysis
When it comes to
static analysis of C ++ code, the programmer often has an association: "These are some lint-type tools that have a command-line interface and are no longer relevant." Let's try to figure out how this opinion arose. I will not talk about companies and mature development processes, where static analysis was used before, is used now and will be used in the future. But most use immature processes. Shy it is not worth it. This is a disadvantage not for programmers, but for organizations. For them, a static analyzer is still more exotic than an everyday tool integrated into the development process.
The C language is a language that requires a great deal of care and attention from the programmer, since it controls the actions performed in the code very poorly. A more dangerous language is probably only an assembler. In this connection, static code analysis tools have appeared, the most famous representative of which is lint. This and similar tools were widely used due to the lack of alternative ways to detect errors at the coding stage. They were relevant for programming cycles of any level of maturity.
The new C ++ language, due to more strictly type control and other improvements, has become much more secure. Compilers for C and C ++ began to issue warnings about many potentially dangerous situations. In fact, they took over many of the functions of existing static analyzers, and the use of the latter has become less popular. At this point, many abandoned the additional level of analysis using third-party tools.
However, static analyzers are not outdated at all. They learned to detect many types of errors associated with object-oriented programming, warn about the incorrect use of libraries (for example, Qt) and even identify errors in parallel programs. The consequence is that static analyzers, as before, can significantly reduce costs during the testing and support stages. And what is nice, now these are usually not separate tools, but modules that integrate into the development environment.
I would like to emphasize that the opinion about static analyzers as obsolete command-line solutions is outdated. These are modern tools that perfectly complement the standard features of the compiler and other tools for improving the quality of programs.
As an example, take Visual Studio again. Starting with Visual Studio 2005, the Team System version includes the general purpose static analysis subsystem
Code Analysis . Although the system is an extension, it is tightly integrated into the environment and working with its diagnostic messages is similar to working with messages issued by the compiler (see Figure 8).

Figure 8 - Contribution of Code Analysis settings in Visual Studio 2010
There are other, specialized static analyzers. As an example, I can give you the
PVS-Studio analyzer that we are developing, which also tightly integrates into Visual Studio (see Figure 9) and allows you to identify many errors in 64-bit and OpenMP programs.

Figure 9 - Integration of PVS-Studio in Visual Studio 2010
Modern static analyzers are friendly programs that can be handled not only by professionals, but also by programmers who are just starting to work.
Dynamic analysis
Speaking of a dynamic analyzer for finding errors in working with memory, first of all everyone remembers the DevPartner BoundsChecker Suite tool. However, I want to reduce the fervor of his supporters and those who recommend him in the forums. Undoubtedly, it was a wonderful and indispensable tool for a long time. Unfortunately, the project is currently not developing and is rapidly becoming obsolete. For example, BoundsChecker does not support
Win64 applications. It is able to run in a 64-bit environment and test 32-bit applications, but does not know how to work with 64-bit applications. Quote from the booklet: “DevPartner Studio supports 32-bit application development on 64-bit Windows (
WOW 64 )”.
This kind of lag is not allowed for testing tools. Fortunately, a new titan has come to replace BoundsChecker and other dynamic analysis tools, which is much more reasonable to focus on. This is an
Intel Parallel Inspector tool included in Intel Parallel Studio.
Intel Parallel Studio integrates into Visual Studio (see Figure 10) and adds the ability to test working with memory and streams. Checking memory in Intel Parallel Inspector includes checking for memory leaks, detecting pointers that refer to a remote object, and detecting work with uninitialized variables. Intel Parallel Inspector
allows you to identify the use of incorrect references to memory, controls the stack and so on. Flow checks include checks on race conditions, deadlocks, call stack analysis with adjustable depth.

Figure 10 - Setting the level of diagnostics in Intel Parallel Inspector
The best part is that it is possible to analyze programs both compiled using Intel C ++, and using Visual C ++. Supported analysis of Win32 and Win64 applications. Intel Parallel Studio is developing steadily, has a low price and its use can be safely included in long-term plans.
Conclusion
The infrastructure of tools for programmers is constantly changing. It is possible to find both new, more convenient solutions, and to abandon some old ones (due to the cessation of their development). By the way, in large companies there are special employees (and even departments) who are only engaged in following the development of the tools used in the development process.