⬆️ ⬇️

Dive in ACID3

(This is the first article in a series of articles reviewing various browser tests.)



What is Acid3? Who came up with it? How does it work and how does it work? What does he really measure? These and other questions we will ask in this article and try to find answers.



What is Acid3?



Acid3 is the third of a series of special tests (previously Acid1 and Acid2 ), written "to help browser vendors, so that they can check the support standards in their products." Specifically, ACID3 aims to test specifications associated with the development of dynamic Web 2.0 applications.

')





Acid3 includes 100 special tests that test 19 different specifications.



Among experts, the attitude towards this test is rather ambiguous: on the one hand, all knowledgeable web developers understand that this test shows little and can hardly act as an absolute criterion for supporting web standards, on the other hand, hardly, someone will take it that this test had no effect.



Among those who look at funny pictures and journalists, the test still serves as one of the browser puzomerok. At some point, one could even observe such a picture with a clear correlation between Acid3 and web standards.







(It is also interesting, by the way, that, according to Google Trends, Russia has persistently been in the top three countries for the last three years, where Acid3 is spoken about the most.)



Acid3 in terms of using the browser's “quality” as a measure, perhaps, has become the brightest and widely used not only in the community and among journalists, but also occasionally by individual companies :)







In the marketing sense, an additional deceptive impression is created by the “magic of numbers” and the usual associative series of 100 ~ 100%, which, of course, in terms of supporting standards, is absolutely not true.



As an example, here is a video from Vesti.Net dated November 20, 2010:

... In general, Acid3 test to support all web standards, both devices are successful, receiving 100 points out of 100 ...


But six months before this Kompyulenta kicks off a preliminary version of IE9:

Slowly but surely, the browser learns an adequate understanding of the underlying web standards . So, if the first edition of the Acid3 tests "knocked out" 55 points out of a hundred possible, then the second edition of the chassis is already gaining 68 points.


And this is how the year before last about Safari 4 was written on iXBT :

Safari 4 is the industry's first web browser, the final version of which completely passes the test for compatibility with the Acid3 web standards .


And the last example, at about the same time cNews about Firefox Mobile (Fennec):

The Acid3 test is a test page of the Web Standards Project, which determines how well the browser engine conforms to generally accepted standards , how well it handles new HTML and CSS specifications, and whether it is capable of displaying web pages correctly .


In this context, as Eric Meyer writes :

“The fact that Acid3 is not really a test that tests a wide range of standards is important here. This is just a small exhibit, something like a Potemkin village. This is a shame, because if anything is needed now, these are comprehensive test suites for specifications - XHTML, CSS, DOM, SVG. ”




Where is the truth? And what is really inside? With this, and we will understand.



The authors



The main Acid3 developer is Ian Hickson (Ian Hickson) , who previously developed the Acid2 test suite. Hickson is part of the CSS standards development team and, in particular, co-edited the CSS 2.1 specification, and also contributes greatly to the preparation of the HTML5 specification, being the editor of this and other related specifications.



Ian has worked in the Mozilla Foundation, Netscape and Opera Software, now works at Google.



Saying that Yang is the main developer, one cannot fail to mention his colleagues, who also contributed:



Initially, Acid3 was considered as an independent test. Currently, ACID tests are also being promoted as one of the activities of the WaSP (Web Standards Project) . (Sometimes this is misleading, because an organization with the word “web standards" behind it is sometimes a “weighty” argument for raising status.)



Internal organization



The most interesting thing in Acid3 is just the internal structure and details of the processes leading to one or another tsiferkam and squares of the final image.



To understand what's inside, it's time to get into the source of the test - the benefit in all modern browsers is the development tools (and even almost all embedded).



The test consists of 5 main parts:

The whole testing process takes place dynamically and is performed by JavaScript tools, and it is on this part that we will focus more on this. (From the interesting about CSS - for images used data-uri).



Testing engine


The engine code for testing includes:

Leaving beyond the scope of this review, the features of image formation using CSS (in this case, this is not a crucial point), let's look at how the update function works and how the test suite works.



Test suite


Each test, from the point of view of the external user, is a separate function without incoming parameters, returning some sign of success.



All tests are divided into 6 sets of 16 tests:



Plus, individual tests with numbers 0, 97, 98 and 99, formally combined into a set of 7.



Each test returns the code of the set to which it belongs as a sign of success. For example, test 87 looks like this:

function () {

// test 87: Date tests -- years

var d1 = new Date(Date.UTC(99.9, 6));

assertEquals(d1.getUTCFullYear(), 1999, "Date.UTC() didn't do proper 1900 year offsetting" );

var d2 = new Date(98.9, 6);

assertEquals(d2.getFullYear(), 1998, "new Date() didn't do proper 1900 year offsetting" );

return 6;

}




* This source code was highlighted with Source Code Highlighter .




Some tests linked to interaction with external files may return a special feature that requires restarting due to possible problems with network interaction (in this case, up to 500 attempts are made to run the test).



The internal structure of each of the tests can be disassembled separately, but, hardly, it is of great interest. Everything is fair there - the correctness of the solution of some problem is checked (see also below), and sometimes quite a large number of checks (I'm not sure that such large groupings of tests is a good practice from the point of view of writing code).



Any of the checks may fall out with an error that will be caught in the external loop (see below).



The functions that describe the tests are combined into a common array.



update ()


As mentioned above, the update () function is directly involved in running tests, collecting results, and forming a log and final image.



Technically, the function is recursive with a timeout delay: for every next test it starts after 10 milliseconds, a maximum of 500 attempts for tests requiring a network connection (result == "retry").



Simplified testing scheme looks like this:

function update() {

if (index < tests.length) {

try {

var result = tests[index]();

if (result == "retry" ) {

retry += 1;

if (retry < 500) {

setTimeout(update, delay);

return ;

}

fail( "timeout -- could be a networking issue" );

} else if (result) {

//

}

} else {

fail( "no error message" );

}

} catch (e) {

//

};

retry = 0;

index += 1;

setTimeout(update, delay);

} else {

//

}

}




* This source code was highlighted with Source Code Highlighter .




When updating the appearance, taking into account the number of the set to which the test belongs, a “P” in the class name is added to one or another rectangle corresponding to the test suite. Depending on the number of letters “P”, one or another CSS class is used:

< style type ="text/css" >

/* colours for them */

.z, .zP, .zPP, .zPPP, .zPPPP, .zPPPPP { background: black; }

.zPPPPPP, .zPPPPPPP, .zPPPPPPPP, .zPPPPPPPP, .zPPPPPPPPP,

.zPPPPPPPPPP { background: grey; }

.zPPPPPPPPPPP, .zPPPPPPPPPPPP, .zPPPPPPPPPPPPP,

.zPPPPPPPPPPPPPP, .zPPPPPPPPPPPPPPP { background: silver; }

#bucket1.zPPPPPPPPPPPPPPPP { background: red; }

#bucket2.zPPPPPPPPPPPPPPPP { background: orange; }

#bucket3.zPPPPPPPPPPPPPPPP { background: yellow; }

#bucket4.zPPPPPPPPPPPPPPPP { background: lime; }

#bucket5.zPPPPPPPPPPPPPPPP { background: blue; }

#bucket6.zPPPPPPPPPPPPPPPP { background: purple; }

</ style >




* This source code was highlighted with Source Code Highlighter .




It is from this scheme that it turns out that as the tests pass, the rectangles are gradually painted in one or another color, starting with black and ending with a specific color for each set.



What does Acid3 test?



As stated above, Acid3 contains 100 tests, which can be divided into six groups:

The set of tests and the composition of the tested specifications, of course, are impressive, but returning to the quotations at the beginning of the article, there are two big complaints to this composition:

  1. This, frankly, not all standards, which ideally should support the browser. For example, there are about 30 modules of CSS3 modules, but here only two are “indicated” - CSS3 Selectors and CSS3 Media Queries.
  2. Test suites for each of the standards are extremely far from showing a real picture of the support of a particular standard. There are only two things that such checks fix: bugs in the implementation and the complete lack of support for the specific functionality tested (but not the entire standard).



    Modern approach : large test suites developed by the community (browser vendors, experts, web developers, etc.) sufficiently fully cover specific specifications. For example, the CSS 2.1 test suite contains thousands of tests.
And one not very big complaint is the future of individual specifications being tested (see the section “Why don't Firefox 4 and IE9 score 100 points?”).



The final conclusion on the content: Acid3 tests no more than what it tests - individual elements of individual standards with a small number of tests. That is, statistically nothing.



Interesting Facts



Acid3 Competition


In the process of working on tests, Jan Hickson announced a competition for additional tests (16 pieces) in order to bring the number of tests to a round number (100). Among the requirements was this:

The test should fail (throw an exception) in a fresh build of Firefox or Webkit (ideally in both). (Opera and IE already fail a sufficient number of tests and I don’t want to add more tests that fail only in them. Of course, if you find something that is right in Firefox or Webkit and Opera or IE, it will be better.)




(By the way, Acid3 was released around the same time as Microsoft released the public beta version of IE8, the main functionality of which was already fixed and could not be changed regardless of the test results, similar to Acid3.)



Bugs


As Ian Hickson writes , after the initial announcement of Acid3 (March 2008), individual tests changed due to detected bugs and there was even a case when the test had to be changed due to the changed specification. At one point, one of the vendors (we won’t point a finger) said that it was completely Acid3, but it turned out that there was an error in the tests :)



The last update of the test suite was made about a year ago - in early April 2010.



Acid 4 and lessons from Acid3


Some time after the release of Acid3, preliminary information about Acid4 appeared. Although the plans to date have not been destined to come true, some interesting points in the form of lessons learned from Acid3, I think, deserve attention:

Why don't Firefox 4 and IE9 score 100 points?


As you know, Firefox 4 and IE9 do not score 100 points in Acid3 - and many journalists and people who do not follow all the twists and turns of events continue to be surprised.



Comments by Alexander Limi (GUI designer in Mozilla) with reference to the comment by Boris Zbarsky (Mozilla engineer):



“The remaining three points concern SVG Fonts. Opera and Webkit implemented a small subset of SVG 1.1 Fonts, sufficient for passing Acid3. We don’t want to embed even a small subset into Gecko until we are sure that it will benefit developers or users. At the same time, the implementation of the full specification is a rather difficult task, SVG Fonts were designed without integration with HTML.



At the moment, the SVG WG has decided that the SVG Fonts will not be in the main part of the SVG and will move to a separate specification, which will require serious work if someone undertakes to fully implement it. ”


Preference is given to WOFF.



Dean Hachamovich (Microsoft) expresses a similar preference and lack of desire in the SVG Fonts implementation and adds about the “missing” SMIL implementation:

“SMIL animation support in SVG in the web community is far from being strong enough. The leader of the standardization movement, SVG, wrote that no SMIL support in the current status is possibly the best option, since the SVG WG is in charge of coordinating with the CSS WG regarding changes in animation and filter extensions. ”




In the question of animations, preference is given to JavaScript and CSS.



Why is Acid3 important?



It would seem that the internal content of Acid3 and its assessment in the professional community very clearly indicate that the test cannot be viewed as a criterion for assessing the quality of standards support.



In practice, such a selective and “exhibit” approach with a built-in orientation to the bugs of various browsers, and not a large-scale functional check does not carry enough value and gives out frank populism.



It is unlikely that someone from browser developers will take seriously Acid3, as a criterion for the quality of their product. But at the same time, the bugs, the check of which is embedded in many Acid3 tests, turned out to be corrected along with other improvements. In this sense, Acid3 is a good beacon of the fact that developers are moving in the right direction (if only they do not customize the browser for specific tests).



Ultimately, and in practice (marketing) Acid3 has played and in some places continues to play a significant role. And when the flywheel spun, it is hardly possible to completely ignore it.



Acid3 is also interesting and valuable in that it is a good example of automated testing of browsers available to ordinary users, with a vivid display of results that is understandable to the broad masses.



Marketing is such a marketing :) Even if it is web standards marketing.

Source: https://habr.com/ru/post/118076/



All Articles