📜 ⬆️ ⬇️

Make the site work on touch devices.



Touchscreens on mobile phones, tablets, laptops and desktops have opened up a whole host of new interactions to web developers. In the translated guide, Patrick Loki discusses the basics of working with sensory events in JavaScript. All examples considered further are in archive .


Do we need to worry about touches?


With the advent of touch devices, the main question from the developers is: “What do I need to do to make sure that the site or application works on them?” Surprisingly, the answer is nothing. Mobile browsers by default handle most sites that were not designed for touch devices. Applications not only work normally with static pages, but also handle interactive sites with JavaScript, where scripts are associated with events like pointing the cursor.
')
To do this, browsers simulate or simulate mouse events on the device’s touchscreen. A simple test of the page (example1.html in the attached files) shows that even on a touch device, pressing the button starts the following sequence of events: mouseover> mousemove> mousedown> mouseup> click .

These events are triggered in quick succession with virtually no delay between them. Pay attention to the mousemove event, which provides at least one-time execution of all scripts triggered by mouse behavior.

If your site responds to mouse actions, its functions will in most cases still work on touch devices, without requiring additional modifications.

Mouse event simulation issues


Delayed clicks


When using touch screens, browsers intentionally use an artificial delay of about 300 ms between the touch action (for example, pressing a button or a link) and the actual activation of a click. This delay allows users to make doubletaps (for example, to enlarge and reduce the image) without accidentally activating other elements of the page.

If you want to create a website that responds to user actions as a native application, there may be problems. This is not what ordinary users expect from most online resources.

Finger tracking


As we have already noted, synthetic events sent by the browser contain a mousemove event - always only one. If users swipe their finger around the screen too much, synthetic events will not be generated at all - the browser interprets such a movement as a scroll-like gesture.

This becomes a problem if your site is controlled by mouse movements — for example, a drawing application.

Let's create a simple canvas application (example3.html). Instead of a specific implementation, let's see how the script reacts to mouse movement.

var posX, posY; ... function positionHandler(e) { posX = e.clientX; posY = e.clientY; } ... canvas.addEventListener('mousemove', positionHandler, false ); 

If you test the example with the mouse, you can see that the pointer position is continuously tracked as the cursor moves. On the touch device, the application does not respond to the movement of the fingers, but responds only to a press that triggers a synthetic motion event.

"Look deeper"


To solve these problems, you have to go into abstraction. Sensory events appeared in Safari for iOS 2.0, and after implementation, almost all browsers were standardized in the W3C Touch Events specification. New events recorded in the standard - touchstart, touchmove, touchend and touchcancel. The first three specifications are equivalent to standard mousedown, mousemove, and mouseup.

Touchcancel is called when the touch interaction is interrupted - for example, if the user puts a finger outside of the current document. Observing the order in which sensory and synthetic events are triggered for pressing, we get (example4.html):

touchstart> [touchmove] +> touchend> mouseover> (a single) mousemove> mousedown> mouseup> click .

All sensory events are involved: touchstart, one or more touchmove (depending on how gently the user presses the button without moving a finger around the screen), and touchend. After that, run synthetic events and the final click occurs.



Detection of sensory events


A simple script is used to determine if the browser supports touch events.

 if ('ontouchstart' in window) { /* browser with Touch Events support */ } 

This fragment works great in modern browsers. The old ones have quirks and inconsistencies that can be detected only if you go all the way. If your application is focused on older browsers, try the Modernizr plugin and its testing mechanisms. They will help identify most of the inconsistencies.

In determining support for sensory events, we must clearly understand what we are testing.

The selected fragment checks only the ability of the browser to recognize the touch, but does not say that the application is open on the touch screen.

Work on click delay


If we test the sequence of events sent to the browser on touch devices and include synchronization information (example5.html), we will see that a delay of 300 ms appears after the touchend event:

touchstart> [touchmove] +> touchend> [300ms delay]> mouseover> (a single) mousemove> mousedown> mouseup> click .

So, if our scripts react to a click, you can get rid of the default browser delay by writing reactions to a touchend or touchstart. We do this by responding to any of these events. Touchstart is used for interface elements that should be launched immediately when you touch the screen - for example, control buttons in html games.



Again, we should not make false assumptions about the support of touch events and that the application is open on the touch device. Here is one of the most common tricks often mentioned in mobile optimization articles.

 /* if touch supported, listen to 'touchend', otherwise 'click' */ var clickEvent =('ontouchstart'in window ?'touchend': blah.addEventListener(clickEvent,function(){ ... }); 

Although this scenario is well intentioned, it is based on a mutually exclusive principle. Responding to either a click or a touch, depending on browser support, causes problems on hybrid devices — they immediately interrupt any interaction with a mouse, trackpad, or touch.

A more robust approach takes into account both types of events:

 blah.addEventListener('click', someFunction,false); blah.addEventListener('touchend', someFunction,false); 

The problem is that the function is executed twice: once at the touchend, the second time when the synthetic events and click are triggered. This can be circumvented by suppressing the standard response to mouse events using preventDefault (). We can also prevent code repetition by simply forcing the touchend handler to trigger the right click event.

 blah.addEventListener('touchend',function(e){ e.preventDefault(); e.target.click(); },false); blah.addEventListener('click', someFunction,false); 

There is a problem with preventDefault () - when using it in the browser, any other default behavior is suppressed. If we apply it directly to the initial touch events, any other activity will be blocked - scrolling, long mouse movement or scaling. Sometimes it falls out of place, but the method should be used with caution.

The sample code is not optimized. For reliable implementation, check it out at FTLabs's FastClick .



Motion tracking with touchmove


Armed with knowledge of sensory events, let us return to the tracking example (example3.html) and see how it can be changed to track the movement of the fingers on the touch screen.

Before we see the specific changes in this script, we first understand how touch events differ from mouse events.

Anatomy of sensory events


According to the Document Object Model (DOM) Level 2 , functions that respond to mouse events receive a mouseevent object as a parameter. This object includes the properties — the clientX and clientY coordinates, which the above script uses to determine the current position of the mouse.

For example:

 interface MouseEvent : UIEvent { readonly attribute long screenX; readonly attribute long screenY; readonly attribute long clientX; readonly attribute long clientY; readonly attribute boolean ctrlKey; readonly attribute boolean shiftKey; readonly attribute boolean altKey; readonly attribute boolean metaKey; readonly attribute unsigned short button; readonly attribute EventTarget relatedTarget; void initMouseEvent(...); }; 

As you can see, touchevent contains three different touch lists.



Each of these sheets is a matrix of individual sensory objects. Here we will find pairs of coordinates in the image of clientX and clientY.

 interface Touch { readonly attribute long identifier; readonly attribute EventTarget target; readonly attribute long screenX;; readonly attribute long screenY; readonly attribute long clientX; readonly attribute long clientY; readonly attribute long pageX; readonly attribute long pageY; }; 

Using touch events to track fingers


Let's go back to the canvas based example. We need to change the function so that it responds to both touch events and mouse actions. It is necessary to track the movement of a single touch point. Just grab the clientX and clientY coordinates on the first element in the targetTouches array.

 var posX, posY; function positionHandler(e) { if ((e.clientX)&&(e.clientY)) { posX = e.clientX; posY = e.clientY; } else if (e.targetTouches) { posX = e.targetTouches[0].clientX; posY = e.targetTouches[0].clientY; e.preventDefault(); } } canvas.addEventListener('mousemove', positionHandler, false ); canvas.addEventListener('touchstart', positionHandler, false); canvas.addEventListener('touchmove', positionHandler, false); 

When testing a modified script on a touch device (example6.html), you will see that tracking a single finger movement now works reliably.

If we want to expand the example so that it works multitouch, we will have to slightly change the initial approach. Instead of one pair of coordinates, we will take into account a whole series of them, which is cyclically processed. This will allow you to track and single mouse clicks, and multitouch (example7.html).

 var points = []; function positionHandler(e) { if ((e.clientX)&&(e.clientY)) { points[0] = e; } else if (e.targetTouches) { points = e.targetTouches; e.preventDefault(); } } function loop() { ... for (var i = 0; i<points.length; i++) { /* Draw circle on points[0].clientX / points[0].clientY */ ... } } 

Like that:



Performance issues


As with mousemove events, while moving fingers, the touchmove can operate at high speed. It is advisable to avoid complex code — complex calculations or whole drawing events for each move. This is important for older and less productive sensor devices than modern ones.

In our example, we perform the absolute minimum — storing the latest mouse arrays or coordinates of touch points. Application code is independently executed in a separate loop using setInterval.

If the number of events handled by the script is too high, it deserves the work of special solutions - for example, limit.js .

By default, browsers on touchscreen devices handle specific mouse scenarios, but there are situations when you need to optimize code for touch interaction. In this lesson we covered the basics of working with sensory events in JavaScript. Hopefully this will come in handy.

Source: https://habr.com/ru/post/227175/


All Articles