Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
143023864X_HT5.pdf
Скачиваний:
8
Добавлен:
21.02.2016
Размер:
7.98 Mб
Скачать

CHAPTER 13 THE FUTURE OF HTML5

on the canvas element. Similarly, audio data APIs will enable music creation in HTML5 applications. This will help round out the content-creation capabilities available to web applications and move us closer to a self-hosting world of tools to create media on and for the Web. Imagine editing audio on the Web without having to leave the browser.

Simple playback of sounds can be done with the <audio> element. However, any application that manipulates, analyzes, or generates sound on the fly needs a lower-level API. Text-to-speech, speech-to- speech translation, synthesizers, and music visualization aren’t possible without access to audio data.

We can expect the standard audio API to work well with microphone input from the data element as well as files included with audio tags. With <device> and an audio data API, you may be able to make an HTML5 application that allows users to record and edit sound from within a page. Audio clips will be able to be stored in local browser storage and reused and combined with canvas-based editing tools.

Presently, Mozilla has an experimental implementation available in nightly builds. The Mozilla audio data API could act as a starting point for standard cross-browser audio programming capabilities.

Touchscreen Device Events

As Web access shifts ever more from desktops and laptops to mobile phones and tablets, HTML5 is also continuing to adapt with changes in interaction handling. When Apple introduced the iPhone, it also introduced into its browser a set of special events that could be used to handle multitouch inputs and device rotation. Although these events have not yet been standardized, they are being picked up by other vendors of mobile devices. Learning them today will allow you to optimize your web applications for the most popular devices now.

Orientation

The simplest event to handle on a mobile device is the orientation event. The orientation event can be added to the document body:

<body onorientationchange="rotateDisplay();">

In the event handler for the orientation change, your code can reference the window.orientation property. This property will give one of the rotation values shown in Table 13-1, which is relative to the orientation the device was in when the page was initially loaded.

Table 13-1. Orientation Values and Their Meanings

Orientation Meaning

Value

0The page is being held in the same orientation as its original load.

-90

The device has been rotated 90 degrees clockwise (right) since the original load.

180

The device has been rotated upside-down since the original page load.

90The device has been rotated 90 degrees counter-clockwise (left) since the page was originally loaded.

317

CHAPTER 13 THE FUTURE OF HTML5

Once the orientation is known, you can choose to adjust the content accordingly.

Gestures

The next type of event supported by mobile devices is a high-level event known as the gesture. Consider gesture events as representing a multitouch change in size or rotation. This is usually performed when the user places two or more fingers on the screen simultaneously and pinches or twists. A twist represents a rotation, while a pinch in or out represents a zoom out or in, respectively. To receive gesture events, your code needs to register one of the handlers shown in Table 13-2.

Table 13-2. Event Handlers for Gestures

Event Handler

Description

ongesturestart A user has placed multiple fingers on the screen and has begun a movement.

ongesturechange The user is in the process of moving multiple fingers in a scale or rotation.

ongestureend The user has completed the scale or rotation by removing fingers.

During the gesture, the event handler is free to check the rotation and scale properties of the corresponding event and update the display accordingly. Listing 13-1 shows an example usage of the gesture handlers.

Listing 13-1. Example Gesture Handler

function gestureChange(event) {

//Retrieve the amount of change in scale caused by the user gesture

//Consider a value of 1.0 to represent the original size, while smaller

//numbers represent a zoom in and larger numbers represent a zoom

//out, based on the ratio of the scale value var scale = event.scale;

//Retrieve the amount of change in rotation caused by the user gesture

//The rotation value is in degrees from 0 to 360, where positive values

//indicate a rotation clockwise and negative values indicate a counter-

//clockwise rotation

var rotation = event.rotation;

// Update the display based on the rotation.

}

// register our gesture change listener on a document node node.addEventListener("gesturechange", gestureChange, false);

Gesture events are particularly appropriate in applications that need to manipulate objects or displays, such as in diagramming tools or navigation tools.

318

CHAPTER 13 THE FUTURE OF HTML5

Touches

For those cases where you need low-level control over device events, the touch events provide as much information as you will likely need. Table 13-3 shows the different touch events.

Table 13-3. Touch Events

Event Handler

Description

ontouchstart A finger has been placed on the surface of the touch device. Multitouch events will occur as more fingers are placed on the device.

ontouchmove One or more of the fingers on the device has moved its location in a drag operation.

ontouchend One or more fingers have been lifted away from the device screen.

ontouchcancel An unexpected interruption has stopped the touch operations.

Unlike the other mobile device events, the touch events need to represent that there are multiple points of data—the many potential fingers—present at the same time. As such, the API for touch handling is a little bit more complex as shown in Listing 13-2.

Listing 13-2. Touch API

function touchMove(event) {

// the touches list contains an entry for every finger currently touching the screen var touches = event.touches;

//the changedTouches list contains only those finger touches modified at this

//moment in time, either by being added, removed, or repositioned varchangedTouches = event.changedTouches;

//targetTouches contains only those touches which are placed in the node

//where this listener is registered

vartargetTouches = event.targetTouches;

//once you have the touches you'd like to track, you can reference

//most attributes you would normally get from other event objects varfirstTouch = touches[0];

varfirstTouchX = firstTouch.pageX; varfirstTouchY = firstTouch.pageY;

}

//register one of the touch listeners for our example node.addEventListener("touchmove", touchMove, false);

You may find that the device’s native event handling interferes with your handling of the touch and gesture events. In those cases, you should make the following call:

event.preventDefault();

319