As we explore interactivity in Three.js, it's crucial to consider the diverse ways users interact with our 3D experiences. Beyond the traditional mouse and keyboard, touch devices have become ubiquitous. This section will guide you through handling input from multiple devices, with a particular focus on the nuances of touch events.
The fundamental principle for handling input in Three.js, whether from a mouse, keyboard, or touch, is to listen for browser events. We can attach event listeners to the HTML element that contains our Three.js renderer (often a <canvas> element or its parent container). This allows us to capture user interactions as they happen.
const canvas = renderer.domElement;
canvas.addEventListener('click', handleClick);
canvas.addEventListener('mousemove', handleMouseMove);
canvas.addEventListener('keydown', handleKeyDown);Touch events, however, operate slightly differently. Instead of a single 'click' or 'mousemove' event, touch interactions involve a sequence of events:
touchstart: Fired when a finger first touches the screen.touchmove: Fired when a finger moves across the screen.touchend: Fired when a finger is lifted from the screen.touchcancel: Fired if the touch interaction is interrupted (e.g., by a phone call).
Each touch event object contains a touches property, which is a TouchList object. This list can contain multiple Touch objects, representing each finger currently on the screen. For single-finger interactions, we'll typically be interested in event.touches[0].
canvas.addEventListener('touchstart', handleTouchStart);
canvas.addEventListener('touchmove', handleTouchMove);
canvas.addEventListener('touchend', handleTouchEnd);When handling touch events, we need to extract the X and Y coordinates of the touch. These coordinates are relative to the viewport. To use them with Three.js's raycasting, we'll need to normalize them into a coordinate system ranging from -1 to 1.