As we explore interactivity in Three.js, it's crucial to consider the diverse ways users interact with our 3D experiences. Beyond the traditional mouse and keyboard, touch devices have become ubiquitous. This section will guide you through handling input from multiple devices, with a particular focus on the nuances of touch events.
The fundamental principle for handling input in Three.js, whether from a mouse, keyboard, or touch, is to listen for browser events. We can attach event listeners to the HTML element that contains our Three.js renderer (often a <canvas> element or its parent container). This allows us to capture user interactions as they happen.
const canvas = renderer.domElement;
canvas.addEventListener('click', handleClick);
canvas.addEventListener('mousemove', handleMouseMove);
canvas.addEventListener('keydown', handleKeyDown);Touch events, however, operate slightly differently. Instead of a single 'click' or 'mousemove' event, touch interactions involve a sequence of events:
touchstart: Fired when a finger first touches the screen.touchmove: Fired when a finger moves across the screen.touchend: Fired when a finger is lifted from the screen.touchcancel: Fired if the touch interaction is interrupted (e.g., by a phone call).
Each touch event object contains a touches property, which is a TouchList object. This list can contain multiple Touch objects, representing each finger currently on the screen. For single-finger interactions, we'll typically be interested in event.touches[0].
canvas.addEventListener('touchstart', handleTouchStart);
canvas.addEventListener('touchmove', handleTouchMove);
canvas.addEventListener('touchend', handleTouchEnd);When handling touch events, we need to extract the X and Y coordinates of the touch. These coordinates are relative to the viewport. To use them with Three.js's raycasting, we'll need to normalize them into a coordinate system ranging from -1 to 1.
function getNormalizedTouchCoordinates(event) {
const rect = canvas.getBoundingClientRect();
const touch = event.touches[0];
const x = ((touch.clientX - rect.left) / rect.width) * 2 - 1;
const y = -((touch.clientY - rect.top) / rect.height) * 2 + 1;
return { x, y };
}It's important to consider that a user might be using a mouse and keyboard on one device and a touch device on another. To create a robust experience, you'll want to detect the primary input method or, ideally, support both simultaneously. A common approach is to have separate event handlers for mouse and touch events, but to ensure that if a touch event occurs, you disable or adjust mouse-related behavior to avoid conflicts.
graph TD
A[User Interacts] --> B{Device Type?}
B -- Mouse/Keyboard --> C[Handle Mouse/Keyboard Events]
B -- Touchscreen --> D[Handle Touch Events]
C --> E[Update Scene]
D --> E
E --> F[Render Update]
For example, when a touchstart event occurs, you might set a flag indicating that touch input is active. In your mouse event handlers, you would then check this flag and potentially ignore the mouse event if touch is active, preventing unintended double actions.
let isTouching = false;
canvas.addEventListener('touchstart', () => { isTouching = true; });
canvas.addEventListener('touchend', () => { isTouching = false; });
canvas.addEventListener('mousemove', (event) => {
if (isTouching) return;
// Handle mouse move...
});
canvas.addEventListener('click', (event) => {
if (isTouching) return;
// Handle mouse click...
});When implementing touch interactions, think about common gestures like pinch-to-zoom, swipe, and tap. These can be translated into transformations of your Three.js scene. For instance, a two-finger pinch can be used to scale the camera or objects in the scene.
Handling multiple input devices effectively requires a clear strategy for event management. By understanding the differences between mouse and touch events, and by implementing robust event listeners with appropriate checks, you can create interactive 3D experiences that feel natural and responsive across a wide range of user devices.