Your design system has opinions about the exact shade of blue on a button. It has strong feelings about 8px vs 12px padding. It probably even has a documented stance on whether shadows should be warm or cool.
But ask it what a successful action feels like? Blank stare.
We've spent the last decade obsessing over the visual and spatial dimensions of design systems — colour tokens, typography scales, spacing grids, motion curves. We've tokenised everything you can see. But we've completely ignored the sense that's literally at our fingertips.
Touch.
We've Been Designing for Screens, Not Humans
Here's an uncomfortable truth: humans have five senses, and design systems address exactly one of them. Maybe one and a half, if your system has motion guidelines.
Think about the last time you used a really good native app on your phone. The gentle tap when you toggle a switch. The satisfying double-pulse when a payment goes through. The sharp triple-buzz when something goes wrong. These aren't accidents — they're designed interactions that make digital things feel physical.
Now think about the last web app you used. Did it feel like anything? No. It didn't. Because the web has historically treated haptics like the weird cousin nobody invites to dinner.
The Token-Shaped Hole
If you work with design systems, you're already familiar with the hierarchy:
--color-success-500: #22c55e;
--font-body: 'Inter', sans-serif;
--spacing-md: 16px;
--ease-standard: cubic-bezier(0.4, 0, 0.2, 1);
--shadow-elevation-2: 0 2px 4px rgba(0, 0, 0, 0.1);
These are sensory contracts. They promise that "success" looks green, body text is Inter, medium spacing is 16px, standard animation follows this curve. Every designer and developer on the team speaks this shared language.
Now imagine:
--haptic-success: [50, 50, 50]; /* two quick taps */
--haptic-error: [50, 30, 50, 30, 50]; /* three sharp buzzes */
--haptic-nudge: [80, 80, 50]; /* strong tap, then soft */
--haptic-confirm: [200]; /* single solid pulse */
Same idea. Semantic names, platform-agnostic definitions, part of the shared vocabulary. When a designer annotates a component with haptic-success, the developer knows exactly what that means — just like they know what color-success-500 means.
Don't just read about it — try it. Here are six semantic haptic tokens. Tap one to fire the pattern on your device (powered by web-haptics — works on iOS and Android):
Tap a pattern to trigger it on your phone.
This isn't science fiction. Julia Wong at the Design Systems Collective wrote about tokenising haptics in Figma last year. PIE (Just Eat's design system) already has a haptic feedback pattern page. ABLE's design system has actual haptic token notation — #haptWin, #haptError, #haptWarn.
But here's the thing: they're all native mobile only. Every single one stops at iOS and Android. The web? Apparently it doesn't deserve to feel things.
The Web Finally Has Fingers
This is where it gets interesting. For years, the web's answer to haptics was basically navigator.vibrate(200) — a single API that worked on Android Chrome and precisely nothing else. iOS Safari ignored it entirely. It was the <blink> tag of tactile feedback.
But that's changing. The Vibration API now has broader support, and — this is the big one — iOS has quietly started supporting it. A library called web-haptics by Lochie Axon just dropped that wraps the whole thing in a clean, framework-agnostic API with React, Vue, and Svelte bindings:
import { useWebHaptics } from 'web-haptics/react'
function CheckoutButton() {
const { trigger } = useWebHaptics()
const handlePayment = async () => {
const result = await processPayment()
trigger(result.success ? 'success' : 'error')
}
return <button onClick={handlePayment}>Pay Now</button>
}
That's it. One import. One function call. Your web app now feels like something.
Three Dimensions of Design Tokens
Here's how I think about it. Design systems have evolved through sensory dimensions:
Dimension 1: Visual (mature) Colour, typography, spacing, iconography, shadows. This is where 99% of design system effort goes. We're good at this.
Dimension 2: Temporal (adolescent) Motion, animation, transitions. Most mature systems have easing curves and duration tokens. Some even have choreography guidelines. We're getting there.
Dimension 3: Tactile (infant) Haptic patterns, vibration presets, touch feedback. Almost nobody has this in their web design system. This is the frontier.
And there's arguably a Dimension 4 — auditory — that IBM explored with their Sonic Design Guidelines. But let's not get ahead of ourselves.
The point is: a truly multi-sensory design system should define how interactions look and move and feel. The technology is finally here to do all three on the web.
What Haptic Tokens Actually Look Like
Let's get concrete. Here's what a haptic token layer might look like in a design system:
Primitive Tokens (the raw patterns)
{
"haptic": {
"tap-light": { "pattern": [30], "intensity": 0.3 },
"tap-medium": { "pattern": [50], "intensity": 0.5 },
"tap-heavy": { "pattern": [80], "intensity": 0.8 },
"double-tap": { "pattern": [50, 50, 50], "intensity": 0.5 },
"triple-tap": { "pattern": [50, 30, 50, 30, 50], "intensity": 0.7 },
"buzz-short": { "pattern": [200], "intensity": 1.0 },
"buzz-long": { "pattern": [500], "intensity": 1.0 }
}
}
Semantic Tokens (the meaning layer)
{
"haptic": {
"success": { "value": "{haptic.double-tap}" },
"error": { "value": "{haptic.triple-tap}" },
"warning": { "value": "{haptic.tap-heavy}" },
"selection": { "value": "{haptic.tap-light}" },
"toggle": { "value": "{haptic.tap-medium}" },
"notification": { "value": "{haptic.buzz-short}" }
}
}
Component Tokens (the usage layer)
{
"button": {
"haptic-on-press": { "value": "{haptic.selection}" }
},
"switch": {
"haptic-on-toggle": { "value": "{haptic.toggle}" }
},
"form": {
"haptic-on-submit-success": { "value": "{haptic.success}" },
"haptic-on-submit-error": { "value": "{haptic.error}" }
},
"toast": {
"haptic-on-appear": { "value": "{haptic.notification}" }
}
}
Same three-tier architecture we already use for colours and typography. Primitives → semantics → components. Nothing revolutionary about the structure — the revolutionary bit is applying it to a sense we've been ignoring.
Cross-Platform Mapping
One of the trickier challenges — and Julia Wong flagged this in her piece — is that iOS and Android have very different native haptic APIs. iOS gives you clean semantic names (.light, .success, .error). Android gives you... constants with less obvious meaning.
A proper haptic token system needs a platform mapping layer:
| Semantic Token | Web (Vibration API) | iOS (UIKit) | Android |
|---|---|---|---|
haptic-success | [50, 50, 50] | .success | CONFIRM |
haptic-error | [50, 30, 50, 30, 50] | .error | REJECT |
haptic-selection | [30] | .selection | CLOCK_TICK |
haptic-toggle | [50] | .medium | TOGGLE_ON |
The web patterns are approximations — you can't perfectly replicate a Taptic Engine response with a basic vibration motor. But that's fine. The semantic layer is the contract. The implementation adapts to the platform, just like how color-primary might render slightly differently across displays but carries the same intent.
The Accessibility Angle Nobody's Talking About
Here's where it gets properly important. Haptic feedback isn't just a nice-to-have for making buttons feel satisfying. It's a non-visual feedback channel.
For users with visual impairments, haptics can signal state changes that might otherwise be missed. A screen reader will tell you a form submitted successfully, but a haptic pulse confirms it in a way that's instantaneous and unmistakable.
For users with hearing impairments, haptics replace audio cues entirely. That notification sound you rely on? Useless. A notification buzz? Universal.
If we're serious about inclusive design — and we all claim to be — then designing haptic feedback shouldn't be an enhancement. It should be part of the accessibility baseline.
What This Means for AI-Generated Components
And here's where my day job crashes the party.
I've written before about AI systems that generate UI components from design system documentation. The premise is simple: if your design system is well-documented, an AI can read those docs and generate compliant components.
Now extend that to haptics. If your design tokens include haptic definitions, an AI generating a form component doesn't just get the colours and spacing right — it also knows that a successful submit should trigger haptic-success and a validation error should trigger haptic-error. The documentation is the design system, and the design system now includes how things feel.
This is genuinely new territory. No AI code generator I'm aware of considers haptic feedback when generating components. But there's no technical reason it can't — it's just another token to reference.
"But Kevin, Nobody Asked for This"
Yeah, nobody asked for design tokens either, until Salesforce formalised them and suddenly everyone realised they'd been managing colour values like animals. Nobody asked for motion guidelines until Material Design shipped them and made every other app feel static by comparison.
Haptics is at that exact inflection point. The technology exists. The patterns are emerging. The native mobile world is already doing it (just not sharing with the web). And a handful of design systems are starting to tokenise it.
The question isn't whether haptic tokens will become standard in design systems. It's whether you'll be the team that figured it out early, or the one that bolts it on as an afterthought in 2028.
Getting Started (Without Overthinking It)
If you want to experiment, here's the minimum viable approach:
- Install web-haptics in your web project (
npm i web-haptics) - Define 4-5 semantic haptic patterns — success, error, warning, selection, notification
- Add them to your existing token file alongside your colours and spacing
- Annotate 2-3 key interactions — form submissions, toggle switches, destructive actions
- Test on actual devices — you cannot design haptics on a desktop. You have to feel them.
That last point is crucial. Julia Wong described mapping iOS and Android haptics by literally holding both devices and feeling the patterns. There's no Figma plugin for touch. You have to pick up the phone.
Editor note: This was an absolute pain to get working on this site as I had to upgrade from Nuxt2 to Nuxt3 just to use the library.
So, what does this mean long term?
Design systems exist to create consistency across every dimension of a user's experience. We've nailed the visual dimension. We're making progress on motion. But we've been completely ignoring the one sense that literally defines how we interact with our devices — touch.
The tooling is here. The patterns are proven (on native, at least). The web is finally catching up. All that's missing is for someone to take haptic tokens as seriously as they take colour tokens.
Your design system has opinions about border-radius: 8px. Maybe it's time it had an opinion about what success feels like.
Kevin Coyle is an AI and design systems consultant who helps enterprises build things that work. He's spoken at conferences about design systems and AI, and once built an entire manifesto website about why dashboards are terrible. He can be found at kevincoyle.co.uk or @kevincoyle on X.
