Weekend project: how would you prototype the feel and behaviour of 3D Touch if it didn’t yet exist? This is kind of a ‘reverse prototype’ because, obviously, 3D Touch does exist. So why prototype it? Well, while it’s usually quite simple to think about how you might prototype various types of software, this feature was a hardware+software interaction. I wanted to investigate how you might go about prototyping this kind of interaction, inexpensively, but with high-fidelity.

Step 1: Move a slider based on pressure

Right. So first I bought a Phidgets Bridge and a load cell and wired them up to my Mac. The Phidgets software on the Mac can be set up to send out any sensor readings from the load cell over the local network. Then you can write a tiny iOS app to listen to this Phidgets broadcast and change a slider value in response to the readings. It turned out that going from load cell → bridge → Mac → network → app was amazingly fast and responsive (see the video). I didn’t have to optimise anything, it was just good out of the box. You don’t see Phidgets used that much but I love them for prototyping because everything seems to just work the way you want it to.

Step 2: Blur a picture of my home screen based on pressure

Next I wanted to mimic the 3D Touch behaviour of pressing down on the home screen icons and having the background blur out. I wanted this to be really performant, so instead of dynamically calculating a blur of the home screen dependent on pressure (which the real thing does), I just put a static blurred layer on top and changed its opacity with pressure. (I totally pinched that idea off Stack Overflow.) This made it trivial for the phone to process the (fake) blur really fast. Here’s how it looks:

Step 3: Don’t blur the touched icon and put the phone on top

The last part was to not blur the touched icon, and to mount the phone on top of the load cell so you could get a feel for the whole interaction for real and play with it. Here it is:

Extra credit

This prototype works well, but the pressure reading actually varies a lot depending on how close your finger is to the load cell at the top. To make a version that is more accurate across the whole screen, we could just add another load cell at the bottom of the phone and average the two values.

Resources

The very hacky, please-do-not-really-program-like-this-in-production code is here on Github. You might find it useful if you are doing a Phidgets project of your own that you want to integrate with an iOS app.