Despite Jony Ive describing the Pencil as being designed for marking and never as a stylus finger replacement in Wallpaper*, I’ve determined to explore a few unconventional makes use of for mine. Yesterday noticed a slightly ramshackle wanting Pencil based digital scale and right this moment I’m utilizing it as a joystick of types for controlling parameters on image filters.
My PencilController challenge is a Swift app for iPad Pro that applies two Core Image filters to a picture: a hue adjustment and a colour controls which I use to manage the saturation.
The Pencil’s orientation in space is described by the Horizontal Coordinate System with azimuth and altitude angles.
The hue filter’s value is managed by the azimuth angle and the saturation is managed by the altitude angle: when the pencil is vertical, the saturation is zero and when it’s horizontal the saturation is eight (although when the pencil is completely horizontal, its tip isn’t truly touching the screen, so the best saturation the app can set is about six and three quarters).
To jazz up the user interface, I’ve also added a rounded cylinder using SceneKit which mirrors the Pencil’s position and orientation.
Controlling Core Image Filter Parameters with Pencil
Setting the values for the 2 Core Image filters is fairly easy stuff.
carta da parati bambini are declared as constants at the highest of my view controller along with a Core Image context (without colour administration for performance) and a Core Image picture:
When the touch either starts or adjustments, I need to make sure it originates from a Pencil by checking its sort and then invoke applyFilter() via pencilTouchHandler() methodology:
pencilTouchHandler() extracts the azimuth and altitude angles from the UITouch, does some simple arithmetic and passes those values to applyFilter():
It’s applyFilter() that makes use of these two values to set the parameters on the filters and show the output in a UIImageView:
On my iPad Pro this filtering is quick sufficient on a near full display picture that I don’t have to fret about doing this work in a background thread.
Controlling SceneKit Geometry with Pencil
The following piece of work is to orient and position the “virtual pencil” so it mirrors the real one. I’ve overlaid a SCNView above the UIImageView and added a capsule geometry (which is a cylinder with rounded ends, not not like a Pencil). Importantly, I’ve also added a flat airplane which is used to capture the Pencil’s location within the SceneKit 3D space:
Inside the pencilTouchHandler(), I take advantage of the SceneKit view’s hitTest() technique to find the x and y positions of the Pencil on the display in SceneKit’s 3D area on the airplane:
…and with the outcomes of that hit take a look at, I can place the cylinder underneath the Pencil’s touch location:
Finally, with the altitude and azimuth angles of the touch, I can set the Euler angles of the cylinder to match the Pencil:
I’ve made the SceneKit camera orthographic, a perspective camera adds undesirable rotation to the “virtual pencil” as it moves throughout the display screen.
Conclusion
Despite what Jony Ive may say, the Pencil presents some consumer interaction patterns not possible with a easy touch display screen and i hope other developers begin exploring new concepts. In addition to the 2 angles, the Pencil also has x and y coordinates and its drive, so that’s 5 completely different values that might probably be used for controlling something, from image filters to an audio synthesiser!