Ad

Our DNA is written in Swift
Jump

DTLoupe – Reverse-engineering Apple’s Loupes

I am working on a CoreText-based rich text editor at the moment. That means employing two primary technologies: the UITextInput protocol as well as rendering the formatted text with CoreText. Unfortunately Apple has forgotten to add selection and loupe mechanics to UITextInput, so we have to build these ourselves if we want to get the same look&feel as the built-in stuff.

So to get the selection handling and loupe we see developers go two paths: either they distort UIWebView with fancy JavaScript or they struggle with implementing their own code. These approaches lead to a wide variety of differently looking and behaving loupes and selection mechanics. I have contacted Apple by all means available to me and I’m hoping that there will be an official method to get the selection mechanics and loupe down the road.

But until there is, I let me present an interim solution for this problem. This will be a component I call DTLoupe and it have many potential applications besides being used in an editor to select text. Like providing a magnifier in a context when pinch-to-zoom does not make sense.

Apple’s loupe consists of several parts that are combined with a zoomed rendering of the view around your finger. The layers bottom to top are:

  • a “lo” image forms the base
  • a mask masks the inner shape of the loupe
  • a 1.25x enlarged rendering of the view is put into this space
  • a “high” image adds the shine and general frame

Apple seems to prefer use of a masking image over a clipping path. We believe this to be the case because masking is probably much faster than clipping because it can be entirely done on the GPU, being essentially a pixel-by-pixel operation, whereas with clipping there needs to be some extra logic to decide whether a pixel is still in the permitted area or not.

Quite a bit of experimentation was necessary to get it looking right, including the showing and hiding animation. Thankfully there is an amazing project on GitHub UIKit-Artwork-Extractor that helped us learn about “how they did it” by letting us inspect the images that are part of the whole effect. A note at the side, Apple is extensively using PNGs for all sorts of UI elements, so you might want to do the same, especially if you have a designer that can provide all these individual items for you to assemble.

My friend Michael Kaye was hard at work to research all the nuts and bolts and the result is coming along very nicely. There are a couple of other projects that provided some inspiration, like the OmniGroup framework. If you ask me now “why did you re-invent the wheel if OmniUI already has a loupe?” the answer is simple: it does not look like and does not behave like the original. And on top of that they never updated their solution for retina displays, which makes it impossible to use in a modern project where Retina-support is a must-have.

Now for the demo:

Work is ongoing to polish but I just had show off the fabulous work that Michael did over the past few days. Next up is actually hooking up the touch handling for moving the cursor, extending the selection and handling the context menu.

 


Categories: Parts

6 Comments »

  1. Great to see Oliver, and good to hear your working on the Rich Text editor. Very exciting.

    Do you have any (approximate) feeling on when you might finish the RTE? Looking forward to it. 🙂

  2. Well an NSAttributedString-based variant might be ready for BETA in 2-3 weeks. Getting HTML out as well will take longer.

  3. Nice! Consider me a buyer – i don’t need the HTML variant, and i need to do some customization so when beta is ready, I’m in. 🙂

  4. Oh, another (tangent) question for you. I’m developing a desktop version of my ipad app, with syncing, and I’ve been wondering how text layout in a Core Text ipad project like this would compare to the NSTextView layout in Cocoa. Mostly concerned about getting the same basic metrics across both platforms… do you have any thoughts on this?

  5. I fon’t know the first thing about stuff on the Mac, but I suppose it should look similar if not identical.

  6. I imagine that Apple’s loupe was forked off their Core Image filter project released several years ago for the Mac. Apple released it as sample code for how to build a complex Core Image Filter – I just don’t remember the name. I saw that the code was still on Apple’s site as of two years ago.