I chose this article’s title to try and grab your attention. Well, the product is still the same and does the same. The only difference is one that is under the hood. And as such it is your job – should you choose to accept it – to marvel at the benefits that the new old parsing engine brings us.
Over the past week I’ve been working on DTLocalizableStringScanner or in short genstrings2. The original genstrings dates back to the NeXtStep days. You know how it is “never change a winning team” BUT “the good is the enemy of the great”, because if something kind of works, why change it?
Besides of the other problems I’ve alluded to in my previous article genstrings is very slow. Internally it is written in Objective-C as you can tell from the occasional stack trace when it crashes again. But it was created in a time when Macs did only have single CPU cores and when we did not have the awesome LLVM with ARC, GCD and multi-threading.
Actually I am on vacation, but I couldn’t help myself using the breather to work on a little hobby project. This I shall reveal to you today!
If you localize your apps (iOS or Mac regardless) you are probably used to working with genstrings. Probably painfully working, which is why we built the Mac app Linguan which remote-controls genstrings and merges the results with your previously existing tokens.
But genstrings has several big problems.
If you have ever looked at UIPasteboard you might have seen that there are a variety of public types, like colors, images, plain text or URLs. But applications are also free to implement their own types to be put on the pasteboard when the existing types don’t do justice to your content.
One such custom type is being used between Apple’s iOS apps, like mobile Safari, whenever you copy HTML snippets: “Apple Web Archive pasteboard type”. At first glance this looks really secretive because all you can see there is a long string of numbers representing the NSData for it.
Today I am unveiling an Open Source solution to consuming this pasteboard type as well: DTWebArchive. Using this you can let your users copy something from Safari or Mail and pasted it into your app while preserving the rich text. I put this code into a new GitHub repository because even if you don’t dabble with CoreText and NSAttributedStrings+HTML then this project can be very useful to you.
You might have noticed that I blogged much less during the past 3 months, that was for the most part because almost all of my programming time went into a secret project for Scribd. Something that is finally revealed to the public on July 19th 2011.
Preempting the next question I am usually asked at this point: “What is Scribd?” Scribd is often described as “the YouTube of Documents”. You can upload and share any kind of document on their network and they have an HTML5 reader that you can embed on your blog.
At the time the official statement was that Scribd is working on a mobile reader for iOS and they needed much more control over the rendering and interactivity of HTML-based content than UIWebView would afford them.
Let me tell you how Scribd has completely ditched UIWebView and is revolutionizing the way you read on your iPhone. Introducing Float.
For my new version of Summertime I am building a time zone picker. You can get the known time zone names from the NSTimeZone class, but unfortunately Apple does not give us any localization of these. The localizedName:local: method gives you localized names of the time zones itself (e.g. “Pacific Standard Time”) in various formats. But what I found to be missing is a way to have the geo names localizable as well.
If I have my iPhone set to German I want to find my timezone by entering “Wien”, not “Vienna”.
My initial thought was to keep this to myself, but since I only speak German and English I can never hope to have the translations be perfect unless I would pay several translators to comb through them. And you know, Google Translate is great, but not 100%. So I started a new Open Source project on GitHub: NSTimeZone+Localization which aims to remedy this.
One of the things that people know me for is for continuing to develop on MyAppSales, my favorite tool to download and keep those elusive sales report from iTunes Connect. It’s always been a hobby and until now I’ve allowed people to access the source in exchange for a mandatory donation. This went on for almost two years now.
Those donations never made sufficient money for me to pay for professional development. But I felt that I had to ask the approx. 500 people on the google group about their opinion as they might see their donation as a purchase and not like the idea of this software now being available for free. Boy was I wrong. Resoundingly people voted “+1” for going OpenSource.
So here it is. It’s Open. Read on for how I moved the repo from my SVN server to GitHub, including the entire revision history.
After having released SpeakerClock, a speech countdown clock, I got a couple of suggestions, one was not about the app itself, but about how you could stand the iphone on it’s side.
Kevin Jamison: “Now we just need to come up with a simple little stand similar to the iPod Touch uses to hold it in a tilted landscape mode. Maybe rubber coated feet so it won’t slide if you are using a podium.”
I remembered that every once in a while the blogosphere brought to light yet another clever way of solving such problems. A quick search on Google and YouTube yielded a couple of great solutions. Some that you can purchase , same that you can make yourself at little to no cost.
First let’s look at the professional solutions which typically cost between $5 and $10. After we’ve set the benchmark we’ll explore how we can achieve landscape stability ourselves. Building something useful out of physical things might be a welcome distraction from hours of coding Cocoa. 🙂
It is with a certain pride that I am announcing that truly was involved with the creation of an app that made it to the top. Well, at least in Austria and Germany, and at least in a category.
I’m referring to Infrared Photography which is an app that was contracted out to us by none less than Michael Görmann, professional photographer and in need of a signature app that he could hand out as a sort of business card. Via a short detour he arrived at our virtual doorstep and was promptly served a cute little app.
To put the credit where it is due and put it more precisely the programming itself was carried out by my brother-in-law whom I had the pleasure to help out here and there and generally have a mentoring eye over, when it comes to iPhone programming. He was and still is to this date an export for server-side Java, Android and Symbian. But about iPhone development I can proudly say: “He got that from me.” His name is Rene Pirringer and his company is ciqua. We are closely working together on iPhone projects.
Infrared Photography is unique in a way because it gives you an endless supply of beautiful black&white photos which where made with expensive Infrared Euqipment and by the unmatched eye of Mr. Görmann. So unique in fact that Apple approached him and asked him to put a couple more screenshots online, because they were planning to feature it. And so they did.
You could see an immediate jump in downloads, but for the most part in Germany and Austria where it was featured. It seems that those countries are served by the same iTunes team at Apple. Being featured like this glued the app to the first place like you can see here:
We are thankful for having had this chance find us and all you out there who keep downloading the app. It’s an amazing reference for us.
Opinions differ on what constitutes “Augmented Reality”. This can generally range from sophisticated 3D graphics overlayed on a stereoscopic view of the word to simply overlaying floating icons related to a location over a live camera view. In between some also count as AR when you have a special marker in your hand which you webcam can track and inserts a small 3D model on top of it.
On the iPhone it is currently not possible to do any sophisticated motion tracking or shape recognition was you can only get live images via a backdoor which Apple recently kind of opened just a bit so that useful applications like RedLaser or uStream can make use of it. Still it seems that the consensus is that you can only use images that have no UI elements over them as the workaround does captures of the whole screen.
I figured it would be an interesting addition to my Dr. Touch’s Parts Store and so I set up to create DTAugmentedRealityController. The goal for V1.0 is to provide a view controller which you can feed icons and geo locations and the view controller would overlay said icons over live video from the iPhone’s camera.