Ad

Our DNA is written in Swift
Jump

He Says He Cracked It

For lack of any other sensible field in Apples backyard to speculate on we are hearing multiple news sources and recyclers invest their brains in talking about a real Apple TV, as in TV-Set-style TV. All my favorite podcasts are full of it. And even Steve Jobs himself speaks to us from the grave via his biography where is quoted to have said “I’ve cracked it”, referring to an integrated TV-Set.

Not to be outdone, let me also add a bit of speculation, founded on some actual facts. Loops that Apple left intentionally open as if to telegraph their next moves.

For Apple this year 2011 marks the introduction of three new technologies, which are – unlike Apple’s usual approach – have not been proven before on the market.

In the hardware sector it was the Thunderbolt connection technology which aims to replace USB, Firewire and Display-Port all in one stroke.

In the networking sector they are betting big on iCloud as the ultimate solution for online-backup and cloud-services. This is by far the most important technology because not only did they build an extensive enormous data center for it, but also because until now Apple’s online services where quite lackluster. We don’t know for certain, but I am sure on many occasion Steve Jobs would have described them as “it sucks”.

The fifth generation iPhone had to be delayed until iOS 5 was ready because it is the embodiment of iCloud. Steve Jobs would not have them ship the new iPhone in summer on iOS 4 because of the importance of having an automatic cloud solution that “just works.”

Finally in the artificial intelligence sector Apple launched their first serious effort, named Siri. They kept the name because it is friendly and non-threatening. But here’s another anti-typical move: they launched it as BETA and limited it to just the top of the line iPhone model. It is rather likely that Siri will come to other iOS devices, but only when Apple has ironed out the kinks.

Apple has a high stack of chips on the table so that nobody seriously doubts that Apple will see their 3 bets through. But seriously there was little actual technical tangible innovation this year. All the teams where struggling to get iOS 5 out of the door, iCloud bolted down and Siri behaving more serious.

While this is perfectly understandable, iCloud, Siri and iOS 5 are BIG for Apple, it is not typical or them to devote all their resources exclusively on Software. It would be Microsoft which Bill Gates always described as a Software Company. Apple was always about controlling the experience end to end. So if what we saw in 2011 was the tip of the software tip of the iceberg then there must be an equal or larger amount of resources be pouring into hardware. But which? And where is it?

We might agree that the iPhone 4S is amazing or not, but it is a fact that it only contains technology which has been proven somehow before. The antenna and CDMA/3G world chip already came with the Verizon iPhone 4, the A5 SOC was in the iPad 2. It is almost as if they had an intern play with existing components just like one would with LEGOs.

iPad 2, second iteration of the improved MacBook Air, a couple of silent updates and a bit of Thunderbolt mixed in for good measure? Nothing truly new there. And even the recent silent update for the MacBook Pro seems to only be to token activity while waiting for the next big Intel architecture shift.

I see only two reasons for that: either Steve Jobs has given up on pushing the envelope even longer ago than we think. Or there really are three more years of Steve-blessed products in the pipeline, it’s just that the technology is not there yet. If technology is moving slower than Apple would like it to, then they put their resources into Software. With hardware they are dependent on what their suppliers can give them. With software they control it themselves.

To be fair, iOS 5 is not ONLY iCloud. As a matter of fact there are a boatload of other things in there that shows that they took this release as an opportunity to clean up quite a few old problems. But my point is that you do this kind of cleanup work when you know you have more time at your hands that you would need to ship in late Summer.

So there’s SOMETHING coming. Let’s inspect first what DIDN’T come even though it should have.

There are two glaringly obvious updates that are MIA so far for 2011: No new iPod Touch and no new AppleTV. Both currently have to make do with last years Apple A4 chip, even though they get the iOS 5 treatment.

The most obvious reason I can think of is that A5 is designed to be able to handle 1080p video (which was kept a secret so far because the iPad 2 did not have a built in camera capable of that). iTunes does not have 1080p video, all their High Definition refers to 720p. So an AppleTV with an A5 would be able to do something that the iTunes content engine cannot capitalize on. No advantage to the client = no go. Another problem is the bandwidth that it would take for iTunes to deliver 1080p. Instead of 1-2 GB per movie it would be more like 6-8 GB. Some people are blessed with never-ending bandwidth that would allow them to stream 1080p, but most of the globe has to work with like 4 MBit. It’s only recently that you see people having 20 MBits or more, but mostly in big cities.

On Apple’s side there was simply no way how to quadruple their bandwidth cost without first having their new datacenter in place. Also they would only launch 1080p video on iTunes if they have content for it and a business case. Which is fancy terminology for “charge more”.

Apparently there IS an AppleTV prototype sporting an A5 chip, it is internally called “AppleTV 3,1” and traces of it can be found in iOS 5. But for the stated reason this will never see the light of day. Apple will rather continue to build the previous generation because now their margin should be in the 50% vicinity up from initially a meager 30%. And since it fulfills the customer needs perfectly at the perfect price point ($99), why would you prematurely change that.

I have shown in a previous post that the Apple A5 chip is too weak to be able to drive a Retina display, that is to shove 2048×1536 pixels on screen. Precisely the PNG decoding of a loading image has to be fast enough to happen in a bearable amount of time, and with the A5 it is not. This is why the iPad 2 does not do Retina yet, this is reserved for the A6 and the iPad 3, due in spring 2012.

Now consider the biggest display panels that Apple ever did so far. The discontinued 30″ HD Cinema display had 2560×1600 pixels. The current 27″ Thunderbolt display 2560×1440 pixels, which is essentially the same number of pixels as the cinema display, but they switched the aspect ration to 16:9 from 16:10 (from 25:16 even earlier) to accommodate the format that most TV content is actually presented in these days.

An A5 could not drive a current big Apple monitor even if it wanted to, but the A6 will be able to. Can you imagine Steve Jobs saying “That’s shit” two years ago when an engineer presented him with an Apple display (connected to an AppleTV2 prototype) that reduced the resolution for the TV part and had full resolution only when driven by a Mac?

There was another problem with too many pixels in the early days of large displays. To drive these you had to have dual-link capable DVI ports. That only improved with newer generation Macs and finally was resolved with Mini-Display-Port.

We now have wireless video going from iPad or iPhone Retina to displays. If the iPad 3 is supposed to be capable of the same then you will have to have the same SOC at the receiving (and displaying end). Do you think it is just a coincidence that iPad 3 retina resolution will be almost the same as large Apple monitors?

Speaking of chips, both the A4 and the A5 are manufactured with a features size of 45 nm. I bet you that the A6 will be the first built with 32 nanometers which was only cracked by other chip manufacturers quite recently. Intel’s famous Sandybridge  is the first big consumer chip that is manufactured like this.

But I digress …

It seems to me that Apple wants to have as many A5 chips available as possible to fill the massive uptake of the iPhone 4S this holiday season. A4 Apple TV: good enough. A4 iPod Touch: good enough.

The typical cost of a large Apple monitor has dropped to below $1000 and I believe that you will be able to select the AppleTV option for any new monitor you will buy from Apple in 2012, because the manufacturing cost of that will be around $60 which can easily be “hidden” in the total cost of a monitor, but add a great deal of extra value. I imagine that people would just buy these large monitors to put in a conference room or even their living room.

Most likely we will also see a resurgence of 30″ displays, the sweet spot for the living room seems to be 32″. Researching for this article I checked and it is next to impossible to buy a computer monitor with 32″, but that’s where most reasonable TVs begin. Take a look at this chart that tells you the ideal distance to place the sofa from your TV based on its diagonal size.

In my own living room I have a 32″ TV and we are sitting 8 feet away from it. So we are on the smaller side (could go up to 40″ reasonably) but I have no use for a large image. This TV can only do 1368×768, which is good enough for the PS3 or watching iTunes movies.

The reason for these distances is primarily because even though the size gets larger the number of pixels doesn’t because current material is limited at 1080p. So if you sit too close to a large TV then you see the individual points. So with a 32″ Apple monitor you could set twice as close as a regular TV negating the need for a larger panel. Apple would never make a monitor or TV that is too large to be carried home from your Apple store which also limits the maximum size to about 32″. For everything larger than that you will still get a standalone next generation “dongle” as the current AppleTV 2 is.

And there you have it, the AppleTV TV will just be a large Apple monitor with a built in A6 chip. Apple will never get into the fight with cable or satellite tuner boxes. Those you will simply plug into several available HDMI ports. Of course these will be HDMI 1.3 or higher, because that is the minimum standard to work with such a high res display. Probably HDMI ports: 1 for the TV box (cable/sat), 1 for a blue ray player, 1 for a game console. And of course a Thunderbolt port should you want to use this as a regular monitor.

Now for the ones who are planning to cut their cords, Apple is working on content deals to give us live news and sports which are the two killer apps for why people still think they need a TV set. Recently basketball and hockey appeared there, as well as WSJ. YouTube and Vimeo are also there. Getting there …

If you think about it then our TV has stopped being a TV when we connected a Sat or Cable box. Traditionally TVs where all about the built-in analog tuner. But who actually use his TV’s tuner any more? Our TV even has a DVB-T tuner, but there is no antenna cable plugged into it, because we have no interest in watching the handful of local TV stations. Instead we have a MacMini with an EyeTV and a Sat-Receiver. So that’s almost like an iMac, albeit with much worse resolution and no Apple-style user interface.

Apple removed Front Row from Lion because of this very reason. Nobody really used it because it it did not have the broadness and extensibility of the interface you now have in AppleTV. What would you say if an iMac you bought could sleep while the built-in TV part is providing a much better way of consuming media via the embedded iOS?

That’s why I think that we will see AppleTV integrated into not just Apple monitors but also into all future iMacs. This way people will get the AppleTV experience whenever they buy any Apple display device. Which the interesting benefit of people able to put it down as a business expense (“hey I bought a computer!”) while getting the entertainment and TV part for free. I can imagine that you have the iOS integration as a $50 option on the Apple store. Who can so no to that? You don’t want to miss out on AirPlay capability! (Not if there is no AirPlay for Macs, which there probably will never be)

Most people I know are done with migrating to HDTV-capable flat panels. Most are passing on the 3D-fad. And very few are expecting to see affordable 4K displays during our lifetimes let alone streaming such content over the internet. So why would anybody buy a TV-set that simply bears the Apple logo? And maybe dissolves the small TV dongle? We Apple fanboys buy everything that has the logo on it, but we are not THAT stupid, are we?

An Apple TV / iMac or Monitor-combo would also give us the benefit of bringing communication to your living room. A built-in FaceTime HD camera makes for good virtual family meetings with the grandparents, possible simultaneous watching of TV shows which seeing your loved ones in a picture-in-picture. Oh and if you really must type something, there’s the trusty Apple Bluetooth keyboard. Wireless audio over Bluetooth is also standard, so you can watch movies while the kids are sleeping. The big advantage of such a hybrid approach is that the built in devices would be available both to the TV part when this is active or to the Mac that’s connected when it is not.

This is how Steve Jobs can claim that “he cracked it”, the answer to the Apple TV-Set is NO TV SET. It’s a Mac or a smart Monitor! It’s totally ok to buy new hardware every 2 to 3 years, but you change your TV less often I presume than you do change Macs.

This answers the question who would want an AppleTV-TV. Everybody who is using a Mac as MediaPC. Everybody who wants to shrink his TV to reclaim this space in his living room. Everybody who is not using a built-in tuner. Everybody who is looking for an Apple-designed user interface to his entertainment and media. Everybody who wants to reduce the number of things in his life.


Categories: Apple

1 Comment »

  1. How it’s possible that apple will get display with that resolution in mass production while all it’s suppliers struggle to make it?

    Now good class modern graphics card could support this sort of resolution for 3D but what about mobile?
    You would have to increase power of the GPU and that will increase battery consumption…
    Apple giving up on the famous 10hrs battery life, don’t think so.

    Apple inventing new battery – well if it was that easy the new iphone 4S wouldn’t have the problems.

    I’m afraid ipad3 will be just big disappointment for many with better camera and Siri.

    Also, think about developing software for it even biggest apple monitor is barely enough to display simulator – not great if you develop on apple macbook.

    As much as I like the idea of having retina display in ipad I don’t see it happening in the next 2 years.