How do you deal with incoming calls or similar disturbances while driving the bicycle? Today a bicyclist in front of me demoed his reaction, taking up his phone and trying to interact with it while almost creating an accident with other bicyclists. More proper would be to ignore it, and if you felt it was important, find a place to stop and check it.

Glances provide a second option: you can have a glance on your watch. It’s a disturbance, but better than fumbling one-handed with your phone. But there can be no interaction: your other hand is firmly on the steering wheel and the hand on the watch arm cannot interact with it. So there all the sudden the interaction that is making a short glance into a long glance comes in. Looking at the watch a bit longer can provide additional information about the disturbance, making it easyer to make the call of whether to stop and take care of it, or keep on cycling.

I’m curious to see how long a long glance is, how well glances and bike-riding or car-driving works together (probably still a better idea to focus on what you were doing), and how we can use touch-less interactions initiated by a glance.

When the iPhone launched, and for the first versions of the iOS SDK, an app was a bundle, a directory with metadata if you will, with the suffix .app. System- and 3rd party apps were all contained in each bundle. No app was in multiple bundles, and no bundle was multiple apps.

Already I’m simplifying, because there was one more thing to the app bundle: optionally, you could add a settings bundle, which would load into and allow the user to change the settings from your app, from outside your app.

Since then, things have become more complicated. To start with, you had universal apps: it was still an app, but now it had the complexity of targetting two platforms, the iPhone and the iPad.

Following that, we got extensions, which could (and in my opinion should) be seen as apps in their own right: little plugins, if you will, to the share panel, to the today panel, to the keyboard panel, and to the photo editor. Many apps that used to be limited to living in their own app, now live more hosted in other apps than in their own. But still they are packaged together with “the app”. And sold together. Being in the same bundle you cannot delete one without the other also being deleted.

With the Apple Watch, we get three more extensions: glance, notification (with short glance and long glance), and the watch app. These are supposed to provide windows into your app, but I think that again we will find that for many apps, this is more the app and the main iOS app is. And again, if you’ll delete the main app, these extensions will also disappear from your Apple Watch.

I’m working on a hobby project at the moment, and I think it’s a good illustration of what is bundled with the app now:

  • It has the app, that is a little bit different depending on what size of a screen its running on
  • It has a settings bundle
  • It has a share extension so that you can share items with it
  • It has a today extension so you can see what has changed since last you logged into the app
  • It has a notification extension to notify you when new stuff you could be interested in has happened, together with a long glance and a short glance
  • It has a glance that doubles the today extension for functionality
  • It has an Apple Watch app where you can do the basics of the iOS app Oh, and I should probably add that the functionality of the app isn’t all that advanced. But this only goes to show how many aspects to this app will be here. For my app, it fits well with the iOS app being the centerpiece which the extensions interact with. But what if you’re a storage container like Dropbox? What it you’re a timetracker? What if you’re a health tracker? What if you’re a currency converter? For all these things, the main app in most cases becomes a sidekick that can give you a more advanced look into your data, but which you’ll probably interact with very little compared to the extensions.

What worries me the most is that the main app is required. And if it’s not the centerpiece, then people would not be sure why they would pay for it. That’s very different to me from if people could buy a kick-ass keyboard, share extension, photo filter, Apple Watch app or similar on its own.

The other thing that worries me is what will pay for all these extensions. As you could see from my hobby project, it’s a whole lot of extra work that comes on top, and that honestly is more or less a basic requirement. If my app were a paid app, I don’t think people would accept an in-app purchase to allow the Apple Watch extensions to be made available. I don’t even think Apple would allow me to submit an app with three extensions that by default don’t do anything. And making something that doesn’t do much and then open up functionality seems like something that would be just as bad. Also, showing ads on this little constrained device would probably not sit right with many people as well: you’ve just spent, let’s say $500 on this piece of jewelry, and now it’s plastered with ads most of the time. Nope, that’s no good.

I think the “everything is in the ‘app’ bundle” approach we have now does a lot to cement, or even make worse, the hard situation for people to make money of their apps alone, and I really think this is unfortunate. I would love for this ecosystem to become something people can make a good living of by making good things. If people can make a living selling good wax candles, why should they not be able to make a living selling great digital tools? For now, the only good business I know is for the wax candeleer to use the apps as an entry into his candle-shop to sell more, and perhaps more custom, wax cendles.

For WatchKit extensions, I’ve used the parallell of puppeteering: the app makes the extension do things, but the extension itself has close to little logic. Others use the parallell of the browser, and I do like this. Like the browse, the iPhone serves up state in a context upon a remote screen, the app can project state in a context upon a extension (remote view controller).

Continuing on yesterdays post with the idea that hosting extensions (I really prefer the name remote view controllers like we saw in iOS 6) is something that can be opened up, let’s take it to the extreme and say it was proposed as an open standard, much like HTTP. All the sudden, this could become the new browser, allowing any app to project their content into it.

That would be absolutely awesome! Think about it all the opportunities for device integrations! That could possibly allow me to treat a 5K iMac as a dumb terminal, having all my personal information, documents and state on my phone, and interact with it through a beautiful desktop experience. No compromises, all integration! At the same time, you’d have the Apple Watch integration, with extensions being able to run of the same iMac integrating with it when they are near. Having third parties such as keyboard makers being able to integrate a little screen, or Google Glass taking it to the next level, even having other platforms participate together, making for a seemless integration. This would be a dream of devices coming together, much like we experienced it when having disks formatted for Mac, Amiga or PC was no longer a thing. Data was just data. Now state in a context can be state across a shared context, optimized for the best interaction on each device.

Having thought that thought now, please Apple, propose extensions as an open standard. This would make our devices so much more integrated, redusing hassle and really delighting. Would this be a competetive advantage for Apple? I’d argue yes. Sure, not for people like me who get the whole stack already, but for people like my mother who has a PC, an Android phone and an iPad. She’d have such an amazing upgrade of her experience, that I’m sure she’d be much more likely to buy again from Apple if she didn’t have to consider what ecosystem her new device would integrate with and what system it would not.

Back in 2012 I wrote about the private framework around Remote View Controllers, hinting that developers should keep this in front of their mind. Through iOS 8 this became extensions, and with the Apple Watch, this is the main interface for Apple Watch, at least until WWDC 2015.

On the Debug podcast episode 57 Guy & co mentioned that Apple TV is a prime target for extensions, and I agree, but let’s think it over. For my case, I have a kid and no time to watch TV. I use the TV mainly to stream music just like I would with an Airport Express (why, oh why, is this not in the Airport Extreme and Time Capsules! And why, oh why, has the Apple TV and Airport Express not merged yet? And neither have they been updated in forever. Sigh….). What is a killer feature here, and I really did not expect this to be anything of interest when I bought it, is that if I turn the TV on while playing music, it will also show the most recent photos I’ve taken with my iPhone, which means up-to-date photos of my kid. That’s really awesome, and makes both my wife and I stop, look and enjoy. Oh, and my boy likes it too. :-) The second big thing is AirPlay, being able to share my screen on the TV. And this is the niche that the Apple TV Extensions could carve out a niche in, as I see it.

Take Adobe Lightroom. I love that they have brought it to the iPad, as I have my 160k photos stored and arranged there since around Lightroom 2.0. There is tons I would want them to work on in the app, and of course, showing photos on the TV is a part of that. At the moment, my only choice is AirPlay, making what I do and what my parents see while visiting the same thing. We´ve had presentation tools like KeyNote and PowerPoint give us the option of separate screen output for ages, so this seems klunky today.

But when would the screen extension fire off? With extensions on your phone or mac it’s easy, just fire when it seems appropriate. With the Apple Watch, if you’re it’s close by, you’re wearing it and the phone can show stuff there. But with the AppleTV, most time when I’m nearby the TV is off, or it is not the source being displayed. So there needs to be some mechanism of alerting the app when it would be proper to use it, or for the app to request to be able to use this display.

That got me thinking, where else would I like content to be displayed? Sure, Google has shown that glasses is a valid target, HTC has shown the same for phone covers. I can imagine a lot of places, and not all places are places where I think Apple would get into. Would it be worth it to Apple to allow third parties hosting remote view controllers (sorry Apple Marketing, extensions). That would for instance open up the possibility that my window manufacturer could allow apps to project something into my window. Or refrigerator. Or whatever. Would that be a thing for Apple, just like with MFi, and would that give me as a customer value?

I decided on what to do about the iMac harddrive situation mentioned in Finding a good Thunderbolt disk: I ordered a Seagate GoFlex thunderbolt adapter and caddy, containing a 1TB drive, which I’m going to replace with a 1TB Samsung 840 EVO drive.

That means that I’m not going to get the 550 MB/sec of the Samsung drive, but about 330 MB/sec where the SATA/Thunderbolt adapter caps out. I’m sad about that, and I’m also sad that I couldn’t find a SATA/Thunderbolt 2 adapter at any half-decent price (not that anything Thunderbolt is decently priced, but I mean not ridiculously stupidly priced)

It was a toss-up between the Lacie and this one, but believe it or not, the built-in Thunderbolt cable came in as a minus for the Lacie, as I don’t know how I’d go further if it would break. Also, the Lacie didn’t disclaim what disk was in there. With my combination, at least I know I can move the disk into the computer or into another enclosure if it turns out I am really sad about the performance drop. I expect to keep the disk around longer than the enclosure.

So, waiting for the mailman to come, and dreading the stupid US/EU import-tax (I expect to be writing about that later) and taxing charges.