(this is page 3 of 121)





Latest

Us at MeadhallA few years ago I wrote about a book that I had found by a self-published author with a weird name, Leinad Zaurus, called Daemon which combined the best parts of what was possible with the most dramatic elements of the near future as defined by science fiction. The entire book was a metaphor for the lack of control that comes from automating too much and living far too close to the limit of what any part in a complex interconnected system can sustain in the face of even the smallest of Black Swans.

I was so taken by Daemon (and the palindromically named Leinad) that I cold emailed him to ask if we could get together and chat about me optioning his book for a movie. I knew nothing about what any of the words in that phrase meant but I'd seen it on Entourage and hey, I was recently feeling like a Master of the Universe after selling my company.

Daniel, true to self, let me down gently by telling me that someone else had already thought of that (a small shop called Dreamworks) but thanking me profusely for being a fan thus sparking the kind of authentic author/fan relationship that is becoming more and more common as the Interent eliminates layers of indirection ("distribution") between creators and their fans. I've since gotten ARC (advanced copies) of all of his subsequent novels and reveled in how as an author he truly encompasses the Steve Jobs mantra of living at the intersection of engineering and liberal arts with his work on big themes dressed as thrillers with narrow AI, drones, and augmented reality as its villians and heroes.

It was a treat to see Daniel again in the flesh this week as he trolled the hallways of MIT for inspiration and gushed about the advances of private space flight, the maker movement, and VR. Get him going with just one beer and he'll give you positive proof that the sculptor Brancusi was right when he wrote about artists: "When we are no longer children, we might as well be dead."

Postnote: Daniel will be stopping by at our Oculus Rift "Celebrating Hardware Innovation" event tomorrow night so come and see two pieces of the future together in one place.

Comments Permalink

Back in 1996 when the first “web applications” emerged at the apex of the client-server era, one common frame for looking at the tradeoffs of the web browser was talking about the “richness” of an interface like Lotus Notes or Microsoft Outlook against the “reach” of one like Hotmail or eGroups. While the productivity purists would bemoan the latency of the browser’s limited rendering capabilities handicapped by an Internet that was too slow, they would jealously look on as the web heads collected users by the millions on computers that differed in screen size, operating system, and even (gasp) Internet browsers.

In the end, reach won because networked applications benefit from more users, because the browser vendors had a clear spec to deliver to when it came to interactivity in the form of the GUI (and now no one remembers that Microsoft had more to do with this than any other single vendor), and because broadband became reliable and ubiquitous (at least in the first world). Interestingly though, it took about a decade and there were many who falsely claimed victory many times along the way.

Despite the excitement this last week at MWC in Barcelona around the mobile web as the future of mobile application deployment— an ebullience which reached a fevered pitch in the case of those who came to worship at the altar of the Mozilla/ZTE Firefox OS phone, the truth is probably along the lines of the Mark Twain statement that while history may not repeat itself, it certainly has a way of stuttering.

Firefox OS represents a noble cause: as per the marketing plastered all over their booth at MWC this week, it aims to “connect the next billion people” with affordable (read: anemic) hardware and an “open” platform. The demo I was given looked quite a bit like early WebOS demos did with the “apps” sitting on cards that could be moved around, paused/resumed, etc. It was a little laggy but surprisingly not as much as I had expected. In short, the device pointed to a future where fat apps will give way to the “reach” of folks who might carry loads of these devices, or in the case of the developing world, share them while all of the relevant bits of state live on a server.

It’s going to take a while though as was evident in how poorly the games showed (no sound), how much the tortured limited bandwidth of the show caused in terms of lag, and perhaps most importantly, how weak the story around offline remains, particularly for media storage.

More importantly, the mobile communicator (aka the smartphone) is so much more of a personal device that it feels the richness end of the spectrum is about much more than just snappier UI and local resources, extending to all sorts of sensors and actuators that will take the W3C a long time to bake into the DOM. And to boot, unlike PCs in the late 1990s, we’re a long way from over-serving the needs of the users on this platform.

I’d love to see the mobile web win, and have no doubt that it will in due time as it did on the prior dominant platform. But there is little reason to think that it won’t take a decade just as it did the last time and the belle of this year’s MWC ball was a good reminder of that.

Postscript: While noodling on this, this Engadget editorial perfectly captured a lot of the problems I saw at the booth.

Comments Permalink

In Love with LÖVE

I’ve been looking for the past year for a suitable programming environment for my two boys (10 & 7) that moves them beyond graphical environments like Scratch and Lego Mindstorms and into the land of real programming (a few years ago we had great success with Scratch). In my mind, the target has always been something like the Apple ][ BASIC ROM that shipped with the machine: easy to grasp and when paired with the GR (low resolution graphics), an endless source of fun. However, it’s been an absolutely bear to find something that fit the bill in terms of abstracting away enough complexity while remaining engaging enough in the era of Minecraft and Halo 4. Until LÖVE, a quirky game framework.

Before I get to why LÖVE is working, here is a list of the stuff we played with and dismissed:

  • PyGame: though it benefits from maturity and tons of material online and Python at its core (a fantastic starter language with tons of headroom), it is too low level for kids. Plus it is a bear to build on OSX.
  • Pythonista: iOS only, sadly, and both boys have yet to get the memo that keyboards are out and fingers on glass is the future. If this got ported to the Mac, it would win, hands down as the API is fantastic. Maybe my kids will learn that in the future there won’t be keyboards and start living there...
  • Gosu: Ruby’s version of PyGame, it looked great until you try to build it and see some of the same cracks PyGame suffers from. On top of it, there are a bunch of Ruby projects for teaching kids that use it so the documentation is a bit inconsistent.
  • ImpactJS: I want so badly to believe in a future where all apps are in the browser but if you value your sanity, it’s a tough environment for beginners. ImpactJS is well done if you already have all of the basics of game development down and want to target the browser but otherwise, I call it the “English major maker”
  • Unity3d pros use this and it has a nice feature in that it supports a lot of languages along with a free edition. But it is a full and complex IDE so teaching someone to program with this is like teaching a kid to draw with Photoshop. Sometimes crayons are better (Playmaker makes some of this go away but at the cost of being Scratch-like)
  • text based terminal games: I loved Zork and Choose Your Own Adventures but it turns out that no graphics = no fun for little guys so this one was not ever a real option

Enter LÖVE. I had seen this about 6 months ago but dismissed it because the programming language, Lua, looked weird and I’d never heard of it. But it turns out to be just what the beginner brain needs and surprisingly, its warts are a padawan’s advantages. Crappy scoping? No need to explain locals, globals, etc. Only 5 datatypes? No need to explain how numbers aren’t always numbers. Pascal like “ends” everywhere? Much better than braces or whitespace.

To boot, the API offered by the underlying game engine makes sense, boots quickly, and offers a really nice learning curve. It’s only been a weekend but already we’ve got a a 2D game with a minimalist state machine, collision detection, sprites that don’t look totally awful, and sound effects to boot!

If you are looking for something to ease kids into real programming, you could hardly do better than well documented, nicely packaged, and free on top of a language which may not please the purists but will certainly engage your kids in the basics.

Comments Permalink

The other major number that caught people's attention in the recent Apple earnings call was the drop in gross margin (47% to 38% YoY) which is mostly explained by the iPad mini but definitively hints at lower margins in their future. This is a normal part of any suitably competitive market and should be expected; what might not be however, is why the post PC components of Apple's future might not currently be as well insulated from margin pressure as prior software platforms.

The theory goes something like this: Microsoft Windows was able to maintain an exorbitant margin on its Windows and Office franchise because of the combination of feature lock-in and network effects. Everyone focuses on the latter because it is easier to see: more people using Windows means that file sharing is easier as is the size of the platform target for developers which enriches the platform and makes it harder to choose an alternative. But the former is equally important in my view and can be summed up in a statement I once heard about MS Word: "no one uses more than 10% of Word's features but for just about everyone, it's a different 10%."

Compare that to any of the utility apps on iOS/Android that have more than 10M users— if everyone used just 10% of those features, you'd be using fewer than one feature of Dropbox or Evernote. Or put another way, these apps on iOS are super thin by design and new apps seem to be getting thinner. This has two benefits for app creators: first, it is easier to port stuff when all you are doing is rewriting a view layer for the specific target platform, and second, you can leverage your back end services for the real heavy lifting (such as Evernote's awesome OCR) which means that you are basically out of version hell management with 90% of the iceberg that is your app.

For the platform provider though, the world is a much uglier place. Right now, there are no significant apps on my iPhone that I couldn't find on Android or Windows Phone. And the OS provided apps are less and less relevant every day. Having already replaced music, email, the address book, and data sync, the only app truly keeping me "stuck" is the Apple Remote app and only primarily as a parlor trick.

This may dramatically affect platform differentiation/profit taking in the future (or at least completely shift the game to a hardware fit and finish one), but there is also a obvious way out. If Apple were to begin using its cash hoard to buy some of the better services of the post-PC era and closed them off of all of the other platforms, they might have a better reason to maintain their high margins. I say buy because of the lessons of iCloud and Dropbox: everyone thought Dropbox was a feature until iCloud proved to be its poor country cousin drunk on Moonshine and losing your files en route to the cloud, but they might still have an awesome service or two in house. For instance, knocking off Spotify seems like a no brainer (they are the largest licensee of music in the world after all), and working on radically simpler, Apple-esque music consumption experience when the world's catalog is at your fingertips seems well aligned to what the company that brought us "5000 songs in your pocket" could do well.

The other potentially very interesting vector for sustained differentiation would be enriching iOS as a more full featured content creation platform. Given the massive shift away from PCs to tablets that even they have felt, it seems that giving app developers compelling and differentiated APIs that allow for more expressive creative or work tasks and simply staying one or two tricks ahead of the competition might help that. This is after all, the company that gave us an API for tracking 10 fingers down on the glass first (and Siri, is neither forward looking enough, nor a core strength's of Apple's given their mediocre execution on the server side).

Who would have thought that thinness could be such a curse?

Comments Permalink

Last night Apple announced results that would set records in the world of business at any time and for any industry and yet the stock will tank on the trigger finger nervousness of Mac diehards from 1984, fueled by the PTSD they suffered from the marketshare slide that forced them to Windows and the simplistic notion that history always repeats itself.

To me, the most interesting number that came out of yesterday's earnings results was the massive shrinking of Mac shipments (4.2 million versus 5.2 a year ago) which, all other supply chain excuses notwithstanding, tells the tale of end of the PC more clearly than any IDC or Gartner figures about overall PC shipments or even lackluster Windows 8 early results. These can all be squinted away but if you are looking at the Eloi of technology as the leading edge, the Apple laptops, and specifically the Macbook Air line, is as good as one can get on the evolutionary branch that started more two decades ago with the Compaq luggable: it is small, more than powerful enough, can run Windows as well as OSX, and aggressively priced for the level of fit and finish it delivers. Apple has managed to sells millions of these guys— always growing the shipments until just this last quarter. And as Tim Cook stated on the call, the obvious reason for its decline has got to be the ascendancy of the iPad (and its clones) to replace the jobs people were buying Macbooks to perform.

It's not quite as clear as this because of the fact that the input heavy jobs may still be getting done on a laptop, albeit a much older device whose replacement cycle will be driven (if at all) by total breakdown as opposed to shiny and fancy. But by in large it would seem we are there and that for the most part, Jobs was right in claiming that PCs would become trucks— specialized vehicles for heavy loads.

I can see two big implications for startups building consumer products: the first may be that the non mobile experience should now start with the tablet as opposed to the PC. Whether this means two native apps or one really good HTML5 one remains to be seen but it most certainly will not be mouse-driven interaction (hovers, fine control of a visible pointer, tabbing, etc.) on a 24 inch screen.

The second implication should be the experimentation game we need to take on in order to improve input methods for this new tablet as default ecosystem. No matter how good fingers on glass get, I still talk to plenty of online shoppers who discover on the tablet and then email themselves the checkout process for their PC— an absolute funnel killer to any e-commerce experience. The obvious answer here is OneClick and its brethren but there are loads of non e-commerce cases that need to be addressed even before we consider the possibility that content creation for most of the non-truck population is going to be about more than pokes, likes, and retweets (wishful thinking).

I can see better keyboard options closing this gap (get inspired by the only good thing about Surface vendors) but I suspect we'll also get plenty of software only alternatives. All part of the magic that comes from having a new and clearly constrained design center for apps.

It's not new news that tablets are taking over but after last night's results it feels pretty irrefutable.

Comments Permalink