Your regular Monday post was eaten by the Smoke Monster

No? Not buying it? Meh. Works on Lost.

I got precious little writing done last week or over the weekend because, frankly, I was too depressed to care. It happens once in a while, and this time I wasn’t able to push through and write anyway. The experience did give me an idea for writing a piece on the best examination of what it’s really like to be clinically depressed I have seen in popular media (yes, I’m serious), and when the damn finally burst, well. I’m working on one of my way-too-long-for-a-blog how-tos on how to be cloud-based and use local, OS-native applications anyway, a “hey! you writers!” cautionary piece about not talking your story to death before you write it, and, get this, a short story (which I will not be talking about until it’s done, because, hey, read the previous clause in this sentence).

So, really, more to come. Honest. In the meantime, console yourself with the knowledge that some questions are just unanswerable, like where did the phrase “on the fritz” come from.

Apple is the new Palm

When Jon Rubenstein and his band of Apple cast-offs unveiled webOS and the Palm Pre, they were hailed as the next Apple. So it’s only fitting that the Apple of 2010 is taking it’s cues from the Palm of old.

Remember back in the day, when Palm’s market capitalization was bigger than GM’s? While I think we can all agree that Palm was never really that big, they were the driving force of mobile tech in their day. And they did that because they were masters of efficiency, squeezing the best user experience out of the hardware available at the time. Over time, through countless management and ownership changes, Palm lost sight of this core competency.

To a lot of people in the tech world, Apple’s not exactly the first company that comes to mind when someone says “pro-consumer.” I’d argue that while Apple is heavy handed in their app store policies, et cetera, it’s all with the best interests of the consumer in mind; they’re trying to make the user experience as good as it can possibly be. And in so doing, they’ve taken a few pages directly out of Palm’s playbook.

For three generations, the iPhone and iPod touch screens have been the same half VGA 320×480 as the Palm and Sony Clie devices of years ago. Now Apple’s competition is rolling out higher resolutions screens, and Apple has to answer. Consumers can see the difference between the iPhone’s 320×480 and the HTC devices with 480×800 screens. But the iPhone OS isn’t resolution independent (as, ironically, Palm’s new webOS is), so how does Apple roll out higher definition screens without breaking the hundreds of thousands of apps people already use?

Palm had a similar problem in the early 2000s. They’d pushed their 160×160 resolution screens as far as they could, even adding color. But Microsoft’s Pocket PCs had 240×320 quarter VGA screens that just looked better. So Palm doubled their resolution along both axis—effectively quadrupling the resolution—to 320×320. New applications could make full use of the additional pixels, but older apps were automatically scaled up, or pixel-doubled, by painting a 2×2 square of small pixels for every pixel of a 160×160 screen. Older apps worked just fine, and new apps looked even better than QVGA apps on Pocket PCs. (Eventually Palm added another 160 vertical pixels to replace the old silkscreened Graffiti text input zone that could be used for display when text entry wasn’t needed, bringing us to the same 320×480 that all iPhones use today.)

So how is Apple addressing the resolution problem? As seen on the new iPad, they’re using the same solution Palm did. Apps written aware of the iPad’s larger XGA (1024×768) screen take full advantage of the new resolution. Older iPhone apps can run either at native 320×480 in a small window, or pixel doubled up to 640×960, taking up most of the iPad’s screen. When Apple releases the iPhone HD this summer, it will ship with 640×960 screen nearly the same physical size as the current HVGA screen and automatically pixel double older apps to take up the full screen. Because the screen sizes will be nearly the same—actually, it looks like the iPhone HD screen is a tad smaller than the iPhone 3GS’s—these older apps will look exactly the same as they do on older iPhones. But updated apps designed to take advantage of the extra pixels will look amazing, better than anything on Android devices because of the iPhone HD’s superior pixel density. Thanks for the idea, Palm!

Let me give you another example. At the recent iPhone OS 4 sneak preview event, Apple unveiled how they would address what had been considered the key flaw in their mobile OS: multitasking. Instead of running an arbitrary number of full apps in the background, chewing up memory and processor cycles that the foreground app—the app the user is actually choosing to pay attention to at the moment—the way webOS and Android do, Apple decided to go a different way. Instead, they will continue to default to running only the current active application at any given time, and allow developers to opt in to using specific APIs allowing background threads when they’re actually going to do something useful. You can keep streaming music from Pandora in the background, but the bulk of the Pandora app quits when you’re not actually looking at it. You can be notified of an incoming Skype call without having to leave Skype running. The user gets the benefits of multitasking, but not the resource bloat downside.

When Apple announced this, it seemed vaguely familiar to me, but I couldn’t place why. Then a friend of mine pointed out that it was familiar because this is exactly the multitasking implementation that was to be in Palm OS 6, known as Cobalt to Palm OS 5’s Garnet. Cobalt never appeared in a shipping device, but that was more due to the byzantine politics and rights issues behind Palm and its spin-off PalmSource in the mid-2000s. Architecturally, Cobalt was sound, and its method of multitasking would have been far more efficient and snappy than the alternative in Windows Mobile. By extension, Apple’s multitasking in iPhone OS 4 should be far more efficient and provide a faster, more consistent user experience—and without the need for gigahertz processors and 512MB-1GB of RAM found in Android devices—than found in Android and webOS phones today. Keep in mind the Palm Pre was considered a poor multitasker with 256MB until the updated Palm Pre Plus with 512MB, while the iPad, designed with iPhone OS 4 in mind, only has 256MB, as does the iPhone 3GS. iPhone multitasking will be disabled on the 128MB original iPhone and 3G, so obviously Apple has determined that 256MB will multitask just fine.

Good ideas are good ideas, and Apple has picked up the baton from Palm as the user experience champion in mobile.

How I went from Apple store newbie to lifetime ban in one week at Protocol Snow

Okay, this is a little terrifying. While Apple on the one hand does more than anyone else to ensure a great user experience (more on this tomorrow), they simultaneously seem hell-bent on destroying the customer experience.

When Apple delayed the international iPad launch by a month, early adopters worldwide started to panic. Since my nearby Apple store initially had plenty of stock, I offered to purchase and ship iPads internationally for members of the NeoGAF gaming forum. I was doing this as a favor, unlike hoarders who were unloading iPads on eBay to cash in on the $150+ markup. Instead, my asking prices were very reasonable, just enough to cover all the tax, international express shipping, and Paypal fees with a little left over for gas and my time.

The really disturbing part is the canned, robotic language the clerks in the Apple Store are forced to use. I’ve got more information out of voice recognition phone trees. This is exactly the kind of behavior Wired founder John Battelle called out in his recent open letter to Apple, in which he referred to Apple as the Howard Hughes of the industry. And not in a good way.

How I went from Apple store newbie to lifetime ban in one week at Protocol Snow

It’s the apps, stupid

Why did Apple finally approve Opera Mini for the iPhone? Because on smartphones, browsers don’t matter.

One of the most important messages to come out of last week’s iPhone 4 sneak preview event at Apple has kind of fallen off the radar. In the iAd part of the presentation, Jobs said, “Search has not happened on a mobile device like on the desktop. People spend all their time in apps; they go into Yelp and don’t do general searches. This is where the opportunity is—within apps, not search.”

This is a major departure from the message coming from Apple in the early days of the iPhone. In 2007, it was all about Mobile Safari, searching and web apps. Three years later, the platform has matured and a different use case has emerged. Most iPhone users don’t use Safari all that much. Even when they search, they do it in discrete apps rather than in Safari. When I want to find out who some guest star is in one of my favorite TV shows, I don’t open Safari or even the Google or Bing apps on my iPhone. I open the IMDB app and look it up from there. It’s faster, more targeted and an experience designed for the iPhone screen size.

Jobs gets this, and Apple’s new ad platform is designed to exploit that. Google figured out a decade ago that people are much more receptive to ads if they’re targeted to them and their specific interests. With iAd, Apple has removed a lot of the guesswork even further. If you’re using an app, you’re probably going to be most receptive to ads with the same focus as the app you’re using. This gives Apple even more reliable targeting than Google’s use of keywords.

But there’s another side to this as well, one that I think is interesting in an of itself. Google’s ads live, for the most part, in browsers, because that’s how most people interact with the internet on “big” computers. As a matter of fact, I’m typing this post on a netbook (though I started it in the WordPress app on my iPhone) and the only application I’m running is Firefox. In various tabs, though, I have Gmail, Google Docs, Pandora, Sobees for Twitter/Facebook and Meebo for IM.

But on my iPhone, I can go months without even opening Safari. It just doesn’t factor into my workflow, even though I use all the same web-based services on my iPhone that I use on my netbook and my desktop. The difference is that on my iPhone I have discreet app for each service. I have a Meebo app for IM that sends me push notifications when I have a new message. I use Reeder to keep up on my RSS feeds in Google Reader. I use Tweetie (soon to be just Twitter for iPhone) for Twitter. When I find an article I want to read later in either, I send it over to Instapaper, and read them in the Instapaper app. I use QuickOffice Connect to edit Google Docs directly, the iPhone’s Calendar app for my Google Calendar, Action Lists to give my Toodledo to-do list a GTD workflow.

So when Apple approved Opera Mini for the iPhone last week, I wasn’t surprised at all. Apple has demonstrated that they understand very well that browsers don’t matter on smartphones. Opera Mini is just another app, no more useful to the average iPhone owner than Safari, which is to say not very useful at all. It’s likely to be ignored most of the time, because that’s what happens to smartphone browsers. Smartphones are different than “computers” as we traditionally think of them, and while both platforms need access to the internet, how they do it varies greatly. The size of the screen makes a qualitative, not quantitative, difference in how the device is used and how it accesses information.

And if you think that last sentence means that devices like the Apple iPad and Microsoft Courier are something altogether different from both smartphones and full computers, well, that’s a post for another time.

Twitter Weekly Updates for 2010-04-18

Twitter Weekly Updates for 2010-04-12

In which the blogger peers out from behind the curtain of depression


I’ve been gone for a while. I haven’t written here, or really any anywhere else, for about three months. During most of that time, I wasn’t sure if I’d ever write again. I’ve been the slap-bitch of that old dog, depression.

Depression is a weird thing. It’s part mental, obviously, but also part very much physical. My brain, thanks to a genetic mutation passed down on my mother’s side, doesn’t either produce or retain enough of the neurotransmitter serotonin. Low levels of serotonin suck all the color out of life, leaving the sufferer in a gray twilight—sans sparkly vampires—where nothing much seems to matter. No drive, no ambition, no dreams of something better.

In my case, the symptoms of depression are normally held back by the SSRIs (selective serotonin reuptake inhibitors) I take every day. But they can be pushed over the edge by an emotional trigger, the likes of which I got the first week of January.

I was already teetering on the edge, because I’d just finished writing a novel, the first one I’d actually finished in thirteen years. Writers often go through something very much like post-partum depression when they finish a book, for largely the same reasons. We just finished this really intense emotional commitment, and now it’s just… over. Now what? This is especially hard to deal with if you, like me, get stalled trying to immediately jump into the next book.

Queso, I was already a little fragile dealing with post-partum writer’s block, and then I lost my job. I mean I didn’t actually lose my job I mean I know where my job is still. They just don’t let me go there anymore. Instead of transferring from Old Job to New Job as expected, Old Job instead recalled me and then laid me off. Word had come down from On High that IT Support had to cut payroll by 20%. If you want to do that and lose as few people as possible because you’re barely hanging on as it is, you cut the people making more money than their peers. And who has two thumbs and was the highest paid person on the Helpdesk? This guy!

But here’s where it gets screwed up. If they just transferred me, as had already been approved, they’d lose my salary, but it wouldn’t count towards the 20% cut. So they had to lay me off instead so the bookkeeping would work out. I lost my job in the Great Recession because of friggin’ accountants.

As you might expect, the sheer cosmic insult of this was enough to push me over the edge. At first, I watched a lot of TV. You see, the last thing a depressed person wants to do is confront their own life, their own problems. Life becomes an almost manic struggle to keep oneself distracted, anything to avoid actually thinking about your life. You cling to these distractions like a life preserver in shark infested waters, because they make the pain go away.

Eventually, and this surprised me too, I ran out of forensic shows in syndication to watch. I know! Between NCIS, the various CSIs, Criminal Minds, Cold Case, etc. you’d think I’d be set, but over time I started to recognize the ones I’d already seen. And once you’ve seen @wilw run over by a semi, you really don’t need to see it again.

In February, I got into the beta for Star Trek Online, and zeroed in on something else I could lose myself in. I now understand those guys at SF cons debating the finer points of Trek canon, like why the Gorn are so damn angry. Trek has nearly fifty years of backstory, and the game ties into quite a bit of it. To get the full experience—and, as mentioned, avoid my own experiences—I immersed myself in Star Trek.

I’ve always been a casual Trek fan. I watched most of TNG and the early seasons of DS9, and of course all the movies. I remembered sitting with my Dad as a kid while he watched the original series in syndication, but didn’t really remember anything specific. And I loved the JJ Abrams reboot last summer.

Now, in my desperation to avoid thinking about myself, I went full-on Vulcan salute Trekkie. I bought all the TV series on iTunes and started watching them in chronological order: Enterprise, TOS, movies, TNG, DS9, Voyager. I redownloaded all the Trek novels I’d bought over the years from eReader and arranged them in chronological order, and bought the dozen or so books that take place in the 30 years between the end of Nemesis and the beginning of the game.

And I played a lot of the game.

Star Trek Online isn’t the first MMO I’ve played by a long shot, but it’s the first one where I’ve hit level cap, gotten a character to the point where they can no longer progress because the developers haven’t built that content yet. But after weeks of shooting Klingons, Romulans, Cardassians and Borg, something at the back of my mind started to itch. A few weeks more, and I started to listen.

That itch was telling me that it was time to start writing my own stories again. Time to start blogging again, time to go back to the Unification Chronicles universe and finish telling the story. I have my own centuries-long sprawling space opera, dammit.

My tentative plan is to go back and finish rewriting Revelation, make it as solid as I can, then post it to Smashwords/iBooks, Amazon and Fictionwise/eReader/Barnes&Noble. Once it’s out of my hands and "in the wild," I start on Crusade and ride it all the way to release as well, then Jihad and so on until the series is done. I have absolutely NO timeframe in mind in which to do this. I have no illusions or intentions about getting any of it published commercially or making a living as a novelist. This isn’t about business.

It’s about the itch. It’s about telling stories. And it’s about time to get back to writing.