State of the SmartPhone (and Tablet)

In the last few months, the smartphone market has been substantially shaken up –  culminating most recently in Nokia’s CEO announcing a “change of direction”. This new direction is one that many smart people, especially Eric Raymond who’s consistently predicted how things will go when naysayers and business analysts have said otherwise,  are referring to as the beginning of a “death spiral,” or a “suicide note.”

So – which way are things going overall?

Barring some sort of dramatic refocusing that it’s apparent Nokia’s CEO is not doing, Nokia is no longer relevant. It’s still possible that Pheonix-like, they will find a new vision and become relevant again, but they missed the boat on how to make smart phones that normal people can use without reading the manual. While Apple brought itself back from the brink of death, they did so by finding a vision, jettisoning a confusing product lineup, and completely reinventing itself, and what people expect from a computer. People in the tech industry snickered at the iMacs, the lack of floppy drives, the first iPod, and the first iPhone – and yet this focus on vision, on experience as an integral part of the design and not just a sprayed-on patina, this attention to little details, and the constant reinventing of itself is how it’s regained relevance, and is so far keeping it.

Creating two different divisions, focusing on two product lines at cross-purposes to each other is not the way to rebuild yourself. Worse, they’re tying themselves to the new Windows mobile. Tying yourself to a product line that, despite initial love and admiration from the tech crowd, has dismally failed in the marketplace is also not a winning proposition. I thought it would be pretty competitive myself, but the sales figures are damning.

Confusion of mission and scope is especially a “bad thing” in a world where people, courtesy of their experience of the iPhone, iPad, and Android, expect simplicity. Focussed, effortless simplicity is indeed simple, but it’s hard. It takes work. It takes a ruthless attitude towards “how do people use this” and “which of these features do I really, really, really need.”

RIM has similarly made many missteps, and is also finding itself drowning in a sea of iOS devices and Androids.

So what’s left?

In the phone space, I see Android as dominant, market-share wise, and Apple still hanging on. This market share dominance is why I’m exploring programming for Android on top of taking time to learn programming for the iPhone. Unlike some, I’m not discounting the survival of Apple in this space. They’ve too consistently reinvented themselves – they almost have a fetish for replacing product lines with something new even as they’re still selling well, and have survived decades of doomsaying. Android will have the variety of features, while the iOS devices will be streamlined to do a number of things most people want very well in a very polished manner that Google still barely approaches even when some features are better implemented.

Android will probably keep and retain the bleeding edge, but remember, Apple’s rarely been first to do anything: windowed OS’s, MP3 players, online music stores, smartphones, even tablets – they’ve just been the first to do them in a way that makes them insanely popular outside of the geeksphere (guilty!) or niche, vertical markets. It’s telling that the Android OS we see today has almost no resemblance ot the Android previews before the iPhone came out. As long as Apple keeps that culture (and with Jobs’ hand-picked execs in there that should be another couple years after he gives up running the company), Apple will do well, sales-wise, and stay relevant. While the actual intro-day sales on Verizon this last week were slower than expected, I can’t discount the number of Verizon users I personally know who hate AT&T and were incredibly excited to have an iPhone available on Verizon. And bought one.

It’s also important to note that Apple is the first phone maker that has managed to mostly divorce the features of the phone from the carrier, though some limitations do exist (look at how long it took AT&T to allow tethering long after it was available for the iPhone). Google is still getting there, as there is fragmentation in not only the add-ons that various manufacturers provide, but the various carriers also restrict what features and software are available. Add to that the fact that Android updates are relatively complex for consumers – after Google updates the OS, each manufacturer needs to modify that update to work with thr changes made for its handsets – and many don’t bother to. Nevertheless, the obvious trend for Android is to be more and more independent of the carriers and the manufacturers.

Tablets are trickier. Here, Apple has less of a lead, and the Galaxy Tab, as well as the demos of Android 3.0 for tablets shows that there are some things that can be done differently (for god’s sake, Apple, fix your notification system already!!). You also have HP joining the game with their WebOS-based tablets by mid-year, though there’s no announcement on price.

I predict roughly the same for Android vs. Apple. Yes – Apple’s lead is smaller, but using modern tablets is a much more tactile experience than a mobile phone screen – and build quality here matters. As a result, the Samsung based-tablets are no cheaper for the features than the equivalent iPad. One advantage the Android platform tablets have is that they are already incorporating faster processors that make up for some of the jerky responsiveness on the Android phones that breaks the tactile illusion that the iOS devices maintain.

WebOS is a wildcard – we’ll have to see what HP can make of it from a developer and App perspective, but the product itself seems solid.

No matter how you add it up – interesting times, and the presence of several players will keep Android and Apple honest. 



We’re going through several here.

First of all, I’ve moved to a new host. 

Second – I’ve upgraded the blogging software at the same time. Actually this was less scary than doing the upgrade in place because I not only had a local copy of the website, but a fully functional one online I could always flip back to. The only headache was getting the old database uploaded to the new server as phpmyadmin didn’t want to handle that much data…..

Next step after another post on DNS stuff and filtering: Get my theme updated. 🙂

The Windows Registry

There are a number of places in Windows where Microsoft has solved an existing problem by overcorrecting, and causing other problems well down the road. One of the places this is currently most obvious in Vista is the way you are bombarded by a flurry of “allow/decline” messages every time you do anything that modifies the system. They had to do something about stuff getting installed behind your back, and opted for irritating overkill.

Another place is the registry.

The registry exists for a very good reason. Wayyyy back in the dark days of Windows 3 and DOS, getting your computer configured, adding hardware, and telling your machine where all of the programs were required editing a series of files scattered throughout the hard drive. Programs would place these configuration files in seemingly random locations, and many installation programs for new hardware or software would misread or worse, break the configuration files.

The benefits gained were many. System files were collected into one location where drivers and add-ons could easily find them. The same was true for program preferences. It provided a fast and consistent means of storing this information. Access to most of these settings was through control panels unless you jumped through hoops to manually edit them, reducing the number of potential errors. On top of that it’s structure as used in Windows 2000 and XP allowed corporate computer policies and settings to be configured and enforced centrally. All this was achieved without having to worry about file permissions. There’s even a degree of built-in backup, and many errors could be recovered using the last known good state.

That said, I’ve all too many times run into serious issues when the registry gets messed up. This could be the user settings loaded with your profile when you log in, or worse, the machine settings. How software installation, uninstallation, and reinstallation is handled also is much more difficult than it should be. Finally, it accumulates cruft over time.

Let’s tackle the last, first. Any system of settings can leave behind bits and pieces. Personal program preferences are the worst. Even on a Mac, deleting a program does not get rid of the preference files that store all of your settings. That said, these preferences aren’t read and loaded until their respective program loads, so all they do is tie up space on your hard drive, and have little or no impact. In Windows, if the uninstaller either deliberately leaves the preferences, or forgets them, they are now part of the ‘hive’, are loaded when the computer starts or when you log in, and are yet another point for the registry to become corrupted and fail, even if they are not being used. Besides, tying up this room in memory means longer load times and more memory used up that can be used to run programs, etc.

Remember — all things being equal, something more complex is more likely to break. That’s why we engineers value simplicity in design, and “Rube Goldberg” is something of an insult.

Programs entering themselves into the registry is also a reason for a common complaint among Windows users that Mac users find criminal – program portability. If you have to reinstall Windows in anything other than “repair” mode (and sometimes even then) you are virtually guaranteed to spend hours, if not days, reinstalling every piece of software on the computer. If you decide you will be using a certain program on your shiney new desktop, you can’t simply copy the program file over. By contrast, about the only programs on a Mac requiring full reinstallation are drivers and the Adobe suite. Everything else can run from any directory, and if it doesn’t find a set of preferences, creates a default. Many programs are installed by simply copying them to your hard drive and they can be moved or copied by simply dragging the programs to their new home. At worst you may have to copy the license file out of the system preferences as well, or re-key the license. You don’t even have to put them in the “Applications” folder — the equivalent of “Program Files” in Windows. Getting rid of those same programs is as easy as dragging them to the trash. This is possible because there is not a central registry that tracks locations of program files which breaks if you manually move the file, and because Mac programs are smart enough to create a default set of preferences.

What’s worse is when your registry gets corrupted. This can be corruption of the actual data, or corruption of the structure. Both can result in programs crashing, failing to start up, or worse, the computer never starting at all.

If the data is corrupt, then sometimes it can be manually tracked down and corrected. Usually this is easiest by removing and reinstalling the program, unless of course the uninstaller forgets to remove the relevant registry keys. If the structure is corrupt, it’s a nightmare. You cannot even access, much less delete, the relevant keys to fix the problem. At this point, getting the problem fixed becomes “interesting” in a chinese-curse way, and unless you have on tap very recent backups of the registry that are also clean, you will likely have to reinstall (in the case of system registry problems) or wipe the user account (user registry settings). Few things are more frustrating than trying to get a clean and functional user profile working in a roaming-profile environment.

In short, the registry had solved a number of problems, but has also brought a number of headaches along that just get worse and worse as a computer ages and more programs get added and removed. This is why many Windows experts would recommend a reinstall every year or two, and many tech support lines commonly ask you to reinstall Windows when troubleshooting a problem.

Stealth, and National Treasure

Saw National Treasure again this weekend with the family and loved it all over again. Sure, there are “because it’s a movie and we can’t take the time to show this” nitpicks, there’s even one big logic flaw involving where a shadow falls, but all in all the production values and quality of the film make it an absolute joy to watch.

Stealth is a popcorn movie. Predictable as a straight line, though the AI isn’t evil, with logic flaws and plot holes you could steer a carrier through. That said, it was a great, 80’s, fluff experience with stuff blowing up and it was worth the price of the rental.

Now I need to watch that copy of We Were Soldiers i’d picked up at the library.

One is Good, Two is NOT Better.

We techs always recommend that you keep an antivirus program functional on your computer. Like all good things, too much can be a very, very bad thing.

Just like mixing various prescription drugs, mixing more than one antivirus can have disastrous effects on your computer. The problem is not in the file scanning portion of the antivirus, but its true, hidden bread and butter, the real-time protection. These functions check everything you open and manipulate for virus-like behavior.

Now, all you have to do is imagine two programs competing tooth and nail for access to everything your computer does and you can just imagine the crawl your computer goes through. I’ve seen it before. Just thinking about it again gives me the creepy cold chills of a truly scary book, without any of the enjoyment. That or the sick dropping feeling in my stomach.

So, as a note, please, please, please, only keep one of any type of active security program on your computer. Only one antivirus program with active, “real-time” protection, only one antispyware program that provides proactive protection. One good one is more than enough.

One Little, Two Little, Three Little Windows…

You would think that many people, even those not truly computer – savvy, usually know which version of Windows they are running on their Windows machines. Insofar as knowing whether or not they are running Windows 98 or Windows XP, this is usually, but far from universally true. What most people don’t realize is that for all intents and purposes there are at least four versions of Windows XP installation disks that are all mutually exclusive.

Yes, four. If you decide to include the corporate open license versions, there are even more.

Those of you stuck at two (XP Home and Pro) can be excused for your confusion, because in truth, that is what Microsoft will tell you. What Microsoft doesn’t tell you is that there are two versions of XP Home: The one you buy over the counter, and the “OEM” version that is usually preinstalled on your machine when you buy it from Dell (or Gateway, etc.). While the actual copy of Windows on the disk is identical between the retail and OEM versions, these both have separate disks, and separate sets of installation keys, and separate installers.

Wait, it gets much worse. For the tech, anyway.

Many people who need Windows XP reinstalled have lost their original disks. Fortunately, it’s easy enough to carry around a copy of the various flavors (OEM and retail) of XP home and pro. Guessing which to use is also usually pretty easy based on what OS was originally installed on the machine, which we can often discover by looking for the Microsoft label on the side. It’s critical that we use the correct disk, because with the new activation features, if you don’t get the right version on, you don’t have a valid activation key, and 30 days later Windows stops working.

Imagine, though, the confusion for the poor user who doesn’t realize there is a difference. I see enough people who don’t know they can’t use a friends’ copy of XP pro to fix XP home. Compound this with the fact that many home users who upgraded to XP in the first place often lose their keycodes, and reinstallation becomes nearly impossible unless you’re sufficiently geeky to keep rescue tools like Knoppix around, and USB thumb drives.

So what is Microsoft doing to make things easier for us, the users?


Worse than nothing.

According to a recent article at Ars technica , there will be seven, yes, seven versions of Windows “Vista”, destined to replace Windows XP. Hopefully, these also don’t come in OEM and retail flavors because at this point, I’m beginning to get confused as to which version is capable of what, and I pity the non computer geek. Carrying four CD’s around is annoying enough, and at least I know what I’m doing. Usually.

Setting Up a Home Router

A home cable/DSL router may be the second best improvement you can make to your home computer and network as far as making your broadband connection usable, and keeping your home computer safe. For a very little bit of time and effort (and roughly $40 American) you can prevent all sorts of headaches.

First of all, what is a router? According to my Techno Babble page it is:

A piece of hardware that connects two separate networks together and routes information between them…

The two networks we are talking about are the internet, and one that likely didn’t exist until you installed the router – your home network. The second you plug your computer into the port marked “LAN”, or one of the numbered ports (if the router has a built-in switch), you have an instant, if very small network made up of your computer and the router. The second you attach the router to the cable modem or DSL modem, you have added the router to the network we call the internet.

How to tell if you need a router:

Some DSL modems provided by companies like Bellsouth already act as routers. If this is the case, then you do not need to add a router, though you may want to add a switch and/or a wireless access point to allow more computers onto the internet, or to free yourself up from being tethered to the desk. In order to tell if you are behind a router:

If you have a Windows machine, click on “Start”, then “Run”. In the box provided type:


…and click OK (If you’re still using windows Me or Windows 98 you will have to type in the full word “command” instead of “cmd”). When the black box with the “>” prompt appears, type the following:


… and hit the enter key. You will get a short list of numbers.

For Macintoshes running OS X, open up the system preferences and look at the network preferences. For older versions of OS X you may have to specify “built-in-ethernet” in a drop-down menu.

What you are looking for is a line that starts with “IP Address.” Following it will be a series of four numbers separated by periods. If the first number is a 192, a 172, a 169, or a 10, and you are able to get online, then you can stop worrying. You’re good to go. If not, your standard mail-order place like CDW, newegg, or PC zone can help you, as well as any local Staples, Radio Shack, or electronics store that sells computer equipment.

Setting Up The Router

Hook it up between your modem and your computer as shown in the diagram below:


You will likely have to do one or more of the following three things: First, if you have a cable modem, unplug your cable modem completely for a few minutes. Don’t just turn it off. The reason for this is that it whatever computer or router it first sees is the only piece of equipment the modem will talk to. Unplugging the modem clears this memory and allows it to start talking to your router.

Knology and some other providers may ask you to provide the “MAC” address of your computer. As opposed to “Mac” computers from Apple, the MAC is a unique ID number given to every network card. Your router will have this number on the outside of its’ casing.

Finally, there is a percentage of internet companies like Time Warner that require your computer or router to log in. In this case you will also have to follow your setup instructions for configuring the router and find the option (often on the main page), to have the router connect to the internet using “PPPOE.” You will also have to type in a user name and a password that your ISP gives you. This can unfortunately be problematical and confusing, made worse because most ISP’s don’t support home routers, even though it is unsafe to put your computer directly on the internet without one.

When this is all set up, the router decides if any information it sees on the internet needs to be forwarded to your home computers, and if anything your computer is asking for needs to be sent out to the internet in order to download a web page or file. Without any further configuration or setup, you already have the following benefits:

  • Because of a firewall technology called NAT that is built into nearly all home routers, your computer and home network is now one step removed from the internet. By creating a separate network it just became significantly harder to crack into. More importantly, it is almost impossible for most “worms” (a type of virus that scans nearby networks every few minutes) to get into your computer.
  • On some ISP networks, it’s fairly easy to browse and find computers in your neighborhood. While this is less common these days, having a router prevents anyone else your neighborhood from seeing what computers you have running and from looking into any files you may accidentally share out.

Also, if your router has a built-in switch, or if you are using an separate switch, you can now connect more than one computer to the internet without paying up for more than one internet account. Finally, if you bought a wireless router or add a wireless access point, you can also access the internet from any wireless computers in your household.

It’s Spaceship Three

Not really. What’s resurfaced in a press release and is being discussed at slashdot is that Spaceship Three, from Scaled Composites and Virgin Galactic (no lack of ambition evident in that name…) is intended to be an orbital craft. While I agree that SS1 can’t simply be scaled up to an orbital vehicle without significant work in materiel and engines (among other things, the total thrust is significantly higher to get into orbit, as well as the deceleration and heat), it will still be interesting to see what the X-prize model of development will bring forth from SC and other competitors.

It’s a Bird, It’s a Plane…..

An article from the University of florida about an engineer who has developed an RC-model based drone with wings that change their actual shape in flight (as opposed to extending flaps, etc) from a F4U coarsair-like inverse gull to the opposite.

The intended usage is for highly maneuverable drones that can be operated even in a city.

An MPEG of the wing in action is available here.

It would take some work to scale this up to human-carrying aircraft. One of the reason plane wings are relatively stiff is that to build the wing strong enough to carry what we would call a decent load would make the wing unreasonably heavy if it had to have all the parts and supports required to make it bend like a bird’s wing. In the meantime, it will be interesting to see how far this can be pushed, and more maneuverable, cheap drones are also a good thing, with any number of applications, many of them civilian.

Comparable Worth

If you have a bureaucracy that is responsible for determining who makes how much for what job, what happens when you need to create a job position that hasn’t existed before because you’ve created a new product, a new way to make things, or your business model requires your employees to combine aspects of existing jobs in new ways? I can imagine few better ways to stifle innovation and job creation than to make it nearly impossible to create new types of work.