Archive for the ‘Microsoft’ Category

The reality of Macs and Malware

September 11, 2009

Friday, September 11, 2009

I read an article recently which inspired me to dive into this topic with a blog post of my own. The article in question was “Why are there no Mac viruses?“ by Philip Elmer-DeWitt.  The article makes a few interesting points in and of itself. However, I found the comments which accompanied the article to be of equal interest. I’m reminded of the fact that most people who comment on such topics don’t understand what a virus is much less how it differs from something like a Trojan horse. I’m also reminded of the fact that misery loves company.

For those who feel they have a firm grasp on the definitions and distinctions between the various terms associated with malware, feel free to skip to the next section.

So, what is the difference between terms like Viruses, Worms, Trojan Horses, Spyware and Malware? People seem to feel free to use these terms interchangeably as if they are the same thing. They are not.

Malware, short for “malicious software”, is a very generic term that collectively refers to any sort of bad program running on your computer. Malware may come in the form of a Virus, a Worm, a Root Kit, a Trojan Horse or even as Spyware. has the following definition: “Malicious computer software that interferes with normal computer functions or sends personal data about the user to unauthorized parties over the Internet.

A Virus is a piece of malware that has the ability to self-replicate. This is the most dangerous piece of malware. has the following definition:“a segment of self-replicating code planted illegally in a computer program, often to damage or shut down a system or network.

A worm is similar to a Virus in terms of danger or threat, but there are several distinctions. A worm spreads across a network without any user intervention (this distinction is important as compared to a Trojan). Also, unlike a virus, a worm does not attach itself to existing code. has the following definition: “computer code planted illegally in a software program so as to destroy data in any system that downloads the program, as by reformatting the hard disk.

The is incomplete in this example, so further detail is described here with the Wikipedia entry.

The Wikipedia definition is as follows.
A computer worm is a self-replicating computer program. It uses a network to send copies of itself to other nodes (computers on the network) and it may do so without any user intervention. Unlike a virus, it does not need to attach itself to an existing program. Worms almost always cause at least some harm to the network, if only by consuming bandwidth, whereas viruses almost always corrupt or devour files on a targeted computer.

Root Kit
A Root Kit is more difficult to describe. It’s probably best described as a combination of a Virus and a Trojan. It allows unauthorized users to take control (“root” access in UNIX or “administrative” access in Windows) of your system without the knowledge or permission of the legitimate systems administrator. They are typically installed through legitimate software installations unknowingly. Likewise, they are only effective after they have been installed, presumably unknowingly, by someone with administrative access to a system.

Root kits go back to 1990, but they are most commonly associated with the 2005 Sony BMG scandal. Music CDs from Sony installed a Root Kit onto Windows based PCs in an attempt to enforce DRM. In the process, they created a huge security hole for anyone who was aware of their existence. has the following definition: “A root kit is a computer virus which consists of a program (or combination of several programs) designed to take fundamental control (in Unix terms “root” access, in Windows “Administrator” access) of a computer system, without authorization by the system’s owners and legitimate managers.

Spyware is basically a piece of malware installed on your system that monitors your activity, collects information about users (without their knowledge) and reports this information back to another source. This data is typically used for marketing purposes.

A subset of spyware called “keyloggers” can be used to steal a user’s password, credit card or any other sensitive data entered by keyboard.

Another type of spyware is called “adware”. Adware is computer software that automatically downloads, plays or displays advertisements on your computer when you run certain programs. has the following definition: “any software that covertly gathers information about a user while he/she navigates the Internet and transmits the information to an individual or company that uses it for marketing or other purposes

Trojan (horse)
A Trojan is best described as any piece of malware that is installed or run by the user through deception. Trojans typically do not exploit known security holes. Rather, they trick the user into executing them by pretending to be something else. The only real defense against a Trojan is a healthy dose of common sense. Don’t install software from un-trusted sources for example. Trojans often accompany pirated software downloaded from peer to peer clients. Trojans are not like Viruses as they don’t self-replicate. They are not like worms as they don’t automatically spread via networks. They require an end user to manually execute them. This is an important distinction. has the following definition: “a non-replicating computer program planted illegally in another program to do damage locally when the software is activated.

in the wild
This is another term that’s an important distinction. In the referenced article, Elmer-DeWitt uses the following definition for “in the wild”.

“In the wild” means it has infected, or is currently infecting, new machines through normal day-to-day usage.”

On the Mac platform for example, there have been several attempts at “proof of concepts” to make a Virus. However, due to technical and/or security barriers, these “proof of concept” viruses have never been able to propagate “in the wild”. This of course renders them completely ineffective and thereby nullifies their existence as a security threat.

So, why are there no Mac Viruses?
In the referenced article, Philip Elmer-DeWitt claims there are no known Mac OS X viruses in the wild. By contrast, there are thousands of known Windows based viruses in the wild. When you stop to think about that, it’s a pretty amazing claim to be able to make. Similarly, it isn’t very surprising that Apple would play to that strength with its “Get a Mac” advertising campaign.

Elmer-DeWitt attributes the lack of Mac viruses to three reasons: small market share, stronger UNIX based file system and kernel and “viruses going out of style”.

Security through obscurity
The first reason listed, “small market share”, is by far the most common answer by people (qualified or otherwise) who comment on the topic. This answer is sometimes called “security through obscurity” and this response is especially popular with Windows users. Many Windows users would like to think that Macs are just as vulnerable as PCs. This reasoning continues by suggesting that the smaller user base makes Macs less viable targets.

While there is some truth to this line of thinking, it’s a bit short sighted. On one hand, it’s a fair argument to suggest that viruses are created with the intention of gaining monetary value or wreaking the most amount of havoc as possible. With this mind set, targeting the Mac user base (less than 10% of the overall PC population) would be of little value. However, it’s more likely that Macs would have fewer viruses if this were the case rather than no viruses at all. Other motivations for creating viruses have historically been just for bragging rights amongst the “hacker” population. Imagine the notoriety that would go with being first to create a legitimate Mac virus! Further evidence to debunk this claim would be the “classic” Mac OS (Mac OS 9.x and below) had up to 60 viruses (depending on the source) over the years. Clearly, the classic Mac OS was less of a target, but it was a target and there were viruses for that platform. Similarly, there have been virus attempts for Mac OS X, but they have been unsuccessful due to technical & security limitations built into core of the operating system. As such, it’s fair to suggest that Apple’s relatively low market share makes the Mac less of a target as compared to Windows PCs. However, those who suggest this is the only reason are simply mistaken as logic would dictate otherwise.

Stronger UNIX based file system and kernel
Clearly, Apple’s stronger UNIX based file system and kernel have helped Mac OS X’s security reputation. If nothing else, the documented “virus attempts” would have been successful viruses were it not for this level of security. Arguably, this might be the only reason we haven’t seen a successful Mac OS X virus. However, that’s difficult to prove one way or the other.

Viruses are going out of style
Elmer-DeWitt claims that “The action these days, I’m told, is in Trojans and spyware.” That very well may be, but then the question has to be where are all of the Mac Spyware infections, not to mention Root Kits, Worms, etc?

The truth about Trojans
Trojans exist on both platforms, but these aren’t really breaches in security, these are breaches in common sense – at least with regard to security in modern operating systems. Trojans are often referred to as “Social Engineering” issues because they require tricking the end user to execute them with escalated privileges.

For example, I could write a very dangerous program and call that program “WINWORD.EXE”. I could then supply that executable with a copy of Microsoft Word’s icon. If someone were to download this executable, thinking they were getting a copy of Microsoft Word, they would be very surprised when my dangerous program wreaked havoc with their system instead. Unfortunately, this type of threat is not so much a security issue because the program doesn’t exploit any known security hole; rather the exploitation is the end user’s lack of judgment or precaution. As such, it’s labeled as it is a social issue. Though, to be clear, that “social issue” may very well compromise the security on your system. As such, categorically, it has to be considered a security issue at least indirectly.

With Windows XP, if you ran a Trojan and like most every other XP user, you’re running with administrative privileges, your system was compromised and you had no clue anything unusual was happening. At least with Mac OS X, when the Trojan tries to do something that requires more than basic user privileges (like infecting your operating system or wiping your hard drive, etc.), the user is prompted to authenticate with their administrative password (even if they the user account has administrative privileges). Most people with an ounce of common sense would think this is strange and deny the authentication. Why would Microsoft Word need administrative privileges to create a Word document?

Fortunately, Microsoft Vista and Windows 7 have similar authentication requirements. The problem with Vista is that the use of UAC is overdone to the point that users just become conditioned to accepting everything in order to clear the annoying Window. Apple’s latest operating system release, Snow Leopard or Mac OS X 10.6, contains rudimentary detection for known Trojans. Windows 7 goes a step further and provides basic anti-virus (malware) protection.

Misery loves company
The knee jerk reaction to an article like this is to do a Google search for viruses on Macs. After some digging, someone in the comments thread was able to find some write-up of a Mac OS X virus. Without further investigation, this commenter declared victory by way of “seemingly” providing evidence to dispute Elmer-DeWitt’s claim of no Mac OS X viruses in the wild. The two viruses cited were:

  1. OSX.MachArena.A
  2. OSX/Leap-A or OSX/Oompa-A

The problem is, just doing a quick Google search for a Mac virus may yield a few results, but if you dig into the details, none meet the actual requirements to be both a real virus that is capable of self replicating and also existing in the wild. It’s also important to note that companies who classify a piece of malware as a Virus stand to gain financially as people become scared and purchase their anti-virus “solution”.

Such is the case with the first example, OSX.MachArena.A. It became popular from a press release from Intego (who happens to sell Mac anti-virus software) on November 6, 2006. They did at least admit that it was just a proof of concept that did not exist in the wild. In truth, it had to be run from a Windows partition on a Mac (assuming the Mac user actually even had Windows installed) and even with that, it was executed manually and likewise could not self-replicate. It was just a silly proof of concept experiment that never could have existed in the wild. In a MacFixIt article, Symantic acknowledged that there is “there is no reliable vector for the spread of OSX.Macarena”.

Similarly, Sophos Labs released a press release on February 16, 2006 that the “First ever Virus for Mac OS X” was discovered.  This of course was referring to the malware known as OSX/Leap-A.  Not surprisingly, Sophos Labs just happens to sell Mac anti-virus solutions as well. Notice a pattern here? The entities that produce the scare with press releases just happen to have paid solutions available to you. How nice.

At the time, there were several technical sites which decomposed this supposed “virus” in detail and made it clear that it was not a virus. Just doing a quick search brings up lesser quality explanations, but they still serve to illustrate the point.

“Even if someone does send you the “latestpics.tgz” file, your computer will not be infected unless you explicitly unzip the file, open it and then provide the computer with your password so it can run.
“The Leap-A malware was a poorly-programmed Trojan horse that relied on “social engineering,” or trickery to perform its nasty function. There’s a simple way to protect against this kind of threat — common sense — and in testament to this, a lot of people didn’t fall for it. “

Macworld covered the issue and added the following:

“Apple’s Official Policy concerning this is: “Leap-A is not a virus, it is malicious software that requires a user to download the application and execute the resulting file. Apple always advises Macintosh users to only accept files from vendors and Web sites that they know and trust.” Apple provides a guide to safely handling files received from the Internet here.”

In short, to get this malware, you had to specifically choose to download it via iChat (Apple’s instant messaging software). Then you’d have to uncompress the file manually. OS X itself provides warnings at this point as a matter of course. Then you had to open / execute the file. Finally, you would have to authenticate your administrative password in order for it to work. I believe the real kicker here was that even with all that, it only worked over a local network. Clearly this was not self-replicating given all of the manual steps needed by the user. In reality, this was, as described, a poorly conceived Trojan and nothing else.

No doubt these “details” would seem to rain on the parade of the “Windows fanboys” looking for a little schadenfreude. More likely, it’s likely just a case of “misery loves company”. Either way, as of this writing, there are still zero legitimate Viruses (or worms, etc.) for Mac OS X.

Vectors of attack
Historically, it’s important to look at the vectors of attack for common malware. Years ago, while I worked at IBM, a colleague of mine was security researcher. It was his job to look for holes in software and try to exploit them. Anecdotally, I recall him telling me with a smile: “As long as Microsoft is in business, I’ll have an easy job”. The discussions often turned to examples with things like Microsoft’s ActiveX controls. This technology was Microsoft’s attempt to create a Microsoft Internet, whereby Internet applications would require ActiveX controls and thereby locking out other operating systems. Instead, ActiveX became the breeding ground for countless Virus and other such malware attacks. Prior to Internet Explorer 7, Spyware could be loaded onto your Windows based machine undetected seemingly at will. Worse, since Internet Explorer was (and is still) used by the masses, attacks were very easy.

Similarly, just viewing an e-mail in Windows opened up the door to countless virus attacks on PCs. The attacks were never ending because they were so easy to write. Perhaps the original attack was done by an expert, but subsequent attacks by “script kiddies – kids without much technical experience” were accomplished by modifying an existing virus just enough so that it wasn’t detected by existing virus definitions. With each mutation from the next script kiddie looking for fame, came significant downtime from corporations and users alike.

The point to illustrate here is that the most successful vectors of attack for viruses simply didn’t exist anywhere but the Microsoft Windows platform.

Other comments in the thread
There were a number of other comments after the article that caught my attention and are worth addressing.

“Trojans are just as dangerous as viruses, and the distinctions are basically academic.”
While it’s true that Trojans can be just as dangerous as viruses, the distinction is far from academic. Viruses, Worms, etc. infect your system via exploiting security holes. Trojans, particularly on OS X require the end user to both manually execute the Trojan and to provide administrative authentication manually through trickery. For this reason, it hasn’t been an issue for Mac users. Though, for Windows XP (2/3 of the existing Windows user base as of this writing), simply executing the code was all that was needed. For this reason, it’s understandable why some Windows users would consider this distinction to be “academic”.

“Viruses can be taken care of with a $20/year investment in antivirus software and a bit of personal responsibility.”
That of course ignores the initial investment in the software. Anyway, it’s said that Mac users have a false sense of security because they typically don’t even have anti-virus software installed. While that is true to some degree, the only valid reason for Mac users to run anti-virus software is really just to be a good citizen and not pass on other Windows viruses that come through e-mail, etc. With regard to a false sense of security, it should be noted that Windows users who use anti-virus software are only as safe as their latest virus definitions. That is, on Windows, Viruses happen. Until they’ve been detected and a solution is wide spread, that same virus may attack thousands of Windows users. Likewise, nobody is every 100% safe. Also, it should be noted that anti-virus software runs at an escalated privilege setting. In the past, anti-virus software has been a known vector of attack for security exploits.

“Time will tell.”
This is typically where debates go when there is absolutely no evidence to suggest Macs are equally vulnerable as Windows. When Mac OS X was new, this was a fair argument to make. Any operating system needs to be exposed to a large user base for a few years to see if it stands up to the rigor of the real world. However, after nearly 10 years now, this argument wears thin. How many more years will people still resort to this argument? Certainly, anything is possible. Quite frankly, I’m surprised there hasn’t been a legitimate virus for Mac OS X after all these years. However, with each passing year, the likelihood of such security breaches would seem to diminish.

“It’s because the people who write the viruses all use Macs!”
I’m sure that was written as a joke. Of course, just to respond, how would these PC viruses which were developed on Macs be tested?

“If MAC OS X doens’t have viruses, why do AV ISVS like Symantec, McAfee, and Sophos make AV software for them?”
As of this writing, there is no need to run such software on Macs. In fact, the vast majority of Mac users do not run any such software. That’s not necessarily a good thing, but that is reality. Right now, the primary reason for Mac users to run AV software is just to be a good internet citizen and not pass on viruses through e-mail, etc. onto other Windows users. For some, it’s always nice to know that such software does exist in case of an emergency. However, to be honest, I don’t know how these companies stay in business. For many companies this is one product of many that they offer.

“Let’s level the playing field here – if you “rule out” all those categories, the number of PC malware goes down quite a bit too. Not counting any browser (IE) vulnerabilities must cut it in half.”
While it’s true that if you rule out things like Trojans, etc. from the overall Windows based malware count the numbers would be significantly reduced. However, by no stretch would it “level the playing field”. The numbers of Viruses, Worms, etc. on the Windows side would still be in the tens of thousands as compared to zero for the Mac side. Narrowing the definitions isn’t the magic bullet here in terms of “leveling the playing field”.

“Who cares that it doesn’t have viruses when it has other forms of malware?“
As a multiplatform user, I care. As for other forms of malware, there are a handful of Mac Trojans and none of them are particularly harmful. There are no known Virues, Worms, etc.

“I’m the guy who promoted the $25,000 OS X virus challenge in 2005. I was thoroughly trounced by one and all as an evil, irresponsible criminal for having the guts to publicly say the technological truth about how the Unix frameworks of OS X and the as-shipped system configuration of Macs effectively eliminated any risk of non-user enabled entry trajectories for viruses.”
This wasn’t a question, but rather a comment from Jack Campbell who promoted the $25,000 Mac OS X virus challenge contest. He was criticized by the Mac community at the time as being irresponsible. That probably was true. However, the point remains, nobody was able to claim that prize. Four years later, nobody would still be able to claim that prize. Surely $25,000 is enough to generate interest from hackers, so we can’t say there wasn’t financial incentive to create a Mac virus.

“Windows is more vulnerable because it is more open to software development – which is why there’s a HUGE number of programs written for it, vs a tiny percentage for Mac.”
Considering that every Mac ships with professional development tools (Xcode) I fail to see the logic in that statement. Further, considering the massive amount of development shifting to the iPhone, which is done on a Mac, the excuse of not having enough Mac based developers just doesn’t hold water. For that matter, viruses could well come in the form of UNIX scripting which certainly exists on many other platforms.

“Why do you suppose Mac added AV software to Snow Leopard?”
Technically, there isn’t. Apple did add some basic anti-malware into some services such as iChat, e-mail and the web browser. Currently, it can only detect a few Trojans (because that’s all the malware which exists) and this service isn’t system wide. This is a good first step, but it’s not a full anti-virus solution. Anyway, to answer the question, if some malware does exist, why not build in some form of basic protection to further tighten security?

“no serious computer user or programmer gives a crap about apple, that’s why there are no viruses.”
I’m not even sure how to reply to such a ridiculous statement. However, comments like this do serve to demonstrate the notion that misery loves company and that there is no limit to the type of irrational arguments that come from Windows “fanboys” wishing to discredit the Mac platform in some way. It also serves to illustrate the level of intelligence of the typical ranting forum poster.

The article for the basis of the blog post was just one of many such articles on this topic. It never ceases to amaze me just how much misinformation and incorrect perception exists regarding this issue. The simple truth is, seemingly few people seem to have even a basic understanding of the various terminology associated with security and malware in general. I’m not sure that will change anytime soon.

As I suggested earlier, misery loves company. Windows zealots would be very happy to have the Mac community under the same level of malware attack as they are, but that’s just not the case. Worse, the fact that Mac users can brag that “zero” Viruses, Worms, etc. exist on the Mac platform seems to be a particularly sore spot for many. As seen in the comments section of the referenced article, there were many outrageous (albeit unsuccessful) attempts to discredit the Mac platform’s track record with regard to malware vulnerability. Worse, software companies who develop Mac based anti-virus software have been quick to send out false alarms in hopes of creating a panic and thereby sell more of their product. In many respects, that’s shameful.

Finally, logic would dictate that no system is perfectly secure. Mac OS X certainly isn’t. As such, it would seem likely that eventually there will be a significant Virus/Worm attack for the Mac OS X platform. Yet to date, that just hasn’t happened. Perhaps Mac users are living with a false sense of security by not purchasing anti-virus software. I certainly wish I didn’t have to use anti-virus software on my PC. However, it’s also hard to argue the fact that malware just hasn’t been an issue for Mac OS X users through the history of that operating system. For the Windows fanboys out there, this may be a tough pill to swallow. We can argue why Macs have been a safer platform (smaller market share, better inherent security model, etc.) but we can’t deny the fact that the Mac platform has enjoyed a tremendously better track record in this regard as compared to the Windows platform.


Ballmer’s “Apple Tax” campaign

April 16, 2009

April 15, 2009

Like any good rivalry between competing companies, characters from both sides take shots at each other, hoping someone will take their word at face value and not challenge the validity of such claims.  When it’s done in humor, you tend to get a little leeway with the claims you make.  However, when you present such claims as fact, you’re likely to be taken to task.

 History of rivalry

Microsoft and Apple became competitors when the original IBM PC shipped with MS DOS back in 1981.  At that time though, Microsoft wasn’t really seen as a competitor.  Apple was competing with IBM.  Back then, competing with IBM essentially was a proposition doomed to failure, particularly within the enterprise market.  Microsoft wisely licensed their product to IBM rather than selling it to them outright.  In the process, Microsoft inherited the success of the IBM and future IBM compatible market.

In the meantime, Microsoft was also a developer for the Apple platforms, including the new Macintosh platform.  After developing products for the Mac, it was clear that the GUI was the future of operating system interfaces.  Microsoft promptly began to copy Apple’s designs.  Worse, as a developer for the Mac, they more or less had a blue print for the foundation of APIs and event models to copy from.  When Microsoft copied the Mac, the two companies truly became rivals.

Over the years, Microsoft was able to ride the wave of success on top of their operating system monopoly.  They were even able to extend that monopoly through unfair business practices onto their suite of business productivity applications, Microsoft Office.  During that timeframe, Apple’s market share continued to shrink to the point of near extinction in the mid to late 90’s.  


Times have changed

Back in 1997, Apple’s annual revenue had sunk to $7.1 billion.  By comparison, Apple posted $9.74 billion in revenue in the last quarter alone.   Instead of posting losses, Apple has been posting profits measured in billions on a quarterly basis.  Times are good for Apple.  In addition to increased market share on the desktop PC sales, Apple has enjoyed huge success with consumer electronic devices like the iPod and the iPhone.  

Microsoft is still the leader in the desktop operating system business by a large margin.  However, the trend has been in Apple’s favor for several years now.  Years ago, Apple’s 2 – 3% market share was considered insignificant.  However, according to NetApplications, Apple’s usage market share is pushing 10% and has demonstrated a steady increase over the past several years.

On top of that, Apple’s Safari and various other Webkit based browsers (along with Firefox) have demonstrated similar growth patterns at the expense of Microsoft’s leading Internet Explorer product.  Further, sales of the iPhone have now exceeded sales of Microsoft’s Windows Mobile sales.  Considering how long Windows Mobile has been in the market, this is not a good sign for Microsoft.

Apple’s advertising offensive

For years, Apple has gone on the offensive with various advertising campaigns.  While Apple happens to make great software, they are primarily a hardware company, at least with respect to their business model.  As such, Apple has targeted PC hardware in the past.   For example, during the height of the PowerPC era, Apple attacked Intel based hardware, etc. More recently, Apple’s “I’m a Mac” campaign has gone after the PC in general.  However, since Apple is now using similar hardware, the attacks have really been indirectly against Microsoft.   Like any good campaign, Apple attacks known issues in the PC world such as virus / malware issues, Vista’s annoying UAC security “features”, etc.  Apple does this in a humorous way without making any specific claims.  Apple just generically plays off of known stereotypes and tries to make a point with their PC and Mac people as metaphors for their respective platforms.


Microsoft had enough

While Microsoft is still hugely successful, one thing is clear, they are losing ground on multiple fronts.  Google has Microsoft beat on the internet search front and is challenging Microsoft on the on-line application side.  RIM and now Apple have surpassed Microsoft on the mobile OS front.  Vista has taken a beating in the press and Apple’s “I’m a Mac” advertising campaign has proven successful.   Apple has started to gain significant market share at Microsoft’s expense.  Clearly, Microsoft had enough and decided to retaliate with an anti-Apple campaign of their own.  After a failed Vista advertising campaign, Microsoft decided to hire Crispin, Porter & Bogusky for a $300 million consumer advertising campaign.   The intention was to give Microsoft an image make-over and to halt any traction gained by Apple.


So, how effective have they been?

Microsoft started out with their “Mojave Experiment”.  The premise of this part of the campaign was to get people to think that despite the problems they’ve heard about with Vista, if they just tried it for themselves, they’d want to use it.  While that sounds reasonable enough on the surface, in reality, they’re missing the point.  Most of the issues around Vista have to do with compatibility, installation, etc.  Likewise, it shouldn’t be so surprising that a few people working in a very controlled environment might actually like Vista.  It’s not hard to make any product look good in a demonstration – just ask Steve Jobs for example.  In the end, the overall take away message sent from this advertisement is that Microsoft refuses to acknowledge the faults of their own products and they call their user base a bunch of dummies for not switching to Vista.  Nice message Microsoft.  

Next came the Seinfeld commercials.  Microsoft apparently paid Seinfeld $10 million for this campaign.  I always thought it was common knowledge that Seinfeld preferred Macs.  In any case, both PC and Mac users alike ended up scratching their heads trying to understand what those commercials were all about.  They essentially had no message, nor did they make Microsoft look “cool”.   Result: Fail.

Next on the agenda was the “I’m a PC” campaign.   This was the beginning of the anti-Apple effort.  The first commercials just showed a bunch of different people doing different things and calling themselves a “PC”.  On one hand, it does provide the sense of everyday people using PCs.  That’s not a bad message in and of itself.  It’s certainly better than previous efforts from Crispin, Porter and Bogusky, but that’s not really saying much.  On the other hand, when you compare these ads to Apple’s ads, they just don’t match up.  Apple used the PC person and Mac person as metaphors in what was essentially a bit of humor.  Microsoft was neither using metaphors nor humor to make their point.  In that context, having people say “I’m a PC” simply didn’t make sense.  Nobody would call themselves either PCs or Macs in reality.  

Microsoft then went on to showing kids using various tools to create cool projects.  This was somewhat effective because it tries to demonstrate ease of use which Macs are known for.  Unfortunately, they’d end the piece with the kid saying “I’m x years old and I’m a PC.”.   Instantly, whatever I just saw and was beginning to fall for just became very fake and staged.  The “I’m a PC” comment from the kids made the entire commercial unbelievable.  The common sentiment is that these ads fail as well.

And so what makes Microsoft’s new “I’m a PC” commercials so jaw-droppingly bad is that they’re not countering Apple’s message, but instead they’re reinforcing it. That the spots themselves jump between dozens of different people who “are” PCs, that the spots make a point of emphasizing that there are a billion Windows-running PCs worldwide, this only emphasizes that “PC” is not a brand name but a generic.


Finally, we get to the “Apple Tax”

Seeing as though Microsoft’s various commercials with Crispin, Porter & Bogusky have largely been considered failures, I had no real expectations of success for their “Apple Tax” campaign.  Steve Ballmer has made comments in public forums suggesting that when you buy an Apple computer, you are paying $500 extra just for an Apple logo.  Hence, you’re paying an “Apple tax”.   The assumption here is that Apple computers cost more for essentially identical hardware.

This line of reasoning and direction for an advertising campaign is wrong on so many levels, it’s astonishing.  In no particular order, I’ll provide a few reasons why this is a bad idea.

1.  Establishing a premium brand.

The whole point of an advertising campaign is to establish brand quality.  The retailers will advertise prices, etc. as necessary to make the sales.  Microsoft doesn’t even attempt to challenge Apple on quality, and for good reason.   Instead, they challenge Apple on price.  The reasoning behind this stems from the existing poor economic conditions, etc.  However, even in poor economic conditions, consumers often seek out better value as opposed to lower price.  This campaign basically concedes that the Mac is a better machine, but the PC is cheap.  Result:  Fail.


2.  This does nothing to promote Microsoft products specifically.  

Of course, given the monopoly position Microsoft has on generic PC hardware, Microsoft knows it doesn’t have to promote its own products.  If you buy a PC, you are indirectly buying Microsoft.  Result:  Fail.


3.   What about the “Microsoft Tax”?

There is a big problem with making price your primary consideration for buying a computer.  Logically, since the same PC could be purchased with Linux installed for an even lower price, you’re paying a “Microsoft Tax” when you buy a PC with Microsoft Windows installed.  Microsoft’s own line of reasoning can easily be used against them.  Result:  Fail.

4.  Consumers might actually do price comparisons.

When Steve Ballmer makes the claims that he does, he probably doesn’t expect people to actually check for themselves.  The problem is, exactly, identical hardware comparisons are difficult to find.  By just taking a look at Apple’s web site and Dell’s web site, I found Dell to be more expensive at the high end.  For example if you compare the Mac Pro to the Dell XPS 730x they are at least very similar in price (Mac Pro $3,499 – Dell $3,519), even after the Dell discount.  Now, to be fair this is not an exact comparison as they don’t offer the exact same hardware configurations.  They both used the new Intel i7 chips.  The Dell was factory over clocked but only had 4 cores, the Mac was clocked a little lower but had 8 cores.  They both had 6GB ram, Mac had a bigger drive (640gb vs 500gb), Dell had a crossfire ATI 4850, the Mac had the ATI 4870.  The Mac also had a keyboard, mouse, firewire ports, etc.

On the other hand, the point of this article isn’t to provide a detailed cost analysis across multiple price points and product types.  I would surely have to believe that a custom built machine could be had for less money as well if you’re willing to go through all the hassle associated with it.  I also don’t doubt there are some configurations where Macs do cost more for similar hardware.  The point here is that this notion of an Apple tax applying across the board simply isn’t true.


5.  The ads demonstrate the Mac as the first or preferred choice.

Microsoft’s first Apple Tax ad shows the customer, Lauren, going to the “Mac store” first to try to buy her desired machine with the $1000 budget.  Ultimately, Lauren comes out depressed because she can’t afford the machine she wants.  Then, Lauren goes on to another generic store and is happy again because she can buy a machine she wants within her budget.  That’s all well and good, but it demonstrates that the Mac was her first choice and that getting the PC was more of a consolation prize because it was “cheap enough”.  Is that really the message they want to send?

6.  The ads admit to not being “cool” enough.

In the same commercial, Lauren claims that she must not be “cool enough” for the Mac.  She says this while genuinely looking depressed about that realization.  Again, is that the message Microsoft should want to send?  If she’s not cool enough for the Mac, does that in turn make her a loser for settling for the PC?


7.  The market leader should never acknowledge the competition.

Generally speaking, it’s never considered to be a good idea for the market leader to acknowledge the competition.  On one hand, Microsoft has to do something to try to halt the market share erosion.  Microsoft has temporarily kept the numbers in check recently, but that’s almost entirely due to the low margin NetPC category.  Nobody is getting rich off of the sale of $300 PC systems.  By mentioning Apple or “Mac” by name, Microsoft is giving to much credibility to the competition.  That’s a no win situation for the reasons mentioned above.

This list could very well go one if I were to take the time to analyze every commercial that was part of this campaign.



Generally speaking, Microsoft’s most recent advertising campaign with Crispin, Porter and Bogusky is considered to be a failure by the industry.  Microsoft has not effectively enhanced their brand image, nor have they effectively countered Apple’s advertising campaign.  Worse, this campaign unintentionally ends up sending the wrong message for Microsoft.  The Mojave experiment called their existing user base a bunch of dummies indirectly.  The Seinfeld ads left everyone scratching their heads wondering what the point was.  The “I’m a PC” ads were neither funny nor effective and only served to reinforce the message communicated by Apple’s campaign.  The latest “Apple Tax” commercials have extremely questionable validity and only serve to make both Microsoft and the PC industry as a whole very generic.  Further, if you follow Microsoft’s point to its logical conclusion, everyone that buys a PC would be better served running Linux.  Afterall, Microsoft isn’t even trying to argue the quality issue.  Instead, they’ve made price their focal point of attack.  This advertising campaign is a disaster for Microsoft.   I can only imagine where they will take this campaign next.

Enderle: Exposing a Fraud

November 4, 2008

Monday, November 3, 2008

Every now and then, I read an article from a few industry analysts / pundits for the shear sake of entertainment.  The work from Rob Enderle surely fits into that category.   Rob’s latest article, “It’s Dangerous to Assume People Are Stupid”, is just begging for a counter point.  This article will dissect Rob’s arguments and provide another point of view from someone who isn’t on Microsoft’s payroll.



Rob describes himself as “the principal analyst for the Enderle Group, a consultancy that focuses on personal technology products and trends.“   Of course, it should be noted that the Enderle Group consists of himself and his wife.  It should also be noted that Rob prominently sites Microsoft as a client.  It should also be noted that Microsoft has a history of astroturfing.  (paying bloggers to send a particular message).  Likewise, with all of this in perspective, it’s not hard to understand why Rob writes the rubbish that he does.  I don’t doubt there is financial incentive for him.  However, in the process, the name Rob Enderle, is synonymous with ignorant boob in every technical forum I’ve encountered.




Back to the article…


The well-executed Mac vs. Windows  ads, while at least funny and entertaining, drifted from solid hits to outright hypocrisy as Vista was improved and Apple (Nasdaq: AAPL) seemed unable to remember its own advantages. (Hint: As a percentage, Apple’s ratio of marketing  dollars to development dollars leads the industry.)”

Yes, Apple has been poking a little fun at Microsoft with it’s “get a Mac” ad campaign.  Throughout the article, Enderle seems to almost take a personal offense to this ad campaign, but that’s another topic.   It’s also worth noting that Apple spreads it’s advertising across multiple product lines including the iPhone, iPod, etc.  It’s also worth noting that Microsoft just launched a highly public $300 million advertising campaign in an attempt to boost the company’s public perception.  

Let’s be honest here… yes, Apple has certainly capitalized off of the negative perception Vista has earned in the market place.  Did Apple create that perception?  No.  Is Apple responsible for poor Vista product reviews?  No.  Is Apple responsible for the relatively poor Vista user experience many have written about?  No.  Is Apple feeding the fire a little bit?  Yes.  

Microsoft (Nasdaq: MSFT)  just announced Windows 7, and as a pre-beta product, it is very impressive, largely because Apple’s negative campaign against Windows Vista focused Microsoft more than I’ve ever seen a complex company focused. There is a rule here in the Silicon Valley, and that is that focusing Microsoft on you generally ends badly — and Microsoft actually hasn’t been focused on Apple since the early 90s.”

Windows 7 may or may not be impressive.  I certainly don’t know one way or the other beyond what’s available on the internet.  I doubt Enderle does either.  I do know that any company can put on a technology demonstration that will impress a captive audience.  Apple does that at Macworld and WWDC events.  Why would anyone expect Microsoft’s demonstration of Windows 7 at PDC to not be impressive?  Of course, Enderle comments on an unfinished product don’t exactly carry much weight because:

  1. He’s not technical enough to even comment on products of this nature in anything but the most vague suggestions (more on that latter).  
  2. Enderle’s allegiance with Microsoft precludes him from offering any sort of unbiased opinion.

I’m not going to comment on the merits of Windows 7 just yet as I don’t feel qualified to at this point.  However, right now, the best thing Windows 7 has going for it is that it’s not Vista

I also find it funny that he attributes Microsoft’s “focus” to Apple’s “negative ad campaign”.  Honestly, I think that’s giving Apple’s influence too much credit.  Yes, Apple has gained some market share recently, but that’s as much to do with Apple’s success as it is with Microsoft’s failures.  I’d attribute any recent focus Microsoft is seeing to the basic realities that the enterprise market has largely refused to accept Vista “as is”.  To be fair, there may be several valid reasons for that and not all of them have to do with the quality of Vista.  However, I think Apple is the least of Microsoft’s concerns these days.  Regardless of whether Vista is lousy, great or somewhere in between, the Vista project would seem to have been mismanaged at Microsoft.  When you consider the years and billions of dollars that went into the development of the product, most would question what happened.  When you combine that with the multiple product delays along with the major feature cuts, it’s clear the Vista/Longhorn project was mismanaged. 

What I find even funnier is the implied threat Endrle speaks of.  What could Microsoft possibly do to Apple?  Make a better Windows product?  If that’s the case, then the majority of computer users should profusely thank Apple.  Microsoft could also pull MS Office for Apple, but again… so what.  This threat would have been a bigger deal 10 years ago when that actually meant something.  These days, with OpenOffice and even iWork and to some degree, even Google docs as competitors, this threat doesn’t mean so much.  Combine that with the fact that Microsoft dropped VBScript support for Office 2008 on the Mac and the competing products start to look good.  Very good indeed.  I think Apple knows this.  Further, Office 2008 seems to be very profitable for Microsoft, so I doubt this is even a consideration.

“However, Windows 7 attacks Apple’s historic inability to interoperate, successfully partner and work in the cloud — all of which suggest, if Microsoft executes, it will be the superior product. You can fix a product, but it is really hard to change the DNA of a company, and Apple has historically been its own worst enemy. This last is also true of Microsoft, and we’ll get to that in a moment.”

Enderle’s theory is that if Windows 7 works with “the cloud” better than Snow Leopard, Microsoft will therefore have a better product.  Huh?  Is that now the defining criteria for an entire operating system?  Since when?

Then next sentence is just ridiculous.  “You can fix a product, but it is really hard to change the DNA of a company”.  He claims that Apple is its own worst enemy then goes on to say the same about Microsoft.  Which begs the question…  What exactly is your point Rob?

This also integrates with Microsoft Silverlight advancements showcased at the Professional Developers Conference by the BBC, which will allow people to start watching a TV show or movie on their TV or PC, and finish watching it on a laptop or compliant smartphone, “

The last time I checked, anyone with iTunes (read: mostly everybody) installed can do the same… rent a movie, watch on their computer, TV (through Apple TV) or off on a portable device like the iPod or iPhone.  This is nothing new and you don’t need Silverlight to do it.  Isn’t it great to play catch up, then act is if this concept was something new?

Finally, Apple believes that only Apple should have the freedom to choose; customers have to accept Apple’s choice, it’s partially the result of Apple’s “lock in” policy, an historic problem for Microsoft as well.”

No Rob, Apple like any other company wants to control and profit from as much of the pie as they can.  If a company is sharing a bigger piece of the pie, it’s because they don’t have a choice.  As for vendor lock in, guess what Rob, if you’re discussing DRM based material, you’re going to have vendor lock in to some degree no matter what.  Is the vendor lock-in somehow better because you’re locked into a Microsoft solution as opposed to an Apple solution?  I think not.  Sorry, but Windows 7 will be no different on that respect.

One sustaining advantage that the Mac platform has is the ease in which Mac users can move from an old Mac to a new one. While migrating from Windows to a Mac is about as ugly as you can get, once on the Mac the process is comparatively painless. This is generally why Apple enjoys a higher customer churn rate than any other PC vendor, and it contributes to their higher margins and customer loyalty .”

Yes, moving from an old Mac to a new one is painless.  In fact, with Apple’s migration assistant, it’s completely painless.  However, migrating from Windows to a Mac generally isn’t difficult at all.  For starters, Apple makes it clear that they will do that for you at the genius bar at an Apple store if you’d like.  How difficult is that?  For most people, it’s as simple as migrating bookmarks, address books and some documents in the “My Documents” folder. That’s about it.  In the worst case, a user might do Boot Camp and dual boot or install a virtual Windows machine like Parallels or VMware’s Fusion product.  Again, in most cases that can easily bet setup for you in advance.  It’s not much more difficult than from moving from XP to Vista.

The Democratic Party and Microsoft have always been larger but less focused than their counterparts. For the Republicans or Apple to actually fix their competitors’ focus problems will likely be seen, in hindsight, as a really stupid thing to do.”

Seriously, it’s pretty lame to assign commercial companies to specific political parties.  Why bring politics into this discussion?   One could easily say Enderle is just making an analogy, but it seems to me that he’s trying to ride on the momentum of a particular political party and associates the faults of another party with Apple as a company.  

Apple would have been better off to fix its crappy laptop keyboards (seriously — compare a ThinkPad and MacBook keyboard) and figure out how to do touchscreens on PCs (multi-touch track pads are just lame compared to things like the iPhone and TouchSmart).”

I tend to prefer the more traditional keyboards as well, but not enough to make a big deal about it.  Really, if you don’t like something like a keyboard or a mouse, these are things that are very easily replaced.  The same isn’t true for an operating system (I can almost hear the Linux fanatics now taking issue with this one).  I’m guessing Enderle has never actually used a multi-touch trackpad on a recent Mac.  They are very cool and actually make laptops more efficient and ergonomic then most desktop PCs.  The scrolling text with two fingers is cool.  The zooming of text or pictures with a pinch is very nice.  Of course, Apple has done much more with 3 and 4 finger gestures, but this is clearly the future.  

By comparison, my prediction is that “TouchSmart” will go nowhere in any real practical sense.  Sure, it makes for a nice technology demonstration and possibly for a nice kiosk somewhere.  However, the ergonomics of constantly touching and reaching across a large screen is simply flawed.  If you doubt what I’m saying, imagine your computer has touch screen capabilities and try manipulating everything with your hands and see how long it takes before it becomes annoying.  Really, touch screens makes sense for something like an iPhone or a very small tablet like device.  For large screen computers, this will be nothing but cumbersome.  Clearly, Apple has been a couple years ahead with this type of technology.  If they thought it would be a must have feature, it would have already been shipped with Leopard much less the upcoming Snow Leopard.  For special applications, it might be nice.  For generally computing as we know it today, touch screen interfaces will be ergonomically inferior to mouse and trackpad gestures.  Remember, you heard it hear first!


Honestly, I can say that nothing Rob Enderle says is ever surprising.  I’ve never considered his articles as anything more than a shill for Microsoft’s marketing campaign.  Considering Microsoft’s established history of astroturfing and paying for comments in the media along with Enderle’s acknowledged professional relationship with Microsoft, I find it odd when people actually quote is comments or articles in order to somehow make a point.  If anything, quoting Rob Enderle to support your position works against you within the technical community.  Sadly, Enderle is often quoted in articles intended for the larger, non-technical community.  

Finally, I’m not aware of any “industry analyst” (and I use that term lightly) that has been wrong more often or even as a percentage than Rob Enderle.


A simple Google search turns up a few good examples of what others think of Rob.  For further reading (in no particular order), feel free to visit a few of the links below.  Enjoy!

Thurrott on browser advice

July 9, 2008

Monday, July 7, 2008

I know, I know… picking on Paul Thurrott is akin to picking on a retarded kid.  In that respect, I don’t feel right about writing this sort of blog post.  On the other hand, this guy has some sort of following and just preaches the ridiculous advice.  Whenever I see a link to his site, I just know it’s going to be entertaining.   In this case, Paul discusses browser security (amongst other things) and gives his “blessing” for Firefox and IE7.  While I have no issue with anyone’s personal preferences, I can take issue with the justifications presented for or against various products.

“Internet Explorer 7. There is absolutely nothing wrong with IE 7. In fact, on Windows Vista, it’s arguably the safest Web browser there is. I don’t “love” IE 7, and in fact choose not to use it. (See below.) But I’m OK with real people using it because it will keep them safe. And it finally has enough features that’s it’s not lacking in any meaningful way.”

If by “nothing wrong” and “features that’s not lacking” he’s not referring to security and standards support, etc. then I suppose he’s right.  The purpose of this post isn’t to bash Microsoft Explorer, but let’s be honest.  IE in general (not just IE7) has the worst track record for security.  Some praise the inclusion of a phishing filter. That would be fine if it wasn’t extremely easy to circumvent.  Having a broken phishing filter only serves to provide a false sense of confidence and likewise is far worse than having no phishing filter at all.

In terms of standards support, IE has always been a joke.  For web developers, this has always been a source of frustration.  On one hand, there are cool features that could easily be implemented.  On the other hand, the most common web browser is also the least capable.  If you’re building a personal web site, you might not care about the IE based users.  For commercial sites, you don’t have that luxury.  IE6 didn’t even have full CSS1 support.  It took Microsoft 5 years to update that to IE7 and it still has lousy CSS support.  IE7 can’t even pass the old Acid2 test much less the newer Acid3 test.  It has no support for SVG, etc.

Every browser performance benchmark I’ve seen has listed IE7 as the worst performer by a large margin.

The list goes on, but the criticism for IE7 has been well documented.   

I will say this; IE8 does seem to be headed in a better direction.  Those web sites that were written around the flaws of IE6 and IE7 will apparently be burned with the release of IE8 as IE8 will default to a more standards compliance mode.   

But, given the slow pace of development and historically poor performance, it’s more likely that IE8 will simply “suck less” as opposed to offer any real competition (quality wise) to Firefox, Safari or Opera.

But let’s be serious. IE is in use on over 70 percent of the world’s computers and people aren’t actually contracting malware as a result in any massive numbers. (Put another way, if they are, they’re idiots.) “

So, according to Paul, you’re an idiot if you contract malware?  Wow, that’s a powerful claim Paul!  I suppose if you’re using a Microsoft browser on a Microsoft Operating System, you shouldn’t be terribly surprised by contracting malware, but I wouldn’t go so far as to call someone an idiot when it happens.  

Firefox 3. Mozilla’s browser is my favorite, by far, for two reasons. One, it has an incredible extensibility model that has created a cottage market of useful add-ons. You can be really silly with these things and overload the browser, yes. But if you’re looking for some key bit of functionality that’s not built into Firefox, there’s an add-on out there for you. And you can change the UI dramatically with skins, many of which are high quality. The second reason is security. While I do feel that IE 7 is as secure or more secure than Firefox, Firefox does benefit from a pair of things: Hackers love it (and Mozilla) and are thus less likely to target it, and becuase it’s used less often than IE, it’s less likely to be a target. (This last bit benefits Mac OS X as well.)”

I’m not sure I even want to touch this one.  For starters, I do like Firefox and would agree that it’s an all around good choice.  Though, Firefox isn’t perfect either.  While there are many plug-ins to extend the product’s capabilities, these plug-ins have a habit of breaking between major releases.  Liberal use of plug-ins can also slow the product down.

Safari. At this point in time, you’d be crazy to use Safari on Windows. Apple is a black hole and I don’t trust this software or the way they foist it on people. The only thing seems dishonest to me.”

That’s one of those sentences you have to read twice before you realize it’s not you, it’s that Paul doesn’t make much sense.  In any case Paul’s issue against Safari doesn’t seem to be based on security, standards compliance, usability, performance, etc.  He just doesn’t like the way Apple’s Software updater has the download checkmark defaulted for Safari.  While I might agree with Paul on that trivial issue, the logic to avoid the Safari product all together is just irrational.  My criticism for Safari on Windows would be that it doesn’t feel like a native Windows application.  Apple uses a Leopard theme for the application Window, the “maximize” button behaves like it would on a Mac as opposed to Windows and the rendering engine use’s Apple’s technology as opposed to Microsoft’s.  It’s probably fair to argue that last item for two reasons.  The first is that a consistent rendering engine will more precisely render the same web page across platforms.  Further, Apple’s font rendering technology is more true to font’s intended shape and size.  Still, Mac users don’t like Mac programs that feel like Windows ports.  For the same reason, Windows users don’t want programs that feel like Mac ports.  I’m guessing that Apple wants Windows users to get a feel for using Mac applications.  However, then end result sort of feels like a fish out of water in some respects.   On the other hand, Safari is arguably the best browser for Windows in terms of performance and standards support.

Opera? I know there are fervent Opera supporters out there because they email me every single time I write anything about Web browsers. “When you are going to review Opera [insert version number here]?” “It does [this] and [this] and is better than [Firefox | Safari | IE] at [this] and [this].” Ah, right. I have the same reaction to Opera I’ve always had. I don’t get it. I don’t get why people install this thing and I don’t get why they like it. I know, I know. That’s just the way it is, sorry.”

Here, Paul doesn’t even pretend to have a valid argument against the product.  He says “I don’t get why people install this thing and I don’t get why they like it”.  Has he even tried it?  He doesn’t actually say one way or the other.  Like Safari, he doesn’t mention anything positive or negative about the product explicitly.  It should be pretty obvious why… because he doesn’t know anything about the product. 

In reality, Opera is actually a very nice product.  But, it’s a product in search of a market.  For better or worse, the FireFox is the more popular cross platform open source browser.  According to Net Applications, Opera represents less than 1% overall market share.  

Despite the low market share, Opera has found its way into a few mobile phones and even on the Nintendo Wii.  While the list of devices is impressive, these aren’t devices where people really use the feature.  Whereas, the browsing on something like the iPhone is actually common for that device.

Either way, the biggest criticism for Opera seems to be it’s lack of market share more than anything else.  


I didn’t actually expect an intelligent discussion from a Paul Thurrott article.  However, if he’s going to preach such nonsense, he’s fair game for criticism.  I wouldn’t imagine too many people take him seriously in terms of a legitimate source of information.  Still, the prospect of someone reading his posts and taking it as gospel is pretty scary.

McCracken’s 15 Ways Microsoft Can Reinvent Itself for the Post-Gates Era

June 28, 2008

Friday, June 27, 2008

PC World’s Harry McCracken gives his 15 ways Microsoft can reinvent itself.

It’s easy and even sometimes fun to play the role of the armchair quarterback. It’s easy to say what should be done when you have no personal stake involved with the outcome either way. Likewise, my opinions on the matter are certainly no more relevant than McCracken’s. However, when I read his article, I found myself agreeing on some issues and strongly disagreeing on others.

“Today, as Gates prepares to step down from day-to-day management of the company, another fact is clear: The modern Microsoft remains a company in search of a second act. True, it remains one of the world’s most profitable enterprises, raking in more dough in its 2007 fiscal year than Apple, Google, Yahoo, Oracle, and Adobe combined. But the cracks in the Microsoft hegemony aren’t just showing, they’re growing.”

McCracken does make a good point in that even though Microsoft is still making tons of money, they seem to have peaked and are showing signs of weakness from multiple fronts. Worse, there seems to be a consensus that Ballmer’s leadership isn’t enough to keep Microsoft on top. Here’s my two cents on each of the items listed.

1. Stop trying to be everything to everybody.

When things start to fall apart, it can make sense to simplify your product line and focus on your core competencies. However, it wouldn’t make sense to do that for a profitable product line. It’s unclear whether Microsoft is making a profit on the Zune product line. Over the past year and half, Microsoft hasn’t managed to establish itself as anything more than an also-ran competitor. While a portable music player might have the ability to make some companies appear to have a cool image, that’s anything but the case with the Zune. When I think of the Zune, I can’t help but think of this. I don’t see that image changing anytime soon. In cases like this, Microsoft does need to re-examine their strategy.

2. Upgrade continuously, not once every few years.

The example McCracken uses is Google’s web based products. This is sort of a ridiculous comparison. Rolling out updates to hosted server based applications is very different from rolling out desktop software based upgrades. With SAAS (software as a service), you don’t have to work about incompatibilities with other programs or worry about IT departments trying to figure out when they can test and push updates to all of their end users, etc. It’s one thing if your user base is primarily consumers. For corporate customers, that sort of upgrade cycle represents a logistical nightmare.

On the other hand, the point McCracken makes is actually valid. Waiting 5 – 6 years for an OS upgrade is too long. Less expensive, but more frequent upgrades along the way is a much better approach. The same goes for application software. I prefer Apple’s model here. Every year, they upgrade their iWork software. I’d prefer a lower cost of entry ($80) but pay the same price each year (if I choose to upgrade) as opposed to waiting 3 years for new features then get socked with a $240 upgrade. This is a valid point. However, I didn’t agree with the Google web based software as a valid comparison.

3. Be innovative–no, seriously.

“The marketing message from Redmond would have you believe that Microsoft and innovation are practically synonymous. In fact, the company is more mimic than innovator: When Apple put a tiny “Designed in California” on the backside of every iPod, it was inevitable that the Zune would sport an equally microscopic “Hello from Seattle.” It might do wonders for the company’s reputation if it appointed a Chief Innovation Officer whose duties would include ruthlessly killing everything that smacks of pointless imitation.”

Ha! That sort of thing comes from having the right corporate culture. Microsoft has never had an innovative corporate culture and history has proven Microsoft to be anything but innovative practice. I see that I’m not the only one that recognizes Microsoft’s blatant misuse/abuse of the word “innovation” with their marketing programs. Microsoft’s business model is based on some other entity defining a viable market, and then Microsoft comes in with 10x the resources and tries to take over. This has been successful in the past. Why should they change?

4. Treat customers like kings, not peons.

This advice is true for any company. However, things like WGA (Windows Genuine Advantage) are just the kind of things that send customers screaming into the arms of the competition. I firmly believe there is nothing good that comes from copy protection schemes like this.

5. Make Windows a seamless desktop-Web experience.

McCracken refers to Microsoft’s upcoming Live Mesh here. Microsoft is headed in the right direction, but they really need to execute on this one. Microsoft is known for having very ambitious plans and not being able to deliver. They should start small and evolve the experience over time… sort of like what Apple is doing with Mobile Me.

6. Reboot Windows.

McCracken looks at what Apple did by going from Mac OS 9 to an entirely new foundation with Mac OS X. This is bad advice. I say this, if for no other reason than because no justification is given for such a drastic change. For starters, yes, Apple did a major platform change with OS X, but really, so did Microsoft. The XP OS is based on the NT OS which is pretty buzzword compliant. The Windows 9x / me OS was the legacy garbage Microsoft left behind, just as Apple did with the “classic” Mac OS. Further, the Windows NT kernel may not be better than Unix, but it’s generally on par. Before we demand the plumbing to be replaced, we should at least define the fault with the current technology. McCracken has not done that.

7. Split Windows in two.

Split Windows in two – Why? This suggestion is based on consumer’s desire to continue using XP in favor of Vista. But, what happens when Windows 7 is released? Split Windows in 3? Exactly what message should Microsoft send to Windows based developers? This is something you do when you make a MAJOR transition – sort of like going from OS 9 to OS X. When OS X was first shipped, the platform wasn’t very viable in terms of third party support. Vista had some incompatibility issues, but it wasn’t nearly that significant of a difference. This is a last resort sort of strategy and one that should not be suggested lightly.

“Long-term, the world needs a fundamentally new version of Windows.”

I’m not sure how someone can throw out a sentence like that without providing the justification or at least describing what they would be looking for in this “fundamentally new version of Windows”. Vista is what Microsoft believes to be the future direction. If something is fundamentally wrong, describe what it is and what would lead you to believe Microsoft is even capable of pulling it off.

8. Make Windows more boring.

“MS-DOS was a simple, unglamorous piece of software that focused on being a solid platform for applications from Microsoft and other companies. “

Really, was MS-DOS the good old days? Was it really that good? By what standard is it measured? I used MS-DOS in the early days and seem to recall it being a half-assed CPM clone. It wasn’t even in the same league as Unix based systems of the day. Really, the MS-DOS product came from a hacker who created a product called QDOS (Quick and Dirty Operating System). Microsoft never had a cutting edge operating system, even back then. It’s nice to look back at yester year and imagine things were better than they really were. That’s what McCracken is doing here.

“Microsoft should concentrate on making the OS more reliable, secure, and easy to use rather than adding features to a paint program.”

Agreed. That’s what Apple is doing with the upcoming Snow Leopard release. Though this may be a good thing, I’m not sure the masses or even mainstream media will appreciate it. Consumers and most mainstream media who review operating systems only seem to recognize the shiny new features that can be seen from the surface. But, McCracken makes a good point, the OS should be transparent.

“As Windows has added tools for digital photography, entertainment, and communications, it’s become more complex and less satisfying.”

The problem for Microsoft is that many (like McCracken) don’t seem to distinguish the difference between the OS and the OS distribution. Is support for digital photography really such a bad thing? If you thought Vista was a disappointment, then imagine trying to sell a boring OS. Apple is going to face this challenge with Snow Leopard. There is major work going on under the hood (as should be for any new OS release), but it’s going to lack the shiny new features that sell to end users. Maybe we should see how successful Apple is before suggesting Microsoft does the same thing.

9. Make Windows Mobile the flagship.

This is a good point. Windows mobile has been available for years and is at best, just another “me too” product. This should be seen as an embarrassment for Microsoft. Granted, there wasn’t much competition prior to Apple entering the market. Apple has set the bar for what everyone should expect from a mobile device. Worse, pressure will be coming from Google with their Android platform. If Google is able to execute on Android, this may become the new standard. Microsoft can no longer rely on their desktop monopoly to ensure success in the mobile market. Microsoft is not the big fish in this pond.

10. Leapfrog Google Docs.

The problem here is that Office is a cash cow for Microsoft. On one hand, Microsoft is challenged to figure out how to compete and still be profitable. On the other hand, Microsoft cannot just bury its head in the sand and pretend the problem will go away. This is similar to the telecommunication companies that relied on analog switches before VOIP. They were greedy and didn’t want to disrupt their revenue stream while their competition was mounting attacks based on new technology.

“Then Microsoft earned much of its dominance of the office market the old-fashioned way: By building better software. “

After several generations, Office may have eventually evolved into a better product. However, let’s be honest and not say that Microsoft “earned” the top spot. Microsoft beat the competition by bundling their products and often making deals that essentially gave their product away when purchased with another. For example, in the mid 90’s, I consulted for a large company that “switched” to both MS Office on the desktop and Exchange server for mail. As I understood the deal at the time, the company essentially was given a free license for 85,000+ copies of Office for switching to Exchange as their e-mail server. No, Microsoft was guilty of unfair business practices whereby nobody could compete with them. How could Lotus 123 or Word Perfect possibly compete with that kind of deal? Where does McCracken think the accusations of unfair business practices came from? This would seem to be another case where McCracken looks back at history with a distorted view.

11. Bundle Office with an online suite.

That seems reasonable to me. In this way, Microsoft gets to keep its current business model while countering initiatives from the likes of Google, etc. Good suggestion Harry!

12. Make the Office file formats indispensable on the Web.

Agreed, but only if Microsoft truly opens the format up such that others could easily build viewers, etc. Similarly, the universal file viewer should be a Javascript based app so that any browser can view it without the need of a proprietary plug-in. Microsoft could still control the format going forward, but they would have to be more open with the specifications.

13. Take a studio approach to software.

Agreed, this might be the only way they could build cool products. The main benefit here would be for Microsoft to establish a non-Microsoft based culture where innovation and creativity are first and foremost. That would be interesting.

14. Build Internet Explorer on top of Firefox.

Well, a Gecko based browser is better than a Trident based browser. Then again, a Webkit based browser would seem to be even better yet.

We all saw this coming. Microsoft dominated the market and then basically let their product rot on the vine. There was a time where IE was my preferred browser. That was around the IE 4 to IE 5 timeline. The IE browser has been a joke ever since. Worse, the masses still use IE most often to the detriment of all web developers. Microsoft’s support for web standards has been abysmal. IE is by far the worst performing browser as well. Microsoft should again follow Apple and open source their Trident engine. Let the open source community help ensure web standards, etc. are all part of the engine going forward. Since IE doesn’t currently enjoy any technical advantage over the competition, Microsoft has nothing to lose here. This should be a no brainer.

15. Be a leading iPhone developer.

Agreed. Ballmer would have to swallow some pride to make that happen, but they should embrace emerging new markets like the iPhone. Microsoft seems more interested in promoting the Windows platform than are interested in supporting their software products. That’s their call to make. I can see why a company like Apple takes that position, because Apple’s software is really just a means of selling Apple hardware. Is Microsoft an application company or an operating system company? One side is clearly holding back the other.

What are my thoughts on re-inventing Microsoft?

I don’t know, that’s not an easy task. It’s certainly easier to talk about what won’t work than what will work. For example, Microsoft is apparently about to spend $300 million on an advertising campaign to try to make it appear as “cool” as Apple.

While I’m sure that will help somewhat, I can’t help but think of the analogy of putting lipstick on a pig when I read about these plans. Is Microsoft “un-cool” because of their advertising campaigns? Is Apple “cool” because of theirs? No, in the end, it all comes down to the products you sell.

When I think about Microsoft, I think respect them for their ability to broker business deals, leverage their influence over competitors, etc. Everyone has to respect Microsoft’s massive resources and ability to compete in a market for a long time, even if they are losing money along the way. As a business competitor they should be both admired and feared. On the other hand, I don’t consider Microsoft to be creative or innovative in the least. I also don’t like they way they continue to push proprietary solutions where they aren’t needed. For example, does the industry really need a proprietary Microsoft answer to the PDF format? No. Do we need a Microsoft proprietary competitor for Flash? Apple used to be just as guilty of the “not invented here syndrome”. In Apple’s case, it was more about pride (foolish pride, but pride none the less). In Microsoft’s case, it’s all about control. It’s one thing to come up with a proprietary solution if you’re truly breaking new ground. It’s another thing to do it when there are perfectly suitable existing solutions that the industry has already accepted.

With that said, here are a few suggestions I have for Microsoft. I don’t have 15 steps, I have just three.

1. Embrace Open Source.

Start with the Trident engine of the Internet Explorer product. This would be the quickest way for them to get the product to a more competitive level. It would also help Microsoft better support open standards. Lots of good will comes from “opportunities” like this. Further, doing this would allow Microsoft to focus on making their product better rather than reinventing the wheel and not even doing that as good as free alternatives can. For example, the IE browser could still have cool features unique to Microsoft while supporting an open sourced Trident engine.

2. Embrace Open Standards.

One of the things people hate about Microsoft is that they always have to come late to the party with a “me too” solution that offers nothing for the industry except the possibility of more “lock down” from Microsoft.

3. Be an innovative leader instead of a follower.

Really, consumers yawn when they see Microsoft product announcements. When you look at practically every product they sell, it’s all been done before. Microsoft shouldn’t enter a market just to get a piece of the pie. If they can’t create a better product, they shouldn’t bother. Take Apple for example. The iPod wasn’t the first portable music player. The iPhone wasn’t the first smart phone on the market. But, Apple entered the market because they knew they could put together a much better product than others have. In the case of the iPod, it’s a combination of the cool hardware and the great iTunes software and Music store. The iPhone isn’t perfect (yet), but it was a revolutionary step forward in terms of usability in this class of a product. Advertising was essentially free because everyone was talking about it months before it was even available for sale. That’s the dividend of innovation.

If Microsoft remains a follower, they will never be cool. It won’t matter how much revenue they make in a fiscal quarter.


Linux: The desktop explosion that never happened.

February 6, 2008

Wednesday, February 06, 2008

What ever happened to the promise of Linux on the desktop? Sure, there are GUIs and desktop applications for Linux, but outside of a very small percentage of the geek population, Linux on the desktop is a non-starter. The promise of a desktop based Linux growth explosion has always been “in another year or two”. The problem is that it’s been that way for years with no sign of improvement. 


The purpose of this article isn’t to attack Linux as an operating system. Rather, the purpose is to look at the challenges Linux is facing relating to growth in the desktop operating system market. This article will look at why Linux has failed miserably in this area, predict the near term future for the OS and offer my thoughts on how Linux could achieve great success on the desktop.

With an article like this, it’s probably best to state my position on Linux right up front. I’ve been programming computers long before Linux existed. I’ve studied operating systems in college and have written my own job scheduler as a project back in the day. With that said, from a technology perspective, I’m a fan of Unix and have respect for pretty much any Unix-like operating system, including Linux. However, having written programs that make use of low level interprocess communication, I’m also painfully aware of the differences between an SVR4 based Unix and something like BSD or even Linux.

Over the years, I’ve tinkered with various distributions of Linux (and even SCO before that). At the same time, I’ve also been a user and have done minor development work for Windows and Mac operating systems. Once Mac OS X hit its stride, probably around the Panther release, I found that I had less and less use or even desire to have Linux installed on one of my machines. There are reasons for that which I will discuss later in this article. 

The Cold Truth

Unfortunately, there is no avoiding the hard facts.  I read an article about a month ago that referenced statistics from Net Applications. Linux market share on the desktop remains at less than 1% (0.67%). Given the margin for error either way, it seems fairly safe to conclude that Linux on the desktop is still basically nonexistent. While I don’t take any single source of information as the end all source, however the numbers reported below are representative of numbers I’ve seen reported from other sources. I have no doubt someone can come up with another source of information with somewhat different statistics, but that wouldn’t change the fact that Linux hasn’t had the success on the desktop that has been promised for years.

Linux advocates would probably argue that the installed base of Linux desktop users is higher than this. While I might agree there is some probability of truth in that, I’d also have to question why they aren’t using it. What good is a Linux install if you’re using Windows instead?

The article referenced above notes that over the past 2 years, Apple’s market share has grown from 4.21% (January 2006) to 7.31% (December 2007). Certainly, Apple’s own sales figures confirm the type of growth noted here. The point here is that at least on the desktop, people leaving Windows are switching to the Mac rather than Linux. In short, Mac OS X is becoming “the people’s Unix”.

Regardless of the numbers, this isn’t meant to be a Mac versus Linux comparison. Rather, a look at the state of Linux and what’s holding it back on the desktop. Linux has had much success in the server market. There are lots of reasons for this. Linux, like any Unix-like OS is fast; well designed; familiar to higher end server customers and developers alike. On the server side, the GUI is a non-issue. It’s cheap; it’s well supported by third party middleware vendors and databases alike. Etc., etc. The list goes on. Apple was smart to tout references to UNIX when advertising OS X. Linux would be smart to band with Apple as another technically superior alternative to Windows. Sadly, that’s not what Linus Torvalds has in mind. 

Torvalds isn’t helping matters

I read an article today that sort of jogged my memory on the topic. The article, “Torvalds pans Apple with ‘utter crap’ putdown” . In this article, Linus claims:

“I don’t think they’re equally flawed – I think Leopard is a much better system,” he said. “(But) OS X in some ways is actually worse than Windows to program for. Their file system is complete and utter crap, which is scary.”  

First, what’s the point of attacking OS X? It seems that Torvalds is upset with the success OS X has been enjoying. It also becomes clear that Torvalds sees OS X as a threat to his operating system.

Second, the shot he takes is actually pretty weak. While I’d agree Apple’s HFS+ file system isn’t exactly cutting edge, neither are common Linux file systems really (EXT2, EXT3, ReiserFS, JFS, XFS, etc.) Really, is anyone supposed to get excited about B+ Trees and Journaled file systems? News flash Linus, these features exist in HFS+ as well. ACLs? Yup, it’s in there. I’d agree that ReiserFS does some pretty interesting things and has great performance when you’re dealing with a very large number of files that all happen to be very small in size. But, let’s also admit that ReiserFS has had corruption issues and doesn’t lend itself to be defragmented properly. In short, which Linux file system would he be comparing HFS+ to? Also, most “modern” file systems are considered weak compared to Sun’s new ZFS. Like any file system, it takes years to mature and be deemed worthy of a production environment. There is zero tolerance for bugs in a file system.

“An operating system should be completely invisible,” he said. “To Microsoft and Apple (it is) a way to control the whole environment … to force people to upgrade their applications and hardware.”  

The problem here is that Torvalds is living in a dream world whereby he gets to use a very technical definition of an operating system. To Torvalds, an operating system is nothing more than a kernel. That is, it’s a process scheduler, memory manager (for virtual memory) and possibly an I/O manager for very low level interfaces to input devices. To the rest of the world, the term “operating system” refers to the entire distribution. This includes all of the middle level APIs between the applications and the kernel such as Quicktime, OpenGL, Quartz, etc. on the Mac. It also refers to all of the basic applications that come with your machine that help it do basic things. This would be anything from an e-mail program to a DVD player, etc. Linux installs come as “distributions” not as pure operating systems. Torvalds doesn’t care about trivial things, you know… like a GUI, when he makes such utopian statements. Instead, he paints Microsoft and Apple as evil entities because they actually deliver an entire operating system distribution that is well integrated. Oh, the horror!

“As for his own operating system, Linus said the most exciting developments were Linux’s improving green credentials, and a push into mobile devices such as the One Laptop per Child project and Asus’s new ultra-cheap Eee PC.”  

It’s odd how Torvalds doesn’t mention that both Windows and Macs have long since had very good “green credentials” with their respective power management features. Also, Eee PC? Is that the future of Linux on the desktop? Great. You can either be known for being good or being cheap. Apparently, Torvalds is going for cheap.

“The (Linux) kernel is already being used in things like cell phones, but the problems have been in the UI (user interface).”  

Microsoft and Apple have also put their kernels in mobile devices. Apple seems to be making significant innovations in terms of mobile user interfaces too.I suppose I’ve been a bit put off by the personal comments that have come from Linus over the years. This article was no different. “Trash talking” is a way of life for him it would seem. I’m interested in a technical discussion where he might actually make a few valid points. I’ve yet to see that from him, despite his capability to do so.

The bottom line is that Torvalds isn’t helping people adopt Linux with trash talk that is short on examples and easily contradicted. 

Linux is for geeks?

Linux has had a reputation for being popular with geeks. There’s nothing wrong with that. However, Linux needs to branch beyond this population in order to ever become successful. Again, with a Unix like operating system, what’s not to like? Linux is a geek’s paradise as there are so many options for just about anything. Better yet, most of the software available is free from the open source community, just like the OS itself.

The problem is, to be successful, Linux not only needs to be powerful, but it needs to be easy. Very few Linux distributions can claim to even approach the ease of installation, setup and use of Windows, much less the Mac. 

What about the high profile geeks?

Over the past couple years, there were at least two “relatively” high profile geeks that switched from the Mac to Linux. Mark Pilgrim and Cory Doctorow are the examples that the Linux community seem to mention time and again. I remember reading their stories / justifications at the time, but admittedly forget the details. As I recall, both cases seemed to make a switch based on some sort of principle (open source software?) rather than based on specific needs. I remember a program by program comparison (from at least one of them) of what they were using before and what they are using now. Without out a doubt, they both ended up settling for lesser quality software in order to stand by their “principles” or whatever. That may be noble, but it’s certainly not practical. I believe one of them bought a Thinkpad because it was cheaper than a Macbook. Well, that’s certainly true in some cases, but like anything else, you generally get what you pay for.

Anyway, what was the end result? Both of these “high profile” converts were very vocal about their decisions in their blogs. Clearly, they seemed to be trying to justify their decisions to others (and possibly themselves). The problem is, everyone else just seemed laugh and “wish them luck” as their choice wasn’t very practical. I’m guessing they thought they were somehow leading the way for others as if they had such influence, etc. 

Why I can’t justify a switch to Linux (yet).

As someone who likes Linux and feels at home in any Unix like environment, I haven’t been able to justify switching to Linux full time for myself nor would I yet recommend it for others with lesser experience. Below are a few reasons that I would imagine are common to others.

  • Software. I’m all for open source software. There have been plenty of benefits from open source software, even for commercial software vendors. However, the quality of open source software seems to be in bits and pieces rather than in collections of large applications.

For example, LAME is probably the best MP3 encoder available. As a tool, that’s great, but I haven’t seen a good equivalent to something like iTunes. Audacity is a nice free audio editor, but it’s not in the same league as something like Logic. GIMP is no substitute for Photoshop. OpenOffice is okay, but Microsoft Office is better. Etc, etc. The list goes on. When it comes to desktop software, it’s about making do rather than having the best. I could go into a discussion on emulators like WINE, etc. but I don’t really consider those to be real solutions. They are less than optimal ways of “making do” to be kind.

  • Ease of use and installation. Again, this is not a problem for the geeks, but end users don’t want to have to troubleshoot installation configuration issues, nor do they want to have to hunt down the proper device drivers, etc. Linux based distributions such as Ubuntu have made great strides in this area compared to what I was used to years ago, but I wouldn’t consider it ready for the masses yet. Just think of the least computer savvy users you know. Now imagine them trying to install and configure Linux. Enough said.
  • Standards. Unfortunately, strengths can be turned into weaknesses. Linux gives you the ultimate in terms of choices. This is a geek’s paradise. Unfortunately, most consumers don’t want so many choices. They want standards. Developers also want standards. If you’re a developer, how do you develop for a system with no standards? At the server level it’s easy as you’re not dependent upon the GUI. This is why Linux has been able to gain traction in the server market. On the desktop, not having a standard is a very bad thing. This is true for GUIs and possibly to a lesser degree, files systems.
  • Graphical User Interface. The GUI issue really warrants its own line here. There are two main choices (GNOME and KDE) and roughly a dozen lesser known choices. In the end, this only serves to fragment an already small niche market. No wonder commercial software developers avoid Linux on the desktop. They’d need a different product for each GUI they chose to support. Also, what about consistency across the user base?
  • Common direction for the platform. The Linux kernel receives the proper attention, but that’s about it. Everything else on Linux is really a hodge podge of miscellaneous parts. They may be good parts in and of themselves, and the choices are great. The end user experience is always going to be inferior to a well designed and well integrated solution that was designed to fit together. In this respect, both Macs and Windows have an advantage over Linux.

Low end desktop potential

If the trend continues for foreign countries to revolt against the Microsoft monopoly, Linux stands to gain the most here. When it comes to common file formats for productivity suites such as word processors, speadsheets, etc. it wouldn’t be a terrible thing to have a standard that’s not controlled by Microsoft in place. Yes, attempts to do that are already under way, but a wholesale switch to Linux for some government agencies would certainly speed up the process and perhaps make this movement relevant.

Many developing countries are looking for low cost alternatives for their computer needs. Initiatives such as the Asus Eee PC are attempts to address this need and may very well become successful. Having a “free” operating system is almost a requirement for this class of computer. This may very well be the springboard for Linux on the desktop that Linux fans have been hoping for. The problem is, someone that’s going to spend $200 on a PC probably isn’t going to spend $1200 on something like the Adobe Creative Suite or Final Cut Studio, etc. That is, it may become popular for the basics, but I don’t see much of a market for the higher end commercial software products on an Eee PC / Linux platform. If that’s the case, Linux would have the marketshare but still be a second rate choice on the desktop. 


From a technology perspective, Linux is a first rate operating system. It’s a powerful UNIX-like operating system with an unbelievably low price (free). Linux is a geek’s dream come true as there are many choices, configurations and distributions to choose from.

At the same time, Linux on the desktop is perennially “just a year away from exploding on the desktop”. Unfortunately, we’ve all heard this repeated many times over many years. It hasn’t happened and for good reasons. There are very significant barriers preventing Linux from being successful on the desktop. Linux needs a unifying force to come up with “the” Linux standard, not just “another” Linux standard in terms of technology. This includes the file system, the GUI and everything in between. Of course, the notion of having a unifying force for Linux goes against the very grain and nature of the Linux community. In short, Linux wouldn’t really be Linux anymore if that ever happened.

Linus Torvalds claims the operating system should be transparent. Of course, he’s able to make such claims because he doesn’t take responsibility for the entire operating system; rather, he only speaks for the kernel. Rarely do kernel changes in the Mac OS or Windows require third party updates, it’s the higher level APIs that do. Likewise, I find Torvalds’ comments to be disingenuous at best.

Some form of Linux has a chance at gaining marketshare success in the form of very low end computers such as the Eee PC example. If foreign governments continue the push towards Linux on the desktop, this would help as well. However, neither of these scenarios really describes a best in class environment. Rather, they describe a lowest common denominator situation. But hey, that worked for Microsoft!

The biggest barrier for Linux on the desktop is the fragmentation of technology and standards. In my opinion, the best chance Linux has for a future on the desktop is for a commercial entity with sufficient resources to take control and push for a unified standard. Companies like IBM come to mind because of their vast resources and commitment to Linux. Unfortunately, I question IBM’s ability to pull it off due to their historically poor understanding of consumer needs and desires. IBM has always done well catering to their enterprise customers.

The perfect set-top box

December 11, 2007

Tuesday, December 11, 2007

If only someone made the products we actually want to buy! This article is more of a rant about good products that fall short and how I believe they can be fixed. If you’re frustrated by today’s set-top box offerings, then please, read on.

For starters, it would be best to state the major features that a set-top box should have. My view of a set-top box is a device that streams media from your computer to your HD television. That’s it (well almost). I suppose in an ideal world, having a DVR (digital video recorder) would be nice, but I due to the complexities of dealing with the various different cable / satellite companies, etc. I don’t really consider this a requirement.

Also, this article is not meant to be an in-depth review of existing products by any means. Likewise, please forgive the generalizations made during this discussion.

Windows Media Center based devices

There have been many devices to choose from over the years, but none have been good enough for me to buy yet. I’ve played with a few of the earlier Media Center Extenders (MCX) from HP and Linksys. Microsoft took an early lead in this arena, but the end results have always felt like more of a novelty than something I had to buy.

The Xbox 360 seems to be a significant improvement. On the PC side, it’s about as good as you can ask for. HP even has a TV with built in MCX capability, though I haven’t tried that one yet.

Apple TV

The problem for Microsoft (and me) is that Apple has beaten them to the punch with iTunes / iPod. That is, most people don’t want to have their media content in multiple places for multiple applications. It’s not a matter of disk space; it’s a matter of convenience and principle. While Microsoft isn’t likely going to be able to license Apple’s DRM in order to play their protected content. However, they could do things like parse the iTunes library for music, video, etc.

Apple came out with the AppleTV product earlier this year. While this is much closer to what I’m looking for, it still misses the mark. On positive side, it does integrate with my media files better. On the negative side, its support for HD lags behind Microsoft MCX devices. Come on Apple, 720p is good, but I want 1080p support. Also, I want 1080p content from the iTunes music store. This is another area where Microsoft has the edge over Apple.

These devices are also just begging for a web browser. Microsoft has Explorer and Apple has Safari. Microsoft has an obvious need to maintain marketshare. For Apple, like the iPhone, the AppleTV could be a means of extending the use of the Safari web browser without increasing Mac marketshare. This should be a “no brainer”.

Finally, I’d like to have cheap USB external storage as an option on all of these devices.


Microsoft currently has the lead for set-top boxes. With the Xbox 360, you even get the bonus of having a nice video game console. If you’re knee deep in Microsoft technology and just love your Zune, this is by far the best choice for you. However, for the rest of us that prefer the iPod / iTunes to manage most of our media, the solution is less clear.

Apple made a decent showing with its AppleTV product this year, but it’s clearly a 1.0 product. My main gripe is the weak HD support for a product that actually requires an HD connection (component video or HDMI). With the lack of an HD standard for physical media such as Blu-ray and HD-DVD, Apple is missing an opportunity to sell 1080p content. That’s a shame. It’s not a far fetched scenario to see both Blu-ray and HD-DVD formats pushed aside in favor of an on-line only format.

Web browsers should be standard features on these devices. The AppleTV already has access to YouTube content. Why not a web browser? A keyboard could be optional. Microsoft bought WebTV but hasn’t done anything with it. It seems to me this purchase hasn’t been profitable for them. Why not take the expertise learned from this system and add it to the Xbox? Either way, both devices fall short of their potential.  Web browsing from an HD based device is much more practical than surfing the web on standard defintion television.

Finally, device makers shouldn’t assume everyone has 802.11n routers and instead provide a means of extended built in storage via inexpensive USB external hard drives.

This is my wish list for 2008! The current devices aren’t far off from what I’m looking for. Based on sales of the AppleTV and common usage of the Xbox, neither Apple nor Microsoft has delivered what consumers are really waiting for. Let’s hope 2008 brings us to the next level of set-top boxes.

History of the Graphical User Interface (GUI)

November 12, 2007

Monday, November 12, 2007

When viewing various forum debates, for some reason, it seems important for some to give credit for various innovations to one company or another. In a similar vein, others try to disprove such claims, etc. There is probably no greater example of this than the origins of the graphical user interface. On one hand, I hesitate to write about this issue because so many people have such strong opinions on the matter. On the other hand, there is an awful lot of misinformation posted in forum discussions.

Finally, the one common theme I’ve noticed when people debate this issue is that they all try to make a case whereby one company gets full credit for the innovation and everybody else copied them. Unfortunately, the origins of such technologies are rarely “black and white” issues and there are many “shades of grey”. Most innovations are built off of the works of others. The graphical user interface is no exception.

Legend has it…

For many years, it was commonly accepted that Apple created the GUI and for good reason. Outside of a few people working in labs, most consumers and IT professionals alike were not familiar with the concept of the GUI prior to the debut of the Apple Lisa. At the time, I recall reading the coverage of this product in technical magazines such as BYTE, etc. Clearly, this technology was treated as something entirely new and there was no reference or comparison to Xerox’s work. Of course, back then, the internet wasn’t what it is today. There were online services and bulletin boards, etc. but for the most part people relied on magazine articles to keep current.

As difficult as it may seem today, the notion of using a mouse with icons and drop down menus, etc. was completely foreign. At the time, all interaction with computers was done with the command line interface (CLI). As I recall, the GUI concept was ridiculed by much of the technology community at the time and the Lisa / Mac platform was painted as a toy by some. It wasn’t for “serious” users.

As we all know, the GUI eventually did become popular to the masses. Radical changes like this often take time to digest. More importantly, it often takes the market leader (someone like Microsoft) to push this as the future direction and basically not give the masses a choice before they see the light.

Once the concept of the GUI became a good thing, some began to challenge Apple’s role in this innovation. Microsoft was smart enough to see which way the wind was blowing and likewise followed Apple with the pursuit of a GUI of their own. Since Microsoft blatantly copied Apple’s work (more on that later), the common sentiment from Microsoft Windows advocates was to cite prior work from Xerox PARC and claim Apple did the same.

Some Microsoft advocates seem to wonder why “what Microsoft did to Apple” was different from “what Apple did to Xerox”. There were several differences. For starters, both Xerox and Apple were pioneering GUI concepts simultaneously. Clearly some concepts were developed at Xerox first and some were developed at Apple first. The problem for Apple was that they actually visited with the Xerox team to see what they were doing. This and this alone is what led some to the perception that Apple just stole their GUI from Xerox. Apple paid in stock (worth millions) for this brief visit. The agreement was up front. Apple didn’t see how things were done in any detail and the programming environment Xerox was working in was so different from Apple’s development environment that it wouldn’t have mattered if they did see the details behind Xerox’s work. Microsoft was not just a competitor to Apple, but they were also a developer for the Mac platform. They had intimate access to Apple’s APIs and frameworks to see how to create a GUI. Not only did Microsoft copy Apple’s work in concept, but as a developer for Apple, they had the blueprint from which to copy. Further, there have been no real significant GUI conventions that Apple is using which originated at Microsoft. With each release of Windows, it became clear that Microsoft was not interested in innovation since it was easier just to copy. With the Windows 95 release, it was clear that Microsoft wasn’t even trying to hide their intentions of copying Apple.

What’s GUI? Who’s a WIMP?

In order to determine who gets credit for something, it’s important to have a reasonable definition. The basic requirements of a GUI are defined in an acronym called WIMP. WIMP stands for Windows (bit mapped graphics required), Icons, Menus and Pointing devices (usually mice).

Xerox’s contribution

Without a doubt, Xerox’s Palo Alto Research Center (PARC) was responsible for many things we take for granted today such as the development of Ethernet, the Laser printer, etc. and certainly, they helped unify some of the elements found in GUIs today. However, just as it is important to give credit where credit is due, it’s equally important to make clear where credit is not due. For example, if you recall the elements (WIMP) necessary for a GUI, we have to ask which were developed at Xerox. Windows? Yes and No. Yes, Xerox developed a windowing system, but no, it did not invent the bit mapped graphics model necessary to display the windows. Icons? No. Menus? Yes, but not in the form we know them today. Pointing devices? Mice? No and no. The point here isn’t to belittle the accomplishments of Xerox. Rather, the point is to illustrate that the concept of the GUI didn’t just pop up at Xerox. Most of the items mentioned such as the bit mapped graphics, the mouse, icons, etc. were prior works that are attributed to Douglas Englebart and his work at the Augmentation Research Center which was funded by various US Government agencies including DARPA, NASA, Air Force, etc.

Xerox did successfully create an advanced object oriented development environment written in Smalltalk. This is where the majority of Xerox’s work went into for this product. Xerox had technology demonstrations of rudimentary windowing systems and the assumption has generally been that Xerox was further along than Apple was when Apple visited with Xerox. In terms of pure GUI concepts, Xerox’s work was certainly an evolution over Englebart’s previous work. However, today’s modern GUI bares little resemblance with what Xerox finally came up with in the Star/Alto product.

Apple’s contribution

In 1979, Jef Raskin started the Macintosh project at Apple. He identified a need for a computer that was easier to use than anything developed to date. Both the Macintosh project and the Lisa project were works in progress prior to Apple’s infamous visit to Xerox. Apple and Xerox were in simultaneous development of a GUI. Apple was aware of Xerox’s work because the founder of the Macintosh project, Jef Raskin had lectured at Xerox on the topic prior to joining Apple. Jef Raskin was something of an authority on the subject at the time. He had written his Master’s thesis on a WYSIWYG graphical interface back in 1967. Likewise, many of the same ideas that fueled Xerox’s effort originated from the creator of the Macintosh project.

At the same time, it would be unfair to suggest Apple visited Xerox and didn’t come away with any ideas. Clearly they did. The fact that Apple’s Lisa GUI was different from Apple’s Macintosh GUI should make that clear enough. But, the devil is in the details. Apple wasn’t developing in a Smalltalk environment, Apple had resource limitations that Xerox didn’t. That is, Apple had to make a GUI work on an affordable piece of hardware. Xerox was working purely in a research environment without the same hardware limitations.

Testimonials from those involved…

There was an interesting essay written by Bruce Horn back in 1996 about this topic. Bruce was of the people recruited from Xerox to work for Apple on the Macintosh project. Clearly, he’s one of the few people that can say with authority what work was developed at which company. Below are a few excerpts from his essay.

“For more than a decade now, I’ve listened to the debate about where the Macintosh user interface came from. Most people assume it came directly from Xerox, after Steve Jobs went to visit Xerox PARC (Palo Alto Research Center). This “fact” is reported over and over, by people who don’t know better (and also by people who should!). Unfortunately, it just isn’t true – there are some similarities between the Apple interface and the various interfaces on Xerox systems, but the differences are substantial.”

Again, Bruce goes on to talk about the Smalltalk environment created at Xerox as this was more significant than the actual GUI pioneering they did.

“Smalltalk has no Finder, and no need for one, really. Drag-and- drop file manipulation came from the Mac group, along with many other unique concepts: resources and dual-fork files for storing layout and international information apart from code; definition procedures; drag-and-drop system extension and configuration; types and creators for files; direct manipulation editing of document, disk, and application names; redundant typed data for the clipboard; multiple views of the file system; desk accessories; and control panels, among others. The Lisa group invented some fundamental concepts as well: pull down menus, the imaging and windowing models based on QuickDraw, the clipboard, and cleanly internationalizable software.”

The list goes on. In fact, he talks about Bill Atkinson’s windowing model and how it was designed. Bill wasn’t even aware of the fact that Xerox’s model didn’t even have self-repairing windows as the Mac did. The point here is that there is a big difference between broad concepts and the actual implementation. Clearly, the Macintosh architecture solved problems that the Xerox team wasn’t even aware of.

The concept of pull down windows is one of the most basic functions of the modern GUI. Again, this is yet another thing that didn’t exist at Xerox. Drag and Drop? Yes, another concept developed at Apple. Control panels? Yup, Apple. Clipboard metaphor / concept? Yup, Apple. Desk Accessories? We know them now as Widgets or Gadgets, but the concept originated at Apple. Most people don’t realize even how icons were used on the Xerox Star. Icons were used as verbs. Like an icon to “save” a file. On the Macintosh, you could use icons as objects, like a trash can or you could drag a document onto an application icon to open it, etc.

What’s interesting is that even things that have been attributed to Xerox such as the selection based text editor, apparently did not originate there at all. Below is a response from the late Jef Raskin in response to Bruce Horn’s essay:

“Horn makes it seem that the selection-based editor came with Tesler from PARC. It may have been a case of convergent evolution, since we already had that paradigm at the Mac project. In this case it dates at least back to an editor I designed much earlier, while at Bannister & Crun. In ’73 I discussed my editor concepts with many people at PARC, so I do not know whether Tesler’s design was influenced by my work, I know it was not the other way around.”

In the following paragraph, it becomes clear that in terms of general concepts, Jef Raskin’s work predated any of the fundamental concepts that were further developed by Xerox.

“My thesis in Computer Science, published in 1967, argued that computers should be all-graphic, that we should eliminate character generators and create characters graphically and in various fonts, that what you see on the screen should be what you get, and that the human interface was more important than mere considerations of algorithmic efficiency and compactness. This was heretical in 1967, half a decade before PARC started. Many of the basic principles of the Mac were firmly entrenched in my psyche. By the way, the name of my thesis was the “Quick-Draw Graphics System”, which became the name of (and part of the inspiration for) Atkinson’s graphics package for the Mac.”


In the end, who created the GUI? There is no single entity that can take credit for that. Apple advocates want to say Apple did. Microsoft advocates would love to say Microsoft did, but there has never been any evidence to support that. Instead, if they can’t attribute this innovation to Microsoft, they refuse to acknowledge Apple’s work so they claim Xerox invented it all.

The problem with the Xerox takes all claim is that most of the basic elements required for a GUI predate Xerox’s work as well. As I mentioned earlier, they didn’t create bit mapped graphics, they didn’t create icons, they didn’t create the mouse pointer, etc. At the same time, they did evolve the concept into a rudimentary working model.

Moreover, if we look at the modern GUI today, it more closely resembles the work done with the original Macintosh than anything else, including Apple’s earlier Lisa system. That’s probably true for several reasons. For starters, Xerox was never able to successfully commercialize their work. Apple’s GUI was certainly the first commercially successful GUI and certainly the first GUI that anyone outside of the Xerox PARC lab that anyone would have ever seen. Also, the market leader, Microsoft basically did just steal from Apple. As such, it is understandable why Apple’s GUI and whatever we call “today’s modern GUI” to be very similar in convention. Really, although there have been many refinements to the GUI, when you think about it, not much as changed in the past 25 years in terms of how we interact with our computer.

In terms of the Xerox Star, its one thing to look at a screen shot of an operating system and it’s another thing to see how it’s used. Not only was the Xerox Star rudimentary in function compared to the Macintosh, but all accounts I’ve read claim it was clumsy and slow to use. That is, there was elegance in the Smalltalk environment they created, but it should be considered an unfinished product at best. It was more of a technology demonstration than a product that’s ready for prime time. The point here isn’t to belittle the work Xerox did. Rather, when considering the time line of events, it’s important to realize how long it takes to create a product like this that actually is ready to ship to the masses.

Finally, Apple was the company that brought the GUI concept to the public and made them aware of it. The majority of today’s GUI conventions date back to Apple’s work more so than any other single entity. For that reason, in my opinion, Apple deserves the lion’s share of the credit for this innovation. But, in the end, neither Apple nor Xerox can claim they alone created the GUI. Clearly, the work from those such as Douglas Englebart, Jef Raskin, etc. predates either works from either Xerox or Apple and this foundation was a necessary building block for both companies. Though both can claim partial credit, when you look at the conventions we associate with a GUI today, the majority of credit would have to go to Apple.

Leopard, Vista and Thurrott… oh my!

November 5, 2007

Monday, November 5, 2007

Apple’s most recent operating system, Mac OS X 10.5 Leopard, was released last week and overall, the reviews have been very positive. Not surprisingly, Apple recorded record sales during the product’s first weekend on sale. Leopard sales have far outpaced Apple’s initial sales for Tiger (10.4) when it was released. This stands in strong contract to the reception Microsoft received with its Vista debut. While I happen to think Vista is a fine operating system and largely a big improvement over XP, the migration to Vista has been painful for many. I’ve been shocked to see such a demand from consumers to downgrade back to XP.

With that in mind, it’s kind of funny to read a Leopard review from an extremely biased Windows “journalist”. I’ve tagged this under “journalist hack”, but I may start a new tag called “comedy” to cover items like this. While I realize nobody really takes Paul Thurrott’s articles seriously, feel free to continue reading my rebuttal just for fun.

For the record, I should probably state my position of Leopard right up front. I see Leopard as a nice evolutionary upgrade for the Mac platform. I wouldn’t say the new features in Leopard are particularly innovative in function. Apple has done a good job of taking existing concepts and implementing them in a very user friendly way. While enhancing the user interface to existing concepts is a form of innovation, it’s not the same level of innovation that Apple is known for. Much of what Apple has done with Leopard has been done before in other operating systems. Thurrott tries to attribute much of Apple’s work to previous works from Microsoft. In doing so, he makes himself look foolish as his comparisons are often a bit of a stretch. I’m not sure whether Thurrott really isn’t aware of where various technologies have originated from or whether he chooses to omit such information in order to give his explanation more credibility.

Apple Mac OS X 10.5 ‘Leopard’ Review
Paul Thurrott

“While the Apple hype machine and its fanatical followers would have you believe that Mac OS X 10.5 “Leopard” is a major upgrade to the company’s venerable operating system, nothing could be further from the truth. Instead, Leopard is yet another evolutionary upgrade in a long line of evolutionary OS X upgrades, all of which date back to the original OS X release in 2001.”

I suppose this is where the comedy begins. A common argument I hear from Windows zealots seems to be that each OS X upgrade is minor, sort of like a “service pack” release for Windows. The basis for this argument comes solely from the naming convention. The fact that each release has had what seems like a point release (10.0, 10.1, 10.2, 10.3, 10.4, 10.5) naming convention seems to be more than Thurrott can handle. In all fairness, the 10.1 “Puma” release really was just that, a point release to fix stability issues and add very minor features to the not yet ready for prime time 10.0 “Cheetah” release. However, each subsequent release since then has been very substantial and worthy of the “major release” designation.

Apple highlights some of the more significant user level features on their web page. They even have a site which makes a list of 300 new features. Granted, all of the new features aren’t earth shattering and many are enhancements to existing features, etc. Still, not everything is listed. For most of what’s new in an operating system, you’d have to be a developer to appreciate. For those who actually want a taste of what Apple has really done with the kernel, systems services, etc. you can start by reading the John Siracusa’s excellent review at Ars Technica.

Anyway, back to the Thurrott’s review… The funny thing is, Thurrott is in a tough position. Leopard is considered by most to be the leading operating system in terms of technology and features. Clearly there is not a true one to one comparison with Vista. Vista excels at some things and Leopard excels at others. Still, Leopard is the real deal and Thurrott is fully aware of it. So, on one hand, Thurrott wants to belittle the accomplishment Apple makes with each release. On the other hand, you have to acknowledge the product Leopard is in your review. The only way to do that is to claim the original Mac OS X 10.0 release was better than it is. That’s pretty much how Thurrott back pedals in his next paragraph. He doesn’t mention the 10.0 release by name, but since everything since then has been a “service pack” like update, what he’s implying is clear enough.

“But let me get one huge misunderstanding out of the way immediately: That’s not a dig at Leopard at all. Indeed, if anything, Apple is in an enviable position: OS X is so solid, so secure, and so functionally excellent that it must be getting difficult figuring out how to massage another $129 out even the most ardent fans. Folks, Leopard is good stuff. But then that’s been true of Mac OS X for quite a while now.”

Thurrott tries to draw parallels between Vista and Leopard and the development process between Microsoft and Apple, but he just doesn’t fly.

“Both Leopard and Vista were horribly late, Vista even more so than Leopard.”

Really? Vista/Longhorn was supposed to ship in 2003, but actually shipped (to consumers) in 2007. That’s 4 years late and it was missing most of the interesting features that were originally promised. Leopard was supposed to ship in “first half” of 2007, but actually shipped in October of 2007. That’s 4 months late and feature complete. BIG difference Paul!

New Features

For some reason, Thurrott goes on a tirade about the definition of “new features”.

“…the feature must actually be new (i.e. have not appeared in any form in a previous version of the product) and must actually be something that impacts end users in a practical way.”

According to Thurrott’s definition, hardly any software products can list “new” features. That is, (using Thurrott’s own example), it doesn’t matter what “enhancements” Apple has made to its DVD player for example because it already had a DVD player. Likewise, new capabilities of this or any similar application don’t count as “features” for some reason. I wonder if Thurrott applies this same logic to his Microsoft Vista or Office reviews?

Time Machine

“New to Leopard, Time Machine is Apple’s version of Microsoft’s Previous Versions feature, which first appeared in Windows Server 2003 over four years ago.”

As expected, Thurrott loosely tries to compare Apple’s Time Machine feature with something Microsoft has. The problem is, what is he comparing it to? Backing up files? Apple has had a “backup” application for years that came standard with .Mac subscriptions. Backing up files is nothing new for Apple either. Perhaps he means snapshots? Again, this concept has been around in the Unix world for ages, certainly long before Microsoft even considered adding such a feature. This alone makes me wonder why he is trying to pretend Apple is copying Microsoft here. It gets better though.

“What makes Time Machine truly interesting is that it works with certain applications in addition to files and that’s something Apple should stress more in its discussions about this feature.”

Again, this begs the question, which Microsoft product does what Time Machine does here?

“Unfortunately, the company mucked up Time Machine with a truly juvenile user interface, one that is horribly out of place in its otherwise staid and professional looking OS X.”

I suppose user interface is a subjective thing. However, it’s worth noting that this is the one and only review I’ve seen which doesn’t praise the Time Machine interface. Apple has put a practical and user friendly interface on what has traditionally been a tool reserved for geeks. It would seem that Thurrott stands alone on this one.

“Apple also blows it by requiring a second hard drive: This makes Time Machine less useful for mobile users, which Apple says represent over 50 percent of its sales. Way to ignore your own trends, Apple.”

Again, comments like this are why I enjoy reading Thurrott’s posts. Comic relief is a great way to reduce stress in the work day. Basically, Thurrott is advocating the practice of storing your backups on the same volume as your original data. That’s all well and good until you actually need to restore your data after a disaster occurs. What do you do when your hard drive crashes Paul? You’ve just lost your original data and your backup data. Brilliant! Please Paul, don’t apply for an IT job. Any laptop user that doesn’t backup to another physical device such as an external hard drive or some other network storage is just asking for trouble.


Thurrott doesn’t say much about Leopard’s Spaces feature other than acknowledge that it came from the UNIX environment. True enough, though some of the ideas go back as far as Xerox Parc, the concepts of virtual desktops are really nothing new. Most of the work was pioneered by the X Window system. Microsoft has dabbled with this concept before by making this feature a “PowerToy” add on to Windows XP, but the implementation was absolutely horrible. Basic features like moving a window to another desktop, etc. were not implemented.

This is definitely a feature for power users and Apple seems to have done a good job here. There is nothing innovative about what Apple has done with this, but it is a first class implementation of an existing concept.

User experience

“After years of deemphasizing unnecessary translucency effects in Mac OS X, Apple takes a big step back in Leopard. Now, not only are menus more translucent than ever in Leopard, but so is the system-wide menu bar at the top of the screen, meaning that it will rarely be solid white as it’s been in all Mac OS releases since the original version in 1984. The effect is ugly, and I wish you could at least turn it off.”

Every dog has his day and even Thurrott gets to be right once in a while. Fortunately, this “feature” can be turned off, but it involves something beyond a simple control panel setting.

“Apple’s file manager application, the Finder, has always been adequate, but this time around it’s been upgraded with a number of Vista-like features, including a new look and feel (based, go figure, on iTunes) and a semi-customizable sidebar. This, I like quite a bit.”

This is another one of those situations where you just have to laugh. Thurrott claims the Mac “Finder” has been upgraded with “Vista-like” features, which in turn he credits as being based on iTunes. Really, why bring Vista into this? Clearly the iTunes product and interface have existed long before Vista. Isn’t it possible that Apple is just standardizing the interfaces of its own products? Thurrott is clearly implying Apple is copying Microsoft in some way. Yet, he goes on to acknowledge that both Leopard and Vista are copying Apple’s own iTunes. Funny. He goes on to draw other similarities between Leopard and Vista, but again, most of these conventions already exist in Apple’s iTunes.

Smart Folders

“Search For, as you might expect, is OS X’s answer to Vista’s Searches folder. Here, you’ll see links to prebuilt searches such as Today, Yesterday, Last Week, and links for searching for images or documents. And as like Vista, you can create your own saved searches. These will automatically show up in the Search For list in the Finder when saved.”

Again, Thurrott seems to be hoping that his reading audience is limited to Windows users that have never seen or heard of other operating systems. Here, Thurrott is referring to Apple’s “Smart Folders” feature. Smart Folders are also known as Virtual Folders. They are basically saved search criteria. When you access one of these folders, they use the saved search criteria and use the systems search engine to provide dynamic results. Apple has made extensive use of this feature in products like iTunes, iPhoto, Mail, etc. for years. Microsoft first used this feature in Outlook, but this use is predated by iTunes by several years. Even still, the feature really originated with the Be operating system (BeOS). So, again, this concept is nothing new. Apple hasn’t done anything innovative here but they certainly didn’t steal this from Microsoft as Thurrott seems to imply.

Quick Look

“In another nod towards reducing steps and thus increasing efficiency, Leopard includes a new feature called Quick Look, which lets you view the contents of most document types without opening them in the application that created them. This feature, apparently modeled after the preview feature in Windows Desktop Search, augments Leopard’s Finder-based icon views which, like those in Vista, use thumbnails to reflect the contents of documents.”

At this point, one has to question whether he’s insane or not. Many programs save icons that reflect the content and have for years. For example, this is typical for graphics programs when saving JPEG files, etc. Comparing Leopard’s Quick Look to Vista’s preview feature is either disingenuous or naïve (or possibly both). Quick Look is way beyond Vista’s preview. Quick Look allows you to view entire documents, page by page for example without opening the application. It’s instantaneous in performance and elegant in design. It’s a real break through for searching content. It should also be noted that Mac OS X (since the beginning) has always had the ability to preview photos, movies, audio, etc. in a method beyond what Vista’s preview does. This feature works in conjunction with Apple’s Cover Flow viewing mode. Since this is another feature which has no counterpart in Vista, Thurrott makes up some nonsense about performance issues. I’m not sure what machine he’s using and if he’s looking in folders with 10,000+ files or something. From my experience, I haven’t witnessed a performance issue with this feature.


“When Apple copied Microsoft’s instant search feature to create Spotlight, it only got it partially right, so the Leopard version addresses some of the missing features from Tiger.”

Predictably, Thurrott paints Apple as the one copying Microsoft. Really? That’s a bold claim to make and it should be noted that he offers nothing to back this up. Apple shipped a comprehensive desktop search solution, Spotlight, in the 10.4 (Tiger) release. Tiger shipped in April, 2005. Microsoft shipped a comprehensive desktop search solution, WDS, in the Vista release. Vista shipped in January, 2007 (let’s not split hairs about the fall 2006 release to select “business” users). So, that begs the question: How did Apple copy Microsoft?

Apple first introduced Spotlight in June, 2004 at the annual WWDC (world wide developer conference) as part of an early demonstration of their upcoming Tiger release. Not coincidentally, Microsoft quickly went out and purchased a search engine start up company, Lookout, in July, 2004. This acquisition later went on to be released in beta as the “MSN Desktop Search”. In version 2, it was later renamed to Windows Desktop Search. Yet, we’re supposed to believe Apple is copying Microsoft? Not according to the timeline!

In case anyone is wondering, Microsoft’s indexing services (which Apple also had back in the “classic” Mac days) is not the same thing. Some may note that Microsoft was working on the now defunct WinFS for years. But I see no point in discussing products than never shipped. We could just as easily discuss Apple’s previous efforts with the V-Twin search engine developed for Copland, etc. Also, if Microsoft’s home grown search technology were really mature enough, they wouldn’t have had to go out and purchase another company just to acquire a competing solution.

In any case, it’s not like either company came up with the idea. The BeOS was years ahead of it’s time technology wise. The BeOS Tracker had what is considered to be the best desktop search integration even today.

Anyway, Thurrott goes on to try to make Spotlight look bad by making false claims. For example:

“Spotlight now supports Boolean operators like AND, OR, and NOT, which should be familiar to database gurus and Google fans. As with Vista’s Start Menu search feature, you can now use Spotlight to quickly find and launch applications.”

While it is true that there were limitations on what you could do from the Spotlight search window in Tiger, Thurrott is absolutely incorrect on both counts here. For example, searching from a Finder window allowed for more extensive search criteria. Further, command line utilities such as mdfind, mdls, mdutil, etc. allowed for extensive Boolean based searches and other flexibility not found in other search engines. Additionally, you most certainly could launch programs from the Spotlight search window, though I do agree that this feature was better implemented in both Vista and Leopard.

Finally, Thurrott makes it sound like Apple is just catching up here. That’s not the case. Where is Vista’s (WDS) ability to search over networks like Leopard can?

Web Browsing

“Apple’s lackluster Safari Web browser is updated to version 3 in Leopard and it features some improvements that will be familiar to user of Firefox.”

This is the sort of comment that demonstrates Thurrott’s extreme anti-Apple bias. Since when is Safari considered to be a “lackluster” web browser? I certainly agree that Firefox is a great web browser, I use it fairly often. Prior to Safari 3, it did have the best search feature for example. I find it odd that he doesn’t even mention IE 7. If anything, IE has been considered lackluster for only just now getting tabbed windows, not to mention its poor support for WC3 standards. On the other hand, Safari has been proven to be the fastest browser and the first to successfully complete the ACID2 test. That doesn’t sound like a “lackluster” browser to me. I’d say the biggest legitimate knock against Safari is market share. In practice, web developers are forced to make sure their web pages can be viewed by Microsoft’s Internet Explorer, even if they have to abandon internet standards or dumb down their code in the process.


“It’s not like OS X, which has had no real world viruses or malware attacks over the year, has gotten any more secure in a realistic sense.”

I’m not even sure what that means. Basically, Leopard does make security improvements. In some cases such as “Library Randomization” it is playing catch up to Vista and other more secure operating systems (OpenBSD, etc.). That said, Apple apparently doesn’t get credit for this because it was already “secure enough”? Granted, Vista was a huge improvement over XP in terms of security. The vast majority of security features have always been present in OS X. This explains why malware attempts have largely been unsuccessful with OS X and now similarly with Vista. Windows zealots used to claim that OS X was safe due to security through obscurity. Of course, now that Vista has adopted much of what OS X has already done, some are beginning to recognize the OS X’s security model was in fact very good.

“If Apple is seriously about slowing that growth, it needs to offer an OS that is obviously better than Vista. Leopard is not that system.”

Thurrott is a bit naïve if he thinks operating system market share is that simple. Tiger seriously outclassed XP, but Windows users didn’t switch in droves (some did though). Switching platforms is extremely costly, especially for the business world where most of Microsoft’s sales come from anyway. History has proven that the best operating system in terms of features and technology is not enough. Does anyone remember DR-DOS? OS/2? etc. ?

“Make no mistake: Mac OS X 10.5 “Leopard” is the real deal, a mature and capable operating system and a worthy competitor to Windows Vista. But then, so was Tiger, Leopard’s predecessor.”

Agreed. Microsoft Vista brought rough parity with Apple’s Tiger (10.4) operating system. In some ways it was a little better and some ways not quite as good, but overall, it certainly bridged the huge gap between XP and Tiger. Leopard is not revolutionary in any way, but it is a nice evolutionary upgrade over Tiger and overall most would agree it’s a better choice over Vista. That opinion is certainly shared amongst the more respected press.

“Another problem with Leopard is the unmet expectations. Apple, like Microsoft with Windows Vista, promised more than it delivered with Leopard, and even went so far as to promise secret new features that never materialized.”

I think there is a big difference here. Microsoft made a big deal about the different pillars of “Longhorn”. Major, specific features were promised such as WinFS, Palladium, etc. but not delivered. Apple on the other hand never promised anything specific. Rather, when Leopard was first demoed more than a year ago at the WWDC, they mentioned other “secret features” in a sort of tongue and cheek type of way. Surely, anything like the support for Sun’s ZFS, etc. could qualify for that. To compare Microsoft’s broken promises to Apple’s is just absurd.

“Leopard is also incomplete. If you purchase this product on October 26, you’ll be getting pre-release quality software that Apple will update early and often, as they’ve done so often in the past with virtually all of its software products in the past several years. While your garden-variety Mac zealot may bristle at this suggestion, people who actually beta tested Leopard know what I’m talking about. It will get better over time. It always does.”

By Thurrott’s definition, Vista must also be incomplete. Are there no service packs coming for Vista? Really, Thurrott’s sense of logic is certainly twisted. No operating system or software product of that scope will be released without the “service pack” type of updates.

“Leopard was Apple’s chance to once again leapfrog Windows, and given the five years of delays Microsoft put us through, it should have been a slam-dunk. That Apple was only able to come up with something that’s roughly as good as Vista is both surprising and telling, I think. Leopard just isn’t better than Vista. And it should be.”

Again, this is another example of more ridiculous commentary from Thurrott. Yes, Leopard was an opportunity to leapfrog Vista. While I’d stop short of saying Leopard leapfrogged Vista, it generally did shoot past it. To suggest that leapfrogging Vista should be a slam dunk is to suggest that Vista is crap and easy to leapfrog. That’s just not the case. Also, why is Thurrott surprised that Apple is the only real competitor left? Operating system development is a huge effort and requires support from third parties. Linux is fundamentally sound. From a kernel level technology, it’s better than Windows. Linux just needs to standardize on the front end and get better third party support. It also needs to get a user base that is accustomed to paying for software.


I’m not sure a Paul Thurrott article really warrants a formal rebuttal. In fact, I’m quite sure it doesn’t. Nobody is perfect and everyone (including me) makes mistakes. However, there are few examples of people who are more consistently wrong than Thurrott. Rob Enderle comes to mind, but that’s another story. Since Microsoft is a client of Enderle, at least it’s clear where his bias comes from.

I’ve seen various forum debates where even the most die hard Windows zealot won’t cite Thurrott as a source of information because they know that would only hurt their position in an argument. Still, I have to admit, I do enjoy reading his articles once in a while. Every once in a while, I might even indulge myself in a rebuttal!

In any case, Vista is a fine operating system and in my opinion is somewhere between Apple’s Tiger and Leopard releases in terms of features overall. I’d agree that it’s roughly on par with both, but from my experience, I’d say Apple is in a better position with Leopard.

The real challenge will be to see where both companies go from here. If Microsoft goes another 5 – 6 years before it’s next major OS release, they will certainly fall behind Apple, much like XP was well behind Tiger.

While the Vista product is good, Microsoft can’t be proud of Vista as a project. It took nearly 6 years and cost more than 6 billion dollars and finally shipped without many of the promised features. That was a disgrace and I’m sure it’s no coincidence that Jim Allchin has moved on. Microsoft has replaced Allchin with Steve Sinofsky. I have no doubt he’ll do a better job on delivering on promised features and promised dates.

It’s not like Apple hasn’t suffered through this as well. Remember the Copland project? Of course, all of that was before Steve Jobs (and company) returned to Apple. Since then, Apple has been very predictable and reliable with regard to delivering on its promises. Perhaps Apple was a bit stretched recently with its focus on the iPhone product. However, a 4 month delay for a software product of this scope is certainly reasonable. Unlike Thurrott, I would not put Leopard in the same category as Vista from that respect. Still, the pressure will be on Apple to continue to enhance the Mac OS X operating system with regular updates. Only time will tell who will deliver what next… I’m looking forward to the next releases from Microsoft and Apple already!

Subpixel font rendering: A difference in philosophy.

August 23, 2007

Thursday, August 23, 2007

Every once in a while, someone writes an article that inspires me to write a blog entry that explores a topic in more detail. In this case, the credit goes to the infamous George Ou for his blog entry entitled: Vista puts Mac OS X font rendering to shame. Mind you, when an article is well written and presented with facts, there isn’t much controversy and hence, not much more to expand upon.

Rather than go into the detail of Ou’s blog, it’s safe to just summarize his article by saying he doesn’t like the font rendering technology in OS X, but not surprisingly, he does like Microsoft’s font rendering technology. Mind you, Ou has no credibility in the publishing field and he has even less credibility in basically anything IT related. Further, his blog article misses many key points which are critical in this discussion. While I’m not writing this article for the purpose of character assassination, it is worth putting the validity of Ou’s blogs in context. Based on the history of his blogs, he’s very biased against Apple. He’s probably most famous for his support for David Maynor in his botched attempt to demonstrate a bug in Apple’s WiFi drivers, but that’s a story for another day. The point is, Ou has an established history of being consistently incorrect and drawing conclusions from misinformation. He strives to be controversial. Writing outrageous articles which neglect the facts are his modus operandi. For this reason, I won’t bother dissecting his post. Instead, I’ll just give him credit for giving me a reason to write about this issue.

What is subpixel rendering and why is it used?

Have you ever noticed that fonts and drawings, etc. tend to have jagged edges, particularly around curves or diagonal lines? Yet, when you print the same text or drawing to a printer, it looks sharp and much clearer? That’s because vector based images (like Postscript, Truetype or Opentype fonts) use mathematical calculations to draw their shapes. More specifically, the typical screen resolution is in the 72 – 96 dpi (ppi) range and printers are at least in the 300 dpi range. As the resolution of the print/display medium increases, so does the accuracy of drawing.

Anti-aliasing is a means of tricking the human eye into believing you are viewing an image at a higher resolution. For example, a white diagonal line displayed against a black background will look jagged when displayed from a computer screen. The same (vector based) line, when calculated at a higher resolution (like when it’s output to a printer) will look smooth. However, there is a technique for displaying grey dots between the white edges of the line and the black background. The effect on the human eye is to have a smooth line displayed on a low resolution display.

In summary, with respects to fonts displayed on the computer screen, anti-aliasing is used to make jagged fonts appear smooth. Subpixel rendering, like anti-aliasing, is a method for font smoothing.

How does it work?

The smallest element (dot) on a computer screen is called a pixel. Each pixel, particularly on LCD screens, is comprised of a red, green and blue stripe. By varying the intensity of these different stripes, the LCD panel is able to make the pixels (dots) brighter or dimmer in addition to changing colors. Subpixel rendering basically allows the vector calculations used to draw the font to go down to the subpixel level (red, green and blue components of a pixel) in order to achieve the effect of displaying the font at a higher resolution. This method effectively triples the horizontal resolution of an LCD display. Subpixel rendering does not have the same effect on CRT displays.

History of subpixel rendering…

Manipulating the screen at the subpixel level is nothing new. This technology actually dates back to the old Apple ][ days where Steve Wozniak manipulated sub pixel data as a means of displaying more colors than the hardware was designed to. Although it was used for a different purpose, the concept is the same.
Quote from Steve Wozniak:

“Back in 1976, my design of the Apple II’s high resolution graphics system utilized a characteristic of the NTSC color video signal (called the ‘color subcarrier’) that creates a left to right horizontal distribution of available colors. By coincidence, this is exactly analogous to the R-G-B distribution of colored sub-pixels used by modern LCD display panels. So more than twenty years ago, Apple II graphics programmers were using this ‘sub-pixel’ technology to effectively increase the horizontal resolution of their Apple II displays.”

In 1998, Microsoft announced its brand of subpixel rendering, called Cleartype. This feature was optional, but left off by default on windows XP (circa 2001). It is turned on by default in Windows Vista (circa 2007).

Apple has provided font smoothing technologies in OS X since its first consumer release in 2001. The main difference is that Apple’s font smoothing technologies employs different algorithms based on the desired setting. That is, it will use subpixel rendering for the LCD setting, anti-aliasing for CRT settings, etc.

A difference in philosophy.

Both Apple and Microsoft adopted different philosophies to address the jagged font issue. Since the philosophies are different, so are the implementations. It’s not necessarily a matter of being right or wrong as both implementations have their respective strengths and weaknesses. All the same, everyone is entitled to their opinion.

This difference in philosophy is best described in the link below.

Font smoothing, anti-aliasing, and sub-pixel rendering

“Apple generally believes that the goal of the algorithm should be to preserve the design of the typeface as much as possible, even at the cost of a little bit of blurriness.
Microsoft generally believes that the shape of each letter should be hammered into pixel boundaries to prevent blur and improve readability, even at the cost of not being true to the typeface.”

That is, Apple’s method keeps fonts true to their design at the potential loss of sharpness on very small font sizes. Microsoft’s method attempts to keep fonts as clear as possible, at the expense of the accuracy of the font.
In Joel’s example, the dot on the “i” in the word “Rendering” was slightly clearer with Cleartype. However, that came at the expense of the inaccurate difference between plain text and bold text. Cleartype messes up the “g” and the “e” characters as it forces alignment to a subpixel grid. The “R” is slightly exaggerated and the “o” is shaped as an oval instead of a circle with the Cleartype implementation. In short, Cleartype actually changes the font to the point that the font displayed is not what was intended by the font designer.

At the web site below, Damein Guard illustrates the difference in scaling between Microsoft’s Cleartype and Apple’s font smoothing. Cleartype is all over the place and Apple’s font smoothing is nice and linear. There may be a difference on a CRT, but on a quality LCD, I wouldn’t even consider Cleartype to be smoother in this example.

Damien goes on to note:

“Windows does not scale fonts linearly as the rough line points out.

Windows scales the height and width but not the weight of the font.“

My own test…

Since I am a multi-platform user, I decided to conduct my own test. That and I don’t want to run into any copyright issues by posting images created by someone else. Here’s what I found:

render technology

In my test above, I created the Windows version with Word and the Mac version with Pages. I used the same font, Arial on both. The line in blue is 12pt and the black text starts at 8pt and moves up to 16pt in 1pt increments. Note: You should be using an LCD to view this image. Here’s what I observed:

  • Even at the same font size, the Microsoft’s fonts are bigger. The larger the actual font, the easier it is to draw it correctly. So, right off the bat, there is a difference. Perhaps this has something to do with the default dpi the screens are set at?
  • In the lower font sizes (9pt – 11pt), Cleartype does appear slightly sharper.
  • I do see accuracy problems in the Cleartype rendering though. Starting with the line in blue – both the bold “e” and “g” characters are rendered incorrectly.
  • As mentioned earlier, one thing is clear. The size scales fine on both sides, but weight does not with Cleartype. With Cleartype, the weight seems to be grouped by every 3pt increments. For example, 14pt, 15pt, 16pt differ in size, but have the same apparent weight. This is a result of Cleartype forcing the output to a grid. OS X’s Quartz rendering by comparison scales very smoothly in both size and weight as one would expect.
  • When you’re given a choice between choosing slightly crisper text or more accurate fonts, it’s understandable why some would prefer going with the crisper text. If your needs are basic such as writing an e-mail, or simple word processing, Microsoft’s method might be better for you. But, if you’re doing any sort of professional work where font accuracy and spacing is important, Apple’s implementation is the better choice.

    In some respects, it’s not just a matter of personal choice. For example, if you are doing content creation using Cleartype, these font inaccuracies are then propagated to other forms of media such as web page graphics, DVDs, etc.


    This discussion is focused on the difference in philosophies. However, many facts were presented along the way. The facts suggest that Microsoft’s Cleartype can appear to be clearer than Apple’s font smoothing under certain circumstances. My opinion is that the difference in clarity is negligible. The facts also show that Microsoft’s Cleartype is considerably less accurate than Apple’s font smoothing technologies. Cleartype rendered fonts are not displayed in the manner the font designer intended and the fonts don’t scale properly, especially when compared to Apple’s implementation.

    As a matter of subjective opinion, I certainly agree that some people may prefer the look of Cleartype rendered fonts over fonts rendered with Apple’s technology. However, my anecdotal evidence also indicates that people who prefer Cleartype are generally less knowledgeable about fonts in general. Personally, I share the opinion that it is not Microsoft’s place to purposely distort a font, even if the intention is for better clarity on LCD monitors.
    As others have mentioned before me, if screen clarity is the primary objective, then screen optimized fonts such as Verdana, Lucida Grande, Georgia, etc. should be used. Similarly, these screen based fonts have little character when output to print. Likewise, consideration for using the appropriate font for the appropriate occasion should be exercised. Finally, I believe font rendering technology should render fonts as accurate as possible to the intended design.

    Finally, although there is a difference between Apple’s Quartz renderer and Microsoft’s Cleartype renderer, I don’t see that the difference is really worthy of all the fuss some people seem to make of it. Both methods have advantages and disadvantages. Both Apple and Microsoft have taken different philosophies. But, is that such a terrible thing? Generally speaking, Microsoft and Apple serve different markets. I realize that the George Ou’s of the world are just looking for attention. I’m not sure what else would prompt such lopsided arguments. In any case, Windows users seem happy with Cleartype and Mac users seem happy with Quartz. They are both optimized for different purposes. Neither is perfect for all occasions and we’re all entitled to our own opinions and preferences. As stated, my opinion is that the distortion of a font is not acceptable for any reason. Whatever your opinion is on this issue, hopefully, this post helps you consider the pros and cons of both implementations of subpixel font rendering.