This was post to CompuServe on Aug 20th 1997 in response to the following comment:
“The only computer company offering a GUI was, at one point, Apple, and it's pretty clear that Windows competed by borrowing it. I didn't say anything about who created it, although according to the Steven Levy book the PARC prototype was much transformed and improved by Apple, while Windows has yet to improve on the MacOS.“
At the same time as the Mac came out, there was GUIs on UNIX workstations from Sun, Apollo and SGI. More importantly, there was VisiSoft, who was offering VisiOn, which was the acknowledge source of Windows. Development of Windows & MacOS were started at pretty much the same time, independently. Mac came to the market first, because they were designing the hardware & software together, to complement each other; Microsoft had to retrofit Windows into IBM's lamebrain hardware design. Much of the changes between the Mac and the PARC system, came from recommendations by Microsoft programmers, who were working with Apple, writing applications for the Mac while the Apple programmers were writing the operating system.
And the sentiment that "Windows has yet to improve on the MacOS" is a common one --- among MacUsers, but is hardly justified. An operating system is much more than a user interface, and in the basic OS architecture (Memory management, process control, device management) Windows win (And Windows NT blows them both away). This is why Apple has been "borrowing" things from Windows, like DDE (circa 1988) which begat Publish & Subscribe (circa 1991), and OLE (circa 1992) which begat OpenDoc (circa 1994, RIP 1996).
Even for the user interface, there is a battle a the user studies. Apple has studies that show MacOs is better; while Microsoft has studies show that Windows is better. So what's the difference? Apple stacks the deck more. The survey people who've used both -- which sounds fair, but almost guarantees that most of them have been using a Mac longer, and asks them which they feel more comfortable with. Naturally, they choose Macs. Microsoft, on the other hand, finds people who haven't used either, and actually times them doing things, to find out what way of doing things is really best. Apple used to do things like that also, back when that department still had a staff, several rounds of lay-offs ago.
And, and Time reports in this weeks issue, in 1988, Gates sent a letter to Apple: "Apple should license Macintosh technology to 3-5 significant manufactures for the development of Mac compatibles...Microsoft is very willing to help Apple implement this strategy," but Apple management was to pigheaded to do what needed to be done. Two years later, ever industry analyst was saying the same thing. Seven years later, Apple finally came to it senses, and as the recent deal shows, Microsoft is still "very willing to help Apple".
I find it rather funny that the "common wisdom" is that Bill Gates want to control everything, while Jobs is looking out for the people, when the facts clearly demonstrate that the reverse is true. Both Jobs & Gates learned of graphical interfaces at the same time, and both saw them as the future of computing -- If their shared vision of personal computers everywhere was to be achieved, then those computers would have to be running a GUI. Gates didn't care who provided them, as long as he got a cut.
To have GUIs everywhere, Gates needed to accomplish two things: a) To promote Macs, and b) to kill the IBM command-line interface. The trick with b) was that he knew he couldn't kill the IBM hardware standard, which was already deeply entrenched. Hence he had to promote a GUI on the IBM platform, and since no one else was writing one, Microsoft did.
Jobs, on the other hand, continued his "all things emanate from Apple" stance, with a closed proprietary hardware design, a closed proprietary software design, and his rather childish "Kill Windows" plan. This "competition" between Apple & Microsoft existed only in the mind of Jobs, which he preached enough that it finally took root in the rest of Apple management, and finally in the Apple user community.
Had Apple listen to Gates, and opened the design, Apple would have been the #1 hardware manufacture, with MacOS having about 50% of the market, --- Until, about, I'd say, 1990, when Windows & MacOS would have essentially merged, forming a system which combination the best of both, and was compatible across all platforms, and the personal computer world would have reached the state of "One World, One Operating System", and both companies would be working to improve it, instead of wasting their time duplicating each other's work.
A while back I read someone describing "mature" technologies -- One critical test of when a technology became "mature", was when their media could be exchanged without have to ask "What format is it in?" Phonographs matured in 1950 when RCA started making turntables which revolved at 33 1/3 RPMs (to play records marketed by Columbia) as well as 45 RPM (to play the records they marketed). Video tape matured when Betamax died. CDs came out the chute "mature". But, mainly due to Steve Jobs's ego, Personal computing hasn't been able to mature.... He remains probably the biggest single roadblock to advancement in this area.
Note - as an aside, the comment given above which prompted this essay was post on CompuServe by a writer of some notoriety, who knows much about movies, but not nearly as much about computers as he thinks he does...
Copyright © 1997 James M. Curran.
All rights reserved.
Revised: 24-Mar-2014