iPad: After the dust settles
We are now a month post iPad launch and where are we? Some news sites are postulating the the use of iPad in the enterprise world. I can’t even fathom a use for it at home, much less putting it into the hands of corporate users. Let’s examine this platform more carefully to understand the reasons for its lack of viability in Enterprise, much less for simple home uses.
Multitasking
I know a lot of media outlets have harped endlessly on the lack of multitasking. Granted, it is a computing device and most computing devices do require some level of multitasking. I know that the iPod and the iPhone had very little internal memory (at least up to 3rd gen editions). So, it could only comfortably run one app. That was primarily a memory limitation. With the latest iPhone and iPod touch, that may have all changed. To be perfectly honest, I don’t keep up with Apple’s hardware spec details. Why? Because it’s not really that much of an interest. I mean, that’s the point. It shouldn’t be. Apple has always touted their devices ‘just work’. So, why should I need to dig inside one of their devices to find out the gritty details? I shouldn’t need to.
This issue is also what bites Apple many times over. They decide to under design the device by putting in not enough resources. So, they have to make concessions in the operating system by removing things like multitasking. Of course, with the iPod touch and the iPhone, we’re talking about very small devices that may not have the physical room to hold the amount of resources necessary to support multitasking.
The iPad’s physical size should no longer hinder their ability to put in the necessary resources to fully support multitasking (and then some). So, there is no reason why the iPad shouldn’t have supported multitasking out of the gate. But, it doesn’t. The iPhone 4.0 OS is supposed to address this issue, but not until the fall of 2010.
Multitasking and how it relates to computing
So, is it required that a PC support multitasking? Good question. It’s also a question not easily answered. In general, though, the answer should be ‘yes’. It should be ‘yes’ because the ability to run multiple apps is necessary to get things done. For example, to copy and paste between two different documents or to share information through application conduits. Even the simple act of embedding one app inside another requires multitasking. Quitting and restarting each app to move between them is cumbersome and time wasting.
In the end, yes, multitasking is required to make the computing experience be what we’ve come to expect. In the iPad, that computing experience isn’t there. So, for this reason, iPad won’t be fully accepted without multitasking. We’ll revisit this topic, however, once multitasking is (re) introduced in iPhone 4.0 OS.
Enterprise computing and the iPad
Is it ready for enterprise? I personally would say no. I’ve owned an iPod touch for several years and since the iPad really has no better selection in enterprise apps than does the iPod touch, the answer to this question is still no. Even though the screen on iPad is larger, it doesn’t offer the necessary productivity apps to fully work in a corporate enterprise. Yes, it does have a mail app. That’s a big part of what makes it work in Enterprise. Unfortunately, that mail app is so locked down and limited, that it may not fully work for the enterprise. That’s not to say that the mail app isn’t good in a pinch or to read a quick current email or two. But, don’t try to go searching for emails buried in your folders, that just doesn’t work well.
For enterprise computing, the current incarnation of the iPad is nowhere near ready.
What uses does the iPad offer?
Good question, once again. It isn’t a media PC, so that’s out. It isn’t good for enterprise level computing, that’s out. It can watch movies and read books, so coffee table literature, ok. Ignoring the touch screen and sleek design (which are just amenities, after all), it has to come down to the apps and features. The apps are limited, at this point, and I don’t really see much of that changing due to Apple’s app lockout situation.
Until Apple opens the platform up for general development, the platform will continue to be tightly controlled and, thus, limiting the applications that are available. Until this situation is resolved, this device may never end up anything more than a novelty.
HP’s slate cancelled
Looking back at history shows us that the tablet has had an extremely rocky past that always leads to failure. I’m not sure that even Apple can overcome this rocky past even with the success of the iPhone. The iPad is really too big and clunky to be truly portable. It’s too closed to allow open development. So, it’s no surprise that HP and other companies who had previously announced their intent to release a tablet are now reconsidering that release. In fact, HP announced the slate on the heels of Apple’s iPad announcement and has now cancelled the slate completely before it was even released.
Some people blame the ‘success’ of the iPad. Well, success is very much subjective. Putting 500,000 of the iPad into circulation is nothing to sneeze at. But, that doesn’t necessarily indicate success. The Newton comes to mind here. It was a hot new item that all but died in about two years. Where is the Newton now? Apple has a past history of deleting products rapidly and the iPad may be one of those items.
Apple’s past failures
If you really want an iPad, then get it now while it’s hot. Don’t wait. The reason I say this is that in 6-12 months, you could find yourself with a doorstop that Apple no longer supports. Apple has a history of killing off failing devices rapidly. So, with this particular device, don’t wait to buy it. If you wait until spring of 2011, you may find that the device is dead and buried. You could be holding a flat doorstop that iTunes no longer supports and with no active development. I can very easily see this device becoming one of Apple’s most recent failures.
Working while traveling
The tablet format has a questionable past anyway. The form factor isn’t pleasing to use. It isn’t easy to carry and, getting past the touch screen, it’s cumbersome to write text into it. So, it’s going to need a dock with a keyboard and mouse. A real keyboard and a real mouse. But, that takes the portability out of it. If you’re sitting on a plane, you’re locked into using the touch surface. Now consider that you can’t lay the device flat and work easily. I mean, you can lay it down, but then it’s not at the correct viewing angle. To get it into the correct viewing angle, you have to hold it in one hand, you have to balance it in your lap in a contorted way or you have to carry along a kickstand when you’re on a plane.
In this instance, a notebook, iphone, itouch or netbook works much better. For a netbook, the top pops open at the correct viewing angle and you have a real keyboard and mouse available. Granted, it’s not a touch surface, but that’s just a novelty anyway. Once the novelty of touch wears off, then you have to determine how to make use of this input method in an actual usable way.
Using an iPod touch or iPhone is also easy. It fits easily in one hand, is light in weight and works without the need for kickstands, contorted positions or clumsy positioning issues. Clearly, not a lot of usability was discussed when the iPad was designed. Usability is one of the things Apple usually prides itself in its designs. In the iPad, usability was clearly an afterthought.
3G iPad
This is the one and only one saving grace of the iPad. Internet everywhere is where we need to be. The supposed $29.95 monthly plan associated with the 3G version of the iPad is a reasonable price. Unfortunately, the iPad itself is marked up by an additional $130. So, instead of $499, it’s $629 (and that’s the smallest iPad). But, the iPad with 3G is the only version that I would personally consider. I already have an iPod touch and its uses are extremely limited unless you have WiFi handy (which isn’t very often). And, even when you do find a place that claims to have WiFi, 8 times out of 10, the connectivity is either slow or limited. So much for free WiFi. While 3G isn’t that fast, it’s at least always on pretty much anywhere you need it.
Form factor is the key to success
The problem with the current iPad is its size. This is the wrong form factor to release. It’s the wrong size and wrong weight. The size that the iPad should have been was about the size of a paperback book. Bigger than an iPod touch, small enough to fit in a pocket. This will take computing to the truly portable level. The screen will be bigger than a Sony PSP (which is a decent size to watch movies), but small enough to still be portable. Combine that with 3G and you have a device that people will want to use. The iPad is not that portable and still requires a case with handles. After all, you don’t really want to drop a $600+ device. But, a device the size of a paperback book at the cost of maybe $399, that’s in a price range that could work.
First Gen iPad
Remember that this is the first generation iPad. You really have to wait until the third gen of an Apple device to get to the features that make it worthwhile. The question remains, will the iPad even make it to a third incarnation? Only time will tell. Apple won’t abandon the iPhone OS on devices for quite some time. But, the form factor of the iPad is likely to change several times before it’s over. Like I said, if you want this thing, buy into it now. Otherwise, if you want to wait a year, you may not be able to get this form factor again and you may find that Apple has backtracked into smaller more portable devices.
Clickity Click – The Microsoft Dilemma
Once upon a time, the mouse didn’t exist. So, the keyboard drove the interface. Later, Xerox came along and changed all of that (with the help of Steve Jobs and Apple). Of course, as it always does, Microsoft absconded with the mouse functionality and built that into Windows… not that there was really much choice with this decision.
We flash a decade forward or so and we’re at Windows XP. A reasonably streamlined Windows operating system from Microsoft. In fact, this is probably and arguably the most streamlined that Microsoft’s Windows has ever been (and will likely ever be). Granted, security was a bit weak, but the user interface experience was about as good as it can get. With only a few clicks you could get to just about anything you needed.
Flash forward nearly another decade to see the release of the dog that was Windows Vista. Actually, Windows Vista’s look was not too bad. But, that’s pretty much where it ends. Microsoft must not have done much usability testing with Vista because what used to take one or two clicks of the mouse now adds 1-3 extra clicks. The reason, Microsoft has decided to open useless windows as launchpads to get to underlying components. Added layers that are pointless and unnecessary. For example, you used to be able to right click ‘My Network Places’, release on properties and get right to the lan adapters to set them up. No more. Now this same properties panel opens a launchpad interface that requires clicking ‘Change Adapter Settings’ just to get the adapters. Pointless. Why was this added layer necessary? And this is the best of the worst.
More than this, though, is that sometimes the labeling of the links to get to the underlying components is obscure or misleading. So, you’re not really sure what link to click to get to the thing you need. That means you end up clicking several things just to find the thing you need. Yes, you can use the help to find things, but that then means opening even more windows and clicking through even more time wasting events just to locate something that should have been one-click anyway.
Server Operating Systems
This issue is not limited to the desktop OS world. In the server world, such as Windows 2008 R2, these launch pads are now excessive and in-your-face. For example, when you first install Windows 2008 R2, two of these panels open as the first thing after you log in. So now, I’m already starting out having to click closed two windows that I didn’t even need to see at that point just so I can get to the desktop. Likely, if you’re installing a server operating system, you’re planning on hooking it into a domain controller. So, setting up anything on the local administrative user is pointless. That means I have to close out of these useless panels in order to get to the panel where I can import this machine into the domain. It would have been far more helpful to have the first thing open be the join-the-domain panel. I don’t need to set up anything else on that newly installed machine until it’s in the domain.
Desktop Systems
Most people are much more familiar with the desktop operating systems than the server versions. But, these added clicks are all throughout not only Vista, but now Windows 7. Because Windows 7 is effectively a refresh of Vista with added compatibility features, these extra clicks are still there and still annoying. Why Microsoft had to take a streamlined interface and make it less efficient for users, I’ll never know. But, these added clicks to get to standard operating system tools is a waste of time and productivity. It also requires a higher learning curve to teach people the new method.
“If it ain’t broke, don’t fix it”
This motto needs to be ingrained into the engineering team at Microsoft because they clearly do not understand this. Added extra layers of windows does not make the OS more efficient. It makes it bloated, cumbersome and extremely inefficient. That extra click might only take an extra second, but those seconds add up when you’re doing a repetitive task that involves dealing with those windows as part of your job.
As another example, opening the default setup for control panel in XP shows the control panels themselves. In Vista / Windows 7, it now brings up a launch pad of abstract tasks. Tasks like ‘System and Security’ and ‘User accounts and family safety’. Clicking these leads to more sub concept tasks. So, instead of just showing the actual control panels, you have to click through a series of abstract task pages that ultimately lead you to a tool. No no no. Pointless and inefficient. Let’s go back to opening up the actual control panel view. I want to see the actual control panels. The abstract task idea is great for beginners. For advanced users, we want to turn this crap off. It’s pointless, slow and unnecessary. Power users do not need this.
Windows for beginners
Microsoft Bob is dead. Let’s not go there again. So, why is it that Microsoft insists on trying to spoon feed us an interface this dumbed down and with excessively clicky features? This interface would have been a great first step in 1990. But, not today. Taking this step today is a step backward in OS design. Anyone who has used Windows for more than 6 months doesn’t need these added inconveniences and inefficiencies. In fact, most average computer users don’t need this level of basics. Only the very beginner beginners need this level of spoon feeding.
Microsoft needs a to create new version (or, alternatively, a preference to turn ‘Newbie mode’ off). Simply put, they need Windows for the Power User. A stripped down design that gets back to basics. A design that eliminates these cumbersome beginner features and brings back the single click features that let us navigate the operating system fast and using the mouse efficiently (read, as few clicks as possible). Obviously, if you’re running ANY server edition, this automatically implies the power user interface. Let’s get rid of these helper panels on startup, mkay?
Microsoft, can we make this a priority for Windows 8?
What is it about tablets?
Ok, I’m stumped. I’ve tried to understand this manufacturing trend, but I simply can’t. We have to be heading towards the fourth or maybe fifth generation of tablet PCs, yet each time they bring tablets back to the the market, this technology fails miserably. Perhaps it’s the timing, but I don’t think so. I think the market has spoken time and time again. So, what is it about this technology that make manufacturers try and try again to foist these lead balloons onto us about every 6 years?
Wayback machine
It was in the early 90’s that Grid Computers arguably released the first tablet (or at least, one of the very first tablets). Granted, it used a monochrome plasma screen and I believe that it ran DOS and Windows 3.1 (that I recall), but these things flopped badly for many different reasons. Ultimately, the market spoke and no one wanted them. It’s no wonder why, too. The lack of keyboard combined with the size and weight of the unit, the need for a pen and the lack of a truly viable input method doomed this device to the halls of flopdom. Into obscurity this device went along with Grid Computers (the company).
In the early 2000s, Microsoft+Manufacturers tried again to resurrect this computer format with XP Tablet edition. This time they tried making the devices more like notebooks where the screen could detach from a keyboard and become a tablet. So, when it was attached, it looked and felt like a notebook. When detached, it was a tablet. Again, there was no viable input method without keyboard even though they were touch screen. The handwriting recognition was poor at best and if it had voice input, it failed to work. XP Tablet edition was not enough to make the tablet succeed. Yet again, the tablet rolled into obscurity… mostly. You can still buy tablets, but they aren’t that easy to find and few manufacturers make them. They also ship with hefty price tags.
Origami
Then, later in the mid 2000’s came Microsoft with Origami. At this time, Origami was supposed to be a compact OS, like Windows CE (although CE would have worked just fine for this, don’t know why Origami really came about). A few tablets came out using Origami, but most computers that loaded this version of Windows used it in the microPC format. Since the Origami version of Windows was a full version (unlike CE), it was a lot more powerful than computers of that size really needed and the price tag showed that. Sony and a few other manufacturers made these microPCs, but they sold at expensive prices (like $1999 or more) for a computer the size of a PDA. Again, no viable input method could suffice on the microPC tablets and so these died yet another death… although, the microPC hung around a bit longer than the tablet. You might even still be able to buy one in 2010, if you look hard enough.
Netbook
Then came the Netbook. The $199-299 priced scaled down notebook using the Atom processor. This format took off dramatically and has been a resounding success. The reason, price. Who wouldn’t want a full fledged portable computer for $199-299? You can barely buy an iPod or even a cell phone… let alone a desktop PC for that price. The Netbook price point is the perfect price point for a low end notebook computer. But, what does a Netbook have to do with a tablet? It doesn’t, but it is here to illustrate why tablets will continue to fail.
Tablet resurrection
Once again, we are in the middle of yet another possible tablet resurrection attempt. Rumor has it that Apple will release a tablet. HP is now also pushing yet another tablet loaded with Windows. Yet, from past failures, we already know this format is dead on arrival. What can Apple possibly bring to the tablet format that Microsoft and PCs haven’t? Nothing. That’s the problem. The only possible selling point for a tablet has to be in price alone. Tablets have to get down to the $199-299 price tag to have any hope of gaining any popularity. Yet, Apple is not known to make budget computers, so we know that that price point is out. Assuming Apple does release a tablet, it will likely price it somewhere between $899 and $1599. Likely, they will offer 3 different versions with the lowest version starting at $899. Worse, at the lowest price point it will be hobbled lacking most bells and whistles.
Even if Apple loads up the tablet with all of the bells and whistles (i.e., Bluetooth, 3G, GSM, OLED Display, iTunes app capable, handwriting recognition, voice recognition, WiFi, wireless USB, a sleek case design, etc etc) the only thing those bells and whistles will do is raise the cost to produce the unit. The basic problems with a tablet are portability (too big), lack of a viable input device, weight and fragility (not to mention, battery life). Adding on a hefty price tag ensures that people won’t buy it. Of course, the Apple fan boys will buy anything branded with a half bitten Apple logo. But, for the general masses, no. This device cannot hope to succeed on Apple fan boy income alone.
Compelling Reasons
Apple has to provide some kind of paradigm shifting technology that makes such a failure of a device like the tablet become successful (or whatever Apple cleverly names its tablet device). If the tablet is over 7 inches in size, it will be too large to be portable. Utilizing OLED technology ensures the cost is extremely high. Putting a thin case on it like the MacBook Air ensures that it’s overly fragile. We’ve yet to find out the battery life expectancy. So far, this is not yet a winning combination.
So, what kind of technology would make such a paradigm shift? The only such technology I can think of would have to be a new input device technology. A way to get commands into the notebook and a way to drive the interface easily. Clearly, a multi-touch screen will help. The iPod is good in that regard (except that you can’t use it with gloves). But, if you want to write email, how do you do that on a tablet? Do you hand peck the letters on that silly on-screen thing that Apple calls a keyboard? No. That’s not enough. Apple needs a fully phonetic speech input technology that’s 100% flawless without any training. That means, you speak the email in and it converts it perfectly to text. Also, you speak in any conversational command and the computer figures out what you mean flawlessly. This is the only technology that makes any sense on a tablet. Of course, it will need to support multiple languages (a tall order) and it needs to be flawless and perfect (an extremely tall order). It will also need to work in a noisy room (not likely).
Can Apple make such a shift? I don’t know. The hardware technology is there to support such a system. The issue, is the software ready? Well, let’s hope Apple thinks so. Otherwise, if Apple does release its rumored tablet without such a paradigm shift, it could be the worst stumble that Apple has made since the Lisa.
Windows 7: Vista Rehashed — Missing the Mark
While the initial response of Windows 7 seems to have been positive from beta users, I have personally found it really no better than Windows Vista. In fact, most of the touted improvements really aren’t there. Here is a basic review of Windows 7 as compared to Vista.
Not much improved
Windows 7 has not really improved enough over Windows Vista. It’s no wonder why Microsoft was able to shove this one out the door so rapidly. Effectively, Microsoft gave Vista a slight UI facelift, added a couple of tweaks here and there and then pushed the product to the shelves. In fact, I’m really wondering why it took as long as it did with so little improvement. The same issues that exist in Vista still exist in Windows 7. Namely, these include limited driver support, application compatibility and enhanced security that gets in the way. I’ll discuss these issues below.
Driver Compatibility
When Vista was released, one of the main issues was driver support. This issue is exactly the same with Windows 7. For example, I have a Dell Studio XPS system running Vista 64 Home Premium edition. It’s running 64 bit because I have 12GB of memory and that won’t work on 32 bit Vista (or Win 7). Dell has had months to ready drivers for this brand new system (purchased July 2009). Yet, Dell does not offer any drivers on their support site for this hardware. Yes, they did support an upgrade disc, but that’s about it. Dell expects you to accept the drivers that come with Windows 7 rather than obtaining the proper and updated drivers. Worse, Windows 7 driver support is still very bare. I wouldn’t expect to see full driver support for Win 7 until at least this time 2010 (possibly longer depending on adoption rate).
Note that 64 bit Windows requires 64 bit drivers. Windows 7 cannot load or use 32 bit drivers under the 64 bit edition. So, if you need to use 32 bit drivers, you should use the 32 bit version. Of course, that means you are limited to 4GB of memory. So, if you have older printer drivers that do not support 64 bit edition, you will have to hope that Windows 7 has a driver or be prepared to throw the printer out and buy something new. This also follows with devices like Dlink’s Skype phone adapter.
You may be able to get around some of these issues using Sun’s Virtualbox or MS’s Virtual PC and loading 32 bit XP under a virtual environment. Note, however, that not all devices offer passthrough to the virtual machine, so you may not be able to run those older devices requiring 32 bit drivers. You may be able to get this working under Win 7 Ultimate’s XP mode.
Overall, driver support is rated 1.5 stars out of 5: poor
Application Compatibility
As with Vista, Windows 7 fails in this area still. Frankly, because Windows 7 is effective Vista with a face lift, all of the same compatibility problems still exist in Windows 7. So, don’t expect your old XP apps to run properly under Windows 7 in many cases. This is especially true of apps that also tie to hardware devices that require drivers.
Worse, I have some 3D apps that work fine on Vista, but do not work at all under Windows 7. This indicates to me that Microsoft has further broken application compatibility between Vista and Windows 7. So, be prepared to lose some apps that may have worked under Vista.
Rating: 3.5 out of 5 stars: fair
Enhanced Security – User Access Controls (UAC)
Security of your operating system and data is a big priority and is understood. Any level of security has to straddle a fine line between securing the system and not getting in the way of using the system. Frankly, UAC is a complete and utter failure. This system is so in-your-face about security that it is a turn off. Combine this with its constant verbose ‘Are you really sure’ messaging, people will soon ignore the messages just to get the work done. Basically, this system is likened to ‘The Boy Who Cried Wolf’. If you make every alert important and nothing ever happens, people stop listening. Will UAC stop a system from being infected? Probably not. People will still run apps they shouldn’t run.
Beyond UAC, Windows 7 changed nothing over Vista. Windows 7’s UAC appears identical to Vista for all intents and purposes. Frankly, it’s still so much of a hassle that I still turn it off.
Rating: 3.0 out of 5 stars: in-the-way
Other problems
Other than the above, not much else has changed. All of the main usability problems that were introduced in Vista are still in Windows 7. For example, when you open file requesters, they tend to default to large icons. I prefer ALL of my file lists (whether a file requester or a Windows explorer window) to be in list formatted with the columns Name, Size and Date Modified. Both Vista and Windows 7 default to Name, Tags, Rating and Date. Sometimes it even adds Date Taken. I have no intention of rating or tagging every file on my filesystem. For files in a photos folder or a music folder, yes. Definitely not every file on the filesystem, so these columns are completely inappropriate for 98% of the filesystem. Yet, the headers are there each time a new file requester opens. Why?
When you’re constantly having to change the columns to show the data you need, that’s very inefficient and wasteful. Let me set it once and forget it. No, can’t do that. I have wasted a ton of time just rearranging these windows each and every time I open a new file requester. Please Microsoft, figure out a way to let us save our favorite columns and make it actually STICK.
In Windows explorer, this USED to work in XP. In Vista, and it also now appears Win 7, you could set up your preferred folder view and go into the options and ‘Make all folders like this one’. That works for a while. However, inexplicably the folders eventually revert back to their old column headers without any warning. So, changing this setting and saving it doesn’t work. Again, it’s another inefficient use of my time.
Time Wasters
On top of the above inefficiencies, Microsoft has decided to bury many functions down up to three layers deep to change system settings. For example, you used to be able to right-click ‘My Network Places’ and get right to the settings for the network adapters. Now, however, if you do this you get to a new UI interface that requires you to click one or two additional links to get to the configuration panel. In some cases, they’ve split features out into multiple separate windows to do the same thing that one panel used to do in XP. Again, this requires not only digging through multiple places, you now have to dig through multiple panels.
Windows 7 should have been redesigned in a major way. Instead, we get a rehash of Vista. The learning curve is still there. Nothing has been done to increase user efficiency in the UI. Overall, I’d give Windows 7 a 3 stars out of 5. Microsoft has a lot of work to get Windows 7 even close to the efficiency level of XP. They also need to address the lack of drivers, driver compatibility and application compatibility issues. Eventually, they won’t be issues once developers redesign their apps to work with Windows 7, but there are still lots of legacy apps that do not work.
Should you buy Windows 7?
That’s really the question of the year. If you are buying a new machine that comes with Windows 7 loaded, go for it. If you are running Windows XP, you might want to think twice. Windows 7 does not solve all of the XP compatibility problems. So, if you’re looking at upgrading an existing system, I would recommend against that. In fact, you can’t directly upgrade (see below). You will find that most of your apps may no longer work. So, be careful when thinking about an XP upgrade. Note that you can’t directly upgrade XP to Windows 7 anyway. Windows 7 will move Windows to Windows.old and then install a fresh copy of Windows 7. This means you will need to find all of your app discs and reinstall (assuming that that he apps are Windows 7 compatible). So, this is a real pain.
I would recommend that you buy a new hard drive and place it into your XP machine and install onto the new hard drive. Then set it up to dual boot. So, then you can boot into Windows 7 or XP depending on what you need. Dual booting is a hassle, but at least it retains your apps. You can even create a virtual environment out of your XP hard drive and run it under Virtualbox or Virtual PC in Windows 7. So, you might want to consider a virtual environment for your XP system for compatibility (assuming you aren’t running games). Note that virtual environments work great for Windows desktop apps. Games, on the other hand, don’t always work that well… so be careful with games as they may not work in a virtual environment.
In answer to this question, only upgrade to Windows 7 from Vista. Do not upgrade XP to Windows 7 as it’s a waste. Instead, buy a new hard drive and install Windows 7 fresh. Then, copy over your files from your XP hard drive that are important to you. Consider the age of XP, you probably need to buy a new hard drive anyway just strictly considering the hard drive’s age. Hard drives are only rated to last about 5 years reliably and XP is long older than 5 years since it was released.
[Update 2/11/2010] After upgrading several systems, I highly recommend against upgrading from Vista to Windows 7 using the upgrade process. The reason: while it appears to work, you may find the system somewhat strange during use. Some things won’t install and work properly. Basically, the system just doesn’t always work 100% after an upgrade. It seems that Windows 7 retains too many Vista files and settings and leaves the system in a slightly unstable state. A state that no amount of repair can fix. If you have Vista and want to upgrade, don’t. Instead, install a fresh copy of Windows 7 and reinstall all of your apps. Windows 7 doesn’t have to format your hard drive, so you won’t lose your data. However, you will need to find it all again after installing Windows 7 fresh. So, if you aren’t familiar with reattaching existing data to newly installed apps, you may need to enlist the help of the Geek Squad or someone who knows what they are doing.
Good luck.
Why Serial ATA will ultimately fail
Serial ATA is the replacement for Parallel ATA hard drives in computers. Serial ATA offers faster speeds, yes, but is still immensely inconvenient in the Windows world (and probably with Linux and Mac as well).
Problematic design / brittle plastic
First, the thing you’ll notice different between a PATA drive and SATA drive is the connectors. Gone are the bigger multipin data connector and the 4 pin power connector. Instead, now we have a multipin power and multipin data connector that has a slim/thin form factor. At first glance, you might think this is cool looking replacement connector. We’ll I’m here to tell you it’s not. The plastic used to hold the flat pins in place is weak and brittle. If you’re not absolutely light touch careful with how the drive fits in place, you’re likely to break one or both of the connectors off. Once that happens, the drive is toast.
In the 18 years I’ve been a systems administrator, I’ve changed many a hard drive and never once broken an IDE’s data connector. I’ve torn a few cables and I’ve bent a few pins, but this is nothing that can’t be corrected easily leaving the drive fully functional. With the brittle plastic SATA connectors on the drive itself, it’s extremely easy to break them off. For this poor design choice alone, this is one reason why SATA manufacturers must eventually redesign this connector or the drive acceptance will fail.
Out with the old, in with the new
Hard drive manufacturers and motherboard manufacturers have been steadily pushing EIDE (IDE) out the door in replacement for SATA drives. That’s great if everyone was on board at the same time. Unfortunately, Microsoft still isn’t on board with this change over. There are still limited native SATA drivers even in Windows Server 2008 (which is an offshoot of Vista). This means, you must still load drivers for certain popular SATA controllers. For example, one of the most common controllers used on motherboards is the SI3114 (Silicon Image) controller. Yet, you still must load drivers to get Windows to recognize a drive connected to it before Windows will install. If you forgot the driver or don’t realize you need it, you’ll easily spend 30 minutes chasing it down from your controller or motherboard manufacturer.
I realize the hard drive and motherboard manufacturers are trying to affect change, but you can’t do it when Microsoft still isn’t on board. I guess these businesses haven’t really figured this out yet.
Road to failure
I don’t mean hard drive failure either. I mean failure of the standard to be accepted in the long term. For poor design choices and the lack of giving Microsoft time to embed the most common SATA drivers into Windows installation media, SATA drives are likely to eventually fail to be the defacto data storage device of choice. Connectors on the back of drives need to be rugged (or at least more rugged than the brittle plastic they are using). The connectors could have been both bigger and more thoughtfully designed than what is on the back of SATA drives. For hot plugable configs, these connectors seem to work reasonably well, but they are still not perfect (as you have to play with alignment to ensure proper connectivity, hoping you don’t break parts off). The SCA connector was a much better standard as far as hot plug standards go: one single connector, big enough to be functional, easy to hotplug and rugged enough to keep from breaking parts off.
SATA drive manufacturers need to work on a design spec for better more rugged connectors on the back of SATA drives. Motherboard manufacturers need to ensure their SATA controller has a built-in driver in Windows installation packages so no specialty setups are necessary. Without these two steps, SATA drives will eventually fail to gain the acceptance and the momentum to keep these products going. Manufacturers seem to think that there is no other choice for data storage in the computer. When you think of hard drives, ATA drives are the first that come to mind. But, we are fast approaching solid state technologies. These solid state storage technologies don’t need the hoggy space of a hard drive chassis, the spinning noise and the eventual failure. With solid state drives, instead of 1U machines, we may even begin seeing 1/2U machines or less.
Fix it or fail
Hard drive manufacturers need to rethink SATA. They need to design both a better connector and faster data rates. 3Gbps speeds is reasonably fast, but we need to be about 10Gbps before vast improvements in transfer rates are actually noticed at a storage level.
Without the necessary support, which by now we should have had in the SATA world, it doesn’t make sense for HD manufacturers to push IDE out the door. There are still far too many times where IDE devices are necessary to get a system to a workable state. Motherboard manufacturers need to be doubly careful. SATA-only motherboards lead to challenges during installation of Windows due to lack of drivers. These installation challenges can lead to frustration and eventually a return of the motherboard to the store.
For all of these reasons, the SATA specification and design needs to be rethought. The brittle plastic connectors are no where near rugged enough and need to be made much more sturdy. The lack of driver support makes installation and repairs extremely frustrating. Chasing down SATA drivers to place on floppy disks can be a challenge even for the most knowledgeable.
For now, this is the state of SATA. It was a promising standard, but for now it’s become a problem because the hard drive industry is trying to push for change far too rapidly without adequately testing the design of the drive. For anyone reading who may work with SATA designs or manufacturing, please feel free to take this to your bosses for review.
The continuing failure of Digital Rights Management (DRM)
When are companies going to learn their lessons about using Digital Rights Management (DRM) to protect content? It seems that just as one company learns a lesson another also has to go through these same pains.
What is DRM?
DRM is simply a piece of software that is designed to limit use of or prevent unauthorized use over a piece of software or content (such as music, videos, video games or productivity software). DRM software is usually run on start-up of the actual software or content to-be-protected to determine if the user running it has a legitimate license.
How does it work?
DRM comes in many forms and there are many different implementations of DRM. The most obvious example of DRM is Microsoft’s activation server system. This system requires that Windows ‘check-in’ with Microsoft’s servers to determine if the copy of Windows you are running is ‘Genuine’ or ‘Counterfeit’. I’m not sure how exactly they determine what’s what, but apparently they have some methodology.
There are other forms of DRM, some more innocuous than others. Sony, for example, got beaten down hard for its use of a rootkit based DRM system on some of its BMG CD releases. This DRM system installed software unknown to the user and, as a result of its installation, left Windows open to attacks and software compromises (from viruses and trojans). This is an extreme example from a, then, well-respected company.
Other forms include dongles (USB keys), having physical media present, requiring license servers to be run, etc etc. Regardless of what form it takes, it will interfere with your ability to use the software in the way you want to use it.
What DRM doesn’t do!
DRM doesn’t actually target the people whom it should target. The intent of DRM is to prevent piracy or unauthorized use. The problem with DRM is that it basically only affects legitimate users and non-technical pirates. It doesn’t affect technically inclined pirates and software crackers… the exact people they need to target.
Because many of these software systems install software onto Windows, these installed softwares can interfere with Windows or other applications in Windows… especially if two different softwares require two different versions of the same protection system loaded on the system simultaneously. These are instances where one app can interfere with another or even interfere with itself.
So, DRM inconveniences the paying user and not really the pirates, which is not the intent of DRM. For example, ZBrush (a 3D object sculpting package) uses an arcane software protection system that doesn’t work on many installations of Vista 64 (and possibly even Vista 32). Pixologic (the developers of Zbrush), have basically thrown their hands up on the issue. They have no idea why it happens. Also, because they have licensed their protection system from a third party vendor, they can’t even fix the issue. So, Vista users who may legitimately want to purchase and use their software cannot do so.
EA’s Spore is another prime example. Spore’s arcane DRM system prevented installation and use of this game on multiple computers due to the way it ‘registered’ with EA’s servers. This DRM even prevented use by different users on the same computer. EA was very slow to respond about this issue and, as a result, hundreds of reviews for Spore on Amazon ended up 1 star.
Crackers
The Crackers of the world, many of them, are actually very good at what they do. They can get into hex code and/or disassembled (assembly) code and rework (remove the sections) that do the DRM checks. By disabling the DRM checking in the application, the application will then launch without the need for the DRM checks.
Note that these people are so adept at doing this, they can probably do it in their sleep. This means that no matter what protection scheme is devised by someone, the crackers can reverse engineer it and remove the protection system in time. Sometimes, they aren’t fully successful at removing it, but they don’t need to be. Instead, they can work with the DRM by producing softwares that mimic the things the DRM software needs to function. Either way, it gets around the necessary things that the DRM needs.
For this reason, DRM fails to target the people whom they want to target and fails to adequately protect the content they so desperately want to protect.
Fed Up!
Users are tired of DRM systems that prevent them from legitimately using software they purchase. The companies do have the right to protect their ‘assets’, yes. But, is it right for them to do so at the expense of their userbase? Making people jump through hoops just to run a piece of software is not the way software is supposed to work. DRM systems get in the way of the software and user experience rather than helping the company protect their assets.
Wake up companies
For software companies that are considering or are now using DRM to ‘protect’ (and I use that term very loosely) their software, you need to rethink your strategy, especially if you are seeing complaints from your userbase about the protection system preventing proper usage of your software. If the DRM is getting in the way of your paying user’s ability to use the software, then you need to get rid of the DRM. DRM is intended to be transparent to the user… but in many cases it is not. Which means, the DRM system failed.
Should companies do away with DRM? At this point, yes. It doesn’t serve your company by inconveniencing the very people who pay you money. It also doesn’t give you any points with customer satisfaction. As more and more people wake up to DRM and dislike it more and more, companies may find that their userbase is dwindling because people won’t agree to install DRM-based software on their computer. Software without DRM is more likely to function properly than DRM protected software. There are way too many software competitors on the market to keep DRM in your software when your competitors don’t use DRM in their products.
So, yes, companies should seriously consider the removal of DRM systems from their software. Also, because DRM fails to adequately target the people whom it should target, adding DRM only serves to damage your company’s reputation and negatively impact your paying userbase.
Have you found a piece of software controlled by DRM that you wouldn’t buy or that you did buy but couldn’t use? If so, please comment here. I’d like to see just how widespread this issue is.
Say no to DRM.
iTunes can corrupt your iPod’s iTunes library
As a follow up to this Randosity article, this article will focus on a specific condition when iTunes will corrupt your iPod’s music database… over and over and over.
How it all starts
About a week ago, my iPod became unrecognized by iTunes. Because iTunes cannot ‘recognize’ the iPod, it requests that you restore the iPod using the restore feature. As a result of a domino effect issue, this problem became more and more compounded. Compounded to the point that I was ready to sell the iPod to someone else and get a different solution.
What is the issue exactly?
This issue started right after the first unrecognized error. After the iPod becomes unrecognizable (we’ll get to what that means shortly), I had to restore the iPod to actually use it again. From that point forward, I kept having to restore it about once a day. Mind you, this is the 8GB iPod Touch and not a 60GB iPod. If it had been a 60GB device, I would have sold it no questions asked. I digress. Anyway, the restores kept getting more and more frequent.
- So, I plug the iPod Touch into the computer’s USB port and let iTunes synchronize the touch. The synchronize progresses normally and then ends correctly.
- I unplug the iPod and check it out. Yep, everything is all there.
- I plug it in again and iTunes then syncs again. Except, this time I noticed (or thought I noticed) iTunes synchronizing some music that was already on the iPod. I thought it was weird, but I discounted it.
- I unplug the iPod and check the ‘Music’ app. I see a “There is no music loaded” message…frustrating (note this was the first time it had happened).
- I plug the iPod back into the computer. iTunes says, “This iPod is unrecognized, please restore it”.
- Note that the Touch’s Apps are all still loaded and the iPod works even though iTunes won’t recognize it (and the music is missing).
What does ‘unrecognizable‘ mean exactly in the iTunes?
After poking around on the Internet about any similar type issues, I’ve found others who’ve had similar behavior on their iPods. The base problem that prevents iTunes from ‘recognizing’ the iPod is that the iPod’s music database (iTunesDB) file has become corrupted. Basically, when the iPod’s iTunesDB file becomes corrupted internally, iTunes refuses to recognize the device or work with it forcing the user to do complete restore (even when the unit is STILL functioning).
Restore Process
There are so many problems with this restore process, suffice it to say that Apple is in desperate need of help. Apple has designed the iPod to work under ideal conditions (i.e., never need to restore). However, when it comes time to restore your iPod and because they didn’t really work this all out properly, the restore process is where iTunes fails miserably.
When iTunes needs to restore the unit, it places the iPod into a special restore mode. A mode that appears to make the unit receptive to installation of firmware (a special icon appears). After iTunes extracts and transfers the firmware over to the iPod, the iPod reboots and installs the firmware (all the while iTunes is watching the progress). After the unit has restored the firmware to factory defaults, iTunes allows you to try to restore from a previous backup or set it up as a new iPod. This factory reset process can take anywhere between 10-15 minutes.
iPod Backups
iTunes only allows for one (1) stored backup of your iPod at a time. So, if that one (1) backup that iTunes has is corrupted, you’ll waste a ton of time trying to restore only to find that the iPod is still corrupted. So, you’ll have to start the restore completely over again and then set the iPod up as a new device (wasting even more time). This happened to me. I also quickly realized it was simpler (and faster) to avoid using an existing backup and just setting it up from scratch again. Apple really needs to allow iTunes to take multiple backups in dated slots and allow these backups to be stored outside of iTunes in files.
Note, if you choose to set the iPod up from scratch, you will have to completely set up your apps again. For example, settings like your WiFi settings, your email settings and your VPN settings will all have to be manually reconfigured. Any apps that require login and passwords will need to be re-entered.
Restoring your settings and media
If you’ve chosen to restore your iPod’s customization settings from a backup, this process will take between 10-15 minutes to complete. And no, as slow as this process is, it doesn’t restore music, videos or any other media. That still has yet to be done (and comes last). After the settings have been restored, you now have a workable (and very blank) iPod again. So, the next thing iTunes does is sync up the applications, then the music, then everything else. The applications will take anywhere from a few minutes to over ten minutes depending on how many apps you have downloaded. The music restore will take whatever it takes to copy the size of your unit (about 6 gigs takes at least 15-25 minutes). So, an 8GB iPod Touch, it takes probably 15-45 minutes depending. If you’re restoring a fully loaded 32 or 60GB iPod, your rebuild will take a whole lot longer.
Corruption
The issue I faced, however, is that something kept corrupting the iTunesDB file on the iPod. It was either the iPod’s hardware messing up or iTunes was shuttling something over it shouldn’t have been. I noticed that on a particular CD the artwork kept disappearing in iTunes (it would be there and then it would show the blank icon when I know that the art previously worked). I also noticed that iTunes would randomly transfer this music over even when it already existed on the iPod and had not been changed. I guess it thought something changed about the music file. Anyway, after it transferred that music, I believe this is what corrupted the iPod. Whatever was causing the artwork to disappear must have corrupted an iTunes file which was transferred to the iPod.
Fix
The fix for this issue, that I found by trial and error, was to completely delete the entire iTunes music library, podcast library and video library and reimport it. So, I went to the ‘Music’ area and selected everything and pressed delete. Of course, I used ‘Keep Files’ to keep them on the disk. I also made sure to NOT use downloaded artwork on the reimported music as I believe the downloaded artwork database is what is getting corrupted. I don’t know why the corruption happens and the guy at the Genius Bar had also never heard of this.. so much for their Genius. He also offered to replace the iPod Touch just in case the hardware was bad, but I don’t think it is.
Arrgh.. Apple get your ACT together!
iTunes can be a hassle to deal with, as evidenced here. Apple needs to take a long hard look at how this all works and fix these problems. One of the ways to fix this issue is to stop marking the unit as unrecognizable when the iTunesDB is corrupted. Instead, they should simply delete the database and rebuild it. Better yet, they should keep a copy of the iPod’s database on the computer for restoration. Also, if Apple allowed multiple backups stored by date on the computer, it would be far simpler to roll back to a previously KNOWN working configuration. Because of this lack of foresight of Apple and because of the simplistic backup system Apple has implemented, this leads to a complete timewaster in restoration by trial and error.
Since there is no real fix you can do to iTunes itself to manage these limitations, I recommend that you turn off automatic synchronization so you can manually sync the iPod yourself at the time of your choosing. I should also mention that Apple decided to turn off visibility (through a drive letter) into the iTunes library files with the iPod Touch, so you can’t even use a third party utility. I can’t imagine having to go through this restore process on a 60GB or larger iPod. Having to go through it 5 times in 5 days because of iTunes is ludicrous and enough to make anyone want to get away from Apple as fast as possible. Apple, you definitely need to figure out how to deal with this issue!
ATI Radeon 3650 driver vs Vista 64: Who Wins?
After a bout of attempting to install the 12/10/08 graphics driver on my ATI Radeon 3650 on 64 bit Vista, I ran into a few glitches… well, many glitches actually. Glitches upon glitches… I guess you could say it was a clusterbuck (replacing the b with an f). Anyway, my system was rather messed up after the attempted installation. Suffice it to say that the 12/10/08 release of the Catalyst driver from ATI is Borked. It doesn’t run on the 3650 on Vista 64, so don’t even bother. However, that was only half of my issues.
The other half of the issues consisted of how to recover from the uninstallation of the driver and recovery back to my previous driver. Unfortunately, Microsoft has completely messed up driver installation and removal on Vista 64. So, be WARNED if you attempt to upgrade your graphics driver under Vista 64. Suffice it to say that here’s the short list of steps:
- As best you can, uninstall the ATI Catalyst tool from ‘Programs and Features’. If that fails…
- Remove all references to ATI and ATI Technologies registry keys from HKLM and HKCU
- Remove all ATI Technologies directories from C:\program files and C:\program files (x86)
- Follow the instructions below to ensure Catalyst Control Manager reinstalls properly:
- Check that registry locations are empty:
- a. HKCU/Software/ATI/ACE
- b. HKLM/Software/ATI/ACE
- Check that (Program Files or Program Files(x86) folder )/ATI Technologies/ATI.ACE is empty
- Check (Windows folder)/Assembly folder to see if there’s any files with Public Key Token of “90ba9c70f846762e” (Sort by Public key token to get a easier view). All these tokens should be uninstalled by right clicking and uninstalling.
- Check that (Document and Settings)/(User)/AppData/Local/ATI/ACE is empty
- Reinstall CCC
Hopefully, you won’t run into the same issues I did getting Catalyst reinstalled, but if you do run into this issue with your ATI Radeon card, perhaps this will help! Oh, and who wins? No one does… Vista sucks for Driver issues.
Disclaimer: If you don’t know what you’re doing in the registry, don’t go there. If you accidentally delete something you shouldn’t, that’s your responsibility. The registry can be a tricky place, so you are hereby warned. Use this information at your own risk.






leave a comment