Random Thoughts – Randocity!

What Microsoft’s purchase of ZeniMax means?

Posted in business, microsoft, Sony, video gaming by commorancy on October 28, 2020
Can the PS5 succeed?

I’ve had this question recently posed to me on a Twitch stream. Yes, I stream games on Twitch in addition to penning this blog. I haven’t cross promoted my Twitch stream on this blog because blogging and gaming are mostly unrelated. However, if you’re interested in watching me game, please leave a comment below and I’ll post my Twitch channel. Let’s explore the answer to the above question.

Bethesda and Microsoft

Microsoft isn’t really a gaming company. They are a software company who produces gaming products in among all of their other hardware and software product lines. Sony is, likewise, not really a gaming company for a similar reason. Sony is mostly a content producing company who also produces gaming hardware.

Anyway, Microsoft’s purchase of Bethesda’s parent company ZeniMax likely means eventual changes to all of Bethesda’s game franchises. In fact, I’m actually surprised that the FTC has allowed such a purchase considering the negative impact it will likely have on consumer choice.

Sony and Microsoft

Let’s start with the elephant in the room. Sony and Microsoft are rivals when it comes to gaming systems. Sony has the PlayStation and Microsoft has the Xbox. Because Microsoft owns the Xbox console, purchasing large gaming companies firmly pushes this situation into conflict of interest and consumer choice reduction territory. Additionally, Microsoft’s purchase of ZeniMax before the PS5 has really launched can become an easy way to keep the PS5 from succeeding.

Why? Microsoft has designs on making the Xbox Series X console succeed and be more successful than the PS5. To do this, they want to lock Sony’s platform out of as much content as they can. How will this manifest with Bethesda’s games?

While the final outcome is entirely uncertain, the handwriting is on the wall. What I mean is that Microsoft may eventually make all of Bethesda’s newest released games exclusive to the Xbox. That means that Bethesda’s game franchises (plural) may ultimately end up playable only on the PC and on the Xbox console. Yes, that could mean that both the Nintendo Switch and Sony’s PS5 are equally negatively impacted by this purchase.

Both Sony and Nintendo could find themselves without future Bethesda titles on their gaming platforms. That could mean no more Fallout, no more Elder Scrolls, no more Doom and no more Wolfenstein will make their way onto Sony or Nintendo’s platforms. It doesn’t stop there. Titles like Bethesda’s upcoming Starfield, which has yet to be released, could be pulled from release on both Sony and Nintendo’s platforms… leaving this game only available on PC and Xbox.

Sure, it may lose Microsoft money by not releasing these games on these non-Microsoft platforms, but Microsoft will more than make up for those game sales losses by pushing more Xboxes and PCs into the home. Eventually, these games will be sold to newly purchased Xboxes and PCs more than making up for the losses in sales on those other platforms. Basically, Microsoft has an easy way to do the dirty to both Sony and Nintendo as far as Bethesda games are concerned.

Microsoft is also well aware of the leverage they hold over the gaming industry by purchasing Bethesda. More than this, Microsoft can steer new consumers onto their Xbox line of consoles and away from Sony and Nintendo consoles strictly by enforcing Xbox Exclusives.

Exclusives

Bethesda isn’t the only studio on the planet. However, Bethesda is a large studio with many very cherished video game franchises… franchises that bring in a lot of cash and drive console purchases.

While Microsoft can enforce making upcoming Bethesda games exclusive, Microsoft doesn’t necessarily have to take this step. However, knowing that Sony pretty much kicked Microsoft’s butt with the PS4’s sales, Microsoft isn’t eager to repeat that trend with the Xbox Series X. Purchasing ZeniMax gives Microsoft a definite edge. It also means Microsoft might also be eyeing the purchase of Activision, EA, Rockstar and even Ubisoft. Don’t be surprised if Microsoft snaps up some of these additional game developers as well.

By Microsoft purchasing large game studios like Bethesda, they can control which console becomes the dominant console this time around (i.e., theirs). This means even more exclusive Xbox games.

Exclusive games force consumers to buy specific hardware platforms to play these exclusive titles.

PS5

What does this news mean for a console like the PS5? It puts the PS5 at a severe sales disadvantage. Microsoft could request Bethesda to not produce PS5 games. Without Bethesda’s support on the PS5, that leaves the PS5 at a major disadvantage in the upcoming next gen gaming market.

This is part of the reason I am not purchasing a PS5 at this time. I’m waiting on how this plays out. Bethesda’s ownership by Microsoft means a very real possibility of future exclusive Xbox titles from Bethesda, with no releases on the PS5 or the Nintendo Switch.

This change would put Sony and Nintendo with a clear sales disadvantage. Sony would have to rely not on Bethesda games to drive the PS5’s sales, but instead rely on Sony Studio game releases… games they have developed themselves or by studios they own (i.e., Sucker Punch).

That doesn’t mean the PS5 will be worthless, but it means that the future of Bethesda’s games being released on the PS5 has become very unclear. In fact, I’d use the word “muddy” to describe these waters.

Here are some questions that come out of the above:

  1. Should I buy and Xbox Series X or a PS5? The answer to this question entirely depends on what Microsoft has planned for Bethesda. If they intend to turn all future Bethesda releases into Xbox exclusives, then the answer to this question is… buy an Xbox Series X. Even then, I’d still recommend buying an Xbox Series X because there’s a zero chance of losing Bethesda games on the Xbox. However, there’s a high probability the PS5 will lose Bethesda’s future games. The even larger answer to this question also depends on whether Microsoft plans to buy more large game studios.
  2. Will Bethesda lose money? The answer to this question is, no. Microsoft has deep, deep pockets. They can withstand any short term monetary losses from making Bethesda’s games exclusive to the Xbox and they can also withstand the long term needs to recoup those losses by selling new Xbox consoles and any exclusive Bethesda games. The more consoles Microsoft sells, the more games they can sell.
  3. Will Microsoft force Bethesda to make exclusives? Yes, they will. This is guaranteed. The question is, which games will be forced into this category? That’s still unclear. Will it only be some of Bethesda’s games, all of them, new games only or some combination of this? We don’t know. However, I can guarantee at least one of Bethesda’s games will be released as an Xbox exclusive. My guess is that most of Bethesda’s games will become exclusives.
  4. What about existing Bethesda games? What happens to these? Microsoft isn’t stupid. They will allow existing games to continue to be sold and operate on the PS4 and any other older non-Microsoft consoles. They won’t rock this boat. Instead, Microsoft will look at upcoming unreleased games and use the games that have never been released to become exclusive.

As a result of these questions and answers, it’s clear that if you love Bethesda’s games and you wish to play future upcoming Bethesda game franchises, you may want to wait before investing in one of these new consoles. It would suck to spend a wad-o-cash to walk home with a PS5 only to find that the one Bethesda game you thought you could play is now an Xbox Series X exclusive. That means, you’ll never see that game released on the PS5. Microsoft is very likely to make this situation a reality.

If Microsoft buys even more of these large developers, they could lock Sony’s PS5 out of the mainstream gaming market. That would push Sony’s PS5 into a situation like Nintendo (and the PS Vita), where the console maker is entirely responsible for creating compelling game franchises for their respective console on their own. Unfortunately, that’s just not enough to keep a platform like the PS5 alive.

In other words, with the purchase of Bethesda, there’s a very real possibility that this time around that Sony’s PS5 will be the underdog.

Ramifications

The bigger ramifications of this purchase is the lack of and reduction of consumer choice. This purchase can easily push Microsoft into an even more monopoly status than they already are. Locking down the biggest game developers to exclusivity for the Xbox means causing the PS5 to ultimately fail and for the same reason the PS Vita failed.

Personally, I believe this is Microsoft’s true agenda. The Xbox One’s sales paled in comparison to the PS4. Microsoft is not eager to repeat this situation with the Xbox Series X. By buying large developers like ZeniMax / Bethesda, Microsoft can all but assure the success of the Xbox Series X… and, at the same time, assure Sony’s failure of the PS5.

This purchase is honestly a one-two punch to Sony…. and for Sony, it’s gotta hurt.

Sony and Gaming

If Sony is smart, they’ll run out and buy Rockstar or Ubisoft right now. They shouldn’t wait. They should purchase one of these companies as fast as they possibly can. Rockstar would be the best choice for Sony.

Sony could then have this same bargaining chip in their back pocket just like Microsoft has with Bethesda. Should Microsoft dictate Xbox exclusivity for Bethesda’s upcoming games, Sony can do the same thing for Grand Theft Auto and Red Dead Redemption (once they own Rockstar). Ultimately, it will be a “tit for tat” situation.

In fact, Sony should buy both Ubisoft and Rockstar and have two bargaining chips. Even still, such a game exclusivity war would lead to fracturing the gaming market in half. Basically, the consumer would be forced to buy multiple consoles to play games that formerly landed on both consoles. It’s a loss for consumer choice… which is why I’m surprised the FTC hasn’t stepped in and blocked this one.

I’m guessing that because the final outcome has not yet manifested, the FTC can’t see the forest for the trees. However, once hindsight forces 20/20 vision, it will be too late for the FTC to block this purchase.

What does this mean for Fallout?

I know this is a very specific question about a very specific game. However, I was asked this very question on a Twitch stream. Let me answer it here.

If you’re a fan of the Fallout series and you’re unsure which of the upcoming console to buy, I’d recommend waiting to see what Microsoft has in store for upcoming Bethesda games.

With that said and to reiterate what I’ve said above, there is now zero chance that Microsoft will withhold Fallout for the Xbox Series X and newer Xbox consoles. However, Microsoft can easily block the release of future Fallout games from the PS5 and the Switch. This means that a consumer’s investment of cash into a PS5 could see the console without any future Fallout or Elder Scrolls or Doom games.

What that means is that should Bethesda take on the challenge of remastering Fallout 1, Fallout 2 and Fallout New Vegas for the newer consoles, these games may only find their way onto the Xbox Series X as exclusives and may not be found on the PS5.

Basically, proceed with caution if you really, really want a PS5. You may find that like the PS Vita, without titles released from Bethesda, the PS5 may end up a dying console before it really gets the chance to take off, particularly if Microsoft buys even more of these large game studios. If the PS5 does fail due to Microsoft exclusives, it will be mostly thanks to Microsoft.

↩︎

Gaming: PS5 vs XB SX Case Design Review

Posted in game controller, gaming, video game console, video game design by commorancy on June 14, 2020

Since the Xbox Series X and the PS5’s case designs have now been unveiled by both Microsoft and Sony, respectively, let’s explore these case designs.

Sony’s PS5

Let’s start with the recent elephant in the room, the Sony PS5. Here are some images:

Xbox Series X

Note, I will henceforth be calling the Xbox Series X the Xbox SX. Here are images of this console:

Design Goals

Sony claims they wanted something “bold, daring and future facing” according to Sony’s CEO. Microsoft’s Xbox Phil Spencer claims they wanted the, “fastest, most powerful console ever”.

Regardless of the claims, let’s dive into the designs of these consoles. The first word that comes to mind is “dated”. Both the Xbox SX and the PS5 offer odd choices in case designs.

Let’s discuss the Xbox SX’s case design. This design has already been done and been done better… thrice, in fact. Once by NeXT and twice by Apple. Let’s look at designs past, shall we?

The above computers consist of the following:

  • Apple G4 Cube (circa 2000)
  • NeXT Cube (circa 1990)
  • Apple Mac Pro Cylinder (circa 2012)

All three of these computers are of a very similar design to the Xbox Series X. Microsoft never can seem to come up with original designs, instead choosing to abscond with older manufacturer designs. I’m not sure what it is about Microsoft’s inability to come up with innovative case designs, but this is what we get with Microsoft: clunky, outdated designs.

That’s not to say that Sony’s case design is much better. It’s unique, but in a word, “ugly”. If you like the look of consumer routers, then I guess the PS5’s case design is what you might like.

The main problem I have with both of these designs is that neither of them are stackable. It seems with Sony, it’s all about having an oddly round shaped surface. This means when you place it horizontally, you can’t stack anything on top of it. With the PS4 Pro, it offered us a fully flat top. Unfortunately, the PS3 had that, again, oddly rounded design. It seems that Sony vacillates between flat topped systems and oddly shaped systems. If Sony’s were the only device in the home, it might be okay. Since some of us have several pieces of gear, including multiple older and newer generation consoles, we want to stack them so we have them together.

Additionally, stacking a console vertically, at least in my cabinet, is out of the question. There is no way for me to locate the Xbox SX or the PS5 vertically. In fact, I have yet to place any console vertically in the last 10 years (no space) and it’s not going to happen now. Note, I talk about alternative placement of the Xbox SX below.

Waiting… and airflow

As a result, I’m likely to wait until the second case iteration of the PS5. I’ve invested in too many first gen consoles and gotten burned. The only time where having the first edition console was a boon was with the PS3… before Sony yanked out the PS2 compatibility and several other useful features for later iterations. That was the one and only one time when it was a benefit. That didn’t excuse the horrible rounded PS3 case design, nor does it excuse the rounded case design of the PS5.

With the Xbox SX, it can at least be placed horizontally. In fact, this console design might actually fare better horizontally than vertically. Why? When standing vertical, there will be limited airspace under the bottom of the unit with which to pull air up and through. The airspace distance is probably designed well enough, but sitting close to a surface will still limit the amount of air flow.

Placing the Xbox SX case horizontally completely unobstructs the bottom intake vent and allows full and complete airflow through the unit. Placing the Xbox Series X horizontally might actually be the better way to place this unit for the best airflow possible. Sony’s case design probably won’t have an airflow problem. They usually don’t.

Sony’s choice of white case, black inner section using blue case lighting is also a throwback design problem. It has the same aesthetic as the Nintendo Wii. It’s not the same case shape, of course, but it has a similar lighting and visual aesthetic.

Form vs Function

One thing that video game console designers need to understand is that it really doesn’t matter how aesthetically pleasing a case design is. What matters is how well the console functions. That isn’t to say that we don’t enjoy seeing a pretty case, but we don’t spend time staring at it either. We want to use the unit, not stare at the case.

Therefore, the most important aspect of a video game console isn’t its case, it’s what’s under the hood and how well all of that works. Spend time making the innards work well. Make them solid and functional and with proper air flow. Put your effort and money into designing the innards and make that innovative. We don’t really care what it looks like.

In fact, as a gamer, I’d prefer the case be flat on top with airflow front-to-back or side-to-side so I can stack my other gear on top of it. A boxy looking case? Not a problem. Failing to understand this functional stacking issue is a design failure in my book. Clearly, Sony’s industrial designers weren’t considering ergonomics or functionality of its case design. For that matter, neither was Microsoft with the Xbox SX.

Case design isn’t really that important to a video game console unless it gets in the way of being installed into a cabinet… which both of these case designs do.

Vertical Design

More and more, game console creators want to produce vertical case designs. I’m not a fan. I don’t want my console sitting vertically. Not only do I have no cabinet space for this, I simply don’t like this design aesthetic. I prefer my computers to sit horizontally. This is partially to do with the cabinet I’ve chosen, but it’s partially due to the wasted space needed to place a console upright.

Case designers need to reconsider this unnecessary trend of designing for vertical installation. Any design that can be installed vertically should also be designed to install horizontally. Design for both use cases!

Blue LEDs

I’m also not a fan of blue colored LEDs. They are 1) too bright and 2) annoying as hell. At night, you simply can’t sleep with blue LED lights staring you in the face. They’re like little lasers piercing your retinas. I hate ’em with a passion. The faster we can get away from this blue LED trend, the better.

PS5 Reveal

Here’s the part where some of you may have been patiently waiting for me to chime in. Well, here it is. The PS5’s reveal was, meh. The gameplay was actually not any better looking than the PS4 Pro. The CPU and GPU might be somewhat faster, but Sony is reaching the law of diminishing returns. The PS5’s play was, well, not at all impressive. In fact, I was so unimpressed by the PS5’s gameplay so as to be disappointed.

I was expecting so much more from the PS5 and we’re basically getting another PS4 renamed PS5. It’s really unimpressive. Going back to the CEO’s remark, there’s really nothing “bold, daring or future facing” about this PS5 console. From the uninspired and knock-off case design to the PS4 graphics shoved into a new case. It’s really very unimpressive.

I’m not sure what Sony has been spending the last 2 years doing, but it’s clear they were not spending the time designing an innovative new product. The PS5 is a rehash of the PS4 in an oddly shaped case.

Innovation

Nintendo Switch

What is innovation? Innovation means to come up with something which hasn’t been seen or done exactly like that before. I’d consider the Nintendo Switch innovative. I’d also consider the Apple G4 Cube innovative. Why is the Switch innovative? Because not only is the Switch a dockable home console, you can take it with you and play on the go. It’s a powerhouse well big enough to work in both situations.

I was fully expecting this same level of innovation with the PS5. Unfortunately, what we got was exceedingly underwhelming. Even the “new” PS5 controller is bland and uninspired. This controller looks pretty much like the old controller with, again, horrible blue LED lights piercing your retinas and lighting up your face. Let’s hope that this time you can actually turn these silly lights off.

The touch pad remains, but is an unnecessary and almost never used feature of the PS4’s controller. The touch pad was simply a battery suck and a gimmick. I wouldn’t mind seeing Sony get rid of that touch pad garbage. As I said, battery suck, gimmick and completely unnecessary.

Yet, here the touch pad is again, making yet another unnecessary appearance. That’s most definitely not innovative. It simply means Sony is way out of touch with how most game developers use the PlayStation’s controller. Short of a handful of early game titles on the PS4, the touch pad was almost never used, other than as a button. Simply get rid of the battery hogging touch pad and replace it with a button, like the new Xbox SX controller has. If you need a touch pad for PS4 compatibility, allow connecting a PS4 controller via Bluetooth.

See, I innovated for you there, Sony. Microsoft’s Xbox SX controller, on the other hand, is about as simplistic and utilitarian as you can get. That doesn’t make it a problem. In fact, it looks so much like an Xbox One controller, you might not even notice that there’s a new button in the middle of the controller surface. It’s a button that basically does the same thing as the touch pad button on the PS4’s controller.

I was actually hoping to see a few more buttons added to both the Xbox SX and the PS5 controller. Buttons that can be programmed for lesser used functions so that game developers don’t have to keep overloading functions onto the same buttons depending on context. It’s frustrating, for example, to play Fallout 76 and expect the square button to do something, but does something entirely different because you’re too close to an in-game object. You have to move away before the original function resumes. Frustrating.

By having more buttons on the controller, you can map these lesser used functions to these other (smaller buttons) so that button overlapping in games becomes much less common.

PC’s don’t have this problem because you have a keyboard with usually 101 keys. On a controller, you have basically 13 buttons on the face plus 4 on the shoulders. I want more buttons on my controller’s face so game developers don’t have to overload button functions anymore. Yet, no such luck on the PS5 or Xbox SX. They are still basically the same ole controllers with the same limited buttons. Yeah, basically no innovation here.

Overall

I’m planning on waiting to purchase these consoles until the second iteration of the console. Possibly even until they release a case redesigned version. You know that both Sony and Microsoft will introduce subsequent case styles in the future. I tire of buying a the first day console and then having them redesign it six months later.

My plan is not to buy the console for at least six months to 1 year after release. I’ll stick with my PS4 and Xbox One until then. Even then, it doesn’t seem that many game developers will be taking advantage of the new console hardware fully for at least that time. Anything in development today on those consoles will have been using the gaming company’s older non-optimized engine. It will take at least six months for most developers to retool their engines to be optimized for the new platform.

For this reason and for the typical dearth of features that Sony is likely to offer us come release day, I’m waiting. There’s nothing like spending $700 to play one game, then let the console sit for 6 months without using it at all. Such a waste of $700.

No, I’m not doing that again Sony. I’ll lay out money towards a console once it actually has some gaming momentum behind it and usable features to boot. Once Netflix and Hulu and all of the staples arrive to the consoles, then there will be some reasons to consider. Until that day arrives, it’s a $700 paperweight.

Pricing

Don’t kid yourself about this next part. Even though pricing hasn’t been announced for the PS5 or the Xbox SX, you can bet that after buying games, accessories, cables, chargers and the console itself, you’ll easily have spent at least $700. The price will probably be closer to $1,000. Even the PS4 exceeded the $1,000 price point if you included a PSVR unit. If there’s a VR unit on the way for the PS5, then expect the PS5’s price point to hit $1,000 to $1,500, possibly more.

We’ll have to wait on the pricing, but Sony and Microsoft have to announce it soon. Few people will place a pre-order on these units without knowing what they’ll end up paying. I won’t. It’s a fundamental aspect of gaming. You have to know the cost of the unit to know if it’s worth the price.

If both Sony and Microsoft price at or close to $1,000 for a base unit, they are probably making huge mistakes. Since the gaming price point has always been $500 or so, doubling that price approaches PC pricing territory. If you can get a PC for cheaper than a console, what’s the point in buying a console?

Microsoft and Sony must be very careful when considering their price point for these consoles. For me, I’d value these consoles at being worth no more than $600-700 (regardless of the actual costs to assemble it). If they’re priced higher than this, the console industry is going to have a real problem on its hands. Even Nintendo may feel the pinch from it. Considering that the Switch costs $299, that’s an excellent price point for such a universally useful unit. Unfortunately, Nintendo has been lax on wooing developers to the platform. So far, Nintendo has only been able to woo Bethesda. Even then, Bethesda’s involvement on the Switch has been limited.

Sony and Microsoft must be very careful with their pricing. I’m actually hoping Microsoft announces their pricing first. This will start a price war between Sony and Microsoft. Sony will have to price the PS5 at or below the same price as the Xbox SX. Sony and Microsoft can ignore Nintendo’s pricing as Nintendo has never offered a similarly competitive console entry. It’s very unlikely Sony or Microsoft will ever price their consoles at $299. At least, not the day one console.

In the future, though, the pricing will be fluid and may approach the $299 price tag… yet another reason to wait.

Let’s hope that Sony and Microsoft can choose to do the right thing with these units and price them accordingly. At least, they shouldn’t be priced any higher than the Xbox One X or the PS4 Pro. As for the design, yeah, it could have been WAY better on both consoles.

↩︎

Can I use my Xbox One or PS4 controller on my iPhone?

Posted in Apple, botch, california, game controller, gaming, video game by commorancy on September 16, 2019

XboxOneEliteController-smThis is a common question regarding the two most popular game controllers to have ever existed. Let’s explore.

MFi Certification

Let’s start with a little history behind why game controllers have been a continual problem for Apple’s iOS devices. The difficulty comes down to Apple’s MFi controller certification program. Since MFi’s developer specification release, not many controller developers have chosen to adopt it. The one notable exception is the SteelSeries Nimbus controller. It’s a fair controller, it holds well enough in the hand, has an okay battery life, but it’s not that well made. It does sport a lightning port so you can charge it with your iPhone’s charger, however. That’s of little concession, though, when you actually want to use an Xbox One or PS4 controller instead.

Because Apple chose to rely on its own MFi specification and certification system, manufacturers would need to build a controller that satisfies that MFi certification. Satisfying the requirements of MFi and getting certified likely requires licensing technology built by Apple. As we know, licenses typically cost money paid to Apple for the privilege of using that technology. That’s great for Apple, not so great for the consumer.

Even though the SteelSeries Nimbus is by no means perfect, it really has become the de facto MFi controller simply because no other manufacturers have chosen to adopt Apple’s MFi system. And why would they?

Sony and Microsoft

Both Sony and Microsoft have held (and continue to hold) the market as the dominant game controllers. While the SteelSeries Nimbus may have become the de facto controller for Apple’s devices, simply because there is nothing else really available, the DualShock and the Xbox One controllers are far and away better controllers for gaming. Apple hasn’t yet been able to break into the console market, even as much as they have tried with the Apple TV. Game developers just haven’t embraced the Apple TV in the same way they have of the Xbox One and the PS4. That’s obvious as to why. The Apple TV, while reasonable for some games, simply does not offer the same level of graphics and game power as an Xbox One or PS4. It also doesn’t have a controller built by Apple.

Until Apple gets its head into the game properly with a more suitably named game system actually intended for gaming, rather than general purpose entertainment, Apple simply can’t become a third console. Apple seems to try these roundabout methods of introducing hardware to try and usurp, or at least insert itself into certain markets. Because of this subtle roundabout method Apple chooses, it just never works out. In the case of MFi, that hasn’t worked out too well for Apple.

Without a controller that Apple has built themselves, few people see the Apple TV as anything more than a TV entertainment system with built-in apps… even if it can run limited games. The Apple TV is simply not seen as a gaming console. It doesn’t ship with a controller. It isn’t named appropriately. Thus, it is simply not seen as a gaming console.

With that said, the PS4 and the Xbox One are fully seen as gaming consoles and prove that with every new game release. Sony and Microsoft also chose to design and build their own controllers based on their own specifications; specifications that are intended for use on their consoles. Neither Sony, nor will Microsoft go down the path to MFi certification. That’s just not in the cards. Again, why would they? These controllers are intended to be used on devices Sony and Microsoft make. They aren’t intended to be used with Apple devices. Hence, there is absolutely zero incentive for Microsoft or Sony to retool their respective game controllers to cater to Apple’s MFi certification whims. To date, this has yet to happen… and it likely never will.

Apple is (or was) too caught up in itself to understand this fundamental problem. If Apple wanted Sony or Microsoft to bend to the will of Apple, Apple would have to pay Sony and Microsoft to spend their time, effort and engineering to retool their console controllers to fit within the MFi certification. In other words, not only would Apple have to entice Sony and Microsoft to retool their controllers, they’d likely have to pay them for that privilege. And so, here we are… neither the DualShock nor does the Xbox One controller support iOS via MFi certification.

iOS 12 and Below

To answer the above question, we have to observe Apple’s stance on iOS. As of iOS 12 and below, Apple chose to rely solely on its MFi certification system to certify controllers for use with iOS. That left few consumer choices. I’m guessing that Apple somehow thought that Microsoft and Sony would cave to their so-called MFi pressure and release updated controllers to satisfy Apple’s whims.

Again, why would either Sony or Microsoft choose to do this? Would they do it out of the goodness of their own heart? Doubtful. Sony and Microsoft would ask the question, “What’s in it for me?” Clearly, for iOS, not much. Sony doesn’t release games on iOS and neither does Microsoft. There’s no incentive to produce MFi certified controllers. In fact, Sony and Microsoft both have enough on their plates supporting their own consoles, let alone spending extra time screwing around with Apple’s problems.

That Apple chose to deny the use of the DualShock 4 and the Xbox One controllers on iOS was clearly an Apple problem. Sony and Microsoft couldn’t care less about Apple’s dilemmas. Additionally, because both of these controllers dominate the gaming market, even on PCs, Apple has simply lost out when sticking to their well-intentioned, but misguided MFi certification program. The handwriting was on the wall when they built the MFi developer system, but Apple is always blinded by its own arrogance. I could see that MFi would create more problems than it would solve for iOS when I first heard about it several years ago.

And so we come to…

iOS 13 and iPhone 11

With the release of iOS 13, it seems Apple has finally seen the light. They have also realized both Sony and Microsoft’s positions in gaming. There is simply no way that the two most dominant game controllers on the market will bow to Apple’s pressures. If Apple wants these controllers certified under its MFi program, it will need to take steps to make that a reality… OR, they’ll need to relax this requirement and allow these two controllers to “just work”… and the latter is exactly what Apple has done.

As of the release of iOS 13, you will be able to use both the Xbox One (bluetooth version) and the PS4’s DualShock 4 controller on iOS. Apple has realized its certification system was simply a pipe dream, one that never got realized. Sure, MFi still exists. Sure, iOS will likely support it for several more releases, but eventually Apple will obsolete it entirely or morph it into something that includes Sony and Microsoft’s controllers.

What that means for the consumer is great news. As of iOS 13, you can now grab your PS4 or Xbox One controller, pair it to iOS and begin gaming. However, it is uncertain exactly how compatible this will be for iOS. It could be that some games may not recognize these controllers until they are updated for iOS 13. This could mean that older games that only supported MFi may not work until they are updated for iOS 13. The problem here is that many projects have become abandoned over the years and their respective developers are no longer updating apps. That means that you could find your favorite game doesn’t work with the PS4 or Xbox One controller if it is now abandoned.

Even though iOS 13 will support the controllers, it doesn’t mean that older games will. There’s still that problem to be solved. Apple could solve that by folding the controllers under the MFi certification system internally to make them appear as though they are MFi certified. I’m pretty sure Apple won’t do that. Instead, they’ll likely offer a separate system that identifies “third party” controllers separately from MFi certified controllers. This means that developers will likely have to go out of their way to recognize and use Sony and Microsoft’s controllers. Though, we’ll have to wait and see how this all plays out in practice.

Great News

Even still, this change is welcome news to iOS and tvOS users. This means that you don’t have to go out and buy some lesser controller and hope it will feel and work right. Instead, you can now grab a familiar controller that’s sitting right next to you, pair it up and begin playing on your iPad.

This news is actually more than welcome, it’s a necessity. I think Apple finally realizes this. There is no way Sony or Microsoft would ever cave to Apple’s pressures. In fact, there was no pressure at all really. Ultimately, Apple shot themselves in the foot by not supporting these two controllers. Worse, by not supporting these controllers, it kept the Apple TV from becoming the hopeful gaming system that Apple had wanted. Instead, it’s simply a set-top box that provides movies, music and limited live streaming services. Without an adequate controller, it simply couldn’t become a gaming system.

Even the iPad and iPhone have been suffering without good solid controllers. Though, I’m still surprised that Apple itself hasn’t jumped in and built their own Apple game controller. You’d think that if they set out to create an MFi certification system that they’d have taken it to the next step and actually built a controller themselves. Nope.

Because Apple relied on third parties to fulfill its controller needs, it only really ever got one controller out of the deal. A controller that’s fair, but not great. It’s expensive, but not that well made. As I said above, it’s the SteelSeries Nimbus. It’s a mid-grade controller that works fine in most cases, but cannot hold a candle to the PS4’s or the Xbox One’s controller for usability. Personally, I always thought of the Nimbus controller as a “tide me over” controller until something better came along. That never happened. Unfortunately, it has taken Apple years to own up to this mistake. A mistake that they’ve finally decided to rectify in iOS 13.

A little late, yes, but well done Apple!

↩︎

 

How much data does it take to update my PS4 or Xbox One or Switch?

Posted in computers, updates, video game console by commorancy on May 10, 2018

It seems this is a common question regarding the most recent gaming consoles. Let’s explore.

Reasons?

  • If the reason you are asking this question is because you’re concerned with data usage on your Internet connection or if your connection is very slow, you’ll find that this answer will likely not satisfy you. However, please keep reading.
  • If the reason you are asking this question is because you want to predict the amount of data more precisely, then skip down to the ‘Offline Updates’ section below.
  • If the reason you are asking this question is because you’re simply curious, then please keep reading.

Xbox One, PS4 and Switch Update sizes

The PS4, Xbox One and Switch periodically patch and update their console operating systems for maximum performance, to squash bugs and to improve features. However, this process is unpredictable and can cause folks who are on metered Internet connections no end of frustration.

How much data will it need to update?

There is no way to know … let’s pause to soak this in …

How much data is needed is entirely dependent on how recently you’ve upgraded your console. For example, if you’ve kept your console up to date all along the way, the next update will only be sized whatever the newest update is. With that said, there’s no way to gauge even that size in advance. Not Microsoft, not Sony and not Nintendo publish their update sizes in advance. They are the size they are. If it fixes only a small set of things, it could be 50-100 megabytes. If it’s a full blown point release (5.0 to 5.1), it could be several gigabytes in size. If it’s a smaller release, it could be 1GB.

If your console is way out of date (i.e., if you last turned it on 6 months ago), your console will have some catching up to do. This means that your update may be larger than someone who updates their console every new update. This means that if the base update is 1GB, you might have another 1GB of catch up before the newest update can be applied. This catch-up update system applies primarily to the Xbox One and not to the PS4 or Switch.

Xbox One vs PS4 vs Switch Update Conventions

Sony and Nintendo both choose a bit more of an one-size-fits-all update process when compared to Microsoft. Because of this, we’ll discuss the Xbox One first. Since the Xbox One is based, in part, on Windows 10, it follows the same update conventions as Windows 10. However, because the Xbox One also uses other embedded OSes to drive other parts of the console, those pieces may also require separate updates of varying sizes. This means that for the Xbox One to update, it has a process that scans the system for currently installed software versions, then proceeds to download everything needed to bring all of those components up to date.

Sony and Nintendo, on the other hand, don’t seem to follow this same convention. Instead, the Switch and PS4 typically offer only point-release updates. This means that everyone gets the same update at the same time in one big package. In this way, it’s more like an iPhone update.

For full point-release updates, the Xbox One also works this same way. For interim updates, it all depends on what Microsoft chooses to send out compared to what’s already on your Xbox One. This means that the Xbox One can update more frequently than the PS4 by keeping underlying individual components updated more frequently if they so choose. This is why the Xbox One can offer weekly updates where the PS4 and the Switch typically offer only quarterly or, at least, much less frequent updates.

Size of Updates

If you want to know the size of a specific update, you have to begin the update process. This works the same on the PS4, the Xbox One or the Switch. This means you have to kick off the update. Once you do this, the download progress bar will show you the size of the download. This is the only way to know how big the update is directly on the console.

However, both the PS4 and the Xbox One allow you to download your updates manually via a web browser (PC or Mac). You can then format a memory stick, copy the files to USB and restart the console in a specific way to apply the updates. This manual process still requires you to download the updates in full and, thus, uses the same bandwidth as performing this action on the console. This process requires you to also have a sufficiently sized and properly formatted USB memory stick. For updating the PS4, the memory stick must be formatted exFAT or FAT32. For updating the Xbox One, it must be formatted NTFS. The Nintendo Switch doesn’t provide offline updates.

Cancelling Updates in Progress

The Xbox One allows you to cancel the current system update in progress by unplugging the lan and/or disconnecting WiFi. Then turning off the console. When the console starts up without networking, you can continue to play games on your console, but you will not be able to use Xbox Live because of the lack of networking.

Once you plug the network back in, the system will again attempt to update. Or, you can perform an offline update with the Xbox One console offline. See Offline Updates just below.

You can also stop the PS4 download process by going to Notifications, selecting the download, press the X button and select ‘Cancel and Delete’ or ‘Pause’. Note, this feature is available on 5.x or newer PS4 version. If your PS4 version is super old, you may not have this option in the Notifications area. You will also need to go into settings (Xbox One or PS4) and disable automatic updates otherwise it could download these without you seeing it.

How to disable automatic updates:

With that said, you cannot stop system updates on the Nintendo Switch once they have begun. Nintendo’s downloads are usually relatively small anyway. Trying to catch them in progress and stop them may be near impossible. It’s easier to follow the guides above and prevent them from auto-downloading.

Also note, any of the consoles may still warn you that an update is available and prompt you to update your console even if you have disabled automatic software downloads.

*This setting on the Nintendo Switch may exclude firmware updates, your mileage may vary.

Offline Updates

Xbox One

The Xbox One allows you to update your system offline using a Windows PC. This type of update is not easily possible with a Mac. Mac computers don’t natively support formatting or reading NTFS properly, but there are tools you can use (Tuxera NTFS for Mac).

To use the Offline System Update, you’ll need:

  • A Windows-based PC with an Internet connection and a USB port.
  • A USB flash drive with a minimum 4 GB of space formatted as NTFS.

Most USB flash drives come formatted as FAT32 and will have to be reformatted to NTFS. Note that formatting a USB flash drive for this procedure will erase all files on it. Back up or transfer any files on your flash drive before you format the drive. For information about how to format a USB flash drive to NTFS using a PC, see How to format a flash drive to NTFS on Windows.

  1. Plug your USB flash drive into a USB port on your computer.
  2. Open the Offline System Update file OSU1.
  3. Click Save to save the console update .zip file to your computer.
  4. Unzip the file by right-clicking on the file and selecting Extract all from the pop-up menu.
  5. Copy the $SystemUpdate file from the .zip file to your flash drive.
    Note The files should be copied to the root directory, and there shouldn’t be any other files on the flash drive.
  6. Unplug the USB flash drive from your computer.

PlayStation 4

You can also update your PS4 console offline using Sony’s system updates. Here’s the procedure for PS4 offline updates. Note, the USB memory stick must be formatted either exFAT or FAT32. The PS4 doesn’t support any other types of stick formats. This means, if you buy a USB stick intended to be used on Windows, you will need to reformat it properly before you can use it on the PS4.

Update using a computer

For the standard update procedure, follow the steps below.

The following things are needed to perform the update:

  • PlayStation®4 system
  • Computer connected to the Internet
  • USB storage device, such as a USB* flash drive
  • There must be approximately 460 MB of free space.
    • On the USB storage device, create folders for saving the update file. Using a computer, create a folder named “PS4”. Inside that folder, create another folder named “UPDATE”.
      PC Update
    • Download the update file, and save it in the “UPDATE” folder you created in step 1. Save the file with the file name “PS4UPDATE.PUP”.
      Download Now Click to start the download.
    • Connect the USB storage device to your PS4™ system, and then from the function screen, select Settings (Settings) > [System Software Update].
      Follow the on-screen instructions to complete the update.
  • If your PS4™ system does not recognize the update file, check that the folder names and file name are correct. Enter the folder names and file name in single-byte characters using uppercase letters.

Nintendo Switch Updates

Nintendo doesn’t offer offline updates at all. The Nintendo Switch only supports Internet updates. There is currently no way to download or update your Switch via USB stick or SD card. The Nintendo Switch is the newest of the consoles, so it’s possible that Nintendo could offer an offline update mechanism some time in the future. However, knowing Nintendo, don’t hold you breath for this feature.

Offline Updates are Point Release Only

These offline update processes apply point-release updates only and not interim updates. Interim updates must still be applied directly from the console. Interim updates scan your system, find what’s needed, then download the patches. This can only be performed on the console. This means you could find that after installing a point release, the Xbox One may still require an additional update or two.

Updates and Internet Connectivity

Game consoles require updates to keep them current. The primary reason for most updates is to keep yours and your friend’s games in sync when playing multiplayer games. This prevents you from having a network edge over another player. When all game consoles are running the same version, all multiplayer activities are on the same playing field.

For this reason, Xbox Live and the PlayStation Network (PSN) require all users to update to use networking features. If you declined or postpone any updates, both the Xbox One and the PS4 will deny you access to networking features. You must update both the console and the games to continue using networking.

If you don’t intend to use the network features such as multiplayer or leader boards, then you don’t need to worry about this. However, if you’re not using the networking features, then there’s no reason to buy Xbox Live or PSN. So far, Nintendo doesn’t yet offer a network capable of multiplayer gaming like Xbox Live or PSN, but as soon as they do I’m quite sure they will enforce the same requirements.

Pushing off Updates

While you can postpone updates to your console, it’s not always the best idea. I get that some people are on metered networking connections and can’t afford to download 20GB sized updates. But, at the same time, this is how consoles work. If you’re looking for a console that supports offline updates, then you’ll want to look at the PS4 or the Xbox One. You might want to skip the Switch if this is a show stopper for you.

As we move into the future, these consoles will continue to assume more and more connectivity is always available. Don’t be surprised to find that both the Xbox One and PS4 discontinue their offline update feature at some point in the future.

Though, Sony will still need to provide a way to install the operating system when a hard drive is replaced. However, that won’t help you with updating your console offline.

If you have a reason to want to know your download sizes more precisely, other than what I mention above, please leave a comment below and let me know.

↩︎

Can the Xbox One catch up to the PS4 this year?

Posted in business, video gaming by commorancy on August 21, 2015

ps4-system-imageblock-us-13jun14We all know that Sony’s PS4 has outsold the Xbox One fairly substantially. However, will moving into this holiday season help or hurt the Xbox One? Let’s explore.

Halo 5

In October, we will see the next installment of Halo 5 released. This is unusual in that this title usually releases in November. I’m assuming that Microsoft is attempting to gain an early head start in console sales. I’m also certain that Microsoft is hoping that Halo 5 (an exclusive Xbox One title) will push consoles off the shelves. The problem is, however, UltraHD 4K.

4K TVs and Consoles

With HDTVs rapidly dropping in price and especially 4K TVs (there are several sub $1000 models), this spells a big problem for console manufacturers. I’m sure it wasn’t expected to see prices of 4K TVs dropping this rapidly this soon. None of the Xbox One, PS4 or Wii U currently support 4K content or 4K TVs. This is shaping into a much bigger problem and is especially a problem for Microsoft and Sony. Without the ability to deliver 4K content to these sub $1000 4K TVs, many people are going to be hard pressed to justify the investment in a $500 console that doesn’t support 4K. So, not even Halo 5 may be able to budge many of those consoles off the shelves, at least not to existing Xbox One owners.

Personally, I’m not planning on investing in any new console systems until there’s 4K support. When Sony and Microsoft can finally get off their collective butts and release a 4K HDMI 2.0 or HDMI 2.2 console version, I will definitely consider replacing my existing consoles, but not until that happens.

Of course, I already own a PS4 and Xbox One. I got both day one, but I’ve recently bought a 4K TV. Barring Netflix and Amazon, there’s effectively no 4K content. Still, it does make my 1080p content look amazingly clear without all that annoying pixelation so common in 1080p TVs.

Console Purchasing and the Holidays

Because 4K TVs are now becoming more commonplace and because 1080p TVs will likely be mostly a distant memory in even just 2 years, it’s hard to justify a $500 expense only to replace it in 6 months or a year. It’s not worth it. Additionally, you can buy a video game at any time after it’s released, but it doesn’t have to be on day one. You can just as easily play Halo 5 in spring of 2016 as you can in the fall of 2015. Yes, there are a lot of day-oners out there (must have it the moment it’s released), but because of the deluge of titles in the fall, it’s easy to pick and choose which ones to leave for later. This means you can delay that console purchase or buying that game until the 4K version arrives.

Yes, Halo 5 will push some consoles off the shelves. But, those looking for a 4K version will likely wait. I’m definitely waiting for the console refresh from Sony and Microsoft. For whatever reason, both of these companies are taking their sweet time to provide this refresh. In fact, Sony should have pushed out this refresh as part of the fall game launch. Sony being at the forefront of the 4K revolution makes it ever more important for Sony to finally get this refresh out the door. It’s even more important to get this refresh out for holiday purchases even if we can’t take advantage of the 4K content yet. Though, I know that Sony’s video on demand services for use with the Sony 4K UltraHD Media Player already offers a very large number of 4K movies. There’s no reason not to get this technology into the PS4 and widen that audience. Not only will it widen the audience for their movie services, it also immediately widens their game playing audience. In this case, were Sony to release this 4K refresh faster than Microsoft, Sony would have tremendous advantage both in sales and in gaming.

Sales Advantage

It’s clear, which ever company gets out their 4K refresh faster, they will have a sales advantage. As I said, considering Sony’s involvement in 4K, it makes perfect sense for Sony to get this refresh out now.

I don’t believe even Halo 5 sales could argue with a Sony 4K hardware refresh. People would think twice about buying an Xbox One until Microsoft also provided a 4K Xbox One refresh themselves. Should Sony release first, it would push Sony’s PS4 much higher in sales numbers because many existing PS4 owners would immediately replace their existing PS4. I know I would. So, that means double sales. Sales to everyone who already has a PS4 and to those who don’t. Of course, this would happen with the Xbox One as well once their 4K refresh is available.

Though, should the Xbox One and PS4 4K edition release together, I would still buy the PS4 version first unless Microsoft released the Xbox One 4K version with a 4K 60Hz playable version of Halo 5. There is currently no franchise title that Sony owns that is that compelling. But, were Black Ops III or Fallout 4 to support 4K, I’d be hard pressed not to consider a 4K PS4.

I personally believe that Sony is currently more likely to release a 4K refresh sooner than Xbox One. Microsoft doesn’t embrace new technologies quickly, especially when Sony is one of the primary proponents of that new technology.

Ultra HD 4K Content

Today, there’s not much 4K content. The drought of 4K content is about as severe as California’s rainfall levels. This can all change with a console refresh. Consoles are quickly becoming the ubiquitous media outlet for the home, especially for children. With a console refresh from Sony, that immediately picks up a relatively large number of 4K movies. With the addition of developers taking advantage of 4K gaming, that opens up a huge new door (literally pixel-wise). While that number of pixels is immense, it offers a brand new immersive level of gaming that hasn’t yet been achieved. Yes, it requires producing much bigger content, but the games will be spectacular, the environments breathtaking and the realism levels achieved would be astounding.

The problem today is that most developers can’t even grasp 1080p. So, I do not expect 4K gaming any time soon. Perhaps from the Call of Duty brand and possibly from Microsoft’s Halo (if 343 can figure it out). But, smaller companies like Atlus and even larger ones like Bethesda struggle with high def gaming. If we can get one HD title out of a developer per year, I consider that a win. With Ultra HD 4K content, I’d expect it might even take 2 years per title. That would suck at not having a new game every year, but 4K is where we’re going and Sony, Microsoft, Bethesda, Ubisoft, EA, Square Enix and the rest would do best to take heed. Not only does gaming want 4K, we need it to move forward. In fact, it should have been included in the original PS4 as Sony already had a 4K TV available at the time the PS4 was released. If Sony had had the foresight to create the PS4 with 4K, I wouldn’t even be writing this article.

Ultra HD’s Time Has Come

ultra_hd_blu-ray_logo_uhd_bd_bluray_logo_6501Sony, release your 4K refresh with the Ultra HD blu-ray spec. Microsoft, release your refresh with a 4K Halo 5. Because these two consoles are on the cusp of 4K, I’m anxiously awaiting their release. I won’t consider a new console purchase until these are out. Because they are so close, I would suggest you wait also. I would love to see any 4K console refresh for this holiday season. I’d love to see Halo 5 running in 4K. In fact, I’d love to play pretty much any of this holiday’s season games including Fallout 4, Halo 5, Black Ops III, Just Cause 3 and Star Wars in 60hz 4K. That would be an amazing holiday gift this season.

Windows 8 PC: No Linux?

Posted in botch, computers, linux, microsoft, redmond, windows by commorancy on August 5, 2012

According to the rumor mill, Windows 8 PC systems will come shipped with a new BIOS replacement using UEFI (the extension of the EFI standard).  This new replacement boot system apparently comes shipped with a secured booting system that, some say, will be locked to Windows 8 alone.   On the other hand, the Linux distributions are not entirely sure how the secure boot systems will be implemented.  Are Linux distributions being prematurely alarmist? Let’s explore.

What does this mean?

For Windows 8 users, probably not much.  Purchasing a new PC will be business as usual.  For Microsoft, and assuming UEFI secure boot cannot be disabled or reset, it means you can’t load another operating system on the hardware.  Think of locked and closed phones and you’ll get the idea.  For Linux, that would mean the end of Linux on PCs (at least, not unless Linux distributions jump thorough some secure booting hoops).  Ok, so that’s the grim view of this.  However, for Linux users, there will likely be other options.  That is, buying a PC that isn’t locked.  Or, alternatively, resetting the PC back to its factory default state of being unlocked (which the UEFI should support).

On the other hand, dual booting may no longer be an option with secure boot enabled.  That means, it may not be possible to install both Windows and Linux onto the system and choose to boot one or the other at boot time.  On other other hand, we do not know if Windows 8 requires UEFI secure boot to boot or whether it can be disabled.  So far it appears to be required, but if you buy a boxed retail edition of Windows 8 (which is not yet available), it may be possible to disable secure boot.  It may be that some of the released to manufacturing (OEM) editions require secure boot.  Some editions may not.

PC Manufacturers and Windows 8

The real question here, though, is what’s driving UEFI secure booting?  Is it Windows?  Is it the PC manufacturers?  Is it a consortium?  I’m not exactly sure.  Whatever the impetus is to move in this direction may lead Microsoft back down the antitrust path once again.  Excluding all other operating systems from PC hardware is a dangerous precedent as this has not been attempted on this hardware before.  Yes, with phones, iPads and other ‘closed’ devices, we accept this.  On PC hardware, we have not accepted this ‘closed’ nature because it has never been closed.  So, this is a dangerous game Microsoft is playing, once again.

Microsoft anti-trust suit renewed?

Microsoft should tread on this ground carefully.  Asking PC manufacturers to lock PCs to exclusively Windows 8 use is a lawsuit waiting to happen.  It’s just a matter of time before yet another class action lawsuit begins and, ultimately, turns into a DOJ antitrust suit.  You would think that Microsoft would have learned its lesson by its previous behaviors in the PC marketplace.  There is no reason that Windows needs to lock down the hardware in this way.

If every PC manufacturer begins producing PCs that preclude the loading of Linux or other UNIX distributions, this treads entirely too close to antitrust territory for Microsoft yet again.  If Linux is excluded from running on the majority of PCs, this is definitely not wanted behavior.  This rolls us back to the time when Microsoft used to lock down loading of Windows on the hardware over every other operating system on the market.  Except that last time, nothing stopped you from wiping the PC and loading Linux. You just had to pay the Microsoft tax to do it.  At that time, you couldn’t even buy a PC without Windows.  This time, according to reports, you cannot even load Linux with secure booting locked to Windows 8.  In fact, you can’t even load Windows 7 or Windows XP, either.  Using UEFI secure boot on Windows 8 PCs treads  within millimeters of this same collusionary behavior that Microsoft was called on many years back, and ultimately went to court over and lost much money on.

Microsoft needs to listen and tread carefully

Tread carefully, Microsoft.  Locking PCs to running only Windows 8 is as close as you can get to the antitrust suits you thought you were done with.  Unless PC manufacturers give ways of resetting and turning off the UEFI secure boot system to allow non-secure operating systems, Microsoft will once again be seen in collusion with PC manufacturers to exclude all other operating systems from UEFI secure boot PCs.  That is about as antitrust as you can get.

I’d fully expect to see Microsoft (and possibly some PC makers) in DOJ court over antitrust issues.  It’s not a matter of if, it’s a matter of when.  I predict by early 2014 another antitrust suit will have materialized, assuming the way that UEFI works comes true.  On the other hand, this issue is easily mitigated by UEFI PC makers allowing users to disable the UEFI secure boot to allow a BIOS boot and Linux to be loaded.  So, the antitrust suits will entirely hinge on how flexible the PC manufacturers set up the UEFI secure booting.  If both Microsoft and the PC makers have been smart about this change, UEFI booting can be disabled.   If not, we know the legal outcome.

Virtualization

For Windows 8, it’s likely that we’ll see more people moving to use Linux as their base OS with Windows 8 virtualized (except for gamers where direct hardware is required).  If Windows 8 is this locked down, then it’s better to lock down VirtualBox than the physical hardware.

Death Knell for Windows?

Note that should the UEFI secure boot system be as closed as predicted, this may be the final death knell for Windows and, ultimately, Microsoft.  The danger is in the UEFI secure boot system itself.  UEFI is new and untested in the mass market.  This means that not only is Windows 8 new (and we know how that goes bugwise), now we have an entirely new untested boot system in secure boot UEFI.  This means that if anything goes wrong in this secure booting system that Windows 8 simply won’t boot.  And believe me, I predict there will be many failures in the secure booting system itself.  The reason, we are still relying on mechanical hard drives that are highly prone to partial failures.  Even while solid state drives are better, they can also go bad.  So, whatever data the secure boot system relies on (i.e. decryption keys) will likely be stored somewhere on the hard drive.  If this sector of the hard drive fails, no more boot.  Worse, if this secure booting system requires an encrypted hard drive, that means no access to the data on the hard drive after failure ever.

I’d predict there will be many failures related to this new UEFI secure boot that will lead to dead PCs.  But, not only dead PCs, but PCs that offer no access to the data on the hard drives.  So people will lose everything on their computer.

As people realize this aspect of this local storage system on an extremely closed system, they will move toward cloud service devices to prevent data loss.  Once they realize the benefits of cloud storage, the appeal of storing things on local hard drives and most of the reasons to use Windows 8 will be lost.  Gamers may be able to keep the Windows market alive a bit longer, otherwise. On the other hand, this why a gaming company like Valve software is hedging its bets and releasing Linux versions of their games. For non-gamers, desktop and notebook PCs running Windows will be less and less needed and used.  In fact, I contend this is already happening.  Tablets and other cloud storage devices are already becoming the norm.  Perhaps not so much in the corporate world as yet, but once cloud based Office suites get better, all bets are off.  So, combining the already trending move towards limited storage cloud devices, closing down PC systems in this way is, at best, one more nail in Windows’ coffin.  At worst, Redmond is playing Taps for Windows.

Closing down the PC market in this way is not the answer.  Microsoft has stated it wants to be more innovative as Steve Balmer recently proclaimed.  Yet, I see moves like this and this proves that Microsoft has clearly not changed and has no innovation left.  Innovation doesn’t have to and shouldn’t lead to closed PC systems and antitrust lawsuits.

Clickity Click – The Microsoft Dilemma

Posted in computers, microsoft, windows by commorancy on April 30, 2010

Once upon a time, the mouse didn’t exist. So, the keyboard drove the interface. Later, Xerox came along and changed all of that (with the help of Steve Jobs and Apple). Of course, as it always does, Microsoft absconded with the mouse functionality and built that into Windows… not that there was really much choice with this decision.

We flash a decade forward or so and we’re at Windows XP. A reasonably streamlined Windows operating system from Microsoft. In fact, this is probably and arguably the most streamlined that Microsoft’s Windows has ever been (and will likely ever be). Granted, security was a bit weak, but the user interface experience was about as good as it can get. With only a few clicks you could get to just about anything you needed.

Flash forward nearly another decade to see the release of the dog that was Windows Vista. Actually, Windows Vista’s look was not too bad. But, that’s pretty much where it ends. Microsoft must not have done much usability testing with Vista because what used to take one or two clicks of the mouse now adds 1-3 extra clicks. The reason, Microsoft has decided to open useless windows as launchpads to get to underlying components. Added layers that are pointless and unnecessary. For example, you used to be able to right click ‘My Network Places’, release on properties and get right to the lan adapters to set them up. No more. Now this same properties panel opens a launchpad interface that requires clicking ‘Change Adapter Settings’ just to get the adapters. Pointless. Why was this added layer necessary? And this is the best of the worst.

More than this, though, is that sometimes the labeling of the links to get to the underlying components is obscure or misleading. So, you’re not really sure what link to click to get to the thing you need. That means you end up clicking several things just to find the thing you need. Yes, you can use the help to find things, but that then means opening even more windows and clicking through even more time wasting events just to locate something that should have been one-click anyway.

Server Operating Systems

This issue is not limited to the desktop OS world. In the server world, such as Windows 2008 R2, these launch pads are now excessive and in-your-face. For example, when you first install Windows 2008 R2, two of these panels open as the first thing after you log in. So now, I’m already starting out having to click closed two windows that I didn’t even need to see at that point just so I can get to the desktop. Likely, if you’re installing a server operating system, you’re planning on hooking it into a domain controller. So, setting up anything on the local administrative user is pointless. That means I have to close out of these useless panels in order to get to the panel where I can import this machine into the domain. It would have been far more helpful to have the first thing open be the join-the-domain panel. I don’t need to set up anything else on that newly installed machine until it’s in the domain.

Desktop Systems

Most people are much more familiar with the desktop operating systems than the server versions. But, these added clicks are all throughout not only Vista, but now Windows 7. Because Windows 7 is effectively a refresh of Vista with added compatibility features, these extra clicks are still there and still annoying. Why Microsoft had to take a streamlined interface and make it less efficient for users, I’ll never know. But, these added clicks to get to standard operating system tools is a waste of time and productivity. It also requires a higher learning curve to teach people the new method.

“If it ain’t broke, don’t fix it”

This motto needs to be ingrained into the engineering team at Microsoft because they clearly do not understand this. Added extra layers of windows does not make the OS more efficient. It makes it bloated, cumbersome and extremely inefficient. That extra click might only take an extra second, but those seconds add up when you’re doing a repetitive task that involves dealing with those windows as part of your job.

As another example, opening the default setup for control panel in XP shows the control panels themselves. In Vista / Windows 7, it now brings up a launch pad of abstract tasks. Tasks like ‘System and Security’ and ‘User accounts and family safety’. Clicking these leads to more sub concept tasks. So, instead of just showing the actual control panels, you have to click through a series of abstract task pages that ultimately lead you to a tool. No no no. Pointless and inefficient. Let’s go back to opening up the actual control panel view. I want to see the actual control panels. The abstract task idea is great for beginners. For advanced users, we want to turn this crap off. It’s pointless, slow and unnecessary. Power users do not need this.

Windows for beginners

Microsoft Bob is dead. Let’s not go there again. So, why is it that Microsoft insists on trying to spoon feed us an interface this dumbed down and with excessively clicky features? This interface would have been a great first step in 1990. But, not today. Taking this step today is a step backward in OS design. Anyone who has used Windows for more than 6 months doesn’t need these added inconveniences and inefficiencies. In fact, most average computer users don’t need this level of basics. Only the very beginner beginners need this level of spoon feeding.

Microsoft needs a to create new version (or, alternatively, a preference to turn ‘Newbie mode’ off). Simply put, they need Windows for the Power User. A stripped down design that gets back to basics. A design that eliminates these cumbersome beginner features and brings back the single click features that let us navigate the operating system fast and using the mouse efficiently (read, as few clicks as possible). Obviously, if you’re running ANY server edition, this automatically implies the power user interface. Let’s get rid of these helper panels on startup, mkay?

Microsoft, can we make this a priority for Windows 8?

Update: iTunes 9 and Windows 7

Posted in Apple, itunes by commorancy on October 29, 2009

As an update to an earlier Randosity article, I have upgraded my system to Windows 7 and then installed iTunes 9. Since making this change, I am no longer having the registry issue documented in this previous Randosity article. So, it may be worthwhile to upgrade your system to Windows 7 to alleviate this issue. Of course, it could be a fluke, but iTunes installed and started up without any issues on Windows 7. Before you upgrade, though, you’ll want to remove iTunes from your system, then run the upgrade to Windows 7, then reinstall iTunes 9. If you still experience registry issues with Windows 7 and iTunes 9, refer to this previous article for tips on what to do.

What’s wrong with Vista / Windows?

Posted in microsoft, tanking, windows by commorancy on July 6, 2009
This post comes from a variety of issues that I’ve had with Vista (specifically Vista 64 Home Premium).  And, chances are, these problems will not be resolved in Windows 7.  Yet, here they are in all their glory.
Memory Leaks
Vista has huge and horrible memory leaks.  After using Vista for a period of time (a week or two without a reboot) and using a variety of memory intensive 3D applications (Daz Studio, Carrara, The Gimp and Poser.. just to name a few), the system’s memory usage goes from 1.69GB to nearly 3GB in usage.  To answer the burning question… yes, I have killed all apps completely and I am comparing empty system to empty system.  Worse, there is no way to recover this memory short of rebooting.  If you had ever wondered why you need to reboot Windows so often, this is the exact reason.  For this reason alone, this is why Windows is not considered ‘stable’ by any stretch and why UNIX outperforms Windows for this reason alone.
Startup and Shutdown
Microsoft plays games with both of these procedures.
On Startup, Microsoft’s engineers have tricked you into thinking the system is functional even when it isn’t.  Basically, once the desktop appears, you think you can begin working.  In reality, even once the desktop appears, you still cannot work.  The system is still in the process of starting up the Windowing interface on top of about 100 background services (on many of which the windowing interface relies).  This trick makes Windows appear snappier to start up than it really is.  In fact, I would prefer it to just ready the system fully, then present the Windowing interface when everything is 100% complete.  I don’t want these tricks.  When I see the windowing interface, I want to know I can begin using it immediately… not before.
On Shutdown, we have other issues.  With Vista, Microsoft Engineers have done something to this process to make it, at times, ridiculously slow.  I have seen 8-15 minute ‘Shutting Down’ screens where the hard drive grinds the entire time.  I’m sorry, but shutdown time is not housekeeping time.  That needs to be done when the system is running.  It should not be done during shutdown procedures.  A shutdown should take no more than about 1-2 minutes to complete flushing buffers to disk and killing all processes.  If it can’t be done in 1-2 minutes, shut the system down anyway as there is nothing that can be done to finish those tasks anyway.
Windows Updates
Microsoft was supposed to eliminate the need to shutdown/reboot for most Windows updates.  For some updates, this is true.  For the majority of Windows updates, this is still not true.  In fact, Microsoft has, once again, made this process multistep and tediously slow in the process.  Don’t get me wrong, I’m grateful that they are now at least verbose in, sort of, what’s going on.. but that doesn’t negate the fact that it’s horribly slow.  The steps now are as follows:
Windows installation process (downloading and installation through the Windows dialog box).  You think it’s over when you..
Restart the system and it goes through finishing Step 2 of this process during shutdown… and then you think it’s over again when
The system starts back up and goes through Step 3 of the update process.
Ok, I’m at a loss.  With Windows XP, we had two steps.  Those first during Windows updater and the second when the system starts back up.   Now with Vista, we have to introduce another step?
Windows Explorer
For whatever reason, Windows Explorer in Vista is horribly broken.  In Window XP, you used to be able to configure your Windows how you liked then lock it in with Tools->Folder Options  and then View->Apply to Folders.  This would lock in exactly how every window should appear (list or icon format, size of icons, etc).  With Windows Vista, this is completely and uterly broken.  This functionality just no longer works.  I’ve tried many many times to lock in a format and Windows just randomly changes the folders back to whatever it feels like doing.
For example, I like my windows to look like this:
Unfortunately, Windows has its down agenda.  If I open a file requester (the standard Vista requester… the one that looks like the above) and I change the view to ANY other folder than this one, it randomly changes folders on the system.  So, I might open the above folder and it will later look like any of these:

This post comes from a variety of issues that I’ve had with Vista (specifically Vista 64 Home Premium).  And, chances are, these problems will not be resolved in Windows 7.  Yet, here they are in all their glory.

Memory Leaks

Vista has huge and horrible memory leaks.  After using Vista for a period of time (a week or two without a reboot) and using a variety of memory intensive 3D applications (Daz Studio, Carrara, The Gimp and Poser.. just to name a few), the system’s memory usage goes from 1.69GB to nearly 3GB in usage.  To answer the burning question… yes, I have killed all apps completely and I am comparing empty system to empty system.  Worse, there is no way to recover this memory short of rebooting.  If you had ever wondered why you need to reboot Windows so often, this is the exact reason.  For this reason alone, this is why Windows is not considered ‘stable’ by any stretch and why UNIX outperforms Windows for this reason alone.

Startup and Shutdown

Microsoft plays games with both of these procedures.

On Startup, Microsoft’s engineers have tricked you into thinking the system is functional even when it isn’t.  Basically, once the desktop appears, you think you can begin working.  In reality, even once the desktop appears, you still cannot work.  The system is still in the process of starting up the Windowing interface on top of about 100 background services (on many of which the windowing interface relies).  This trick makes Windows appear snappier to start up than it really is.  In fact, I would prefer it to just ready the system fully, then present the Windowing interface when everything is 100% complete.  I don’t want these tricks.  When I see the windowing interface, I want to know I can begin using it immediately… not before.

On Shutdown, we have other issues.  With Vista, Microsoft Engineers have done something to this process to make it, at times, ridiculously slow.  I have seen 8-15 minute ‘Shutting Down’ screens where the hard drive grinds the entire time.  I’m sorry, but shutdown time is not housekeeping time.  That needs to be done when the system is running.  It should not be done during shutdown procedures.  A shutdown should take no more than about 1-2 minutes to complete flushing buffers to disk and killing all processes.  If it can’t be done in 1-2 minutes, shut the system down anyway as there is nothing that can be done to finish those tasks anyway.

Windows Updates

Microsoft was supposed to eliminate the need to shutdown/reboot for most Windows updates.  For some updates, this is true.  For the majority of Windows updates, this is still not true.  In fact, Microsoft has, once again, made this process multistep and tediously slow in the process.  Don’t get me wrong, I’m grateful that they are now at least verbose in, sort of, what’s going on.. but that doesn’t negate the fact that it’s horribly slow.  The steps now are as follows:

  1. Windows installation process (downloading and installation through the Windows dialog box).  You think it’s over when you..
  2. Restart the system and it goes through finishing Step 2 of this process during shutdown… and then you think it’s over again when
  3. The system starts back up and goes through Step 3 of the update process.

Ok, I’m at a loss.  With Windows XP, we had two steps.  Those first during Windows updater and the second when the system starts back up.   Now with Vista, we have to introduce another step?

Windows Explorer

For whatever reason, Windows Explorer in Vista is horribly broken.  In Window XP, you used to be able to configure your Windows how you liked then lock it in with Tools->Folder Options  and then View->Apply to Folders.  This would lock in exactly how every window should appear (list or icon format, size of icons, etc).  With Windows Vista, this is completely and utterly broken.  Basically, this functionality simply no longer works.  I’ve tried many many times to lock in a format and Windows just randomly changes the folders back to whatever it feels like doing.

For example, I like my windows to look like this:

Favorite Format

Favorite Format

Unfortunately, Windows has its own agenda.  If I open a file requester (the standard Vista requester… the one that looks like the above) and I change the view to ANY other style than the one above, this change randomly changes other folder views on the system permanently.  So, I might open the above folder and it will later look like any of these:

Format Changed 1

Format Changed 1

Format Changed 2

Format Changed 2

or even

Format Changed 3

Format Changed 3

All of which is highly frustrating.  So, I’ll visit this folder later and see the entire headers have changed, or it’s changed to icon format or some other random format.  Worse, though, is that I’ve specifically changed to the folder to be my favorite format with Tools->Options.  In fact, I’ve gone through this permanent change at least 3-4 times after random changes  have happened and inevitably it changes to some other format later.  Again, highly frustrating.

Access Denied / Enhanced Security

For whatever reason, Microsoft has made shortcuts to certain folders.  Like for example, in your profile directory they have renamed ‘My Documents’ to simply ‘Documents’.  Yet, for whatever reason, Microsoft has created shortcuts that don’t work.  For example, if I click on ‘My Documents’ shortcut, I see ‘Access Denied’.  I don’t get why they would create a shortcut and then prevent it from working.

The only thing the enhanced security has done for Windows users is make it more of a problem to work.  Security goes both ways.  It helps protect you from malicious intent, but it can also get in the way of usability.  Security that ultimately gets in the way, like UAC, has failed to provide adequate security.  In fact, it has gone too far.  UAC is a complete and utter failure.  Combining this with making nearly every security issue tied to the SYSTEM user (with practically zero privileges), makes for stupid and exasperating usability.

Filesystem

To date, Windows still relies heavily and ONLY on NTFS.  Linux has about 5-6 different filesystems to choose from (Reiser, VxFS, XFS, Ext2, Ext3, JFS, BSD and several others).  This allows systems administrators to build an operating system that functions for the application need.  For example, some filesystems perform better for database use than others.   On Windows, you’re stuck with NTFS.  Not only is NTFS non-standard and proprietary (written by Veritas), it also doesn’t perform as well as it should under all conditions.  For database use, this filesystem is only barely acceptable.  It has hidden limits that Microsoft doesn’t publish that will ultimately bite you.  Microsoft wants this to become a pre-eminent datacenter system, but that’s a laugh.  You can’t trust NTFS enough for that.  There are way too many hidden problems in NTFS.  For example, if you hit a random limit, it can easily and swiftly corrupt NTFS’ MFT table (directory table).  Once the MFT table is corrupt, there’s no easy way to repair it other than CHKDSK. Note that CHKDSK is the ONLY tool that can truly and completely fix NTFS issues.  And, even CHKDSK doesn’t always work.  Yes, there are third party tools from Veritas and other companies, but these aren’t necessarily any better than CHKDSK.  Basically, if CHKDSK can’t fix your volume, you have to format and restore.

Note, however, that this isn’t a general Vista issue.  This problem has persisted back to the introduction of NTFS in Windows NT.  But, Microsoft has made no strides to allow or offer better more complete filesystems with better repair tools.  For example, Reiser and EXT3 both offer more complete repair tools than NTFS ever has.

Registry

The registry has got to be one of the most extensive hacks ever placed into any operating system.  This kludge of a database system is so completely botched from a design perspective, that there’s really nothing to say.  Basically, this system needs to be tossed and redesigned.  In fact, Microsoft has a real database system in MSSQL.  There is no reason why the registry is not based on MSSQL rather than that stupid hack of a thing call a hive/SAM.  Whomever decided on this design, well.. let’s just hope they no longer work at Microsoft.

Failure

For the above reasons (and others), Microsoft has completely failed with Windows Vista.  This failure was already in the making, though, when Longhorn was announced ages ago.  In fact, Microsoft had planned even more draconian measures to enable heavy DRM on Windows.  Thankfully, that was removed from Vista.  But, what remains makes Vista so encumbered and exasperating to use, it’s no wonder users are frustrated using Vista.  Combining that with its incredibly large footprint (1.6GB of memory just to boot the OS), and you have a complete loser of an OS.

Windows 7 is a glimmer of hope, but it is still heavily tied to Vista.  If UAC and these stupid SYSTEM user security measures remain, then nothing will really change.  Microsoft needs to take Windows back to the drawing board and decide what is necessary and what isn’t.  Preventing the user from actually using the operating system is not and should not be a core value, let alone part of security.  Yet, here we are.

Microsoft, you need to take a look at the bigger picture.  This is your final chance to get Windows right.  There are plenty of other unencumbered operating systems out there that do not get in the way of desktop computing.  These operating systems are definitely a threat to Microsoft’s continued viability… especially with blundering mistakes like Vista.  Windows will never win any awards for Best Operating System with issues such as these.  Consider Microsoft’s stupid filesystem layout that allows operating system and application files to be thrown all over the hard drive and you’ll begin to understand why Windows continues to fail.

The single reason why Microsoft continues to exist is because users feel compelled to buy this antiquated dog of an operating system strictly due to application support.  If developers would finally and completely jump ship to other more thoughtfully designed operating systems, then Windows would finally wither and die… eventually, this will happen.

The Microsoft Botch — Part II

Posted in botch, microsoft, redmond, windows by commorancy on January 17, 2009

In a question to The Microsoft Botch blog article, jan_j on Twitter asks, “Do you think Microsoft is going down?”  In commentary to that question, I put forth this article.

I’ll start by saying, “No”.  I do not think that Microsoft is ‘going down’.  Microsoft is certainly in a bad way at this point in time, but they still have far too much market share with Windows XP, Windows 2000 and Windows 2003 server as well as Exchange and several other enterprise products.  So, the monies they are making off of these existing installations (and licenses) will carry them on for quite some time.  Combine that with Xbox Live and the licensing of the Xbox 360 games… Microsoft isn’t going anywhere for quite a while.  The real question to ask, though, is.. Is Microsoft’s userbase dwindling?  At this point, it’s unclear, but likely.  Since the Vista debacle, many users and IT managers have contemplated less expensive alternative installations including Linux.  The sheer fact that people are looking for alternatives doesn’t say good things about Microsoft.  

As far as alternatives, MacOS X isn’t necessarily less expensive than Windows, but it is being considered as one possible replacement for Windows by some.   Some people have already switched.  MacOS X may, however, be less expensive in the long term strictly due to maintenance and repair costs.  Linux can be less expensive than Windows (as far as installation software costs and continuing licenses), but it requires someone who’s knowledgable to maintain them.

In comparison…

To compare Microsoft to another company from the past, IBM comes to mind.  IBM was flying high with their PCs in the early days, but that quickly crumbled when IBM started botching things up.  That and PC clones took off.  To date, there has not been a Windows OS clone to compete head-to-head with Microsoft.  So, Microsoft has been safe from that issue.  But, Linux and MacOS X do represent alternative operating systems that do function quite well in their own environments.  Although, MacOS X and Linux interoperate poorly, in many specific cases, with Windows (primarily thanks to Microsoft).

Linux as a replacement

While it is possible to replace Windows with Linux and have a functional system, the Windows compatibility limitations become readily apparent rapidly.  Since most of the rest of the world uses Windows, Linux doesn’t have fully compatible replacement softwares for the Windows world.  Because of Microsoft’s close-to-the-vest approach to software combined with their release-just-enough-information to allow half-baked Windows compatibility.  Thus, Linux (and other non-Microsoft OSes) can’t compete in a Windows world.  This is a ‘glass is half empty or half full’ argument.  On its own, Linux interoperates well with other Linux systems.  But, when you try to pair that together with Windows, certain aspects just fall apart.

That doesn’t mean Linux is at fault.  What it usually means is that Microsoft has intentionally withheld enough information so as to prevent Linux from interoperating.  Note, there is no need to go into the gritty details of these issues in this article.  There are plenty of sites on the Internet that can explain it all in excruciating detail.

However, if your company or home system doesn’t need to interoperate with Windows, then Linux is a perfectly suitable solution for nearly every task (i.e., reading email, browsing, writing blogs, etc).  If, however, someone wants to pass you an Adobe Illustrator file or you receive a Winmail.dat file in your email, you’re kind of stuck.  That’s not to say you can’t find a workable solution with some DIY Linux tools, but you won’t find these out of the box.

This is not meant to berate Linux.  This is just a decision specifically by Microsoft to limit compatibility and interoperability of non-Microsoft products.  This decision by Microsoft is intentional and, thus, Windows is specifically and intentionally designed that way.

Microsoft’s days ahead

Looking at Microsoft’s coming days, it’s going to be a bit rough even when Windows 7 arrives.  If Windows 7 is based on Vista and also requires the same hardware requirements as Vista, Windows 7 won’t be any more of a winner than Vista.

Microsoft needs to do some serious rethinking.  They need to rethink not only how their products are perceived by the public, they need to rethink what they think is good for the public.  Clearly, Microsoft is not listening to their customers.  In Vista, Microsoft made a lot of changes without really consulting with their target userbase and, as a result, ended up with a mostly disliked operating system.

Apple, on the other hand, is able to introduce new innovative tools that, instead of making life more of a hassle, it simplifies things.  Microsoft isn’t doing this.  

Rocky Road

While this flavor of ice cream might be appealing, Microsoft’s road ahead won’t be quite so much that way.  They are heading for a few rocky years coming.  Combine their bad software design decisions with a bad economy and you’ve got a real problem.  Microsoft’s problems, though, primarily stem from lack of vision.  Windows roadmap is not clear.  Instead of actually trying to lay out design goals for the next several revisions, Microsoft appears to be making it up as they go along… all the while hoping that the users will like it.   But, their designers really do not have much in the way of vision.  The biggest change that Microsoft made to Windows was the Start button.  That’s probably the single most innovative thing that Microsoft has done (note that the start button is not really that great of a design anyway).  

Microsoft forces everyone else to do it the Windows way

Microsoft’s main problem with Windows stems from its lack of interoperability between Windows and other operating systems.  While Windows always plays well with Windows (and other Microsoft products), it rarely plays well with other OSes.  In fact, Microsoft effectively forces the other OSes and devices to become compatible with Windows.  Apple has been the one exception to this with many of their products.  Apple has managed to keep their own proprietary devices mostly off of Windows (with the exception of the iPhone and iPods).   Even Apple has had to succumb to the pressures of Microsoft (with certain products) and compete in the Microsoft world even when Apple has its own successful operating system.  Note, however, that Apple’s softwares on Windows leave a lot to be desired as far as full compatibility goes.

 Microsoft has an initiative to allow open source projects access to deeper Microsoft technologies to allow for better compatibility between open source projects and Windows.  There’s two sides to this ‘access’.  The first is that it does help open source projects become more compatible.  On the other side, the developer must sign certain legal agreements that could put the open source project in jeopardy if Microsoft were to press the legal agreements.   So, to get the interoperability, it becomes a double-edged sword.

The tide is turning

Microsoft’s somewhat dwindling installations of Windows, lack of quality control and bungling of major products may lead more and more people away from Microsoft to more stable devices.  But, the market is fickle.  As long as people continue to generally like Microsoft products and solutions, Microsoft will never be gone.

Note, you can follow my Twitter ramblings here.

%d bloggers like this: