Random Thoughts – Randocity!

What Microsoft’s purchase of ZeniMax means?

Posted in business, microsoft, Sony, video gaming by commorancy on October 28, 2020
Can the PS5 succeed?

I’ve had this question recently posed to me on a Twitch stream. Yes, I stream games on Twitch in addition to penning this blog. I haven’t cross promoted my Twitch stream on this blog because blogging and gaming are mostly unrelated. However, if you’re interested in watching me game, please leave a comment below and I’ll post my Twitch channel. Let’s explore the answer to the above question.

Bethesda and Microsoft

Microsoft isn’t really a gaming company. They are a software company who produces gaming products in among all of their other hardware and software product lines. Sony is, likewise, not really a gaming company for a similar reason. Sony is mostly a content producing company who also produces gaming hardware.

Anyway, Microsoft’s purchase of Bethesda’s parent company ZeniMax likely means eventual changes to all of Bethesda’s game franchises. In fact, I’m actually surprised that the FTC has allowed such a purchase considering the negative impact it will likely have on consumer choice.

Sony and Microsoft

Let’s start with the elephant in the room. Sony and Microsoft are rivals when it comes to gaming systems. Sony has the PlayStation and Microsoft has the Xbox. Because Microsoft owns the Xbox console, purchasing large gaming companies firmly pushes this situation into conflict of interest and consumer choice reduction territory. Additionally, Microsoft’s purchase of ZeniMax before the PS5 has really launched can become an easy way to keep the PS5 from succeeding.

Why? Microsoft has designs on making the Xbox Series X console succeed and be more successful than the PS5. To do this, they want to lock Sony’s platform out of as much content as they can. How will this manifest with Bethesda’s games?

While the final outcome is entirely uncertain, the handwriting is on the wall. What I mean is that Microsoft may eventually make all of Bethesda’s newest released games exclusive to the Xbox. That means that Bethesda’s game franchises (plural) may ultimately end up playable only on the PC and on the Xbox console. Yes, that could mean that both the Nintendo Switch and Sony’s PS5 are equally negatively impacted by this purchase.

Both Sony and Nintendo could find themselves without future Bethesda titles on their gaming platforms. That could mean no more Fallout, no more Elder Scrolls, no more Doom and no more Wolfenstein will make their way onto Sony or Nintendo’s platforms. It doesn’t stop there. Titles like Bethesda’s upcoming Starfield, which has yet to be released, could be pulled from release on both Sony and Nintendo’s platforms… leaving this game only available on PC and Xbox.

Sure, it may lose Microsoft money by not releasing these games on these non-Microsoft platforms, but Microsoft will more than make up for those game sales losses by pushing more Xboxes and PCs into the home. Eventually, these games will be sold to newly purchased Xboxes and PCs more than making up for the losses in sales on those other platforms. Basically, Microsoft has an easy way to do the dirty to both Sony and Nintendo as far as Bethesda games are concerned.

Microsoft is also well aware of the leverage they hold over the gaming industry by purchasing Bethesda. More than this, Microsoft can steer new consumers onto their Xbox line of consoles and away from Sony and Nintendo consoles strictly by enforcing Xbox Exclusives.

Exclusives

Bethesda isn’t the only studio on the planet. However, Bethesda is a large studio with many very cherished video game franchises… franchises that bring in a lot of cash and drive console purchases.

While Microsoft can enforce making upcoming Bethesda games exclusive, Microsoft doesn’t necessarily have to take this step. However, knowing that Sony pretty much kicked Microsoft’s butt with the PS4’s sales, Microsoft isn’t eager to repeat that trend with the Xbox Series X. Purchasing ZeniMax gives Microsoft a definite edge. It also means Microsoft might also be eyeing the purchase of Activision, EA, Rockstar and even Ubisoft. Don’t be surprised if Microsoft snaps up some of these additional game developers as well.

By Microsoft purchasing large game studios like Bethesda, they can control which console becomes the dominant console this time around (i.e., theirs). This means even more exclusive Xbox games.

Exclusive games force consumers to buy specific hardware platforms to play these exclusive titles.

PS5

What does this news mean for a console like the PS5? It puts the PS5 at a severe sales disadvantage. Microsoft could request Bethesda to not produce PS5 games. Without Bethesda’s support on the PS5, that leaves the PS5 at a major disadvantage in the upcoming next gen gaming market.

This is part of the reason I am not purchasing a PS5 at this time. I’m waiting on how this plays out. Bethesda’s ownership by Microsoft means a very real possibility of future exclusive Xbox titles from Bethesda, with no releases on the PS5 or the Nintendo Switch.

This change would put Sony and Nintendo with a clear sales disadvantage. Sony would have to rely not on Bethesda games to drive the PS5’s sales, but instead rely on Sony Studio game releases… games they have developed themselves or by studios they own (i.e., Sucker Punch).

That doesn’t mean the PS5 will be worthless, but it means that the future of Bethesda’s games being released on the PS5 has become very unclear. In fact, I’d use the word “muddy” to describe these waters.

Here are some questions that come out of the above:

  1. Should I buy and Xbox Series X or a PS5? The answer to this question entirely depends on what Microsoft has planned for Bethesda. If they intend to turn all future Bethesda releases into Xbox exclusives, then the answer to this question is… buy an Xbox Series X. Even then, I’d still recommend buying an Xbox Series X because there’s a zero chance of losing Bethesda games on the Xbox. However, there’s a high probability the PS5 will lose Bethesda’s future games. The even larger answer to this question also depends on whether Microsoft plans to buy more large game studios.
  2. Will Bethesda lose money? The answer to this question is, no. Microsoft has deep, deep pockets. They can withstand any short term monetary losses from making Bethesda’s games exclusive to the Xbox and they can also withstand the long term needs to recoup those losses by selling new Xbox consoles and any exclusive Bethesda games. The more consoles Microsoft sells, the more games they can sell.
  3. Will Microsoft force Bethesda to make exclusives? Yes, they will. This is guaranteed. The question is, which games will be forced into this category? That’s still unclear. Will it only be some of Bethesda’s games, all of them, new games only or some combination of this? We don’t know. However, I can guarantee at least one of Bethesda’s games will be released as an Xbox exclusive. My guess is that most of Bethesda’s games will become exclusives.
  4. What about existing Bethesda games? What happens to these? Microsoft isn’t stupid. They will allow existing games to continue to be sold and operate on the PS4 and any other older non-Microsoft consoles. They won’t rock this boat. Instead, Microsoft will look at upcoming unreleased games and use the games that have never been released to become exclusive.

As a result of these questions and answers, it’s clear that if you love Bethesda’s games and you wish to play future upcoming Bethesda game franchises, you may want to wait before investing in one of these new consoles. It would suck to spend a wad-o-cash to walk home with a PS5 only to find that the one Bethesda game you thought you could play is now an Xbox Series X exclusive. That means, you’ll never see that game released on the PS5. Microsoft is very likely to make this situation a reality.

If Microsoft buys even more of these large developers, they could lock Sony’s PS5 out of the mainstream gaming market. That would push Sony’s PS5 into a situation like Nintendo (and the PS Vita), where the console maker is entirely responsible for creating compelling game franchises for their respective console on their own. Unfortunately, that’s just not enough to keep a platform like the PS5 alive.

In other words, with the purchase of Bethesda, there’s a very real possibility that this time around that Sony’s PS5 will be the underdog.

Ramifications

The bigger ramifications of this purchase is the lack of and reduction of consumer choice. This purchase can easily push Microsoft into an even more monopoly status than they already are. Locking down the biggest game developers to exclusivity for the Xbox means causing the PS5 to ultimately fail and for the same reason the PS Vita failed.

Personally, I believe this is Microsoft’s true agenda. The Xbox One’s sales paled in comparison to the PS4. Microsoft is not eager to repeat this situation with the Xbox Series X. By buying large developers like ZeniMax / Bethesda, Microsoft can all but assure the success of the Xbox Series X… and, at the same time, assure Sony’s failure of the PS5.

This purchase is honestly a one-two punch to Sony…. and for Sony, it’s gotta hurt.

Sony and Gaming

If Sony is smart, they’ll run out and buy Rockstar or Ubisoft right now. They shouldn’t wait. They should purchase one of these companies as fast as they possibly can. Rockstar would be the best choice for Sony.

Sony could then have this same bargaining chip in their back pocket just like Microsoft has with Bethesda. Should Microsoft dictate Xbox exclusivity for Bethesda’s upcoming games, Sony can do the same thing for Grand Theft Auto and Red Dead Redemption (once they own Rockstar). Ultimately, it will be a “tit for tat” situation.

In fact, Sony should buy both Ubisoft and Rockstar and have two bargaining chips. Even still, such a game exclusivity war would lead to fracturing the gaming market in half. Basically, the consumer would be forced to buy multiple consoles to play games that formerly landed on both consoles. It’s a loss for consumer choice… which is why I’m surprised the FTC hasn’t stepped in and blocked this one.

I’m guessing that because the final outcome has not yet manifested, the FTC can’t see the forest for the trees. However, once hindsight forces 20/20 vision, it will be too late for the FTC to block this purchase.

What does this mean for Fallout?

I know this is a very specific question about a very specific game. However, I was asked this very question on a Twitch stream. Let me answer it here.

If you’re a fan of the Fallout series and you’re unsure which of the upcoming console to buy, I’d recommend waiting to see what Microsoft has in store for upcoming Bethesda games.

With that said and to reiterate what I’ve said above, there is now zero chance that Microsoft will withhold Fallout for the Xbox Series X and newer Xbox consoles. However, Microsoft can easily block the release of future Fallout games from the PS5 and the Switch. This means that a consumer’s investment of cash into a PS5 could see the console without any future Fallout or Elder Scrolls or Doom games.

What that means is that should Bethesda take on the challenge of remastering Fallout 1, Fallout 2 and Fallout New Vegas for the newer consoles, these games may only find their way onto the Xbox Series X as exclusives and may not be found on the PS5.

Basically, proceed with caution if you really, really want a PS5. You may find that like the PS Vita, without titles released from Bethesda, the PS5 may end up a dying console before it really gets the chance to take off, particularly if Microsoft buys even more of these large game studios. If the PS5 does fail due to Microsoft exclusives, it will be mostly thanks to Microsoft.

↩︎

Rant Time: Xbox One and PS4 automatic downloads

Posted in botch, business, microsoft, Sony by commorancy on June 17, 2017

So, I have reasonably fast internet service. It’s not the top speed I can get, but it’s fast enough for most general purposes. I’ve clocked it on wireless at about 18-20 Mbps down and 6 Mbps up. If I connect a device wired, it will be somewhat faster. With wireless, it’s not the fastest, but it’s definitely sufficient. The wireless is obviously for convenience, but it works well the majority of the time. However, when the PS4 or Xbox One get going with their automatic downloads, it absolutely kills my network connectivity. And so starts my somewhat shorter than usual rant. Let’s explore.

Automatic Downloads

I always turn off automatic downloads whenever possible, no exception. When there is no ability to shut off automatic updates, then I unplug the device. There’s no need to have devices automatically downloading at the most inopportune times. In fact, several months back I explicitly disabled automatic update downloads on my Xbox One. Yet, just yesterday I find my Xbox One automatically downloading again. I’ve finally had enough of rogue network devices and out of sheer frustration, I’ve finally just unplugged it. I also unplugged my PS4 for the same reason. No more rogue network devices. If these systems cannot respect my wishes when I explicitly turn off automatic downloading, then they’re going to remain unplugged until I decide to use them. Worse, these devices would also decide to randomly begin downloading updates at random times (usually in the middle of the night, but it could be any time).

The primary problem is, neither the Xbox One nor does the PS4 limit its download speeds. In fact, both try to download as much as possible, as fast as possible. If both of them get going at the same time, it’s a disaster on my network. Even just one of them downloading is enough to cause problems. If I try to ask Siri or Alexa a question, I get no response or I get the Echo’s dreaded Red Ring (no connectivity).

Rant

At least Apple respects disabling automatic downloads on its devices. These devices dutifully wait until you click update before beginning any downloads. Unfortunately, Microsoft does not honor its no auto updates setting. Instead, it just overrides that setting and dutifully starts downloading whatever it wants whenever it wants. I just can’t have rogue devices like that on my network. Rogue devices need to go away and Microsoft needs to understand that making rogue devices needs to stop. If your software can’t respect the owner’s wish not to download automatic updates, then you really don’t deserve a place in the home.

I haven’t yet determined if the PS4 overrides my no download wishes, but I recall that it, at times, the PS4 will also do this for system updates. Updates which, again, should not automatically update unless I explicitly ask it to update.

Just say no to rogue network devices like the Xbox One. For now, the Xbox One and the PS4 will remain unplugged until I decide I need to use them. Though, in the last few months, there really has been a substantial lack of game titles on both platforms. I’m really finding that the spring and summer to be a dead season with new game titles. Instead of overloading us with too many fall titles which we can’t play that fast, why not spread them out throughout the year and let us have adequate time to play each? This, however, is a whole separate rant topic in itself.

For whom does the bell toll? Microsoft’s Xbox.

Posted in botch, business, gaming, microsoft, xbox by commorancy on March 27, 2016

xbox-1-logoIn case you haven’t been reading recent news, here’s an article that might wake you up… especially if you happen to be an Xbox platform fanboy. What is this alleged article? Microsoft has stated it will merge the PC and Xbox platforms into a single unified platform, ending the sale of dedicated console hardware. Let’s explore.

Xbox and Xbox 360

When the original Xbox arrived in 2001, it received lots of fanfare. The console market now had a competitor against the PlayStation 2. The PS2 had released only one year earlier in 2000. Though, the Sega Dreamcast had promise, Sega pulled the plug in 2000 citing lots of reasons including bad sales, competition and poor platform reception. The Xbox’s controller, architecture and speed quickly ended up competing with the PlayStation 2.

A few years later, we went through the second iteration of this console war when both Sony and Microsoft released the PS3 and the Xbox 360, respectively and near simultaneously. Once again, we had our next generation console in our hands and we gamers were happily playing with better graphics and sound quality.

The Xbox 360 took the lead in the console market over Sony’s PS3, but only by slim margins. Though, the XBox 360 managed to stay one step ahead through out the lifespan of both consoles.

Xbox One and Ps4

Unfortunately, Microsoft would not be able to maintain its fleeting lead it had won with the Xbox 360 with its blundering Xbox One E3 announcement in 2013. Here’s what they had wanted to do:

This announcement in 2013 would set the tone for all things to come including the next iteration of the Xbox platform. Within a week of their E3 announcement, after facing Sony’s harsh rebuttal at E3, Microsoft reversed all of its DRM and privacy invasion strategies after the gamers clearly spoke with their wallet, PS4 orders surged and people cancelled their Xbox One orders in droves. It’s clear, this blunder was Xbox’s first death knell and set in motion many future problems to come for the Xbox. Unfortunately, neither Microsoft nor the Xbox has been able to recover from this blunder.

Elite Console and Controller

XboxOneEliteController-smImmediately prior to this Windows platform integration announcement, Microsoft had just released the Elite Console and Elite Controller. This controller being a much more costly update to its existing hardware ($15o vs $60). This console and especially the controller is Microsoft’s nod to a more professional gamer. That is, a nod to those gamers who want to play games using higher quality contollers, button remapping, changeable controller features, more inputs and faster consoles. I’ll tell you what, though. The Elite Controller is actually quite nice, but very very pricey. Yes, some of us do want these advanced features from our systems. However, it’s entirely disingenuous for Xbox to release the Elite controller and system only to see Microsoft announce the death of future hardware systems just a few months later. Really, what does this say to would-be gamers about Microsoft’s commitment to the gaming market?

To me, this says that the right hand doesn’t know what the left hand is doing in Redmond. On the one hand, you have the Xbox engineering team trying to drum up new gaming interest by releasing high quality experiences for the gamer. On the other, Microsoft itself is trying to reduce costs by getting rid of costly hardware projects it deems a loss. Unfortunately, this doesn’t mean good things for Microsoft as a whole. This ultimately means that the whole company is fractured internally and doesn’t have a proper focus on its products or its markets. Instead, it is making rash decisions without thinking through the long term ramifications of those decisions. A death knell.

Microsoft’s confusion

With this announcement of the integration of Xbox with Windows, Microsoft has likewise announced that it also intends (see article) to stop making future hardware and will instead focus on the Xbox platform as a subcomponent of Windows. Just like Windows Media Center, it will become an add-on to Windows. You might think that this is a great idea, but it isn’t. Let’s understand why.

Windows itself already offers developers a solid gaming development environment to produce native games on Windows. Most AAA game titles are made not only for consoles, but also for Windows and sometimes even Mac. The question is, would that spell the death of the Xbox platform? Yes. The reason the Xbox platform exists is as a gaming hardware platform independent of Windows. It does not exist for Netflix, Amazon or for any other non-gaming entertainment. Sure, you can play movies and music on the Xbox, but that’s not the platform’s intended purpose. Microsoft is seriously confused over the reason the Xbox platform exists and why it continues to exist. This confusion spells yet another death knell. Basically, if Microsoft thinks that the non-gaming aspects of the Xbox will survive once in Windows, it won’t. You can already use native Windows apps to get access to all of the services like Hulu, Netflix and Amazon… and the native apps are usually better.

The Death of the Xbox

Because Windows is already a solid gaming platform in its own right (in addition to being an entertainment platform), integrating a second gaming environment into Windows means that only one of these gaming platforms will survive the transition. Game developers will also only choose one platform to develop. Assuming status quo for the Xbox platform, the Xbox will be the clear loser. It’s simple to understand why: high priced licensing fees. It costs developers substantial amounts of cash to license and sell games branded with the Xbox moniker. It costs far far less to develop games under Windows directly. Unless Microsoft substantially changes their Xbox licensing model, this platform is entirely dead for gaming. Game developers won’t be willing to pay the excessive licensing fees on top of producing the game twice (Xbox and Windows) for the same hardware platform. Why would any game developer produce the same game twice that is destined for the same platform? They wouldn’t. A death knell.

So, what does this mean for gaming? PC gamers win a feather in their cap. Xbox gamers lose a platform entirely. Once games stop being produced for the Xbox platform, and they will stop, the only thing left to use the Xbox platform for is Netflix, other media activities and already purchased digital content. As I said above, you can already crack open Chrome or Firefox and do video streaming and music playing better. So, the answer, there will be nothing left to use the Xbox platform for except for legacy digital content that you may have purchased on an Xbox One/360… assuming that content even remains compatible after the Windows PC migration. Another death knell.

Digital Content

So, what does this mean for already purchased digital content? It means that you better hold onto your working Xbox One and Xbox 360 if you want to continue to use this content. Though, Microsoft may eventually force users to move to the Windows integrated platform and sunset the use of Xbox hardware entirely (and cut it off from the Xbox Live service).

This means that, at some point, you may no longer be able to download your digital content to your Xbox One and you may be forced to buy a PC. Depending on how Xbox One’s content activation system works, it may even prevent you from using the digital content you’ve already downloaded depending entirely upon how far and deep that Microsoft takes it.

Of course, this is still years off yet. But, once that time arrives, your Xbox One and 360 may become paperweights. A death knell.

Why this change?

From Microsoft’s perspective, I can understand the value and cost savings that integration (and lack of hardware) brings. No longer does Microsoft have to design, build and sell hardware platforms, no longer do they have to compete with Sony, no longer do they have to support this finicky hardware (a highly expensive ongoing investment). This means they can reduce their costs for all of the above. Instead, they can push the hardware costs back onto PC manufacturers to support their new Xbox platform.

Unfortunately, expecting PC manufacturers to support the Xbox is a pipe dream fantasy. There are far too many PC manufacturers who don’t follow the rules 100%. Instead, they get about 90% there and call the system done. This means that instead of having a fully 100% reliable Xbox platform, you’ll end up with a crashing behemoth of a system that, once again, barely works. The clear benefit to designing exclusive hardware is to achieve reliability by design. Leaving it to third parties to provide that hardware support means that some PC manufacturers will flat out not support the Xbox platform and those that do will charge a hefty premium. This ultimately means that buying a PC that properly supports the Xbox platform will likely mean a significantly higher cost than older far less expensive dedicated gaming console hardware. Not to mention, the clunky and ugly tower and desktop shapes of PC manufacturers which can no longer be used as a set top box.

This means that not only will the PC-based Xbox experience falter badly, you’re likely looking at 2x, 3x or more the price of today’s Xbox One to invest in a compatible PC-based Xbox platform. This puts this platform so far out of the price range of console gamers, this is yet another death knell for the Xbox. I won’t even get into the peripheral issues. Okay, I will a little. If Microsoft stops the hardware entirely, they’re likely to stop the controllers and leave that also up to third parties.

We all know how well PC controllers work with many games. Sometimes they work, sometimes they don’t. They are usually not wireless and when they are, they are chock full of wireless issues. The whole reason the Xbox One works well is because of the wireless controller and its close integration with the hardware.

Throwing the Baby out with the Bathwater

Ultimately, Microsoft is throwing away all of their hard earned gamer loyalty. They are effectively closing the Xbox and throwing away the key. What this ultimately says is that Microsoft has no long term commitment to the gaming market, the console market or the gamers. What was formerly the green glory will fade into Microsoft’s Windows obscurity.

Overall, this is the worst of all possible fates that could befall the Xbox. A console is not a console without hardware. We all know how well gaming platforms work when they offer dedicated hardware. We also know how well they don’t work when relying on third parties. Think Steam. Perhaps Microsoft is deluded enough to think that Steam is the model of the future? I can tell you that Steam isn’t it. Steam works, but for limited purposes. Effectively, Steam is the app store for gaming. Since most app stores don’t focus on gaming, it was inevitable that someone would put one together. Hence, Steam. But, the Xbox platform, regardless of its current strength in gaming will die a quick death once there is no more console hardware to be had. Gamers aren’t likely to spend their efforts chasing down third party hardware platforms that might or might not work. The whole point of a console is that it “just works”. The Steam model simply won’t work for the Xbox unless you’re talking about $2-5 pricepoint games which could run on Facebook. That’s not the class of gaming that Xbox One is today.

We all need hardware to make our lives better, yes even in gaming. You can’t game without hardware. Relying on PC manufacturers to get you what you need isn’t the answer. Worse, Windows native games and developers will kick the Xbox platform to the curb. No developer in their right mind would consider spending extra money to develop on the Xbox platform when they already have Windows development efforts underway. Why would game developers choose to redundantly build their game twice for the same platform? That’s just stupid.

Sony, Nintendo and, yes, Apple

All of the above is actually very good news for the remaining console developers. Once the Xbox platform dies quietly inside of Windows (and it will), Sony only need worry about Nintendo for the foreseeable future. However, with Apple’s recent foray into gaming with the latest Apple TV, this could mean Apple now has an opening into the console market. What I will say about the current Apple TV for 3D gaming is that it’s still very rudimentary. The textures are low res, the environments look like something out of the Nintendo 64 and there’s not a speck of realism to be found… yet. However, Apple can up the ante a lot in the next Apple TV console iteration. Assuming they wedge in a much higher end GPU and a lot more RAM into the Apple TV, they could easily match the specs of the Nintendo Wii U, but perhaps not yet approach the PS4… it will take quite a bit more effort by Apple to match Sony. For Apple, the door for the console market is quite clearly open. For Microsoft, the door is quickly closing.

Yes folks, the Xbox is officially a dead platform. With this integration announcement, this is the Xbox’s final death knell.

If you are considering the purchase of a new gaming console, you should steer clear of the Xbox One unless you really enjoy buying into dead gaming platforms.

 

Xbox One is already dead before its launch?

Posted in entertainment, gaming, microsoft, redmond by commorancy on November 6, 2013

Xbox One family-580-90Wow… just wow. Infinity Ward, the developers of Call of Duty, has recently stated in this IGN article and this IGN article that Call of Duty Ghosts can only run in 720p resolution and 60hz refresh rate on the Xbox One. Let’s explore why this is yet another devastating blow to Microsoft.

Xbox One

Clearly, Microsoft is banking on Xbox One to last for another 8 years like the Xbox 360. Unfortunately, not gonna happen. The Xbox One is clearly under powered for full next gen console needs. And, you would think the Microsoft hardware engineers would have thought of this issue long before even breaking ground on new hardware. You know, like actual planning.

With all of the new TVs supporting 120 Hz refresh rates and higher and TVs running 1080p resolutions (and 4k TVs not far off), it would be natural to assume that a next gen console should be capable of producing output in a full 1080p 60hz frame rate (as its base resolution). In other words, Xbox One should start at 1080p 60hz but be able to go up to much faster speeds from here. According to Infinity Ward, this is not possible on the Xbox One. I’ll say that one more time. Infinity Ward has just said that 1080p 60hz is not even possible on the Xbox One.

Next Gen Consoles

Because of this significant and avoidable Xbox One hardware deficiency, Infinity Ward has taken the step to produce Call of Duty: Ghosts in 720p at 60hz refresh rate (upscaled to 1080p) on the Xbox One to keep the ‘experience’ similar on all platforms. Let’s compare. Every big game title produced on the Xbox 360 is already 720p 60hz upscaled to 1080p.  What this ultimately says is that the Xbox One hardware is no better than the Xbox 360.  This hardware is basically dead before it’s even hit the store shelves. A next gen console should not see limitations in hardware until at least 2 years following its release. A new console should never see any limitations being hit by any launch titles.

If one of the very first launch titles is already taxing this console’s hardware, this platform is dead on arrival. This means the Xbox One has no where to go but down. It also means that you might as well stick with the Xbox 360 because that’s what you’re buying in the Xbox One. It also means that the games will never provide a high quality next generation game experience no matter which game it is. Seriously, getting high resolution at full speed is why you buy a next generation console.

Granted, I can’t vouch for Infinity Ward’s programming capabilities as I don’t know any of their developers. But, I know they have been producing this franchise for years. I would also expect their software engineers to have both the knowledge and expertise to properly produce any game for any platform they set their sights on.

In other words, I cannot see that this is some agenda on the part of Infinity Ward to try to discredit the Xbox One hardware.

Xbox One vs Xbox 360

The Xbox 360 hardware is well capable of producing games in 720p at 60hz already. It’s been doing this resolution and frame rate for years. Why buy another console that also has this same exact limitation of the current hardware? You buy into a next generation console to get something new. Namely, higher resolution gaming experiences. If the Xbox One cannot provide this, there is no point to this platform and this platform is dead.  DEAD.

Xbox One: Dead on Arrival?

Based on the above, the Xbox One’s lifespan has been substantially reduced to, at best, 1-2 years on the market before Microsoft must redesign it with a new processor and graphics card. This also means that early adopters will get the shaft with ultimately dead hardware and have to buy new hardware again very quickly to get the newest Xbox One experience.

If you’re considering the purchase of an Xbox One, you should seriously reconsider. I’d suggest cancelling your pre-order and wait for the newest next gen console from Microsoft. Or, alternatively, buy a PS4 if you can’t wait that long. Why spend $499 for a console that gives you the same capabilities as the Xbox 360? It makes no sense, especially considering that there are no compelling launch titles on the Xbox One that aren’t also coming to the Xbox 360. It’s worth giving the extra time to make sure your $499 investment into this console is a sound choice.

Coding to the Weakest Hardware?

For the longest time, the Xbox 360 was the weakest hardware of all of the consoles. Clearly, it is still the weakest of hardware.  For the longest time, developers catered to developing their games to the weakest hardware choice. That means, lesser graphics quality, lesser texture quality, lesser everything quality. I’m hoping this is now a thing of the past.

It now appears that game developers are tired of developing to the weakest hardware. Call of Duty Ghosts hopefully proves that. And, rightly so they should. Instead of producing low-res low quality gaming experiences on all platforms, they should provide the highest quality gaming on the best platforms. Then, take that and scale it back to fit on the weaker hardware platforms.

So, this scenario has now flipped the development practices. I’m glad to see developers embracing the best hardware and delivering the highest quality gaming experience on the best hardware. Then, reducing the quality to fit the weaker hardware. It makes perfect sense. It also explains why Infinity Ward reduced the resolution on the Xbox One. But, being forced to reduce the quality of the game to a lower resolution doesn’t bode for longevity of the Xbox One hardware.

What about the PS4 and 4k gaming?

According to those same articles above, the PS4 apparently doesn’t have this 1080p limitation. Call of Duty: Ghosts will run on the PS4 in full 1080p with 60hz refresh. Whether the PS4 is capable of higher resolutions is as yet unknown. Consider this. One of the very first 4k TVs introduced was produced by Sony. I would expect the PS4 to have been built to possibly support 4k gaming experiences. That doesn’t mean it will right now, but it may in the future. The Xbox One? Not likely to provide 4k anytime soon. If Microsoft’s engineers weren’t even thinking of 1080p resolutions, then they most certainly weren’t thinking about 4k resolutions.

If you’re into future proofing your technology purchases, then the PS4 definitely seems the better choice.

Microsoft Surface: Why Windows is not ready for a tablet

Posted in botch, microsoft, redmond by commorancy on July 4, 2013

Microsoft SurfaceMicrosoft always tries to outdo Apple, but each time they try they end up with a half-baked device that barely resembles what Apple offers. Worse, the device barely even understands the purpose of why Apple created their product in the first place or even what space it fills in the market. But, leave it to Microsoft to try. Let’s begin.

Microsoft Surface

I’ve recently come into contact with a Microsoft Surface tablet. Let’s just dive right into the the heart of the problems with this platform. Windows and a touch surface are simply not compatible, yet. Why? We have to understand Window 8. For the release of Windows 8, Microsoft introduced Metro. This interface is a big tile based interface that is, more or less, touch friendly. It’s the interface that was adopted for use on both the Xbox 360 and Windows phones. The difference between Windows phone / Xbox 360 and Windows 8 is that you can’t get to the underlying Windows pieces on the Xbox 360 and Windows phone (and that’s actually a good thing). With Windows 8 on a tablet, unfortunately, you can. In fact, it forces you to at times. And, here’s exactly where the problems begin.

Windows 8 under the hood is basically Windows 7 slightly repackaged. What I mean is that Windows 8 is essentially Windows 7 when not using Metro. So, the window close button and resize button are the same size as Windows 7, the icons are the same size, the tiny little triangle next to a folder hierarchy is the same size. Easily clickable with a mouse. Now, imagine trying to activate one of those tiny little icons with a tree trunk. You simply can’t target these tiny little icons with your finger. It’s just not touch friendly. That’s exactly the experience you get when you’re using the Windows 8 desktop interface. When trying to press the close button on the Window, yet you might have to press on the screen two, three or four times just to hit the tiny little control just to make it activate. It’s an exercise in futility and frustration.

Metro and Windows

Metro is supposed to be the primary interface to drive Microsoft Surface. However, as soon as you press some of the tiles, it drops you right into standard Windows desktop with icons, start button and all. When you get dropped into this interface, this is exactly where the whole tablet’s usefulness breaks down. Just imagine trying to use a touch surface with Windows 7. No, it’s not pretty. That’s exactly what you’re doing when you’re at the Windows 8 desktop. It’s seriously frustrating, time consuming and you feel like a giant among Liliputians.

No, this interface is just not ready for a touch surface. At least, not without completely redesigning the interface from the ground up… which, in fact, is what I thought Metro would become. But no, many of the activities on the Metro screen take you out of Metro. This is the breakdown in usability. For a tablet OS, Metro should be it.  There should be no underlying Windows to drop down to. If you can’t do it Metro, it cannot be done!

A Tablet is not a home computer, Microsoft!

The offering up multiple interfaces to the operating system is the fundamental design difference between IOS and Windows 8. Microsoft would have been smarter to take Windows phone OS and place that operating system straight onto Windows Surface. At least that operating system was completely designed to work solely with touch screen using 100% Metro. That would have been at least more along the lines of what Surface should have been. Instead, Microsoft decides to take the entire Windows 8 operating system and place it onto the tablet, touch-unfriendly and all. Is anyone actually thinking in Redmond?

In addition, putting full versions of Word, Powerpoint and Excel on Windows Surface might seem like a selling point, but it isn’t. The point to the iPad is to provide you with small lightweight applications to supplement what you use on a full computer. Or, better, Cloud versions of the apps. I understand the thinking that having a full computer as a tablet might be a good idea, but it really isn’t. Tablets are way too under powered for that purpose. That’s why notebooks and desktops are still necessary. The size of the processors in flat tablet devices just aren’t powerful enough to be useful for full-sized apps.  That’s the reason why the iPad is the way that it is.  Apple understands that an A6 processor is not in any way close to a full quad core i7 processor. So, the iPad doesn’t pretend to be a full computer knowing that it can’t ever be that. Instead, it opts to provide smaller light weight apps that allow simple communication, entertainment and apps that an A6 is capable of handling within the constraints of the limited ram and storage. That’s why IOS works on the iPad and why Windows 8 doesn’t work on Microsoft Surface.

Herky Jerky Motion

One of the other problems I noticed is that when you’re dragging around Metro’s interface and transitioning between Windows 8 desktop apps and back into Metro, there is this annoying stuttering jerky motion the screen does.  It appears that this was an intentional design and not the graphics card going haywire. I’m not sure why this was let out of Redmond this way. Just from that problem alone, I first thought that Microsoft Surface tablet was having a problem. Then I realized that it wasn’t a tablet hardware problem. Indeed, that problem was inherent within Windows 8 and Metro.  If you’re planning to offer a dragging, fading, transitioning experience, make it smooth. That means, no jerky shaky transitions.  It makes the device seem under powered (it probably is). At the same time, it makes Windows look antiquated and unpolished (it definitely is).

Multiple Revisions

Microsoft always takes two or three product iterations before it settles into a reasonably solid, but second rate, product format. With the exception of the original Xbox, I don’t know of any single device that Microsoft has gotten right on the first try. It was inevitable that they would get the Microsoft Surface tablet wrong. If you’re looking to get into Windows 8, I’d suggest just going for a notebook outright. You’ll get more for your bang for the buck and you’ll have a much more usable Windows 8 experience.

I really wanted to like Windows Surface, but these fundamental problems with Windows prevent this tablet from being anything more than a clunky toy. The iPad actually has a use because the icons and screen elements are always big enough to tap no matter the size of the device. This is one of things that Apple fully understands about touch surfaces.  Although, Apple could do with some nuanced improvements to touch usability.  Unfortunately, when you get to the Windows 8 desktop interface, it’s a complete chore to control it via touch. I just can’t see buying a Windows Surface first version tablet. It tries to be too many things, but fails to be any of them.

Microsoft, figure it out!

Windows 8 PC: No Linux?

Posted in botch, computers, linux, microsoft, redmond, windows by commorancy on August 5, 2012

According to the rumor mill, Windows 8 PC systems will come shipped with a new BIOS replacement using UEFI (the extension of the EFI standard).  This new replacement boot system apparently comes shipped with a secured booting system that, some say, will be locked to Windows 8 alone.   On the other hand, the Linux distributions are not entirely sure how the secure boot systems will be implemented.  Are Linux distributions being prematurely alarmist? Let’s explore.

What does this mean?

For Windows 8 users, probably not much.  Purchasing a new PC will be business as usual.  For Microsoft, and assuming UEFI secure boot cannot be disabled or reset, it means you can’t load another operating system on the hardware.  Think of locked and closed phones and you’ll get the idea.  For Linux, that would mean the end of Linux on PCs (at least, not unless Linux distributions jump thorough some secure booting hoops).  Ok, so that’s the grim view of this.  However, for Linux users, there will likely be other options.  That is, buying a PC that isn’t locked.  Or, alternatively, resetting the PC back to its factory default state of being unlocked (which the UEFI should support).

On the other hand, dual booting may no longer be an option with secure boot enabled.  That means, it may not be possible to install both Windows and Linux onto the system and choose to boot one or the other at boot time.  On other other hand, we do not know if Windows 8 requires UEFI secure boot to boot or whether it can be disabled.  So far it appears to be required, but if you buy a boxed retail edition of Windows 8 (which is not yet available), it may be possible to disable secure boot.  It may be that some of the released to manufacturing (OEM) editions require secure boot.  Some editions may not.

PC Manufacturers and Windows 8

The real question here, though, is what’s driving UEFI secure booting?  Is it Windows?  Is it the PC manufacturers?  Is it a consortium?  I’m not exactly sure.  Whatever the impetus is to move in this direction may lead Microsoft back down the antitrust path once again.  Excluding all other operating systems from PC hardware is a dangerous precedent as this has not been attempted on this hardware before.  Yes, with phones, iPads and other ‘closed’ devices, we accept this.  On PC hardware, we have not accepted this ‘closed’ nature because it has never been closed.  So, this is a dangerous game Microsoft is playing, once again.

Microsoft anti-trust suit renewed?

Microsoft should tread on this ground carefully.  Asking PC manufacturers to lock PCs to exclusively Windows 8 use is a lawsuit waiting to happen.  It’s just a matter of time before yet another class action lawsuit begins and, ultimately, turns into a DOJ antitrust suit.  You would think that Microsoft would have learned its lesson by its previous behaviors in the PC marketplace.  There is no reason that Windows needs to lock down the hardware in this way.

If every PC manufacturer begins producing PCs that preclude the loading of Linux or other UNIX distributions, this treads entirely too close to antitrust territory for Microsoft yet again.  If Linux is excluded from running on the majority of PCs, this is definitely not wanted behavior.  This rolls us back to the time when Microsoft used to lock down loading of Windows on the hardware over every other operating system on the market.  Except that last time, nothing stopped you from wiping the PC and loading Linux. You just had to pay the Microsoft tax to do it.  At that time, you couldn’t even buy a PC without Windows.  This time, according to reports, you cannot even load Linux with secure booting locked to Windows 8.  In fact, you can’t even load Windows 7 or Windows XP, either.  Using UEFI secure boot on Windows 8 PCs treads  within millimeters of this same collusionary behavior that Microsoft was called on many years back, and ultimately went to court over and lost much money on.

Microsoft needs to listen and tread carefully

Tread carefully, Microsoft.  Locking PCs to running only Windows 8 is as close as you can get to the antitrust suits you thought you were done with.  Unless PC manufacturers give ways of resetting and turning off the UEFI secure boot system to allow non-secure operating systems, Microsoft will once again be seen in collusion with PC manufacturers to exclude all other operating systems from UEFI secure boot PCs.  That is about as antitrust as you can get.

I’d fully expect to see Microsoft (and possibly some PC makers) in DOJ court over antitrust issues.  It’s not a matter of if, it’s a matter of when.  I predict by early 2014 another antitrust suit will have materialized, assuming the way that UEFI works comes true.  On the other hand, this issue is easily mitigated by UEFI PC makers allowing users to disable the UEFI secure boot to allow a BIOS boot and Linux to be loaded.  So, the antitrust suits will entirely hinge on how flexible the PC manufacturers set up the UEFI secure booting.  If both Microsoft and the PC makers have been smart about this change, UEFI booting can be disabled.   If not, we know the legal outcome.

Virtualization

For Windows 8, it’s likely that we’ll see more people moving to use Linux as their base OS with Windows 8 virtualized (except for gamers where direct hardware is required).  If Windows 8 is this locked down, then it’s better to lock down VirtualBox than the physical hardware.

Death Knell for Windows?

Note that should the UEFI secure boot system be as closed as predicted, this may be the final death knell for Windows and, ultimately, Microsoft.  The danger is in the UEFI secure boot system itself.  UEFI is new and untested in the mass market.  This means that not only is Windows 8 new (and we know how that goes bugwise), now we have an entirely new untested boot system in secure boot UEFI.  This means that if anything goes wrong in this secure booting system that Windows 8 simply won’t boot.  And believe me, I predict there will be many failures in the secure booting system itself.  The reason, we are still relying on mechanical hard drives that are highly prone to partial failures.  Even while solid state drives are better, they can also go bad.  So, whatever data the secure boot system relies on (i.e. decryption keys) will likely be stored somewhere on the hard drive.  If this sector of the hard drive fails, no more boot.  Worse, if this secure booting system requires an encrypted hard drive, that means no access to the data on the hard drive after failure ever.

I’d predict there will be many failures related to this new UEFI secure boot that will lead to dead PCs.  But, not only dead PCs, but PCs that offer no access to the data on the hard drives.  So people will lose everything on their computer.

As people realize this aspect of this local storage system on an extremely closed system, they will move toward cloud service devices to prevent data loss.  Once they realize the benefits of cloud storage, the appeal of storing things on local hard drives and most of the reasons to use Windows 8 will be lost.  Gamers may be able to keep the Windows market alive a bit longer, otherwise. On the other hand, this why a gaming company like Valve software is hedging its bets and releasing Linux versions of their games. For non-gamers, desktop and notebook PCs running Windows will be less and less needed and used.  In fact, I contend this is already happening.  Tablets and other cloud storage devices are already becoming the norm.  Perhaps not so much in the corporate world as yet, but once cloud based Office suites get better, all bets are off.  So, combining the already trending move towards limited storage cloud devices, closing down PC systems in this way is, at best, one more nail in Windows’ coffin.  At worst, Redmond is playing Taps for Windows.

Closing down the PC market in this way is not the answer.  Microsoft has stated it wants to be more innovative as Steve Balmer recently proclaimed.  Yet, I see moves like this and this proves that Microsoft has clearly not changed and has no innovation left.  Innovation doesn’t have to and shouldn’t lead to closed PC systems and antitrust lawsuits.

Bluetooth Mouse Pairing: Fix ‘Authentication Error’ in Windows 7

Posted in microsoft, redmond by commorancy on June 25, 2012

Every once in a while my bluetooth dongle decides to go whacky on me and the mouse won’t work any longer.  Sometimes the keyboard also.  Usually, I can unplug the dongle and replug it.  This generally recovers both the mouse and the keyboard.  Sometimes it requires repairing one or both of the devices.  Today was a repairing day (at least for the mouse).  Except, today didn’t go at all smoothly.

Note: Before proceeding with any pairing operation to battery powered devices such as mice or keyboards, always make sure your batteries are fresh.  Dead or dying batteries can cause pairing problems simply because the wireless transmitter in the device may not produce a stable enough signal for the receiver.  Also note that dead or dying batteries can sometimes be the source of device connectivity problems. Therefore, always make sure your batteries are fresh before attempting pairing operations with these devices.

The Problem

Normally I just go into ‘Devices and Printers’ and delete the device and pair it again.  This usually works seamlessly.  Today, not so much.  I successfully delete the Targus mouse from the ‘Devices and Printers’ and that works correctly.  I then put the mouse into discovery mode and start the ‘Add a Bluetooth Device’ panel.  The panel finds the mouse fine.  I select the mouse and click ‘Next’. I then see the next image.

So, this is a reasonably stupid error because it’s a mouse.  Mice don’t have authentication errors because they don’t use pairing codes.  I have no idea why Windows would even present this.  It’s clear that something is completely borked in Windows.  And, you know, this is one of the things about Windows I absolutely hate.  It gives stupid errors like this without any hope for resolution.  Note that clicking the little blue link at the bottom of the window is completely worthless.  Clicking that link won’t help you resolve this issue.  It leads you to some worthless help page that leaves more questions than answers and only serves to waste time.  I digress.

So, now that I’ve received this error, I proceed to Google to find an answer.  Well, I didn’t find one.  After traversing through several forums where people are asking the same questions, no answers here. Then, I proceed to search the registry thinking it left some garbage in the registry from the previous pairing.  Nope, that search was a waste.  So now, I’m basically at the trial and error phase of resolution.

I finally get to Microsoft’s knowledgebase which is probably where I should have visited first. Unfortunately, even that didn’t help, but I did find that Windows Server doesn’t support Bluetooth devices (not that that’s very helpful for my issue because I’m on Windows 7).  What visiting this page at Microsoft did is give me an idea of how to proceed based on some images I saw.  Not images of what I’m about to show you, though. Just an image of something that triggered a thought about how silly Microsoft is which lead to another thought and so on leading to the below.

The Fix

So, I go back to trying to pair again.  I set the mouse up into pairing mode and then start ‘Add a Bluetooth Device’.  Instead, this time I decide to right click the device about to be added:

You’ll need to do this pretty quickly as the device won’t stay in pairing mode for very long.  So, click ‘Properties’ and you’ll see the following window:

Now, check the box next to the ‘Drivers for keyboard, mice, etc (HID)’ and click ‘OK’.  This should immediately pair the device without the ‘Authentication Error’ panel appearing.  At least, this fix worked perfectly for my situation.  I can’t guarantee this will work with every Bluetooth mouse or every Bluetooth hardware.  So, your results may vary.  It’s definitely worth giving it a try, though.

Note: The differences in Bluetooth drivers may prevent this fix from working across the board.  So, you will have to try this and relay your experience of whether or not it works for you.

Note, after I unpaired the mouse and repaired it after having done the above, I now see the following panel instead of the authentication error panel. This is the correct panel for the mouse.  Clicking ‘Pair without using a code’ works perfectly now for this device.  I have no idea what caused the other panel to present above.  Note that once Windows gets into that state above, it stays there.  Not sure why Windows would cache an error, but apparently it does.  I’m at a complete loss why Microsoft would cache anything to do with real-time device connection activities like this! However, the mouse now unpairs and pairs correctly again.  Whatever causes this issue, the Windows development team needs to fix it.

My Rant

These are the stupid little things that make Windows such a hacky time-wasting experience.  It’s these stupid quirky behaviors that give Microsoft a bad wrap and that continue to make Microsoft perceived as an inept operating system development company.  It’s problems like this that make Windows a 1990’s level computer experience.

And, I’m not just talking about the error itself.  I’m talking about the overall experience surrounding the error to the lack of any help in finding an answer.  It’s having to resort to searching Google to find answers when Microsoft’s knowledgebase has nothing and offers no answers.  It’s the having to guess using trial and error to find an answer.  It’s the bad experience and bad taste that this experience leaves.  Microsoft get your sh*t together.  It’s long time for Windows to be done with experiences like this and time wasting experiences.  If there are resolutions to a problem, then the time has long past to lead your users who see errors like this one to an exact resolution page with step-by-step instructions that work.  Clearly, there is a resolution to my issue and I present it here.  Why can’t your team do the same?

Seriously, I don’t understand why Microsoft relies on sites like mine to help users fix problems that Microsoft cannot be bothered to document properly. Yes, I realize I’m contributing to the problem by writing this article and ‘helping’ Microsoft out.  Note, however, it’s not so much about helping Microsoft as it is helping users who run into this same stupid experience.  The purpose of this article is to show just how stupid this experience is.  It’s clear that Microsoft has no want in helping its own users who PAID for this product to actually give them real support and documentation.  So, why do we continue to use Windows?

How to format NTFS on MacOS X

Posted in Apple, computers, Mac OS X, microsoft by commorancy on June 2, 2012

This article is designed to show you how to mount and manage NTFS partitions in MacOS X.  Note the prerequisites below as it’s not quite as straightforward as one would hope.  That is, there is no native MacOS X tool to accomplish this, but it can be done.  First things first:

Disclaimer

This article discusses commands that will format, destroy or otherwise wipe data from hard drives.  If you are uncomfortable working with commands like these, you shouldn’t attempt to follow this article.  This information is provided as-is and all risk is incurred solely by the reader.  If you wipe your data accidentally by the use of the information contained in this article, you solely accept all risk.  This author accepts no liability for the use or misuse of the commands explored in this article.

Prerequisites

Right up front I’m going to say that to accomplish this task, you must have the following prerequisites set up:

  1. VirtualBox installed (free)
  2. Windows 7 (any flavor) installed in VirtualBox (you can probably use Windows XP, but the commands may be different) (Windows is not free)

For reading / writing to NTFS formatted partitions (optional), you will need one of the following:

  1. For writing to NTFS partitions on MacOS X:
  2. For reading from NTFS, MacOS X can natively mount and read from NTFS partitions in read-only mode. This is built into Mac OS X.

If you plan on writing to NTFS partitions, I highly recommend Tuxera over ntfs-3g. Tuxera is stable and I’ve had no troubles with it corrupting NTFS volumes which would require a ‘chkdsk’ operation to fix.  On the other hand, ntfs-3g regularly corrupts volumes and will require chkdsk to clean up the volume periodically. Do not override MacOS X’s native NTFS mounter and have it write to volumes (even though it is possible).  The MacOS X native NTFS mounter will corrupt disks in write mode.  Use Tuxera or ntfs-3g instead.

Why NTFS on Mac OS X?

If you’re like me, I have a Mac at work and Windows at home.  Because Mac can mount NTFS, but Windows has no hope of mounting MacOS Journaled filesystems, I opted to use NTFS as my disk carry standard.  Note, I use large 1-2TB sized hard drives and NTFS is much more efficient with space allocation than FAT32 for these sized disks.  So, this is why I use NTFS as my carry around standard for both Windows and Mac.

How to format a new hard drive with NTFS on Mac OS X

Once you have Windows 7 installed in VirtualBox and working, shut it down for the moment.  Note, I will assume that you know how to install Windows 7 in VirtualBox.  If not, let me know and I can write a separate article on how to do this.

Now, go to Mac OS X and open a command terminal (/Applications/Utilities/Terminal.app).  Connect the disk to your Mac via USB or whatever method you wish the drive to connect.  Once you have it connected, you will need to determine which /dev/diskX device it is using.  There are several ways of doing this.  However, the easiest way is with the ‘diskutil’ command:

$ diskutil list
/dev/disk0
   #: TYPE NAME SIZE IDENTIFIER
   0: GUID_partition_scheme *500.1 GB disk0
   1: EFI 209.7 MB disk0s1
   2: Apple_HFS Macintosh HD 499.8 GB disk0s2
/dev/disk1
   #: TYPE NAME SIZE IDENTIFIER
   0: GUID_partition_scheme *2.0 TB disk1
/dev/disk2
   #: TYPE NAME SIZE IDENTIFIER
   0: Apple_partition_scheme *119.6 MB disk2
   1: Apple_partition_map 32.3 KB disk2s1
   2: Apple_HFS VirtualBox 119.5 MB disk2s2

 
Locate the drive that appears to be the size of your new hard drive.  If the hard drive is blank (a brand new drive), it shouldn’t show any additional partitions. In my case, I’ve identified that I want to use /dev/disk1.  Remember this device file path because you will need it for creating the raw disk vmdk file. Note the nomenclature above:  The /dev/disk1 is the device to access the entire drive from sector 0 to the very end.  The /dev/diskXsX files access individual partitions created on the device.  Make sure you’ve noted the correct /dev/disk here or you could overwrite the wrong drive.

Don’t create any partitions with MacOS X in Disk Utility or in diskutil as these won’t be used (or useful) in Windows.  In fact, if you create any partitions with Disk Utility, you will need to ‘clean’ the drive in Windows.

Creating a raw disk vmdk for VirtualBox

This next part will create a raw connector between VirtualBox and your physical drive.  This will allow Windows to directly access the entire physical /dev/disk1 drive from within VirtualBox Windows.  Giving Windows access to the entire drive will let you manage the entire drive from within Windows including creating partitions and formatting them.

To create the connector, you will use the following command in Mac OS X from a terminal shell:

$ vboxmanage internalcommands createrawvmdk \
-filename "/path/to/VirtualBox VMs/Windows/disk1.vmdk" -rawdisk /dev/disk1

 
It’s a good idea to create the disk1.vmdk where your Windows VirtualBox VM lives. Note, if vboxmanage isn’t in your PATH, you will need to add it to your PATH to execute this command or, alternatively, specify the exact path to the vboxmanage command. In my case, this is located in /usr/bin/vboxmanage.  This command will create a file named disk1.vmdk that will be used inside your Windows VirtualBox machine to access the hard drive. Note that creating the vmdk doesn’t connect the drive to your VirtualBox Windows system. That’s the next step.  Make note of the path to disk1.vmdk as you will also need this for the next step.

Additional notes, if the drive already has any partitions on it (NTFS or MacOS), you will need to unmount any mounted partitions before Windows can access it and before you can createrawvmdk with vboxmanage.  Check ‘df’ to see if any partitions on drive are mounted.  To unmount, either drop the partition(s) on the trashcan, use umount /path/to/partition or use diskutil unmount /path/to/partition.  You will need to unmount all partitions on the drive in question before Windows or vboxmanage can access it.  Even one mounted partition will prevent VirtualBox from gaining access to the disk.

Note, if this is a brand new drive, it should be blank and it won’t attempt to mount anything.  MacOS may ask you to format it, but just click ‘ignore’.  Don’t have MacOS X format the drive.  However, if you are re-using a previously used drive and wanting to format over what’s on it, I would suggest you zero the drive (see ‘Zeroing a drive’ below) as the fastest way to clear the drive of partition information.

Hooking up the raw disk vmdk to VirtualBox

Open VirtualBox.  In VirtualBox, highlight your Windows virtual machine and click the ‘Settings’ cog at the top.

  • Click the Storage icon.
  • Click the ‘SATA Controller’
  • Click on the ‘Add Hard Disk’ icon (3 disks stacked).
  • When the ? panel appears, click on ‘Choose existing disk’.
  • Navigate to the folder where you created ‘disk1.vmdk’, select it and click ‘Open’.
  • The disk1.vmdk connector will now appear under SATA Controller

You are ready to launch VirtualBox.  Note, if /dev/disk1 isn’t owned by your user account, VirtualBox may fail to open this drive and show an error panel.  If you see any error panels, check to make sure no partitions are mounted and  then check the permissions of /dev/disk1 with ls -l /dev/disk1 and, if necessary, chown $LOGNAME /dev/disk1.  The drive must not have any partitions actively mounted and /dev/disk1 must be owned by your user account on MacOS X.  Also make sure that the vmdk file you created above is owned by your user account as you may need to become root to createrawvmdk.

Launching VirtualBox

Click the ‘Start’ button to start your Windows VirtualBox.  Once you’re at the Windows login panel, log into Windows as you normally would.  Note, if the hard drive goes to sleep, you may have to wait for it to wake up for Windows to finish loading.

Once inside Windows, do the following:

  • Start->All Programs->Accessories->Command Prompt
  • Type in ‘diskpart’
  • At the DISKPART> prompt, type ‘list disk’ and look for the drive (based on the size of the drive).
    • Note, if you have more than one drive that’s the same exact size, you’ll want to be extra careful when changing things as you could overwrite the wrong drive.  If this is the case, follow these next steps at your own risk!
DISKPART> list disk
Disk ### Status Size Free Dyn Gpt
 -------- ------------- ------- ------- --- ---
 Disk 0 Online 40 GB 0 B
 Disk 1 Online 1863 GB 0 B *
  • In my case, I am using Disk 1.  So, type in ‘select disk 1’.  It will say ‘Disk 1 is now the selected disk.’
    • From here on down, use these commands at your own risk.  They are destructive commands and will wipe the drive and data from the drive.  If you are uncertain about what’s on the drive or you need to keep a copy, you should stop here and backup the data before proceeding.  You have been warned.
    • Note, ‘Disk 1’ is coincidentally named the same as /dev/disk1 on the Mac.  It may not always follow the same naming scheme on all systems.
  • To ensure the drive is fully blank type in ‘clean’ and press enter.
    • The clean command will wipe all partitions and volumes from the drive and make the drive ‘blank’.
    • From here, you can repartition the drive as necessary.

Creating a partition, formatting and mounting the drive in Windows

  • Using diskpart, here are the commands to create one partition using the whole drive, format it NTFS and mount it as G: (see commands below):
DISKPART> select disk 1
Disk 1 is now the selected disk
DISKPART> clean
DiskPart succeeded in cleaning the disk.
DISKPART> create partition primary
DiskPart succeeded in creating the specified partition.
DISKPART> list partition
Partition ### Type Size Offset
 ------------- ---------------- ------- -------
* Partition 1 Primary 1863 GB 1024 KB
DISKPART> select partition 1
Partition 1 is now the selected partition.
DISKPART> format fs=ntfs label="Data" quick
100 percent completed
DiskPart successfully formatted the volume.
DISKPART> assign letter=g
DiskPart successfully assigned the drive letter or mount point.
DISKPART> exit
Leaving DiskPart...

 

  • The drive is now formatted as NTFS and mounted as G:.  You should see the drive in Windows Explorer.
    • Note, unless you want to spend hours formatting a 1-2TB sized drive, you should format it as QUICK.
    • If you want to validate the drive is good, then you may want to do a full format on the drive.  New drives are generally good already, so QUICK is a much better option to get the drive formatted faster.
  • If you want to review the drive in Disk Management Console, in the command shell type in diskmgmt.msc
  • When the window opens, you should find your Data drive listed as ‘Disk 1’

Note, the reason to use ‘diskpart’ over Disk Management Console is that you can’t use ‘clean’ in Disk Management Console, this command is only available in the diskpart tool and it’s the only way to completely clean the drive of all partitions to make the drive blank again.  This is especially handy if you happen to have previously formatted the drive with MacOS X Journaled FS and there’s an EFI partition on the drive.  The only way to get rid of a Mac EFI partition is to ‘clean’ the drive as above.

Annoyances and Caveats

MacOS X always tries to mount recognizable removable (USB) partitions when they become available.  So, as soon as you have formatted the drive and have shut down Windows, Mac will likely mount the NTFS drive under /Volumes/Data.  You can check this with ‘df’ in Mac terminal or by opening Finder.  If you find that it is mounted in Mac, you must unmount it before you can start VirtualBox to use the drive in Windows.  If you try to start VirtualBox with a mounted partition in Mac OS X, you will see a red error panel in VirtualBox.  Mac and Windows will not share a physical volume.  So you must make sure MacOS X has unmounted the volume before you start VirtualBox with the disk1.vmdk physical drive.

Also, the raw vmdk drive is specific to that single hard drive.  You will need to go through the steps of creating a new raw vmdk for each new hard drive you want to format in Windows unless you know for certain that each hard drive is truly identical.  The reason is that vboxmanage discovers the geometry of the drive and writes it to the vmdk.  So, each raw vmdk is tailored to each drive’s size and geometry.  It is recommended that you not try to reuse an existing physical vmdk with another drive.  Always create a new raw vmdk for each drive you wish to manage in Windows.

Zeroing a drive

While the clean command clears off all partition information in Windows, you can also clean off the drive in MacOS X.  The way to do this is by using dd.  Again, this command is destructive, so be sure you know which drive you are operating on before you press enter.  Once you press enter, the drive will be wiped of data.  Use this section at your own risk.

To clean the drive use the following:

$ dd if=/dev/zero of=/dev/disk1 bs=4096 count=10000

 
This command will write 10000 * 4096 byte blocks with all zeros.  This should overwrite any partition information and clear the drive off.  You may not need to do this as the diskpart ‘clean’ command may be sufficient.

Using chkdsk

If the drive has become corrupted or is acting in a way you think may be a problem, you can always go back into Windows with the data1.vmdk connector and run chkdsk on the volume.  You can also use this on any NTFS or FAT32 volume you may have.  You will just need to create a physical vmdk connector and attach it to your Windows SATA controller and make sure MacOS X doesn’t have it mounted. Then, launch VirtualBox and clean it up.

Tuxera

If you are using Tuxera to mount NTFS, once you exit out of Windows with your freshly formatted NTFS volume, Tuxera should immediately see the volume and mount it.  This will show you that NTFS has been formatted properly on the drive.  You can now read and write to this volume as necessary.

Note that this method to format a drive with NTFS is the safest way on Mac OS X.  While there may be some native tools floating around out there, using Windows to format NTFS will ensure the volume is 100% compliant with NTFS and Windows.  Using third party tools not written by Microsoft could lead to data corruption or improperly formatted volumes.

Of course, you could always connect the drive directly to a Windows system and format it that way. ;)

Tagged with: , ,

Clickity Click – The Microsoft Dilemma

Posted in computers, microsoft, windows by commorancy on April 30, 2010

Once upon a time, the mouse didn’t exist. So, the keyboard drove the interface. Later, Xerox came along and changed all of that (with the help of Steve Jobs and Apple). Of course, as it always does, Microsoft absconded with the mouse functionality and built that into Windows… not that there was really much choice with this decision.

We flash a decade forward or so and we’re at Windows XP. A reasonably streamlined Windows operating system from Microsoft. In fact, this is probably and arguably the most streamlined that Microsoft’s Windows has ever been (and will likely ever be). Granted, security was a bit weak, but the user interface experience was about as good as it can get. With only a few clicks you could get to just about anything you needed.

Flash forward nearly another decade to see the release of the dog that was Windows Vista. Actually, Windows Vista’s look was not too bad. But, that’s pretty much where it ends. Microsoft must not have done much usability testing with Vista because what used to take one or two clicks of the mouse now adds 1-3 extra clicks. The reason, Microsoft has decided to open useless windows as launchpads to get to underlying components. Added layers that are pointless and unnecessary. For example, you used to be able to right click ‘My Network Places’, release on properties and get right to the lan adapters to set them up. No more. Now this same properties panel opens a launchpad interface that requires clicking ‘Change Adapter Settings’ just to get the adapters. Pointless. Why was this added layer necessary? And this is the best of the worst.

More than this, though, is that sometimes the labeling of the links to get to the underlying components is obscure or misleading. So, you’re not really sure what link to click to get to the thing you need. That means you end up clicking several things just to find the thing you need. Yes, you can use the help to find things, but that then means opening even more windows and clicking through even more time wasting events just to locate something that should have been one-click anyway.

Server Operating Systems

This issue is not limited to the desktop OS world. In the server world, such as Windows 2008 R2, these launch pads are now excessive and in-your-face. For example, when you first install Windows 2008 R2, two of these panels open as the first thing after you log in. So now, I’m already starting out having to click closed two windows that I didn’t even need to see at that point just so I can get to the desktop. Likely, if you’re installing a server operating system, you’re planning on hooking it into a domain controller. So, setting up anything on the local administrative user is pointless. That means I have to close out of these useless panels in order to get to the panel where I can import this machine into the domain. It would have been far more helpful to have the first thing open be the join-the-domain panel. I don’t need to set up anything else on that newly installed machine until it’s in the domain.

Desktop Systems

Most people are much more familiar with the desktop operating systems than the server versions. But, these added clicks are all throughout not only Vista, but now Windows 7. Because Windows 7 is effectively a refresh of Vista with added compatibility features, these extra clicks are still there and still annoying. Why Microsoft had to take a streamlined interface and make it less efficient for users, I’ll never know. But, these added clicks to get to standard operating system tools is a waste of time and productivity. It also requires a higher learning curve to teach people the new method.

“If it ain’t broke, don’t fix it”

This motto needs to be ingrained into the engineering team at Microsoft because they clearly do not understand this. Added extra layers of windows does not make the OS more efficient. It makes it bloated, cumbersome and extremely inefficient. That extra click might only take an extra second, but those seconds add up when you’re doing a repetitive task that involves dealing with those windows as part of your job.

As another example, opening the default setup for control panel in XP shows the control panels themselves. In Vista / Windows 7, it now brings up a launch pad of abstract tasks. Tasks like ‘System and Security’ and ‘User accounts and family safety’. Clicking these leads to more sub concept tasks. So, instead of just showing the actual control panels, you have to click through a series of abstract task pages that ultimately lead you to a tool. No no no. Pointless and inefficient. Let’s go back to opening up the actual control panel view. I want to see the actual control panels. The abstract task idea is great for beginners. For advanced users, we want to turn this crap off. It’s pointless, slow and unnecessary. Power users do not need this.

Windows for beginners

Microsoft Bob is dead. Let’s not go there again. So, why is it that Microsoft insists on trying to spoon feed us an interface this dumbed down and with excessively clicky features? This interface would have been a great first step in 1990. But, not today. Taking this step today is a step backward in OS design. Anyone who has used Windows for more than 6 months doesn’t need these added inconveniences and inefficiencies. In fact, most average computer users don’t need this level of basics. Only the very beginner beginners need this level of spoon feeding.

Microsoft needs a to create new version (or, alternatively, a preference to turn ‘Newbie mode’ off). Simply put, they need Windows for the Power User. A stripped down design that gets back to basics. A design that eliminates these cumbersome beginner features and brings back the single click features that let us navigate the operating system fast and using the mouse efficiently (read, as few clicks as possible). Obviously, if you’re running ANY server edition, this automatically implies the power user interface. Let’s get rid of these helper panels on startup, mkay?

Microsoft, can we make this a priority for Windows 8?

What is it about tablets?

Posted in Apple, botch, business, california, computers, microsoft by commorancy on January 15, 2010

Ok, I’m stumped.  I’ve tried to understand this manufacturing trend, but I simply can’t.  We have to be heading towards the fourth or maybe fifth generation of tablet PCs, yet each time they bring tablets back to the the market, this technology fails miserably.  Perhaps it’s the timing, but I don’t think so.  I think the market has spoken time and time again.  So, what is it about this technology that make manufacturers try and try again to foist these lead balloons onto us about every 6 years?

Wayback machine

It was in the early 90’s that Grid Computers arguably released the first tablet (or at least, one of the very first tablets).  Granted, it used a monochrome plasma screen and I believe that it ran DOS and Windows 3.1 (that I recall), but these things flopped badly for many different reasons.  Ultimately, the market spoke and no one wanted them.  It’s no wonder why, too.  The lack of keyboard combined with the size and weight of the unit, the need for a pen and the lack of a truly viable input method doomed this device to the halls of flopdom.  Into obscurity this device went along with Grid Computers (the company).

In the early 2000s, Microsoft+Manufacturers tried again to resurrect this computer format with XP Tablet edition.  This time they tried making the devices more like notebooks where the screen could detach from a keyboard and become a tablet.  So, when it was attached, it looked and felt like a notebook.  When detached, it was a tablet.  Again, there was no viable input method without keyboard even though they were touch screen.  The handwriting recognition was poor at best and if it had voice input, it failed to work.   XP Tablet edition was not enough to make the tablet succeed.  Yet again, the tablet rolled into obscurity… mostly.  You can still buy tablets, but they aren’t that easy to find and few manufacturers make them.  They also ship with hefty price tags.

Origami

Then, later in the mid 2000’s came Microsoft with Origami.  At this time, Origami was supposed to be a compact OS, like Windows CE (although CE would have worked just fine for this, don’t know why Origami really came about).  A few tablets came out using Origami, but most computers that loaded this version of Windows used it in the microPC format.  Since the Origami version of Windows was a full version (unlike CE), it was a lot more powerful than computers of that size really needed and the price tag showed that.  Sony and a few other manufacturers made these microPCs, but they sold at expensive prices (like $1999 or more) for a computer the size of a PDA.  Again, no viable input method could suffice on the microPC tablets and so these died yet another death… although, the microPC hung around a bit longer than the tablet.  You might even still be able to buy one in 2010, if you look hard enough.

Netbook

Then came the Netbook.  The $199-299 priced scaled down notebook using the Atom processor.  This format took off dramatically and has been a resounding success.  The reason, price.  Who wouldn’t want a full fledged portable computer for $199-299?  You can barely buy an iPod or even a cell phone… let alone a desktop PC for that price.  The Netbook price point is the perfect price point for a low end notebook computer.  But, what does a Netbook have to do with a tablet?  It doesn’t, but it is here to illustrate why tablets will continue to fail.

Tablet resurrection

Once again, we are in the middle of yet another possible tablet resurrection attempt.  Rumor has it that Apple will release a tablet.  HP is now also pushing yet another tablet loaded with Windows.  Yet, from past failures, we already know this format is dead on arrival.  What can Apple possibly bring to the tablet format that Microsoft and PCs haven’t?  Nothing.  That’s the problem.  The only possible selling point for a tablet has to be in price alone.  Tablets have to get down to the $199-299 price tag to have any hope of gaining any popularity.  Yet, Apple is not known to make budget computers, so we know that that price point is out.  Assuming Apple does release a tablet, it will likely price it somewhere between $899 and $1599.  Likely, they will offer 3 different versions with the lowest version starting at $899.  Worse, at the lowest price point it will be hobbled lacking most bells and whistles.

Even if Apple loads up the tablet with all of the bells and whistles (i.e., Bluetooth, 3G, GSM, OLED Display, iTunes app capable, handwriting recognition, voice recognition, WiFi, wireless USB, a sleek case design, etc etc) the only thing those bells and whistles will do is raise the cost to produce the unit.  The basic problems with a tablet are portability (too big), lack of a viable input device, weight and fragility (not to mention, battery life).  Adding on a hefty price tag ensures that people won’t buy it.  Of course, the Apple fan boys will buy anything branded with a half bitten Apple logo.  But, for the general masses, no.  This device cannot hope to succeed on Apple fan boy income alone.

Compelling Reasons

Apple has to provide some kind of paradigm shifting technology that makes such a failure of a device like the tablet become successful (or whatever Apple cleverly names its tablet device).  If the tablet is over 7 inches in size, it will be too large to be portable.  Utilizing OLED technology ensures the cost is extremely high.  Putting a thin case on it like the MacBook Air ensures that it’s overly fragile.  We’ve  yet to find out the battery life expectancy.  So far, this is not yet a winning combination.

So, what kind of technology would make such a paradigm shift?  The only such technology I can think of would have to be a new input device technology.  A way to get commands into the notebook and a way to drive the interface easily.  Clearly, a multi-touch screen will help.  The iPod is good in that regard (except that you can’t use it with gloves).  But, if you want to write email, how do you do that on a tablet? Do you hand peck the letters on that silly on-screen thing that Apple calls a keyboard?  No.  That’s not enough.  Apple needs a fully phonetic speech input technology that’s 100% flawless without any training.  That means, you speak the email in and it converts it perfectly to text.  Also, you speak in any conversational command and the computer figures out what you mean flawlessly.  This is the only technology that makes any sense on a tablet.  Of course, it will need to support multiple languages (a tall order) and it needs to be flawless and perfect (an extremely tall order).  It will also need to work in a noisy room (not likely).

Can Apple make such a shift?  I don’t know.  The hardware technology is there to support such a system.  The issue, is the software ready?  Well, let’s hope Apple thinks so.  Otherwise, if Apple does release its rumored tablet without such a paradigm shift, it could be the worst stumble that Apple has made since the Lisa.

%d bloggers like this: