Favorite song of the week: Nuclear by Mike Oldfield
Song to be featured in the game Metal Gear Solid 5: The Phantom Pain
Apple’s newest MacBook: Simply Unsatisfying
It’s not a MacBook Air. It’s not a MacBook Pro. It’s simply being called the MacBook. Clever name for a computer, eh? It’s not like we haven’t seen this brand before. What’s the real trouble with this system? A single USB-C connector. Let’s explore.
Simplifying Things
There’s an art to simplification, but it seems Apple has lost its ability to rationally understand this fundamental concept. Jobs got it. Oh man, did Jobs get the concept of simplification in spades. Granted, not all of Jobs’s meddling in simplification worked. Like, a computer with only a mouse and no keyboard. Great concept, but you really don’t want to enter text through an on-screen keyboard. This is the reason the iPad is so problematic for anything other than one-liners. At least, not unless there’s some kind of audio dictation system. At the time, the Macintosh didn’t have such a system. With Siri, however, we do. Though, I’m not necessarily endorsing that Apple bring back the concept of a keyboard-less computer. Though, in fact, with a slight modification to Siri’s dictation capabilities, it would be possible.
Instead, the new MacBook has taken things away from the case design. More specifically, it has replaced all of those, you know, clunky, annoying and confusing USB 3.0 and Thunderbolt port connectors that mar the case experience. Apple’s engineers have now taken this old and clunky experience and ‘simplified’ it down to exactly one USB-C port (excluding the headphone jack.. and why do we even need this jack again).
The big question, “Is this really simplification?”
New Case Design
Instead of the full complement of ports we previously had, such as the clever magsafe power port, one or two Thunderbolt ports, two USB 3.0 ports and an SD card slot, now we have exactly one USB-C port. And, it’s not even a well known or widely used port style yet.
Smart. Adopt a port that literally no one is using and then center your entire computer’s universe around this untried technology. It’s a bold if not risky maneuver for Apple. No one has ever said Apple isn’t up for risky business ideas. It’s just odd that they centered it on an open standard rather than something custom designed by Apple. Let’s hope that Apple has massively tested plugging and unplugging this connector. If it breaks, you better hope your AppleCare service is active. And since the unplugging and plugging activity falls under wear-and-tear, it might not even be covered. Expect to spend more time at the Genius bar arguing over whether your computer is covered when this port breaks. On the other hand, we know the magsafe connector is almost impossible to break. How about this unknown USB-C connector? Does it also have the same functional lifespan? My guess is no.
I also understand that the USB-C technology automatically inherits the 10 Gbps bandwidth standard and has a no-confusion-plug-in-either-way connector style. But, it’s not as if Thunderbolt didn’t already offer the same transfer speed, though not the plug-in-either-way cable. So, I’m guessing that this means Thunderbolt is officially dead?
What about the Lightning cable? Apple recently designed and introduced the Lightning connector for charging and data transfer. Why not use the Lightning connector by adding on a faster data transfer standard? Apple spent all this time and effort on this cool new cable for charging and data transfer, but what the hell? Let’s just abandon that too and go with USB-C? Is it all about throwing out the baby with the bathwater over at Apple?
I guess the fundamental question is… Really, how important is this plug-in-either-way connector? Is Apple insinuating that general public is so dumb that it can’t figure out how to plug in a cable? Yes, trying to get the microUSB connectors inserted in the dark (because they only go in one direction) can be a hassle. The real problem isn’t that it’s a hassle, the real problem is that the connector itself was engineered all wrong. So, trying to fit in a microUSB cable into a port is only a problem because it’s metal on metal. Even when you do manage to get it lined up in the right direction, it sometimes still won’t go in. That’s just a fundamental flaw in the port connector design. It has nothing to do with directionality of it. I digress.
Fundamentally, the importance of a plug-in-either-way cable should be the lowest idea on the agenda. What should be the highest idea is simplifying to give a better user experience overall and not to hobble the computer to the point of being unnecessarily problematic.
Simply Unsatisfying
Let’s get into the meat of this whole USB-C deal. While the case now looks sleek and minimal, it doesn’t really simplify the user experience. It merely changes it. It’s basically a shell game. It moves the ball from one cup to another, but fundamentally doesn’t change the ball itself. So, instead of carrying only a power adapter and the computer, you are now being forced to carry a computer, power adapter and a dock. I fail to see exactly how this simplifies the user experience at all? I left docks behind when I walked away from using Dell Notebooks. Now, we’re being asked to use a dock again by, of all companies, Apple?
The point to making changes in any hardware (or software) design is to help improve the usability and user experience. Changing the case to offer a single USB-C port doesn’t enhance the usability or user experience. This is merely a cost cutting measure by Apple. Apple no longer needs to add pay for all of these arguably ‘extra’ (and costly) ports to the case. Removing all of those ‘extraneous’ ports now means less cost for the motherboard and die-cuts on the case, but at the expense that the user must carry around more things to support that computer. That doesn’t simplify anything for the user. It also burdens the user by forcing the user to pay more money for things that were previously included in the system itself. Not to mention, requiring the user to carry around yet more dongles. I’ve never ever known Apple to foist less of an experience on the user as a simultaneous cost cutting and accessory money making measure. This is most definitely a first for Apple, but not a first for which they want to become known. Is Apple now taking pages from Dell’s playbook?
Instead of walking out of the store with a computer ready in hand, now you have to immediately run to the accessory isle and spend another $100-200 (or more) on these ‘extras’. Extras, I might add, that were previously included in the cost of the previous gen computers. But now, they cost extra. So, that formerly $999 computer you bought that already had everything you needed will now cost you $1100-1200 or more (once you consider you now need a bag to carry all of these extras).
Apple’s Backward Thinking?
I’m sure Apple is thinking that eventually that’s all we’ll need. No more SD cards, no more Thunderbolt devices, no more USB 3 connectors. We just do everything wirelessly. After all, you have the (ahem) Apple TV for a wireless remote display (which would be great if only that technology didn’t suck so bad for latency and suffer from horrible mpeg artifacting because the bit rate is too low).
Apple likes to think they are thinking about the future. But, by the time the future arrives, what they have chosen is already outdated because they realized no one is actually using that technology other than them. So, then they have to resort to a new connector design or a new industry standard because no other computers have adopted what Apple is pushing.
For example, Thunderbolt is a tremendous idea. By today, this port should have been widely used and widely supported, yet it isn’t. There are few hard drives that use it. There are few extras that support it. Other than Apple’s use of this port to drive extra displays, that’s about the extent of how this port is used. It’s effectively a dead port on the computer. Worse, just about the time where Thunderbolt might actually be picking up steam, Apple dumps it in lieu of USB-C which offers the same transfer speeds. At best, a lateral move technologically speaking. If this port had offered 100 Gbps, I might not have even written this article.
Early Adopter Pain
What this all means is that those users who buy into this new USB-C only computer (I intentionally forget the headphone jack because it’s still pointless), will suffer early adopter pains with this computer. Not only will you be almost immediately tied to buying Apple gear, Apple has likely set up the USB-C connector to require licensed and ID’d cables and peripherals. This means that if you buy a third party unlicensed cable or device, Apple is likely to prevent it from working, just as they did with unlicensed Lightning cables on iOS.
This also means that, for at least 1-2 years, you’re at the mercy of Apple to provide you with that dongle. If you need VGA and there’s no dongle, you’re outta luck. If you need a 10/100 network adapter, outta luck. This means that until or unless a specific situational adapter becomes available, you’re stuck. Expect some level of pain when you buy into this computer.
Single Port
In addition to all of the above, let’s just fundamentally understand what a single port means. If you have your power brick plugged in, that’s it. You can’t plug anything else in. Oh, you need to run 2 monitors, read from an SD card, plug in an external hard drive and charge your computer? Good luck with that. That is, unless you buy a dock that offers all of these ports.
It’s a single port being used for everything. That means it has a single 10 Gbps path into the computer. So, if you plug in a hard drive that consumes 5 Gbps and a 4k monitor that consumes 2 Gbps, you’re already topping out that connector’s entire bandwidth into the computer. Or, what if you need a 10 Gbps Ethernet cable? Well, that pretty much consumes the entire bandwidth on this single USB-C connector. Good luck with trying to run a hard drive and monitor with that setup.
Where an older MacBook Air or Pro had two 5 Gbps USB3 ports and one or two 10 Gbps Thunderbolt ports (offering greater than 10 Gbps paths into the computer), the new MacBook only supports a max of 10 Gbps input rate over that single port. Not exactly the best trade off for performance. Of course, the reality is that the current Apple motherboards may not actually be capable of handling 30 Gbps input rate, but it was at least there to try. Though, I would expect that motherboard to handle an input rate greater than 10.
With the new MacBook, you are firmly stuck to a maximum input speed of 10 Gbps because it is a single port. Again, an inconvenience to the user. Apple once again makes the assumption that 10 Gbps is perfectly fine for all use cases. I’m guessing that Apple hopes the users simply won’t notice. Technologically, this is a step backward, not forward.
Overall
In among the early adopter problems and the relevancy problems that USB-C has to overcome, this computer now offers a more convoluted user experience. Additionally, instead of offering something that would be truly more useful and enhance the usability, such as a touch screen to use with an exclusive Spotlight mode, they opted to take this computer in a questionable direction.
Sure, the case colors are cool and the idea of a single port is intriguing, it’s only when you delve deep into the usefulness of this single port does the design quickly unravel.
Apple needs a whole lot of help in this department. I’m quite sure had Jobs been alive that while he might have introduced the simplified case design, it would have been overshadowed by the computer’s feature set (i.e., touch screen, better input device, better dictation, etc). Instead of trying to wow people with a single USB-C port (which offers more befuddlement than wow), Apple should have fundamentally improved the actual usability of this computer by enhancing the integration between the OS and the computer.
The case design doesn’t ultimately much matter, the usability of the computer itself matters. Until Apple understands that we don’t really much care what the case looks like as long as it provides what we need to compute without added hassles, weight and costs, Apple’s designers will continue running off on these tangents spending useless cycles attempting to redesign minimalist cases that really don’t benefit from it. At least, Apple needs to understand that there is a point of diminishing returns when trying to rethink minimalist designs…. and with this MacBook design, the Apple designers have gone well beyond the point of diminishing returns.
Make LuxRender render faster
In addition to writing blogs here at Randosity, I also like creating 3D art. You can see some of it off to the right side of the screen in the Flickr images. I point this out because I typically like to use Daz Studio to do my 3D work. I also prefer working with the human form over still life, but occasionally I’ll also do a still life, landscape or some other type of scene. Today, I’m going to talk about a rendering engine that I like to use: LuxRender. More specifically, how to get it to render faster. You can read more about it at www.luxrender.net. Let’s explore.
3Delight and Daz Studio
Daz Studio is what I use to compose my scenes. What comes built into Daz Studio is the rendering engine named 3Delight. It’s a very capable biased renderer. That is, it prefers to use lighting and internal short cuts to do its rendering work. While 3Delight does support global illumination (aka. GI or bounced lighting), it doesn’t do it as well or as fast as I would like. When GI is turned on, it takes forever for 3Delight to calculate the bounced light on surfaces. Unfortunately, I don’t have that long to wait for a render to complete. So, I turn to a more capable renderer: LuxRender. Though, keep in mind that I do render in 3Delight and I am able to get some very realistic scenes out of it, also. But, these scenes have a completely different look than Lux and they typically take a whole lot longer to set up (and a ton more lights).
LuxRender
What’s different about Lux? The developers consider it to be an unbiased renderer, that is, it is considered physics based. In fact, all renderers attempt to use physics, but Lux attempts to use physics on all light sources. What is the end result? Better, more accurate, more realistic lighting…. and lighting is the key to making a scene look its best. Without great lighting, the objects within it will look dull, flat and without volume. It would be like turning the lights off in a room and attempting to take a photograph without a flash. What you get is a grainy, low light, washed out and flat image. That’s not what you want. For the same reason you use a flash in photography, you want to use LuxRender to produce images.
Now, I’m not here to say that LuxRender is a perfect renderer. No no no. It is, by far, not perfect. It has its share of flaws. But, for lighting, it can produce some of the most realistically lit scenes from a 3D renderer that I’ve found. Unfortunately too, this renderer is also slow. Not as slow as 3Delight with GI enabled, but definitely not by any stretch fast. Though, the more light you add to a scene, the faster Lux renders.
However, even with sufficient lighting, there are still drawbacks to how fast it can render. Let’s understand why.
LuxRender UI
The developers who designed LuxRender also decided that it needed a UI. A tool that allows you to control and tweak your renders (even while they’re rendering). I applaud what the LuxRender team has done with the UI in terms of the image tweaking functionality, but for all of the great things in the UI, there are not-so-smart things done on the rendering side. As cool and tweakable as a render-in-progress is, it should never take away from the speed at how fast a renderer can render. Unfortunately, it does.
Let’s step back a minute. When you use Daz Studio, you need a bridge to operate Lux. It needs to be able to export the scene into a format that Lux can parse and render. There are two bridges out there. The first is Reality. The second is Luxus. I’ll leave it to you to find the bridge that works best for you. However, Reality has versions for both Daz Studio and Poser. So, if you have both of these, you can get each of these versions and have a similar experience between these two different apps. If you’re solely in the Daz world, you can get Luxus and be fine. Once you have this bridge and you export a scene to the LuxRender, that’s when you’ll notice a big glaring sore thumb problem while rendering.
Render Speed and LuxRender UI
When I first began using LuxRender, one thing became very apparent. LuxRender has this annoying habit of stopping and starting. Because my computer has fans that speed up when the CPU is put under load and slow down when not, I can hear this behavior. What I hear is the fans spinning up and spinning down at regular intervals. I decided to investigate why. Note, renderers should be capable of running all of the CPU cores at full speed until the render has completed. 3Delight does this. Nearly every other rendering engine does this, but not LuxRender.
Here’s part of the answer. There are two automatic activities inside of the LuxRender UI while rendering:
- Tonemapping
- Saving the image to disk from memory
- Write FLM resume file
Both of these activities outright halt the rendering process for sometimes several minutes. This is insane. Now, let’s understand why this is insane. Most systems today offer 4 or more cores (8 or more hyperthreaded cores). Since you have more than one core, it makes no sense to stop all of the cores just to do one of the above tasks. No. Instead, the developers should have absconded with one of the cores for either of these processes leaving the rest of the cores to continue to do rendering work all of the time. The developers didn’t do this. Instead, they stop all cores, use one core (or less) to write the file to disk or update the GUI display and then wait and wait and wait. Finally, the cores start up again. This non-rendering time adds up to at least 5 minutes. That’s 5 minutes where zero rendering is taking place. That’s way too long.
How do I get around this issue? Well, I don’t entirely. If you want to use LuxRender, you should run over to luxrender.net and make a complaint to solve this problem. The second thing to do is set the tonemapping interval to 3600 seconds, the image write to disk interval to 3600 seconds and the FLM write interval to 3600 seconds. That means it will only save to disk every 1 hour. It will only update the screen every 1 hour and save a resume file every 1 hour. That means that LuxRender will have 1 hour of solid render time without interruptions from these silly update processes. This is especially important when you’re not even using the LuxRender UI.
Note that many applications set up intervals as short as a few seconds. That’s stupid considering the above. Yeah, we all want instant gratification, but I want my image to render its absolute fastest. I don’t need to see every single update interval in the UI. No, if I want to see an update, I can ask the UI to provide me that update when I bring it to the front. Automatically updating the UI at 10 second intervals (and stop the rendering) is just insane and a waste of time, especially when I can simply refresh the UI myself manually. In fact, there is absolutely no need for an automatic refresh of the UI ever.
Network Rendering
The second way to speed up rendering is to use other systems you may have around the house. They don’t necessarily need to be the fastest thing out there. But, even adding one more machine to the rendering pool makes a big difference on how fast your image might complete. This is especially important if you’re rendering at sizes of 3000 by 3000 pixels or higher.
System Specs and Lux
Of course, buying a well capable system will make rendering faster. To render your absolute fastest in Lux, it’s a given that you need CPU, CPU caching and large amounts of RAM to render. So, get what you can afford, but make sure it has a fair number of CPUs, a reasonable L1 and L2 cache set and at least 16GB of RAM (for 3k by 3k or larger images). If you add one or more GPUs to the mix, Lux will throw this processing power on top and get even faster rendering. But, this doesn’t solve the problem described above. Even if you have 32 cores, 128GB of RAM and the fastest L1 and L2 caches, it still doesn’t solve the stopping and starting problem with rendering.
If you want to dabble in LuxRender, you should run over to the luxrender.net and file a complaint to fix this cycling problem. In this day and age with multiple cores and multithreading, stopping the render process to save a file or update a UI is absolutely insane. To get your fastest renders, set the update intervals to 3600 seconds. Note, though, that if LuxRender crashes during one of the one hour intervals, you will lose all of that work. Though, I haven’t had this happen while rendering.
So, that’s how you get your fastest render out of LuxRender.
leave a comment