ABC’s Lost: What really happened?
For 6 years, we tuned in to find out what the next episode would be. For 6 years, we wondered as the premise got stranger and stranger. In the end, we finally see all of the plane crash victims that we knew together one last time in death. So, what really happened?
Common Theories
A lot of people theorize that they were dead the whole time. Others believe everything from seasons 1-5 were real events. Other theories are somewhere between these two. None of these scenarios fit exactly with what I believe happened. Keep in mind that these theories below are mine. If the writers choose to revisit this story and alter their vision of what really happened and how it happened, then that’s up to them. Any new stories they put forth could also negate the below theories. As the show sits today, here is my theory.
Were they dead?
Yes. They were dead before the plane crashed on the island. In fact, they probably died from a crash at sea. If they were supposedly dead, then where were they and what where we watching? Though they were dead from our Earthly plane of existence, they did seem very much alive. You’ll need to understand the writers’ use of the jumbo jet plane archetype is a literal metaphor (and pun) for carrying these people to the next ‘plane’ of existence. Once you realize that the plane is merely a metaphor, then you’ll understand the entire show. Even the title ‘Lost’ is both a pun and a foreshadowing of the main characters’ ‘awakening’ when put into context of the story.
That flight literally moved each of the victims to the next existence plane which allowed them to continue their lives right where they left off from their former reality (in excruciating detail), just as though the plane had really crashed. Let’s start off understanding that plane of existence. The next plane is supposedly the plane of imagination and creation (and as a way point for the next step in our journey). If this territory seems unfamiliar, you should probably research more on the 7 or 12 or 31 planes of existence theories. In the next plane from ours, you can create a realistic universe of your own choosing. So, the island represents this plane of existence. The island had rules because the person who imagined the island created those rules. It looked, smelled, felt and tasted like a real island because that plane of existence was just as real to those involved.
In the case of people new to that plane, they are not yet aware that they are dead (from the Earthly reality) and continue onward ‘living’ their lives as though they were still alive in the Earthly plane. The reason the physicality of the island mirrors our physical human reality so closely is that all people who recently die end up there. Because each person’s essence is so heavily tied to the Earth plane for so long, it’s natural to bring that familiarity into the plane of imagination and creation and then recreate those things most familiar exactly as it were (people and all). Hence, the Island.
The Glitch
In that plane of existence, things will be a little off kilter here and there (like the cat glitch in the Matrix). For example, the smoke monster, the island barrier, Jacob, people randomly appearing and disappearing on the island, items they need randomly appearing and disappearing, being cured of illness, time travel, magical events, etc. These are all manifestations of someone’s imagination and/or of being in that non-physical plane of reality. Because none of the people realized they were effectively in a dream reality, they never ‘woke’ up to it… all except Desmond. He didn’t wake up, but he could manipulate parts of that island reality. In fact, he may have been the ‘constant’ who unknowingly created the island from his imagination after having died sometime earlier. Assuming Desmond was the creator of the island, he couldn’t wake up before the rest of the characters or the Island might drastically change.
Note, the characters discount or disregard the glitching because that plane of existence is less rational than the Earthly plane. So, events that would seem way out of place here on Earth are more readily accepted in that plane. Acceptance of the glitching is part of the awakening process.
Why strand them there?
That’s a good question. Let’s understand that they would have ended up in that plane of existence simply by their physical body dying. However, for no other reason than the writers needed a place to put the plane crash victims to create this story, placing them all into Desmond’s plane of existence was as good a place as any. If you have a bunch of dead people, to the writers it seemed to make sense and it produced a good enough show.
But, they left the island!
Well, yes and no. Because that plane of existence can manifest anyone’s imagination, it’s easy to have characters end up back at home. That doesn’t mean they were really there. What the characters saw was merely a shadow world created by that character in the imagination plane. That’s why the real world always seemed just a little bit odd, somewhat unnatural and unreal. So, anyone they interacted with was simply a dream character. Because not one of the characters ever woke up, they never knew they could learn to manipulate their own world in any way they saw fit. But, if they had awakened, they would also know that they’re dead. So, for the writers, it would have revealed the ending too soon to have any one character actually ‘wake up’.
Some of the people died on the show
Yes, they did. But, they were already dead? Yes, those characters who died on the island suddenly realized they were already dead and moved on from that plane to the next plane earlier than the rest of the characters. Because ‘moving on to another plane’ is a different event from physically dying, all of the characters who thought they were still ‘alive’ perceived that person’s exit as a death. If they were to perceive another character’s death in any way other than by our plane’s means, they would wake up to the fact that they’re dead. It also makes perfect sense that some characters might figure it all out sooner than others. There’s no need to stay on the island once you know the truth of it.
What was the island?
Was the island a type of Purgatory? Not exactly. Purgatory assumes you believe in Christianity. Purgatory is defined as an intermediate state between death and Heaven. A place to purify before reaching Heaven. If the Island were Purgatory, that would assume all of the characters were destined for Heaven. In fact, there were plenty of characters there that didn’t seem to deserve entry to Heaven for the things they had done in life. But, who am I to judge that for them?
Instead, it’s better to adopt the wider view of planes of existence outside any single organized religion’s ideas. These views define planes as, yes, intermediate planes after death, but more than that. There are anywhere between 7 and 31 planes. I won’t get into further details about this topic as it’s well beyond the scope of this article. There are plenty of books describing these planes, what they are and why they exist.
Anyway, the Island is one of these planes and a type of ‘waiting room’ (if you subscribe to the Catholic view, it might be considered Purgatory) for people to make peace with their old life allowing them to ‘wake up’ to their new existence slowly before moving on. It’s a place to let you replay events from your physical life and unshackle yourself from the confines of a physical body to transition to the next plane. Think of the Matrix and waking someone up there. It’s kind of the same thing, but you get to wake up on your own rather than by taking a pill and finding yourself in a new reality immediately. The island is simply that stopover point that leads each of those people to the next step of their existence.
Note that during season 6, their existence was defined to be ‘Purgatory’, but by season 6 the characters were beginning to wake up. During seasons 1-5, the characters thought they were still physical. In their reality, that was all an illusion. The only thing real during seasons 1-5 was they were in that waiting room that appeared to be an island. In fact, they were in an alternate plane of existence where imagination and creation makes things appear real.
Why 6 years?
Understand that time in that plane of existence is meaningless. 6 minutes, 6 hours, 6 days or 600 years could all pass in the blink of an eye to us. Time doesn’t work the same in the next plane of existence. To us, we watched 6 years of episodes, but to the characters it may have seemed to happened in less then 30 days. Time is relative to where you are.
Why not all 250+ passengers?
Those specific few people were likely chosen by Desmond to live out their reality on his island or simply found their way to that island because Desmond wanted it to happen. The rest of the 250 passengers ended up in their own different realities, perhaps living out their own lives as if the plane had crashed, but others could end up making a world back at home with their families. The unseen victims of the crash made their own realities outside of the island reality and we didn’t get to see their lives unfold. Some of those people might also have moved on faster than those we saw on the island.
They weren’t dead until the very end?
Yes and no. They were dead in our reality. But, they weren’t dead in their plane of existence. A plane that is outside of our existence (or at least a plane that we cannot get to in our current tangible form). Because their bodies had died, their essence moved on in what appeared to be a body that looked, acted and dressed just like the living counterpart. The theory is that when you die, you continue to see yourself as your last physical body even in the next plane of existence. That is, until you slowly wake up to your new non-physical existence.
At the very end, the characters were finally awakened to their own Earthly death. A death that happened before the island. Once they awakened, they could realize the truth of it and return to the Earthly plane as ghosts. For whatever reason, they all awakened in unison, that or it was simply just time. Though, to them, the island was still just as real as any event on the Earthly plane. But, to the Earthly plane inhabitants where their physical bodies had died, they had died at sea in the plane and that’s all their Earth families ever knew.
In essence, Lost was a show about ghosts living in an alternate plane of reality.
Personalized Search: Where is it?
For all of the innovation hubbub involving search technologies back in the early 2000s, one thing that has still not materialized is personalized search. What is personalized search? Let’s explore.
Generalized Searching
Today, when you go to Google or Bing and you type in search keywords, you’re likely to get the same search results that everyone else sees when typing in those same keywords. But, this is approach today is asinine, antiquated and stupid. While it may have been okay back in the early 2000s when search was new and the database was smaller, with the larger amount of listings, personalized search is long overdue.
When Google introduced Gmail, I thought they might be onto something when they were discussing personalized ads in Gmail. Unfortunately, Gmail is pretty much where all of that innovation ended. Nothing different materialized in Google’s main search product. And worse, it’s now 2014 and we still don’t have anything different.
Personalized Search
Since nearly every search engine requires a login and password, it’s no big leap to offer ways of storing search preferences right into each user’s profile. As you search for things, the system will learn of your likes, preferences and click habits. Even better, add thumbs up and thumbs down on listings to move them up and down in your own personal search rankings. If I don’t ever plan to use Reddit, then I can lower its search rankings in my preferences. If I heavily use Twitter, I raise search rankings involving Twitter when they are ranked lower.
But, my preferences are my own. With the sites I like and the sites I dislike, I should be able to tailor my search results to fit my needs. If I decide to start using Reddit later, I can re-rank these listings higher again. These are all my choices and affect my own personalized search results.
As a side effect of personalized results, it also forces everyone to sign into Google or Bing to gain the benefits of personalized search. That’s definitely a benefit to these search engines.
Why personalized search?
Generalized searching, unfortunately, yields results based on someone else’s likes, dislikes, payola or other criteria. I want to tailor my own results to fit my search needs. So, if I’m searching for a specific product and I use Amazon frequently, Amazon’s listings will always be the first to show at the top. Why show me Newegg or J&R Music listings if I have no intention of going there to buy? It’s a waste of the search engine time and mine.
It’s quite clear that personalized search’s time has come and it’s something Google needs to embrace. That is, rather than the next ‘WheatToast’ version of Android (or whatever clever food name they happen to use). Google has clearly been ignoring search improvements and the lack of innovation in this area clearly shows how out of touch both Google and Bing are.
As the size of search databases grow, individuals need better innovative tools to tame and distill the millions of listings into smaller more personal and useful listings. Personalized search must become the next innovation in search.
What will this break?
Search Engine Optimization. I know I know, I can hear a lot of SEO advocates groaning about how bad this will be for SEO. Note that SEO would only be impacted by each user who tweaks personal search rankings. For users who don’t do this, normal SEO rules apply. Though, I don’t personally care about how high some company is ranked in my personal search list. What I care about is the quality of the listings. In fact, in a lot of cases, SEO won’t even be affected in my own results. If I have made no preferences involving some keywords, the generalized rules still apply. So, if none of my sites that I ranked higher are in the listing, the generalized results will be shown to me and standard SEO won’t be impacted. It’s only after the first generalized results list that I can tweak the listings to my own preference.
After that, SEO may be impacted by my own personal preferences. But hey, that’s my choice. That’s the point to personalized search results. If I value one company over another, that’s my preference. I have the right to make that preference. That some third party wants their listing at the top of my search results is not my problem. You can use a paid listing for that. That’s the point in paying for a listing. The organic results are my own and I should be able to rearrange, tailor and shuffle them to my own personal likes. There is no other way to tame the mounds of links that get thrown at users during generalized search… results that are only to grow larger and larger.
So, to those people relying on SEO, I say, “too bad”. Learn to pay for listings if you want to be at the top of my personalized search results or, alternatively, give me a reason to rank you higher. That is, whenever we finally get personalized search.
Huffpost: Facebook is not a verification system
The Huffington Post recently put up a ‘warning’ that in order to ensure ‘civil discussion’ on their own site discussion areas, they would need to verify my account. Then, they proceed to put up a ‘Connect to Facebook’ button. Note they do not allow any other ‘verification’ method than Facebook. Let’s explore why this is not appropriate.
Facebook is a social media site
Facebook is not a verification system. This system is probably the LEAST trusted site on the internet for privacy, accuracy of personal data or for any other verification purpose. Nonetheless, that Huffington Post is now requiring connection to Facebook to post comments on the Huffington Post site is clearly without rational thought. But, it does have an alternative agenda.
Huffington Post is a news media site. It has nothing to do with Facebook and, more specifically, nothing to do with my Facebook account. Sorry Huffington Post / HuffPost, not gonna happen. You can go wallow in your own stupidity. This requirement is not only insane, but stupid decisions like this can easily to lead to your own demise.
Verification Systems
If you own a site contemplating verifying a user, don’t tie it to Facebook. Email verification is the only level of verification that you need to verify an account. Connecting to someone’s Facebook account in no way guarantees civil discussion. Connecting to Facebook is a privilege, not a right. The only thing connecting to Facebook guarantees is that Huffington Post can randomly place garbage onto that Facebook user’s wall.
HuffPost Agenda: Calling a rose, a rose
Though this situation is not rosy, it also has nothing to do with verification and everything to do with Huffington Post’s own propagandizing self-promotion agenda, that and gaining access to private pieces of your Facebook profile. It has nothing to do with verifying a user. As a large respected journalism site, if you’re going to require something like this, then call it for what it is. You plan on using these Facebook connections for your own advertising purposes. Don’t lie to us and hide it behind some fake verification process.
This is nothing more than a real-estate grab. Huffington Post is merely grabbing Facebook accounts to use for their own advertising purposes. It has nothing whatever to do with civil discussions or user accountability. No, let’s just call a rose, a rose. It is not anything other than that. Until Huffington Post does the ‘right thing’ and states the real reason why it needs all of those Facebook account connections, I can no longer trust the Huffington Post.
Bye HuffPost!
How not to run a business (Part 7): Communication
Internal business communication is a problem in any company, especially as a company grows. When you have a 10 person team, it’s easy for everyone to know what everyone else is doing. When you’re a 500 person team, that challenge becomes quite a bit harder. How do you solve this problem for a 500 person company? Let’s explore.
Don’t expect all team members to know everything that’s going on
Foster a company that values communication, knowledge, excellence and teamwork. One of the biggest problems facing any company is that people, in their zeal to get things done rapidly, gloss over explanations about critical points. In email, it’s really apparent when you get that miles-long email thread that effectively tells you, “Read the below 50 reply thread and figure out what’s going on”. You do this only to realize that no relevant customer information, times and/or dates of the ‘problem’ are even described in that thread. Worse, you’ve spent 15 minutes reading it twice. It’s no wonder things don’t get done quickly and that customers complain of slowness.
Team members should provide ALL necessary information to everyone properly for expediency. It is the originating employee’s responsibility is to describe all necessary information that identifies the customer in your company’s system. Without this basic information, someone will eventually have to stop and determine this. To solve this problem effectively, require the use of a ticketing system to track problems and make the ticketing system require input fields that force the employee reporting the problem to fill in all details properly. This completely avoids that 50 reply long thread where not one person defines the most basic things needed.
Wasting time on deciphering a miles long email thread is pointless. It’s much more useful to get to the heart of the problem immediately. Use ticketing systems to manage these communication problems. Email is for quick questions and small discussions. Ticketing systems are for resolving problems. Use the right tool for the right reason.
Don’t let your employees post internal company information to internet sites
Internal information flow is for employee use only. Twitter, Reddit, Quora or even your own external blog or discussion forums are not the place for employee communication. Hire people to manage external facing customer information. Saying or doing the wrong thing on public facing media, especially when you become a public company, can hurt your company and can become a permanent part of Google’s search database for years.
Your corporate communication’s team (you do have one, right?) should strictly control public messaging. On the other hand, employees posting their own personal views of non-company related matters is not to be restricted when not on the clock and when using personal assets and networks. As long as their posts have nothing whatever to do with company business, there should be no restrictions on use of employee after-hour use of social media.
Tweeting personal things while on-the-clock and using company equipment, should be frowned upon if for no other reason than it is reducing work productivity. If the employee does personal things during their break or lunch hour, it should not be restricted if performed from personal devices outside of company networks and not involving company business.
On the other hand, posting public communication involving the company or the company’s products on company time should be handled by the corporate communication team or by their approval. Saying the wrong thing on the wrong venue can cause irreparable damage to your company’s credibility or lose critical deals.
Don’t read every employee email or store them forever
While you can likely do this through auditing, it opens your business up to some legal issues. If the person reading another employee’s email becomes aware of something illicit that your team may or may not know about, it could lead to issues involving the company becoming an accomplice in whatever the act may have been. Not having the knowledge, it’s much easier to deny involvement and that the employee was acting on their own. That may or may not help your case, but it may prevent other personal lawsuits from arising.
Additionally, if you are reading employee email, that means it’s stored some place. Because it’s stored, it may fall under other problem areas like email retention. That doesn’t mean you shouldn’t archive all emails sent by your employees, but keeping the emails stored too long is probably just as bad as reading them, in terms of legal problems. However, you may need to know if an ex-employee made sales promises to a customer that may not have been documented anywhere else. However, when you have emails stored, they can be subpoenaed during discovery of a legal proceeding. If you purge them after required legal retention periods, they’re not there to be discovered. At the same time, you may lose some historical information about your company. You have to make the call where to draw the retention line.
If you intend to keep backups of email, you should really only keep them for as long as the law allows, then purge them irrevocably from disk and all backups. Not having the information around can save your company from legal issues if an employee did something not sanctioned by the company.
Don’t use Google Apps, Postini, Appriver or other third party email servers without knowing how they work
If you outsource your company’s email system to a third party, you could open yourself up to lawsuits, loss of trade secrets or spying. You should always read that third party’s contract terms very carefully and ask for revisions for items which you don’t agree. If that third party reserves the right to archive, store and possibly even read those emails, this could open your company up to not only lawsuit discovery, it could lead your company to lost company secrets, lost company contracts, lost revenue, hacked email or lost customers.
A third party does have the responsibility to maintain some levels of privacy over contracted services, but you can’t control who that third party hires. If they happen to hire a person or contractor of malicious intent, you’re vulnerable. Simply using a third party, you’re at risk. In other words, that third party could end up hiring your competitor to provide some fundamental service that is conflict of interest to your business. Also, email hosting providers selling services to large corporate entities are prime targets for an attack. Beware of these risks involving third party providers. While using such a third party service may appear less expensive, you have to understand the hidden costs of running your business through any third party service. Only you can weigh those risk-benefits.
Even more, you’re also at the mercy of that third party’s security processes. If their process is not as stringent as yours, your company secrets could be at risk. If you don’t know the level of security that that third party provider offers, you could be a world of hurt if their email server is compromised and a bunch of your private company email appears on internet forums or on CNN.
Don’t pass trade secrets or confidential company information in plain text email
While you can’t rule out a corporate mole within your own organization, it’s far far easier to lose your trade secrets through email communication than through any other medium. If that communication uses a third party, you’re really at risk since few companies require encrypted email. If you choose to use email communication through a third party cloud provider, you should require that each employee send and use encrypted communications when discussing trade secrets, large customer deals, financial information or even discussing customer lists.
Setting up GPG, while not necessarily trivial, is one way to combat sending such easily viewed emails. Even the simple act of someone reading an email at home on their iPhone will transfer that email data across the internet in a visible plain text which can be read by anyone along the way. Email encryption prevents prying eyes other than to the recipient it was intended. Not all email communication requires encryption, but for those that do, encryption can be the difference between a lost deal and winning that deal.
The bigger your company gets, the more targeted it will be for espionage.
Don’t rely on chat systems to take the place of email
Chat systems are fleeting. These messages are easily lost. If you need records to be stored for your employee’s time use, then you should require email or ticketing to manage this. Chat is not always productive, but can be helpful to get answers to questions rapidly. But, don’t have your employee rely on chat to execute sensitive system procedures, especially if your company is using AOL, YMessenger or any other third party hosted chat system. Instead, for new procedures, use a local phone conference system. Voice chat is much more interactive, less error prone and, when combined with screen sharing, can provide much better methods of disseminating information and communication. Once the process has been nailed down, place it into an onsite Wiki that can be reviewed as a knowledgebase. Use a chat system for what it’s best at doing, writing quick small fleeting messages.
Don’t rely on third party services to run your entire communication business
If you can afford it, you should build and operate your own corporate communication systems behind your own corporate network infrastructure. If you farm out any part of your corporate communications to a third party provider, your communication is at risk. Risk from theft, from espionage, from hacking and from data retention that’s all out of your control. Instead, to control all of your communications (both internal and external), you will want to own all communication systems including ticketing, email and chat services. While you can’t own mobile device networks, you can own when and how they are to be used for communication.
Don’t forget to encourage employees to communicate regularly
While meetings are great ways at getting a lot of people on the same page at once, those that aren’t in the room during that meeting won’t have any clue. It’s also easy to forget who attended a meeting after the meeting convenes, so always make sure to encourage people who attended the meeting to communicate to those who didn’t attend and to whomever needs to know.
Also, require someone to keep meeting notes at all meetings and post the notes to a common department page after the meeting concludes. Better, require recording of meetings and store the meeting recordings as mp3 files for easy access and download. This not only allows those not in attendance to catch up on what was said, it also keeps those who were in attendance from claiming something was or wasn’t said. Basically, recordings keep everyone honest and informed. Remember to apply data retention policies to all archived meeting recordings.
Don’t tolerate employees who claim ignorance on what they have previously said
For any manager, director, VP or regular employee, honesty is the best policy. Keeping your employees honest keeps the company functioning correctly. However, any employee that regularly uses the ‘I never said that’ defense, usually indicates that they did say that at some point. Employees should not be allowed to get away with that defense, especially when it is found via email (or through recordings) that they did say whatever they claim they didn’t.
Employees using that defense more than twice and who have been found to have said it, should be officially written up and placed on a performance plan. Any further transgressions should be met with swift removal from the position. Honest communication is the key. Anyone intentionally sabotaging that goal by using this defense, should be swiftly stopped and/or removed. Fewer things make a company more communicably dysfunctional and time wasting than having to deal with unnecessary diversions (e.g. having to prove someone else wrong).
Instead, employees should always focus on the business at hand, not on doing historical research projects to find out what someone may or may not have said.
Don’t encourage employees to keep other employees in the dark
Barring salary and compensation details and upcoming earnings information, there are very few business topics that cannot be communicated to any employee in the company. Granted, some information may not be necessary for a specific person’s job role. There is no reason, though, that a person who manages IT couldn’t know the DSO number of a collections associate in Finance. This is not secret information. It may not be necessary information for that IT person’s job, but it should not be in any way a secret. Not passing unnecessary information is considered okay, but if someone asks it’s not a secret.
In the spirit of this section, all critical business information needs to be sent to everyone who needs to know. Example, when sales deals are closing, sales employees need to disclose all promises made to the customer and that information needs to be disseminated to all employees potentially impacted by those promises. Passing the information is not necessarily in place to prevent the deal from happening, but to allow anyone with extenuating information to inform the sales person of those business constraints impacting those promises. In other words, sales people need a technical conscience. The only way to manage this is to involve a technical person to help reign in the sales person and set proper expectations. Barring the use of a technical person in every sales deal, then the promises need to be disseminated to the technical teams to ensure the deal can be closed without problems.
If special provisions are needed for some promises, then the prospect needs to be informed of when those special provisions may become available. The last thing you want your sales person doing is making a promise without telling anyone. That’s the quickest way to not only lose the deal, but also to face refunds months later when the promises cannot be kept (and the sales person has spent their commission check after having left the company).
Communication Reality
Checks and balances can only be performed with proper communication to all teams and by also not keeping employees in the dark. If you find your sales team making promises without informing people timely, this person should be reprimanded and written up. Further transgressions should be met by leading them to them the exit door.
Communication is always a challenge and keeping the communication flowing is the only way to ensure smooth business operations. It’s when communication stops, lags or is held back until it’s too late that it becomes a business continuity problem. As a company grows larger and larger, communication will suffer. When a company becomes divided by geographic boundaries, communication becomes not only worse, but compartmentalized. What one office may know, another won’t. That’s a recipe for problems all around. Unfortunately, that’s also the problem that most very large companies like AT&T and Verizon face today. With 10000 or more employees, communication between all of these employees will greatly suffer and is one of the reasons that ticketing and process flows become the single most important communication tool in a super sized company.
However, that you may only have 50 employees doesn’t mean your communication can’t suffer. Every company can improve communication by using the right tools.
← Part 6 | ↓ Part 7.1 | Chapter Index | Part 8 →
Cinavia: Annoying? Yes. What is it?
If you’re into playing back movies on your PS3, you might have run into an annoying problem where your movie plays for about 20 minutes, then the audio suddenly drops out entirely with a warning message on the screen. This is Cinavia. Let’s explore.
What is Cinavia and how does it work?
Cinavia is an audio watermarking technology created by the company Verance where an audio subcode is embedded within digital audio soundtracks at humanly imperceptible levels, but at a level where a DSP or other included hardware chip can read and decode its presence. Don’t be fooled by the ad with smiling children on the Verance site, this has nothing to do with helping make audio better for the consumer. No, it is solely created for industry media protection.
This Cinavia watermark audio subcode seems to be embedded at a phase and frequency that can be easily isolated and extracted from an audio soundtrack, then processed and determined if it’s valid for the movie title being played back. Likely, it’s also an analog audio-based digital carrier subcode (like a modem tone) that contains data about the title being played.
How is Cinavia used in the film industry?
There are two types of known uses of Cinavia watermarking. The first use is to protect theatrical releases from being pirated. Because the audio watermarking is audible, but imperceptible, it will be picked up by microphones (strictly because of the Hz range where the subcode is embedded). Keep in mind that just because the subcode cannot be heard by human ears, it doesn’t mean it can’t be heard and decoded by a specialty hardware chip. So, if a theatrical release is CAMed (i.e. recorded from the screen), the Cinavia watermarking will also be recorded in the audio. After all, what is a movie without audio?
The second use is to protect Blu-ray copies of films from being pirated. For the same reason as theatrical releases, Blu-ray films are also embedded with a subcode. But, that subcode is different from theatrical films. For this reason, films destined for theatrical releases will never play in a consumer Blu-ray player ever (including players such as the PS3, PS4 or Xbox One). Commercial Blu-ray disks play because the audio track uses AACS with a key likely embedded within the subcode watermark. If the AACS key matches the value from the watermark, the check passes and the audio continues to play.
I have also read there is a third use emerging… to protect DVD releases. But, I have yet to confirm any DVDs currently using this technology. If you have run into any such releases, please leave a comment.
How would I be affected by this?
All consumer Blu-ray players manufactured after 2012-2013 are required to support Cinavia. If the Cinavia subcode is present, the player will blank the audio track if the AACS key is mismatched. This means hardware Blu-ray players from pretty much any manufacturer will be affected by Cinavia protection if the title supports it. CAM copies of theatrical releases will never play because the audio subcode is entirely different for theatrical films and the Blu-ray player will recognize that theatrical subcode and stop audio playback.
Not all movie titles use Cinavia to protect their content. Not all players support the Cinavia protections from all media types. For example, some Blu-ray players can play media from a variety of sources beside BD disks (e.g., USB drives, Network servers, etc). These alternative sources are not always under Cinavia protection even if the specific movie has an embedded subcode.
Since Sony is the biggest proponent and user of this technology, all Sony players, including the PS3 and PS4 along with their standalone Blu-ray players will not play back Cinavia protected material if it doesn’t continue to pass the subcode tests. For example, if you rip a Blu-ray disk protected by Cinavia and then burn it to a BD-rom disk, the movie will stop playing audio at around the 20 minute mark and display a warning. If you attempt to stop and start the movie, it will play audio again for a few seconds and then stop playing with a warning.
How can you remove Cinavia protection?
In short, it’s not as easy as that may sound. Once the Cinavia protection is detected on the media, the hardware activates and continues to look for the information it needs to make sure the content is ‘legitimate’.
With that said, there are ways of getting around this on certain devices. As I explained, some players don’t check for Cinavia for certain types of media (i.e., USB or Network streaming). Sony, however, does check for all media types. The PS3, though, doesn’t seem to check for Cinavia if the playback is through the optical output port (i.e., when playing back through an optical receiver). That would make sense, though, as it would be left up to the receiver to blank the audio based on Cinavia. Since most receivers probably don’t support Cinavia, there should be no issue with playback.
Other technical methods include garbling the audio somewhat or using variable speed on the audio. Neither of these two methods are really acceptable to the ears when watching a movie. We all want our movies to both look and sound correct.
How can I avoid this problem?
You can easily avoid this issue by using a a player that doesn’t support Cinavia protection. For example, Windows Media Player, VLC, etc. Most PC media players do not support Cinavia. Though, if you get a PC from Sony, expect the media player on any Sony product to support Cinavia (yes, even Windows Media Player might as Sony may have loaded a system-wide Cinavia plugin). If you buy a PC from any manufacturer other than Sony, you likely won’t be affected by Cinavia.
This problem almost solely exists on Blu-ray standalone players. So, if you avoid playing movies on such consumer hardware players, you can usually avoid the Cinavia issue entirely. Though, there are some commercial PC media players that do support Cinavia.
A possible real solution?
Another method which I have not seen explored, I have decided to propose here. With a film protected by Cinavia, the Cinavia subcode should exist both within silence as well as noisy portions likely at the same volume. First, extract a length of silence (that contains Cinavia subcode). Now, garble, stretch, warp and generally distort this subcode so that it cannot be recognized by a Cinavia decoder. Then duplicate the garbled ‘silence’ subcode to fill the length of the entire film. Extract the film’s audio soundtrack, mix in the new garbled full length subcode throughout the entire film. Note that remixing 7.1 or 5.1 track is a bit tricky, but it can be done. I would suggest inserting it on the subwoofer track or the center track, though it may be present on all of the tracks by design. After the audio track is remixed and remuxed into a resulting MP4 (or other format), the new garbled subcode should hopefully interfere just enough with the existing already-embedded subcode to prevent the Cinavia protection from getting a lock on the film’s original subcode.
The outcome of the garbled subcode could cause one of two things to happen. 1) The Cinavia detection is rendered useless and the Cinavia hardware ignores the subcode entirely or 2) The Cinavia detection realizes such tampering and shuts down the audio track immediately. While erroring on the side of fail is really a bad move in an industry already fraught with bad press around failed past media protection schemes, I would more likely suspect scenario number 1. But, it’s probably worth a test. No, I have not yet had time to test my theory.
While this doesn’t exactly remove Cinavia, it should hopefully render it useless. But, it won’t recover the lost audio portions being used by the Cinavia subcode.
How would I go about doing this?
I wouldn’t attempt doing the above suggestion manually on films as it takes a fair amount of time demuxing audio, creating the garbled audio subcode, remixing the new track and remuxing it into the video. But an application capable of ripping could easily handle this task during the rip and conversion process if provided with a length of garbled subcode.
[Updated: 2018-01-06]
Apparently, DVDFab seems to have a way to rip and disable Cinavia protections according to their literature. They have released the DVDFab DVD and Blu-ray Cinavia Removal tool. If you’re still having difficulties with Cinavia while watching your movies, it might be worth checking out this tool. Note, I have not personally used this tool, so I can’t vouch for its effectiveness. I am also not being sponsored by DVDFab in this article. I’m only pointing out this tool because I recently found it and because it seems to have a high rating. On the other hand, I do see some complaints that it doesn’t always recognize and remove Cinavia on some movies. So, caveat emptor. Even though it’s not an inexpensive product, it is on sale at the time of this update for whatever that’s worth.
It seems that someone finally may have implemented my idea above. Good on them if they did… it only took around 4 years.
How not to run a business (Part 8): Stock and Incentive edition
While it’s great that employers want to reward employees and give incentives to stay, there is the correct way and there is the wrong way. Let’s explore.
Don’t offer tiny stock grants with huge vesting schedules and cliffs
If you’re planning to offer a stock grants as ‘stay’ incentives, make it sizable. Stock programs with vesting schedules are a good thing, but not grants with tiny amounts of shares. First, it’s a waste of paperwork to give out less than $10k in equivalent shares (vested over 4 years) in company stock both for the HR team and for the receiving employee. You’ll have your team spending time on managing all of these tiny grants with no benefit to anyone. Second, most employees won’t hesitate to walk away from such grants before the vesting period ends which means even more paperwork to clean it up after the employee has left. Employees won’t wait 12 months just to get another $1-2k when they can likely pick up a 5-10% raise (and possibly even a sign-on bonus) by changing jobs.
If you want to give an incentive to employees so that they stay with your company, approve grant sizes that matter. More specifically, grant sizes that are higher than an equivalent raise. Make it worth your employee to want to worry about. For example, grant a size equivalent to 1 year of salary (at the then current stock price) with a 4 year vesting schedule. If an employee sees they’re going to get 1/4 of their salary each year for the next 4 years, that’s definitely an incentive to stay. If they don’t stay, you don’t pay. Assuming the employee is a high performer and highly valued, it is worth it when they do stay. That’s the entire point of the grant. However, issuing a grant that, at best, offers the employee $1-3k after taxes each year offers not even the best performer an incentive to stay. After all, you do want this employee to stay, right? Most great employees can easily make up such a tiny amount left behind by moving to a new job with a new company. Most people would have no problems walking away from a tiny dollar amount for a new job offer. Again, this leaves your existing employees to clean up the mess left over from the tiny little unvested grants. Note that it’s the same amount of paperwork whether you grant 1 share or thousands.
In other words, grant stock incentive sizes that make sense for all involved or choose a different incentive vehicle altogether. While you may think giving stock grants is a positive thing, employees generally don’t because of the downsides of vesting schedules and cliffs, the hassles of taxes (it will probably cost the employee more to hire a tax consultant than the bonus is worth) and when it’s too small it’s not worth the employee’s time. Be very careful when using this incentive vehicle.
Don’t send the wrong message to your employees by using the wrong incentives
In the case above with stock, you have to consider what such a small grant size says to the employee. If you give an employee a pittance grant, you’ve essentially just told them, “You’re worth $1-2k a year extra” (once they do the math). That, in many cases (especially in California), is less than the average raise. That doesn’t, in any way, impart confidence that the employee is valued… and that’s exactly what a pittance grant says. It’s definitely not the right message to send. Yes, extra money is always a good thing, but not when it’s wrapped (er.. trapped) in the wrong incentive vehicle or if it’s the wrong dollar amount.
Keep in mind that for the employee it’s all about when they actually see the money. Trapping the money behind vesting schedules and vesting cliffs is tantamount to dangling a carrot from a stick just out of reach (for a year) and then only giving them 1/4 of that carrot after chasing it for a year. If you expect the employee to wait a year to get 1/4 of a baby carrot, it better be a damned good tasting baby carrot (e.g., a substantial amount of money actually worth waiting for).
From a monetary perspective alone, $1-2k extra a year can be easily handed to the employee in many other ways. You can label the extra as a bonus, you can label it as a ‘great job’ thank you, you can hand them a live check with a personal thank you or you can buy them an iPad as a gift.
Each of these suggested alternative incentives sends the correct message. Handing someone an iPad is a whole lot more satisfying of a bonus than handing them the quagmire of pittance RSUs. In stock plans with long term vesting schedules, vesting cliffs, stock price uncertainties, waiting periods and tax disincentives, it’s a quagmire of a bonus system for the employee to navigate only to secure $1k. Don’t use stock grants to hand out $1-2k a year bonuses. Using this incentive vehicle sends the absolute wrong message to your employees, can damage employee self-worth and ultimately damage your reputation as a respectable company. Ultimately, if the employee is left with nothing for a year and then has to wait 4 years to ultimately get maybe $10k gross and suffer huge tax liabilities in the process, that’s the wrong message to send.
So, always use the correct incentive vehicles to send a positive message to your employees to keep them on board. Using the wrong vehicle in the wrong way not only plants the seed of dissatisfaction, it can lead to the employee walking away entirely.
Don’t flaunt your sales team’s wins to your non-sales employees
Your sales team is important to the success of your company. It’s also great that your sales team members, or at least some of them, are doing well to bring in those great deals. On the other hand, many companies make the mistake of continually rewarding the most outstanding sales team members with trips, gifts, dinners and other niceties. Keep this information firmly within your sales team. Do not share this information with non-sales departments.
It’s very easy for the other departments to see the sales team as being the team with all the special benefits. This can make the other teams seem as if they are being left out of the loop. Your operations team, for example, usually has staff working 24/7/365 to make sure things are working. Yet, your sales team is being flown around the globe on sales team kick-offs. This sends the wrong message to other teams. If you are going to give incentives to your sales teams, either keep it away from your other teams or figure out a way (i.e., via winning an internal lottery) to include other team members in these wins.
Again, it’s important to understand that the sales team, while important to new business and renewals, isn’t the only team keeping your business afloat. All teams need to be supported, given incentives and given the opportunity to participate in travel events when available.
Do allow employees to participate in company sponsored events
If your company is planning to do trade shows such as Dreamforce or possibly even creation of your own company annual event, allow and encourage employees from all departments to participate. Doing the same job day after day, month after month is hard to do year in and out. Breaking the monotony of the same ole same ole will help reinvigorate employees when they do get back to their job. Allowing employees to do something different for a couple of days does help re-energize people to do their best jobs. It also encourages employees to meet and work with other employees outside of their team that they otherwise would not. This allows for a much closer knit company, especially when the employee does end up working with that person they met earlier.
Don’t be ambiguous or vague about your incentive programs and make sure they are fair to all teams
If you plan to offer such incentives as RSUs, stock options, bonus plans, merit-based trips, etc, document them. Document exactly how they work, who is eligible and how each employee can become eligible. If your programs only include certain departments, make certain that when other departments become aware (and they will) that you offer compensating alternatives to those other departments.
For example, if your sales team members receive an end-of-year trip to the Bahamas for the best sales numbers, then your finance team should, likewise, be offered some kind of off-site vehicle for the finance team members who consecutively kept their DSOs down that year. Offering something to one team and not others clearly smacks of favoritism. When it is not documented clearly, this causes more friction between teams than it solves. Better, if teams are offered grand incentives, then use a lottery to allow other departments to participate in it. So, for each sales team member who wins a trip, they can bring a member from another team along and that person is determined via a lottery. Again, this should all be documented fully so there is no question about either individual or team incentive programs.
← Part 7 | Chapter Index | Part 9 →
The Grammy Awards: What were they thinking?
So, I’m all for mutual-admiration-societies. You know, where you’re recognized by your peers with a gaudy gold award for producing something that’s entirely your job. Though, I suppose the point is to recognize that some creative works are better than others, but no one goes around pinning awards in most professions. No, this is a phenomena pretty much strictly involving the entertainment industry, and almost exclusively limited to Hollywood. I say ‘almost’ because the Tony awards recognize outstanding theater performers (which is pretty much exclusive to New York). And yes, there are the Saturn awards for novels, but again this is still considered entertainment.
Good Work or A** Kissing? You decide.
So, I’m all for recognizing good musical work. After all, that’s what the radio is for. Listeners vote by asking for music to be played and by purchasing it. Of course, we all know that’s not exactly true. Radio stations put music into heavy rotation mostly because of things other than popular requests. Sure, sometimes it is, but most times it’s because the producer wants it played and pays for that. And you might think that consumer music purchases are what drives the ‘Gold’ and ‘Platinum’ certifications. Nope. These certifications are assigned based solely on how many copies SHIPPED to retailers. Not how many were ultimately purchased. So, if 1 million copies are shipped to retailers, that’s considered ‘Platinum’. If 500,000 copies ship to retailers, that’s considered ‘Gold’. I’m not even sure how or if digital purchases factor into these certification programs.
The assumption is that the certification implies that there is a correlation between sales and shipments, but that doesn’t explain cut-outs. Let’s just say that this certification program is a bit of a scam. It doesn’t really say anything about the quality of the music or whether the music actually sold. The sales are merely implied. If someone has deep enough pockets to print 1 million copies of an album and get them shipped to retailers (whether or not a single copy sells), that would still be certified as a platinum album.
Music is subjective
Yes, it is. But, music is also derivative of other works. Sometimes it’s outright copying. Sometimes it’s rehashing tired themes and genres that have already been tread. Let’s take the 2014 Grammy Album of the Year: Daft Punk’s Random Access Memories as an example. What’s wrong with this album? Well, it’s good, but it’s not the best album I’ve ever heard. The music on RAM is mostly derivative, tired and somewhat cliche not to mention retro. It’s not that it’s not well performed, but it’s well under the level of skills I’ve heard from Daft Punk. The 2010 Tron Legacy Daft Punk soundtrack is a much stronger work musically than Random Access Memories by far. So what does that say?
It says that of all of the albums released in 2013, Daft Punk’s was the best. In fact, I found a large number of tracks on Random Access Memories unlistenable. Not because the tracks weren’t produced or performed well, but because they are just musically weak. They just don’t hold up to repeated listens. Yet, here we have the Grammy judges selecting it as the best album of 2013.
Personally, the best album of 2013 in my eyes would have to be OneRepublic’s Native. But, this album wasn’t really even recognized, for the most part. Only a single OneRepublic track was even nominated, ‘I Lose Myself’ and it didn’t win. The album wasn’t even nominated for best album. Yet Daft Punk’s mediocre album was nominated and won… so…
What’s up with that?
So what’s up with that is that it isn’t about the best music. It’s about the notoriety of the artist. Daft Punk has been recently riding the wave of publicity. The Grammy judges are only riding that same wave along with the artists. Winning has little to do with the music and has everything to do with trying to pull in as many viewers as possible. That’s crystal clear.
Daft Punk will drag in tons of viewers. OneRepublic won’t. But, OneRepublic’s Native is a completely outstanding and consistent album of mostly fresh tracks. I will state that they do sound a little like U2, but with a much needed sound update. However, the songs are mostly original, fresh and stand up to repeated listens especially when placed into a pop playlist of other tracks.
On the other hand, the Daft Punk RAM tracks are too long, sound too dated, are chock full of interruptions & weird intros and just drone on far too long in a pop playlist. Basically, they’re not something that I want to listen to often in a playlist. On the other hand, when I get into the mood for OneRepublic, I want listen to the whole album over and over. The songs are melodic, have catchy hooks, are mixed solidly, have solid musical themes and just overall work well as pop tracks. But, it’s just not individual tracks. It’s a whole album of them. They’re all consistent, catchy and fresh from start to finish of the album. There’s really not a bad track or performance on OneRepublic’s Native and this is, if no other reason, why this album is actually better than Daft Punk’s Random Access Memories. Of course, if you don’t like bands like U2 or The Script, you may not find the music to your taste, but that doesn’t make this album any less strong production-wise or musically.
The Grammy Snub
So, not seeing a musical artist like OneRepublic recognized for their outstanding work on an album like Native is a fairly major snub. The Grammy awards simply snubbed this artist for no real reason. It also says the Grammy awards are in it for the viewers and the money, not for actually recognizing the best music released during a year. This is the reason I generally avoid watching award shows. I just don’t trust the judges to pick the best works for that year. I’d rather find the best entertainment myself. As for Bruno Mars’s win, I’m on the fence. Unorthodox Jukebox had some strengths, but his vocals were really not that strong. He’s a reasonably good vocalist, but not the best I’ve heard. Unfortunately, I found the songs on Unorthodox Jukebox themselves to be less than impressive than OneRepublic’s Native. I’m not even sure why Unorthodox Jukebox was even considered for the 2014 Grammy awards as the album was released in December of 2012. Mutual admiration societies are really not good at actually picking the most outstanding of their bunch.
Stung by the Target data breach? Here are some tips.
Unless you’ve been living in a cave, Target stores recently disclosed that it had potentially lost up to 40 million credit and debit card numbers when their point of sale systems became infected with malicious software. Let’s explore how to protect yourself from these situations.
Knee-jerk Reactions
A lot of people who are not very tech savvy immediately jump the gun and presume all credit card systems are vulnerable and that carrying and using cash is safer. Unfortunately, this is an incorrect assumption to make. Cash, while useful, is not always safer to carry around. If you are carrying, for example, thousands of dollars on your person, when you get robbed or mugged, your money is gone and is not replaceable on top of whatever injuries you may have sustained when they robbed you.
You’re probably thinking, “How is anyone going to know I’m carrying it?” You have to open your wallet to buy things. People can easily peer in and see how many bills you have tucked in there. It’s very simple. They’re not going to mug you immediately following seeing the money. No, they’ll wait and do it a much more opportune time for them, but when you are most vulnerable (alone in a garage or someplace else similarly alone and dark). So, carrying loads of cash is not the answer. Money is also not replaceable when it’s stolen.
When and what happened in the breach?
Target confirmed that cards swiped through its terminals between November 27th and December 15th were likely exposed in the breach. However, Target hasn’t been forthcoming describing exactly how the breach was accomplished. But, what has been said is that the point of sale terminals appear to have become infected with malicious software. This would likely include both the customer card terminal reader and the register itself since both are connected together. It has also been stated that the hackers only received data contained on magnetic card stripe, which indicates that the malicious software only infected the actual card swiping hardware device.
However, if the entire register and card-reader terminal was infected with malicious code, it’s possible they also captured all input from these terminals which would include PIN codes and signature digital data. So, I’d suggest proceeding on the assumption that they did potentially obtain keyed-in data including PIN codes.
To be the absolute safest in your response to any breach announcement, always assume the worst to take the most appropriate action in anything dealing with credit or debit cards.
Who is Most Vulnerable?
Mastercard, Visa and Amex card holders or debit card holders which contain Visa or Mastercard logos are the most vulnerable card holder types in this breach. These cards can be used anywhere, especially at online sellers without signatures. So, it’s easiest to use these cards all over the Internet.
The least vulnerable cards are Target RED cards without Visa logos. These cards would actually protect you against use. Since these cards are only usable at Target and must be presented at the register to be swiped, they cannot be used at Target without creating a physical card. Because these cards do not look or feel like regular credit cards, they would be a bit harder to duplicate. Though, it’s not impossible. Because the non-Visa RED cards only work at Target, this means that the perpetrators would likely use the ‘low hanging fruit’ first. That is, the perpetrators would opt to use card numbers that can be used anywhere and can be used online without needing to print a card. Or, more specifically, Visa, Mastercard or Amex branded cards. Cards without logos, like Target’s RED cards can only be used at Target which limits where the card can be used.
The RED card can be used, however, at Target.com. This means they could use your RED card on a Target.com account.
What should I do?
If you have a credit or debit card bearing the Mastercard, Visa or Amex logos, you should flip the card over, call the number on the back and ask to have the card replaced. Don’t try to contact Target, don’t ask questions at Target, just have the card replaced immediately. Yes, I know this is the height of the holiday shopping season and may make it inconvenient for you, but just consider how much more inconvenient if the perpetrators max out your card and you have to clean up that mess in addition to not being able to shop? It’s always better to err on the side of caution and replace your card.
If you have a RED debit card, log into Target’s RED card management site and change your PIN. You can get to it from the main Target.com web site. Go ahead right now and do it. I’ll wait. You can finish reading the article when you get back.
So, now that you’re all done changing your PIN to your RED card, that’s really all you need to do. If the perpetrators obtained your RED debit card number, it cannot be used without the PIN code. By changing your PIN, you have now just protected your RED debit account from unauthorized use.
If you have a RED credit card without a Visa logo, assuming this card only requires a signature to purchase, then you are also vulnerable to easy purchases online at Target.com. Even with a non-logo Target credit card, there’s much less that can be done with it as it only works at Target. Still, I suggest you also visit the RED card management portal and choose to replace your RED credit card. There’s a link in the management site to do this. I suggest doing this online rather than trying to call the number on the back and waiting on hold. Due to the extremely high volume of calls that Target is experiencing at the moment, it’s really a whole lot faster to use their web management site. However, before you run off and request a replacement card, I suggest reading the rest of this article first.
If you own a Target Visa card, you should replace it immediately just as you would any Visa branded card.
Should I cancel my RED card?
The answer to this question is not as simple. If you use no other card than the RED debit card to make purchases at Target, you are actually more protected than any other card you can use. So, I wouldn’t recommending closing out your RED debit card if you want to continue shopping at Target. However, if you no longer wish to shop at Target after this breach, then I would suggest you close out all of your RED cards as you don’t want these cards hanging around unused.
If you own a Target Credit card and especially a Target Visa card, you might want to consider closing these cards and replacing them with a RED debit card instead. Debit cards are protected by PIN codes. Without the PIN, the card is useless. With a credit card, only a signature is required in-store. For web purchases, no verification is really required other than the security code on the back (and not always even at that). With debit cards, your PIN code protects you. With a credit card, very little protects you other than fraud liability coverage and even then you can still be held liable.
The Best Card To Use
The RED debit card is the safest card to carry into Target to shop. It’s safer than a Visa, Mastercard or Amex branded card because it can only be used at Target. It’s safer than carrying loads of cash. It also gives you a 5% discount off of purchases. You won’t even get that discount with cash. It requires a PIN code to use the card and PIN codes are relatively easy to change on the Target management site by the authorized user. It’s not so easy to change by a hacker. The one downside to using the Target RED debit card is that it requires giving Target ACH access to your bank account. But, if you set up a separate account strictly for shopping purposes as suggested in Randosity’s Don’t Trust Paypal article, you can even protect your bank account from unauthorized ACH access by Target.
How do I protect myself?
There are limits to what you can do to protect yourself against technology. We are all vulnerable to attacks every day when using our phones, our computers, at work, in our cars. Technology is everywhere and malicious code is being developed as you read this article. There is no protection against malicious code technologies. Most technologies are written for the greater good, such as checking you out at the store, helping run your phone, helping run bank ATMs, etc. These are all good uses of technologies. However, there are people who’s goal it is to disrupt these technologies for their own pleasure, for political reasons, for terror reasons or simply to disrupt the flow of society.
Basically, sh*t happens. You can’t predict it, you can’t manage it, you can’t really do much about it. This is why your bank cards have limited liabilities and why they allow you to change PIN codes and ask for replacement cards. The banks are well aware problems happen and they have safeguards in place to help prevent these problems.
However, only you can protect you. If you want to be the safest you can be, then monitor your transactions in your accounts closely. Also, choose technologies and technology strategies that help you safeguard your accounts. Don’t expect the banks to do this for you. However, some banks do offer limited monitoring services and will contact you when suspicious activities appear. But, it is up to you to make sure your account information is safe. Basically, if you don’t trust in the current payment technologies, you’ll be left behind. If you do trust the technologies, you have to take the good with the bad. Cash paper money won’t last forever. Eventually, it will be replaced with something else. But, these new payment technologies will continue onward.
For now, cash is one way to handle the technology issue, but it is not the best way. Of course, you could go back to using paper checks, but even checks are vulnerable to electronic attacks. While the paper check is an older concept, it still suffers from technology attacks because checks are scanned by computers and from there they become digitally vulnerable. It can also be difficult to buy things with cash or checks at online retailers unless they accept Paypal. The bottom line, if you choose not to participate in the new payment technologies, you will find it difficult and inconvenient to buy things, especially online. If you choose to embrace the newest payment technologies, you will need to also embrace the new security paradigm that goes along with these new technologies. Target has just unwittingly become a poster-child for these new paradigms.
The State of Gaming
I’ve been an ardent gamer since the Atari 2600 broke onto the scene. Before that, I was an avid pinball and arcade attendee. Suffice it to say, I’m a gamer. So, let’s explore what’s changed about gaming.
Early Days
In the earliest stages of gaming, experimentation was commonplace. This is not as much true in early pinball games as the physics were pretty much set, but in video games the bounds are endless. Though, the pinball technologists would definitely surprise me over what they could do with a table and with digital displays. I digress. In the beginning, games like Pong (1972) set the stage as to what could be done. A simple table tennis game seemed a good first step. It was a game everyone already recognized, but now it’s on a screen with no need to carry around real rackets. Now you just moved your finger and the paddle moved. No more physical exertion. What was born was couch entertainment.
However, you couldn’t take the arcade home with you. At least, not for a while yet. We wouldn’t see video games become true couch entertainment until after the Atari 2600 is born in 1977, five years after Pong’s release into the arcades .
Arcades
I loved visiting the arcades during the early 70s. The ambience, the music and the machines (oh so many to choose) all beckoned for that quarter. One quarter, the fuel that drove your gaming satisfaction. Of course, at the time, I was too young to have a job, so I was at the mercy of my parents to give me some money. When we visited the mall, my mother would always give us (my brother and I) a couple of bucks and off to the arcade we’d run. For her the cost was a shopping experience without a couple annoying kids constantly making trouble. For us, we got to explore the latest video games in the arcade like Atari’s Pong or US Billiard’s Shark (where you play as the shark eating the swimmer) or some of those old-style pinball games with the wheels for numbers. No digital numbers on these pinball games. Digital displays would come later.
This particular arcade (my first) was always fun and had unique games. It sat right across from a five and dime store. Some of the games even had some quirky behaviors born from carpet static. One of the pinball games would add a free game just by rubbing your feet on the carpet and zapping the coin slot. Unfortunately, living in humid Texas meant you could only do this at certain times of the year. The way-too-humid rest of the time you had to pay. That is, until the arcade owners figured out the trick.
Throughout the 70s and early 80s, I’ve visited many different arcades in malls, strip malls, at bowling alleys, at batting cages, amusement parks, convenience marts, standalone arcades, at mini-golf and at Malibu racing tracks. They all had their own ambiance and games that made each experience unique and left a lasting impression on each visit. I never tire of visiting a new arcade.
One of the arcades I would occasionally visit had a mammoth pinball machine that used what looked like a white cue ball as the pinball. This pinball game was ginormous. Though it was big, it really wasn’t one of the most exciting pinball games. Its uniqueness was in its size, not in its game board mechanics. I always thought that it played like everything was in slow motion. I always preferred the smaller pinball games. This particular arcade had a cave-like quality that made it seem like you were the only one in there.
Video Game Experimentation
During the early years of video games, many different companies experimented with video game ideas. There were even hybrid pinball and video games combined, though none of these really successfully married the two technologies.
The earliest games were flat single color games. The earliest video games also used black and white CRT screens. When color was needed, flat gel color panels were applied to top of the black and white screen. It wouldn’t been until later that color CRTs would be added to video games.
This was a great time to watch as video games progressed from being simple flat shapes on black and white screens to more complex pixel drawn characters in later games like Mortal Kombat and Gauntlet.
Arcade Video Games
As we moved into the era of video gaming, games became increasingly more complex graphically and sonically, but the games themselves remained relatively simple. Games like Pong, Space Invaders, Asteroids and Shark moved into games like Donkey Kong, Centipede, Venture, Burgertime, Dig-Dug, Mr. Do and Galaxian. All of these games had a simple level based premise. Do something to ‘win’ the level and move onto the next level. The win-the-level premise really had its roots back to pinball and simply carried over into video games. However with pinball, it was less about winning the level and more about keeping the ball in play as long as possible. With pinball, you were typically given 5 turns or balls to play. Once you used up all 5 turns, the game was over.
With video games, the premise changed from ‘playing as long as possible’ to ‘playing as short as possible’ so that arcades could maximize their profits. You really didn’t want the same kid playing the game on the same quarter for hours on end. This could easily happen with certain pinball games, but with video games that was not a goal. As we moved into video gaming, it became less about skill and more about defeating the ‘enemies’ (whatever they happened to be). Video game creators quickly learned that ‘enemies’ were the motivator for play. At the same time, the enemies got more and more complex, ingenious and harder to beat. In centipede, it happened to be a big segmented centipede squirming its way down the screen towards your ‘gun’. If you managed to destroy all of its parts of the centipede, the level was over.
Many games adopted the ‘Centipede’ approach to levels and began building more and more complex ‘waves’ of enemies, such as Galaga. So, from where did Galaga descend? From Galaxian, of course. And, Galaxian descended from Space Invaders. Space Invaders was an early somewhat higher res game depicting ‘ufo invaders’ at the top of the screen that you had to shoot until you destroyed them all. From this game alone descended a bunch of other games, some direct clones like Galaxian, Galaga and Gorf, some indirect clones like Defender (a side scroller). From Defender came some sonically similar games like Joust. Note, there are plenty of games I could reminisce over games from this time period, but I’ll move on to get to my point.
Game Innovation
As we progressed, game designers continued to push the boundaries with newer and more interesting ideas with higher res and more compelling gameplay like Paperboy, Marble Madness and Pole Position. There were also a number of vector based games like Battlezone, Tempest and Star Wars which also pushed the boundaries using vector graphics which would ultimately die as a technology. At the time, though, vector games were some of the first games to depict objects in 3D space (even though they were just wireframe drawings). The vector technology did offer, at least for me, more compelling gameplay due to the pseudo-3D experience. Unfortunately, the vector drawing method would only become a stop-gap technology to getting us to the 3D shooters of today. Though, the games that utilized vector technology were definitely one-of-a-kind and would also see produced a home arcade cartridge driven version named Vectrex in 1982. I always wanted one of these.
In among all of the flat 2D sprite based games, I applaud Atari for pushing the vector boundaries at that time. Without these innovative arcade games to keep us interested in plopping more quarters into the machines, we wouldn’t have kept playing.
Moving on, innovation continued with games like Gauntlet which took the arcades by storm. The Tron games didn’t do so bad either. Even Journey (the rock band) got in on the gaming action with the mostly horrible Journey arcade game set to Journey music from the Frontiers album. An earlier Atari 2600 console game was also released based on the Escape album. We would even see video game innovation in the form of laserdisc based games such as Don Bluth’s animated Dragon’s Lair and Space Ace titles. I have no idea how many quarters I plopped into these machines. There were even controversial video games based on movies, like Exidy’s Deathrace 2000 (1976) where you ran people over which turned into a grave.
All during this period, game designers were pushing the envelope on game ideas without much thought to the idea of game genres. That would come later. So while there were fighting games like Mortal Kombat and Street fighter and racing games like Manaco GP and Pole Position, these games would become a staple at most arcades. There would also be a few sports titles like Punch-Out! and these would introduce the idea of sports games, but the Maddens and FIFAs of the world would have to wait until consoles improved. Specifically, the later linked racing games where 4-8 players were linked and could race in unison in sit-down driving arcade cabinets. Other than racing, no other arcade games braved linking their cabinets for multiuser play. That wouldn’t happen until the dawn of home networking and later Xbox Live.
Arcade Gaming End
So, while arcade gaming has never really ended specifically, it is greatly diminished as a result of the introduction of the Atari 2600 and later the Nintendo NES and the Sega Genesis. It’s funny, Atari, Nintendo and Sega were all huge builders of arcade games. Yet they all introduced home gaming consoles that would ultimately more-or-less kill the arcade as the place to game. I guess you might say that it was inevitable looking back now, but it is interesting to consider this fact.
Keep in mind that all during the later home console period (mid 90s), home gaming on the PC would become stronger and stronger with games like Doom, Quake and Wolfenstein. Thanks to iD software, Doom would actually usher in the era of first and third person shooters and, thus, bring this genre front and center. It would be a bit later that consoles would steal the PC thunder and introduce games like Halo.
Anyway, as home gaming consoles improved from the Atari 2600 through the to Atari 5200 and then later from the Sega Genesis to the Sega Dreamcast, from the Nintendo NES to the Nintendo Gamecube and to Sony Playstation 1, this ensured that home gaming would continue to prosper and that arcades would lose ground. However, even up until the Sega Dreamcast, we continued to see innovative titles arriving at home from games like Blue Stinger to Yu Suzuki’s Shenmue series. With Shenmue being one of the first open-world free roaming games that allowed you to interact with much of the world including real-time season changes.
The Era of Home Gaming
With the introduction of the Xbox and PS2, the whole course of gaming changed. Once these consoles were introduced, the gaming landscape began to be shaped primarily by Microsoft and Sony. At this point, we began losing a lot of innovative titles. Sure, we might see one every now and then like Rez, but these were an anomaly and not the norm. Still, with the Xbox and PS2, the genres were solidified into basically a handful of names like ‘shooter’ or ‘racing’ or ‘fighting’ or ‘multiplayer’ or you get the picture. With these new branded titles, it was easy for developers to create and drop games into the slots and people would understand exactly what they meant.
Still, while the genres were pretty much set by the Xbox and PS2, there were still a few developers willing to go outside of these and produce something new and different, but rarely.
As we move forward to the introduction of the Xbox 360 and the PS3, we see undefinable genre titles diminish further and the standard genre become defined. Basically, if your game didn’t fall inside a genre, it likely wouldn’t be released. Or, it would be released as a low priced digital download game. The only real exception to this was Valve who seemed to be able to get a games like Portal released onto consoles. Still, Portal could be considered a first person shooter even though that wasn’t the primary objective of the game.
With games like Halo 3 and Gears of War on the Xbox 360 and God of War on the PS3, this era saw primarily genre based titles released. Few developers ventured outside of these tried-and-true genres, but the rule was that they could if the developer chose to and these still might happen occasionally. In fact, by the Xbox 360 and PS3, there were effectively no titles that fell outside of the genre labels.
Era of the Home Console
With the 2013 introduction of the PS4 and the Xbox One, the era of home gaming is likely coming to an end. With what I consider to be an incremental update to these consoles (Moore’s law no longer applies), these hardware updates are only minimal updates to their predecessors. There was a much bigger leap in quality from the Xbox to the Xbox 360 (moving from 480p 4:3 aspect and component video to 16:9 1080p HDMI output). Changing the video standard between the Xbox and Xbox 360 and between the PS2 ad PS3 was a huge leap. Not to mention, the cell multiprocessor system that Sony put into the PS3. At this point, the 2013 consoles are at the point of diminishing returns.
Both the PS4 and the Xbox One are simply mid-priced PCs with standard Intel processors and standard ATI graphics cards. They’re effectively mid-grade PCs running proprietary operating systems. In fact, I’d actually say the Xbox One is likely running a modified form of Windows 8 with greatly reduced features from the Xbox 360. The PS4, however, is running Sony’s own proprietary operating system similar in looks to was on the PS3, but also with greatly reduced features. Though, the Ustream/Twitch live streaming features of the PS4 are a much welcomed improvement.
Yet for the cost factor of the units, the games haven’t dramatically improved. Let’s observe the problems. With the new consoles, the genres are pretty well set in stone. At this point, no developer would be willing to stray outside of the standard defined genres: shooter, fighting, sports, real-time RPG (which is slowly being combined with shooter), turn-based RPG, puzzle, simulation, strategy, party (encapsulates dance and other party games) and creative. While there may be some sub-genres such as ‘horror’ or ‘mystery’ or ‘period’ which can apply to each of the genres, these are the top genres that are used. Sports encapsulates all forms of sports including baseball, football, racing, skiing, skateboarding, etc.
In fact, most games fall into one of the following: shooter, fighting, sports or RPG. The rest of the genres are lesser used.
The End of the Console?
As the PS4 and the Xbox One are now available, it’s becoming more and more clear. It’s expensive to create a game title on these consoles. To create a game that looks like Ryse, you need to outlay a hefty sum of cash to license the Crytek game engine. And that’s just to get the engine you need to drive the hardware. Still, once you’ve spent your wad obtaining a CryEngine license, you still need to hire a slew of programmers, artists and writers to develop a compelling story and then work to make that into some kind of a compelling play.
From concept to completion, you’re likely talking at least 3-5 years depending on the size of your staff. Of course, the more people you throw at the problem, the faster you can get it done. But, speed isn’t your only enemy here. For the example I mentioned earlier, Ryse, this game is absolutely gorgeous. The environments are amazing, the characters and armor are outstanding. So then what’s the problem?
The gameplay in Ryse is absolute trash. They could have taken the game mechanics straight from a 1990s Mortal Kombat game and plopped into to Ryse for all I know. The characters move in unrealistic ways, the game forces pauses at the most inopportune times and the gameplay is just overall bad. So, this issue is firmly the enemy of the PS4 and the Xbox One. A developer spends years and loads of cash creating a title only to produce something that plays like Ryse. In fact, Ryse is a firm example of what NOT to do on a next generation console. It is the low bar by which to make sure your game is above. Sure, it’s pretty, but that’s where Ryse all ends.
Limited Games, Longer Create Cycle
This will be the continual battle of the PS4 and the Xbox One throughout their console lifespan. Consider that the Xbox 360 and the PS3 have both been on the market for at least 8 years now. That’s 8 years of back catalog of games. Now, go look at these titles. Many of these games took less than 2 years to produce. And, of course, some of them show it (i.e., Two Worlds).
With these new console generations, the bar has now been raised again. Specifically for the graphics. To produce the graphics needed to look great at 1080p, this is not just a small amount of work. Not only does it require high res textures, it requires high res models. Producing such models and textures is not a quick process. Where the textures may have been half the size on the Xbox 360, they are now twice the size on the Xbox One. That simply takes longer time to produce.
This means that instead of the 2 year time it took for the Xbox 360, it might take 3-4 years to produce a title on the PS4 and the Xbox One. So, that means in 8 years, we’re likely to have half the number of big name titles we have on the Xbox 360. That also means it will take perhaps twice as long to produce titles for the Xbox One and the PS4. Further, this means there will also be a lot of engine reuse with new graphics dropped under the hood. In fact, I expect a lot of texture reuse across many games.
For the game studios that can afford the time it takes, these will continue. For those that can’t afford the time it takes to produce that level of a title, they will likely fold, stop producing or move to a different market.
The State of Games
Unfortunately, today we are seeing a convergence of genres. No longer do we see the new innovative titles, other than in digital downloads as small diversions. Occasionally a Japanese developer will produce a title geared toward the Asian market that will cross-over to the US market. But, that’s rare. Most titles produced today fall into one of the predetermined genres. It’s just too risky for game studios to gamble on an experiment. Game studios want to know their title is a guaranteed success. The only way that can happen is by making sure they stay within the trappings of the genres.
When games were like Pong or Shark might take a few people a several months up to a year to produce the game, it now takes many years to produce something like Halo 4. It’s too risky and expensive to gamble on experimentation. Game studios, therefore, won’t risk this. This is why we are firmly seeing more and more repetitive, trite and cliche games. Basically, we are effectively seeing games that you’ve already played at least twice already. Game studios believes having that level of familiarity with the subject matter will make it more likely to succeed. If it’s similar to a game you’ve already played, they assume, that familiarity will keep the gamers happy.
Unfortunately, the only thing this does is make the game crappy and annoying. Game studios don’t want to see or know this, but it is most definitely true. If you make your game feel like some other game or a game that you’ve played before, then it is that other game. It’s then not new or innovative and becomes an exercise in futility.
Predictions and Mobile Devices
I expect we will continue to see the smaller game studios close or be bought out. The larger game studios may continue to weather the longer cycle, but not forever. They have to see a return on their investment or they will also stop producing.
Overall, I expect that we will see less and less studios producing games for consoles. I also see this as the likely end of the ‘epic’ game. Game developers will begin go move back into smaller more easily built titles like ‘Farmville’ and move away from the epic titles like ‘Call of Duty’ and ‘Halo’. The only game studios producing such titles will be those that are subsidized by Sony, Microsoft and Nintendo.
Those game studios not being subsidized to produce such ambitious titles will move away from the consoles and begin developing titles for mobile devices. Since mobile computing is pretty much taking over, there’s really no need to own a living room console. It’s easier to play games on devices you are already carrying. Eventually, game studios will realize that it’s far more lucrative to produce games to play on what’s in your pocket than what’s in your living room. Especially considering how many devices are sitting in people’s pockets untapped.
Just a few compelling titles on iOS or Android, like Angry Birds, and you’re pretty well set. Angry Birds has already paved the way, it’s just a matter of time before studios wake up and realize what they are missing.
Xbox One is already dead before its launch?
Wow… just wow. Infinity Ward, the developers of Call of Duty, has recently stated in this IGN article and this IGN article that Call of Duty Ghosts can only run in 720p resolution and 60hz refresh rate on the Xbox One. Let’s explore why this is yet another devastating blow to Microsoft.
Xbox One
Clearly, Microsoft is banking on Xbox One to last for another 8 years like the Xbox 360. Unfortunately, not gonna happen. The Xbox One is clearly under powered for full next gen console needs. And, you would think the Microsoft hardware engineers would have thought of this issue long before even breaking ground on new hardware. You know, like actual planning.
With all of the new TVs supporting 120 Hz refresh rates and higher and TVs running 1080p resolutions (and 4k TVs not far off), it would be natural to assume that a next gen console should be capable of producing output in a full 1080p 60hz frame rate (as its base resolution). In other words, Xbox One should start at 1080p 60hz but be able to go up to much faster speeds from here. According to Infinity Ward, this is not possible on the Xbox One. I’ll say that one more time. Infinity Ward has just said that 1080p 60hz is not even possible on the Xbox One.
Next Gen Consoles
Because of this significant and avoidable Xbox One hardware deficiency, Infinity Ward has taken the step to produce Call of Duty: Ghosts in 720p at 60hz refresh rate (upscaled to 1080p) on the Xbox One to keep the ‘experience’ similar on all platforms. Let’s compare. Every big game title produced on the Xbox 360 is already 720p 60hz upscaled to 1080p. What this ultimately says is that the Xbox One hardware is no better than the Xbox 360. This hardware is basically dead before it’s even hit the store shelves. A next gen console should not see limitations in hardware until at least 2 years following its release. A new console should never see any limitations being hit by any launch titles.
If one of the very first launch titles is already taxing this console’s hardware, this platform is dead on arrival. This means the Xbox One has no where to go but down. It also means that you might as well stick with the Xbox 360 because that’s what you’re buying in the Xbox One. It also means that the games will never provide a high quality next generation game experience no matter which game it is. Seriously, getting high resolution at full speed is why you buy a next generation console.
Granted, I can’t vouch for Infinity Ward’s programming capabilities as I don’t know any of their developers. But, I know they have been producing this franchise for years. I would also expect their software engineers to have both the knowledge and expertise to properly produce any game for any platform they set their sights on.
In other words, I cannot see that this is some agenda on the part of Infinity Ward to try to discredit the Xbox One hardware.
Xbox One vs Xbox 360
The Xbox 360 hardware is well capable of producing games in 720p at 60hz already. It’s been doing this resolution and frame rate for years. Why buy another console that also has this same exact limitation of the current hardware? You buy into a next generation console to get something new. Namely, higher resolution gaming experiences. If the Xbox One cannot provide this, there is no point to this platform and this platform is dead. DEAD.
Xbox One: Dead on Arrival?
Based on the above, the Xbox One’s lifespan has been substantially reduced to, at best, 1-2 years on the market before Microsoft must redesign it with a new processor and graphics card. This also means that early adopters will get the shaft with ultimately dead hardware and have to buy new hardware again very quickly to get the newest Xbox One experience.
If you’re considering the purchase of an Xbox One, you should seriously reconsider. I’d suggest cancelling your pre-order and wait for the newest next gen console from Microsoft. Or, alternatively, buy a PS4 if you can’t wait that long. Why spend $499 for a console that gives you the same capabilities as the Xbox 360? It makes no sense, especially considering that there are no compelling launch titles on the Xbox One that aren’t also coming to the Xbox 360. It’s worth giving the extra time to make sure your $499 investment into this console is a sound choice.
Coding to the Weakest Hardware?
For the longest time, the Xbox 360 was the weakest hardware of all of the consoles. Clearly, it is still the weakest of hardware. For the longest time, developers catered to developing their games to the weakest hardware choice. That means, lesser graphics quality, lesser texture quality, lesser everything quality. I’m hoping this is now a thing of the past.
It now appears that game developers are tired of developing to the weakest hardware. Call of Duty Ghosts hopefully proves that. And, rightly so they should. Instead of producing low-res low quality gaming experiences on all platforms, they should provide the highest quality gaming on the best platforms. Then, take that and scale it back to fit on the weaker hardware platforms.
So, this scenario has now flipped the development practices. I’m glad to see developers embracing the best hardware and delivering the highest quality gaming experience on the best hardware. Then, reducing the quality to fit the weaker hardware. It makes perfect sense. It also explains why Infinity Ward reduced the resolution on the Xbox One. But, being forced to reduce the quality of the game to a lower resolution doesn’t bode for longevity of the Xbox One hardware.
What about the PS4 and 4k gaming?
According to those same articles above, the PS4 apparently doesn’t have this 1080p limitation. Call of Duty: Ghosts will run on the PS4 in full 1080p with 60hz refresh. Whether the PS4 is capable of higher resolutions is as yet unknown. Consider this. One of the very first 4k TVs introduced was produced by Sony. I would expect the PS4 to have been built to possibly support 4k gaming experiences. That doesn’t mean it will right now, but it may in the future. The Xbox One? Not likely to provide 4k anytime soon. If Microsoft’s engineers weren’t even thinking of 1080p resolutions, then they most certainly weren’t thinking about 4k resolutions.
If you’re into future proofing your technology purchases, then the PS4 definitely seems the better choice.






leave a comment