What is Critical Thinking?

Critical Thinking, when taught in a classroom setting, teaches something that approximates critical thinking, but isn’t actually critical thinking. In fact, what is taught is more deductive or logical reasoning than critical thinking. Let’s explore.
This article is 6647 words. At an average reading speed of 200 words per minute, this article will take slightly more than 33 minutes to read it. Grab your favorite beverage and let’s get started.
Critical Thinking Tests
Here’s a test example:
Which of the following reflects the intention of the writer well?
A. To call men intelligent who are not strikingly so must be to use the concept with undue imprecision
B. Every empirical concept has a degree of vagueness
C. Calling someone intelligent or not depends upon one’s whim
D. There is no need to be as indecisive as the writer of the above
While there is an answer to this question, I’m not going to go into what it is just now for a number of reasons which will become apparent shortly. Instead, let’s analyze this type of question for its appropriateness for critical thinking skills.
First, let me start by saying that the grammar on this question is absolutely atrocious. Without proper grammar, you can’t make heads or tails of what the question is actually asking. The grammar forces you to trip over the question which then forces you to become distracted by the grammar. This fact alone leads to confusion and interpretation problems. Once we’re off track for the interpretation, we can’t easily arrive at a correct answer. Is this a test writer trick? I’ll leave that for you to decide.
Second, this question has multiple choice answers. I vehemently dislike multiple choice answers for a number of reasons. The first reason to dislike multiple choice answers is that they offer a limited selection of choices. You can’t be free to think through the question critically… which is the whole point in this exercise. Instead, you must keep your thoughts constrained to only 1 of 4 answers. On the plus side, the question author didn’t include the absolutely horrid trick answers, “All of the above”, “None of the above”, “Answers 3 and 4” or any similar type answer tricks.
The second half of the second reason to dislike multiple choice answers is that you must decipher what the question author is asking you to do and then keep your thoughts constrained to only those 4 answers… even though your own critical thoughts may lead you to additional answer conclusions not included. This means, you have to put yourself into the shoes of the question author to try and determine how the question author expects you to answer this question. In fact, this makes answering this question less about performing actual critical thinking and more about trying to get into the head of the question author to determine the test author’s motives. That’s not a critical thinking exercise at all. No.
That’s test taking 101. Meaning, it actually becomes more important to understand the test author’s tricks than it is to actually utilize critical thinking skills to answer the question. This is an important distinction to understand about test taking. This is why multiple choice test taking is less about what you know and more about how best to decipher the test author’s motives for the inclusion of the question… and more importantly, how they are expecting you to respond (correctly or incorrectly) to their biased notions. In other words, test authors leave you just enough threads of logic to lead you in multiple directions. Only one thread, if you follow it, leads to the correct answer. Other thought threads, if you are tricked by the question author’s lead, will lead you down the the wrong answer path.
This means you may be betrayed by your very own thought processes. You may postulate the wrong answer simply because the question author led you down the wrong path based on reaching the wrong conclusion. Again, this is test taking 101. You have to become a savvy test taker to understand that the test author is intentionally leading you down the wrong answer path. You have to be smart enough to understand this aspect of test taking to rethink your conclusion to lead you to the correct answer. Again, this has nothing whatever to do with critical thinking skills and everything to do with avoiding test author traps.
Third, this snippet of text is too small to draw any real conclusions. It’s like taking two sentences from the Star Wars novel and then expecting you to understand the author’s intention behind the story. You can’t do this with only two sentences. This question text lacks the bigger context of why it exists in a larger text. If the “author” behind this question included this small statement in a romance novel, for example, and was then talking about a specific character with this statement, you could much more easily draw conclusions to the correct answer because you have wider context surrounding its reason to exist. However, pulling a small snippet out of a larger story, then expecting a test taker to rationalize conclusions without the necessary larger context means jumping to conclusions mostly by guessing. Guessing isn’t the way to critical thinking. Guesswork is best left for situations where the outcome is more or less meaningless. Guesswork shouldn’t be part of or required as a strategy when taking any standardized multiple choice test of any kind. For test taking, you either know the answer or you don’t.
Free Form vs Multiple Choice
Free form answers range in difficulty, but at the same time, require more actual critical thinking. You have to be able to articulate into words the answer to the question. These word answers can then be read by the teacher to understand the student’s thought rationale. That’s the point in critical thinking. For some, writing a free form answer can be easier. For others, it can be more difficult. One thing is certain. Writing a free form answer means you’re not constrained to a limited set of answers… which takes trick answers by wily question authors off of the table. It also takes misinterpretation issues off the table. However, it won’t solve poor grammar problems, such as in the question above.
There, I Fixed It
The question above should have been correctly worded as follows:
Which of the following reflects the intention of the writer well?
A. To call men intelligent who are not strikingly so must be to use the concept with undue imprecision
B. Every empirical concept has a degree of vagueness
C. Calling someone intelligent or not depends upon one’s whim
D. There is no need to be as indecisive as the writer of the above
In fact, the text of this question, now that this question been grammatically corrected, is technically an alternative form of the classic “glass half-full” vs “glass half-empty” argument. Let’s examine.
Of intermediate men (meaning, men who fall halfway between intelligent and not intelligent), do we call them intelligent or not? Again, this situation illustrates another version of the “glass is half-full” versus “glass is half-empty” argument. Thus, such situations can be both rationalized and stated either way.. and correctly I might add. It’s particularly true when extenuating circumstances are present (i.e., how thirsty you are, for example).
Recognizing that this is a case of “glass half-full vs glass half-empty” should be the critical thinking challenge. Once you recognize this fact, the answer should become obvious. Yet, it doesn’t. There’s no answer here that immediately rewards the critical thinker for recognizing this fact. Instead, we are still left with 4 bland answers… answers that don’t adequately or obviously sum up the question author’s reason for writing this question.
However, according to this test author, the answer is A… “To call men intelligent who are not strikingly so must be to use the concept with undue imprecision”. There is nothing in the snippet that describes a man as “strikingly so”. This “strikingly so” concept was added in the answer and was not part of the question. In fact, the correct answer should be C… “Calling someone intelligent or not depends upon one’s whim.” Why?
Why Indeed
The answer A works only from a utility perspective, but the answer breaks down under scrutiny. The “strikingly so” text, which was only present in the answer and not in the question, was added as a qualifier for the intermediate man. This qualifier didn’t exist in the original text and was incorrectly introduced as a new concept in the answer. This violates answer protocol.
A man who is of intermediate intelligence can’t really be called unintelligent unless someone who is much more intelligent stands next to him. Intelligence is a matter of degree. This means that so long as the intermediate man is the most intelligent man in the room, then the glass is half full… or more specifically, the man is intelligent. However, if the intermediate man isn’t the most intelligent man in the room, then the glass is half-empty… or more specifically, the man is considered unintelligent. It’s all a matter of context.
The label is then applied based on the context (or whim) of the situation… which means answer C, “Calling someone intelligent or not depends upon one’s whim”. Even though neither C nor A correctly or adequately describe this situation, C is the most correct of all of the included weak answers, because C doesn’t introduce new information.
Let’s also keep in mind that the definition of “undue” means “excessive”. Calling an intermediate man intelligent isn’t, in any way, excessive. Anyone who is not unintelligent must be, by their very nature, some amount of intelligent. We all understand that intelligence is a matter of degree. It is not an absolute. Calling someone intelligent doesn’t immediately conjure up images of Einstein and Mensa when using that word to describe someone. Instead, calling someone intelligent means to recognize that they are not stupid. For this reason, this question is better served (as a critical thinking exercise) by recognizing that it is, in fact, a form of “glass half-full” vs “glass half-empty” and then treating the test taker accordingly with an appropriate answer. That is the reason for this answer’s existence… an exercise that the test writer him/herself wasn’t intelligent enough to realize.
Critical Thinking
The above proves that this form of test taking isn’t sufficient to demonstrate if a student really understands critical thinking. This form of test only tests if the student can take tests, not that they understand the concept of critical reasoning.
Critical thinking and reasoning is designed to compare ideas, learn the most you can about it, apply logic and determine if what someone is saying is true, partly true, partly false or entirely false. Again, there are degrees to falsities and truth. Understanding and being able to critically find these half-truths or half-falsities (seem familiar?) is the art of critical thinking. It is a concept that the question author above failed to understand. It is a concept, however, that is the reason critical thinking skills are very important.
To ferret out truth from fiction using logic and reasoning is, by its very nature, a skill that everyone needs to master. Sure, you may know when your kids are lying, but can you tell when your co-worker is lying? Your boss? Your doctor?
You can’t just blindly go around thinking that all of these people are telling you the absolute truth any more than thinking they are outright lying. You need to be able to determine degrees to their truth and their deceptions. This is where critical thinking comes into play. Critical thinking isn’t just about reading text, either. It’s also about reading body language, reading into a person’s words and watching how people interact with one another. To become a critical thinker, you must also be able to read human body cues and nuances. Critical thinking skills are rarely ever just about one thing. It’s a combination of cues, text, conversation and rhetoric that combines to create a whole. Once the whole is created, it can be dissected and analyzed by your brain. The point is, to work through not only the logic or irrationality of the situation, but also to combine all aspects to see the bigger picture. Only then can you really think critically about what you know.
Case in point, the above question. If I had attempted to guess what the question author wanted, I would have gotten the wrong answer… because my analysis was not what the question author was seeking. Instead, my thought processes led me down the wrong path because I saw something in the question that turned out to be ignored by the question author. Instead, the question author took the wrong path by introducing information in the answer which shouldn’t have been there. I would have ignored the A answer because of the introduction of information that wasn’t in the original question. In fact, the introduction of that new information was actually the author’s trick to lead you away from the correct answer… to have you select an answer that didn’t introduce new information.
That’s not the critical thinking that the author intended. Instead, you were forced to use critical thinking to deduce which answer is the best based entirely on what you guess the author expected. In fact, there is even less information to go on about the test author than there is in the question. However, taking a test as a whole, you might be able to, through critical thinking, ascertain patterns in the questions and answers. It is these patterns that might lead you back to the above question to later realize the obviousness of the answer.
Unfortunately, you would have to have answered many questions on the test to realize the test author’s scam behaviors. Once you can recognize the test author’s scams on the test as a whole, you can then go back and rework previous answers to fall in line with that new information gleaned from the full test. This is why there’s an art form in taking tests… and while it might utilize critical thinking, it is more dependent on second guessing the test author correctly. I digress.
Critical Information
Critically viewing the world is important. You don’t have to tell everyone your conclusions. You simply need to be able to reach reasoned conclusions based on all information you can obtain. Conclusions aren’t always correct, but critical thinking isn’t an exact science. Because data is always changing and being updated and more information can be found, conclusions may change based on new data. This is the reason to always remain open to new data with a willingness to update conclusions based on that new data.
Jumping to conclusions is easy. It’s just that people tend to jump to conclusions way before having enough critical information. In fact, many people jump to conclusions with only the barest of information. Snap conclusion jumping is the whole reason why TV sitcom programs like Three’s Company (and many other situation comedies) can even exist. With access to the Internet, everyone now has a treasure trove of information right at their fingertips. Don’t just search one thing and call it a day. Spend some quality time searching and digging and reviewing. Look at all sites… even if the site is primarily made up of kook conspiracy theorists. There are always grains of truth tucked everywhere. It’s the commonalities between the various sites that are likely to lead you to those grains of truth.
If you can call and ask questions of people, you can even gain more insight. Nothing is off limits when seeking information. The worst that someone can say to you is, “No” and then you’re no worse off than you were before. As long as you understand this aspect, you can dig for information and sometimes get the information you need. Most times you don’t even need to call and talk to someone. An email will typically suffice. Some people can even be more forthcoming in email because it doesn’t require speaking aloud, which can be overheard by bosses and other staff. Typing, on the other hand, isn’t a problem… which documents yet another critical thinking exercise.
Getting the information to aid in finding an answer is half the fun. The other half is analyzing the data in your brain for even more ideas. Not everyone has the aptitude or desire for this. I get that. But, critical thinking is still very much a useful life skill.
Testing vs Real Life
Understand that the test question above is the kind of question you might expect to see on an aptitude test, like the GMAT. To pass that test, you will need to study similar kinds of questions like the above. You’ll then need to understand how to read and interpret these questions for the appropriate answer. However, know that that kind of “critical thinking” isn’t the same as you would use in everyday life… and herein lies the rub.
Teaching critical thinking skills in a class room will gear the student towards passing an aptitude test. Unfortunately, such tests don’t adequately prepare the student for using genuine real world critical thinking skills to solve actual problems, get to the bottom of a lie or in any other way support any other real life dilemma. We must rely on a completely different set of thinking skills than those taught in a class room. For this reason, relying on academia alone to impart the necessary information tends to come up short in real world applications. It is for this reason that I write this article.
Academia
Don’t get me wrong. Academia is great for learning new information. The teachers are excellent at getting students up to speed on various topics that they may know nothing about. Unfortunately, as great as they are at doing this, you also must recognize the limitations of academia. The biggest limitation is that university and college classes aren’t great at teaching information that’s real world applicable… or more specifically, how you can apply that knowledge to real world, real life situations. Instead, the student is left to his / her own devices for how to tie course materials to every day life.
Some course materials lend themselves to real world application much more readily than others. For example, accounting classes. It’s fairly obvious that the information learned in an accounting class can be used at an accounting firm. Classes like Sociology, Psychology, Art History, History and even Geology glasses don’t always offer up real world applicable information. They’re “great to know” classes, but can’t often be used in real life. Even mathematics classes don’t always have real world applicable uses, unless you’re a video game programmer. Even then, there’s limited use cases for that information. Knowing Calculus, for example, may not be helpful in programming a video game unless you’re designing a new and better physics engine.
In most cases, however, a game developer will grab an existing game engine off of the shelf which doesn’t require a need to know that level of calculus…. because you’re using an existing pre-programmed engine, not designing a brand new one. You will need to know how to use the engine to its fullest, but that won’t require that level of mathematical understanding.
Academia does have its uses. Specifically, it helps you to get a degree. Having a degree is exceedingly helpful in obtaining a job in certain fields. Unfortunately, much of the required academic information learned at a university is lost in time… which means that the money wasn’t well spent. Certainly, you got the degree out of the deal, which is the primary reason to spend the money. However, not retaining the learned information is a loss to not only in what was learned, but to a lesser degree in the money spent in that attempt to learn. That doesn’t mean all will be lost after earning a degree.
Academia isn’t totally a waste for learned information, however. Some of the information learned can be useful in real everyday life… if you can manage to retain it. If you use some of that information on a regular basis as part of a job, then you will at least retain that information. However, keep in mind that learning information during the pursuit of your degree can become outdated years later. Even such topics as history, physics and mathematics can change as new assumptions are made, as new information is uncovered and as new technological achievements arrive. In other words, some information learned in 1995 might be outdated by 2005, just 10 years later. Indeed, computer systems will be far outdated. Learning to use, for example, the WordStar word processor was entirely outdated by the release of Windows 98 and packages like Microsoft Word. As another example, learning to use Windows 95 had become entirely outdated by 2020 with Windows 10 being the most current edition.
Even the introduction of the iPhone and the iPad have changed academia from 1995. For this reason, many professions require refresher courses every year to keep each professional informed of the latest changes in the industry. Unfortunately, too many industries don’t require such refresher courses.
Learning Everyday
The point to all of this is that critical thinking is left to the individual to both address on their own, but continue to learn, grow and expand their own knowledge and contemplation skills. Critical thinking isn’t something that you put down or use occasionally. You must use this skill every time you interact with anyone. That includes watching the news, reading a book, talking to your friends and, indeed, even interacting with your boss.
What you’ll soon learn is that everyone has an agenda. It may be as small and innocuous as attempting to sway your point of view, but it might be as big as attempting to manipulate you into doing something for them. Critical thinking is an important life skill. This can’t be emphasized enough.
You must be both willing and able to see through to a person’s real agenda. Not everyone wants something from you… at least, not something that’s tangible. Television news programs want your attention and they want to sway you to a specific point of view… a point of view that is dictated by only the information presented.
A real world example
COVID-19 comes to mind. Vaccines do have benefit when designed correctly. However, the agenda now is to push the vaccines at all costs. News programs have been pressing this point almost relentlessly… to the point of ignoring the pandemic itself. We now get 5 minute snippets of the death numbers and we get 30 or 60 minute long segments with “medical professionals” espousing how well the vaccines work… yet, how scarce they are.
We know that. We knew that when the vaccine rollout began. It’s as if the news shows each want to insult our intelligence by assuming we didn’t know that the vaccines would be scarce for months on end. Yet, instead of covering the pandemic and showing us the carnage, the news producers instead choose to show us a whole lot of nothin’ about how poorly and slowly the vaccine rollout is progressing. In fact, news programs have chosen to politicize this whole issue by blaming it on the politicians. I won’t go down into the politicization quagmire that literally has no end. Instead, let’s move on.
These news shows have chosen a one-sided approach to pandemic reporting. Instead of reporting on the actual pandemic, they are reporting on the vaccine rollout and pretending that the vaccine rollout is considered reporting news on the pandemic. Hint: it isn’t. The vaccine is but one small subset of the entire pandemic. The pandemic is about how the virus is spreading, not how well the vaccine rollout is going.
Let’s understand more. The vaccine brings hope. The pandemic brings despair. As a producer, which would you rather report on? Here’s where biased reporting comes into play. The pandemic is not just about the vaccine, it’s about how, when and why people are contracting the virus. It’s about contact tracing. It’s about timely testing. It’s about hospitals under siege. It’s about the resulting deaths. It’s about running out of medical equipment. It’s about all of these things and more…. and yes, it is also about the vaccine rollout.
When a news program chooses to ignore all else to bring the vaccine rollout front and center, that’s disingenuous reporting and it’s the very definition of biased reporting. One might even consider this kind of repetitive reporting as a kind of reporting designed to convince the viewer the vaccine is a “good thing”. This aspect requires critical thinking skills to both realize and understand. If you don’t use critical thinking skills here, you can’t know to visit other news sites to get information about the pandemic itself sans the vaccine rhetoric. Critical thinking allows you to bring all aspects into perspective.
In the last example, trying to convince someone of something by repeating it often is a recent, but definitely not new, trend. As a critical thinker, you must recognize this false strategy to understand just how misleading this trend is. Donald Trump utilized this “repeat often” strategy in an attempt to convince people that the election was rigged. Here we have news programs using this same exact strategy to sway people to the news producer’s agenda about, “pandemic bad, vaccine good”.
Let me just stop here to point out a prior Randocity article about the vaccine. Again, this is another critical thinking article. I’m not attempting to convince you of my point. Instead, I’m offering up various sides and I leave it for you to decide your own point of view. I also don’t use repetitive reporting techniques to barrage you with the same point over and over and over as a technique of persuasion. I could most certainly use this technique, but then this blog would be no better than Donald Trump or various major news networks.
With this article, I want you to rise above these petty persuasion techniques and see these things for what they are… by using critical thinking and reasoning. However, as the saying goes, “You can lead a horse to water, but you can’t make it drink.” I can lead you, but you must choose to understand. I’m not here to convince you. You must make the leap to understand for yourself.
One Last Exercise — COVID-19
Let’s critically discuss the vaccine rollout. The vaccine rollout team has chosen a very specific rollout methodology. A methodology that I have begun to question. There’s no argument that choosing to inoculate those most at risk first seems like the best strategy, but is it? Clearly, those at high risk stand to lose their lives if they become infected. However, how do those at risk become infected? The answer is most likely, by those who are much younger and healthier who bring it to them.
Reasoning this out, it seems that rolling the vaccines out to those at highest risk of carrying the virus around would make the most logical sense, regardless of age. Yes, it’s been stated the vaccines won’t necessarily prevent carrying the virus asymptomatically. Let’s examine who I propose here: children in school. Because children are dependent on adults for their well being and because children must return to school and daycare centers which congregate children into close social groups and because children are not yet capable of understanding the ramifications and risks of carrying around COVID-19, children carriers are the most likely reason those at risk could become infected.
Children congregate socially to play and learn. Because of that, they then pass COVID-19 around and bring it home to their parents. If it’s a multi-generational household with grandparents at home, then those most at risk can easily become infected. The parents can then become unknowingly infected and, for a short asymptomatic time period, carry and spread COVID-19 to work, retail businesses when shopping and others they encounter… even to social events like the year end holiday season.
Many people have presumed this next false logic about children and COVID-19: “Because children are less prone to the affects of COVID-19, this means they are less likely to spread it.” This is patently false. There is no causation between these two separate concepts. Children and adults are both human. Human to human transmission is just as likely from a child as from an adult. In fact, because children are less likely to wash hands often and less likely to cover their mouths when they sneeze or cough, transmission of COVID-19 from a child is extremely high. While the child may never get severe symptoms, that may not be true of those to whom the child has transmitted the virus. However, a child doesn’t have the life experience to understand why handwashing is important… why covering their nose and mouth to sneeze or cough is important… why it’s important to take regular baths and to wash clothing. That leaves adults at greater risk from their own child, particularly if they’ve been at school around other children.
I’ve even seen doctors on news programs implying that children can’t as easily transmit the virus to adults as justification of getting the children back into school. I get this want. I truly do. Parents can’t have their children at home 100% of the time. They need their child back in school. After all, school is really treated primarily as a form of free daycare… with the added benefit that the child might learn something. However, the misguided logic of children being unable to spread COVID-19 is patently false and will bite us all in the ass.
Children can pass COVID-19 to an adult just as easily as an adult can pass COVID-19 to a child. There is no transmissibility decrease from child to adult or adult to child for any virus, including colds, flu and, yes, COVID-19. You are just as likely to catch a virus from a child as from an adult or from anyone of any age. Anyone claiming otherwise is flat out lying to you. Human transmissibility of a virus doesn’t change simply because of the age of the human. Believing that lie could get you and your family dead.
For this reason, using this lie to justify school reopening is ripe for a resurgence of the virus… not to mention, the unnecessary loss of teaching staff life that, at this time, can’t be easily replaced. If school districts want to believe this patently false lie and reopen the schools simply to get the kids back at their desks, then don’t say you haven’t been warned.
Vaccination
If schools wish to reopen, children and teachers must be vaccinated for COVID-19. Why? Not because the vaccine won’t stop children from being carriers, but because it will reduce the amount of time they can carry COVID-19 when they get it. In fact, a child taking the vaccine may actually reduce and limit the child’s ability to transmit the virus to others. If their immune system fights off the virus quickly (in a day or two), a child’s ability to transmit is limited to a day or two at most. Because the vaccine kickstarts the immune system into action fairly quickly, a child should be able recover far, far faster from even an asymptomatic infection than an adult. Vaccination can then drastically reduce transmission from child to child in a school setting. It also drastically reduces the chances a child may transmit it to a teacher (particularly if the teachers are also inoculated) or to their parents.
For this reason, the currently flawed strategy of inoculating the eldest groups first and working downward means leaving school age children as the very last group to receive the vaccine. Again, flawed logic. Yet, parents want schools reopened now. If schools want to reopen, then everyone working in schools, including the children, must be vaccinated. There is no other choice. This means modifying the present rollout strategy to send vaccines into schools by having schools be the next group in line to receive the vaccine. Attempting to open schools without a vaccine strategy will lead to the unnecessary deaths of teachers and staff operating the schools… at which point the schools will be forced to close, not because of the virus threat, but because there’s simply no longer enough staff to operate the school system. Then, the choice to reopen schools will have been firmly shut down until such time as staff can be replaced.
Of course, no new teachers will want to hire onto school districts whose leadership so callously let their own teachers and staff die by becoming infected with COVID-19 via the children, particularly when this situation could have been entirely avoided by choosing a safer distance learning approach. In other words, the schools and school districts will have a logistical public relations nightmare on their hands should such a situation unfold. Not to mention, many, many lawsuits from teacher and school children families alike. Opening schools to 100% capacity without any mitigation strategies, such as the vaccine, is ripe for many, many more COVID-19 deaths, not just in schools. Think about the holiday season surge, but then realize it won’t end until schools end up forced closed because of loss of a critical amount of staff. It’s ultimately a no-win scenario. Believing the lie that schools are “safe environments” without offering a vaccine strategy is likely to end up with the same outcome as the year end holiday season COVID-19 death surge. Here you should use critical thinking to think through this assertion.
Reducing the Spread?
The bigger question… Is anything that we’re doing, including the vaccine rollout, actually making a dent in COVID-19’s spread? As of now, probably not. The vaccine rollout might eventually begin to take an effect, but that probably won’t happen for at least a year or longer. Even if the vaccine reduces the symptoms of COVID-19 to a manageable and survivable level, that still means that COVID-19 still has the potential to be fatal in some risk groups where the vaccine doesn’t work properly or can’t be administered. The vaccine may reduce the mortality rate in an eventual way, but we don’t yet as know how far the mortality rate may be reduced.
On the other side, we are also running against the variant clock. I really dislike the term variant and, instead, prefer the term strain. I don’t know why the news programs are using the term variant instead of strain, but here we are. The primary defined difference between a variant and a strain is its functional difference. For example, scientists believe that if a mutated virus is capable of getting past a vaccine, then it is considered a new strain. If a virus has mutated, but functionally hasn’t changed and vaccines are still equally effective, then it is a new variant. However, I’d argue that if the virus hasn’t functionally changed, it isn’t even a variant … regardless of whether its genome has mutated? In other words, variants aren’t important until they are able to get around a vaccine, in which case it’s no longer a variant, but a new strain.
The clock, however, is still ticking. That means that eventually a new strain (not variant) of COVID-19 will emerge that (almost) completely evades the current vaccines. That means vaccine manufacturers will need to rework the vaccine to include the new strain(s) or provide a booster shot that boosts the antibodies to now include the new strain(s). Though, I’d logically argue that a booster shot that intends to combat a new strain is not a booster and is instead a new vaccine unto itself. Additionally, when a new strain emerges, it likely won’t be a single strain. It will be multiple strains. Once this happens, tracking them all down to modify the vaccines can be a challenge. In other words, the vaccine efficacy is entirely dependent on how long the current strains remain unchallenged. As soon as new strains emerge, this whole situation starts all over again.
That’s an example critical thinking, not the test that began this article. The test example above doesn’t actually detect your ability to reason. It tests your ability to take tests. It’s one of the fundamental problems with academia. Until universities wake up to this fundamental disparity, they remain status quo by offering an alternative universe from reality. Universities need to wake up to the realities of the world and learn to teach real world experience. Right now, the best universities offer is book knowledge which, unfortunately, may only offer less than 20% usability in the real world. For this reason, it’s why corporations shy away from hiring recent graduates for critical business roles… which makes recent graduate employment all the more difficult. Graduates may wonder why. Well, now you have your answer. Only real world business experience offers businesses the safety net they need to know the individual understands how to operate in a corporate culture and do the assigned job to the satisfaction of the corporation leadership team.
A Final Word to College Graduates
A recent college graduate has little to no corporate experience and, thus, has no way to know how to time manage themselves or their job efforts. Time management is never taught in college. The recent grad will eventually learn this, but many businesses want new hires to hit the ground running on day one. Managers don’t want to spend hours and hours training a recent college graduate only to find them walk away from the job a year later for significantly higher pay. For training reasons, hiring managers typically hire recent graduates for significantly less pay than someone seasoned. Training is costly both in time and money, which is a significant part of the reason for the lower pay. To invest that time and money into a recent college graduate only to have them walk puts managers on edge… and makes them gun-shy to try it again. That doesn’t mean a raise won’t be forthcoming to get you up to market levels. Don’t assume you’re stuck at the pay rate where you are. However, many graduates are too impatient to wait.
I realize college graduates want higher pay on their first job, but that isn’t usual. Worse, using a new employer simply to put the company name on your resume for a year is callous and manipulative. It may also hurt your future job prospects. If as a new graduate, you commit to a job, stick with it for at least a couple years. Don’t use the company as a stepping stone for resume experience and discard them like an empty bottle. Sticking with the job increases your marketability for new jobs and increases your chances for much better pay opportunities. Walking away too soon will be frowned upon by hiring managers. Hiring managers will notice your itinerant nature is a problem… particularly if you’ve left the job in a year or less after graduation. They may even insinuate there’s a problem afoot with hiring you. Be careful with your first job as it sets the tone for all future jobs.
Again, this is critical thinking and reasoning skills at work. You must think through all aspects of the hiring processes to understand how and why what you do and how you treat your employer can help or hurt your future career. Learn to use all of your critical thinking skills to think through every situation. Critical thinking is a skill that’s difficult to master, but it is a life skill that will greatly aid you in many different ways throughout your life.
↩︎
Apple’s newest MacBook: Simply Unsatisfying
It’s not a MacBook Air. It’s not a MacBook Pro. It’s simply being called the MacBook. Clever name for a computer, eh? It’s not like we haven’t seen this brand before. What’s the real trouble with this system? A single USB-C connector. Let’s explore.
Simplifying Things
There’s an art to simplification, but it seems Apple has lost its ability to rationally understand this fundamental concept. Jobs got it. Oh man, did Jobs get the concept of simplification in spades. Granted, not all of Jobs’s meddling in simplification worked. Like, a computer with only a mouse and no keyboard. Great concept, but you really don’t want to enter text through an on-screen keyboard. This is the reason the iPad is so problematic for anything other than one-liners. At least, not unless there’s some kind of audio dictation system. At the time, the Macintosh didn’t have such a system. With Siri, however, we do. Though, I’m not necessarily endorsing that Apple bring back the concept of a keyboard-less computer. Though, in fact, with a slight modification to Siri’s dictation capabilities, it would be possible.
Instead, the new MacBook has taken things away from the case design. More specifically, it has replaced all of those, you know, clunky, annoying and confusing USB 3.0 and Thunderbolt port connectors that mar the case experience. Apple’s engineers have now taken this old and clunky experience and ‘simplified’ it down to exactly one USB-C port (excluding the headphone jack.. and why do we even need this jack again).
The big question, “Is this really simplification?”
New Case Design
Instead of the full complement of ports we previously had, such as the clever magsafe power port, one or two Thunderbolt ports, two USB 3.0 ports and an SD card slot, now we have exactly one USB-C port. And, it’s not even a well known or widely used port style yet.
Smart. Adopt a port that literally no one is using and then center your entire computer’s universe around this untried technology. It’s a bold if not risky maneuver for Apple. No one has ever said Apple isn’t up for risky business ideas. It’s just odd that they centered it on an open standard rather than something custom designed by Apple. Let’s hope that Apple has massively tested plugging and unplugging this connector. If it breaks, you better hope your AppleCare service is active. And since the unplugging and plugging activity falls under wear-and-tear, it might not even be covered. Expect to spend more time at the Genius bar arguing over whether your computer is covered when this port breaks. On the other hand, we know the magsafe connector is almost impossible to break. How about this unknown USB-C connector? Does it also have the same functional lifespan? My guess is no.
I also understand that the USB-C technology automatically inherits the 10 Gbps bandwidth standard and has a no-confusion-plug-in-either-way connector style. But, it’s not as if Thunderbolt didn’t already offer the same transfer speed, though not the plug-in-either-way cable. So, I’m guessing that this means Thunderbolt is officially dead?
What about the Lightning cable? Apple recently designed and introduced the Lightning connector for charging and data transfer. Why not use the Lightning connector by adding on a faster data transfer standard? Apple spent all this time and effort on this cool new cable for charging and data transfer, but what the hell? Let’s just abandon that too and go with USB-C? Is it all about throwing out the baby with the bathwater over at Apple?
I guess the fundamental question is… Really, how important is this plug-in-either-way connector? Is Apple insinuating that general public is so dumb that it can’t figure out how to plug in a cable? Yes, trying to get the microUSB connectors inserted in the dark (because they only go in one direction) can be a hassle. The real problem isn’t that it’s a hassle, the real problem is that the connector itself was engineered all wrong. So, trying to fit in a microUSB cable into a port is only a problem because it’s metal on metal. Even when you do manage to get it lined up in the right direction, it sometimes still won’t go in. That’s just a fundamental flaw in the port connector design. It has nothing to do with directionality of it. I digress.
Fundamentally, the importance of a plug-in-either-way cable should be the lowest idea on the agenda. What should be the highest idea is simplifying to give a better user experience overall and not to hobble the computer to the point of being unnecessarily problematic.
Simply Unsatisfying
Let’s get into the meat of this whole USB-C deal. While the case now looks sleek and minimal, it doesn’t really simplify the user experience. It merely changes it. It’s basically a shell game. It moves the ball from one cup to another, but fundamentally doesn’t change the ball itself. So, instead of carrying only a power adapter and the computer, you are now being forced to carry a computer, power adapter and a dock. I fail to see exactly how this simplifies the user experience at all? I left docks behind when I walked away from using Dell Notebooks. Now, we’re being asked to use a dock again by, of all companies, Apple?
The point to making changes in any hardware (or software) design is to help improve the usability and user experience. Changing the case to offer a single USB-C port doesn’t enhance the usability or user experience. This is merely a cost cutting measure by Apple. Apple no longer needs to add pay for all of these arguably ‘extra’ (and costly) ports to the case. Removing all of those ‘extraneous’ ports now means less cost for the motherboard and die-cuts on the case, but at the expense that the user must carry around more things to support that computer. That doesn’t simplify anything for the user. It also burdens the user by forcing the user to pay more money for things that were previously included in the system itself. Not to mention, requiring the user to carry around yet more dongles. I’ve never ever known Apple to foist less of an experience on the user as a simultaneous cost cutting and accessory money making measure. This is most definitely a first for Apple, but not a first for which they want to become known. Is Apple now taking pages from Dell’s playbook?
Instead of walking out of the store with a computer ready in hand, now you have to immediately run to the accessory isle and spend another $100-200 (or more) on these ‘extras’. Extras, I might add, that were previously included in the cost of the previous gen computers. But now, they cost extra. So, that formerly $999 computer you bought that already had everything you needed will now cost you $1100-1200 or more (once you consider you now need a bag to carry all of these extras).
Apple’s Backward Thinking?
I’m sure Apple is thinking that eventually that’s all we’ll need. No more SD cards, no more Thunderbolt devices, no more USB 3 connectors. We just do everything wirelessly. After all, you have the (ahem) Apple TV for a wireless remote display (which would be great if only that technology didn’t suck so bad for latency and suffer from horrible mpeg artifacting because the bit rate is too low).
Apple likes to think they are thinking about the future. But, by the time the future arrives, what they have chosen is already outdated because they realized no one is actually using that technology other than them. So, then they have to resort to a new connector design or a new industry standard because no other computers have adopted what Apple is pushing.
For example, Thunderbolt is a tremendous idea. By today, this port should have been widely used and widely supported, yet it isn’t. There are few hard drives that use it. There are few extras that support it. Other than Apple’s use of this port to drive extra displays, that’s about the extent of how this port is used. It’s effectively a dead port on the computer. Worse, just about the time where Thunderbolt might actually be picking up steam, Apple dumps it in lieu of USB-C which offers the same transfer speeds. At best, a lateral move technologically speaking. If this port had offered 100 Gbps, I might not have even written this article.
Early Adopter Pain
What this all means is that those users who buy into this new USB-C only computer (I intentionally forget the headphone jack because it’s still pointless), will suffer early adopter pains with this computer. Not only will you be almost immediately tied to buying Apple gear, Apple has likely set up the USB-C connector to require licensed and ID’d cables and peripherals. This means that if you buy a third party unlicensed cable or device, Apple is likely to prevent it from working, just as they did with unlicensed Lightning cables on iOS.
This also means that, for at least 1-2 years, you’re at the mercy of Apple to provide you with that dongle. If you need VGA and there’s no dongle, you’re outta luck. If you need a 10/100 network adapter, outta luck. This means that until or unless a specific situational adapter becomes available, you’re stuck. Expect some level of pain when you buy into this computer.
Single Port
In addition to all of the above, let’s just fundamentally understand what a single port means. If you have your power brick plugged in, that’s it. You can’t plug anything else in. Oh, you need to run 2 monitors, read from an SD card, plug in an external hard drive and charge your computer? Good luck with that. That is, unless you buy a dock that offers all of these ports.
It’s a single port being used for everything. That means it has a single 10 Gbps path into the computer. So, if you plug in a hard drive that consumes 5 Gbps and a 4k monitor that consumes 2 Gbps, you’re already topping out that connector’s entire bandwidth into the computer. Or, what if you need a 10 Gbps Ethernet cable? Well, that pretty much consumes the entire bandwidth on this single USB-C connector. Good luck with trying to run a hard drive and monitor with that setup.
Where an older MacBook Air or Pro had two 5 Gbps USB3 ports and one or two 10 Gbps Thunderbolt ports (offering greater than 10 Gbps paths into the computer), the new MacBook only supports a max of 10 Gbps input rate over that single port. Not exactly the best trade off for performance. Of course, the reality is that the current Apple motherboards may not actually be capable of handling 30 Gbps input rate, but it was at least there to try. Though, I would expect that motherboard to handle an input rate greater than 10.
With the new MacBook, you are firmly stuck to a maximum input speed of 10 Gbps because it is a single port. Again, an inconvenience to the user. Apple once again makes the assumption that 10 Gbps is perfectly fine for all use cases. I’m guessing that Apple hopes the users simply won’t notice. Technologically, this is a step backward, not forward.
Overall
In among the early adopter problems and the relevancy problems that USB-C has to overcome, this computer now offers a more convoluted user experience. Additionally, instead of offering something that would be truly more useful and enhance the usability, such as a touch screen to use with an exclusive Spotlight mode, they opted to take this computer in a questionable direction.
Sure, the case colors are cool and the idea of a single port is intriguing, it’s only when you delve deep into the usefulness of this single port does the design quickly unravel.
Apple needs a whole lot of help in this department. I’m quite sure had Jobs been alive that while he might have introduced the simplified case design, it would have been overshadowed by the computer’s feature set (i.e., touch screen, better input device, better dictation, etc). Instead of trying to wow people with a single USB-C port (which offers more befuddlement than wow), Apple should have fundamentally improved the actual usability of this computer by enhancing the integration between the OS and the computer.
The case design doesn’t ultimately much matter, the usability of the computer itself matters. Until Apple understands that we don’t really much care what the case looks like as long as it provides what we need to compute without added hassles, weight and costs, Apple’s designers will continue running off on these tangents spending useless cycles attempting to redesign minimalist cases that really don’t benefit from it. At least, Apple needs to understand that there is a point of diminishing returns when trying to rethink minimalist designs…. and with this MacBook design, the Apple designers have gone well beyond the point of diminishing returns.
leave a comment