Random Thoughts – Randocity!

Why Rotten Tomatoes is rotten

Posted in botch, business, california by commorancy on December 31, 2019

cinema-popcornWhen you visit a site like Rotten Tomatoes to get information about a film, you need to ask yourself one very important question, “Is Rotten Tomatoes trustworthy?”

Rotten Tomatoes as a movie review service has come under fire many times for revitew bombing and manipulation. That is, Rotten Tomatoes seem to allow shills to join the service to review bomb a movie to either raise or lower its various scores by manipulating the Rotten Tomatoes review system. In the past, these claims couldn’t be verified. Today, they can.

As of a change in May 2019, Rotten Tomatoes has made it exceedingly easy for both movie studios and Rotten Tomatoes itself to game and manipulate the “Audience Score” ratings. Let’s explore.

Rotten Tomatoes as a Service

Originally, Rotten Tomatoes began its life as an independent movie review service such that both critics and audience members can have a voice in what they think of a film. So long as Rotten Tomatoes remained an independent and separate service from movie studio influence and corruption, it could make that claim. Its reviews were fair and for the most part accurate.

Unfortunately, all good things must come to an end. In February of 2016, Fandango purchased Rotten Tomatoes. Let’s understand the ramifications of this purchase. Because Fandango is wholly owned by Comcast and in which Warner Brothers also holds an ownership stake in Fandango, this firmly plants Rotten Tomatoes well out of the possibility of remaining neutral in film reviews. Keep in mind that Comcast also owns NBC as well as Universal Studios.

Fandango doesn’t own a stake in Disney as far as I can tell, but that won’t matter based on what I describe next about the Rotten Tomatoes review system.

Review Bombing

As stated in the opening, Rotten Tomatoes has come under fire for several notable recent movies as having scores which have been manipulated. Rotten Tomatoes has then later debunked those claims by stating that their system was not manipulated, but then really offering no proof of that fact. We simply have to take them at their word. One of these allegedly review bombed films was Star Wars: The Last Jedi… where the scores inexplicably dropped dramatically in about a 1 month period of time. Rotten Tomatoes apparently validated the drop as “legitimate”.

Unfortunately, Rotten Tomatoes has become a bit more untrustworthy as of late. Let’s understand why.

As of May of 2019, Rotten Tomatoes introduced a new feature known as “verified reviews”. For a review’s score to be counted towards the “Audience Score”, the reviewer must have purchased a ticket from a verifiable source. Unfortunately, the only source from which Rotten Tomatoes can verify ticket purchases is from its parent company, Fandango. All other ticket purchases don’t count… thus, if you choose to review a film after purchasing your ticket from the theater’s box office, from MovieTickets.com or via any other means, your ticket won’t count as “verified” should you review or rate the movie. Only Fandango ticket purchases count towards “verified” reviews, thus altering the audience score. This change is BAD. Very, very bad.

Here’s what Rotten Tomatoes has to say from the linked article just above:

Rotten Tomatoes now features an Audience Score made up of ratings from users we’ve confirmed bought tickets to the movie – we’re calling them “Verified Ratings.” We’re also tagging written reviews from users we can confirm purchased tickets to a movie as “Verified” reviews.

While this might sound like a great idea in theory, it’s ripe for manipulation problems. Fandango also states that “IF” they can determine “other” reviews as confirmed ticket purchases, they will mark them as “verified”. Yeah, but that’s a manual process and is impossibly difficult to determine. We can pretty much forget that this option even exists. Let’s list the problems coming out of this change:

  1. Fandango only sells a small percentage of overall ticket sales for a film. If the “Audience Score” is calculated primarily and solely from Fandango ticket sales alone, then this metric is a horribly inaccurate metric to rely on.
  2. Fandango CAN handpick “other” non-Fandango ticket purchased reviews to be included. Not likely to happen often, but this also means they can pick their favorites (and favorable) reviews to include. This opens Rotten Tomatoes up to Payola or “pay for inclusion”.
  3. By specifying exactly how this process works, this change opens the Rotten Tomatoes system to being gamed and manipulated, even by Rotten Tomatoes staff themselves. Movie studios can also ask their employees, families and friends to exclusively purchase their tickets from Fandango and request these same people to write “glowing, positive reviews” or submit “high ratings” or face job consequences. Studios might even be willing to pay for these positive reviews.
  4. Studios can even hire outside people (sometime known as shills) to go see a movie by buying tickets from Fandango and then rate their films highly… because they were paid to do so. As I said, manipulation.

Trust in Reviews

It’s clear that while Rotten Tomatoes is trying to fix its ills, it is incredibly naive at it. It gets worse. Not only is Rotten Tomatoes incredibly naive, this company is also not at all tech savvy. Its system is so ripe for being gamed, the “Audience Score” is a nearly pointless metric. For example, 38,000 verified reviews based on millions of people who watched it? Yeah, if that “Audience Score” number isn’t now skewed, I don’t know what is.

Case in point. The “Audience Score” for The Rise of Skywalker is 86%. The difficulty with this number is the vast majority of the reviews I’ve seen from people on chat forums don’t rate the film anywhere close to 86%. What that means is that the new way that Rotten Tomatoes is calculating scores is effectively a form of manipulation itself BY Rotten Tomatoes.

To have the most fair and accurate metric, ALL reviews must be counted and included in all ratings. You can’t just toss out the vast majority of reviews simply because you can’t verify them has holding a ticket. Even still, holding a ticket doesn’t mean someone has actually watched the film. Buying a ticket and actually attending a showing of the film are two entirely separate things.

While you may have verified a ticket purchase, did you verify that the person actually watched the film? Are you withholding brand new Rotten Tomatoes account reviewers out of the audience score? How trustworthy can someone be if this is their first and only review on Rotten Tomatoes? What about people who downloaded the app just to buy a ticket for that film? Simply buying a ticket from Fandango doesn’t make the rating or reviewer trustworthy.

Rethinking Rotten Tomatoes

Someone at Rotten Tomatoes needs to drastically reconsider this change and they need to do it fast. If Rotten Tomatoes wasn’t guilty of manipulation of review scores before this late spring change in 2019, they are now. Rotten Tomatoes is definitely guilty of manipulating the “Audience Score” by the sheer lack of reviews covered under this “verified review” change. Nothing can be considered valid when the sampling size is so small as to be useless. Verifying a ticket holder also doesn’t validate a review author’s sincerity, intent or, indeed, legitimacy. It also severely limits who can be counted under their ratings, thus reducing the trustworthiness of “Audience Score”.

In fact, only by looking at past reviews can someone determine if a review author has trustworthy opinions.

Worse, Fandango holds a very small portion of all ticket sales made for theaters (see below). By showing all (or primarily) scores tabulated by people who bought tickets from Fandango, this definitely eliminates well over half of the written reviews on Rotten Tomatoes as valid. Worse, because of the way the metric is calculated, nefarious entities can game the system to their own benefit and manipulate the score quickly.

This has a chilling effect on Rotten Tomatoes. The staff at Rotten Tomatoes needs roll back this change pronto. For Rotten Tomatoes to return it being a trustworthy neutral entity in the art of movie reviews, it needs a far better way to determine trustworthiness of its reviews and of its reviewers. Trust comes from well written, consistent reviews. Ratings come from trusted sources. Trust is earned. The sole act of buying a ticket from Fandango doesn’t earn trust. It earns bankroll.

Why then are ticket buyers from Fandango any more trustworthy than people purchasing tickets elsewhere? They aren’t… and here’s where Rotten Tomatoes has failed. Rotten Tomatoes incorrectly assumes that by “verifying” a sale of a ticket via Fandango alone, that that somehow makes a review or rating more trustworthy. It doesn’t.

It gets worse because while Fandango represents at least 70% of online sales, it STILL only represents a tiny fraction of overall ticket sales, at just 5-6% (as of 2012).

“Online ticketing still just represents five to six percent of the box office, so there’s tremendous potential for growth right here.” –TheWrap in 2012

Granted, this TheWrap article is from 2012. Even if Fandango had managed to grab 50% of the overall ticket sales in the subsequent 7 years since that article, that would leave out 50% of the remaining ticket holder’s voices, which will not be tallied into Rotten Tomatoes current “Audience Score” metric. I seriously doubt that Fandango has managed to achieve anywhere close to 50% of total movie ticket sales. If it held 5-6% overall sales in 2012, in 7 years Fandango might account for growth between 10-15% at most by 2019. That’s still 85% of all reviews excluded from Rotten Tomatoes’s “Audience Score” metric.  In fact, it behooves Fandango to keep this overall ticket sales number as low as possible so as to influence its “Audience Score” number with more ease and precision.

To put this in a little more perspective, a movie theater might have 200 seats. 10% of that is 20. That means that for every 200 people who might fill a theater, just less than 20 people have bought their ticket from Fandango and are eligible for their review to count towards “Audience Score”. Considering that only a small percentage of that 20 will actually take the time to write a review, that could mean out of every 200 people who’ve seen the film legitimately, between 1 and 5 people might be counted towards the Audience Score. Calculating that up, for very 1 million people who see a blockbuster film, somewhere between 5,000 and 25,000’s reviews may contribute to the Rotten Tomatoes “Audience Score”… even if there are hundreds of thousands of reviews on the site.

The fewer the reviews contributing to that score, the easier it is to manipulate that score by adding just a handful of reviews to the mix… and that’s where Rotten Tomatoes “handpicked reviews” come into play (and with it, the potential for Payola). Rotten Tomatoes can then handpick positive reviews for inclusion. The problem is that while Rotten Tomatoes understands all of this this, so do the studios. Which means that studios can, like I said above, “invite” employees to buy tickets via Fandango before writing a review on Rotten Tomatoes. They can even contact Rotten Tomatoes and pay for “special treatment”. This situation can allow movie studios to unduly influence the “Audience Score” for a current release… this is compounded because there are so few reviews that count to create the “Audience Score”.

Where Rotten Tomatoes likely counted every review towards this score before this change, after they implemented the new “verified score” methodology, this change greatly drops the number of reviews which contribute to tallying this score. This lower number of reviews means that it is now much easier to manipulate its Audience Score number either by gaming the system or by Rotten Tomatoes handpicking reviews to include.

Fading Trust

While Rotten Tomatoes was once a trustworthy site for movie reviews, it has greatly reduced its trust levels by instituting such backwards and easily manipulable systems.

Whenever you visit a site like Rotten Tomatoes, you must always question everything you see. When you see something like an “Audience Score”, you must question how that number is calculated and what is included in that number. Rotten Tomatoes isn’t forthcoming.

In the case of Rotten Tomatoes, they have drastically reduced the number of included reviews in that metric because of their “verified purchase” mechanism. Unfortunately, the introduction of that mechanism at once destroys Rotten Tomatoes trust and trashes the concept of their site.

It Gets Worse

What’s even more of a problem is the following two images:

Screen Shot 2019-12-23 at 7.26.58 AM

Screen Shot 2019-12-23 at 7.26.24 AM

From the above two images, it is claimed Rotten Tomatoes has 37,956 “Verified Ratings”, yet they only have 3,342 “Verified Audience” reviews. That’s a huge discrepancy. Where are those other 34,614 “Verified” reviews? You need to calculate the Audience Score not solely on a phone device using a simplistic “rate this movie” alone. It must be calculated in combination with an author writing a review. Of course, there are 5,240 reviews that didn’t at all contribute to any score at all on Rotten Tomatoes. Those audience reviews are just “there”, taking up space.

Single number ratings are pointless without at least some text validation information. Worse, we know that these “Verified Ratings” likely have nothing to do with “Verified Audience” as shown in the images above. Even if those 3,342 audience reviews are actually calculated into the “Verified Ratings” (they probably aren’t), that’s still such a limited number when considered with the rest of the “Verified Ratings” so as to be skewed by people who may not have even attended the film.

You can only determine if someone has actually attended a film by asking them to WRITE even the smallest of a review. Simply pressing “five star” on the app without even caring is pointless. It’s possible the reviews weren’t even tabulated correctly via the App. The App itself may even submit star data after a period of time without the owner’s knowledge or consent. The App can even word its rating question in such a way as to manipulate the response in a positive direction. Can we say, “Skewed”?

None of this leads to trust. Without knowing exactly how that data was collected, the method(s) used and how it was presented on the site and on the app, how can you trust any of it? It’s easy to see professional critic reviews because Rotten Tomatoes must cite back to the source of the review. However, with audience metrics, it’s all nebulous and easily falsified… particularly when Rotten Tomatoes is intentionally obtuse and opaque for exactly how it collects this data and how it is presents it.

Even still, with over one million people attending and viewing The Rise of Skywalker, yet Rotten Tomatoes has only counted just under verified 38,000 people, something doesn’t add up. Yeah, Rotten Tomatoes is so very trustworthy (yeah right), particularly after this “verified” change. Maybe it’s time for those Rotten Tomatoes to finally be tossed into the garbage?

↩︎

What’s wrong with Quora?

Posted in botch, business, california, rant by commorancy on July 28, 2019

QuoraYou might be asking, “What is Quora?” We’ll get into that soon enough. Let’s explore the problems with Quora.

Questions and Answers

Before we get into Quora, let’s start by talking about Google. Many people seek answers from Google for many different questions. In fact, questions are the number one use for Google. You don’t go to Google to seek answers you already know. You go there to search (or question) things you don’t know. Such questions might include:

  • Where can I buy a toaster?
  • How long do I bake a chicken?
  • How do I make Quesadillas?
  • What’s the value of my 1974 Pontiac T-Bird?

These are full text questions. And yes, Google does support asking questions in long form such as these above. You can also search Google by using short key words, such as “toastmaster toaster” or “pontiac t-bird” (no, you don’t even need to use the proper case).

These short form questions are solely for use at search engines. When seeking answers to long form questions both Google and other sites can offer responses to your questions. One such site is Quora. Another is Yahoo Answers (a much older platform). Even Google got in on this action with Google Questions and Answers.

Quora

Quora is a recent incarnation of the older Yahoo Answers platform. Even before Yahoo Answers, there was Ask Jeeves. Even Epinions, a product review site (defunct as of 2018), had many answers to many questions. Epinions, in fact, opens a bigger discussion around site closures and content… but that’s a discussion for another article.

The real question (ahem) is whether sites like Yahoo Answers and Quora provide valuable answers or whether they simply usurp Google’s ability to answer questions in more trusted ways. I’m on the fence as to this question’s answer. Let me explain more about Quora to understand why I feel this way.

Quora is a crowdsourced product. By that I mean that both questions and answers are driven by crowds of subscribers. Not by Quora staff or, indeed, Quora at all. Unlike Wikipedia which has many volunteers who constantly proof, correct and improve articles to make Wikipedia a trustworthy information source, Quora offers nothing but the weakest of moderation. In fact, the only moderation Quora offers is both removal of answers and banning of accounts.

Quora has no live people out there reviewing questions and answers for either grammar and mechanics, nor trustworthiness. No one questions whether an answer is valid, useful or indeed even correct. Quora doesn’t even require its answer authors to cite sources or in any way validate what they have written. In fact, Quora’s moderation system is so broken that when answer authors do cite sources, their answer might be flagged and removed as ‘spam’. Yes, the very inclusion of web site links can and will cause answers to be marked as spam and removed from the site. Quora’s insane rationale is that if there’s a web link, it must be pointing to a site owned by the answer author and in which the answer author is attempting to advertise. This stupid and undermining rationale is applied by bots who neither read the content they review nor do they understand that the answer author can’t possibly own Wikipedia.com, Amazon.com or eBay.com.

Indeed, Quora’s moderation is so bare bones basic and broken, it undermines Quora’s own trustworthiness so much so that when you read an answer on Quora, you must always question the answer author’s reputation. Even then, because Quora’s verification and reputation system is non-existent, you can never know if the person is who they say they are. But, this is just the tip of the troubles at Quora.

Quora’s Real Problems

Trustworthiness is something every information site must address. It must address it in concrete and useful ways, ways that subscribers can easily get really fast. Wikipedia has addressed its trust issues by a fleet of moderators who constantly comb Wikipedia and who question every article and every statement in each article. Even with a fleet of moderators, incorrect information can creep in. Within a day or two, that information will either be corrected or removed. Wikipedia has very stringent rules around the addition and verification of information.

Twitter offers a verification system so that celebrities and people of note can send information to Twitter to verify who they say they are to Twitter staff. You’ll notice these as little blue check mark’s by the Twitter subscriber’s name. These check marks validate the person as legitimate and not a fake.

Quora, on the other hand, has no such rules or validation systems at all. In fact, Quora’s terms of service are all primarily designed around “behaving nicely” with no rules around validation of content or of authors. Indeed, Quora offers no terms that address trust or truth of the information provided. Far too many times, authors use Quora as a way of writing fanciful fiction. Worse, Quora does nothing to address this problem. They’re too worried about “spam” links than about whether an answer to a question is valid or trustworthy.

Yet, Quora continually usurps Google’s search by placing its questions (and answers implicitly) at the top of the search results. I question the value in Quora for this. It’s fine if Quora’s answers appear in search towards the bottom of the page, but they should NEVER appear at the number 1 position. This is primarily a Google problem. That Google chooses to promote untrustworthy sites at the top of its search results is something that Google most definitely needs to address. Sure, it is a problem for Quora, but it’s likewise a problem for Google.

Google purports to want to maintain “safety” and “trustworthiness” in its search by not leading you to malicious sites and by, instead, leading you to trustworthy sites. Yet, it plops Quora’s sometimes malicious answers at the top of its search results. Google needs to begin rating sites for trustworthiness and it should then push search results to appropriate levels based on that level of trust. Google needs to insist that sites like Quora, which provide consumers with actionable information, must maintain a certain level of trust to maintain high search rankings. Quora having its question results appear in the top 3 positions of the first page of Google search based entirely on weak trustworthiness is completely problematic.

Wikipedia strives to make its site trustworthy… that what you read is, indeed, valuable, valid and truthful information. Quora, on the other hand, makes absolutely no effort to ensure its answers are valid, trustworthy or, indeed, even truthful. You could ask Google for the answer to a question. You might see Quora’s results at the top of Google’s results and click it. Google placing such sites in the top 3 positions implies an automatic level of trust. That the sites that appear in the first 3 results are there because they ARE trustworthy. This implicit trust is entirely misplaced. Google doesn’t, in fact, place sites in the top of its search because they are trustworthy. It places them there because of “popularity”.

You simply can’t jump to this “trustworthiness” conclusion when viewing Google search results. The only thing you can glean from a site appearing in Google results is that it is not going to infect your computer with a virus. Otherwise, Google places any site at the top of its ranking when Google decides to rank in that position. As I said, you should never read any implicit level of trust into sites which appear in the first 3 positions of Google search. Quora proves this out. Quora’s entire lack of trustworthiness of information means that Google is not, in any way, looking out for your best interests. They are looking out for Quora, not you. Quora’s questions sometimes even rank higher than Wikipedia.

Quora’s Answers

With that said, let’s delve deeper into the problem with Quora’s answers. If you’ve ever written an answer on Quora, then you’ll fully understand what I’m about to say. Quora’s terms of service are, in fact, counter to producing trustworthy answers. Unlike news sites like CNN, The Washington Post and the L.A. Times, where journalistic integrity is the key driving force, Quora ensures none of this. Sure, Quora’s answer editor tool does offer the ability to insert quotes and references, but doing so can easily mark your answer as ‘spam’.

In fact, I’ve had 2 or 3 year old Quora answers marked as ‘spam’ and removed from view because of the inclusion of a link to an external and reputable web site. Quora cites violation of terms for this when, in fact, no such violation exists. The author is then required to spend time appealing this “decision”.

Instead, its bots will remove reviews from its site based entirely upon reports by users. If a user doesn’t like the answer, they can report the answer and a Quora review bot will then take the answer down and place it under moderation appeal. There is no manual review by actual Quora staff to check the bot’s work. This work is all done by robots. Robots that can be gamed and sabotaged by irate, irrational, upset users who have a vendetta against other Quorans.

The answer takedowns are never in the interest of trust or making Quora more trustworthy, but are always in the interest of siding with the reporting user who has a vendetta or is simply insane. Users have even learned that they can game Quora’s robots to have answers removed without valid reasons or, indeed, no reasons at all. There’s no check and balance with the moderation robots or takedown requests. Quora receives a report, the answer is summarily removed.

Unfortunately, this is the tip of a much larger Quora iceberg. Let’s continue.

Which is more important, the question or the answer?

All of the above leads to an even bigger problem. Instead of Quora spending its development time attempting to shore up its level of site trust, it instead spends its time creating questionable programs like the Partner Program. A program that, in one idea, sums up everything wrong with Quora.

What is the Partner Program? I’ll get to that in a moment. What the Partner Program ultimately is to Quora is an albatross. Or, more specifically, it will likely become Quora’s downfall. This program solidifies everything I’ve said above and, simultaneously, illustrates Quora’s lack of understanding of its very own platform. Quora doesn’t “get” why a question and answer platform is important.

Which is more important to Quora? They answered this question (ha, see what I did there?) by making the question more important than the answer.

That’s right. The Partner Program rewards people monetarily who ask questions, NOT by rewarding the people who spend the lion’s share of their time writing thoughtful, truthful, trustworthy answers. In effect, Quora has told answer authors that their answers don’t matter. You can write a two sentence answer and it would make no difference. Yes, let’s reward the people who spend 5 minutes writing a 5-10 word sentence… not the people who spend an hour or two crafting trustworthy answers. And this is Quora’s problem in a nutshell.

Worse, it’s not the questions that draw people in to Quora. Yes, the question may be the ‘search terms’, but it’s not why people end up on Quora. The question leads people in, it’s the ANSWER that keeps them there. It’s the answers that people spend their time reading, not the questions.

This is the iceberg that Quora doesn’t get nor do they even understand. The questions are stubs. The questions are merely the arrow pointing the way. It’s not the end, it’s the beginning. The questions are not the reason people visit Quora.

By producing the Partner Program, Quora has flipped the answer authors the proverbial middle finger.finger-512If you’re a Quora answer author, you should definitely consider the Partner Program as insulting. Quora has effectively told the answer authors, “Your answers are worthless. Only questions have monetary value.” Yes, let’s reward the question writers who’ve spent perhaps less than 5 minutes devising a sentence. Let’s completely ignore the answer authors who have spent sometimes hours or days crafting their words, researching those words for clarity and truthfulness and ensuring trust in each detailed answer.

It’s not the questions that draw people in, Quora staff. People visit Quora for the answers. Without thoughtful answers, there is absolutely no reason to visit Quora.

Indeed, Quora’s thinking is completely backasswards, foolish and clownish. It shows just how much a clown outfit Quora really is. Seriously, placing value on the questions at the expense of answer authors who spend hours crafting detailed answers is the very definition of clownish. That situation would be synonymous to The Washington Post or The New York Times valuing and paying readers to leave comments and then asking their journalists to spend their own time and money writing and researching their articles, only to give the article to the newspaper for free. How many journalists would have ever become journalists knowing this business model?

Qlowns

Whomever at Quora dreamed up this clownish idea should be summarily walked to the door. Dissing and dismissing the very lifeblood of your site, the actual question authors, is just intensely one of the most stupid and insane things I’ve seen a site do in its life.

Not only is the very concept of the partner program qlownish, not only does it completely dissuade authors from participating in Quora, not only is it completely backwards thinking, not only does it reward question authors (which honestly makes no sense at all), this program does nothing to establish trust or indeed, does nothing to put forth any journalistic integrity.

Instead, Quora needs to ditch the question Partner Program and fast. It needs to quickly establish a system that not only rewards the best answer authors, it needs to enforce journalistic integrity on EVERY ANSWER. It needs to implement a validation system to ensure that authors are who they say they are. It needs to make certain that every answer author understands that they are in every real sense a ‘journalist’. And, as a journalist, they should uphold journalistic integrity. That integrity means properly researching sources and properly citing those sources. Yes, it’s a hassle, but it means that Quora’s answers will become trustworthy sources of information.

Right now, the answer authors are mostly random and low quality. In fact, most answers are of such low quality that you simply can’t trust anything found on Quora. Since Quora does not enforce any level of journalistic standards on the answers, there is no way anyone reading Quora should trust what any answer author writes. An answer may seem detailed, but in some cases they are pure fiction. No one at Quora ensures that answers in any way uphold any level of journalistic integrity (there’s that phrase again). It’s an important phrase when you’re writing something that people rely on.

Making a statement of fact for something that seems questionable needs to be cited with a source of reference. Show that at least one other reputable source agrees with your “facts”. That doesn’t mean that that “fact” is true. It’s easy for other reputable sites to be fooled by tricksters. This is why it’s important to cite several reputable sources which agree with your facts. I don’t want to dive deep into the topic of journalistic integrity or what it takes to validate sources, so I’ll leave this one here. This article is about Quora’s inability to uphold journalistic integrity.

Quora’s Backward Thinking

Indeed, the Partner Program’s existence confirms that Quora’s site importance is the opposite of journalistic integrity. Quora’s team values only the questions and the question writers. They do not, in any way, value the journalistic integrity required to write a solid, trustworthy answer. Questions are mere tools. They do not at all imply any level of trust. Here’s another analogy that might make more sense.

A question is simply the key to open a lock. A key is a tool and nothing more. You pay for the lock and key together. You don’t pay only for a key. Paying for a key without a lock means you don’t value (or indeed) even need a lock. You can’t lock anything with only a key. The two are a pair and they both go hand-in-hand. If you lose the key, you can’t open the lock. If you lose the lock, they key has no value. However, it’s easier and cheaper to replace a key than it is to replace the lock. This shows you the value of a ‘key’ alone.

Because Quora chooses to place value only the key and not on the lock, they have entirely lost the ability to protect Quora’s reputation and credibility. Indeed, Quora’s credibility was already in jeopardy before the Partner Program was even a twinkle in someone’s eye. With the Partner Program, Quora has solidified its lack of credibility. Quora has officially demonstrated that it is committed to valuing and paying only for keys and never paying for locks to go with those keys. That means the locks will be the weakest, most flimsiest pieces of junk to ever exist… indeed, the locks won’t even exist.

When you’re trying to secure something, you want the strongest, most durable, most rugged, most secure lock you can afford. You don’t care about the key other than as a the means of opening and securing a lock. Sure, you want the key to be durable and rugged, but a key is a key. There’s nothing so magical about a key that you’d be willing the shell out big bucks solely for a key. You always expect a lock and key to go together. You expect to buy both and you expect them both to work as a cohesive whole. If the key fails, the lock is worthless. If the lock is breakable, then the key is worthless. A lock and key are the very definition of a synergistic relationship. In the lock and key relationship, both have equal importance to the relationship. However, the lock itself is viewed by most people as the most important piece. Locks, however, become unimportant if they can’t secure the belongings they are entrusted to protect. Yes, you do need both the key and the lock for the system to function as a whole.

Likewise, Quora needs both the question and answer to function as a cohesive whole. In the synergistic relationship between the question and an answer, neither is more important in this synergy. Of the two, however, like the lock mechanism, the answer is the most important to the end user because it is what imparts the most information to the reader. It is what must be trustworthy. It is what must contain the information needed to answer the question. The question then holds the same functionality as a key. In fact, it is very much considered a key to Google. That’s why they’re called ‘keywords’ or ‘key phrases’. Using the word ‘key’ when in relation to a search engine is intended to be very much synonymous with a real life key you attach to a key ring. A keyword unlocks the data you need.

Valuing both the Lock and Key

Quora needs a rethink. If there’s any value to be held on data, both the key and the lock, or more specifically the question and answer, need to be valued as a cohesive whole. If you value the question, then you must also value the answer(s). This means revenue sharing. The question author will then receive the equivalent % of revenue that each answer author receives based on work involved. Since a sentence might take you 5 minutes to write and requires no trustworthiness at all, the maximum value a question author might receive would be no more than 10%. The remaining 90% of the revenue would be issued to the answer authors based on traffic driven to the site.

Let’s say that $100 in revenue is driven to that Q&A for the first month. $10 is given to the question asker… always 10% of total revenue. That’s probably a little on the high side, but the question asker did kick the whole process off.

Now, let’s say 3 answers are submitted for the question. Let’s assume all 3 answer authors are participating in the revenue program. The remaining $90 is then spread among the 3 answer authors based on total views. Likes might pump up the percentage by a small percentage. If one answer is fully detailed and receives 2.5k views in 30 days and the remaining two answers receive 500 views each, then the 2.5k views answer author would receive at least 72% of the remaining revenue (2.5k + 1k = 3.5k). 2.5k is ~72% of 3.5k. This means this author would receive 72% of the remaining $90 or a total of $65. The remaining $15 would be split between the other two authors. The more participating authors, the less money to go around per answer. Questions that receive perhaps 200 answers might see only a few dollars of revenue per author.

There must also be some guidelines around answers for this to work. Answer authors must be invited to participate in the program. If the answer author isn’t invited and hasn’t agreed to terms, no revenue is shared. Also, one word, one sentence and off-topic answers disqualify the answer from sharing in revenue. Additionally, to remain in the revenue program, the answer author must agree to write solid, on-topic, properly structured, fully researched and cited answers. If an invited author attempts to game the system by producing inappropriate answers to gain revenue, the author will be disqualified from the program with any further ability to participate. Basically, you risk involvement in the revenue sharing by attempting to game it.

This math incentivizes not only quality questions, but also quality answers. The better an answer is, the more views it is likely to receive. More views means more revenue. The better and clearer the answer, the more likely the author is to not only be asked to participate in the revenue sharing program, the more likely they are to receive a higher share of that revenue. The best answers should always be awarded the highest amounts of revenue possible.

Google vs Quora

As I postulated early in the article, does Quora actually hold any value as a site or does it merely usurp Google’s search results? This is a very good question, one that doesn’t have a definitive answer. For me, I find that Quora’s current answers range from occasionally and rarely very high quality to, mostly, junky worthless answers. This junky aspect of Quora leads me towards Quora being a Google usurper. In other words, most of Quora’s results in Google are trash clogging up the search results. They shouldn’t be there.

Unfortunately, Google returns all results in a search whether high or low quality. Google does offer some limited protection mechanisms to prevent malicious sites from appearing in results. But, Google’s definition of the word ‘malicious’ can be different than mine in many cases. Simply because someone can put up a web site with random information doesn’t automatically make that site valuable. Value comes from continually providing high quality information on an ongoing basis… the very definition of professional journalism. Now we’re back to journalistic integrity. We’ve come full circle.

Unfortunately, because of Quora’s lack of insistence on journalistic integrity, I find Quora to be nothing more than a mere novelty… no better than TMZ or the National Enquirer. I’m not saying TMZ doesn’t have journalists. They do. But, a rag is always a rag. Any newspaper dishing dirt on people I always consider the bottom feeders of journalism… the very dreckiest of tabloid journalism. This type of journalism is the kind of trash that has kept the National Enquirer and other tabloids in business for many, many years. It’s sensational journalism at its finest (or worst). Sure, these writers might aspire to be true journalists some day, but they’ll never find reputable journalistic employment dishing dirt on celebrities or fabricating fiction (unless they begin writing fiction novels).

Unfortunately, many of Quora’s answers fall well below even the standards established by the dreckiest of tabloids. The one and only one thing tabloids and Quora have in common is fiction. Unfortunately, the fiction on Quora isn’t even that entertaining. It’s occasionally amusing, but most of it is tedious and cliché at its most common. Think of the worst movie you’ve watched, then realize that most of these Quora fiction “stories” are even less entertaining than that. There may be a few gems here and there (probably written by professional writers simply exercising their chops on Quora), but most of it is not worth reading.

Worse, the trust level of what’s written is so low (regardless of purported “credentials”), there’s nothing on Quora worth extending a level of trust. Reading Quora for sheer entertainment value, perhaps that can be justified a little. Even then, most answers fall way short of having even entertainment value. Even the worst YouTube videos have more entertainment value. Full levels of trust? No way. Quora has in no way earned that.

Seeking Answers

Yes, we all need questions answered, occasionally. We all need to seek advice, occasionally. Yes, I’m even seeking to answer the question, “What’s wrong with Quora?” Of course, don’t expect to read any answers like THIS on Quora. Oh, no no no. Quora is very, very diligent at removing anything it deems to be anti-Quora in sentiment, such at this article. Anyway, if you choose to seek out Quora for this kind of information, Quora’s immediate problems now become your problems. Considering all of the above, Quora is probably one of the worst ways of getting information. Not only can you be easily deceived by an answer author, you can be taken for a ride down Scam Lane. Trust advice from Quora with the same level of skepticism as you would from a 6 year old child. I’m not saying there are 6 year old children on Quora, but Quora certainly acts like one. Seeking Quora for advice means you could, in fact, be taking advice from 13 year old via a Barbie encrusted iPad.

Should I write for Quora?

I’m sure this is the question you are now contemplating after having read this article. This is a question that only you can answer. However, let me leave you with these thoughts. When you write answers for Quora under the current Partner Program, you are doing so for free. Yet, question authors are being paid for YOUR effort, answer and research. You spend the time, THEY get the dime. It’s an entirely unfair arrangement.

To answer this question more definitively… I personally won’t write any future answers for Quora. Quora currently relies on each answer author’s thoughtful, researched answers to make its a success (and bring in ad dollars). If you do not like this turn of events with the Partner Program, say, “NO” and do not write for Quora.

If enough answer authors stop 🛑 writing for Quora, the questions writers can’t and won’t be paid. This will have Quora scrambling for a new fairer equity system. If you are just as disgusted by Quora’s Partner Program as I am, then walk way from Quora and no longer write answers. I have stopped writing answers and will no longer write any further answers for the site until they come to their senses and compensate both question writers and answer authors equally in a profit sharing arrangement.

↩︎

%d bloggers like this: