Those of attending college in the late 80’s or early 90’s may still remember Gopher, Archie and Veronica. You may have also had more than one paper returned by a professor who did not accept the use of online sources – I know I did.
In the early days of the World Wide Web, online information sources were a great place to visit but they were rarely trusted. Nowadays, with everyone from the Wall St. Journal to the BBC actively publishing, the cloud of skepticism that used surround the internet has greatly dissipated. Instead, particularly for Millennials who have never known a world without Wikipedia, it has largely been replaced by unwavering trust. The Internet has become our first (& often only) stop for information. We rely on Google and Twitter to give us answers to everything!
Yet despite the trust, there still is no guarantee that the top three results on your Google search will provide you with honest and accurate information. It seems that Google is attempting to change that.
According to a February 28th article on Newscientist, a Google research team is testing a new way to establish PageRank. Rather than heavily depending on the number of inbound links (or the online reputation), a new algorithm will count how many ‘incorrect facts’ appeared on each of your site’s pages.
The lower the count, the more trustworthy the page and thereby the higher your PageRank.
How does Google determine if a ‘fact’ is accurate or not?
Simply put, Google is able to use its enormous digital muscles to compare material across the web.
If the vast majority of sites agree on something, it would then be considered “…a reasonable proxy for truth…” On the other hand, if your “fact” appears to contradict the majority, it will hurt your ranking.
So what does this mean for most businesses? For now… Nothing.
A launch date for this new paradigm shift in search has not been released. Therefore I do not recommend abandoning your current SEO strategy. In fact Google has yet to confirmed whether they will ever implement it. Plus if they do, it will likely come as a tweak or additional factor in the algorithm, not a complete replacement.
How can we argue that?
Because of the nature of the Internet, a purely “Trust” based search engine criteria would not be practical. Take the following two examples:
Let’s say you run thriving pizza joint with hundreds of reviews on Yelp. What happens when you move your business to a new address or open a new location? If the search engine were strictly based on “Trust” every review that included your old location would lower your PageRank. This could be devastating to the business.
Moving beyond the simple, imagine the impact for sites that cover controversial topics such as politics, religion, ect. When was the last time you saw two political site actually agree on a single “fact” across the political aisle?
What do you think about the new “trust” factor for search ranking? Share your comments below. Also, if you are looking to build your digital media presence, contact us for our affordable social media solutions.