The never-ending chicken-and-egg issue of gaining ground in Google results is about to take another abrupt turn. According to Google's Matt Cutts, the company is working on a new set of tweaks to the fabled "GoogleBot" that will penalize sites that over-optimize for prime Google results.
Search Engine Land's Barry Schwartz reports that Cutts let the impending tweaks slip out while speaking at a panel at this year's South by Southwest conference in Austin, Texas. The goal, said Cutts, is to "level the playing field" between sites that focus on excessive optimization to achieve strong Google results versus sites that hit Google naturally through strong, relevant content.
"We try to make the GoogleBot smarter, try to make our relevance more adaptive, so that if people don't so SEO we handle that. And we are also looking at the people who abuse it, who put too many keywords on a page, exchange way too many links, or whatever else they are doing to go beyond what you normally expect. We have several engineers on my team working on this right now," Cutts said.
Cutts added that the optimizations could hit anytime between the next few weeks to a month from now. Google hasn't gone on record with any additional details as to what its optimizations might include – fair, since additional details about how the GoogleBot will rank sites could invariably assist those looking to re-game the rankings for their benefit. It's also unclear as to how Google plans to "penalize" sits that over-optimize, or even if there's going to be a way for website operators to determine whether they fall below this threshold or not.
According to a recent report by the Wall Street Journal, Google is working on integrating "semantic search" technology into its primary search system. This would give Google the ability to better answer users' questions and search queries immediately, rather than merely point them to a list of websites that might contain the answers they seek.
In the Journal's example, a user could search for the phrase "Lake Tahoe" and receive a list of facts that Google's already parsed about the lake — it's location, altitude, average temperature, et cetera — instead of just a list of sites that could provide this information, were a user to dig deeper. Google's semantic search will be powered by a database that currently contains more than 200 million "interconnected entities and attributes," said Google senior vice president Amit Singhal.
No comments:
Post a Comment