This is the meat and potatoes of Google search, a question search engine optimizers have been trying to answer for years. While it is generally understood how Google’s patented PageRank algorithm generates an SERP, the specifics remain as secret as Coke’s magic formula or the Colonel’s Kentucky Fried recipe. To hear Google tell it, this is to prevent both white and black hats from gaming the system. But though the finer points of PageRank remain mysterious, there is enough information available to construct a reliable guide to what attracts the algorithm’s fancy.
As previously stated, Google does not pull its results from the internet proper but instead from its own vast index. It compiles this index by crawling the web with software programs called “crawlers” or, more whimsically, “spiders.” These programs build a map of the internet based on the web of links that exists between internet pages. The spiders follow links from page to page, multiplying as the links and pages multiply. Years ago, when this process became general knowledge, black hat SEOs and spammers would create giant link farms that looped a few websites back and forth to each other. Google’s algorithm is now much more sophisticated and will actively demote link farms and link wheels to the bottom of its SERPs, or ban them outright (we’ll get more into bad SEO practices and spam in the next section).
As links are the essential component to how Google compiles its index, they have a lot of bearing on how it pulls its search results. It should be noted however that Google’s constant updates have given more and less weight to many other factors, a source of frustration and oftentimes panic in the SEO world. But as we’ll learn shortly, the major tenets of SEO have remained virtually unchanged.
So how does Google decide what are the best pages to display in response to a user query? According to Matt Cutts, currently head of Google’s Webspam team, Google asks some questions of its own. These questions are based on the keywords in a given query, and include:
- Do these keywords appear in the title of a webpage?
- Do they appear in the URL?
- Are the keywords close to each other?
- Does a page contain known synonyms for the keywords?
- Most importantly) is the page containing the relevant terms from a quality website or is it from a spammy, or untrusted, site?
Links to and from a page are counted, as well as a page’s popularity (how often it is visited and clicked on). Other factors influencing the SERP are the safesearch filter, a user’s preferences and the “freshness” of pulled content. Content is considered fresh if it is new and original to the site being searched.
As you’ve probably guessed, keywords are the magic ingredient here. Indeed, trustworthy links and matching keywords are the two most important factors in building a quality site that Google trusts. But as should come as no surprise, spammers took this idea and ran with it. “Keyword stuffing,” the act of jamming as many keywords into a webpage as many times as possible, became a common black hat SEO tactic. Keywords hidden in a site’s pages, the same color as a site’s background, inserted into the code, these generated plenty of false positives in the early days of the algorithm. But as it did with link farms, PageRank has since evolved to distinguish and discard these spam sites.
Today links and keywords still matter, but how they are used is of just as much if not greater importance. For instance, a link from a high ranking web page (such as a trusted news source or business entity) counts for greater weight than a low ranking page. Creating a bunch of shell sites just for the purposes of multiplying one’s links will earn a site nothing more than a spam warning. In the same way, it is not just keywords that matter now. How keywords are used on a page, where they appear and how they are dispersed also contributes to a site’s ranking.
Understandably then, there is a right and a wrong way to use links and keywords. Not always so understandable is what else puts a site at the top of the SERP. There are a few things:
1) How closely a site’s content matches a long-tail keyphrase.
If a keyword is something small like “panda” or “Jiminy Cricket,” it’s easier to search. Long-tail phrases like “don’t take your guns to town” are often broken up into their component parts (“guns,” “town,” etc.). But since those are the lyrics to a Johnny Cash song, the whole phrase will generate results. Google Instant and the 2013 Hummingbird update are meant to improve Google’s ability to match long-tail keyphrases.
2) How fresh and and how frequent a site’s content is.
This has been true since the beginning: Original, quality content is great for boosting your PageRank. The more targeted content that exists on your site, the more varied the keywords and content Google can search through. If you write for even a niche audience, frequently adding to this content will ensure frequent visitors and readers. The more visitors to your site, the higher Google will rank your site. The higher Google ranks your site, the more readers will be able to find your site. Be aware however that the emphasis here is on original material. Sites that scrape content from others or stockpile a bunch of low quality bushwa will be penalized as spam.
3) How popular you are.
No, things haven’t changed much since your high school prom. Simply put, the more people visit your site, the higher your rankings will rise. For a plucky little website just starting in the business this may seem unfair, but remember that Google’s job is to select the most trusted and relevant pages in answer to users’ queries. Established brands have much more recognition and, since Google’s “Vince” update, are actually favored by the search engine. However, even a small business can achieve top rankings in Google by following appropriate SEO tactics, honing in on keyphrases, adding quality content and being recognized for its service.
What about the Ads at the Top of the SERPs?
The links that appear at the top of a Google results page are paid advertisements, which Google highlights in yellow to distinguish them from its organic results. The benefit to these ads is that a business will appear at the top of an SERP whenever a user searches a keyword that business is paying for.
The negative aspect to these ads is two-fold: First, users tend to put more stock in links that appear naturally. GroupM UK and Nielsen conducted a study in June 2011 that encompassed 28 million UK users and 1.4 billion searches. The results were overwhelmingly in favor of organic results, with 94% clicking on organic rather than paid links. The second negative aspect to paid ads is their impermanence. Once the payments stop, the link disappears. Organic search engine optimization that is done professionally and appropriately lasts as long as the SEO is maintained.