Over the past few months the SEO blogging community has been loaded with comment regarding Google’s implementation of the Knowledge Graph data within the SERPs, which aims to provide not just links, but more answers to search queries; to implement a more ‘human’ search experience with data which connects different elements relating to the search which may not have been explicitly specified in the query.
Here’s a screenshot of one I found. However like most of Google’s big rollouts, is currently only appearing on the US SERPS.
Google has also recently introduced lists and collections, which displays related topics at the top of the SERPs as well as the side.
What this does, is offer more information outside of the set keywords I queried. I am given the opportunity to explore other areas around the topic and find out more information, whilst still on the search page.
This is both an exciting and salient change within the industry and is set to revolutionise the way in which people search, browse, learn and indeed use the web as a whole.
Such a shift is neither new nor unexpected however, and Semantic has been a part of search engine strategy for a long time. Looking back to the growth and cultural shift that has allowed Google to return data in this way, can be valuably indicative of where it plans to continue these changes in the future.
In the beginning, there was HTML
HTML has always allowed for grammatical, contextual structure with code; offering different tags for different meanings. The evolution of the web at large meant that the semantic power of HTML was lost through early design restrictions and HTML was broken up and altered for presentation purposes as opposed to creating semantically structured, meaningful code (for a really thorough and digestible explanation, see What’s the big deal about Semantic HTML)
Things improved however and over the past decade design and development has evolved, allowing for code to be more separated from presentation. Yet this fast evolution meant that creating semantically structured code never really emanated, meaning that in order to bring semantic markup into the mainstream, overarching development trends and culture had to be tweaked.
Enter RDFa, Microformats and Rich Snippets
RDFa, launched in 2004 and Microformats.org, launched in 2005, offered a cohesive new way of thinking about data, a universal way of marking up HTML and was “an attempt to get everyone to change their behaviour and rewrite their tools”.
However it was not until the implementation of structured markup was given a clearly visible, tangible commercial value, that technologists as a whole began to really pay attention.
With the introduction of Rich Snippets, a way of displaying relevant data within the SERPs and providing more information before the user clicks through, meant that all of a sudden the almost Purist idea of a structured, semantic web became a reality. Rich Snippits were regularly reported to improve CTR and improve traffic levels. So suddenly, the implementation of semantic HTML became all the more attractive to SEOers, business owners, marketeers and people interested in web commerce the world over. Essentially, Google’s global presence made semantic saleable.
In June 2011, as a means to simplify and encourage Webmasters to add structured markup to their sites, Bing, Yahoo and Google came together to launch schema.org, an extensive shared markup vocabulary.
“At Google, we’ve supported structured markup for a couple years now. We introduced rich snippets in 2009 to better represent search results describing people or containing reviews. Adoption by the webmaster community has grown rapidly, [and] we want to continue making the open web richer and more useful […] That’s why we’ve come together with other search engines to support a common set of schemas.” - Google Webmaster Blog.
Structured data is not only pervading onsite campaigns, but is set to change off site activity as well. In 2011 Matt Cutts annouced that Google began an initiative to attribute content to the authors of that content, using the rel-author tag, allowing bloggers beaming faces to peer out at you directly from the SERPS.
Connected with a G+ account Rel=author allows ‘authors’ to build authority within a certain sphere, so rather than solely guest posting on particular sites, it became equally authoritative to have an influential author to blog on your site. (This topsy turvy way of social metric and link generation is an interesting topic in itself, which you can read an excellent article about on SEOMoz here)
On July 31st this year, Google announced the Structured Data Dashboard, a useful and exciting Webmaster Tools addition which displays the structured data on a website, at a site, item type and page level.
What Happens Next?
The evolution of the semantic web to where it stands today, indicates one thing; that it is search engines that have been the primary driving force to mass implementation and the intention is to push it far further than where it stands today. Last year Google’s Search lead, Amit Singal gave an indication of where this semantic makeover is likely to take Google in the future.
”I have a ‘to-do’ list on my phone which contains tasks like “pick up a gift for my father”. The phone has a GPS system and knows where I am. It also contains my calendar, so it knows when I am free. Why shouldn’t a search engine, be able to sync up all that information and tell me when I am near a shop which has a gift in it that my dad would like at a time when I am free?“
It is much harder to optimise for a user’s intent or to predict what they might be looking for, rather than just searched keywords.
To prepare your site for this shift as it stands, I would recommend; implementing a robust content strategy which identifies all potential customers and answers all potential questions, review your site and carefully consider which areas may benefit from semantic markup (either within search or for users) and ramp up any social activity to increase potential visibility within personalised search.
A common apprehension of this shift is that a lot of the information a user wants is provided within the SERPs, and this therefore might have a negative impact on website traffic. After all, sites are now not only competing with their competitors for traffic, but with search engines, too. Alongside all of this, the change is likely to increase advertising spend on biddable media, as organic results become more personalised and take up less of the above-the-fold search space.
Blending your campaigns is becoming ever more crucial and an integrated campaign across PPC, SEO, Display, Social, Affiliate and Content Creation is always likely to deliver the best results, in whatever the search landscape may be.