Data. Big data. SEO professionals are surrounded by ever increasing volumes of information; fuelled by our hunger to consume vast quantities of information and the technical ability to process and store that data.
But with so much information at our disposal and a bewildering choice of tools, platforms and services required to access the data, making sense of it all can be a struggle.
The more data you have, the better informed your decisions can be. Marketing Managers should rely on data rather than gut instinct. Despite that, when you are drowning in data, the potential is all too often unfulfilled because the scale simply becomes overwhelming.
In this blog, Sam Osborne and I take a look at practical methods to apply a big data mentality to your SEO strategy. It’s one thing having all the information and another when it comes to interpreting this and using it to your commercial advantage.
Millions of keywords: finding your search terms
Historically, SEO managers have focused on a small set of keywords, measuring activities against the rankings of that handful of search queries. Some SEO companies still do this.
The problem with this approach is that your target customers use thousands of search queries. You need to look at all the words relevant to your target audience. Looking at a bigger set of words data gives us the broader picture.
Google is the original big data source, providing invaluable information about the phrases relevant to your business and their potential.
Tools such as Google AdWords Keyword Planner and SEMRush provide a wealth of query data, allowing you to filter by particular themes, estimated search volumes and the level of competition for phrases.
The big keywords bubble to the top of the filters, but you need to apply smart analysis to these results. You’re not only looking for the queries with high volume, you also need to choose queries that offer a realistic opportunity for gaining visibility in search and traffic to your site.
Understanding the intent of a user’s search query helps interpret large volumes of data and deliver meaningful insight which can be applied to your website, whether it’s the technical optimisation or the content you write for a particular page.
For example, refining the data for solution-led phrases (how / what / why, etc.) and practical ‘tips’, ‘guides’ or ‘advice’ queries. This is an effective way of identifying opportunities to educate your audience, build trust and industry authority.
Boost performance of existing phrases
SEO is not just about the fresh, untapped areas of opportunity. Deep data analysis also provides valuable intelligence on current, established themes that you have been targeting.
A better understanding of existing search query data makes it possible to identify practical ways of improving the performance of what you already have, such as identifying and improving thin content.
Formerly Webmaster Tools, Google Search Console shows you the phrases you have visibility for and the performance in terms of clicks, impressions (when they appear in search), click-through rate (CTR) and average position.
The extra layers of data accessible via the Search Analytics report make it easy to interpret lots of information and eliminate any need for marketing decisions based on assumption.
This detailed analysis is crucial. Not only for the phrases you choose to feature in a page title or site content, but to ensure that your entire site architecture is logical and incorporates the most valuable keywords.
At Vertical Leap, we developed Apollo Insights to combine a variety of search query data sources to help our clients analyse all of this information in one place.
When it comes to harvesting and collating the data Apollo Insights does the legwork, thus enabling our SEO specialists to focus on the expert analysis and decision-making to improve search performance.
The Words screen in Apollo Insights displays impressions, views and other information on one grid, from multiple sources.
For example, we can identify high impression phrases with comparatively low clicks and make recommendations to improve their appearance in search to increase the number of clicks and CTR.
Or perhaps we have a group of phrases which – despite a poor average position – are attracting comparatively large volumes of impressions. We can find practical ways to boost site visibility for this theme and better meet the demand.
Thousands of pages: digging into URLs
SEO is not just about analysing search queries. We also look at pages – within a site, and within other sites that link in. While your website may only have a few pages, Google may have indexed hundreds of thousands of URL variants, and then there are the millions of pages on the internet from other websites that may be relevant to your universe.
The more information you can have, the better, but how do you dig into that mass of data to properly evaluate how your website is performing and why?
Tools such as Google Search Console or Google Analytics can give us information about specific pages or groups of pages within a website. This information allows specialist experts to identify improvement areas within a site at granular levels to ensure they are working hard for your business. For example, having visibility on page one of Google is great but if no-one is clicking on those results a large percentage of qualified traffic is being lost.
Google Search Console
Want to know how your site performs before a user has visited your website? Google Search Console lets us see a wealth of information about how many times a page appeared in the search results, whether it was clicked and what the average position of that search result is. As a deep data set for pages on a site, it’s an incredibly useful tool, and it is free.
One of the most important ways that people use Google Search Console is to find out how much visibility a theme has within a site. For example, if a site sold apples and pears, each of these themes can be analysed by segmenting the information.
The below graph shows the segment of impressions data for pages that contain apples:
Further to this we can see which pages were performing and what the click through rate (CTR) and average positions were for those pages, as well as the search queries that provided those impressions.
Using this type of information we can identify key insights into how the site performs in results – before a user has visited – and make changes to influence that behaviour.
Taking page analysis to the next level
Apollo Insights is a big data platform that aggregates data from multiple sources. It allows us to do deep analysis on the pages of a site and its competitors. We can identify information from Google Search Console in tandem with data from Google Analytics, SEMRush and other sources.
When looking at pages, Apollo Insights shows us:
- Visibility data
- Organic search engine visits and user engagement metrics
- Page speed and mobile friendliness data
- Page quality analysis data such as content length, search queries and duplicate information
We also know which pages are performing and which are not receiving impressions in the search results. This type of analysis allows us to quickly identify and improve dead pages to breath life back into them effectively creating new visibility out of existing content!
Using this information allows us to make more effective decisions in a more efficient way.
Digging into the data about your website
A website can be analysed in isolation but it also needs to be reviewed in the context of the competitor landscape. There are tools that provide site audits for you, returning distilled information such as:
- Do you have a robots.txt?
- Is there an XML sitemap?
- What’s the page load speed?
- Does every page have a title tag?
Although useful, these are based on a single site and rarely have all the information you would need, this means going to other websites, searching for the information and patching the pieces together.
A manual technical audit of your website or any other is time consuming. While there are tools available, where you can run a URL through a script to receive a robotic report, this still needs to be done manually. For website owners, and specifically for those looking after multiple websites, you need to be using some level of automation.
A ping checker, which makes sure your site is live, or a change notifier can help you to spot problems as and when they arise.
But think about how much human time is invested in manually auditing websites in detail. One person can only concentrate on one thing at a time. A computer, however, can do many things simultaneously.
Automation multiplies your capability
We’re building Apollo Insights with several layers of automation, so that it can take care of hundreds of site checks on a continuous basis.
The end result is an audit score and report that allows us to see how the site is performing at a everything considered level.
The report itself, shown below, provides specific, actionable feedback and provides examples of the improvement areas and what can be done to improve them.
The above image is only a snapshot of the information available, we can then break down into the specifics by exploring the result. This of course also allows us to see visibility and engagement metrics to prioritise optimisation.
Apollo allows us to do much more than search query or page analysis. The site analysis data sets allow in depth review of backlinks and monitor competitor visibility to ensure that we have all the data and an take action upon it!
Finding people who matter in a sea of data
There are billions of people in the world. That won’t surprise you. The majority of these people are using the internet and any of them could come across your brand.
If you were able to capture the name and address of every person who interacts with your brand in some way, not only would you have a database bigger than your house, but you would probably not even know where to begin analysing it all.
Those people will already be segmented – some will be email subscribers, some follow you on Twitter, some subscribe to your company channel on LinkedIn …
For SEO purposes, here are the types of people you should care about:
- People who link to your articles (on their blogs or on social media)
- People who have an audience that you want to reach
- People who read your content or buy from you
You can use Google Analytics to examine how people read your site, and to understand their user journeys. You can also use your own customer relationship management (CRM) system to keep track of their activities and develop their loyalty.
There are several tools available to target your audience on social media. Identifyng your key responders and influencers is a key requirement of social media and content marketing. Services like BuzzSumo are great for identifying the second category – people who have the right audience.
Based on a query search, BuzzSumo will show you people who have posted and promoted articles about that topic. We’re pulling information from BuzzSumo and other sources into our platform so that we can correlate and cross reference it with the sites, pages and words data.
Decision automation is the future for big data
At Vertical Leap, we’ve designed our data platform around the four core elements we’ve described in this article:
- Sites: Technical information about a site.
- Pages: Impressions, views and technical information about URLs within a domain, as well as in-bound links.
- Words: All the search queries related to a site, including search queries that drive visibility and content within the pages of a site.
- People: The people who share your content, and the people who interact with you socially.
The future for digital marketing is one where data gets bigger and more diverse, and where the requirement for software to analyse and use that data becomes greater.
Want to benefit more from big data?
If you’re keen to start embracing big data in your SEO and wider digital marketing strategies but aren’t quite sure where to start, contact one of our specialists today on 023 9283 0281. In the meantime, check out some of our other big data articles: