5 Questions to Ask an SEO Agency in 2022

Whether you’re working or being prospected by an SEO agency or freelancer, here’s a few questions it might be worth asking:

1. What are your thoughts on MUM?

Google is rolling out the multitask-unified model (MUM) which is a multimodal algorithm has the potential to revolutionise how we search. 

2. How do we prioritise tasks?

A clear prioritisation matrix is needed to make sure low effort/high value tasks come first for a quicker ROI. 

3. How do we know X activity works?

Being able to test hypothesis before scaling is essential to ensure resource is being optimally used.

4. How are our rankings doing?

SEOs need to be able to give an executive summary of how well your site is appearing in search. This means they need the tools available to quantify all the data. 

5. How much traffic/revenue will SEO activity drive this year?

Being able to forecast and set KPIs is crucial for benchmarking success. 

Other considerations

If you’re interested in what SEO in 2022 will look like, here’s a few interesting resources:

Feel free to add any questions you might ask in the comments.

What’s the Point of Keyword Research?

Keyword research for the sake of keyword research is not useful. What you get is a spreadsheet with 1000’s of rows of words.

What keyword research needs to do is tell a story and give narrative. You need to be able to understand the search landscape for a vertical with actionable insights in totality that’s easily digestible for humans.

In order to do to this you need to be able to quantify all those queries into digestible clusters or categories.

This is also useful for reporting for when key stakeholders ask how rankings are going. You can pull out keyword categories which create great anecdotes for the business.

You do not have to manually tag each keyword, with Python, machine learning or software can do this automatically.

Here’s various resource on how to do this:

Let me know how you approach keyword research in the comments 👇🏻

Getting SEO tasks prioritised with developers

It’s key when using any type of resource whether that’s in-house or client that you give clear strategic narrative and precedence to tasks which will see results quicker. 

This helps building key relationships with developers.

Using a prioritisation matrix we can make a custom logic to how we formulate tasks in a sprint or throughout the year that goes beyond tech. The foundations of which are built on these initial pillars:

– Business Value: Est. impact it will drive based on forecasting potential revenue/user increase.
– Resource: How laborious the task may be.
– Strategic Value (Optional): Business objectives beyond revenue and traffic. Essentially a vanity metric.
– Cost (Optional): Potential to create a cost model for full ROI.

Scoring each activity with this in context will give you a clear priority order with low cost & high value activity descending. 

Here’s some more resources similar to this topic:

If you have any other tips for working with developers feel free to drop a comment.

Can AI Create Content For SEO?

There is no one-size-fits-all answer to this question, as the use of artificial intelligence (AI) for content creation will vary depending on the specific needs and goals of each organization. However, AI can certainly be used for SEO purposes, as it can help to automate content generation and optimization processes.

^That chunk of text there was created using OpenAI.

AI generated text looks pretty sophisticated but it all depends on the model and what it is trained on. 

GPT-3’s pre-trained data-set includes crawl text from the web but for it to have potentially more impact you could look at using a pre-trained data set from only a specific vertical.

What I find it is tremendous at right now is for scaling, whether that be for getting ideas for content briefs or missing meta descriptions.  

The key is to use text summarisation using Hugging Face transformers using python. In the example image (which was from my talk about NLP with Semrush last year) we scraped the text from a group of URLs and used BERT to summarise the text in a certain amount of characters.

The certain amount of characters was the optimal length for things like meta descriptions & intro text. It still needs a human to QA but it puts you in better place than starting from scratch.  

If this is something you’re interested in here’s a few more resources to explore for SEO:

Feel free to comment any other resources or ideas you have.

How to Test Pages for SEO

A/B testing for Google’s bots or users is quite difficult. Mainly because it could be considered cloaking if you’re not doing it right + SEO is one of the slowest channels dependant on things like your crawlability.

One of the things you can do is something that I mentioned in a recent article for The Drum is:

“Isolate each activity for reporting so you see how they impact each other. Proving ROI in the long run will relieve internal tension on activity and easily create case studies.”

An easy way to do this is in Google Data Studio by creating a custom field that segments URLs by activity.

The custom field formula looks something like this for a Google Analytics connector:

WHEN REGEXP_MATCH(Landing Page,'/product/thing') then "Added Hreflang"
WHEN REGEXP_MATCH(Landing Page,'/about/') then "Self-referencing Canonical"
WHEN REGEXP_CONTAINS(Landing Page,'/blog/') then "Evergreen Content Creation"
WHEN REGEXP_CONTAINS(Landing Page,'/Services/') then "Metadata Changes"
ELSE "Other"

Flip between REGEXP_CONTAINS or REGEXP_MATCH depending on the array of pages that activity was commenced on. Add a | between each page path for more than 1 URL per test.

You’ll need a thorough experiment ideology before you start. Key for testing is well is to only isolate URLs when no other influence has happened on the page. So if it is metadata changes make sure it’s only that which happened to that page. 

Here’s some other approaches on how to test pages for SEO:

If you have any other ways you do SEO testing feel free to comment.

How to Optimise Copy for Knowledge Panels

Knowledge graphs (and other SERP features) are a way for Google to directly answer a user query in search without having to click on a website. 

Research we did a couple of years ago scraping 100’s of knowledge graphs in search results + analysing with Google’s Natural Language API gave us some key insights into the structure of text that Google & users favours.

Key points are:

– Use assertive language.
– Content that appeared in the knowledge graph usually was in the first paragraph or directly below the heading.
– The text needs to be straight to the point with no fluff. 

Based on this and looking at the syntax of how words fit together we came up with a blueprint of how to structure content that would be inline with how knowledge graphs are displayed:

[Entity] >[Literal Explainer] >[Relevancy] > [Practicality] > [Resolve].

  • Entity– A thing, name, object or product. 
  • Literal Explainer – articulate in text form a vivid description of the entity. 
  • Relevancy– Origins and context. 
  • Practicality – It’s use
  • Resolve – Interest

Obviously there’s other criteria beyond text that determines wether your content will appear like authority of your website but we’ve had some great success using this template just from a ranking perspective. Which can work on anything from product pages to long-form pieces of content. 

If you’re interested about Google’s Knowledge Graphs and how they work here is some key resources:

If you have any other ways of tackling knowledge graphs feel free to drop a comment as it would be interesting to hear!

How to Automate Branded Search Reporting in Data Studio

With Google Data Studio and the Google Search Console connector you can segment queries between brand vs. non-brand.

It’s great for general reporting, opportunity spotting & PPC/SEO testing if bidding on branded terms.

To do this just create a custom field and add this formula:

WHEN REGEXP_CONTAINS (Query, "Brand") THEN 'Branded'
ELSE "Non-branded"

You’ll want to test at first then refine to make sure you’re capturing all relevant queries i.e. spelling errors.

Using this logic and basic regex you can further segment for elements like intent, categorisation and more. 

You can also supercharge this to make your dashboards quicker & dynamic with linking this up with BigQuery. 

Here’s some similar resources on how to segment queries and use Google Data Studio:

Feel free to share in the comment how you report on branded search

Reporting on Top Level Directories and Subfolders in Data Studio

A simple custom field in Google Data Studio will allow you to create separate dimensions for each subfolder which works for Google Analytics, Google Search Console and really any connector where the dimension is a URL.

To do this follow these steps:

1. Edit your data connection in data studio
2. Click on add field 
3. Paste in the formula – REGEXP_EXTRACT(URL, ‘^https://[^/]+/([^/]+)’)
4. Remember to give the field a unique name like “Top-level directory”
Bonus – If you want 2nd level directories paste in – REGEXP_EXTRACT(URL, ‘^https://[^/]+/[^/]+/([^/]+)’)

This is great for showing what sections of your site are winning and great for international SEO for spotting misaligned traffic if you add the Country dimension on top of.

Here’s some other great resources on segmentation in Google Data Studio: 

If you have any other use cases of segmentation or found this useful feel free to comment!

How do you Forecast for SEO?

Automated forecasting is something that has historically been difficult for SEOs because of the level of complexity + potentially disruptive factors like unpredictable algorithmic shifts & real world factors like Covid-19.

You also want to be able to be transparent about how much your SEO activities will contribute towards any growth outside of the current trajectory.

However that doesn’t mean it’s not worth trying and reviewing in a quarterly basis. There’s key technologies we can utilise that use time series data based on additive models like Facebook’s Phophet https://lnkd.in/dZnJqn_C

It will require a little bit of Python knowledge but here’s a few resources and templates about how to automate everything from rankings, traffic and revenue: 

If anyone else has any other innovative ways of forecasting drop a comment 🕺🏻

Google’s MUM is on the Way

MUM (Multitask Unified Model) is a multi-modal algorithm that Google claims is x1000 more times powerful than BERT and is being rolled out into the search engine in the near future. 

The most important thing to note is its able to understand different elements beyond text such as images and video when looking at matching user queries. 

Here’s some important resources to read and practical examples so you can stay ahead of the curve:

Rowan Zellers MERLOT: Multimodal Neural Script Knowledge Models https://lnkd.in/eJQXGufi