Organic Marketing In 2018

Organic Marketing 2018
Standard

This post was also made to the 4Ps Marketing blog.

Is SEO dead? No (again) but it’s evolving at an ever-more rapid pace. Here’s a look at some of the biggest present and upcoming organic marketing trends to ensure that you pay attention to them this year.

SEO “Aten’t Dead ” to quote a wise old woman, but it sure as sugar is changing and evolving in all kinds of new and exciting directions. Here are some of the most significant pieces of advice I’d give to someone who wants to use organic channels to grow their business in 2018.

Technical SEO Doesn’t Work In A Silo

Long gone (alas) are the days when technical SEO fixes could be business transformative for organic performance. Now and then you may get lucky with a big win like sorting out a mass duplication problem, but the vast amount of tech SEO these days is what I’d like to class as ‘hygiene discipline’ – you do it to make the most of the organic equity that your brand and website have. Or, you use it as a way to avoid losing ground through poor practice penalties or missed opportunities, not to generate “more”. Should you still be investing in it? Absolutely – it is still a very necessary part of ensuring growth – but be wary of expecting technical measures alone to explode your traffic overnight.

It’s Not Paid OR Organic Any More

Aside from the fact that in today’s modern and ad-laden Google SERP, you’d have to be rather clinically insane to not want to do any Adwords activity at all, there’s a broader implication for what back in around 2005 used to be – an often hotly -contested subject of “PPC or SEO”. Simply put, go multichannel or go home.

Yes, there are always efficiencies to be had, but try to think more in terms of balancing your overall business performance than in the old-fashioned “but if I get more organic traffic then I can dial back PPC” (or indeed vice versa). Consider attribution modelling, or at the very least get to grip with your assisted conversion channels, and be highly aware that your users will likely have multiple touchpoints with your brand before they even think about becoming a customer.

UX Is King

It’s a very valid point that a lot of search marketing is a form of user experience optimisation for search engines, but this has never been truer than now. Google’s driving front seat on the “UX as SEO” train with everything from site speed and mobile experience to intrusive ads and non-secure sites. Make use of all the data on your site, apps and other points of contact (yes, that includes good old word of mouth!) to ensure that everything from content engagement and button functionality, to speed and phone service are as frictionless as possible.

Bear in mind that UX goes beyond online too – it’s no good having the world’s most efficient website or great lead generation if your customer service is poor, logistics chain is nonsensical and deliveries are being lost or damaged – you’ll be forfeiting conversions and unable to make profits. That’s why paying attention to the full lifecycle of your users – and I am most definitely including your existing customers in this – yields almost endless rewards in the form of improved engagement, better brand rep and, above all, a higher lifetime value.

Think Beyond The Top Of Google

I’ve been saying this for years, but it has never been truer.  You need to think about where your users are starting (or resuming) their search for information, products and services, rather than just aiming for the first SERP result page.

Retailers, do you have an Amazon or eBay store? A lot of shoppers begin their journey here rather than on what we might conventionally think of as a search engine. Professional services firms, are you getting press coverage in the right industry magazines, joining the right LinkedIn group discussions, and attending the right events to be seen?

Stop just asking “how can I get to the top of Google” and instead ask “where are my users’ eyeballs” (or indeed ears, given the rise of voice-driven devices)! Those are the places that you need to be at the right time with the right content (be that an ad, creative, PR piece, boots on the ground or a good old-fashioned SERP) to capture users down the path of interest and trust in your brand.

Brand Drives Performance; SEO Makes The Most Of It

Whenever a client with a semi-decent website asks for a strategy to help boost organic growth, I often get funny looks when I put together something that has little to no tech SEO time but heavily invests in PR, outreach and other brand-related content activity.

Think of it like selling a house; of course, you’ll need to make sure the pipes aren’t leaking, and the building is structurally sound (i.e. the technical SEO) otherwise nobody is going to be mad enough to buy it for a decent price. But what do you do if you want to really wow people and boost the value of the house? You add a conservatory, refit the kitchen and overhaul the driveway so that it looks downright gorgeous.

So, if you’re looking at your organic performance as a series of silos, a channel that sits off on its own or seeing traffic shrinkage because of increased ad real estate on the SERPs, then please drop me a line and let’s go for a cuppa. The reality is much, much cooler and ample opportunities exist for real, business-transformative growth that will blow your mind.

Photo Credit: Silvestri Matteo

Optimisation Around User Intent for Voice Search

Standard

This post was originally made on the 4Ps Marketing blog.

Prior to Christmas, Google announced  that they’ve published a specific set of search quality evaluation guidelines for Google Assistant and Voice Search in order to give more insight into how its quality process works. The resulting PDF is rather short but extremely useful for anyone interested in this type of technology, from developers and academics to marketers seeking a better figurative footprint in the eyes-free search results.

It’s All About Intent

The document references the existing Actions & Answers criteria for the Needs Met scale in the general search quality evaluation guidelines, so this isn’t particularly surprising. Google has always closely worked in line with user intent, but it’s particularly interesting to see them work on guidelines for voice action apps.

Intent Understanding Determines Satisfaction

The examples for playing music rely on a human-level of common sense to determine how successful a response is. For example, when a user requests “play jazz” the app should play a playlist rather than a single song.

Even more conventional queries get this sort of treatment, for example when someone asks the name of the US president, they will want the current head of state. Or, if someone asks about the weather for the coming weekend, they’ll want to see a daily forecast for the whole weekend rather than just a general commentary.

Here Comes the Knowledge Vault

“Play Mumford & Sons reminder” was a query that particularly caught my attention as a testament to the sheer data parsing capabilities of the famous Google Knowledge Vault.

Not being a huge Mumford & Sons fan, my initial thought around the intent was that someone wanted to set a song from the band as a reminder.

People with better (or worse)  taste in music will, of course, understand that this query is asking for a specific track to  play – so much for my human perspective on the query – while the example response is given just opens a calendar app to set a reminder.The reason why I like this example is twofold; firstly, it turns on its head the very old-fashioned idea of “latch onto the most obvious keyword”. Here “Reminder” is the name of a song, not the calendar-style activity. The second is that it is reliant on the kind of advanced semantic parsing capability, generally native to Google itself with its Knowledge Vault technology.

What Does This Mean for Me?

Unless you’re a developer probably not an awful lot right now – but quite a few brands are now starting to look seriously at the idea of integrating themselves into technologies such as Google Home and Alexa with custom skills. In my opinion, all brands should  consider having it on their roadmap within the next twelve to eighteen months as the adoption rates of this tech aren’t showing any signs of slowing down.  Brands that are too slow to embrace this growth stand the risk of being left behind.

Currently, there are some good implications in here that relate to preparing your brand’s content for Voice Search in general. There isn’t a single easy or technical way to “optimise” for Voice Search (if I had a nickel every time someone asked me about “voice SEO” I’d be off blogging about Stargate Atlantis somewhere rather than writing this!) but there are some general things to bear in mind.

First and foremost, rich answer results on Google surface commonly as responses to voice queries. While there doesn’t seem to be a direct equivalence (experiments at 4Ps have shown that a voice answer doesn’t always match the rich snippet returned in “position zero” when sampling on a normal SERP) it is still worth keeping an eye on your footprint here. Make sure all your reference touchpoints like maps and Wikipedia are up to date and on-brand and see where else your content could be showing up for Q&A-type queries. Not everything gets a spotlight like Rodney McKay, of course (and nor should it) but that doesn’t mean you shouldn’t take steps to keep an eye on your SERP presence!

Google Rich Answer Example

When it comes to trying to feature in more snippet, extensive experimentation and careful monitoring are strongly recommended for the best results. Content formatting and wording is a much stronger influence than technical mark-ups like schema.org. So, create stuff that is fantastic for your users (as usual – yawn) and focus on a “meeting intent” angle if you want to see progress. Always but always know what the purpose of your content is from a user’s perspective – are you answering a question (or questions), providing specific guidance on how to do something, explaining how something works…?

So, buzz your data, speak to your users (and customer service team), produce awesome stuff that meets the needs a real-world person might have in relation to your brand, and you’re already as well on the way as you can be to “optimising” for voice search in 2018.

Tools like AnswerThePublic.com are invaluable for generating ideas based on real searches, and of course, I’d be horribly remiss if I didn’t also mention the outstanding rich snippet research done by the SERP sampling superstars at STAT. So go forth and develop your snippet presence, and don’t forget to have a good look at things like Actions On Google and Alexa skills development as well while you’re at it!

Photo Credit: Kevin Bhagat

Entering 2018 With Mindfulness

Standard

Wow, I did not blog much last year.

Of course that’s at least partially because I moved my fandom/hobby shenanigans over onto Tumblr, which is not a decision I regret in the slightest, but also down to that old chestnut of not really having time to stop much and reflect enough to write about it, let alone put together a meaningful post with actual sentences. Here are my top four key takeaways from 2017.

1. Work To Live, Don’t Live To Work

Since my breakdown (wow, nearly three years ago now) I spent a lot of time drifting through work in a mild sea of resigned apathy, not really caring much one way or the other. Perversely, this has made me both a happier person overall and (bizarrely) a better worker, because by not living and breathing my work/career non-stop to the large exclusion of anything else I’ve actually started enjoying it again. This means I’m more laid back and productive all round, and because I’m spending a lot less time panicking like every tiny little thing that has or could possibly go wrong is some sort of extinction level event I’m fairly sure I’m a bit nicer to be around as well.

I’d like to be able to say I haven’t had any further emotional meltdowns, but I can’t say that without lying through my teeth so I won’t. The agency still sometimes does things that make me want to scream and tear out my hair (or someone else’s) and clients still often make me weep for the current and future state of humanity, but I’ve found that the term not my circus, not my monkeys is a very useful adage (along with remembering that it is the agency’s own money that is being spent on what seem to be to be stupid decisions, not my own personal money).

My agency’s CEO reminded me during one of my recent semi-meltdowns (in the nicest possible way) that he’s not paying me to worry about his job and the MD’s job and everyone else’s, so I should stop doing extra worrying for free, which is a concept I rather like: don’t worry for free.

A good old-fashioned oh well, fuck it often does the trick quite nicely as well. I’ll care about clients and their insanity during work hours – and will do so frankly quite passionately, because I quite enjoy what I do and like to get good results for people – until my time for the week is done, and then the lot of it can frankly go hang because I’m off to go swimming or play Mage The Awakening or whatever.

2. Creative Laziness Makes For Amazing Productivity

Everyone often comments about my speed at work – it’s one of the things I see clients, colleagues and third parties alike all gape at. I can (apparently) turn around documentation, technical testing and research in the time it takes most people to get their morning coffee, and more than once I’ve had someone’s jaw drop in amazement at my typing speed. Now the latter is easy enough to explain: not only am I an avid trashy fanfiction writer, but I spent nine months fresh out of university working as a medical secretary for a psychiatrist whose written English was not great, so she preferred to dictate everything.

The former is what I like to think of as creative laziness. I get tasks done as rapidly and efficiently as possible because I’d rather get something finished and ticked off in order to move onto the next thing. If I’m stuck doing the same thing for more than a couple of hours I get hopelessly bored – so the best way to avoid this is to actually work more efficiently, not less.

3. It’s Okay To Not Be At The Top

This is a major issue I have largely due to my upbringing and lots of other shenanigans I get to bore psychiatric professionals with – if you’re not in position numero uno, especially at work, then you might as well be at the bottom.

I didn’t get a clean sweep at either GCSEs or A levels (all A*s but one at GCSE, AAB at A levels) and was devastated to the point of tears when I only got a 2:1 honours on my degree rather than a first. That probably sounds utterly absurd to someone considerably less neurotic but it is a serious problem when anything less than perfection/top of the class/top earner (with all the career bells and whistles and job titles to go with it) is viewed in one’s own mind as equal to complete failure.

Partially a result of therapy and partially as a result of the much stronger ah, fuck it attitude described above, I’ve made significant strides towards getting over this. I doubt I’ll ever be turning cartwheels at being second best (or third, fourth, fifth etc) and I’ll never stop craving praise like a voracious little compliment weasel, but I stepped down from a significant Head Of title and role at work because I hated doing it. The agency only recently properly promoted someone else (a very deserving someone else, I hasten to add) to fill the gap, but no longer being “top dog” in that regard no longer bothers me. Fewer headaches. One less circus and a lot fewer monkeys.

4. Remember Your Own Needs Hierarchy

Anyone heard of Maslow’s Hierarchy Of Needs? It’s a theory in human psychology that says people’s motivations are determined by certain needs, and those motivations and needs progress as they are fulfilled. This is a nice diagram from Simply Psychology:

Maslow's Hierarchy Of Needs

Now, especially when suffering from a long-term mental illness your basic needs and psychological needs can overlap quite significantly, so this model sometimes stops applying. You can be entirely unmotivated by hunger, for example, because you feel so hopeless that basic bodily needs become just a background nag rather than an imperative. So actually what I find helpful is to create my own more personal “hierarchy of needs” based on the things I know I need to prioritise in order to stay functional and happy. This lets me evaluate situations as they come along and determine how to allocate my energy (or “where to spend my spoons,” to use a popular analogy).

Here’s roughly what my personalised hierarchy of needs looks like.

Personal Needs Hierarchy

 

These are bit more abstract and first-world than the Maslow version of course, but I find this kind of “ranking” of things really and truly invaluable for allocating my energy; lower down things are more important and therefore take priority. Time to either do some writing or cuddle with the pugs on the sofa? The pugs take priority. Meeting a friend for coffee or writing some fanfic? The fanfic wins (I’m really not that social a person, as further evidenced by the fact that animals are a whole two rungs more fundamentally prioritised than people, who are in my book a “nice to have” – however I know several folks for whom “time to socialise with friends and family” would be a much higher priority bloc).

This pyramid is also really useful for the things that AREN’T on it. For example: career. If you’d asked me three years ago I would have told you my career was basically my life, and everything else was just secondary stuff orbiting around it. Now it isn’t even on there – as long as I’m earning enough to meet the “subsistence” requirement and the work I’m doing is interesting enough to fulfil the “stress-boredom balance” need then everything is good.

It’s amazing how useful this sort of thinking is.

So while doing the usual mindfulness and um-aah and spiritual critiquing that is popular at this time of year, try building yourself a personalised Maslow hierarchy. It’s a good exercise from a self-review perspective anyway, and it can prove really very useful in terms of helping to prioritise how you spend your mental and emotional energy.

Happy 2018!

Photo Credit: Cristian Escobar

Display Ads: Stop & Think!

Display Advertising - Stop & Think!
Standard

So, today in random news I recently decided to become a supporter on the Guardian. The BBC’s continuous pandering to the royals has been pissing me off more than usual lately (newsflash: I’m anti royal family) and I’ve found that the Guardian has actually replaced the BBC as my go-to “let’s check in on how screwed up the world is today” destination.

I noticed the message on the site today asking me to sign up (not that I’ve not seen it before, but still) and after discovering that it now costs less than an annual Playstation Plus subscription I thought hey, why the hell not. They accept Paypal as well, which is always nice as it saves me having to hunt down my card from wherever the pugs have randomly dragged my purse today, so that’s one conversion barrier neatly bypassed. In fact the whole signup was pretty slick and smooth, on par with an Amazon checkout but without the hustle.

Of course after something so profoundly adult-ish on a Saturday afternoon I went back to writing fanfiction like a normal lunatic, which for various reasons led me to the always-invaluable www.fantasynamegenerators.com. This site is on my adblocking whitelist because a) it is genuinely useful so I don’t mind the creator earning from it and b) its advertising is discreet and doesn’t make the site entirely unusable in the process.

Now given that I had, no more than one quarter of an hour earlier, literally just counted myself as an onsite conversion (and from organic, no less) for The Guardian, I was rather surprised to be greeted by this (on the Viking Name Generator page, if you must know, but that’s not really the point):

Guardian Display Advertising FailUm. Open another tab, check email…nope, there’s the subscription confirmation, right in my inbox.

I wish I could say this is the first time I’ve seen this sort of thing, or even that it was the first time I’ve had it happen so painfully obviously to me. I can’t, of course. Not even close.

All together now, boys and girls: don’t target your converters just after they convert.

Aside from the fact that this is a wasted bid effort, not to mention inventory that could have far more usefully gone to someone else, this kind of “haunting” effect is the sort of thing that can really alienate today’s web-savvy consumers to a brand. Now I work in digital marketing so I can just sigh, roll my eyes, rant on my blog and move on with my life without any real change in my sentiments to the Guardian as a journalistic entity, but the average web user does not (gasp) work in this field.

On the whole, consumers don’t like ads. They tolerate them. It’s just a fact of marketing life. But tolerance levels go sharply down and irritation goes sharply up when ads are irrelevant, too persistent, interfere with whatever the user was trying to do in the first place (I’m looking at you, YouTube pre-roll) or are otherwise just poorly targeted.

Say, for example, by trying to sell a user something they bought less than fifteen minutes ago.

Now remarketing, properly handled, is an immensely powerful tool. So, frankly, is display prospecting when done correctly. I would never in a million years suggest that a brand abandons its display/RTB efforts provided the appropriate KPIs look good and brand safety is assured (check your white and blacklists folks – is NakedFurriesRUs.com really where you want your high end luxury name appearing?) so no need to start setting fires just yet.

But please, please turn your brain on and think when setting up your targeting options. I’m by no means claiming this is always as simple as finding a decent Viking name generator in the riches of the internet – Mona Elesseily’s excellent article about GDN targeting and layering is a great read, if nothing else to show how complex just this single network can be to sort out properly – but since when was something only worth doing if it was easy?

Rakuten Marketing recently ran some surveys around online advertising and amongst the other interesting (and worrying) things they discovered was that a whole bunch of people around the world associate ads with “other negative online experiences like fake news.”

Awkward. Especially for this particular example!

While the immediate “last click” ramifications of such an advertising faux pas may seem insignificant to the point of ridiculousness, in today’s ever-more-crowded online marketplace it is critical to realise that brand preference and good, old-fashioned, squishy feelings play more of a factor than ever before. Yes, you got the conversion right now, but if you continue to stalk your new customer with crappily-targeted display creatives you’re not only going to piss them off but alienate them to your brand and very potentially lose them as a customer in the near – if not immediate – future.

Given how much it probably cost you, considering all touchpoints and appropriate channel attribution, to acquire that customer in the first place, doing something that actively drives down their lifetime value to your business is so far beyond stupid that it can barely see stupid in the distance.

I may forgive the Guardian its little whoops moment because I know the struggle (and really enjoyed their recent article about tardigrades), but the vast majority of users will be far less generous!

Besides, every time you advertise unnecessarily to a recent converter a kitten gets brutally shot by a floating hand.

Don't Kill Kittens

Isn’t that reason enough to sort out your targeting strategy?

Edit 16th August 2017 – someone just sent me the below message on Facebook in response to my sharing of this post. Case in point, much? 🙁

Bad Retargeting

Aaand someone else sent me this over Slack as well (the booking he made was for next week, and they’re already trying to get him to book again). Shocking.

More Bad Retargeting

Anyone else have any classic retargeting failures they’d like to share – especially ones that alienated them from the brand in question?

Photo Credit: Simon Launay

How To Future Proof RIGHT NOW For the IoT

IoT Future Proofing
Standard

This post was originally made on the 4Ps Marketing blog.

This big malarkey about “the Internet of Things” (IoT) just won’t go away, will it? From radiators you control with your phone to fridges with internal cameras so you can check the contents on the move, it seems like everything is connected these days. Alexa and Google Home are all set to start battling it out on the home voice search stage, and it seems like only a matter of time before you’ll be shouting at your loo to get more toilet paper or hollering at the oven to order pizza.

One of the biggest challenges marketers currently face is providing some kind of actionable answer to the big question everyone is asking – what does this mean for my brand, how can we leverage it, and how can we start future proofing our digital assets?

I could spend the blog equivalent of War & Peace giving my best shot at answering all of those questions (drop me a line for a cuppa and we can talk about it if you’re interested) but what I’m going to focus on right now is the third one – specifically, what you can do right now on your website that will form the first steps of future proofing it against the rise of the voice-driven Internet of Things.

You’ll need a friendly developer (or, failing that, an unfriendly developer and something to bribe them with) and ideally a tech-fluent SEO on hand to get this done. It is worth it though, as an immediate and sometimes surprisingly simple-to-implement form of future-proofing that doesn’t require a multi-million pound technology investment.

That’s right, I’m going to tell you to mark up your website with schema.org again.

Schema? Again?

Right, now the groaning noises have stopped let me tell you why you should do this – and specifically do it with JSON-LD script injections rather than microdata. Well, other than the previously covered reasons when I updated my recommendation last year.

Google Home & Alexa Skills Use JSON

The first image is a screencap of an Action for Google Home (from here) and the second is a Skill for Alexa (from here). Notice anything about both of these?

That’s right, they’re both powered by JSON.

We’ve already seen the early beginnings of JSON driving actions on pages as well as simply structuring data to be machine readable – the most obvious example is the sitelink search box markup which allows users to directly interface with your website’s search bar from the Google results page, saving a click. In a future without a conventional results “page” – say, the Internet of Things or a voice search heavy technology ecosystem – it’s easy to see how these sorts of interactions can evolve. What precisely this looks like is still to be determined, but all the signs point to it being written in JSON.

Schema.org already has a whole mess of options available for Actions as well as Objects. A lot of them, especially as given in the examples, are rather pedantic and not necessarily of immediate use from a marketer’s perspective, but the point is that the vocabulary is there. It’s quite accessible as well – even I can write JSON scripts, and I haven’t done any formal coding since the FORTRAN 90 module in my undergraduate degree.

So if you’re a brand, get marked up with JSON rather than microdata, and start using this to signpost key actions on your site – from Order Brochure to Add To Basket, or whatever else you can implement. I recommend inline markup where you can; while it is perfectly possible to deploy schema.org using Google Tag Manager and similar systems, there seems to be a marked delay in pickup by crawlers and there’s every possibility that non-Google entities won’t even realise the stuff is there.

If you’re a marketer, and doing anything in the vague region of technical SEO or web development, try and get at least basic reading fluency in JSON scripting. W3CSchools is a good starting point, and I would personally recommend Codecademy if you want something more structured towards progressive learning.

Photo Credit: Gian Prosdocimo

Guide To An SEO-Safe Domain Migration

Standard

This post was originally made on the 4Ps Marketing Knowledge Base.

There are lots of reasons why a business may need to change website domains during its lifetime. A rebrand is the most common, although there are other reasons, for example going from a .co.uk to a .com to facilitate a single domain international expansion. At any rate, a domain change is an enormously risky event for a website – the mass change of URLs necessitates a great deal of unsettlement and re-indexing which can send organic visibility haywire.

It certainly isn’t something to be taken lightly. A mishandled domain migration can destroy a website’s visibility, and organic traffic can take months or even years to recover after a botched domain change. Fortunately we’ve got a checklist at 4Ps that can help your domain switch go smoothly.

PREP YOUR REDIRECTIONS

Before you do anything else, make sure you set up your 301 redirects to go from old URL to new URL. Just pushing everything on the old domain to the homepage of the new domain will be devastating to visibility – authority needs to be passed on a page by page basis in order to preserve it in the right manner to avoid losses.

Ensure that these redirects are tested thoroughly in a development environment – the last thing you want is lots of redirect chain slamming your site’s speed, or something misconfigured so a URL is missed, when the big day comes.

UPDATE ALL INTERNAL LINKING

Again in your development environment, ensure that all links and associated elements are properly updated to reflect the new domain. If you’re using relative rather than absolute internal links that saves a job, but it is a good idea to run a crawl to be sure just in case as some sites can end up with a mix.

As well as general hyperlinks, ensure that all linking elements are properly updated to reflect the new domain. XML sitemaps, rel alt hreflang, rel canonical, rel alt amphtml, Open Graph, Twitter Cards, structured data – all these sorts of elements must be checked and updated to reflect the final absolute URLs that will be on the site after the domain changes. Not doing so will result in broken links and can cause other difficulties with organic visibility if any markup ends up invalid.

PREP MEASUREMENT TOOLS

Setting up query visibility measurement well before the domain switch is due to take place is always good to establish some kind of benchmark for the old domain before the change. Ensure your analytics is set up appropriately so that all traffic can be tracked and attributed correctly.

Another excellent measurement tool is Google Search Console (and Bing Webmaster Tools, plus any international variants). The best way to handle this setup is to have both the old and new domains validated at the same time, ready for the switch. You can do this relatively easily in a variety of ways, and having both domains verified will allow you to monitor impression levels for both.

Naturally the optimal result will be organic impressions for the old domain decreasing at about the same rate that they increase for the new one, indicating a smooth transition. Ensuring these tools are enabled and verified in advance will ensure that the right data is available to diagnose any issues as they arise – missed redirects, dropped URLs and other such problems are all much more easily corrected if they can be clearly identified.

THE BIG DAY

Update all analytics and PPC (and any other external tools or platforms) as soon as possible after the domain switch, if it is not appropriate to do so beforehand. When making the DNS switch, it is worth reducing your TTL (Time To Live) to speed up the migration and help propagate the new domain as rapidly as possible.

Once the new domain is live, run a full redirect check and then an internal crawl to check internal links. Ensure any problems are corrected as soon as possible – there’s no such thing as “too soon” for fixing issues. The next thing you should do is register the change of address in Google Search Console (and all other webmaster tools accounts). Once this is done, resubmit your XML sitemap (or sitemaps, or sitemap index file). I also like to manually submit the homepage to the index as well, just as an extra nudge to the crawlers to get them going.

MONITOR AND WAIT

Expect at least six weeks of ranking and visibility fluctuations while the new domain settles in. Sometimes this can last up to two months or even longer, depending on how well the migration is handled and how smoothly the redirects go in.

It is also worth noting that in some circumstances, when migrations go well, it is entirely possible for a site to experience few to no fluctuations at all – but for purposes of managing expectations internally it is generally recommended to prepare for the worst, and to be pleasantly surprised if they do not arise!

FURTHER ADVICE

If you’ve got a domain change or similar high SEO risk event on the horizon for your brand, I’ve got tons of experience in website migrations of all shapes and sizes. Give me a shout or take a look at some of the related guides on my agency’s website:

Recommendation Update: From Microdata To JSON

Standard

This post was originally made on the 4Ps Marketing blog.

I’m all about the structured data markup over here. In a world increasingly driven not just by users searching for immediate answers, but by artificial intelligences like Siri and Cortana performing searches on behalf of their users, the ability to have rich information easily machine readable and machine processable is of increasingly vital importance.

lthough the official line from Google (and others) continues to be that structured markup has no direct “ranking impact” or is not used as a “ranking factor” (although rumours continue to circulate that it will be part of the algorithm one day), evidence continues to pile up in favour of its implementation as an SEO consideration. Studies like this 2014 one from SearchMetrics consistently show a correlation between the use of structured data and high organic website performance, and Google themselves say that use of the markup helps increase the chance that you’ll appear favourable in rich snippets and enhanced listing types in organic SERPs – things like Knowledge Graph cards and rich answers. As these result types often pip conventional results to the top of the SERP, that’s a pretty powerful message for potential exposure of your brand.

What Is Structured Markup?

In its simplest terms, structured or semantic markup turns a webpage from a bunch of text and images into a set of things with each have their own properties. Rather than relying on search robots and algorithms understanding the concept of a pair of shoes as a product for sale (rather than a webpage that contains lots of text about shoes and £ symbol), for example, structured data lets us explicitly state this in the form of additional markup in the source code of the page. This means that machines reading the page (including things like search robots and other “parsing” things like Siri/Cortana et al) don’t need to work as hard to understand what it is about – they can see all the information and attributes laid out in a way they can nicely understand.

As we all (should) know, a search bot that doesn’t need to work as hard is a happy bot. Happy bots generally mean visibility boosts, one way or another! So think of structured markup as the bot equivalent of tasty chocolate…

Happy Robot

Implementing Structured Markup

In the past the generally accepted and most widely-used manner of implementing structured markup has been using microdata. This is generally simple to implement into existing HTML templates as it essentially just adds a bunch more attributes (in the form of itemscopes and itemprops) to your pages.

Microdata Markup Example

This approach can often be tricky depending how complex your templates are, and a bit of jiggery pokery is often required to get things to nest correctly, and often you need to include a bunch of meta tags to make sure all the needed attributes are present and in the correct format. It works fine most of the time though, and there are often plugins available for the big, mainstream content management systems which make life easier.

Switching To JSON-LD

JSON-LD, or JavaScript Object Notation for Linked Data (which is what its mother calls it when she’s angry with it), is essentially a way of encoding structured data – including schema.org markup – using the JSON data interchange format. That’s a posh way of saying, in very simple terms, that it lets you put your structured data bits and bobs into a script element that sits independently to the existing HTML template of your website.

JSON Markup Example

There are a few advantages to implementing structure data with JSON-LD rather than microdata:

  • It keeps the structured markup independent of your template layout, so if your site’s page templates change you won’t need to redo the markup each time because your nesting breaks or the entire thing starts producing errors
  • It is much easier and more efficient to mark up complex nested objects and concepts in JSON than it is microdata, so you can implement more comprehensive markup to take advantage of more potential opportunities
  • You can pull fields and properties directly from your content management system without needing to play about with meta itemprops and other formatting pain
  • You can even (with the right analytics consultant to help) link up your JSON deployment with your data layer to start tracking properties as custom dimensions in one neat package

Most importantly though – and the number one reason I’ve made the switch to formally recommend to all clients and prospects going forward that they utilise the JSON-LD markup method from now on – is that Google itself has switched from am ambiguous level of “we support any schema.org formats” to specifically recommending the use of JSON-LD above the other markup techniques.

Google JSON Recommendation

We’ve already seen evidence in the past that Google has been more inclined to properly pick up and parse JSON implementations of things like the sitelink search markup, so this isn’t particularly shocking, but with the new advice on things like Rich Cards and the change to Google’s developer documentation on the subject – not to mention the pain of having to rebuild the microdata spec every time a template is tweaked slightly – I decided it was time for a formal change of recommendation.

There’s no evidence (thus far) that Google is going to stop supporting microdata implementations of structured markup, but this is one of the rare cases where the search giant has provided not only general guidelines but a clear preference for a particular implementation. So in the name of future proofing, not to mention an easier life, I’d suggest the following:

  1. If you haven’t implemented any structured data on your site yet (and why not?) then make sure you get this into your technical roadmap soon, and use JSON-LD when you do.
  2. If you’ve got structured data implemented already using JSON-LD, run it through the revamped structured data testing tool to be safe – Google has tweaked and expanded some of its own specifications, such as for Articles and Recipes, in ways that diverge slightly from the core specs on schema.org, so it is worth checking.
  3. If you’ve got structured data implemented on your site using microdata or RDFa, don’t panic! Google does still support these, but you’ll probably want to look at getting a revamp in JSON-LD into your technical roadmap within the next 12-18 months to be safe (and you can use this as a good time to overhaul the markup to make sure you’re taking advantage of all your opportunities, too).
  4. If you’ve got structured data implemented on your site using some other vocabulary than schema.org, such as microformats, you’re running a risk of both losing any benefits you still have (as Google has firmly thrown itself into the schema.org camp) and are probably missing opportunities too, so make the switch to schema.org (with JSON-LD implementation) as soon as you can.

To find out more about the benefits of schema.org markup for your search marketing and brand promotion efforts, take a look at the newly refreshed Google documentation which is a good read for both developers and less technical marketing types.

SEO In 2017: The Crystal Ball Predictions

Standard

This post was originally made on the 4Ps Marketing blog.

After the ever-saintly Ben Davis featured my thoughts on AMP in the usual seasonal Econsultancy roundup it seemed like a good idea to get the rest of my thoughts down on e-paper before getting swamped in keeping the pugs from eating all the tinsel and getting the cat away from the wrapping paper. As is normal, of course, I’ve only just had a chance to sit down and actually start thinking about the next (and last) twelve months with an appropriate level of depth, so here we go…

Real Time Penguin

Google finally launching real time Penguin was a huge highlight of 2016 for me – better late than never, and so on. I’m also very pleased about Google’s refreshed approach to this – ignoring rather than immediately penalising bad links – as this very much diminishes the nastiness potential of “negative SEO” and seems like an adoption of a “more carrot than stick” kind of attitude towards webmasters on Google’s side.

Voice Searching

The mainstream launches of the “home PA” systems like Amazon Alexa and Google home are very interesting and seem like they might finally propel voice search into the mind of mainstream consumers and brands. This idea of bringing search into an always-on state is a natural evolution of device proliferation but we’ll be watching very curiously to see how it starts to shake up user interaction with digital, especially buying patterns. It’ll also be interesting to see how paid advertising starts to rear its head on voice platforms without fundamentally damaging the user experience on them.

The implications for longer and more specific searches, which tend to occur more naturally in voice queries, are also going to have knock-on effects on everything from content structure to search market research. Alex Smith, on the Food & Leisure team a my agency, commented

“What I’d like to see is the evolution of a keyword planner-esque tool that can work on phrase match or uses machine learning to handle the longer tail, more semantic and context-driven queries.”

Whoever can start providing these sorts of datasets, given the increasingly vicious throttle that Google is putting on its own search data and the lack of granularity in Adwords tools for things like device split and media search method (voice vs type vs image etc), is likely to make a lot of forward-thinking marketers very happy people. Platform providers of the world, take note!

UX Integration

Something a I’ve started to notice with quite a few clients this year is the (very welcome!) development of marketing teams starting to take a real interest in their site’s performance in terms of user experience rather than just bottom line. Site speed, especially, has far too often in the past been written off as “a problem for the IT/web guys” so I’m bloody glad to start seeing some decline in this silo-ised thinking. Analytics, data, marketing and customer journeys seem to be finally starting to get joined up in brand thinking, so although there’s still a long way to go for most businesses this is a great step forward and here’s hoping to see it continue to see deeper adoption in 2017 and beyond.

VR, Bots And New Search Touchpoints

There’s been a lot of buzz around the tech and digital industries about virtual reality and the rise of chatbots as interaction tools for brands. VR is a big unknown for search at the moment as it is (of course) more experientially focused but Nick Shread, my fellow Kent resident and colleague at 4Ps who heads up the third sector team, notes

“I’m wondering how VR will impact search. Searches performed from within games or experience playbacks perhaps? Maybe a “find me something related to this” sort of prompted discovery angle that ties into results across other devices?”

There’s a wide open field for experimentation here of course – watch this space! Chatbots and similar machine learning or AI-driven tools are already starting to make waves in the search space though, especially on mobile. Google set the precedent with their own machine learning driven RankBrain, of course, but there’s an increasing trend of AI-type entities making searches on behalf of human users, rather than the human user undertaking the search themselves.

That’s all the average chatbot does if you dig deep enough under the bonnet of the technology (at the risk of oversimplifying a very cool and complex field), and you only need to look at the typical behaviour of digital assistants like Cortana and Alexa to get some cool ideas of where this could be going. Matt Stannard, my usual partner in predictive crime, and I predicted this back in 2015, so we’re applying for our licensed digital sector psychic badges this year. Matt (who serves at the Innovation Director at 4Ps when he isn’t building mad-scientist type analytics gadgets) also comments that he thinks different search methods than “words” are going to start rising soon too as things like image and sound recognition keep developing.

“What about searching by proxy? Cortana, find me something that looks like this, with “this” being an image, or phrase, or sound, or smell…or even a feel. Haptic interfaces are going to start showing up sooner or later!”

On the Google front, for 2017 the only thing I’m reasonably confident of myself is that AMP is going to get bigger before it goes away, despite some signs of rising controversy in its potential user benefits and current implementation form. Google is pushing it immensely hard and it seems to be only a matter of time before it extends to full capability deployment in new verticals like eCommerce. This will be particularly interesting as and when the mobile-first organic index gets rolled out, as despite Google’s claims that they’re aiming for a “low delta” I suspect that non-responsive sites are going to see some big shifts in visibility if they don’t get their content and markup synced up.

How do you see the organic search landscape shifting in 2017 and beyond? How will new technology and potential touchpoints start disrupting the way brands need to present their content to users? How will measurement, analytics and data struggle or stride ahead to keep up? I’m always up for a coffee and a geek out so drop a line to 4Ps for a chat or hit me up direct and let’s talk.

A Robots.txt Guide For SEOs

Standard

This post was originally made on the 4Ps Marketing Knowledge Base.

Every SEO should know their way around the core principles of a robots.txt file. It is the first thing a crawler looks for when it hits a subdomain so getting the basics (and the not-so-basics) spot on is important to ensure you don’t end up with pages showing ineffectually in search results or just dropping out of them altogether.

Robots.txt Location

Your robots.txt file must sit at the root of your subdomain. No negotiation here. What actually happens is that the crawler strips out the path from the URL, which is everything after the first forward slash, but in practical terms this means your robots text should sit at the root.

  • http://www.website1.com/robots.txt
  • http://website2.com/robots.txt
  • http://place.website3.com/robots.txt

Put it anywhere else, and crawlers won’t find it, which means you effectively have no robots.txt file on your site. That means, incidentally, that bots will assume they can access everything and so will just go berserk and crawl every inch of the site they can get to. This might be perfectly fine if you have a smaller website – but it can be very risky SEO-wise on a large catalogue or enterprise site where you want to more carefully control crawler behaviour to make sure things are indexed to best effect.

The Basics

You can create a robots.txt file in any basic text editor, up to and including Notepad. A very basic robots.txt file will look something like this:

The first line uses a wildcard * to mean “any user agent” (so “any robot”), the disallow being blank means nothing on the site is disallowed from crawling, and the sitemap line specifies the location of the XML sitemap (or sitemap index file) for the website so the bot can hop onto it and start indexing from that list. Keeps things nice and efficient!

If you want to stop all bots from indexing content within certain folders – say, an area only accessible to logged-in users, that’s pretty simple to do.

You can also keep robots out from a single page or file if you want.

Important Notes On Robots.txt Blocking

It is important to note that blocking things in robots.txt does not prevent them from appearing in search engine results pages altogether. What you may end up seeing in a SERP might well be something like this:

Robots Blocked SERP

Now for most things this may actually be fine. User areas or invoice templates and so forth – you’re probably not too worried about outlier cases where they show up like this, as long as their full content isn’t being indexed and ranked organically.

In some cases, however, brands may be more sensitive to certain URLs or files and want to ensure they will never show up in a search engine in any shape or form. If this is the case, it is vitally important to ensure that these files are not blocked in robots.txt – the bot will need to crawl the asset thoroughly, not just “ping the URL,” so it can see the robots meta noindex tag or x-robots noindex HTTP header.

It is also critical not to block assets in robots.txt that are needed to render pages in a browser. In the past many developers would mass block things like scripts or CSS folders, but doing this now will result in a grumpy message from Google in Search Console and can have a direct negative impact on your organic visibility levels (Google announced this change in 2014).

Other Important Notes

There are plenty of other elements you might need to know about a robots.txt file. Keep an eye out for some of the following:

  • Crawl delays. These were used back in the day to throttle robot access. There’s no reason to have them in a modern setup – and Google ignores crawl delay rules anyway.
  • Pattern matching. Both Google and Bing robots will honour rules that make use of * (a wildcard, meaning “any sequence of characters”) and/or $ (which matches the end of a URL).
  • The robots.txt file is case sensitive in all senses – don’t call it robots.TXT, for example, and make sure any rules you put in are case matched to the URLs required.
  • Only one URL rule can go per line. Three file or folder disavows, for example, must go on three lines.
  • Processing order for rules is important! Google and Bing robots both make use of the “most specific rule first” principle, while standard processing order is top to bottom. If in doubt, put any Allows above any Disavows (for example Allow a file in a director before you Disallow the entire directory in order to achieve a “disallow all in this directory except this file” effect).
  • Avoid blocking files in robots.txt when you should be using other techniques. Some of the most common problems we see include blocking mobile websites from non-mobile bots or using robots.txt to block duplication caused by internal architecture problems. Make sure you address situations like this with search engine recommended solutions, not just by throwing robots.txt rules in!
  • You can add comments (human but not machine-readable notes) to robots.txt files by using # at the beginning of a line.

Remember that while the robots.txt standard is a directive, it is not enforceable. Naughty and malicious bots and crawlers will generally ignore it altogether in favour of whatever they want from your site. Be aware too that the robots.txt file is always public – anyone can see it by going to the /robots.txt URL on your site! Definitely don’t rely on the robots.txt file to keep secure areas of your site “hidden” or “safe” – use the appropriate encryption and login protocols.

Now Is The Time To Migrate To HTTPS

Standard

This post was originally made on the 4Ps Marketing blog.

I’m really not a fan of jumping on bandwagons at the drop of a hat. In the fast moving digital world, flavours of the month come and go far more rapidly than they can be realistically evaluated for genuine business impacts. While early adoption can have its merits, diving feet-first into potentially high risk scenarios without a clear picture of the payoffs is not advisable for brands operating in a massively competitive space.

That’s why I was cautious about the newly announced HTTPS “SEO signal” back in 2014 that sent parts of the search industry into uproar. It was an enormously minor SEO positive but the transition was immensely risky as it (then) involved a complete change of address site move-style transition. Sites that handled the migration wrong ran horrible SEO risks and more than one ended up severely damaging their organic visibility in a way that far outweighed the minor benefit of having all their URLs on HTTPS.

I was therefore pretty “meh” about the whole thing.

Over the last two years, this has gradually refined into a more judicious “if you’re doing a big migration or relaunch anyway, let’s get onto HTTPS at the same time” with caveats around implementation and site speed implications. The SEO signal aspect of HTTPS remains minor at best, but Google is pushing hard to get the whole web secured and other factors started to come into serious consideration.

This is why, as of now, I’m officially revising our recommendation. It has been two years since the introduction and Google shows no signs of letting up on HTTPS. There’s been confirmation that as of January 2017, the Chrome browser will start flagging ordinary HTTP pages a “non secure” if they collect any passwords or card details. Google actually say that this is part of “a long-term plan to mark all HTTP sites as non-secure” in a way that is very obvious to users.

Eventual HTTP Treatment In Chrome

Next steps could be even more drastic – some current (unsubstantiated) rumours include Chrome not loading pages with mixed protocols, or even that the browser will stop rendering non-secure sites altogether. With Chrome usage accounting for comfortably over 55% of monthly browser market share, this is no longer something that brands can sit back on.

To help, I’ve pulled together a checklist for migrating to HTTPS with minimal organic visibility impact. If you’ve got concerns or want help managing this process, give me a shout here or via 4Ps.