Growing Your Social Audience Organically (Part 1: Facebook & Twitter)

Organic Facebook & Twitter Growth
Standard

This post was also made on Rebelhack.com.

Wouldn’t it be lovely if we still lived in an age where all you needed to reach excited audiences on social channels was a barrel of enthusiasm and a half-decent wit manning Twitter? Alas those days are long past, but it is still very possible – and indeed very encouraged – to grow your social audience in an organic way before you start pouring your hard-won funding into community building and post amplification.

First things first – don’t get stuck on the vanity metrics. If you’re boosting engagement and visibility then natural followers will naturally – er – follow. So whistling innocently while wandering over to BuyLotsOfFollowersTotallyLegitHonest.com (or whatever) is a) pointless and b) will probably get you penalised by the social network in question, meaning you have to start all over again to do it properly. Focus on posting stuff that is relevant to your brand and that the people like, and they’ll start following you. But remember that engagement is what you’re after, especially very early in the game.

Social Following Size Isn't Everything

Of course this post could go on for a year and a day if we let it, so for now let’s focus on the big grand-daddies of the social world, the original boys that got this “social media” thing moving. Well, unless you’re a MySpace hipster, presumably.

Tips For Organic Growth On Twitter

First and foremost, find the sweet spot for number of tweets. I’d recommend a minimum of five times a day and potentially more, depending on competitor activity and engagement. Watch for timing! Every audience reaches peak engagement at different times of day, from the morning commute to the afternoon boredom lull. Start off with evenly spaced tweets and then scrutinise your engagement analytics to see which hours to lean into more (sadly Twitter took the time of day analytics off their in-platform package but if you drill down and poke around you can still get the necessary insights).

Twitter Analytics

Appear active. This means don’t just post some press article about yourself once a day – follow those in your industry, retweet their content, tweet out a cool stat you just heard…the possibilities are more or less endless in the very short-form space.

There are tons of social post management tools out there, including free versions, or just use Tweetdeck so you don’t have to be setting an alarm to post badger gifs at 3am by hand, if it turns out that your core audience has a real late-night Mustelinae fetish.

3am Badger Gifs

Be visual! Gifs, memes, silly visuals, mini-infographics, videos, whatever – it’s all good. This type of content is nearly always more highly engaged with, plus it makes your timeline look all sexy and exciting which encourages new viewers to sign up and turn into followers.

Embrace the almighty hashtag. Find the daily tweets like #MondayMotivation or #ThrowbackThursday (or any of literally hundreds more) and get involved with them. Don’t make use of branded hashtags too early; you’ll look egocentric and a teensy bit delusional. Focus on what is trending, and use the free versions of tools like Hashtagify or RiteTag to help you decide on which are the best tags to use. Stick to 2-3 hashtags per tweet maximum, though; overloading is spammy and will put users off.

The Almighty Hashtag

Tips To Grow Facebook Organically

Don’t think of your page as a broadcast channel – Facebook works much better as an interactive platform, and it gives people a good reason to want to hit that Like button. Run polls, surveys, post open discussions, make it a destination worth coming back to. They do call it social media, after all.

Actual social skills. Get them.

Watch for the sweet spot (again) with both post time and post numbers. A lot of users dislike brands that are too active on Facebook and clutter up their feeds, so as little as once a day can often be just right. Study the in-platform analytics to get an idea of when your posts get the most engagements, but as a starting rule of thumb try your local lunchtime. However, be consistent; while less can be more on Facebook, that doesn’t mean you should leave your page to gather dust.

Multimedia! Again! One worthy addendum: don’t embed from other video systems, as Facebook gives a little wee algorithmic boost to videos that are uploaded to its own platform. So that’s a no-no to just sharing YouTube links, if you want to make the most of your reach. You can sling hashtags onto Facebook too – but stick to one a post and only if the tag is already part of the text you’re sharing, otherwise it just looks like you accidentally C&Ped something from Twitter, leading to the heartbreak of sideways giggles behind the bleachers from your peers in online community management.

Social media mockery

Once you’ve got a fair handle on what does and doesn’t work organically, Facebook really rewards the idea of content promotion using boosted posts. Don’t do this for everything though – take your top organic performers and beef them up with a little extra budget to see them really take off. Advertising on Facebook can eat up your pennies (and pounds) but when done with a gentle and judicious hand it can also act to extend the overall reach of your best content.

Right, that’s the Old Two down. Next time we’ll go a little more media focused to talk Instagram and Pinterest. Cameras at the ready.

If you can’t wait until the next time I have a chance to scribble down a blog, get in touch and let’s chat organic growth for your whole business, not just those little ego-boosting numbers on the social profiles. Nice to have, and undeniably valuable in the long term, but those fellas won’t pay the bills, right?

 

Pay the bills

Getting Your Organic Growth Into The Fast Lane

Fast Lane Organic Growth
Standard

This post was also made on Rebelhack.com.

At least once a week someone asks me if SEO is dead yet, like it is some kind of terminal patient on life support. The same person normally then ends up running away as I go into my Number One RantTM on the subject, which is (in a nutshell) to please stop thinking of SEO as some single-aspect doohickey or flavour-of-the-week tactic you can deploy to magically increase the revenue on your website. If you’re asking this question, you don’t know what SEO is. But that’s okay, because I do, and that’s what I’m here for.

SEO advice, that is, not ranting. The ranting is a free bonus.

To quote a certain wise old woman, SEO most definitely aten’t dead, but like most things that aren’t six feet under it is developing and evolving. Where it used to be the hip, edgy kid hanging out in its mum’s basement and smoking ciggies while listening to weird experimental music, SEO has now grown up, moved out to the suburbs and got itself a real job.

SEO Doesn’t Work By Itself

Now here when I say “SEO” I’m referring to the more “traditional” workflows – technical fixes and tweaks, basic content optimisation and the like. Long gone are the days when this sort of thing alone could turn your organic performance into some kind of 80s rock-pumping powerhouse a la Rocky, storming up hills in one glorious montage. Now, occasionally you may get lucky with a particular win such as streamlining your site’s information architecture or sorting out some mass duplication issue, but that’s rare in the modern web.

This “technical SEO” stuff is still essential but it needs to be viewed more as a hygiene discipline, like browser computability testing. Will it necessarily kill you not to do it properly? Well, no, unless you reallycock everything up, but you’re going to be at an immense disadvantage and essentially starting the race hobbled if you don’t get the house in order pretty darn quick.

TL;DR: should I still do core/technical SEO? Hecking yes, but don’t expect it to explode your growth overnight like pulling the stopped on some self-inflating raft.

Organic Traffic Ain’t Free

If I had a nickel for every time someone has stormed up and said they want to “do SEO instead of PPC to save money” then I most definitely would be off writing more smutty fanfiction rather than this blog. SEO is not cheap, not easy and most of all definitely not free. It offers a different pattern of ROI is all – organic marketing tends to pay off longer term, and (provided you do it right) builds on its own success over time.

Does that mean SEO is better than PPC? Why in god’s name should you pay for x clicks a month, every month, when if you invest in SEO instead you can be paying much less per click?

Well firstly (you sausage), most search/digital ecosystems these days run on ads – not just Google but also Amazon, eBay, Bing and any social network you care to name – so if you’re not looking into all possible channels regardless of click payment status then you’re potentially missing a ginormous segment of your audience.

Secondly, organic acquisition is just as competitive (if not more so) than paid, so here’s a mad idea – how about use both to get the best benefits? Adwords, for example, is fantastic for testing new query areas rapidly to see if they gain traction. A few hundred or so quid for a week’s limited PPC test on a new query bloc to see if it gets any traction is a heck of a lot cheaper than investing thousands over many months into organic positioning for the same bloc and thenfinding out it just doesn’t work for your audience for whatever reason.

Can you get efficiencies by marrying up paid media and organic marketing? Yes, and of course that’ll naturally lead to you saving a few bob, but my advice is to try them both together, test with dialling one back or the other up, assume nothing, and for the love of little pug-dogs do remember that your users are going to have multiple touchpoints with your brand – organic and paid, more often than not – before they turn from a curious click into a converting customer.

UX Is Non-Denominational Royalty

It’s a very valid point that rather a lot of search marketing ends up being a form of “user experience optimisation for search engines.” You don’t optimise your meta data or ad copy for giggles, you do it so it is more helpful/attractive to users so that more of them will click on it. This has never been more true than now as we edge towards the 2020s, with Google firmly steering the “UX as SEO” train out of Hypothetical Station towards Bloody Well Do It-Ville.

Everything from site speed and non-intrusive interstitials is on the organic performance radar now, not to mention site security, overly intrusive apps, sensibly-sized tap targets and all other kinds of shenanigans. Pay attention to your data and pounce on any friction points you spot as soon as possible so the whole brand experience for users is as slick as can possibly be.

That’s right kids, I said whole brand experience. No good having the universe’s best website and/or app if your customer service is awful, your logistics chain is broken or the actual doohick you’re offering is crap on toast. Word of mouth is still a thing, digitally as well as literally, so ensure you pay attention to the full lifecycle of all your users, existing as well as new customers, to keep those punters loving your brand and coming back for more. There’s more to LTV than the colour of the upsell buttons, amirite?

Escape That Google Box

I’ve been saying this for years at my old agency but I’m saying it again at Rebelhack, as loud if not louder. Where do your users go for stuff? Not just the physical “click to buy” or “convert to lead” stuff, but the research, the informational “wat dis,” the “x vs y” or whatever – the whole of their journey which might start with a minor or idle need but (ideally) winds up with them as a happy and loyal customer of your brand.

Selling some sort of physical doodad? Got an Amazon or eBay store? Increasingly vast numbers of shoppers start their search for products on those platforms rather than from Google. Offering the best alternative to Service Q in the market? Are you getting press coverage and mentions in the right professional magazines and blogs, chatting people up in the right LinkedIn discussions, schmoozing it at the right offline events?

Please, please, please stop asking about “getting to the top of Google” and start asking where the various sensory organs of your users are engaged. Yes, I said various – remember voice search is still on the rise so it’s ears now, not just eyes. These are the places you need to be at the right time with the right content, be that an ad, creative, PR piece, brand name-drop, sales dude on the ground or a good old-fashioned search engine results page.

Brand Drives Performance; SEO Makes The Most Of It

If you’re asking for organic growth marketing ideas, try not to gasp in astonishment when you get a ton of suggestions back that talk about PR (digital or conventional), outreach, social, content and other brand related activity with maybe a teeny footnote on the tech SEO side.

We’re here trying to make the car that is your business go faster, right? So think of technical SEO like tuning the engine – it gives efficiencies, improves power output, improves durability, but at the end of the day if you’ve got the best darn tuned engine in your car the damned thing still isn’t going anywhere unless you put some fuel in it. That’s your brand equity, from your reviews on Facebook to your press coverage, allied influencers, social buzz, cool content, funny webcomics and (of course) that oh-so-shareable video of your CEO dressed as a giant carrot while handing out free popcorn outside Twickenham. That’s what is going to get this motor really running.

Organic Marketing @ Rebelhack

Hey Ruth, didn’t you used to regularly rant on an agency site as well as here? Why yes, kind and gentle reader (with slightly damaged metaphorical eardrums, if you’ve been following me more than ten minutes), indeed I did. I was on a three week mini sabbatical at the tail end of March but this month I started my shiny new jobsicle as Head of Organic at Rebelhack. The job was going to be called Head of SEO. Who was paying attention to what they just read and now understands why (hands in the air, please)?

Am I sad to have left the unstoppable rocket ship/freight train/other large object with a lot of momentum that is 4Ps Marketing, part of Artefact? Of course I am. If you’re a big household name brand (or watercooler-name brand, if you’re in the B2B space) looking to get to grips with the future of this digital malarkey, for god’s sake stop reading this and plan a coffee with some of the dudes over there. Tell Matt I sent you. And remind him of the differences between AI and machine learning, because sometimes he gets over-excited and mixes them up. You can tell him I told you to say that, too.

But I am also super-stoked to be getting under the skin of the awesome businesses working with Rebelhack, learning the intricacies of early-stage growth modelling with Logan (who is also quite often over-excitable – clearly when it comes to CEOs I have a type) and helping a whole shedload of cool, new and exciting firms disrupt their industries with fresh ideas and the kind of bolshie enthusiasm you normally only find in a herd of puppies.

We’re already making changes to super-turbo-charge organic marketing at Rebelhack. We’re hiring our first dedicated in-house PR and outreach person (hit me up if you think this could be you, incidentally) as well as bringing in a new, sexified version of our in-house tech platform and overhauling our reporting to make it even easier for our partner clients to get to the bones of their growth marketing. We recently moved into our shiny new office in the middle of Hatton Garden; it has our name on the door and everything. We’ve even figured out how to work the coffee machine.

So strap in and hang on, folks, because it’s going to be one heck of a ride. If you fancy a cuppa before boarding, hit me up and let’s talk about we can turn your fledgling business into a growth superstar.

Optimisation Around User Intent for Voice Search

Standard

This post was originally made on the 4Ps Marketing blog.

Prior to Christmas, Google announced  that they’ve published a specific set of search quality evaluation guidelines for Google Assistant and Voice Search in order to give more insight into how its quality process works. The resulting PDF is rather short but extremely useful for anyone interested in this type of technology, from developers and academics to marketers seeking a better figurative footprint in the eyes-free search results.

It’s All About Intent

The document references the existing Actions & Answers criteria for the Needs Met scale in the general search quality evaluation guidelines, so this isn’t particularly surprising. Google has always closely worked in line with user intent, but it’s particularly interesting to see them work on guidelines for voice action apps.

Intent Understanding Determines Satisfaction

The examples for playing music rely on a human-level of common sense to determine how successful a response is. For example, when a user requests “play jazz” the app should play a playlist rather than a single song.

Even more conventional queries get this sort of treatment, for example when someone asks the name of the US president, they will want the current head of state. Or, if someone asks about the weather for the coming weekend, they’ll want to see a daily forecast for the whole weekend rather than just a general commentary.

Here Comes the Knowledge Vault

“Play Mumford & Sons reminder” was a query that particularly caught my attention as a testament to the sheer data parsing capabilities of the famous Google Knowledge Vault.

Not being a huge Mumford & Sons fan, my initial thought around the intent was that someone wanted to set a song from the band as a reminder.

People with better (or worse)  taste in music will, of course, understand that this query is asking for a specific track to  play – so much for my human perspective on the query – while the example response is given just opens a calendar app to set a reminder.The reason why I like this example is twofold; firstly, it turns on its head the very old-fashioned idea of “latch onto the most obvious keyword”. Here “Reminder” is the name of a song, not the calendar-style activity. The second is that it is reliant on the kind of advanced semantic parsing capability, generally native to Google itself with its Knowledge Vault technology.

What Does This Mean for Me?

Unless you’re a developer probably not an awful lot right now – but quite a few brands are now starting to look seriously at the idea of integrating themselves into technologies such as Google Home and Alexa with custom skills. In my opinion, all brands should  consider having it on their roadmap within the next twelve to eighteen months as the adoption rates of this tech aren’t showing any signs of slowing down.  Brands that are too slow to embrace this growth stand the risk of being left behind.

Currently, there are some good implications in here that relate to preparing your brand’s content for Voice Search in general. There isn’t a single easy or technical way to “optimise” for Voice Search (if I had a nickel every time someone asked me about “voice SEO” I’d be off blogging about Stargate Atlantis somewhere rather than writing this!) but there are some general things to bear in mind.

First and foremost, rich answer results on Google surface commonly as responses to voice queries. While there doesn’t seem to be a direct equivalence (experiments at 4Ps have shown that a voice answer doesn’t always match the rich snippet returned in “position zero” when sampling on a normal SERP) it is still worth keeping an eye on your footprint here. Make sure all your reference touchpoints like maps and Wikipedia are up to date and on-brand and see where else your content could be showing up for Q&A-type queries. Not everything gets a spotlight like Rodney McKay, of course (and nor should it) but that doesn’t mean you shouldn’t take steps to keep an eye on your SERP presence!

Google Rich Answer Example

When it comes to trying to feature in more snippet, extensive experimentation and careful monitoring are strongly recommended for the best results. Content formatting and wording is a much stronger influence than technical mark-ups like schema.org. So, create stuff that is fantastic for your users (as usual – yawn) and focus on a “meeting intent” angle if you want to see progress. Always but always know what the purpose of your content is from a user’s perspective – are you answering a question (or questions), providing specific guidance on how to do something, explaining how something works…?

So, buzz your data, speak to your users (and customer service team), produce awesome stuff that meets the needs a real-world person might have in relation to your brand, and you’re already as well on the way as you can be to “optimising” for voice search in 2018.

Tools like AnswerThePublic.com are invaluable for generating ideas based on real searches, and of course, I’d be horribly remiss if I didn’t also mention the outstanding rich snippet research done by the SERP sampling superstars at STAT. So go forth and develop your snippet presence, and don’t forget to have a good look at things like Actions On Google and Alexa skills development as well while you’re at it!

Photo Credit: Kevin Bhagat

Entering 2018 With Mindfulness

Standard

Wow, I did not blog much last year.

Of course that’s at least partially because I moved my fandom/hobby shenanigans over onto Tumblr, which is not a decision I regret in the slightest, but also down to that old chestnut of not really having time to stop much and reflect enough to write about it, let alone put together a meaningful post with actual sentences. Here are my top four key takeaways from 2017.

1. Work To Live, Don’t Live To Work

Since my breakdown (wow, nearly three years ago now) I spent a lot of time drifting through work in a mild sea of resigned apathy, not really caring much one way or the other. Perversely, this has made me both a happier person overall and (bizarrely) a better worker, because by not living and breathing my work/career non-stop to the large exclusion of anything else I’ve actually started enjoying it again. This means I’m more laid back and productive all round, and because I’m spending a lot less time panicking like every tiny little thing that has or could possibly go wrong is some sort of extinction level event I’m fairly sure I’m a bit nicer to be around as well.

I’d like to be able to say I haven’t had any further emotional meltdowns, but I can’t say that without lying through my teeth so I won’t. The agency still sometimes does things that make me want to scream and tear out my hair (or someone else’s) and clients still often make me weep for the current and future state of humanity, but I’ve found that the term not my circus, not my monkeys is a very useful adage (along with remembering that it is the agency’s own money that is being spent on what seem to be to be stupid decisions, not my own personal money).

My agency’s CEO reminded me during one of my recent semi-meltdowns (in the nicest possible way) that he’s not paying me to worry about his job and the MD’s job and everyone else’s, so I should stop doing extra worrying for free, which is a concept I rather like: don’t worry for free.

A good old-fashioned oh well, fuck it often does the trick quite nicely as well. I’ll care about clients and their insanity during work hours – and will do so frankly quite passionately, because I quite enjoy what I do and like to get good results for people – until my time for the week is done, and then the lot of it can frankly go hang because I’m off to go swimming or play Mage The Awakening or whatever.

2. Creative Laziness Makes For Amazing Productivity

Everyone often comments about my speed at work – it’s one of the things I see clients, colleagues and third parties alike all gape at. I can (apparently) turn around documentation, technical testing and research in the time it takes most people to get their morning coffee, and more than once I’ve had someone’s jaw drop in amazement at my typing speed. Now the latter is easy enough to explain: not only am I an avid trashy fanfiction writer, but I spent nine months fresh out of university working as a medical secretary for a psychiatrist whose written English was not great, so she preferred to dictate everything.

The former is what I like to think of as creative laziness. I get tasks done as rapidly and efficiently as possible because I’d rather get something finished and ticked off in order to move onto the next thing. If I’m stuck doing the same thing for more than a couple of hours I get hopelessly bored – so the best way to avoid this is to actually work more efficiently, not less.

3. It’s Okay To Not Be At The Top

This is a major issue I have largely due to my upbringing and lots of other shenanigans I get to bore psychiatric professionals with – if you’re not in position numero uno, especially at work, then you might as well be at the bottom.

I didn’t get a clean sweep at either GCSEs or A levels (all A*s but one at GCSE, AAB at A levels) and was devastated to the point of tears when I only got a 2:1 honours on my degree rather than a first. That probably sounds utterly absurd to someone considerably less neurotic but it is a serious problem when anything less than perfection/top of the class/top earner (with all the career bells and whistles and job titles to go with it) is viewed in one’s own mind as equal to complete failure.

Partially a result of therapy and partially as a result of the much stronger ah, fuck it attitude described above, I’ve made significant strides towards getting over this. I doubt I’ll ever be turning cartwheels at being second best (or third, fourth, fifth etc) and I’ll never stop craving praise like a voracious little compliment weasel, but I stepped down from a significant Head Of title and role at work because I hated doing it. The agency only recently properly promoted someone else (a very deserving someone else, I hasten to add) to fill the gap, but no longer being “top dog” in that regard no longer bothers me. Fewer headaches. One less circus and a lot fewer monkeys.

4. Remember Your Own Needs Hierarchy

Anyone heard of Maslow’s Hierarchy Of Needs? It’s a theory in human psychology that says people’s motivations are determined by certain needs, and those motivations and needs progress as they are fulfilled. This is a nice diagram from Simply Psychology:

Maslow's Hierarchy Of Needs

Now, especially when suffering from a long-term mental illness your basic needs and psychological needs can overlap quite significantly, so this model sometimes stops applying. You can be entirely unmotivated by hunger, for example, because you feel so hopeless that basic bodily needs become just a background nag rather than an imperative. So actually what I find helpful is to create my own more personal “hierarchy of needs” based on the things I know I need to prioritise in order to stay functional and happy. This lets me evaluate situations as they come along and determine how to allocate my energy (or “where to spend my spoons,” to use a popular analogy).

Here’s roughly what my personalised hierarchy of needs looks like.

Personal Needs Hierarchy

 

These are bit more abstract and first-world than the Maslow version of course, but I find this kind of “ranking” of things really and truly invaluable for allocating my energy; lower down things are more important and therefore take priority. Time to either do some writing or cuddle with the pugs on the sofa? The pugs take priority. Meeting a friend for coffee or writing some fanfic? The fanfic wins (I’m really not that social a person, as further evidenced by the fact that animals are a whole two rungs more fundamentally prioritised than people, who are in my book a “nice to have” – however I know several folks for whom “time to socialise with friends and family” would be a much higher priority bloc).

This pyramid is also really useful for the things that AREN’T on it. For example: career. If you’d asked me three years ago I would have told you my career was basically my life, and everything else was just secondary stuff orbiting around it. Now it isn’t even on there – as long as I’m earning enough to meet the “subsistence” requirement and the work I’m doing is interesting enough to fulfil the “stress-boredom balance” need then everything is good.

It’s amazing how useful this sort of thinking is.

So while doing the usual mindfulness and um-aah and spiritual critiquing that is popular at this time of year, try building yourself a personalised Maslow hierarchy. It’s a good exercise from a self-review perspective anyway, and it can prove really very useful in terms of helping to prioritise how you spend your mental and emotional energy.

Happy 2018!

Photo Credit: Cristian Escobar

Display Ads: Stop & Think!

Display Advertising - Stop & Think!
Standard

So, today in random news I recently decided to become a supporter on the Guardian. The BBC’s continuous pandering to the royals has been pissing me off more than usual lately (newsflash: I’m anti royal family) and I’ve found that the Guardian has actually replaced the BBC as my go-to “let’s check in on how screwed up the world is today” destination.

I noticed the message on the site today asking me to sign up (not that I’ve not seen it before, but still) and after discovering that it now costs less than an annual Playstation Plus subscription I thought hey, why the hell not. They accept Paypal as well, which is always nice as it saves me having to hunt down my card from wherever the pugs have randomly dragged my purse today, so that’s one conversion barrier neatly bypassed. In fact the whole signup was pretty slick and smooth, on par with an Amazon checkout but without the hustle.

Of course after something so profoundly adult-ish on a Saturday afternoon I went back to writing fanfiction like a normal lunatic, which for various reasons led me to the always-invaluable www.fantasynamegenerators.com. This site is on my adblocking whitelist because a) it is genuinely useful so I don’t mind the creator earning from it and b) its advertising is discreet and doesn’t make the site entirely unusable in the process.

Now given that I had, no more than one quarter of an hour earlier, literally just counted myself as an onsite conversion (and from organic, no less) for The Guardian, I was rather surprised to be greeted by this (on the Viking Name Generator page, if you must know, but that’s not really the point):

Guardian Display Advertising FailUm. Open another tab, check email…nope, there’s the subscription confirmation, right in my inbox.

I wish I could say this is the first time I’ve seen this sort of thing, or even that it was the first time I’ve had it happen so painfully obviously to me. I can’t, of course. Not even close.

All together now, boys and girls: don’t target your converters just after they convert.

Aside from the fact that this is a wasted bid effort, not to mention inventory that could have far more usefully gone to someone else, this kind of “haunting” effect is the sort of thing that can really alienate today’s web-savvy consumers to a brand. Now I work in digital marketing so I can just sigh, roll my eyes, rant on my blog and move on with my life without any real change in my sentiments to the Guardian as a journalistic entity, but the average web user does not (gasp) work in this field.

On the whole, consumers don’t like ads. They tolerate them. It’s just a fact of marketing life. But tolerance levels go sharply down and irritation goes sharply up when ads are irrelevant, too persistent, interfere with whatever the user was trying to do in the first place (I’m looking at you, YouTube pre-roll) or are otherwise just poorly targeted.

Say, for example, by trying to sell a user something they bought less than fifteen minutes ago.

Now remarketing, properly handled, is an immensely powerful tool. So, frankly, is display prospecting when done correctly. I would never in a million years suggest that a brand abandons its display/RTB efforts provided the appropriate KPIs look good and brand safety is assured (check your white and blacklists folks – is NakedFurriesRUs.com really where you want your high end luxury name appearing?) so no need to start setting fires just yet.

But please, please turn your brain on and think when setting up your targeting options. I’m by no means claiming this is always as simple as finding a decent Viking name generator in the riches of the internet – Mona Elesseily’s excellent article about GDN targeting and layering is a great read, if nothing else to show how complex just this single network can be to sort out properly – but since when was something only worth doing if it was easy?

Rakuten Marketing recently ran some surveys around online advertising and amongst the other interesting (and worrying) things they discovered was that a whole bunch of people around the world associate ads with “other negative online experiences like fake news.”

Awkward. Especially for this particular example!

While the immediate “last click” ramifications of such an advertising faux pas may seem insignificant to the point of ridiculousness, in today’s ever-more-crowded online marketplace it is critical to realise that brand preference and good, old-fashioned, squishy feelings play more of a factor than ever before. Yes, you got the conversion right now, but if you continue to stalk your new customer with crappily-targeted display creatives you’re not only going to piss them off but alienate them to your brand and very potentially lose them as a customer in the near – if not immediate – future.

Given how much it probably cost you, considering all touchpoints and appropriate channel attribution, to acquire that customer in the first place, doing something that actively drives down their lifetime value to your business is so far beyond stupid that it can barely see stupid in the distance.

I may forgive the Guardian its little whoops moment because I know the struggle (and really enjoyed their recent article about tardigrades), but the vast majority of users will be far less generous!

Besides, every time you advertise unnecessarily to a recent converter a kitten gets brutally shot by a floating hand.

Don't Kill Kittens

Isn’t that reason enough to sort out your targeting strategy?

Edit 16th August 2017 – someone just sent me the below message on Facebook in response to my sharing of this post. Case in point, much? 🙁

Bad Retargeting

Aaand someone else sent me this over Slack as well (the booking he made was for next week, and they’re already trying to get him to book again). Shocking.

More Bad Retargeting

Anyone else have any classic retargeting failures they’d like to share – especially ones that alienated them from the brand in question?

Photo Credit: Simon Launay

How To Future Proof RIGHT NOW For the IoT

IoT Future Proofing
Standard

This post was originally made on the 4Ps Marketing blog.

This big malarkey about “the Internet of Things” (IoT) just won’t go away, will it? From radiators you control with your phone to fridges with internal cameras so you can check the contents on the move, it seems like everything is connected these days. Alexa and Google Home are all set to start battling it out on the home voice search stage, and it seems like only a matter of time before you’ll be shouting at your loo to get more toilet paper or hollering at the oven to order pizza.

One of the biggest challenges marketers currently face is providing some kind of actionable answer to the big question everyone is asking – what does this mean for my brand, how can we leverage it, and how can we start future proofing our digital assets?

I could spend the blog equivalent of War & Peace giving my best shot at answering all of those questions (drop me a line for a cuppa and we can talk about it if you’re interested) but what I’m going to focus on right now is the third one – specifically, what you can do right now on your website that will form the first steps of future proofing it against the rise of the voice-driven Internet of Things.

You’ll need a friendly developer (or, failing that, an unfriendly developer and something to bribe them with) and ideally a tech-fluent SEO on hand to get this done. It is worth it though, as an immediate and sometimes surprisingly simple-to-implement form of future-proofing that doesn’t require a multi-million pound technology investment.

That’s right, I’m going to tell you to mark up your website with schema.org again.

Schema? Again?

Right, now the groaning noises have stopped let me tell you why you should do this – and specifically do it with JSON-LD script injections rather than microdata. Well, other than the previously covered reasons when I updated my recommendation last year.

Google Home & Alexa Skills Use JSON

The first image is a screencap of an Action for Google Home (from here) and the second is a Skill for Alexa (from here). Notice anything about both of these?

That’s right, they’re both powered by JSON.

We’ve already seen the early beginnings of JSON driving actions on pages as well as simply structuring data to be machine readable – the most obvious example is the sitelink search box markup which allows users to directly interface with your website’s search bar from the Google results page, saving a click. In a future without a conventional results “page” – say, the Internet of Things or a voice search heavy technology ecosystem – it’s easy to see how these sorts of interactions can evolve. What precisely this looks like is still to be determined, but all the signs point to it being written in JSON.

Schema.org already has a whole mess of options available for Actions as well as Objects. A lot of them, especially as given in the examples, are rather pedantic and not necessarily of immediate use from a marketer’s perspective, but the point is that the vocabulary is there. It’s quite accessible as well – even I can write JSON scripts, and I haven’t done any formal coding since the FORTRAN 90 module in my undergraduate degree.

So if you’re a brand, get marked up with JSON rather than microdata, and start using this to signpost key actions on your site – from Order Brochure to Add To Basket, or whatever else you can implement. I recommend inline markup where you can; while it is perfectly possible to deploy schema.org using Google Tag Manager and similar systems, there seems to be a marked delay in pickup by crawlers and there’s every possibility that non-Google entities won’t even realise the stuff is there.

If you’re a marketer, and doing anything in the vague region of technical SEO or web development, try and get at least basic reading fluency in JSON scripting. W3CSchools is a good starting point, and I would personally recommend Codecademy if you want something more structured towards progressive learning.

Photo Credit: Gian Prosdocimo

Guide To An SEO-Safe Domain Migration

Standard

This post was originally made on the 4Ps Marketing Knowledge Base.

There are lots of reasons why a business may need to change website domains during its lifetime. A rebrand is the most common, although there are other reasons, for example going from a .co.uk to a .com to facilitate a single domain international expansion. At any rate, a domain change is an enormously risky event for a website – the mass change of URLs necessitates a great deal of unsettlement and re-indexing which can send organic visibility haywire.

It certainly isn’t something to be taken lightly. A mishandled domain migration can destroy a website’s visibility, and organic traffic can take months or even years to recover after a botched domain change. Fortunately we’ve got a checklist at 4Ps that can help your domain switch go smoothly.

PREP YOUR REDIRECTIONS

Before you do anything else, make sure you set up your 301 redirects to go from old URL to new URL. Just pushing everything on the old domain to the homepage of the new domain will be devastating to visibility – authority needs to be passed on a page by page basis in order to preserve it in the right manner to avoid losses.

Ensure that these redirects are tested thoroughly in a development environment – the last thing you want is lots of redirect chain slamming your site’s speed, or something misconfigured so a URL is missed, when the big day comes.

UPDATE ALL INTERNAL LINKING

Again in your development environment, ensure that all links and associated elements are properly updated to reflect the new domain. If you’re using relative rather than absolute internal links that saves a job, but it is a good idea to run a crawl to be sure just in case as some sites can end up with a mix.

As well as general hyperlinks, ensure that all linking elements are properly updated to reflect the new domain. XML sitemaps, rel alt hreflang, rel canonical, rel alt amphtml, Open Graph, Twitter Cards, structured data – all these sorts of elements must be checked and updated to reflect the final absolute URLs that will be on the site after the domain changes. Not doing so will result in broken links and can cause other difficulties with organic visibility if any markup ends up invalid.

PREP MEASUREMENT TOOLS

Setting up query visibility measurement well before the domain switch is due to take place is always good to establish some kind of benchmark for the old domain before the change. Ensure your analytics is set up appropriately so that all traffic can be tracked and attributed correctly.

Another excellent measurement tool is Google Search Console (and Bing Webmaster Tools, plus any international variants). The best way to handle this setup is to have both the old and new domains validated at the same time, ready for the switch. You can do this relatively easily in a variety of ways, and having both domains verified will allow you to monitor impression levels for both.

Naturally the optimal result will be organic impressions for the old domain decreasing at about the same rate that they increase for the new one, indicating a smooth transition. Ensuring these tools are enabled and verified in advance will ensure that the right data is available to diagnose any issues as they arise – missed redirects, dropped URLs and other such problems are all much more easily corrected if they can be clearly identified.

THE BIG DAY

Update all analytics and PPC (and any other external tools or platforms) as soon as possible after the domain switch, if it is not appropriate to do so beforehand. When making the DNS switch, it is worth reducing your TTL (Time To Live) to speed up the migration and help propagate the new domain as rapidly as possible.

Once the new domain is live, run a full redirect check and then an internal crawl to check internal links. Ensure any problems are corrected as soon as possible – there’s no such thing as “too soon” for fixing issues. The next thing you should do is register the change of address in Google Search Console (and all other webmaster tools accounts). Once this is done, resubmit your XML sitemap (or sitemaps, or sitemap index file). I also like to manually submit the homepage to the index as well, just as an extra nudge to the crawlers to get them going.

MONITOR AND WAIT

Expect at least six weeks of ranking and visibility fluctuations while the new domain settles in. Sometimes this can last up to two months or even longer, depending on how well the migration is handled and how smoothly the redirects go in.

It is also worth noting that in some circumstances, when migrations go well, it is entirely possible for a site to experience few to no fluctuations at all – but for purposes of managing expectations internally it is generally recommended to prepare for the worst, and to be pleasantly surprised if they do not arise!

FURTHER ADVICE

If you’ve got a domain change or similar high SEO risk event on the horizon for your brand, I’ve got tons of experience in website migrations of all shapes and sizes. Give me a shout or take a look at some of the related guides on my agency’s website:

Recommendation Update: From Microdata To JSON

Standard

This post was originally made on the 4Ps Marketing blog.

I’m all about the structured data markup over here. In a world increasingly driven not just by users searching for immediate answers, but by artificial intelligences like Siri and Cortana performing searches on behalf of their users, the ability to have rich information easily machine readable and machine processable is of increasingly vital importance.

lthough the official line from Google (and others) continues to be that structured markup has no direct “ranking impact” or is not used as a “ranking factor” (although rumours continue to circulate that it will be part of the algorithm one day), evidence continues to pile up in favour of its implementation as an SEO consideration. Studies like this 2014 one from SearchMetrics consistently show a correlation between the use of structured data and high organic website performance, and Google themselves say that use of the markup helps increase the chance that you’ll appear favourable in rich snippets and enhanced listing types in organic SERPs – things like Knowledge Graph cards and rich answers. As these result types often pip conventional results to the top of the SERP, that’s a pretty powerful message for potential exposure of your brand.

What Is Structured Markup?

In its simplest terms, structured or semantic markup turns a webpage from a bunch of text and images into a set of things with each have their own properties. Rather than relying on search robots and algorithms understanding the concept of a pair of shoes as a product for sale (rather than a webpage that contains lots of text about shoes and £ symbol), for example, structured data lets us explicitly state this in the form of additional markup in the source code of the page. This means that machines reading the page (including things like search robots and other “parsing” things like Siri/Cortana et al) don’t need to work as hard to understand what it is about – they can see all the information and attributes laid out in a way they can nicely understand.

As we all (should) know, a search bot that doesn’t need to work as hard is a happy bot. Happy bots generally mean visibility boosts, one way or another! So think of structured markup as the bot equivalent of tasty chocolate…

Happy Robot

Implementing Structured Markup

In the past the generally accepted and most widely-used manner of implementing structured markup has been using microdata. This is generally simple to implement into existing HTML templates as it essentially just adds a bunch more attributes (in the form of itemscopes and itemprops) to your pages.

Microdata Markup Example

This approach can often be tricky depending how complex your templates are, and a bit of jiggery pokery is often required to get things to nest correctly, and often you need to include a bunch of meta tags to make sure all the needed attributes are present and in the correct format. It works fine most of the time though, and there are often plugins available for the big, mainstream content management systems which make life easier.

Switching To JSON-LD

JSON-LD, or JavaScript Object Notation for Linked Data (which is what its mother calls it when she’s angry with it), is essentially a way of encoding structured data – including schema.org markup – using the JSON data interchange format. That’s a posh way of saying, in very simple terms, that it lets you put your structured data bits and bobs into a script element that sits independently to the existing HTML template of your website.

JSON Markup Example

There are a few advantages to implementing structure data with JSON-LD rather than microdata:

  • It keeps the structured markup independent of your template layout, so if your site’s page templates change you won’t need to redo the markup each time because your nesting breaks or the entire thing starts producing errors
  • It is much easier and more efficient to mark up complex nested objects and concepts in JSON than it is microdata, so you can implement more comprehensive markup to take advantage of more potential opportunities
  • You can pull fields and properties directly from your content management system without needing to play about with meta itemprops and other formatting pain
  • You can even (with the right analytics consultant to help) link up your JSON deployment with your data layer to start tracking properties as custom dimensions in one neat package

Most importantly though – and the number one reason I’ve made the switch to formally recommend to all clients and prospects going forward that they utilise the JSON-LD markup method from now on – is that Google itself has switched from am ambiguous level of “we support any schema.org formats” to specifically recommending the use of JSON-LD above the other markup techniques.

Google JSON Recommendation

We’ve already seen evidence in the past that Google has been more inclined to properly pick up and parse JSON implementations of things like the sitelink search markup, so this isn’t particularly shocking, but with the new advice on things like Rich Cards and the change to Google’s developer documentation on the subject – not to mention the pain of having to rebuild the microdata spec every time a template is tweaked slightly – I decided it was time for a formal change of recommendation.

There’s no evidence (thus far) that Google is going to stop supporting microdata implementations of structured markup, but this is one of the rare cases where the search giant has provided not only general guidelines but a clear preference for a particular implementation. So in the name of future proofing, not to mention an easier life, I’d suggest the following:

  1. If you haven’t implemented any structured data on your site yet (and why not?) then make sure you get this into your technical roadmap soon, and use JSON-LD when you do.
  2. If you’ve got structured data implemented already using JSON-LD, run it through the revamped structured data testing tool to be safe – Google has tweaked and expanded some of its own specifications, such as for Articles and Recipes, in ways that diverge slightly from the core specs on schema.org, so it is worth checking.
  3. If you’ve got structured data implemented on your site using microdata or RDFa, don’t panic! Google does still support these, but you’ll probably want to look at getting a revamp in JSON-LD into your technical roadmap within the next 12-18 months to be safe (and you can use this as a good time to overhaul the markup to make sure you’re taking advantage of all your opportunities, too).
  4. If you’ve got structured data implemented on your site using some other vocabulary than schema.org, such as microformats, you’re running a risk of both losing any benefits you still have (as Google has firmly thrown itself into the schema.org camp) and are probably missing opportunities too, so make the switch to schema.org (with JSON-LD implementation) as soon as you can.

To find out more about the benefits of schema.org markup for your search marketing and brand promotion efforts, take a look at the newly refreshed Google documentation which is a good read for both developers and less technical marketing types.

SEO In 2017: The Crystal Ball Predictions

Standard

This post was originally made on the 4Ps Marketing blog.

After the ever-saintly Ben Davis featured my thoughts on AMP in the usual seasonal Econsultancy roundup it seemed like a good idea to get the rest of my thoughts down on e-paper before getting swamped in keeping the pugs from eating all the tinsel and getting the cat away from the wrapping paper. As is normal, of course, I’ve only just had a chance to sit down and actually start thinking about the next (and last) twelve months with an appropriate level of depth, so here we go…

Real Time Penguin

Google finally launching real time Penguin was a huge highlight of 2016 for me – better late than never, and so on. I’m also very pleased about Google’s refreshed approach to this – ignoring rather than immediately penalising bad links – as this very much diminishes the nastiness potential of “negative SEO” and seems like an adoption of a “more carrot than stick” kind of attitude towards webmasters on Google’s side.

Voice Searching

The mainstream launches of the “home PA” systems like Amazon Alexa and Google home are very interesting and seem like they might finally propel voice search into the mind of mainstream consumers and brands. This idea of bringing search into an always-on state is a natural evolution of device proliferation but we’ll be watching very curiously to see how it starts to shake up user interaction with digital, especially buying patterns. It’ll also be interesting to see how paid advertising starts to rear its head on voice platforms without fundamentally damaging the user experience on them.

The implications for longer and more specific searches, which tend to occur more naturally in voice queries, are also going to have knock-on effects on everything from content structure to search market research. Alex Smith, on the Food & Leisure team a my agency, commented

“What I’d like to see is the evolution of a keyword planner-esque tool that can work on phrase match or uses machine learning to handle the longer tail, more semantic and context-driven queries.”

Whoever can start providing these sorts of datasets, given the increasingly vicious throttle that Google is putting on its own search data and the lack of granularity in Adwords tools for things like device split and media search method (voice vs type vs image etc), is likely to make a lot of forward-thinking marketers very happy people. Platform providers of the world, take note!

UX Integration

Something a I’ve started to notice with quite a few clients this year is the (very welcome!) development of marketing teams starting to take a real interest in their site’s performance in terms of user experience rather than just bottom line. Site speed, especially, has far too often in the past been written off as “a problem for the IT/web guys” so I’m bloody glad to start seeing some decline in this silo-ised thinking. Analytics, data, marketing and customer journeys seem to be finally starting to get joined up in brand thinking, so although there’s still a long way to go for most businesses this is a great step forward and here’s hoping to see it continue to see deeper adoption in 2017 and beyond.

VR, Bots And New Search Touchpoints

There’s been a lot of buzz around the tech and digital industries about virtual reality and the rise of chatbots as interaction tools for brands. VR is a big unknown for search at the moment as it is (of course) more experientially focused but Nick Shread, my fellow Kent resident and colleague at 4Ps who heads up the third sector team, notes

“I’m wondering how VR will impact search. Searches performed from within games or experience playbacks perhaps? Maybe a “find me something related to this” sort of prompted discovery angle that ties into results across other devices?”

There’s a wide open field for experimentation here of course – watch this space! Chatbots and similar machine learning or AI-driven tools are already starting to make waves in the search space though, especially on mobile. Google set the precedent with their own machine learning driven RankBrain, of course, but there’s an increasing trend of AI-type entities making searches on behalf of human users, rather than the human user undertaking the search themselves.

That’s all the average chatbot does if you dig deep enough under the bonnet of the technology (at the risk of oversimplifying a very cool and complex field), and you only need to look at the typical behaviour of digital assistants like Cortana and Alexa to get some cool ideas of where this could be going. Matt Stannard, my usual partner in predictive crime, and I predicted this back in 2015, so we’re applying for our licensed digital sector psychic badges this year. Matt (who serves at the Innovation Director at 4Ps when he isn’t building mad-scientist type analytics gadgets) also comments that he thinks different search methods than “words” are going to start rising soon too as things like image and sound recognition keep developing.

“What about searching by proxy? Cortana, find me something that looks like this, with “this” being an image, or phrase, or sound, or smell…or even a feel. Haptic interfaces are going to start showing up sooner or later!”

On the Google front, for 2017 the only thing I’m reasonably confident of myself is that AMP is going to get bigger before it goes away, despite some signs of rising controversy in its potential user benefits and current implementation form. Google is pushing it immensely hard and it seems to be only a matter of time before it extends to full capability deployment in new verticals like eCommerce. This will be particularly interesting as and when the mobile-first organic index gets rolled out, as despite Google’s claims that they’re aiming for a “low delta” I suspect that non-responsive sites are going to see some big shifts in visibility if they don’t get their content and markup synced up.

How do you see the organic search landscape shifting in 2017 and beyond? How will new technology and potential touchpoints start disrupting the way brands need to present their content to users? How will measurement, analytics and data struggle or stride ahead to keep up? I’m always up for a coffee and a geek out so drop a line to 4Ps for a chat or hit me up direct and let’s talk.

A Robots.txt Guide For SEOs

Standard

This post was originally made on the 4Ps Marketing Knowledge Base.

Every SEO should know their way around the core principles of a robots.txt file. It is the first thing a crawler looks for when it hits a subdomain so getting the basics (and the not-so-basics) spot on is important to ensure you don’t end up with pages showing ineffectually in search results or just dropping out of them altogether.

Robots.txt Location

Your robots.txt file must sit at the root of your subdomain. No negotiation here. What actually happens is that the crawler strips out the path from the URL, which is everything after the first forward slash, but in practical terms this means your robots text should sit at the root.

  • http://www.website1.com/robots.txt
  • http://website2.com/robots.txt
  • http://place.website3.com/robots.txt

Put it anywhere else, and crawlers won’t find it, which means you effectively have no robots.txt file on your site. That means, incidentally, that bots will assume they can access everything and so will just go berserk and crawl every inch of the site they can get to. This might be perfectly fine if you have a smaller website – but it can be very risky SEO-wise on a large catalogue or enterprise site where you want to more carefully control crawler behaviour to make sure things are indexed to best effect.

The Basics

You can create a robots.txt file in any basic text editor, up to and including Notepad. A very basic robots.txt file will look something like this:

The first line uses a wildcard * to mean “any user agent” (so “any robot”), the disallow being blank means nothing on the site is disallowed from crawling, and the sitemap line specifies the location of the XML sitemap (or sitemap index file) for the website so the bot can hop onto it and start indexing from that list. Keeps things nice and efficient!

If you want to stop all bots from indexing content within certain folders – say, an area only accessible to logged-in users, that’s pretty simple to do.

You can also keep robots out from a single page or file if you want.

Important Notes On Robots.txt Blocking

It is important to note that blocking things in robots.txt does not prevent them from appearing in search engine results pages altogether. What you may end up seeing in a SERP might well be something like this:

Robots Blocked SERP

Now for most things this may actually be fine. User areas or invoice templates and so forth – you’re probably not too worried about outlier cases where they show up like this, as long as their full content isn’t being indexed and ranked organically.

In some cases, however, brands may be more sensitive to certain URLs or files and want to ensure they will never show up in a search engine in any shape or form. If this is the case, it is vitally important to ensure that these files are not blocked in robots.txt – the bot will need to crawl the asset thoroughly, not just “ping the URL,” so it can see the robots meta noindex tag or x-robots noindex HTTP header.

It is also critical not to block assets in robots.txt that are needed to render pages in a browser. In the past many developers would mass block things like scripts or CSS folders, but doing this now will result in a grumpy message from Google in Search Console and can have a direct negative impact on your organic visibility levels (Google announced this change in 2014).

Other Important Notes

There are plenty of other elements you might need to know about a robots.txt file. Keep an eye out for some of the following:

  • Crawl delays. These were used back in the day to throttle robot access. There’s no reason to have them in a modern setup – and Google ignores crawl delay rules anyway.
  • Pattern matching. Both Google and Bing robots will honour rules that make use of * (a wildcard, meaning “any sequence of characters”) and/or $ (which matches the end of a URL).
  • The robots.txt file is case sensitive in all senses – don’t call it robots.TXT, for example, and make sure any rules you put in are case matched to the URLs required.
  • Only one URL rule can go per line. Three file or folder disavows, for example, must go on three lines.
  • Processing order for rules is important! Google and Bing robots both make use of the “most specific rule first” principle, while standard processing order is top to bottom. If in doubt, put any Allows above any Disavows (for example Allow a file in a director before you Disallow the entire directory in order to achieve a “disallow all in this directory except this file” effect).
  • Avoid blocking files in robots.txt when you should be using other techniques. Some of the most common problems we see include blocking mobile websites from non-mobile bots or using robots.txt to block duplication caused by internal architecture problems. Make sure you address situations like this with search engine recommended solutions, not just by throwing robots.txt rules in!
  • You can add comments (human but not machine-readable notes) to robots.txt files by using # at the beginning of a line.

Remember that while the robots.txt standard is a directive, it is not enforceable. Naughty and malicious bots and crawlers will generally ignore it altogether in favour of whatever they want from your site. Be aware too that the robots.txt file is always public – anyone can see it by going to the /robots.txt URL on your site! Definitely don’t rely on the robots.txt file to keep secure areas of your site “hidden” or “safe” – use the appropriate encryption and login protocols.