During some research (yes, even I have to admit I don’t know everything), I came across this old (2011) video about ‘what is SEO’. I watched it and to be honest I liked it. The simplicity of the explanation and the wholesome nature was pleasing.
I thought that this presentation was worth a share. This was Marcus Tober, the founder of SearchMetrics and presented at SMX in June 2014.
It is always interesting to see these aggregate level mass studies observing and observing over time the variable that may effect ranking.
Take a read.
I was explaining Authorship and how to use Authors to own their original content the other day. A friend forwarded this graphic to me yesterday serendipitously. It is very good and worthy of a share.
I was going to write up some notes about the evolution of the SERPs as I needed to explain why CTRs were very unpredictable depending on the type of keyword, user intent, vertical and personalisation. While thinking about all the possible combinations and scooting around the web, I saw that Dr Pete has saved me the effort. Cheers fella. Well, here is his deck via SlideShare.
I couldn’t of done better myself 😉
Just looking into a traffic drop that looks like it only affected events-style content and on some very precise informational terms from Google web. So, with my ‘tinfoil-hat’ on and assuming that we are the victim here of the behemoth that is the omnipotent and omnipresent Google. I am referring to knowledge graph and structured mark-up. And specifically how much information do we give away.
The modern dilemma
Should we give, and be ‘needed’: or hold-out and be lonely?
If we give too much data to Google we [website owners who monetize them] we will loose visits, impressions and the chance to build brand/loyalty and monetize. So, with no directly obvious return for our time, effort or investment. If you run your site for ad revenues; for clients who expect web analytics to give you the only measure of success then this is important.
Schema mark-up, microformats etcetera present our information in a way that can be used by the engines in a number of different ways. It can be used to provide site links in the SERPs to its own page, or, it can be used to provide site links to other related pages on your site. However, [tinfoil hat moment] it is also giving data/facts and information nuggets to the engines, so they can present the answer via knowledge graph/one-box answers or mash-it-up in another clever way for their own gain.
So we, as media owners/publishers are between a rock and a [very] hard place. We are all moving to separate data from the presentation layer. Moving to html5, modular design, focusing on site speed and mark-up to make quality, scalable sites. We have, as an industry, pushing for the latest standard, code to enable anywhere/anyformat viewing of our brands. We are progressive SEOs to an extent, have always understood that our presence is not just our own sites but this is making it very real indeed.
But, we can’t not, keep supplying this data as we want to be found/seen to be experts and add value via the entry to the web. And like all imbalanced relationships, if I don’t give, Google what it wants, they will leave us for our competitors. Without even a kiss goodbye!
Google wants to the be the repository of all things in knowledge and [I believe] is starting to display portal like tendencies by keeping users on its own properties. So, what to do?
Right now – embrace it, see how it goes. Maybe in the future, we will have
forced to change our thinking and divide our sites into two types. 1) Pages that may get visits, but do give information to users [maybe just not on our site], and 2) Landing pages, entries onto our own web properties as we do right now.
The updated ‘Periodic Table of SEO Success Factors & Guide to SEO’ by SearchEngineLand. Simply…nice!
Today I delivered this presentation at eTail Europe conference here in London, UK.
The title is “The future of SEO. Moving to a holistic inbound marketing strategy” with an earned media case-study thrown in for good measure.
I genuinely believe that in the future, earning peoples attention will be the norm, rather than just pushing your way in front of them. By being, innovative, creative and “remarkable” then you will get more attention and loyalty than you could afford to buy.
In this presentation I try to lay out some history and context to the evolution of SEO and the changing search-scape. I introduce the new world of “inbound marketing” aka “earned media” or “content based marketing”.
Delivering this kind of campaign needs a lot of different skills. So, formulating a plan, organising “all of your brains” may need to organisational shift. This stuff can be so much fun and hit multiple teams traditional objectives. SEO for links and social citations. For Social Media for Likes, RTs, shares, follows, subscribes. For branding, for PR, to build loyalty and affinity with your brand. And many others.
The case study refers to My Destination’s Biggest Baddest Bucket list. This is a campaign that SeSoMe delivered as an internal agency. Working with so many throughout the wider business and the franchise network. The 2nd half of the campaign starts in July when the winners travel for a full 6 months living like a local and ticking off their bucket list.
If you want to talk about this presentation or SeSoMe, just let me know.
Thanks to Joao for taking this picture from the audience.
Enjoyed this, some straight forward honesty with some real experimentation. A lot of us go through some of these things for personal and professional sites and most SEO have a number of personal play sites. Some good, some bad, some deliberately bad to act as a barometer. So, even an casestudy without domain names can confirm what you think you see in your own bubble. Thank you for sharing Michael.
Just seen in the SERPs that on the drop down menu where geeks [like me] would use to see the cached version of the page there is the option to “Share”.
Which takes you to a slim version of the share/post box.
If you try to share the same site/page again it will take you to the post you made on G+ profile posts stream.
Just something new I have just seen.
So, the clear long term direction is to keep making a quality, well written, engaging, fast site that people would choose to recommend to their friends and family and mention on social. So, no change there.
If you are doing things to game Google, which everyone has to [to some degree] as there is a addictive dependency that has been created. A vicious circle of traffic need/expectation and the sheer dominance of one traffic provider.
In an article earlier this week by Search Engine Land they wrote up about the next significant algo update. The interesting thing in this article is that the after shocks in the last year or so have been just incremental updates, and a full 2.0 is coming.
This latest video featuring Matt Cutts of Google, talks about what changes are coming. Even IF you take his commentary as controlling spam through PR efforts; there is normally some directions/themes that you can take out of it.
Here is that video.
It seems that more changes are coming. Enhancements some might say. The tone of the video suggests that this is to be more comprehensive than before.
- There is a Penguin 2.0 a web spam change – to continue to target [hate myself for using this term] ‘black hat’. So, more comprehensive would mean more identification, to go deeper with bigger impact for site that are caught
- Advertorials, to prevent these from passing any page rank. If they are done just for SEO, then they are in violation of their “quality guidelines”. To make this a bit stronger in detection and decision making on value to pass. There are no issue with advertorials – unless they are just done to pass page rank for cash
- If you are running loans, porn, pills or hard SEO areas you are being targeted by name into your link practices! This included a named mention of finding networks and all sites associated with them
There were some positive things
- Better communication and information if your site is hacked or have malware inserted
- An attempt at giving sites the authority they deserve, and then the visibility there too
- Great news for many who were caught up partially or unfairly by their Panda update. Matt talks about looking for more +ve signals that may move some sites out from the borders
- And, some presentational things in the SERPs around clustering domains to a user. And, if you do see them once, you would be less likely to see them again on subsequent pages
So, to be honest, if this is true. Then the coming changes would be genuinely quite good for most hard working decent folk! Let’s hope it treats my sites well!