Laying the tracks for search engines

November 7, 2014
Comments Off on Laying the tracks for search engines

Imagine you’re a librarian in the world’s largest library.

Your only job is to take the material (books, magazines, encyclopedias, etc.) coming into the library and index all of it. Only you’re not indexing alphabetically, or chronologically. You’re indexing according to topics, keywords, and queries. Your patron comes in and asks you to get the most relevant material to his query. Let’s say he asks – “I’d like everything you have on ants” – so you start thinking and analyzing everything you’ve indexed on ants. What would be the most relevant, you’d want to give your patron something that’s highly relevant, that way he’ll keep coming back. You’ll want to find more than one piece of material so your patron has a selection to choose from. Give him options. And this is pretty much what a search engine does, only they do it very quickly.

Search engines crawl web pages, sometimes they’re called spiders, ewww!

They do this through links, links are the easiest way to get from page to page, so it’s important to do internal and external link building. Good, relevant internal link building will help your SEO. So will editorially-earned link building (I’ll talk about this in one of my next posts).

Algorithms like Panda and Penguin, (what’s the new one – Pigeon or something like that?) have made things really easy for the user to find what they’re looking for. So while you’re doing you’re content, you have to think about your codebase and all non-text content.

Codebase – use semantic markup. In the new HTML5 specs, you can use tags that are specifically designed to let a search engine know this is what type on content it is – <aside>, <article>, <navbar>, <sidebar>, etc. Make sure you use the alt attribute for images – this is W3C compliant. But when it comes to the actual page, we’re not only going to have text. We want things like rich media – images, videos, pdf’s. So, you’re going to want to have descriptive text for all these elements on your page. Transcribe your videos. Put a description block underneath an image. Write descriptions for carousels and sliders. Search engines or machines read pages much differently than humans. Get the help of a coder and put your metatag descriptions in there (they don’t really help with SEO anymore, but they’re best practice and will help with your click-through-rate), have them help you with putting in an XML sitemap (this will tell the machine what’s on the page), and put a Robots.txt file in your file structure, so things like your login page aren’t indexed.

There is so much to consider when laying the tracks for search engines, but it has to be done. Now search engines can rank for intent, which I find fascinating. I saw Rand Fishkin (one of the great SEO experts ) at the last INBOUND, he opened my eyes to something. If you put in a search engine query – “I want to know the movie where the guy’s called the dude” – the search engine will come back with The Big Lebowski  (great movie!!) and that amazes me. These algorithms are getting so good, that they know our intent. I don’t know if I’m impressed or totally freaked out!! The bottom line is we have to keep up with the ever changing ecosystem we call the web!


Stay tuned for my next post about local SEO.

Related Posts

Being AJiLe logo


If so, join the mailing list to receive the latest blog posts that help bridge the gap between the digital and the human.

You have Successfully Subscribed!