Basic SEO for WordPress

Last Modified: November 11, 2015

In the year 2014 my article “WordPress SEO, basic optimization” was published in the WP Magazine.

Since that time WordPress updated and I decided to create an article in my blog, the content of which will be updated over time, following the trend of SEO and WordPress updates.

In this “Basic SEO for WordPress” article there are described settings of the CMS itself, plugins, and other hints that will help the site owner regardless of the chosen strategy of promotion in the future to get a tangible SEO effect. I would say – it’s basics, which you need to implement in any case.

Let’s start!

Links

Permalinks

By default (Settings → Permalinks) WordPress creates URLs using a digital ID, for example: yourdomain.com/?p=123. Concisely, nutshell, unfortunately not informative.

Do not scoff on possible visitors from search engines, and on search engines themselves — specify links in understandable words, increase the weight of those words on the page. I personally prefer to use the following combination:

Permalinks
Variant of permalinks structure

Thus a clear logical structure is dictated, it is easier for search engines to identify meaningful sections on the site, as well as the user can optionally enter the section by removing characters to the nearest slash.

If you need additional customization – try to check this list of possible structure tags.

Domain name

Do not forget to define the URL of your website (Settings → General) — with or without the www, http, or https protocol:

Configure the domain in WordPress
Configure the domain in WordPress

After you save changes – check the redirects by typing the address of the homepage in different variations. If the redirection does not work – this may force search engines work with your website as with a pair of duplicates, and common weight of inbound links can be dispelled. For example, instead of 100 links, leading to page http://www.yoursite.com/page/ you only get 60 and 40 others will lead to the duplicate at http://yoursite.com/page/.

Canonical links

Canonical links are links that are not visible to users, but are working for search engines similarly to 301 redirect. With their help, you can specify the page-source for the search engines in case if the need arises to use the same content multiple times in different locations.

Several possible cases where WordPress can create multiple addresses for one page:

  • Source page: http://somesite.com/page/
  • The same page, but with an anchor for response to comment: http://somesite.com/page/?replytocom=593#respond
  • The same page, but with access via ID number: http://somesite.com/?p=425623

For users these pages will look exactly the same, but for search engines – three addresses means three different pages. This means that, as in the case of duplicate domains, referential mass may be scattered, and, besides, the pages will be judged as duplicates at the same domain, what search engines is not too fond of.

In this case, simply adding this code <link rel="canonical" href="http://somesite.com/page/"> into the <head></head> section will solve the problem of duplicate records, since canonical link will redirect search engines robots to the source page.

You can either implement such links manually into your theme, or you can use one of the SEO-plugins that allow you to resolve the issue automatically. I personally prefer Yoast SEO.

Taxonomy in the context of WordPress SEO

Taxonomy is, in general, teaching about the principles and practice of classification and systematization.

WordPress by default uses two kinds of custom taxonomies: categories (vertical hierarchy) and tags (horizontal hierarchy).

How to use them? It is simple. Vertical hierarchy is used for clear systematization of the structure so that search engines would be able to understand the site structure easily (where sections are, where single pages, etc.), as well as users would be able to understand easily and logically – how to move through menus and “breadcrumbs”.

Horizontal hierarchy is used primarily to speed up retrieval of information, for both: humans and robots. “Tag cloud” will allow immediately go to the needed topic by couple of clicks, without plunging into the tree of vertical hierarchy. Apply tags to every post and page, where it seems justified, but don’t overdo it. Up to 10 still looks normal, but more than 30 already looks spammy.

Robots.txt

robots.txt is a text file, located by default in the root of the website and recommends robots of search engines to index or not to index certain parts of the site.

WordPress by default does not contain this file on the server. When requesting this address http://yoursite.com/robots.txt it is created virtually, based on CMS settings, which can give not too wide choice of “allow bots to index your website” and “close the website indexation totally”.

For the sake of fairness it should be noted that now, when search engines learned to play CSS and JavaScript (even if not in full), there is no need in fine tune of robots.txt at most not complex sites.

However, if you need a real file in the root of the website, where for some reason you want to exclude from the search any files or pages – create it manually (virtual file automatically will disappear), or use a plugin that will change the virtual file, such as: DL Robots.txt (RU).

Sitemap.xml

sitemap.xml file – list that stores addresses of pages of your website that you would like to pass to the search engines for indexing. Format, the maximum number of pages and the maximum amount of this file vary slightly from search engine to search engine, but generally conform to standard Sitemap Protocol.

WordPress by default does not contain sitemap.xml file. You can solve this problem in the following ways:

Before passing the sitemap.xml file to search engines – do not forget to check the correctness of compiling through services of search engines themselves: Google, Yandex (RU).

Snippet in the SERP

From all well-known fields which have a direct impact on the snippet in the SERP, WordPress only supports Title by default (function wp_title ()), which automatically displays the name of the post or page.

If you want more (add the Description meta tag, customize the Title tag, add Google + Publisher Link, Twitter Cards and Open Graph) – you have to use plugins (same Yoast SEO) or custom fields (wonderful plugin to work with them – Advanced Custom Fields) and output them directly into the code of theme of the website.

Separately it is necessary to tell about the site icon or favicon.

Starting with WordPress version 4.3 it by default includes a function for adding a favicon (Appearance → Customize → Site Identity). However, if your logo will be placed on a transparent background – I advise to use specialized plugin (like Favicon by RealFaviconGenerator), or write code manually, because different systems and devices may display transparency of icons in different ways.

Semantic markup

Semantic markup – is a page markup with use of additional tags and attributes that help search engines robots to process the information contained in the content.

I’ve already mentioned, for example, Open Graph Protocol – it is one of the types of semantic markup, which was developed by Facebook. You are able to see the result of it each time you publish some link in you Facebook post: some picture from the page appear under the post, some title, description, maybe video. This means that Facebook robot checked link, that you have inserted into your publication, read the Open Graph tags (which picture to take, what title to add, and so forth), if they were there, and uploaded it all into your publication.

Open Graph is used not only by Facebook but also by other social networks, so do not ignore it.

It can be implemented simple enough, using the same Yoast SEO plugin, but that cannot be said about such a no less important variation of semantic markup, as Schema.org.

Schema.org is allocated by that it is officially supported by the leading search engines on the planet, respectively – the more you use it to describe your content, the more benefits it will bring to you in SERP. However, that is the problem with Schema.org – it is very extensive and automatically can be added to the site only in a very small volume. Accordingly in WordPress by default it does not supported.

But – less is better than nothing, and therefore I strongly advise to take a look at the Add Meta Tags plugin. Important – after the installation do not forget to disable the Open Graph in fields of the plugin and other tags that you already use on the website, for example through Yoast SEO, otherwise tags might be duplicated, and not always with the same content.

HTML code

Firstly, you need to remember that WordPress is not responsible for your website’s code in full. Mostly it depends on the theme you are using.

Warned – hence, armed:

  • Check out how H1 is generated in your pages template, for example. If it is already displayed by default with a copy of the title inside – do not include the H1 into content itself. This header (H1) must be unique on the page;
  • Check – if the category displays the post entries each with its own H1 heading, separated by <section></section> tags and with <!DOCTYPE html> on the page, i.e. the browser will work with HTML5, which allows the use of multiple H1 headers in such cases;
  • Check for hidden code at the footer of the website or at meta tags. You will not like it, especially at first, if you’ll mess your website with external links to irrelevant content.

Beyond that, appropriate standard recommendations for all sites – check code validation, fill the alt attributes for all images and so on.

Speed up indexation

WordPress by default provides the ability to send pings (messages) to different services, notifying them that you have a new article (Settings → Writing → Update Services).

I would not recommend spamming everyone and everything, banal because sooner or later you will start to fall into the black sheets, and in addition, each ping creates some load on the server.

However, in addition to the default WordPress Ping-O-Matic service (http://rpc.pingomatic.com/) I would recommend at least add several other resources:

  • http://blogsearch.google.com/ping/RPC2
  • http://blogsearch.google.us/ping/RPC2
  • http://blogsearch.google.co.uk/ping/RPC2

As well as a few more by analogy, if the site you have is multilingual (for example – with Russian language):

  • http://ping.blogs.yandex.ru/RPC2
  • http://blogsearch.google.ru/ping/RPC2
  • and so on.

In conclusion

A few tips in addition to aforesaid:

  • Do not use full content of the post in the category loop. Use the function the_excerpt() instead of the_content(). Thus, you will reduce the number of duplicated content at your website and will make it easy for users to find the necessary material, because that will accelerate the scrolling;
  • Use the Excerpt field. This will allow you to display some interesting text about the article, not just the first few lines of text entry with an ellipsis at the end and some words about “read more”. Also – it’s good tool of saving you from additional duplicate content;
  • Exclude archives by date from search engine indexing. Such pages do not carry any special thematic load, but completely duplicate existing categories;
  • Exclude site search results page from search engine indexing. Nothing useful there, except for duplication of existing entries and the SERPs.
  • Do not exclude tag pages from search engine indexing like this one: http://yoursite.com/tag/something/. In contrast to searche results pages and archives by date – this type of page carries useful thematic load that you identified yourself. Regarding duplicate content, Google promised not to prosecute such things if all will be done in “white” way (tags just fall into this category), without malice.

4 Responses

  1. Igor Igor
    On 2015/11/11 at 19:03

    Hi, Pavel, my thoughts:
    1. Robots.txt need to be written by hands, not plugins
    2. Sitemap.. cool tool is xml-sitemaps.com, not automatically of course, but good
    3. Caching! You’re know this best that Google looks at page ping time, so W3 Total Cache Performance plugin!

    • Pavel Karpov Pavel Karpov
      On 2015/11/11 at 21:06

      Hi, Igor 🙂 Thanks for your comment!

      1. DL Robots.txt, if you mean it, only allow to change the content of the virtual robots.txt file, generated by WP, but how it will be changed – only your decision. Some people just think, that it’s better to store all the data in database and default folders (/plugins/) than add additional files to the root folder (easier to create some types of backup, etc.).

      2. Agree – it is a good service 🙂 and have restriction of maximum 500 pages, so – for small/middle sites only… Although they do have some php script, which can be implemented directly to server, but for such cases – what will be the difference between this script and some WP plugin? 🙂

      3. Yes, I agree 🙂 For many cases this plugin is saviour! I just was not sure, that caching should be added to “basic” features, because some websites can be light-weight by default, some large ones can have their own caching servers, etc… The same problem with responsive layout. What if person just don’t need mobile users? 🙂 But nevertheless – you are right, cache is important feature. I believe – I’ll write another article and will add a link here as an additional “not basic” stuff.

      • Igor Igor
        On 2015/11/12 at 12:58

        1. Sorry, but fully disagree about robots.txt via plugins, only hands if you’re real SEO-man and not lazy =) you need to know what you need to close. Some plugins generate meta-descriptions, so by this idea – is it good? Computers are not humans
        2. Agree about limit, for small/middle websites. I just dont like plugins if you can do the functional by your hands, so i hate plugins like Visual Composer!!!!
        3. Most of Wp sites written by diletants/not professionals with 100500 bad plugins in the base and hosted at “$1/yr hosting”, so these sites loads like a sh*t, of course if you have a good server and good site code, you dont need a cache plugin, but its 0,001% =)

        • Pavel Karpov Pavel Karpov
          On 2015/11/12 at 13:42

          1. Here is some misunderstanding again 🙂 text inside virtual robots.txt file will be created manually by you. DL Robots.txt plugin allow only to store this content not in real file, but in virtual – that’s the only difference 🙂 It doesn’t generate any content itself.

          2. Such things, as Visual Composer (I don’t like them either, BTW) allow people without knowledge of coding to create some stuff on pretty good level (better, that they could do on their own). And yes – product from the professional always will be better optimized for search engines and users, that’s why we should say “thank you” to such tools – they allow newcomers to start and allow us to compare on real examples and to convert these newcomers into our clients some time later 🙂

          3. Agree 🙂 But again – it’s too wide theme (caching at WP), so – additional article will be 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *

Inbound Certified
Certified Bitrix Marketing Specialist
Certified Google Analytics Specialist
ИП Карпов Павел Дмитриевич
ИНН: 772973705115
ОГРНИП: 317774600074991
: Moscow, Michurinskyi prospekt, 25, 3