A Simple Step by Step Guide to SEO

Search results page – where the magic begins

The mythical first rank is like the Holy Grail, City of Gold or Atlantis (it did exist!) – everyone pursues it but only few will be blessed with reaching it. Why are high ranks important? Because they provide traffic and the traffic brings users who become clients. This is what Google first search results page looked like in 1998:

first google page in 1998

And this is how Google homepage has been changing over the past 20 years

What can be found on the search engine results page?

Nowadays, there is several times more information.

Google's search results'

  1. Google Ads – product advertisements
  2. Google Ads – text advertisements
  3. Videos
  4. Organic search results
  5. Google Maps
  6. Organic search results
  7. Google Ads
  8. Similar searches – a valuable source of information

What interests us most regarding Search Engine Optimization in Google search engine is the presence in organic search results (text links, maps, images). Of course, the appearance of Google's first search results page changes from year to year and it has evolved a lot over the past 20 years, adapting new functionalities, such as:

  • advertisements (2000)
  • images (2001)
  • spell check of the query (2002)
  • products (2002)
  • news (2002)
  • suggestions (2005)
  • links to subpages of the website from the first rank (2005)
  • video (2005)
  • weather forecast, games results, video, biographies of famous people with their family relations (od 2005)
  • voice search (2008)
  • image search (2011)
  • Knowledge Graph (2012)

The better we can use the functionalities which Google gives us, the more traffic we can attract to our website.

What does the order of search results depend on?

There is no short answer to this question. We can point at hundreds of factors influencing website’s ranking (evaluation by Google) which are reflected in the website appearing in search results. As the most important factors I would indicate:

  • website’s construction and structure, the quality and optimization of the code
  • content quality and content’s optimization in the context of the browser
  • domains containing links to the website and their thematic relation to the website as well as the number of references
  • being mobile-friendly
  • page loading speed

To cut a long story short, we can say that if we want our website to be highly evaluated by Google, it needs to have a good code, quality content and authoritativeness (other websites must refer to it).

The number of visits to website and the rank in search results

Click Through Rate (CTR) is a ratio showing how often people click a link. The lower your website ranks in search results on a given query, the less people will click the link to your website which, in effect, will drive less traffic.

What is CTR in organic search results like?

Let’s analyse the chart of the relation of CTR to the domain’s rank in search results:

position vs ctr Data collected by advancedwebranking.com based on 1 599 260 phrases from 30 720 domains.

By analysing this data, we can easily arrive at the conclusion that:

The best place to hide a corpse is the second page of search results

What influences CTR?

Without doubt, the most important factor is the rank where the website is displayed. But in the year 2018, when search results depend heavily on user’s location, their search history and information from social networks, the website’s rank can vary according to these factors. Once the domain gets displayed on the first page of search results, the CTR is influenced by:

  • adjusting the title and the meta description to the query
  • rich snippets – any additional descriptions generated upon markers in the page’s code (e.g. prices of the products, reviews)
  • emoji and other nonstandard methods of attracting user’s attention

Search Engine Optimization – what to start with

SEO is a long-lasting process. I often say that once you start SEO, you need to keep doing it. Organic traffic is still the cheapest (per conversion) and most effective kind of traffic.

Strategy and competitors’ analysis

This is what all marketing campaigns should start with. In every industry there are dominant companies. If you look at them closely, you will surely be able to redirect part of their traffic to your website. If you invest enough time and money into this stage, you will definitely save them in the future by avoiding mistakes which you could commit without having analysed the competitor’s activities. Still, never give up on your experience and on what your instincts tell you.

Use tools for keyword analysis

Use a free keyword planer from Google and you will immediately see hundreds of phrases with their traffic statistics which can bring more traffic to your website. Additionally, check out external tools such as Ahrefs, Majestic, or the Polish Senuto.

Check what brings your competitors traffic

Analysing the descriptions of meta titles of subpages on your competitors’ websites you will find out the queries they want to rank for, so what gives them conversion in the form of purchases or inquiries. There are also websites such as similarweb.com, where you will learn the estimated (foreseen) traffic as well as its sources. The data is not very detailed but it gives general overview of each domain. With the information contained in that web service you will be able to prepare a better strategy for attracting traffic.

Select key phrases basing on your experience

Nobody knows better than you what services you offer and what products you sell. Use your knowledge and experience to select the keywords – think like your client. Think about what the clients ask about when they call or email. If you are still unsure of how to select keywords for your SEO, have a look at our SEO knowledge base.


The architecture of information – webiste’s structure

This part is often omitted by web masters and later by programmists. A website’s structure should be like a newspaper – with its columns, categories, articles, sections. In the case of a website the user should go from more general information to details, at the same time being able to quickly move to other sections from the homepage.

While designing the structure of a website, one of our goals should be to design its navigation in such a way that will allow the user to reach any section in the website with three clicks.

Design the structure thinking like a user but seeing like a search engine.

From your point of view, the sections of a website which interest you most are the ones through which the user can contact you, read about your company or see the photo gallery. From the point of view of SEO, they are worthless. If the analysis of the website will show that they (or, as it is often the case, privacy policy) are the most important sections of the website it will prove that the website has been badly designed and part of its potential is being wasted.

Imagine that you have 100 points to give away. These points describe your website’s potential and they influence its rank in Google search results. If some of these points are being lost because of a faulty structure, instead of taking advantage of the website's entire potential, you use only 80%. During a SEO audit we find such errors and we correct them.

Draw your website structure basing on your keywords selection, grouping them according to their od generality level and search volume. For example, while designing a website of a construction business which has its own projects, realizes client’s projects and makes renovations, the structure could look like that:

- construction services
   - detached house construction
   - warehouse construction
      - warehouse design
      - warehouse thermal insulation
   - construction of blocks of flats
- renovation services
   - wall painting
   - roof renovations
   - elevation renovations
- construction supervision
- real estate sales and rental
   - houses on sale
   - apartments on sale
   - outhouse rental
   - office rental
- blog
- knowledge base

Such a structure reflects, in the scope of the website, our SEO strategy, so how the search engine thinks. To this structure we should add pages that will be important for the user but insignificant from the point of the view of the search engine, such as Contact page, Policy and About page – making it possible to reach these subpages.

User Experience and SEO

User experience (UX) is, briefly, the idea of creating a product in such a way that the client can use it in an intuitive way, in line with the designer’s intentions and expectations and also with the expectations of the user himself. In case of websites good UX consists of:

  • transparent and easy navigation
  • content adjusted to user’s intents
  • • correctly selected functional elements (signing-up for the newsletter, quick contact form, progress bar)

Designing websites with UX in mind influences SEO directly and indirectly. A good user flow on the website means smaller rejection rate which translates into a higher probability of conversion.

Internal Links

Some of the subpages on the website will be automatically linked thanks to correctly designed navigation. Nevertheless, it is still worth helping the robots to reach particular subpages e.g. from the posts on the blog. From the article “5 Things to Avoid When Buying a House” we can create a link to a previously mentioned subpage: houses on sale, linking them with the phrase “cheap houses on sale”. Thanks to this, we will enrich the collection of keywords and phrases in the website’s external links.

You need to remember about the First Link Counts rule. It means that if on one page you use a few links to another subpage, Google will take only the first phrase into account and the other links will be ignored.

There are ways to evade this rule, but I will write more about it in the next article.

Information access for bots

Robots.txt file is a border crossing for search engines and other bots. Before a bot visits a subpage, it needs to check in this file whether it can visit the website (if it is not blocked). In the robots.txt file you can also block the access to chosen bots, as well as chosen subfolders or URLs matching the given pattern.


Optimize the webiste’s code

So, you have the strategy, you have the structure, now you need to take a good care of the website’s code, thanks to which the website will load quickly. As any SEO expert, I have my theories on optimization. One that I most often repeat when I explain the essence of SEO is translating the domains rank into energy costs. The less energy Google servers spend to index a website, the better it is assessed. In other words, the less Google pays for the electricity used to moving bots on your website, the better for you. In a nutshell, this can be achieved by:

  • leaving in the Google index only those websites that are valuable for the user
  • limiting Google bots’ movements only to the indexable pages
  • making sure that the pages load quickly and do not include a lot of unnecessary code
  • compressing website’s code and images which will speed up page loading time and cut down on the amount of space that our data takes up on Google servers

Improve page load time

Do you know that, according to Google, over 53% of mobile device users leave a website if it does not load in the first 3 seconds? That means that it is definitely worth it to take good care of the speed of displaying and loading the entire website. Google helps us do it providing the tool “Google Insight Speed” thanks to which we can analyse website’s critical points and correct them. In Cyrek Digital we additionally use GTmetrix, Yslow and the developer's console from Google Chrome browser in order to identify the elements hindering the loading speed.

I used to think that Google provided so many free tools for social objectives. Just to make the world a better place. But I could not be more wrong: if a user leaves a website before it loads, they will not see AdSense advertisements installed on it so Google will not make any money on them

Redirecting vs optimization

In some situations, redirections are necessary and in others they are inadvisable. Why? One of the worst scenarios is the case when we get a visit to the page from the search engine (what a surprise – SEO works!) but the user gets to the error 404 default page (page not found) and they leave our website. This situation most commonly occurs after the pages migrate to other engines, after introduction of clear URL and the change of the structure of an address. In such case you need to make sure that the user gets to another subpage which corresponds to his search intents. This will be achieved by using redirection 301 (permanent redirection). You should remember to:

  • redirect to subpage which is most closely related in its content. If you remove a product from the store, redirect the user to a similar one. If you do not have a similar product, redirect to the category page
  • • avoid redirecting to homepage. If the old page was generating traffic from organic search results, after redirecting it to the homepage you can be sure that you are going to lose the traffic after a few days (unless the entire content and optimization of the subpage will be redirected to the homepage, but this hardly ever happens)
  • • make sure that the redirection is permanent (301) and not temporary (302)
  • • confirm that this redirection does not generate a redirect chain, so that the address, to which the user is redirected after attempting to the nonexistent page is the final address

There are cases, however, when redirecting is inadvisable. If on the website there are internal links to pages that are redirected to other addresses, take care to update those links to their final addresses

Oldschool SEO had it:

  • redirections 301 cause the loss of about 15% of the subpage's strength, which was confirmed in the video 5 years ago by Matt Cutts, the quality department (antispam) manager in Google
  • redirections 302 cause 100% loss of the strength of redirection (by definition they are temporary)
  • migration to https harms the websites ranking position because it redirects every URL address

Current SEO teachings say that

  • redirecting the website on https versions does not bring loss of strength, according to John Mueller
  • • long-lasting 302 redirections are sometimes interpreted by Google as permanent redirections 301
  • redirections do not cause loss of strength, as Gary Illyes assures

However, from our experience we can tell that in the year 2018 a change of redirections from 302 to 301 still brings a huge leap of the websites ranking. The same occurs after eliminating internal links.

redirects fix and optimisation

Doubled internal content

Once you know what your prospective clients are looking for, make sure that they find it on your website. It is a long way from the emergence of a need to the purchase. The more times a user comes across your web service on their way, the better the chances are that they will become your client.

This is why content is so important on every website, in categories such as articles, blogs, news and knowledge base. This is where you can answer the user’s “How?” and “Why?” questions, after which you can direct them to the sales-related part of the web service, so to the product, service, or contact page. When you have content in the website service, you must try to make it unique internally (in the range of the website) and externally (not copied from other domains).

It presents a significant problem especially for online stores that copy product descriptions from the manufacturers and suppliers. Of course, we have trusted ways to still ensure a better uniqueness within the product page than the competitors have


Linkbuilding – how to win references to the website

For me personally, this is the most arduous part of the optimisation process but at the same time it is the one that brings best results. What is a good link? Let me give you an example: Let’s say that the best websites are the ones that have been manually selected as the best source of information. They include government websites, universities websites and websites of scientific publishers.

To acquire a link to a commercial project from such a website is very difficult (almost impossible), so we assume that a link from such a place must be very valuable. If we cannot have it on a university's homepage, maybe on a department’s subpage? This is more likely, though still not easy (we have had little success in this field). The next option would be a subpage of a teaching professor in a given department. The content depends entirely on them and usually it is not controlled by anyone else. Acquisition of a link from such a subpage may be easier but at the same it will have less value.

We could describe the same hierarchy in the case of authoritative medium, like a national news portal. A link acquired in an article published on the first page will have more value than an article in a subdomain with influencers’ blogs. A web service can also make it possible to blog for anyone who will reveive their own domain. A link from such a blog will have the least value.

Simplifying, in seems that the web services that have the highest number of incoming links are the most valuable (authority sites). This would probably be the case if it was not for the spam in the links, links exchange systems, automatic tools for acquisition of references to websites. Google’s algorythms efficiently filter out spam and they do not consider such links in calculating website's rank.

Additionally, Google employs over 10 000 people who manually evaluate websites’ quality. Last month the company updated its Google Search Evaluation Guidelines – a document of over 200 pages describing in detail positive and negative factors influencing a website’s rank.

To sum up, I shall say: Quality, quality, quality, although diversification is also necessary. Every day, we acquire links from various sources, which include (in order from the most to the least effective):

  • links from internet fora (buzz marketing)
  • links from microblogs
  • links from own blogs
  • links from external blogs (guest posting)
  • links from articles – own portals created for given topics
  • links from articles – external portals
  • links from undisclosed sources, directing to valuable content within the website (content marketing)

A good article from a given industry and concerning interest topics of the target group will almost automatically generate traffic on the website. If it is on a trending topic and contains, e.g. research results, infographs or statistics, then other, more popular (that are higher in the hierarchy) media will be eager to make references to the information presented in the article, quoting it and publishing the link to the article in the footnotes.

Search engines make for almost a half of global Internet traffic, so if you think that your do not need SEO, think again.

A Simple Step by Step Guide to SEO