The mythical first rank is like the Holy Grail, City of Gold or Atlantis (it did exist!) – everyone pursues it but only few will be blessed with reaching it. Why are high ranks important? Because they provide traffic and the traffic brings users who become clients. This is what Google first search results page looked like in 1998:
And this is how Google homepage has been changing over the past 20 years
Nowadays, there is several times more information.
What interests us most regarding Search Engine Optimization in Google search engine is the presence in organic search results (text links, maps, images). Of course, the appearance of Google's first search results page changes from year to year and it has evolved a lot over the past 20 years, adapting new functionalities, such as:
The better we can use the functionalities which Google gives us, the more traffic we can attract to our website.
There is no short answer to this question. We can point at hundreds of factors influencing website’s ranking (evaluation by Google) which are reflected in the website appearing in search results. As the most important factors I would indicate:
To cut a long story short, we can say that if we want our website to be highly evaluated by Google, it needs to have a good code, quality content and authoritativeness (other websites must refer to it).
Click Through Rate (CTR) is a ratio showing how often people click a link. The lower your website ranks in search results on a given query, the less people will click the link to your website which, in effect, will drive less traffic.
Let’s analyse the chart of the relation of CTR to the domain’s rank in search results:
Data collected by advancedwebranking.com based on 1 599 260 phrases from 30 720 domains.
By analysing this data, we can easily arrive at the conclusion that:
The best place to hide a corpse is the second page of search results
Without doubt, the most important factor is the rank where the website is displayed. But in the year 2018, when search results depend heavily on user’s location, their search history and information from social networks, the website’s rank can vary according to these factors. Once the domain gets displayed on the first page of search results, the CTR is influenced by:
SEO is a long-lasting process. I often say that once you start SEO, you need to keep doing it. Organic traffic is still the cheapest (per conversion) and most effective kind of traffic.
This is what all marketing campaigns should start with. In every industry there are dominant companies. If you look at them closely, you will surely be able to redirect part of their traffic to your website. If you invest enough time and money into this stage, you will definitely save them in the future by avoiding mistakes which you could commit without having analysed the competitor’s activities. Still, never give up on your experience and on what your instincts tell you.
Use a free keyword planer from Google and you will immediately see hundreds of phrases with their traffic statistics which can bring more traffic to your website. Additionally, check out external tools such as Ahrefs, Majestic, or the Polish Senuto.
Analysing the descriptions of meta titles of subpages on your competitors’ websites you will find out the queries they want to rank for, so what gives them conversion in the form of purchases or inquiries. There are also websites such as similarweb.com, where you will learn the estimated (foreseen) traffic as well as its sources. The data is not very detailed but it gives general overview of each domain. With the information contained in that web service you will be able to prepare a better strategy for attracting traffic.
Nobody knows better than you what services you offer and what products you sell. Use your knowledge and experience to select the keywords – think like your client. Think about what the clients ask about when they call or email. If you are still unsure of how to select keywords for your SEO, have a look at our SEO knowledge base.
This part is often omitted by web masters and later by programmists. A website’s structure should be like a newspaper – with its columns, categories, articles, sections. In the case of a website the user should go from more general information to details, at the same time being able to quickly move to other sections from the homepage.
While designing the structure of a website, one of our goals should be to design its navigation in such a way that will allow the user to reach any section in the website with three clicks.
Imagine that you have 100 points to give away. These points describe your website’s potential and they influence its rank in Google search results. If some of these points are being lost because of a faulty structure, instead of taking advantage of the website's entire potential, you use only 80%. During a SEO audit we find such errors and we correct them.
Draw your website structure basing on your keywords selection, grouping them according to their od generality level and search volume. For example, while designing a website of a construction business which has its own projects, realizes client’s projects and makes renovations, the structure could look like that:
- construction services
- detached house construction
- warehouse construction
- warehouse design
- warehouse thermal insulation
- construction of blocks of flats
- renovation services
- wall painting
- roof renovations
- elevation renovations
- construction supervision
- real estate sales and rental
- houses on sale
- apartments on sale
- outhouse rental
- office rental
- knowledge base
Such a structure reflects, in the scope of the website, our SEO strategy, so how the search engine thinks. To this structure we should add pages that will be important for the user but insignificant from the point of the view of the search engine, such as Contact page, Policy and About page – making it possible to reach these subpages.
User experience (UX) is, briefly, the idea of creating a product in such a way that the client can use it in an intuitive way, in line with the designer’s intentions and expectations and also with the expectations of the user himself. In case of websites good UX consists of:
Designing websites with UX in mind influences SEO directly and indirectly. A good user flow on the website means smaller rejection rate which translates into a higher probability of conversion.
Some of the subpages on the website will be automatically linked thanks to correctly designed navigation. Nevertheless, it is still worth helping the robots to reach particular subpages e.g. from the posts on the blog. From the article “5 Things to Avoid When Buying a House” we can create a link to a previously mentioned subpage: houses on sale, linking them with the phrase “cheap houses on sale”. Thanks to this, we will enrich the collection of keywords and phrases in the website’s external links.
You need to remember about the First Link Counts rule. It means that if on one page you use a few links to another subpage, Google will take only the first phrase into account and the other links will be ignored.
There are ways to evade this rule, but I will write more about it in the next article.
Robots.txt file is a border crossing for search engines and other bots. Before a bot visits a subpage, it needs to check in this file whether it can visit the website (if it is not blocked). In the robots.txt file you can also block the access to chosen bots, as well as chosen subfolders or URLs matching the given pattern.
So, you have the strategy, you have the structure, now you need to take a good care of the website’s code, thanks to which the website will load quickly. As any SEO expert, I have my theories on optimization. One that I most often repeat when I explain the essence of SEO is translating the domains rank into energy costs. The less energy Google servers spend to index a website, the better it is assessed. In other words, the less Google pays for the electricity used to moving bots on your website, the better for you. In a nutshell, this can be achieved by:
Do you know that, according to Google, over 53% of mobile device users leave a website if it does not load in the first 3 seconds? That means that it is definitely worth it to take good care of the speed of displaying and loading the entire website. Google helps us do it providing the tool “Google Insight Speed” thanks to which we can analyse website’s critical points and correct them. In Cyrek Digital we additionally use GTmetrix, Yslow and the developer's console from Google Chrome browser in order to identify the elements hindering the loading speed.
I used to think that Google provided so many free tools for social objectives. Just to make the world a better place. But I could not be more wrong: if a user leaves a website before it loads, they will not see AdSense advertisements installed on it so Google will not make any money on them
In some situations, redirections are necessary and in others they are inadvisable. Why? One of the worst scenarios is the case when we get a visit to the page from the search engine (what a surprise – SEO works!) but the user gets to the error 404 default page (page not found) and they leave our website. This situation most commonly occurs after the pages migrate to other engines, after introduction of clear URL and the change of the structure of an address. In such case you need to make sure that the user gets to another subpage which corresponds to his search intents. This will be achieved by using redirection 301 (permanent redirection). You should remember to:
There are cases, however, when redirecting is inadvisable. If on the website there are internal links to pages that are redirected to other addresses, take care to update those links to their final addresses
Oldschool SEO had it:
Current SEO teachings say that
30x redirects don't lose PageRank anymore.— Gary "鯨理" Illyes (@methode) 26 lipca 2016
However, from our experience we can tell that in the year 2018 a change of redirections from 302 to 301 still brings a huge leap of the websites ranking. The same occurs after eliminating internal links.
Once you know what your prospective clients are looking for, make sure that they find it on your website. It is a long way from the emergence of a need to the purchase. The more times a user comes across your web service on their way, the better the chances are that they will become your client.
This is why content is so important on every website, in categories such as articles, blogs, news and knowledge base. This is where you can answer the user’s “How?” and “Why?” questions, after which you can direct them to the sales-related part of the web service, so to the product, service, or contact page. When you have content in the website service, you must try to make it unique internally (in the range of the website) and externally (not copied from other domains).
It presents a significant problem especially for online stores that copy product descriptions from the manufacturers and suppliers. Of course, we have trusted ways to still ensure a better uniqueness within the product page than the competitors have
For me personally, this is the most arduous part of the optimisation process but at the same time it is the one that brings best results. What is a good link? Let me give you an example: Let’s say that the best websites are the ones that have been manually selected as the best source of information. They include government websites, universities websites and websites of scientific publishers.
To acquire a link to a commercial project from such a website is very difficult (almost impossible), so we assume that a link from such a place must be very valuable. If we cannot have it on a university's homepage, maybe on a department’s subpage? This is more likely, though still not easy (we have had little success in this field). The next option would be a subpage of a teaching professor in a given department. The content depends entirely on them and usually it is not controlled by anyone else. Acquisition of a link from such a subpage may be easier but at the same it will have less value.
We could describe the same hierarchy in the case of authoritative medium, like a national news portal. A link acquired in an article published on the first page will have more value than an article in a subdomain with influencers’ blogs. A web service can also make it possible to blog for anyone who will reveive their own domain. A link from such a blog will have the least value.
Simplifying, in seems that the web services that have the highest number of incoming links are the most valuable (authority sites). This would probably be the case if it was not for the spam in the links, links exchange systems, automatic tools for acquisition of references to websites. Google’s algorythms efficiently filter out spam and they do not consider such links in calculating website's rank.
Additionally, Google employs over 10 000 people who manually evaluate websites’ quality. Last month the company updated its Google Search Evaluation Guidelines – a document of over 200 pages describing in detail positive and negative factors influencing a website’s rank.
To sum up, I shall say: Quality, quality, quality, although diversification is also necessary. Every day, we acquire links from various sources, which include (in order from the most to the least effective):
A good article from a given industry and concerning interest topics of the target group will almost automatically generate traffic on the website. If it is on a trending topic and contains, e.g. research results, infographs or statistics, then other, more popular (that are higher in the hierarchy) media will be eager to make references to the information presented in the article, quoting it and publishing the link to the article in the footnotes.
Search engines make for almost a half of global Internet traffic, so if you think that your do not need SEO, think again.