Introduction to Search Engine Optimization

Karolina 14.08.2019

There are many reasons to boost a website’s site engine rankings. To name a few: to gain a broader audience, build brand awareness and cultivate customer trust.

A vague undertaking

The main problem while working on the broadly defined SEO is its ambiguous nature – it is never completely clear how a specific adjustment will affect a website’s search engine performance. Nevertheless, using varied techniques coupled with reliable monitoring tools ensures a dynamic improvement in a short period of time and a steady growth in the long run.

Because of the breadth of the subject, only selected techniques will be presented.

Become worthwhile

Nowadays, readability, user experience and accurate, high-quality content may be more valuable than ever. RankBrain is a quite new search engine algorithm based on machine learning and is one of the most important factors in Google’s ranking process. It learns which pages seem to be the most accurate results for certain search queries by analyzing many relevance indicators, one of which is dwell time – the time spent by a user browsing a search engine result before leaving it. With the new algorithms in use, one may expect the significance of useful and accessible content grow bigger and bigger, overshadowing SEO gimmicks of the past.

Decreasing the bounce rate might also be beneficial to your SERP (Search Engine Results Page) position. A bounce is a user visit that ends at the first viewed page – it’s completely normal for some types of sites, like blogs and single-page apps, but for most of them – it indicates a failure in attracting user’s attention. The most basic and effective way to encourage a user to delve into your website is to maintain a useful internal linking structure which is not overwhelming.

Tag it up

Proper meta tagging remains a core concept of a search engine optimized website.

  • Title tags are concise descriptions of a websites’ content. They are displayed as the headers of results provided by search engines and should typically be no longer than 70 characters.
  • Descriptions tags should summarize the content of a particular website and provide a natural way to include important keywords. The tags are shown just below the page titles on SERPs unless the search engine decides to swap your description with one of their own. The length should not exceed 250 characters.
    <meta name=”description” content=”A description of a website.”>
  • Alternative text should be provided for all important images , except for these that are merely decorative. They are used by visually impaired users and by web crawlers to better understand the content of a website.
    <img src=”image.png” alt=”an alternative text”>

Title and description tags affect search engine rankings indirectly – they gather users’ attention, increasing the CTR (Click Through Rate) and thus promoting your website higher.

Moreover, one should not forget to use the HTML5 semantic markup for a better description of your content to crawlers. The semantic tags should be used according to their standard definitions and as such will help search engines to correctly identify and categorize the sections of your pages.

Don’t overdo it

Avoid stuffing the tags with unneeded keywords, as search engines have become better at recognizing this and might actually decrease the site’s ratings, not to mention that lower content quality might discourage users from visiting it again.

Avoid creating duplicate content on different subpages and never use somebody else’s work, as it might actually backfire on your ranking.

Gather the power

If a website contains many duplicate pages or has dynamic paths leading to the same results, canonicalization and/or creating 301 redirects might be necessary. They basically transfer the ranking “power” of a certain page to a different one, creating a single source of information with a higher ranking position to some extent. Canonicalization is achieved by means of special tags in a website’s source, for instance:

<link rel=”canonical” href=”http://canonical-address.com/” />

while 301 redirects are made through setting a proper HTTP response code. SEO-wise, both solution have a similar effect—the main difference between them is user experience.

Make it faster

Web performance optimization is a key to creating a site that is both user and search engine friendly. Long loading times will most definitely increase your bounce rate, while also decreasing your ranking position more directly, as site speed is one of the factors taken under consideration by Google’s ranking algorithm.

To achieve a higher level of website performance:

  1. Minimize or remove render-blocking JavaScript instance using asynchronous functions, inlining external scripts or deferring loading of JavaScript scripts until the page is rendered readable.
  2. Compress and minify source files.
  3. Improve the server response time, especially for users coming from the targeted locations.

Be reliable

Getting rid of most HTTP 400 and 500 error codes is a must for a SEO-oriented website. These error codes make your page look unreliable and lose valuable visitors. They also hurt your search engine position. The easiest way to deal with them is to monitor your website with Google Search Console and fix the errors one by one. You can then mark them as fixed to let Google know about your changes.

Draw a map

As your site grows bigger, it might be useful to consider creating your site’s XML sitemap – as Google points out, it might help them crawl more of your pages. A sitemap is special file accessed by search engines that describes your site’s structure. It is of special importance for large sites with little internal linking and for new sites that have few external links leading to them. After preparing the file, add it using Google Search Console.

Gain their trust

Due to the increasing amount of spam on the web, assessing site trustworthiness has become a necessity. One of the most important ways of doing that is to count the number of trusted and popular pages pointing to it, while also making sure that a webpage doesn’t link to sources that look irrelevant, seem to contain spam or inappropriate content. Social media are one of the more accessible sources of external linking, so don’t hesitate to take advantage of them to generate at least a part of your external linking.

Target local audience

When possible, it is of vital importance to indicate your business’ local activity. Through a clever usage of location-related keywords, management of your Google My Business profile, social media engagement and structured metadata about your premises, you might be able to gain additional Internet traffic. Because of its more narrowed-down characteristics, the traffic might have a better than usual conversion rate.


Mobilegeddon was a term coined by the press in 2015 to warn users about Google updating their algorithm to rank better these websites that display well on mobile devices. Although the change wasn’t that drastic, small screen compatibility has become an important SEO factor and a further transition towards algorithms focused more on mobile experience might be expected. Therefore, mobile-first approach is becoming an industry standard and might be worth switching over to.

Make peace with the enemy

AMP, or Accelerated Mobile Pages, is an open-source initiative to enhance user experience when browsing web pages on mobile devices. Many publishers treat AMP as a sort of enemy which makes their advertisements less profitable. Like it or not, however, adjusting your content to AMP will most likely improve your mobile performance tremendously and you might also get a better place in the Google ranking, which can lead to increased traffic and better conversion rates in the future. Also, as it’s a technology developed and supported by the biggest search engine company, we might expect it to become a very important factor in the coming years.

Interest them at first glance

Finally, to gain better click-through rate, it might be worth it to spend some of your time adding structured data according to Google’s guidelines. Although there are no guarantees, Google might take your metadata under consideration and start displaying rich results for your website, which definitely wins more user attention than regular, plain text.

Shield from the attacks

Negative SEO is a technique of decreasing another site’s ranking through not-so-ethical actions. Although rarely used, it might seriously worsen your search engine performance. To protect yourself from getting harassed by your competitors, it’s important to:

  • Monitor your site’s speed and availability—you might become a victim of a denial-of-service or crawling attack.
  • Keep an eye on your external linking—a sudden, non-linear rise might suggest being a victim of link farm spamming, i.e. making search engines believe you’re trying to outsmart them by using automatically generated websites to link to you. To stop these attacks, disavow the hostile domains pointing at you through Google’s Disavow Links Tool.
  • Find pages using your content as their own (e.g. using some professional antiplagiarism tools) and inform Google about copyright infringement.
  • Revise your site’s security to make sure you’re safe from more direct forms of attacks.


Being able to track the effectiveness of your actions is essential to improving your search engine visibility. Google Search Console is a feature-rich tool—among other things, it allows to:

  • Generate reports on your site’s performance, including the CTR for different search queries.
  • Report on both internal and external linking.
  • Display warnings about incorrect metadata.
  • Inspect HTTP error response codes received by Google crawlers.
  • Inform about your rich results and their performance.
  • Get an overview of AMP-related issues.

The tool is completely free and there is absolutely no reason not to use it. However, you might consider pairing it with additional, paid SEO services that generally hint at what adjustments your site could benefit from and allow mass keyword ranking, getting information about your linking, comparing with your competitors (thus letting you address their actions and keep up with them), learning about your site performance problems and finding unused keywords.

For more advanced analytics and monitoring, you might also think about using some professional SERP scrapping services to collect important data and use it to tailor your own solution.

comments: 0