The salient feature of SEO methods and strategies that we can talk about in this article is that they can be easily implemented by everyone. While publishing quality content is the main factor in increasing organic traffic, almost everything in SEO efforts depends on each other.

Applying as many methods or strategies as possible within 100 SEO suggestions, especially for new websites, small businesses that want to rank in local searches and attract traffic, can result in a short time.

First, the site must be registered with the Google Search Console to check whether the website is technically properly configured and keep it under constant control.

Search Console helps you identify issues such as tracking, managing and browsing a site's status in Google Search results, such as mobile compatibility.

2- Setting the canonical domain version, duplicate URL merge

Http, https, www protocol abbreviations written in front of the domain name. For example, if a website can be opened via both http and https and / or both with the www prefix and without versions, Google sees these versions as duplicate URLs with the same content. One of these versions should be preferred, other versions should be directed to the preferred version. Thus, by clearly telling Google which URL is the standard, we prevent it from creating a disadvantage in SEO. See also: Duplicate URL Merge , Htaccess URL Redirection

3 - Keeping the website indexable

The ability to crawl websites by search engines should not have a "noindex" directive in the robots.txt file or between HTML meta tags. These directives are not enabled by default in content management systems such as WordPress, however, the option "Try to prevent search engines from indexing this site" in the Ayalar> Reading tab may be activated or forgotten when it is under construction.

In order to prevent your SEO from being affected when crawling problems occur, the first things to check are the WordPots setting or plugins that can activate the robots.txt file and noindex meta tags.

2020 01 27 23 23 22 -

4- Creating XML Sitemap

A Sitemap is an XML file that lists the URLs of a site. XML sitemaps are created automatically in many content management systems. However , there are ways to create and manage more efficient XML sitemaps .

5- Submitting XML Sitemap

An SEO plugin or any sitemap generator plugin used on WordPress can create a custom sitemap with its own name. Specifying and submitting the sitemap path in the Search Console allows to add information such as when each URL was last updated, how often it changed, and how important it was in relation to other URLs on the site.

6- Creating a robots.txt file

Robots.txt is a text file containing instructions for search engine crawlers. Defines which URLs search engine spiders should crawl. However, not all of these URLs are explicitly named in the robots.txt file. Instead, scanning certain areas is not allowed. Using this simple text file, all domains, subdirectories or a specific file can be easily excluded from search engine spiders. However, this file does not protect against unauthorized access. Robots.txt is located in the root directory. It is the first document opened by the boats that visit the site.

7- URL Control (Search Console)

The Google Search Console URL Control provides useful insights about the site. In fact, after typing any of the URLs on the site into the URL bar:

  • When was the last time it was scanned,
  • Last scan status,
  • If there are crawl errors,
  • Whether the page is properly indexed
  • Whether the URL is mobile-friendly,
  • Whether there is a linked AMP page

you can check.

8- Opening Time of the Site, Acceleration

According to most analyzes, if a site takes more than 3 seconds to load, your bounce rate may increase, on average, from 50% to 90%. After the mobile priority indexing update, the mobile user experience has become even more important for SEO. Mobile users on the go do not like to wait, it is a matter of time before they press the back key ( pogostick ), so mobile speed is very important.

9- Customized Error (404) Page

The general 404 return error that returns when attempting to crawl URLs that have been removed or not found is the most common. In this page, instead of showing only the 404 error code, you can offer users page suggestions that they can visit.

10- 301 Redirects

301 redirects are done to protect domain authority and search rankings when a page's URL is changed for any reason.

It is aimed to provide a smooth transition on the SEO side by moving the content to a different URL.

A 301 redirect is a permanent redirect from one URL to another. 301 redirects send site visitors and search engines to a different URL than the clicked or typed URL.

As a result of link building within the site, links to pages that have been removed or whose link has been permanently changed may remain idle and return a 404 error. Broken links can be found through Screaming Frog or browser plugins that detect broken links . Up to 500 URLs can be analyzed for free with the Screaming Frog inspection tool .

12- Detection and control of duplicate content

Duplicate content is when a particular content is located on different pages within the website. Publishing a certain section or all of the same content on multiple pages can have a negative impact on SEO. It is necessary to identify such pages, remove them if possible, and if not possible, direct them to the original version (preferred) with canonical tags.

How does Google detect the first version of Duplicate Content?

13- Secure connection protocol, using SSL

Google recommends that websites communicate over a secure connection, that is, the "https" protocol, and states that it will have little effect in terms of SEO. SSL setup is nowadays free and easy thanks to certbooks like LetsEncrypt and OpenSSL.

2020 01 28 20 12 13 -

SSL (Secure Sockets Layer) Secure Socket Layer is a standard security technology used to establish an encrypted connection between the server and the user's browser.

14- Structured Data Marking (Rich Snippets)

Structured data markup provides search engines with information about page content in predetermined formats and provides a richer look in search results. In the image below, you can see the structured version of a page with FAQ structural data in the search results.

2020 01 28 20 15 28 -

Review Structured Data Types

15- "hreflang" application for multiple language websites

A multi-language website is to publish the same content in different languages. Multilingual websites can cause duplicate content issues. Therefore, the Hreflang tag should be applied to each of the multilingual pages.

Sample:

rel="alternate" hreflang="tr-TR"

16- Breadcrumbs

Breadcrumbs are links in the upper left corner of the pages that show the user what page they are on and the path from the homepage to that page. The Breadcrums feature serves as a kind of navigation for users, while helping search engines understand the page hierarchy.

17- Mobile compatibility

Mobile compatibility is one of the indispensable technical features of websites. No matter how small the screen, web design styles should be configured to keep up with it.

18- Responsive web design (responsive)

Responsive web design is based on the idea that the website can be displayed on all device screens without any problems. The user should be able to view the website from a mobile phone or giant screen smart TV without any problems.

19- AMP “Accelerated Mobile Pages”

Accelerated Mobile Pages are mobile compatible pages that are faster than traditional mobile pages. It is activated when clicked from the search results and loads faster. A simple HTML version of the original page .. Google, Pinterest, Linkedin, Twitter and many more applications support AMP. It is indirectly a Google Ranking factor, as AMP Mobile improves the user experience.

20- Pay attention to advertisement placement

Websites displaying online sponsored ads, such as Adsense, should pay attention to the selection and placement of the ads. For example, a full page screen that pops up when switching from one page to another is the reason for bouncing. Users do not come to websites to watch advertisements, so advertisements should not disturb the purpose of the visit.

21- Making the content the best possible for mobile users

Large blocks of text can disturb mobile users, causing them to leave the site. The content will be much more readable with the appropriate font type, size, coloring and paragraphs of 2 or 3 sentences.

22- Using bullets to increase readability

Item lists consisting of numbers or symbols significantly increase readability. Using bullets when appropriate allows the content to be read faster.

23- Facilitating Sharing on Social Media

It is important to add the necessary tools to the website so that mobile or desktop users can easily share the content. Social sharing buttons can be created with WordPress plugins.

25- Mobile and Table Compatibility Control with Google Chrome Emulator

The Google Chrome Emulator Tool is perfect for viewing the website on different device screens and checking for scripts, secure connections and many more errors. Any content on the desktop screen can be viewed in various smart phone simulations with this tool.

26- Highlighting keywords to attract the attention of the user

Highlighting keywords in bold doesn't mean better ranking. This method is to make it easier for the user to find important parts of the text and can increase the duration of the page.

27- Pogosick Reduction

Mobile users are often on the move, hastily tending to access information immediately. By specifying the main idea of ​​the text in the first lines, the Pogostick effect, which means the action of pressing the back key as soon as it comes to the page and returning to the search engine results, can be reduced.

28- Using long-tail keywords to get more relevant traffic

Users who know what they want are likely to engage in targeted actions such as sales. Such users write what they are looking for in the search engine in more detail. For example, the word "Vegan recipes" is too general and probably the user is not sure exactly what he is looking for. A long-tailed user, such as "Vegan dinner recipes" in the search query, knows what he wants, so he has a good chance of completing the conversion goal because he stays longer on the page he clicks.

29- Finding long-tailed keywords

Keyword research tools sort the searched words to date. There is not much chance of finding new keywords. Question-and-answer forums such as Quora are great, especially for finding new, long-tailed keywords. On these platforms, there may be little used keywords on other websites before.

30- Finding keywords from Wikipedia

Wikipedia is a great platform for finding keywords from any subject. All the links you find on a topic page on Wikipedia are potentially relevant keywords that you can use or create in your content.

To be continued…