SEO Terms and Definitions

A Complete Glossary of Essential SEO Terms and Definitions.


H

L

O

P

Q

R

T


Above the Fold

A term used to explain area of a webpage prior to scrolling. The phrase is traditionally coined to describe top portion of a newspaper. There’s an established perception in the digital marketing industry that ads above the fold perform better than below the fold. When it comes to improving user experience, SEO professionals recommend using the most important part of your content above the fold in order to catch the attention of your visitors.

AdSense

AdSense is Google’s contextual advertising network which allows publishers to rent the real estate of their websites/blogs to display ads from Google. When visitors click on ads shown by Google AdSense network, the revenue generates will be shared between the publisher and Google in a predefined ration.

AdWords

AdWords is Google’s Paid Advertising program wherein online business owner can participate and show their ads to Google users. Under AdWords advertising program, website owners will pay per every click on their ads shown across Google’s advertising network.

Affiliate Marketing

Affiliate Marketing is a process of marketing product/services in return of a certain commission paid per sale. Many bloggers sign up for affiliate programs and use special affiliate links on their posts which explain the benefits of using the products they are affiliated to. Contrary to AdSense, wherein Google pays you per click, in affiliate marketing programs, you get a predefined commission only when a certain goal is completed, i.e. sale of product, sign-ups, subscription etc.

Analytics

Analytics are user tracking software programs which are designed to track the behaviour of website visitors in order to improve digital marketing campaigns. Page views, unique visitors and bounce rate are some of the key metrics in analytics software. Google Analytics is a free and popular analytics software from Google.

Anchor Text

The text or image that is clickable on a webpage. Anchor text is used to lead the user to a destination URL in order to perform a particular action such as getting more information, place an order or review the source of information. As per SEO best practices, choosing contextually relevant words rather than your target keywords is a good idea for anchor text. Google has become particularly intolerant towards anchor text manipulation.

Alt Tags

The text that is visible when you hover mouse over an image of a webpage. Alt tags help search engines bots better understand the images and index them properly. They are also useful making your website accessible to visually challenged visitors.

(Back to Top)

Backlinks

Backlinks are the links that a website gets from another website. For example, if yourwebsite.com links to mywebsite.com, then mywebsite.com earns a backlink.

Also known as inbound link, a backlink is essentially word or phrase which is clickable. On a webpage a word or phrase that is linked to a page on another website is known as hyperlink.

When you hover your mouse on a hyperlink, it shows you the destination page URL. If you click on the hyperlink, then you land on the destination page URL.

Webmasters use hyperlinks as a reference and to help their readers better understand a topic. Therefore, getting a backlink from another website is a natural process. It’s like a earning a vote of trust or acknowledgement from a stranger.

The best way to earn backlinks from other websites is to focus on creating useful content. When your website has lots of useful information, other websites link to your website pages in order to help their own readers.

Black Hat SEO

Black Hat SEO is an approach to optimizing website, keeping search engines bots in mind rather than human visitors. Typically, Black Hat SEO involves manipulating search engine algorithms and violating search engine guidelines blatantly in order to improve search engine rankings.

The hazards of following Black Hat SEO are many, including getting the whole website banned for violating search engine guidelines, and losing existing ranks altogether.

Bots/Spiders

Bots or Spiders (aks Search Engine Bots/Spiders) are basically software programs that are designed to crawl and index webpages on the Internet in order in a predefined format.

Bounce Rate

The rate at which your site visitors ignore your landing page and bounce off or leave your site altogether rather than continue to view more pages.

Bounce rate is typically based on a particular session (browsing period) and calculated in the following way:

Number of Bounces/Number of Entrances X 100

For examples,

Number of Entrances/Arrivals: 900 visits within 30 mins

Number of Exits/Bounces: 450 bounces within 30 mins

Considering the scenarios above, the bounce rate is: 450/900 X 100 = 50%

There is no ideal bounce rate as such; it can be different depending on niches, sources of traffic and other factors.

As per Avinash Kaushik,

  1. It’s usually hard to get a bounce rate under 20%
  2. Bounce rate over 35% is cause of concern
  3. When the bounce rate is over 50%, you should start worrying

Bouncing rate can be deceptive and therefore, it needs meticulous analysis. Here’s a guide on how to perceive bounce rate and reduce it further.

Broken Link

A hyper link on a webpage that no longer works or fails to take you to the destination webpage is called a Broken Link.

Below are some of the most prominent reasons of a Broken Links.

  1. Incorrect URL given by the website owner
  2. Destination webpage has been removed, resulting in 404 Error or the page location has been changed
  3. Destination website has gone offline
  4. User has firewall protected software

Broken links affect user experience since the website visitors can’t access the information they are looking for. When a visitor clicks on a broken links, they usually see a 404 or Not found error page. Read this post to learn more about 404 Error.

Breadcrumbs

Breadcrumbs refer to exact location of a visitor on a website. It is considered as the secondary navigation scheme on a website that helps a user find their location while browsing a large website (typically an e-commerce store). Since breadcrumbs improve user experience, they are considered to be SEO-friendly.

Here’s a great post to learn more about breadcrumbs.

(Back to Top)

Crawler (Search Engine Crawler)

Search engine crawler is a software program that is designed to crawl websites across the World Wide Web to provide the search engines with updated information from time to time.

Also known search bots or search spiders, the search engine crawlers, crawl webpages and hyperlinks on a periodic basis in order to keep search engines up-to-date with the latest information.

Cached Copy

Every time search engine bots crawl your website or webpages, they take a snapshot or copy of the page on the specific date and store the version. This stored version of your webpage is called Cached Copy.

Search engines such as Google and Bing keep cached version of webpages in their index after crawling various websites. In other words, a cached version tells you when a page was last indexed.

You can see the cached version as well as the current webpage in SERPs. In fact, every link on the SERPs includes a cached link. When you click on that link it takes you to the cached version of the webpage.

In order to see cached copy of a webpage, you need to use site: operator in the search box followed by the webpage URL.

For example, to see when Google indexed this page, you will use the following operator.

Site: topleaguetech.com/seo-glossary-terms

Cached Pages are useful in the when an important webpage is down. They are also handy when the internet connectivity is slow.

Cross Linking

Cross Linking refers to the process of linking between two websites whether or not they belong to the same webmaster. Cross linking is done to offer users useful context on a webpage. They make the Internet a better ecosystem of knowledge.

In the early days of SEO, search engines used to offer a great deal of emphasis on cross linking in order to determine the link popularity of a particular website. With the growing popularity of SEO, cross linking was heavily manipulated, prompting search engines to revamp their website ranking strategies.

Canonical Link

A canonical link (also known as preferred version) is a link that tells search engines to prioritize and index only the primary version of a webpage, especially when your website has two or more web pages having exactly same content. In other words, a canonical link can help you avoid duplicate content issues. Read Google’s official guide on how to use canonical URLs.

Conversion Rate

Conversation rate is the rate at which visitors on your website convert, meaning complete a desire action. For example, if the goal of your website is to increase sales, and your website makes 4 sales for every 100 daily visitors, then your daily conversion rate is 4%. Conversion rate is a complex theory and there are techniques to optimize website conversion rate.

Read our posts on Conversion Rate:

  1. Copywriting Clichés that Kill Your Conversion Rate
  2. 9 Conversion-Critical Elements of Landing Pages

Click Through Rate

Click Through Rate or (CTR) is the percentage of users who visited your website via SERPs. In SEO, click through rate is based on the number of impressions a particular webpage gets on SERPs.

For example, if a webpage appears 100 times on Google Search and 30 users click on it to enter your website, then the page’s CTR is 30%.

Many SEO professionals believe Google factors in CTR data to determine the relevance of a webpage for ranking purposes.

CMS

Also known as Content Management System, CMS is a software technology that allows users with little knowledge of computer programming language to build and manage their blog/website without depending on a developer.

WordPress is considered to be the most popular CMS software which powers millions of websites today.

Crawl Budget

Crawl Budget is the specific amount of time Googlebot has at its disposal to crawl a website. The more pages it can crawl within the given budget, the better is a site’s visibility on Google Search.

Typically, more popular and authoritative sites with better PageRank tends to have higher crawl budget than others.

Creating an SEO-friendly site structure and internal linking strategy among other on-page optimization practices, are key to making the best use of crawl budget and more importantly, improving the crawl budget.

Here’s a great post on crawl budget optimization.

(Back to Top)

Do-follow/No-follow Links

It’s a common practice for websites to link to each other to improve context for their users. For Search Engines, it is important to understand the nature of these links in order to understand the popularity of website.

Technically, a Do-follow link contains default HTML attribute that tells search engines to trust the target website and share link equity with them. A Do-follow link allows a search engine to trust links and consider them as votes.

Traditionally, search engines would treat links from external sources as votes in order to determine the relevance of a website and its position on SERP’s.

For example, if website A has 100 links and website B has got 150 links, then website B would rank higher than website A.

In a perfect search engine ecosystem this would work great. However, in a world replete with intense competition, this led to massive manipulative practices among the webmasters who tried to acquire illegitimate links to outrank their fair competitors.

Therefore, Google and other search engines decided to support No-follow HTML attribute to devalue an external link that contains the No-follow attribute.

Here’s what a link with No-follow attribute looks like:

<a href=”http://www.google.com/” rel=”nofollow”>Google</a>

In SEO, links with No-follow HTML attribute:

  1. Don’t share link equity with target webpage
  2. Don’t share link juice with target webpage
  3. Don’t help target webpage to rank higher

It’s a good practice to use No-follow HTML attribute when you link out to:

  1. Affiliate websites
  2. Websites with lower Domain Authority
  3. Links within Sponsored Content

Here are some more tips about using No-follow tag.

Domain Age

The number of years a domain name has been registered. For example, if abc.com has been registered for 2 years, then its domain age is 2. In SEO, Google tends to consider website with a longer domain age to be more trustworthy. Some believe it’s even a ranking factor.

Duplicate Content

Content on a website is said to be duplicate when it matches the content on other web pages on the same website or another website.
Duplicate content situation occurs in the following scenarios:

  1. When same version of the copy has been used across several pages
  2. When due to wrong SEO settings, the same content is visible under several URLs with the site

Duplicate content can adversely affect search engine rankings of a website.
Note: Duplicate Content is different from Plagiarized Content. While plagiarized content is illegitimate, and can attract search engine penalties, duplicate content is perfectly legitimate but it’s detrimental for your site’s search engine visibility.

Deindex

Deindex is the process of removing some webpages or a whole website from search engine’s index because of violations of terms of use.
When a website or its webpages are deindexed by search engines (e.g. Google) they will no longer appear within the SERP’s of the search engine.

Deep Linking

Deep Linking is the process of linking to website’s inner webpages rather than its home page. Therefore, a deep link takes the user to the specific landing page rather than the home page of a website.

Many SEO professionals believe deep linking can be a great SEO practice to boost your site’s overall authority, relevancy and visibility in search engines. A simple backlink analysis of your website on Open Site Explorer can tell you if you have built most of your links to your home page.

Learn more about Deep Linking from Neil Patel here.

Disavow Tool

A tool developed by Google to help site owners to request Google (via Search Console) to remove a bunch of poor quality backlinks point to their site. Read more about this tool here.

Doorway Pages

Doorway Pages are webpages that are typically designed to manipulate search engine bots and unfairly rank higher on SERP’s. Also known as bridge pages, these pages are built only for search engines and have no utility for the human visitors. Having multiple sites point to just one webpage is a good example of a doorway page. However, there could be a number of reasons why Google might consider your website to be harbouring doorway pages.

(Back to Top)

External Links

Externals links are hyperlinks that point to a webpage located outside one’s own website. For example, if website A links to a page located at website B, then website A is said to have one external link.

According to SEO best practices, webmasters should consider quality and trustworthiness of a website while linking out to it.  Read more about external links here.

Editorial Link

Editorial Link is the link that is the result of site’s quality content rather than paid link or a link that is requested from another website. In other words, editorial links appear more natural to search engines than non-editorial or paid links. Editorial links are typically edited and approved by human editors. Building great content and creating high quality digital assets are some of the best ways to attracting natural and editorial links from a variety of sources and with diversified anchor texts.

Entry Page

An entry page (aka Landing Page) is the first page a visitor sees when they land on your website, typically from a referrer page. SEO’s should maintain their entry pages in order improve bounce rate and user experience.

Ethical SEO

Ethical SEO is an established SEO practice entailing a number of SEO activities that are designed to improve user experience and popularity of a website thereby earning editorial links and boosting search engine authority.

(Back to Top)

Favicon

Favicon, aka shortcut icon, website icon, tab icon and bookmark icon, is an icon (typically the website logo) associated with a particular website. It appears on the address bar of the web browser next to the site’s name.

Feed

Feed (aka RSS Feed) is a useful tool for site owners to allow their visitors to subscriber to their blog content. Some SEO’s believe submitting to RSS Feed helps your blog widen its reader base.

Fresh Content

The content that is up-to-date and therefore, more likely to be useful to search engine users. Websites and blogs that publish fresh content at regular intervals tend to get crawled and indexed by search engines more frequently.

(Back to Top)

Google Caffeine

Google Caffeine is an algorithm update which was launched by Google in 2010 to deliver fresh and updated information in its search results. Many SEO’s believed this was in response to the rapid growth of social media usage among its users.

Google Panda

Google Panda is an algorithm update which was launched by Google in 2011 to lower the ranks of site replete with low quality content (aka thin content) and promote sites with high quality content up the SERP’s. The Panda algorithm brought in a massive change in the way SEO’s perceived web content.

Google Penguin

Google Penguin is an algorithm updated which was launched by Google in 2012 to lower the ranks of sites that violated Google Webmaster Guidelines and were involved in link schemes – a black hat SEO tactic – to manipulate search engine rankings. Quite a few websites were severally affected by the update.

Google Hummingbird

Google Hummingbird is a revamp of Google’s Search algorithm to keep up with conversational search, intent and context of search queries. It was launched in 2013 in order to help smartphone users with voice search features.

Google Search Console

Google Search Console (earlier known as Google Webmaster Tools) is a free service by Google for webmasters. Site owners use Google Search Console to check indexing status and other metrics of their website in order to improve visibility on search engines.

Index (Search Engine Index)

Search engine index refers to the database where the information is stored after search bots have crawled the webpages.

Here’s how search engines provide you with the information:

  1. Users enter their keyword input into the search box
  2. Search engines look for relevant webpages within their index
  3. Search engines retrieve matching results based on user queries
  4. Search engines rank and display results based on their merits

Internal Linking

Internal Linking is an SEO practice wherein a webpage links to another webpage typically within the same website in order to improve context, usability and user experience.

Internal linking one of the most common SEO best practices which can help improve a site’s crawability and perform better on Google Search.

Learn how to choose phrases for internal linking optimization.

Impressions

Impressions are the number of times a webpage has appeared on Google Search against user search queries within a given period of time. A higher number of impressions indicate that a webpage is very visible on Google Search. However, impressions data doesn’t mean anything as a standalone metric. It’s meaningful only when compared to the click data.

For example, if a webpage has 1000 impressions but no clicks at all, the higher number of impressions is absolutely worthless. However, if it has 50 clicks, then the page’s CTR is 0.5%. You can get more data about impressions data on Google’s Webmaster Tools.

Indexed Pages

Indexed Pages are webpages that have been crawled and indexed by search engines bots. A webpage, once crawled and indexed, will appear on SERP’s for a keyword or a number of keywords. Here are simple tips to improve the number of indexed pages.

(Back to Top)

Keywords

Keywords are essentially words or combinations of words which users enter into search box of a search engine in order to seek answer or information.

Keywords are also known as search queries or index terms.

It’s important to note that users can only enter one keyword or query into the search box at a time.

For the purpose of clarity, keywords can be classified two categories such as Short-tail Keywords and Long-tail Keywords.

Short-tail keywords (also known as root keywords) consist of 1-3 keywords while long-tail keywords are longer than three words.

Furniture is a short-tail keyword while Furniture stores in San Jose is a long-tail keyword.

Although short-tail keywords are less specific to search engines. Therefore, search engines tend to return a wide range of results.

On the other hand, long-tail keywords are more specific to search engines. Therefore, search engines tend to show specific results to the users, the kind of information users really need.

Please, note that normal users themselves are not aware the type of keywords they enter while performing a search. They simply use words that they think will help search engines understand their needs better and thereby, return results that are close to their expectations.

As a matter of fact, by their very nature, long-tail keywords tend to help search engines understand users’ intent better and provide them with more specific results that better match the users’ intent.

Therefore, webpages ranking at the top of SERPs for long-tail keywords, have been found to be more useful for searchers. In order words, those webpages have higher engagement rate or conversation rate.

Keyword Density

Keyword Density is the number of times a particular keyword is used in a content piece. It is typically described in percentage, e.g, 1 percent, 1.6 percent, 2 percent etc.

For example, if word count of a content piece is 1000, and a particular keyword has been used 10 times across the piece, then the content piece is said to have a keyword density of 1 percent.

Traditionally, search engines used to rank webpages based on their keyword density, meaning pages with higher density would rank higher than others. However, with keyword manipulation practices becoming rampant among many SEO’s, search engines decided to disregard the keyword density for ranking purposes.

As a matter of fact, Google now considers it as a manipulative approach, and devalues pages that have an unnatural keyword density.

Although there’s no ideal keyword density as such, keeping it between 1-2 percent is considered a safe approach by many SEO professionals.

Moreover, using keywords naturally rather than stuffing them in your content piece is a bad idea since Google has become increasingly intolerant towards Keyword Stuffing.

Here are the some golden about using keywords in your content piece:

  1. Use them only when they sound natural and fit the context of your topic.
  2. Don’t overuse them – use their derivative forms or fragmented forms if the topic demands it.
  3. As far and naturally as possible, use keywords in your Heading tags, Title Tags and Meta Descriptions.
  4. Keywords should be evenly distributed across your content pieces rather than concentrated only in some parts.
  5. Avoid using keywords that sound irrelevant and out of place within a given context.

Keyword Stuffing

Keyword Stuffing is a process of padding a content piece with keywords unnaturally, and more often, with irrelevant keywords in order to manipulate the search engine algorithms.

Considered as an unethical SEO practice, Keyword Stuffing used to be more prevalent until Google tweaked their search algorithms to catch and penalize the unscrupulous behaviour a few years ago.

Even as there are no established mandates about complying with a specific Keyword Density, the White hat SEO best practices recommend against using keywords indiscriminately in your copy.

In other words, follow the Keyword Density best practices to remain safe.

(Back to Top)

Link Spamming

Link Spamming is a process of manipulating the search algorithm mechanism by planting unnatural links across multiple websites, in an effort to boost the search engine ranking of a website.

Link Spamming was particularly prevalent in the SEO industry until 2012, post which Google released powerful algorithmic updates to counter such practices.

Some examples of link spamming are building links across web directories sans any editorial discretion, creating several blogs as sub-domains across free blogging sites to plant links, and investing in link farms. More examples of link spamming here.

Longtail Keywords

Keywords that contain long and highly specific search phrases are known as Long-tail keywords. These keywords usually contain more than three words and are often very specific and relatively easier to rank for.

For example, “affordable content writing services” is a long-term keyword since it suggests the user is looking for companies that offer content writing services at cheap rates.

It is believed that searchers are more likely to use long-tail keywords when they are close to the purchase.

Here’s a great source to read more about Longtail Keyword.

Link Building

Link building is the process of acquiring links from external sites in order to improve site’s authority as well as search engine rankings. Link building is one of the many tactics webmasters deploy in order to promote their website to their target audience. Read this guide to know more about Link Building.

(Back to Top)

Meta Tags

Meta Tags are HTML tags that are used to describe the information on a webpage. Search engines use the information within these tags to determine the meaning of the pages.

When search engines return results for your queries, they usually show you the Meta data (meta tags) of web pages.

Typically, Meta data would include Title Tags, Meta Descriptions and Keyword Tags.

Traditionally, search engines used to rank webpages based on the information provided in the Meta tags. However, as webmasters and SEO’s started padding their web pages with false and exaggerated Meta data to manipulate the search engine algorithm, search engines decided to drop the Meta data from ranking considerations in their algorithms.

For example, in 2009, Google declared that it would no longer use keywords, Meta titles and Meta descriptions in their search algo for ranking purposes.

Even though search engines don’t use the Meta data for ranking websites anymore, it is still crucial to improve your click through rate on SERPs. Your prospective customers would still click on search engine results that have informative and compelling Meta titles and descriptions.

Meta Title tags

Title tags are clickable link that appear in the search results for specific queries. Content that describe the essence of your web page in a headline is known as Title Tag or Meta Title Tag. Ideally it should include the keywords of the webpage and carry a compelling message. So when it appears on the SERPs, searchers will find it more relevant and click on it, which will boost your Click Through Rate (CTR).
From SEO point of view, the Title Tag should be

  1. Not more than 70 characters in length
  2. Short and persuasive with focus keywords as close to the beginning as possible
  3. Descriptive enough to convey the idea your webpage is based on

Meta Description

Meta description is an HTML tag which is used to offer a concise description of a webpage. It is an essential part of Meta tag which also includes Meta title and Meta keywords.
It usually appears below the Meta Title in SERPs and contains a summary/snippet of the webpage. Although Google no longer considers Meta description for ranking purposes, it is useful in improving the Click Through Rate (CTR) of the webpage, especially when crafted for your users. Ideally, the Meta Description should be

  1. Brief, well written and unique for all pages
  2. Within 155 characters including spaces
  3. Relevant to the webpage
  4. Optimized with keywords

(Back to Top)

Optimization

In the SEO context, Optimization refers to the process of improving the properties of a website in order to help boost its performance on Search Engine Result Pages (SERPs).

For example, if a webmaster improves the existing content of his website to make the information more credible and useful for the readers, then it’s part of the website optimization.

A webmaster can optimize his website in a number of ways. In fact, website optimization is a lifelong process.

Outbound Links

Links that take you from one website to another are called Outbound Links. There are several reasons why an outbound link is used in a website:

  1. Complete an action (e.g Affiliate Sites)
  2. Complete a transaction (e.g. Payment Gateway)
  3. Cite a source of information for the readers (e.g Blog)

In SEO, outbound links are said to improve on-page value especially when used judiciously. Many believe if a website cites authority sources to help users, it establishes the site’s credibility and hence, improve on-page value.

However, webmasters should link to relevant and useful sources in order to help users. In turn, it helps search engines understand context of your page and improve your page authority.

There two types of outbound links.

Do-follow Link: This type of outbound link share your Link Juice/Link Equity with the target website.

No-follow Link: This type of outbound link doesn’t share any Link Juice/Link Equity with the target website.

(Back to Top)

Ranking Factors

Search Engines use algorithms (software programs) to determine the usefulness of indexed webpages stored in their information database (search index). These algorithms or software programs are designed according to several factors that are often the closes clues about keyword input or search query entered by the user.

For example, Google’s search engine algorithms consider over 200 factors or signals including Page Rank, Links, Content Freshness and Location.

The ranking factors evolve with the advancement in technology as well as the user behavioural patterns. Google’s Hummingbird and RankBrain are result of the changing search patterns and computing technology.

Ranking Signals

Search Engines employ a set of ranking signals or clues to determine the relevance of a webpage with regards to a user query. For example, Google is currently factoring on 200 signals to determine the relevance of a webpage. The more signals a webpage gets, the higher it ranks on Google’s SERPs.

(Back to Top)

SEO

SEO stands for Search Engine Optimization, a process to improve the quality of a website so that it performs better and improves its ranks on Search Engine Results Pages for a given keyword.

Search Engine

A search engine is basically an online information database which is constantly updated by software programs called algorithm. Search engines are designed to help you find any information that exists on the Internet or World Wide Web.

Search engines usually have a home page showing a search box where users can enter their keywords or queries and hit enter to seek their desired information.

Search Engines usually show results page by page, each page typically showing ten results per page. This is often called Search Engine Result Pages or SERPs.

Some popular search engines include Google Search, Bing Search and Yahoo Search.

Search Engines rank websites or web pages based on their merits.

SERPs

SERPs stands for Search Engine Result Pages. When users enter a keyword or query into the search box, search engines return results in a page by page manner. Those pages are called Search Engine Result Pages.

Each result consists of a link to a particular webpage, combined with Page Title and Meta Description.

Search Engines sort or rank results based on their individual merits. The webpage that is ranked at the topmost position is considered the best result based on a particular keyword or query.

Search engines always strive to return the most useful and relevant results to their users. Therefore, they constantly improve their algorithm (process of ranking) to ensure their ranking appear flawless to their users.

Search Algorithms

When you enter a search query or keyword input into search engine’s search box, they take your keyword and looks for relavant information within their search index or information database and returns results (web pages) that are most relevant to your search queries or keyword inputs.

To perform this operation automatically and quickly, search engines use computer programs known as Search Algorithms.

Google defines algorithms are computer programs which look for clues to offer you exactly what you want when you enter a keyword input in its search box.

Search algorithms, being computer programs, are built differently by different search engines based on a set of rules that apply to the process of establishing relevance of the indexed web pages located within their database, with regards to the search queries entered by the user.

As a matter of fact, search algorithms are designed to establish the context of keyword inputs and then consider several criteria or look for clues that can help them determine the relevance of indexed web pages before providing the user with the most relevant results.

According to Google, their search algorithms rely on more than 200 unique signals or clues to guess what a user might be looking for. Typically, these signals include RankBrain, the presence of keywords on a web page, links, freshness of the content present on the web page, the location where you’re performing the search and many other factors.

Google’s primary Search Algorithm is known as Google Hummingbird.

Sitemap (HTML/XML)

Sitemaps are an important part of a website. As the name suggest, a sitemap helps the visitor understand the overall structure of the site and navigate across the website easily.

From technical standpoint, there are two types of sitemaps – HTML Sitemap and XML Sitemap.

An HTML Sitemap is designed for the site visitor in order to help them navigate different sections of the website easily. The webmaster can choose to show only those URLs that they are important for the site visitors. A sitemap improve usability of your website and therefore, boosts user experience. Major search engines including Gooogle consider user experience as a vital signal for ranking purposes.

An XML Sitemap is designed purely for the search engine bots and it helps search engines bots stay updated about changes on your website. An XML sitemap updates itself automatically when you make any change to your website, e.g., add a new page, publish a new post, and add a new tag, category and even meta data.

While an HTML Sitemap is optional for the webmaster, an XML sitemap is highly recommended by Google. If your website is running on WordPress, you could either create an XML sitemap manually, generate one using Sitemap Generator or simply use any popular XML sitemap plugin.

Search Traffic

As the name suggest, Search Traffic typically refers to the visitors that find your website through search engines. If your website appears on SERPs for search queries or keywords by your prospective visitors, they click on the search results to reach your website. Basically, the traffic generated from actual searchers is known as Search traffic, rather than those generated from sponsored campaigns. Therefore, it’s also known as organic traffic. Web analytics can help you find more details about the source of organic traffic.

Spun Content

Spun content refers to the content that is rewritten typically with the help of an Article Spinner in order to produce a textually unique content piece. Many webmasters use spun content in order to save time required to write original content by human writers. However, Google has already devalued spun content to the extent that websites using spun content don’t really perform well on Google Search. Here’s a great post on content spinning.

(Back to Top)

Unique Content

Content which is unique in nature. Many webmasters tend to copy content from other websites rather than write their own. This can lead to duplicate content issues. Unique content refers to written content that:

  1. Is not plagiarized
  2. Is not curated from other sources
  3. Passes Copyscape test
  4. Doesn’t violate copyright

In a broader sense, some SEO professionals believe that Unique Content is also supposed to be based on unique ideas along with meeting the above criteria. This principle is based on the fact that many SEO’s tend to use Spun Content off existing articles found on other websites. Having unique content is one of the strong Ranking Signals used by Google. Please, read why unique content really matters.

(Back to Top)

Viral Marketing

A technique to spread awareness about a brand, products or services in order to create buzz around them, prompting people to take sudden interest in the trend. Viral Marketing has become more successful after the emergence of the Social Web which primarily include Facebook, Twitter and Reddit. In Viral Marketing, the goal is to create a compelling message that is aligned with the theme of your brand/products/services rather than hard-selling features/benefits of your products/services.

(Back to Top)

Webmasters

Webmaster is a person who owns and manages a website. In reality, however, the person who owns a website may not necessarily be managing it. A webmaster can make any changes to his website, since he has access to the backend of the website. A webmaster can play a number of roles including administrator, developer and publisher.

White Hat SEO

White Hat SEO is an approach to optimizing website, keeping human audience in mind more than search engine crawlers. It also means optimizing a website in compliance with the search engine guidelines rather than trying to manipulate them in a bid to improve rankings.

The benefits of following White Hat SEO are many, including keeping a website safe from algorithmic updates that are designed to penalize webmasters deploying manipulative optimization tactics.

Web Directories

Web directories feature listings of online businesses by categories and sub-categories which can be filtered by locations and nature of businesses. Most web directories accept entries of new online businesses and are managed by human editors for approval. Dmoz and Yahoo! Directory are some of the more popular web directories.

Online directories typically allow listings to include their website URL. Traditionally, search engines would weigh incoming links from directories very highly. However, after mushrooming of low quality web directories, Google decided to devalue the links acquired from directories.

While directory submission is still used by SEO’s as link building tactic, SEO’s are supposed to be extremely picky about which directories they choose since links from low quality directories are being considered unsafe from SEO stand point.

(Back to Top)