If you are a new
comer for website and blog, or if you work with anything related to the
Internet, you’ll definitely need to know a little bit about search engine
optimization (SEO). A good way to start is to familiarize you with the most
common terms of SEO are written bellow:-
1. SEM:
Stands for Search Engine Marketing, and it is invoked via search engine, or
through internet and online presence, like SEO, SMO, Email marketing, PPC etc.
SEM is divided into two main parts: SEO and PPC. SEO stands for Search Engine
Optimization, and it is the process of optimizing websites to make their pages
appear in the organic search results. PPC stands for Pay-Per-Click, and it is
the practice of purchasing clicks from search engines. The clicks come from
sponsored listings in the search results.
2. Back-link:
Also called a link coming to your site from another website or simply link, it
is a hyperlink on another website pointing back to your own website. Back-links
are most important for SEO because they affect directly the Page-Rank of any
web page, influencing its search rankings.
3. Page-Rank:
Page-Rank is an algorithm that search engine (Google) uses to estimate the
relative most important of pages around the web. The basic idea behind the
algorithm is the fact that a link from page A to page B can be seen as a vote
of trust from page A to page B. The higher the number of links to a page
therefore, the higher the probability that such page is important.
4. Link-bait:
A link-bait is a piece of web content published on a website or blog with the
goal of attracting as many back-links as possible (in order to improve one’s
search rankings). Usually it’s a written piece, but it can also be a video, a
picture, a quiz or anything else. Classic examples of linkbait are the “Top 10″
lists that tend to become popular on social bookmarking sites.
5. Link
farm. A link farm is a group of websites where every website links to each
other website, with the purpose of increasing the Page-Rank of all the sites which
is linking back to each other. This practice was working well in the early days
of search engines, but now today it is called a spamming technique.
6. Anchor
text: A clickable text link on the web page is called anchor text.
Having keyword rich anchor texts help with SEO because Google will associate
these keywords with the content of your website.
7. NoFollow:
The nofollow is a HTML link attribute used in website by the webmasters to
signal to search engine that they don’t crawl the website they are linking to.
When Google sees the nofollow attribute it will basically not count that link
for the Page-Rank and search algorithms.
8. Link
Sculpting: By using the nofollow attribute strategically webmasters
were able to channel the flow of Page-Rank within their websites, thus
increasing the search rankings of desired pages. This practice is no longer
effective as Google recently change how it handles the nofollow attribute.
9. Meta
Tags: Meta tags are used to give more information regarding the
content of your pages to the search engines. The Meta tags are takes place
inside the HEAD section of your HTML code, and thus are not visible to human
visitors.
10. Title
Tag: The title tag is exactly the title of a web page, and it is one
of the most important factors inside Google’s search algorithm. Ideally your
title tag should be unique and contain the main keywords of your page. You can
see the title tag of any web page on top of the browser while navigating it on
the browser’s title bar. The length of the title bar should be 60 to 70
characters according to the Google guidelines.
11. Search
Algorithm: Google’s search algorithm is used to find the most relevant
web pages for any search query. The algorithm considers over 200 factors
(according to Google itself), including the PageRank value, the title tag, the
meta tags, the content of the website, the age of the domain and so on.
12. SERP:
Stands for Search Engine Results Page. It’s basically the page when you search for
a specific keyword on Google or on other search engines and they display a
search result page. The amount of search traffic your website will receive
depends on the rankings it will have inside the SERPs.
13. Sandbox:
Google basically has a separate indexing database store, the sandbox, where it
places all newly discovered websites. When websites are on the sandbox, they
won’t appear in the search results for normal search queries. Once Google
verifies that the website is genuine, it will move it out of the sandbox and
into the main search index.
14. Keyword
Density: Keyword density means divides the number of times that
keyword is used by the total number of words in the page. Keyword density is
one of the most important SEO factors, as the early algorithms placed a heavy
emphasis on it. This is not the case anymore.
15. Keyword
Stuffing: Since keyword density was an important factor on the early
search algorithms, webmasters started to game the system by artificially
inflating the keyword density inside their websites. This is called keyword
stuffing. These days this practice won’t help you, and it can also get you
penalized.
16. Cloaking.
This technique is used to make the same web page show different content to
search engines and to human visitors. The purpose is to get the page ranked for
specific keywords, and then use the incoming traffic to promote unrelated
products or services. This process is also called spamming techniques and can
get you penalized on most search engines.
17. Web
Crawler: Also called search bot or spider, it’s a program that browses
the web on behalf of search engines, trying to discover new links, new pages
and new contents. This is the first step on the indexation process.
18. Canonical
URL: Canonicalization is a process for converting data that has more
than one possible demonstration into a “standard” canonical representation. A
canonical URL, therefore, is the standard URL for accessing a specific page
within your website. For instance, the canonical version of your domain might
be http://www.yoursite.com instead of http://yoursite.com.
19. Robots.txt:
This is text file, placed in the root of the domain, which is used to inform
search bots about the structure of the website. For instance, via the
robots.txt file it’s possible to block specific search robots and to restrict
the access to specific folders of section inside the website.
No comments:
Post a Comment