Tuesday, 24 December 2013

Hidden/Invisible Text- One of the Black hat technique

Invisible Text:

Hidden Text is one of the challenges faced by webmasters and search engines. Spammers continue to use hidden text to stuff keywords into their pages for purposes of artificially boosting their rankings. Search engines seek to figure out when spammers are doing this, and then then take appropriate action.
To start our look at hidden text, let's examine Google's Webmaster Guidelines for hidden text, to see the bottom line:
If your site is perceived to contain hidden text and links that are deceptive in intent, your site may be removed from the Google index, and will not appear in search results pages

A Few Ways to Create Hidden Text
·         Make your text and background colors identical (or virtually identical) - this is the original method used by spammers for creating hidden text. It's easy to detect, and I am not aware of any legitimate use for this technique.

·         Set the font size for text to 0, or to a negative number. This is also easy to detect, and I can't think of any legit use for it either.

Saturday, 21 December 2013

Content Creations & Content Optimization:

The content of your website is important for the search engines as they consider your website's relevancy and importance based on both meta tags and content in relation to the important keywords. There are several guidelines you should follow:

Include keywords in text
You should make sure that all those keywords you have included in your meta tags and on which searches you wish to come up are included in the text of your site.

Consider density of the keywords in text
Be careful not to overcrowd your text with keywords cause you might get penalized. Note that you should have the most important keyword(s) in the beginning of the page as thus they are given more weight. Refer to my previous post http://seo-news-master.blogspot.in/2013/09/how-to-place-important-keywords-and.html to know what should be the optimum density of the keywords.

Mind the formatting of the text
Formatting (bolding, titles, etc.) is also important to show the search engines that certain words are more important than others. Thus, if you bold your keywords in your text, they are perceived as more important than other words and thus your text becomes more relevant to these keywords.

Add regularly relevant content to site
You should make sure that you add relevant content to your website. This content addition invites Google to visit your site more often and increase its relevance and importance.

Wednesday, 18 December 2013

What is a Sitemap and how does it affect your SEO?

What is a Sitemap?
Sitemaps, as the name implies, are just a map of your site - i.e. on one single page you show the structure of your site, its sections, the links between them, etc. Sitemaps make navigating your site easier and having an updated sitemap on your site is good both for your users and for search engines. Sitemaps are an important way of communication with search engines.

In simple terms, a Sitemap is an XML file that is full of your individual webpage’s URLs. It’s like an archive of every webpage in your website. This file should be easily discoverable in your site in order for search engine crawlers to stumble upon it.

What is a Sitemap for?
A Sitemap is usually used for the purpose of letting the search engine crawlers follow the links to all your individual Web Pages so that it won’t miss out on anything.

Sometimes we leave out URLs or hide them from all visible pages because we don’t exactly want some of the users to go there. As a result, some of these URLs are uncrawlable to search engine spiders.

How does your Sitemap affect your SEO?
Search engines should see all the pages that you want them to see. The more pages that they index from you, the more trust your site gains. It only means that your website has more information to offer.
Making sure the search engine spiders get to crawl all the stuff they need to crawl from your website is the exact purpose of a Sitemap.

Monday, 16 December 2013

Black Hat SEO

Black hat methods are unacceptable and highly discouraged ways of bringing traffic to your website. These methods bring short-term results, but once a search engine notices that you are cheating the system, your page could be penalized or banned.

Black hat methods include:
  • Hidden Content
  • Keyword Stuffing
  • Doorway or Gateway pages
  • Link Farming
  • Cloaking
Each term will be explained in detail in the next post.

Saturday, 7 December 2013

White Hat SEO- A pure Organic SEO without spamming

When we say “white”, it refers to pure, no impurity or bad things, similarly white hat SEO means no illegal or bad means of achieving top rankings. It is pure and totally legal ways, without spamming and stuffing of keywords. 
White Hat SEO refers to the use of SEO strategies, techniques and tactics that focus on human audiences, as opposed to search engines and completely follow search engine rules and policies.

Any SEO tactic that maintains the integrity of your website and the SERPs (search engine results pages) is considered a "white-hat" search engine optimization tactic. These are the only tactics that we should use whenever applicable and which enhance rather than detract from your website and from the rankings.

Thursday, 5 December 2013

What is difference between crawling, indexing and caching?

Crawling is where search engines spiders / bots move from web page to web page by following the links on the pages. The pages "found" are then ranked using an algorithm and indexed into the search engine database. Crawling is when search engines view your website and try to find the depths of it or understanding the hierarchy of your website.
Indexing is where search engine has crawled the web and ranks the URLs found using various criteria and places them in the database, or index. It's like in the library, if there's a new book, before the librarian put it in the right shelve, they create an index of it first

Caching is where copies of web pages stored locally on an Internet user's hard drive or within a search engine's database. " Cache is a more techy term, as it is a temporary store of your website. Google updates their cache on timely basis, depending of what you have indicated in your sitemap.xml

Wednesday, 4 December 2013

How Search Engines Work?

The first basic truth you need to know to learn SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven.

Search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.

First, search engines crawl the Web to see what is there. This task is performed by a piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way.

After a page is crawled, the next step is to index its content. The indexed page is stored in a giant database, from where it can later be retrieved.

When a search request comes, the search engine processes it – i.e. it compares the search string in the search request with the indexed pages in the database.

Since it is likely that more than one page (practically it is millions of pages) contains the search string, the search engine starts calculating the relevancy of each of the pages in its index with the search string.

The last step in search engines' activity is retrieving the results. Basically, it is nothing more than simply displaying them in the browser – i.e. the endless pages of search results that are sorted from the most relevant to the least relevant sites. 

Google Technical guidelines

Google Technical guidelines

· Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

· Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

· Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves your bandwidth and overhead.

· Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visithttp://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.

· Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a rorbots.txt file.

· If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.

· Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.

· Test your site to make sure that it appears correctly in different browsers.

· Monitor your site's performance and optimize load times. Google's goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.
Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest or other tools. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.