Tuesday 24 December 2013

Hidden/Invisible Text- One of the Black hat technique

Invisible Text:

Hidden Text is one of the challenges faced by webmasters and search engines. Spammers continue to use hidden text to stuff keywords into their pages for purposes of artificially boosting their rankings. Search engines seek to figure out when spammers are doing this, and then then take appropriate action.
To start our look at hidden text, let's examine Google's Webmaster Guidelines for hidden text, to see the bottom line:
If your site is perceived to contain hidden text and links that are deceptive in intent, your site may be removed from the Google index, and will not appear in search results pages

A Few Ways to Create Hidden Text
·         Make your text and background colors identical (or virtually identical) - this is the original method used by spammers for creating hidden text. It's easy to detect, and I am not aware of any legitimate use for this technique.

·         Set the font size for text to 0, or to a negative number. This is also easy to detect, and I can't think of any legit use for it either.

Saturday 21 December 2013

Content Creations & Content Optimization:

The content of your website is important for the search engines as they consider your website's relevancy and importance based on both meta tags and content in relation to the important keywords. There are several guidelines you should follow:

Include keywords in text
You should make sure that all those keywords you have included in your meta tags and on which searches you wish to come up are included in the text of your site.

Consider density of the keywords in text
Be careful not to overcrowd your text with keywords cause you might get penalized. Note that you should have the most important keyword(s) in the beginning of the page as thus they are given more weight. Refer to my previous post http://seo-news-master.blogspot.in/2013/09/how-to-place-important-keywords-and.html to know what should be the optimum density of the keywords.

Mind the formatting of the text
Formatting (bolding, titles, etc.) is also important to show the search engines that certain words are more important than others. Thus, if you bold your keywords in your text, they are perceived as more important than other words and thus your text becomes more relevant to these keywords.

Add regularly relevant content to site
You should make sure that you add relevant content to your website. This content addition invites Google to visit your site more often and increase its relevance and importance.

Wednesday 18 December 2013

What is a Sitemap and how does it affect your SEO?

What is a Sitemap?
Sitemaps, as the name implies, are just a map of your site - i.e. on one single page you show the structure of your site, its sections, the links between them, etc. Sitemaps make navigating your site easier and having an updated sitemap on your site is good both for your users and for search engines. Sitemaps are an important way of communication with search engines.

In simple terms, a Sitemap is an XML file that is full of your individual webpage’s URLs. It’s like an archive of every webpage in your website. This file should be easily discoverable in your site in order for search engine crawlers to stumble upon it.

What is a Sitemap for?
A Sitemap is usually used for the purpose of letting the search engine crawlers follow the links to all your individual Web Pages so that it won’t miss out on anything.

Sometimes we leave out URLs or hide them from all visible pages because we don’t exactly want some of the users to go there. As a result, some of these URLs are uncrawlable to search engine spiders.

How does your Sitemap affect your SEO?
Search engines should see all the pages that you want them to see. The more pages that they index from you, the more trust your site gains. It only means that your website has more information to offer.
Making sure the search engine spiders get to crawl all the stuff they need to crawl from your website is the exact purpose of a Sitemap.


Monday 16 December 2013

Black Hat SEO

Black hat methods are unacceptable and highly discouraged ways of bringing traffic to your website. These methods bring short-term results, but once a search engine notices that you are cheating the system, your page could be penalized or banned.



Black hat methods include:
  • Hidden Content
  • Keyword Stuffing
  • Doorway or Gateway pages
  • Link Farming
  • Cloaking
Each term will be explained in detail in the next post.

Saturday 7 December 2013

White Hat SEO- A pure Organic SEO without spamming

When we say “white”, it refers to pure, no impurity or bad things, similarly white hat SEO means no illegal or bad means of achieving top rankings. It is pure and totally legal ways, without spamming and stuffing of keywords. 
White Hat SEO refers to the use of SEO strategies, techniques and tactics that focus on human audiences, as opposed to search engines and completely follow search engine rules and policies.

Any SEO tactic that maintains the integrity of your website and the SERPs (search engine results pages) is considered a "white-hat" search engine optimization tactic. These are the only tactics that we should use whenever applicable and which enhance rather than detract from your website and from the rankings.

Thursday 5 December 2013

What is difference between crawling, indexing and caching?


Crawling is where search engines spiders / bots move from web page to web page by following the links on the pages. The pages "found" are then ranked using an algorithm and indexed into the search engine database. Crawling is when search engines view your website and try to find the depths of it or understanding the hierarchy of your website.
Indexing is where search engine has crawled the web and ranks the URLs found using various criteria and places them in the database, or index. It's like in the library, if there's a new book, before the librarian put it in the right shelve, they create an index of it first

Caching is where copies of web pages stored locally on an Internet user's hard drive or within a search engine's database. " Cache is a more techy term, as it is a temporary store of your website. Google updates their cache on timely basis, depending of what you have indicated in your sitemap.xml

Wednesday 4 December 2013

How Search Engines Work?

The first basic truth you need to know to learn SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven.

Search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.

First, search engines crawl the Web to see what is there. This task is performed by a piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way.

After a page is crawled, the next step is to index its content. The indexed page is stored in a giant database, from where it can later be retrieved.

When a search request comes, the search engine processes it – i.e. it compares the search string in the search request with the indexed pages in the database.

Since it is likely that more than one page (practically it is millions of pages) contains the search string, the search engine starts calculating the relevancy of each of the pages in its index with the search string.

The last step in search engines' activity is retrieving the results. Basically, it is nothing more than simply displaying them in the browser – i.e. the endless pages of search results that are sorted from the most relevant to the least relevant sites. 

Google Technical guidelines

Google Technical guidelines

· Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.


· Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.


· Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves your bandwidth and overhead.


· Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visithttp://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.


· Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a rorbots.txt file.


· If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.


· Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.


· Test your site to make sure that it appears correctly in different browsers.


· Monitor your site's performance and optimize load times. Google's goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.
Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest or other tools. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.

Tuesday 26 November 2013

Why SEO plays an important role in website ranking on any search engine?


Why is SEO so important in website ranking on any search engine?

SEO (Search Engine Optimization) improves the visibility of a website on any search engine.  It makes your website easy for both users and search engine robots to understand. Search engines can't see and understand a web page in the way a human does. SEO allows webmaster to provide hints which the search engine can use to understand the content and in turn figure out what each page is about and how it may be useful for users. SEO add proper structure to your content without which many websites are invisible on search engines.

Because of the high degree of competition amongst the various web based businesses, it has become very difficult to make your business visible to the targeted audience. To increase the visibility of your website get it Search Engine Optimized. 

Advantages of SEO:
·         SEO helps search engines to crawl your website deeply and more frequently and thereby helps in quick indexing.
·         Drives more organic traffic towards your website with a natural and smooth flow.
·         Provides a clear picture about your website to the Search Engines in terms of your services as well as products so as to enhance the authenticity of your business
·         Provides you a winning edge by making your presence prominent among the listings for relevant search terms
·         Quality SEO keep you safe from getting penalized by Search Engines for any sort of unethical behavior (Black Hat techniques)

Google Quality Guidelines for websites

Google Quality guidelines

Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.
Quality guidelines - basic principles
  • Make pages primarily for users, not for search engines.
  •  Don't deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"
  • Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.
Avoid the following techniques:
  • Automatically generated content
  • Cloaking
  • Participating in link schemes
  • Hidden text or links
  • Doorway pages
  • Sneaky redirects
  • Scraped content
  • Participating in affiliate programs without adding sufficient value
  • Loading pages with irrelevant keywords
  • Creating pages with malicious behavior such as phishing or installing viruses
  • Sending automated queries to Google
  • Abusing rich snippets markup 

Engage in good practices like the following:
  • Monitoring your site for hacking and removing hacked content as soon as it appears
  •  Preventing and removing user-generated spam on your site


If your site violates one or more of these guidelines, then Google may take manual action against it. Once you have remedied the problem, you can submit your site for reconsideration.

Tuesday 10 September 2013

Google Keyword Planner (Pros and cons)

Drawbacks of Google Keyword Planner:

  1. In contrast to the keyword tool, users are required to log in to an AdWords account to access the planner.
  2. The planner doesn't have match type data for search volume.
  3. The filtering option of “closely related” search terms has completely vanished.
  4. No Global vs. Local monthly searches.
Some positive traits of this tool:
  1. The keyword planner allows local SEM and SEO professionals to get more geographic segmentation and the ability to bundle geographic regions.
  2. Users now will be able to upload keywords upto 10,000 belonging to their own list to get performance data.
  3. Keyword planner will show search volumes from landing page, ad group or any other category set by the user. However, there is still a dispute over the difference in search volume data between the two tools, with the elimination of two key sources, match types and device types. This is explained by Google as below:“In general, you’ll notice that the average search volume data is higher in Keyword Planner as compared to the exact match search volume data you got with Keyword Tool. That’s because we’ll show you the average number of searches for a keyword idea on all devices (desktop and laptop computers, tablets, and mobile phones). With Keyword Tool, we showed you average search volume for desktop and laptop computers by default.”


Friday 6 September 2013

How to place the important keywords and maintain the keyword density

Keyword density plays a very important role when it comes to ranking. For a website to rank good on Google, one need to make sure that the density of the page should be between 1% and 2%.
Also the selection and placement of the primary keywords or keyword phrases is important. 
The placement of important keywords at, or near, the start of a web page, sentence, TITLE, URL, description META tag gives it a greater probability to come up in searches. 
One should also monitor keyword proximity or the closeness between two or more keywords. In general, the closer you can keep the keywords, the better it is.