Dos and Don'ts of the Googlebot

  • Diddy1
  • Newbie
  • Newbie
  • Diddy1
  • Posts: 12

Post 3+ Months Ago

Google Adsense relies on the google adsense spider to properly deliver relevant ads. Which is closely related to or is the same as the googlebot spider.So let's explorer this spider and find out how to better our sites through it. The thing that most webmasters don't realize is the googlebot is a lot smarter than most people think. It has been coded to fit the rising amount of spam websites out there. I've compiled a list fo things they like and things that may get your site dropped by Google which needless to say will leave your Google Adsense profits out of the loop. Let's start with the negatives as to what you shouldn't do:

First duplicate content is an immediate turn-off. They don't like it if 1000 websites all say the same thing. So think twice before using those articles from article websites.

Second the thing about nested tables these are the tables hidden behind tables. Which means you see an ordered content but it's really a lot of code. My advice stay away from nested tables, stick with normal tables.

Third the so called doorway pages which basically involves loading your webpage with keywords and then diverting visitors to another webpage. Again major no-no here.

Fourth these dynamic urls which change depend on inputted information. Too many of these will have the spider spend more time than necessary on your website, bringing your ranking down.

Fifth the HTML or PHP or whatever code it is needs to be kept to a minimum. More code than actual content is bad for your websites since the Googlebot just sees text.

Sixth which is a scam many use to get higher ranking, this only whoever works for a while then gets you completely drooped from Google, is keyword overloading. Putting nothing but keywords on a page will get you high the first time the bot searches your site but as time progresses it learns and you pay.

Seventh of course is self-promoting. Which is basically putting too many links to your homepage from your sub-pages. This sounds like it'll work for you but it doesn't as the Googlebot know enough to know which links are from which domain.

Eighth is having the same content on a page forever, one of the fundamental rules of a good webmaster is to keep your website fresh at all times.

Ninth is the new iframes these frames do nothing short of confusing the Googlebot as it indexes each frame as a different page ending your content in a disarray. Which is why Google Ads only respond to the iframe they are put in.

That's about it for the negatives. Focus on those and check to see that you don't use any of these methods, this will keep you in the neutral zone. But we want more we want to be able to go positive, to make sure the Googlebot leaves our website with enough info to satisfy Google Adsense and get you a good SEO ranking. So here is a list of the things to do in order to get ahead of the spider game:

First things first updated content is a must. If you can update your website daily. Ever wonder why blogs and forums do so well in search engines? Because their content is ever so changing. And not to mention original.

Second use a useful tool by Google called Google Sitemaps. It's a complete layout of your website that you tell Google so it already knows where to spider. Which makes the Googlebot's job so much easier. If you haven't already sign up for http://www.google.com/webmasters/sitemaps/loginGoogle%20Sitemaps. It's free and easy.

Third keep your code clean of errors and keep it to a minimum. There a lot of tools out there that check you website code to make sure there no errors. Even if your content is automatic check jsut to make sure, machines aren't perfect.

Fourth make sure you have a lot of relevant backlinks with different titles. Your website shouldn't be "All About Google Adsense" all the time someone else can link to it under a different title. Like "Google Adsense Info Blog". This makes sure the Google Spider doesn't mistake your backlinks as linksfarm backlinks, which do more harm than good.

Fifth keep your keyword density normal about 3-7% should be good enough. Remember moderation in all things you don't wnat too much or too little.

Sixth make sure each of your images has an ALT tag this makes sure the Googlebot will understand what it stands for. Since it can only read text.

Seventh always keep you webpages static unless it's unavoidable. Too much generated content is bad for spiders.

If you follow all of these your website should be the most Googlebot friendly website out there. Making your Google Adsense ads stay relevant and giving you a higher Google PR. But make sure you keep up-to-date about new developments in the Google Spider field.

Thank You
  • Anonymous
  • Bot
  • No Avatar
  • Posts: ?
  • Loc: Ozzuland
  • Status: Online

Post 3+ Months Ago

  • meman
  • Web Master
  • Web Master
  • User avatar
  • Posts: 3432
  • Loc: London Town , Apples and pears and all that crap

Post 3+ Months Ago

Some good info, But you are confusing the google bot and the adsense bot. The adsense bot doesn't care if you have fresh unique content or a lot of backlinks. It just turns up, reads the page and gives you adverts. It has no effect on the google bot which spiders the internet in general and decides on where you appear in the SERPS.
  • Diddy1
  • Newbie
  • Newbie
  • Diddy1
  • Posts: 12

Post 3+ Months Ago

Whay you don't is that they both work as one now check out:


http://blog.searchenginewatch.com/blog/060419-094701

But so it won't confuse some of the newer people i'll edit my post.

Thank You
  • meman
  • Web Master
  • Web Master
  • User avatar
  • Posts: 3432
  • Loc: London Town , Apples and pears and all that crap

Post 3+ Months Ago

Interesting news, I hadn't heard anything about the adsense bot helping the google bot. Thanks for the link.
  • funlounge
  • Beginner
  • Beginner
  • funlounge
  • Posts: 40

Post 3+ Months Ago

Sorry, but I have to disagree on your fifth don't point:
PHP is executed server-side, so
your PHP code does not make it to the browsers or the web crawlers...
As for HTML, these bots know how interpret the tags and detect what is content and what's code...




quote : Fifth the HTML or PHP or whatever code it is needs to be kept to a minimum. More code than actual content is bad for your websites since the Googlebot just sees text.
  • Archipel
  • Novice
  • Novice
  • Archipel
  • Posts: 15
  • Loc: Belgium

Post 3+ Months Ago

Absolutely, the last one isn't very accurate either.

Quote:
Seventh always keep you webpages static unless it's unavoidable. Too much generated content is bad for spiders.


Why would dynamicly generated pages be bad? The bot doesn't know, it just sees html (as long as the pages are generated on the server, that is).
  • joebert
  • Fart Bubbles
  • Genius
  • User avatar
  • Posts: 13502
  • Loc: Florida

Post 3+ Months Ago

I hope you like questions. :D

Quote:
First duplicate content is an immediate turn-off. They don't like it if 1000 websites all say the same thing. So think twice before using those articles from article websites.

But they like it when 1000 websites say the same thing about another website ?

Quote:
Second the thing about nested tables these are the tables hidden behind tables. Which means you see an ordered content but it's really a lot of code. My advice stay away from nested tables, stick with normal tables.

Is using tables strictly for spreadsheet data a good thing ?

Quote:
Third the so called doorway pages which basically involves loading your webpage with keywords and then diverting visitors to another webpage. Again major no-no here.

For instance, say I have a museum, I would like to cater to both people with the ability to walk, & people in wheelchairs.
I can't expect thoose in wheelchairs to get up the stairs, I can't use a straight ramp because it could be hazardous to thoose in wheelchairs, & it's not fair to expect thoose who are not in wheelchairs to use a switchback wheelchair ramp.

Is there a way to handle special cases with SEO that doesn't hurt me ?

Quote:
Fourth these dynamic urls which change depend on inputted information. Too many of these will have the spider spend more time than necessary on your website, bringing your ranking down.

Isn't the information which search engines would not have a need to know going to be personal, or session based information anyway ?

Quote:
Fifth the HTML or PHP or whatever code it is needs to be kept to a minimum. More code than actual content is bad for your websites since the Googlebot just sees text.

Like a sandwich with half a jar of mayonaise on it ?

Quote:
Sixth which is a scam many use to get higher ranking, this only whoever works for a while then gets you completely drooped from Google, is keyword overloading. Putting nothing but keywords on a page will get you high the first time the bot searches your site but as time progresses it learns and you pay.

How long do you anticipate it being before Google Spiders use language patterns while indexing to tip them off about this ?

Quote:
Seventh of course is self-promoting. Which is basically putting too many links to your homepage from your sub-pages. This sounds like it'll work for you but it doesn't as the Googlebot know enough to know which links are from which domain.

How many would be too many ?
I know I like to have a link to the index on the top & bottom of pages so my "home" and "end" keyboard keys are good for somthing.

Quote:
Eighth is having the same content on a page forever, one of the fundamental rules of a good webmaster is to keep your website fresh at all times.

Is there somthing wrong with archiving theese pages & publishing new ones ?

Quote:
Ninth is the new iframes these frames do nothing short of confusing the Googlebot as it indexes each frame as a different page ending your content in a disarray. Which is why Google Ads only respond to the iframe they are put in.

I don't see iframes being of much use at all since AJAX, I'll skip this one.

Quote:
First things first updated content is a must. If you can update your website daily. Ever wonder why blogs and forums do so well in search engines? Because their content is ever so changing. And not to mention original.

Can we call theese One hit wonders ?

Quote:
Second use a useful tool by Google called Google Sitemaps. It's a complete layout of your website that you tell Google so it already knows where to spider. Which makes the Googlebot's job so much easier. If you haven't already sign up for Google Sitemaps. It's free and easy.

What's more important to Google, other sites telling them about somthing, or my sitemap telling them ?

Quote:
Third keep your code clean of errors and keep it to a minimum. There a lot of tools out there that check you website code to make sure there no errors. Even if your content is automatic check jsut to make sure, machines aren't perfect.

People aren't perfect either, who checks on the people who check on the machines ?

Quote:
Fourth make sure you have a lot of relevant backlinks with different titles. Your website shouldn't be "All About Google Adsense" all the time someone else can link to it under a different title. Like "Google Adsense Info Blog". This makes sure the Google Spider doesn't mistake your backlinks as linksfarm backlinks, which do more harm than good.

Is this anything like staying with the current trends in fashion ?
Kinda like the term "todays black" ?

Quote:
Fifth keep your keyword density normal about 3-7% should be good enough. Remember moderation in all things you don't wnat too much or too little.

Is going through my content & replacing words like "it" with whatever object their context refers to a good idea ?

Quote:
Sixth make sure each of your images has an ALT tag this makes sure the Googlebot will understand what it stands for. Since it can only read text.

Isn't the ALT attribute a required attribute for images in most DOCTYPES ?

Quote:
Seventh always keep you webpages static unless it's unavoidable. Too much generated content is bad for spiders.

What if I have a dozen thinkers who publish their thoughts many times randomly throughout the day, is it ok to keep changing my index page, or should I only update it once a day ?
  • meman
  • Web Master
  • Web Master
  • User avatar
  • Posts: 3432
  • Loc: London Town , Apples and pears and all that crap

Post 3+ Months Ago

I know you were just being pedantic lol but this question was interesting and sometimes confuses people so ill answer it.
Quote:
For instance, say I have a museum, I would like to cater to both people with the ability to walk, & people in wheelchairs.
I can't expect thoose in wheelchairs to get up the stairs, I can't use a straight ramp because it could be hazardous to thoose in wheelchairs, & it's not fair to expect thoose who are not in wheelchairs to use a switchback wheelchair ramp.

Is there a way to handle special cases with SEO that doesn't hurt me ?

I guess the quesion comes down to what is cloaking and is IP delivery ever ok?
Google would only have a problem if you were to use cloaking and IP delievery to treat the google bot differently than a visitor. If you use IP delivery in a good way then googlebot would be sent to your american version, like the rest of the americans, and not to a "googlebot version".
You can see that google have no problem with IP delievery because even they use it. They send american people too google.com, the french to google.fr and brits to google.co.uk.
  • Diddy1
  • Newbie
  • Newbie
  • Diddy1
  • Posts: 12

Post 3+ Months Ago

Archipel wrote:
Absolutely, the last one isn't very accurate either.

Why would dynamicly generated pages be bad? The bot doesn't know, it just sees html (as long as the pages are generated on the server, that is).


Because dynamic content changes by varients. Which means a user has to input somewhere before it's generated. Like the date for example may be a factor. Also the description of the page isn't in the url of it which right takes away from your seo points. A site about web hosting for example should have a title like "web-hosting.html" not index .php?=blah blah.

funlounge wrote:
Sorry, but I have to disagree on your fifth don't point:
PHP is executed server-side, so
your PHP code does not make it to the browsers or the web crawlers...
As for HTML, these bots know how interpret the tags and detect what is content and what's code...




quote : Fifth the HTML or PHP or whatever code it is needs to be kept to a minimum. More code than actual content is bad for your websites since the Googlebot just sees text.


Php just makes your webpage loading times longer with little actual content making the googlebot spend more time than necessary. And those are all factors in your ranking.


joebert here I'll tryto answer your questions best as I can:

1: Yes they like it when you are the site they are talking about. But the actual sites that have say a 500 worded article all saying the same thing will go down as far as seo-wise. Which will eventually bring you down.

2:Yes I believe so depends on how much data you have.

3:I think meman answered that. What I was talking about was the way some people set up automatic redirecting to Google so they can get a pagerank of ten. Which after a while backfires and you gain nothing from it. As it goes away when you put up your own site.

4:Exactly which is why there are so many passwords on google if you know how to search right. But those websites are usually buried. So keep those on a minimum don't make your whole site generated like some sites are.

5:Yes except in this case google can't taste the mayonaise.

6:I'd give it a good say every three moths when the major PR upgrade comes.

7:Yeah about 5 would be too much. But a couple is expected.

8: No I figure nothing.

9:no any forum or blog can be successful since it already fulfills the frash content genre you jsut have to keep it updated.

10: Both work together as the sitempas will point to EVERY one of your webpages but others will increase the PR of that certain page.

11: I don't get this one.

12: No more like diversifying your backlinks. Or making sure your website isn't a one topic website that it can be used for other things.

13: Yes, yes, and yes! I do this too as it and all those little substitution words can be replace and increase your site relevancy.

14: True but many people don't follow the rules.

15: That's okay they can update it anytime they want. As long as it stays on the page for a while.


Phew! there all done. Hope this helps in any way as that is my goal.

Thank You

Post Information

  • Total Posts in this topic: 9 posts
  • Users browsing this forum: No registered users and 4 guests
  • You cannot post new topics in this forum
  • You cannot reply to topics in this forum
  • You cannot edit your posts in this forum
  • You cannot delete your posts in this forum
  • You cannot post attachments in this forum
 
 

© 1998-2014. Ozzu® is a registered trademark of Unmelted, LLC.