Sitemaps

  • Minne
  • Student
  • Student
  • Minne
  • Posts: 94
  • Loc: Small Sports

Post 3+ Months Ago

I read that having a site map helps the page rank. i copy and pasted it.

Site Maps

Site maps are useful in at least two ways:

If a user types in a bad URL most websites return a really unhelpful “404 – page not found” error page. This can be discouraging. Why not configure your server to return a page that shows an error has been made, but also gives the site map? This can help the user enormously
Linking to a site map on each page increases the number of internal links in the site, spreading the PR out and protecting you against your vote “donations”


How can i do this? this would be a great help to some peoples sites.
  • Bigwebmaster
  • Site Admin
  • Site Admin
  • User avatar
  • Posts: 9185
  • Loc: Seattle, WA & Phoenix, AZ

Post 3+ Months Ago

Well this is exactly what I do on my other site:

http://www.bigwebmaster.com/Tree/

That is a complete site map of the entire site. However I do not have that showing on every page, but if you look at my menu I have it linking to the majority of main pages throughout the site which does help spread the PR and increase internal links. I also link directly to the site map from every page of the site.
  • Minne
  • Student
  • Student
  • Minne
  • Posts: 94
  • Loc: Small Sports

Post 3+ Months Ago

I am just bout done with my site map. I am making it simple. all you do is go into our cpanel and goto error pages and goto 404 and put the site map and it shows up when someone types in a page that dont exist for your site
  • b_heyer
  • Web Master
  • Web Master
  • User avatar
  • Posts: 4580
  • Loc: Maryland

Post 3+ Months Ago

This makes sense because it would be creating a page with A LOT of keywords for google, and A LOT of backlinks to other pages.
  • Johan007
  • Guru
  • Guru
  • User avatar
  • Posts: 1079
  • Loc: Aldershot, UK

Post 3+ Months Ago

A site map is very good for dynamic sites that have ? in the url.

For example my movie reviews do not get spidered because they can link from a dynamic page eg:

/cinema.asp?page=3

Instead have a page that list all the urls from a static page like

/sitemap.asp

Then out put all the dynamic links

.../movie.asp?no=2>Blade Runner</a>
.../movie.asp?no=3>Momento</a>
.../movie.asp?no=4>Lord of the Rings</a>
.../movie.asp?no=5>American Pie</a>
  • phaugh
  • Professor
  • Professor
  • User avatar
  • Posts: 794

Post 3+ Months Ago

Be a good host/hostess...

After a fun time with your friends you will say..."Hey lets do this again"

Why not do the same for the spiders....use this tag
<META name="revisit-after" content="10 days">
  • emitind.
  • Student
  • Student
  • User avatar
  • Posts: 92
  • Loc: england

Post 3+ Months Ago

Having a page with links to every page on your site apparently is supposed to boost your overall ranking. However I've had little improvement, so is this true?
  • Solutions
  • Graduate
  • Graduate
  • User avatar
  • Posts: 108
  • Loc: Denmark

Post 3+ Months Ago

I don't think so. I actually believe in internal PageRank leak.

However a sitemap is the only right thing to do if your links are not spiderable ones.

I myself don't use a sitemap. Instead my pages are reachable from various points on my site.
  • phaugh
  • Professor
  • Professor
  • User avatar
  • Posts: 794

Post 3+ Months Ago

I have noticed that internal linking passes page rank to the other pages in the site and if they are linked back to the main entrance they then pass PR back. If you pass enough PR to your internal pages then you will get backlink credits for all the links pointing back to your site entrance and to other internal pages. This works best when you link all the sites pages to each other.
  • Solutions
  • Graduate
  • Graduate
  • User avatar
  • Posts: 108
  • Loc: Denmark

Post 3+ Months Ago

I am not so sure. That would mean that a HUGE site actually could sort of generate it's on PageRank :?: by just adding filler after filler page...
  • phaugh
  • Professor
  • Professor
  • User avatar
  • Posts: 794

Post 3+ Months Ago

Correct...but the amount of PR that is passed decreases or spreads out over the entire site.
  • Solutions
  • Graduate
  • Graduate
  • User avatar
  • Posts: 108
  • Loc: Denmark

Post 3+ Months Ago

Quote:
Correct...


How can it be correct and then there is a "but..." ?

I agree that the PageRank spreads to other internal pages but the total amount of PageRank equals the PageRank that comes in from external pages and the amount that Google gives for code etc.
  • vetofunk
  • A SEO GUY
  • Mastermind
  • User avatar
  • Posts: 2243
  • Loc: Chicago

Post 3+ Months Ago

With all my sites, I usually have links to all the top pages from my index. I wouldn't ever have a link to every single page from my index. It may give different PR results, but this way has always worked well for me.
  • disgust
  • Graduate
  • Graduate
  • disgust
  • Posts: 154

Post 3+ Months Ago

I link to my home page on every page- it may not be a huge jump, it's not going to boost you from a 5 to a 6, or even a 3 to a 4 probably, but it doesn't hurt.

plus it helps with anchor text.

however if you had some sort of freakish situation where your homepage is a 4, and an internal page is getting a lot of links, and is a 5 or 6 or even 7, then obviously you'd want to make that page link back to the main page.
  • madmonk
  • Mastermind
  • Mastermind
  • madmonk
  • Posts: 2115
  • Loc: australia

Post 3+ Months Ago

If I am to make a site map, on which part of index should I put the link of site map?

top part or anywhere on the page?
  • emitind.
  • Student
  • Student
  • User avatar
  • Posts: 92
  • Loc: england

Post 3+ Months Ago

i wouldve thought the closer to the top it is the quicker it will be followed... don't know if it matters too much.
  • disgust
  • Graduate
  • Graduate
  • disgust
  • Posts: 154

Post 3+ Months Ago

it doesn't matter at all unless the page is over 100kb, in which case it'd need to be in the first 100kb.
  • vetofunk
  • A SEO GUY
  • Mastermind
  • User avatar
  • Posts: 2243
  • Loc: Chicago

Post 3+ Months Ago

I may not have said it right. I do have a link on every page to my index, always very important. But, I do not believe you need a link to every page from your index.
  • madmonk
  • Mastermind
  • Mastermind
  • madmonk
  • Posts: 2115
  • Loc: australia

Post 3+ Months Ago

hi vetofunk
what effects will tehre be if you have a link to every page from index?
  • vetofunk
  • A SEO GUY
  • Mastermind
  • User avatar
  • Posts: 2243
  • Loc: Chicago

Post 3+ Months Ago

Like I said, I have never tried it, just due to most of my sites having dozens and dozens of pages, which clutters up the index page.

I'd also rather have my index page give a bigger percentage of PR to my main category pages instead of a smaller percentage to all the pages in my site. But this is just my opinion, I am sure others have done it the opposite way with great results.
  • madmonk
  • Mastermind
  • Mastermind
  • madmonk
  • Posts: 2115
  • Loc: australia

Post 3+ Months Ago

I see your point vetofunk.

also, For my website- travel, there are no high PR sites for my backlinks purposes. definitely not lonely planet. haha. they wouldnt want my link.

suggestions?
  • vetofunk
  • A SEO GUY
  • Mastermind
  • User avatar
  • Posts: 2243
  • Loc: Chicago

Post 3+ Months Ago

If Lonely Planet is your biggest competitor, I would go through the directory they are listed in an see if any of those sites are willing to exchange links with you. Also I would take a look at their backlinks to see where they are listed and where it might be possible for you to be listed. Do all the searching for (travel "exchange links" or travel "add site") and so on. Make sure to get directory listings in Yahoo and DMOZ. Try to add your site to the categories all your biggest competitors are in. Also, think about business.com and Microsoft's directory if you can get into a nice PR'd category.

Travel is going to be a tough job, as I am pretty sure it is one of most searched terms online. I travel a lot so I will keep your site in mind next time I go on a trip.

Good Luck!
  • phaugh
  • Professor
  • Professor
  • User avatar
  • Posts: 794

Post 3+ Months Ago

just catching back up with this thread....

I think I may have been misunderstood. I didn't mean to put a link to every page from every other page. Unless you site is less than 10 pages total. Otherwise that's what the site map is for.

Here's what I try to do as far a structuring the whole site:

1. site map - a map of the entire site with headings to your main sections...perferably enclosed in Hx tags and linked to that section below each section is a sub set of links to all the catagories internal pages and is also linked back to the main page.

2. Index Page - Is linked to the site map and each of the catagories main pages.

3. Main Catagory page - Optimized to that catagories keywords, linked to all the other catagories, linked to all that catagories internal pages.

If I use graphics or java script for any of the navigation I also include a text link somewhere on the page. As well as alt tags on the graphics.
  • madmonk
  • Mastermind
  • Mastermind
  • madmonk
  • Posts: 2115
  • Loc: australia

Post 3+ Months Ago

yes. thanks for all those tips. I have done a check on lonelyplanet already.took me an entire day for that. real tiring. heh.
dmoz 's still making me wait.

I will look into business.com n MS dir too. Good idea. :-)


travel website's not easy. haha. too late for regrets. I chose travel coz i travel lots too. another big big problem is that keywords are too competitive.
  • vetofunk
  • A SEO GUY
  • Mastermind
  • User avatar
  • Posts: 2243
  • Loc: Chicago

Post 3+ Months Ago

Anything like this:
http://www.discount-home-office-furnitu ... cfm?SID=1&
  • phaugh
  • Professor
  • Professor
  • User avatar
  • Posts: 794

Post 3+ Months Ago

Exactly!

Your business site uses a similar structure as well and a better use of anchor text to link to the additional catagories with a little text blurb about each.
  • madmonk
  • Mastermind
  • Mastermind
  • madmonk
  • Posts: 2115
  • Loc: australia

Post 3+ Months Ago

thanks for the example. I am also going to cut down my keywords.
and add more links to index:-)
  • jlknauff
  • Expert
  • Expert
  • User avatar
  • Posts: 501
  • Loc: Florida

Post 3+ Months Ago

How does a site map help your placement?
  • beefcakejcc
  • Graduate
  • Graduate
  • beefcakejcc
  • Posts: 103
  • Loc: Atlanta,GA

Post 3+ Months Ago

it increases the likelihood of all your pages getting indexed
  • CazpianXI
  • Proficient
  • Proficient
  • User avatar
  • Posts: 285

Post 3+ Months Ago

Yes, that's right. Also, it makes the search engine spider's job easier... meaning that:

1. All your pages will get a better chance of becomming indexed
2. Your site may get indexed faster
  • ndvakil
  • Born
  • Born
  • ndvakil
  • Posts: 1
  • Loc: India

Post 3+ Months Ago

If our webpages contain links which are not SE Robots friendly, then we must need Site Map. So, SE Robots can easily crawl all pages easily.

In general cases also Site Map prove benificial.
  • webinv
  • Graduate
  • Graduate
  • webinv
  • Posts: 110

Post 3+ Months Ago

Greetings,

Sitemaps are also good for sites that do not have "site search engines."
Some visitors may be looking for a particular thing on your site and not be able to find it. A sitemap may be able to help that.

Also, I've seen sites that will create a variety of the "same pages" with different content. Those pages are then listed in the sitemap to try to help search engines pick up more pages with different content.
  • madmonk
  • Mastermind
  • Mastermind
  • madmonk
  • Posts: 2115
  • Loc: australia

Post 3+ Months Ago

It is also recommended in google guidelines.

as everybody else has pointed out, Having it makes yr pages easier to index.
  • dprichard
  • Beginner
  • Beginner
  • User avatar
  • Posts: 61
  • Loc: Clearwater Florida

Post 3+ Months Ago

Do you feel like it is beneficial to have a little blurb (Description) with each page name or do you feel like just having the page names is best? Which have you all found works better for Search Engine Optimization?
  • CazpianXI
  • Proficient
  • Proficient
  • User avatar
  • Posts: 285

Post 3+ Months Ago

I don't think that the spider will really care about what you write about each page -- all it will care is what the link text contains. The only thing that having a blurb will do for you is to optimize the site map's PR. (Who would want to do that?)
  • discountdomains
  • Graduate
  • Graduate
  • discountdomains
  • Posts: 170
  • Loc: Telford UK

Post 3+ Months Ago

I had problems with my sitemap when it had >100 links on it, since breaking it down into 4 subsections it works very well.

Clare
  • madmonk
  • Mastermind
  • Mastermind
  • madmonk
  • Posts: 2115
  • Loc: australia

Post 3+ Months Ago

having description for each page is best, i guess.
each page is a potential entry point for visitors.

good to have a gd description for every page
  • CazpianXI
  • Proficient
  • Proficient
  • User avatar
  • Posts: 285

Post 3+ Months Ago

Remeber that site maps have more purpose than just SEO. Think about convenience for your visitors, too.
  • vetofunk
  • A SEO GUY
  • Mastermind
  • User avatar
  • Posts: 2243
  • Loc: Chicago

Post 3+ Months Ago

This is a site map that the SE's love and we have even gotten positive comments from customers:

http://www.discount-home-office-furnitu ... cfm?SID=1&
  • CazpianXI
  • Proficient
  • Proficient
  • User avatar
  • Posts: 285

Post 3+ Months Ago

Very well thought-through site map, vetofunk!

(I assume that this is your website)
  • madmonk
  • Mastermind
  • Mastermind
  • madmonk
  • Posts: 2115
  • Loc: australia

Post 3+ Months Ago

I dont think he needs to plan much for that site map.
it may have been from the product catalog

right vetofunk ? :wink:
  • vetofunk
  • A SEO GUY
  • Mastermind
  • User avatar
  • Posts: 2243
  • Loc: Chicago

Post 3+ Months Ago

;-)
  • john5269
  • Graduate
  • Graduate
  • john5269
  • Posts: 198

Post 3+ Months Ago

OK, so a site map is good!

But what about the size. Would a site map with about 1000 links on get penalized by the major search engines as it will just look like a link farm.
  • Bigwebmaster
  • Site Admin
  • Site Admin
  • User avatar
  • Posts: 9185
  • Loc: Seattle, WA & Phoenix, AZ

Post 3+ Months Ago

Google has just recently launched what is known as Google Sitemaps. It is still in BETA, but it lets webmasters define a map of their entire site in an XML format.

Quote:
What is Google Sitemaps?

Google Sitemaps is an experiment in web crawling. Using Sitemaps to inform and direct our crawlers, we hope to expand our coverage of the web and improve the time to inclusion in our index. By placing a Sitemap-formatted file on your webserver, you enable our crawlers to find out what pages are present and which have recently changed, and to crawl your site accordingly.

Basically, the two steps to participating in Google Sitemaps are:

Generate a Sitemap in the correct format using Sitemap Generator.
Update your Sitemap when you make changes to your site.


This also seems like something useful for people who have problems with Google crawling/finding webpages on their website.

Quote:
Who can use Google Sitemaps?

Google Sitemaps is intended for all web site owners, from those with a single web page to companies with millions of ever-changing pages. If any of the following are true, then you may be especially interested in Google Sitemaps:

You want Google to crawl more of your web pages.
You want to be able to tell Google when content on your site changes.


Apparently once you have made a sitemap there is a special URL that you need to submit it to. You can also have Google help you automatically build your sitemap and have it submitted. You can see full information about Google Sitemaps here:

https://www.google.com/webmasters/sitemaps/stats
  • Fitness4Living
  • Proficient
  • Proficient
  • User avatar
  • Posts: 330
  • Loc: Belfast, Ireland

Post 3+ Months Ago

interesting, I might try it when I get more skilled in my SEO
  • rtchar
  • Expert
  • Expert
  • User avatar
  • Posts: 606
  • Loc: Canada

Post 3+ Months Ago

I still don't fully understand the reason for this new feature ... but I still applaud any efforts to increase communication with webmasters!

I don't believe this will replace the current Internet spidering done by Googlebot. Not every site will have the knowledge and ability to generate XML site maps. Most sites are barely able to handle link exchanges, they are business people NOT programmers. :)

I suppose it will allow the other 10% of webmasters to refine crawls through their sites by Google.

If your have a lot of static pages, flash menus, or daily updates (news or forums), I can see where this will be helpful in directing robots where to crawl. It may even reduce bandwidth used by search engines.

If your server does not support last-modified headers, this might allow webmasters to signal changes to their sites.

Of course the darker elements could abuse it by crawling entire sites daily or even hourly. :evil:
  • Jess
  • Guru
  • Guru
  • User avatar
  • Posts: 1153
  • Loc: USA

Post 3+ Months Ago

ouch thats a pain in the ass to get right, just about got it implemented on one site, shall see what (if any) difference it makes.
  • Axe
  • Genius
  • Genius
  • User avatar
  • Posts: 5735
  • Loc: Sub-level 28

Post 3+ Months Ago

rtchar wrote:
I don't believe this will replace the current Internet spidering done by Googlebot.

I don't believe so either, but I wonder if any extra weight in the results will be given to those webmasters who have adopted Google's new technology/feature?

I don't know if I like the look of their script. It'd probably just be easier for me to write my own PHP script that creates the .xml file in the format they want. With database driven dynamic websites, where new pages are created daily, rolling my own seems to be the best way to keep things up to date on the sitemap.
  • rtchar
  • Expert
  • Expert
  • User avatar
  • Posts: 606
  • Loc: Canada

Post 3+ Months Ago

I have been giving this a little more thought ... OK I was looking at building a web based robot that would generate the XML automatically. As it turns out if a robot can easily crawl your site you DO NOT need XML sitemaps.

The beauty of this system is that you can now tell Google how to access those parts of your site that are NOT spider friendly.

Flash Menus, Javascript, Form Navigation, Drop Down List Boxes, Databases, Dynamic Catalogs? No problem ... include an XML sitemap.

This may actually create a whole new industry. (SEO Sitemap Specialist?) :lol:

Quote:
The REALLY interesting part is how Google will handle page rank. If the only readable link is a sitemap reference, how can they assign PR?

For example --- if every page on my site had a flash menu pointing to my link pages --- but only one reference appears in the site map --- how can PR be passed?
  • SSH-Raj
  • Expert
  • Expert
  • User avatar
  • Posts: 588

Post 3+ Months Ago

Axe wrote:
I don't know if I like the look of their script. It'd probably just be easier for me to write my own PHP script that creates the .xml file in the format they want. With database driven dynamic websites, where new pages are created daily, rolling my own seems to be the best way to keep things up to date on the sitemap.

i'm going to have to do that also... mine will be for oscommerce
  • Jibran
  • Beginner
  • Beginner
  • User avatar
  • Posts: 61
  • Loc: Lucknow, India

Post 3+ Months Ago

I already run a dynamic to static Sitemap on my site. Will this work or do I need to get in touch with my Host about Python and do it the Google way?

http://www.pottersrealm.com/sitemap.html
  • Axe
  • Genius
  • Genius
  • User avatar
  • Posts: 5735
  • Loc: Sub-level 28

Post 3+ Months Ago

As long as Google can read a file in the appropriate format, I don't think it matters WHAT you use to generate it.

I'm going to be using PHP to output the format Google wants on PHPSector.com sometime when I get the time spare.
  • SEO_Pro.
  • Student
  • Student
  • User avatar
  • Posts: 88
  • Loc: SLC, Ut

Post 3+ Months Ago

I tell you what if they make this thing I will make a program that generates the xml for you. Then everyone will be able to use the XML site map for google :lol:

Jacob
  • SEO_Pro.
  • Student
  • Student
  • User avatar
  • Posts: 88
  • Loc: SLC, Ut

Post 3+ Months Ago

Hey,

I have heard that google is not fond of php pages... Is this true?
  • Axe
  • Genius
  • Genius
  • User avatar
  • Posts: 5735
  • Loc: Sub-level 28

Post 3+ Months Ago

Google doesn't mind PHP pages. I've seen MANY MANY PHP pages indexed.

The problem lies in parameters.

What Google doesn't like, is..

.php?something=this&somethingelse=that&this=nothing&that=something

And it doesn't like it because of session IDs on the URLs (which are easily removed if they exist).

Other than that those two things, Google likes PHP as much as HTML.

So, it's not the content, but the way the information is presented (in a technical specification kinda sense).
  • SEO_Pro.
  • Student
  • Student
  • User avatar
  • Posts: 88
  • Loc: SLC, Ut

Post 3+ Months Ago

Thank you for the reply. The seo firm I work for tries not to use php pages because it is not in any of the list of languages that google excepts... Why is this? We found a list of languages on googles web site with in the web masters stuff and php was not in the list....

Note: I looked a little for the list but I could not find it (my lunch break is up got to get back to making some more annoying marketing pages)...

Jacob
  • Axe
  • Genius
  • Genius
  • User avatar
  • Posts: 5735
  • Loc: Sub-level 28

Post 3+ Months Ago

Well, PHP isn't a language that Google can even see. PHP is all server-side. It still OUTPUTs HTML in a regular web page.

That's why many PHP based sites (including this one), use mod_rewrite to fake out the URL to make it more search-engine friendly.

The fact that the ending of the fake URLs is .html and not .php just simplifies working with the process. If you see .php in a URL, you know it's real. If you see .html in the URL, you know it's a redirect. If you named your real and fake URLS .php, it can get confusing to the developer and you end up not knowing what files you need to edit, etc.

But, as far as Google's concerned, the .php extension is treated exactly the same as .html (assuming there's no parameters on the command line, or cookies/sessionIDs to deal with).

The cookies Google doesn't mind, as long as you're not putting the session ID on the URL.
  • SEO_Pro.
  • Student
  • Student
  • User avatar
  • Posts: 88
  • Loc: SLC, Ut

Post 3+ Months Ago

Quote:
Well, PHP isn't a language that Google can even see. PHP is all server-side. It still OUTPUTs HTML in a regular web page.


Well ASP is in the list and that is a server side language also.....
  • Axe
  • Genius
  • Genius
  • User avatar
  • Posts: 5735
  • Loc: Sub-level 28

Post 3+ Months Ago

Yeah, but I actually see PHP page coming up in SERPS. I see ASP pages come up occasionally too, but not too often - but that's probably just due to the fact that PHP's MUCH more popular than ASP.
  • mighty b
  • Beginner
  • Beginner
  • mighty b
  • Posts: 39

Post 3+ Months Ago

I had problems running the script so i just generated a list of urls in txt format and submited it. It was downloaded fine and the status is now OK.

Does this mean it worked?
  • rtchar
  • Expert
  • Expert
  • User avatar
  • Posts: 606
  • Loc: Canada

Post 3+ Months Ago

Google is recommending you start simple ... A text file or simple XML file with just a few pages listed are OK.

Don't worry if every page on your site is listed in your first attempts, this program will NOT replace the crawlers. Besides you can submit a new file with additions every day if you like.

The program is still in Beta, the point is to test the system before it goes live in unattended mode.

mighty b
Quote:
It was downloaded fine and the status is now OK.


According to Google OK means that the uploaded file processed correctly and is now queued. Of course there is no guarantee the "suggestions" will be included in the index. :lol:
  • allgoodpeople
  • Proficient
  • Proficient
  • User avatar
  • Posts: 379
  • Loc: here

Post 3+ Months Ago

just a thought . . .

if this takes off and gets to be standard practice with websites, will other search engines be able (or even be allowed) to access this config.xml file for their own indices?
  • rtchar
  • Expert
  • Expert
  • User avatar
  • Posts: 606
  • Loc: Canada

Post 3+ Months Ago

Google is hoping other Search Engines also make use of this protocol.

Technically the sitemap belongs to YOU. Share it with anyone you like.

I am hoping the protocol expands to include TITLE, DESCRIPTION, and KEYWORD meta tags. Think of how convenient it would be to have this info summarized.

Then with any luck major directories (DMOZ) could automate their submit process as well. :lol:
  • mighty b
  • Beginner
  • Beginner
  • mighty b
  • Posts: 39

Post 3+ Months Ago

Quote:

According to Google OK means that the uploaded file processed correctly and is now queued. Of course there is no guarantee the "suggestions" will be included in the index. :lol:


Il have to wait for a few weeks then lol
  • pine_things
  • Novice
  • Novice
  • pine_things
  • Posts: 20
  • Loc: Warwickshire, UK

Post 3+ Months Ago

has anybody experienced better indexing of their webpages by using the google site map?

does it see all the pages quicker?
  • mighty b
  • Beginner
  • Beginner
  • mighty b
  • Posts: 39

Post 3+ Months Ago

Defenetely. Google bot lives on my forum now. I had quite a few pages indexed the other day that google bot would not even go to before.
  • 993ti
  • Newbie
  • Newbie
  • 993ti
  • Posts: 12

Post 3+ Months Ago

Submitted it and it got downloaded within the hour, that's pretty fast.
I'm curious how it goes :)
  • Jibran
  • Beginner
  • Beginner
  • User avatar
  • Posts: 61
  • Loc: Lucknow, India

Post 3+ Months Ago

I just noticed that the number of indexed pages of my site have gone down from 5000 to 4870! 8O This happened after using the Google Sitemaps.
  • mighty b
  • Beginner
  • Beginner
  • mighty b
  • Posts: 39

Post 3+ Months Ago

:bouncingsmile: result!!!

Just got 4000+ pages indexed. Before the site map it took me 5 months just to get 300
  • joebert
  • Genius
  • Genius
  • User avatar
  • Posts: 13511
  • Loc: Florida

Post 3+ Months Ago

Somthing for thoose of you with phpBB boards & trouble getting them indexed may want to keep an eye on, http://www.phpbb.com/phpBB/viewtopic.php?t=296051
  • Johan007
  • Guru
  • Guru
  • User avatar
  • Posts: 1079
  • Loc: Aldershot, UK

Post 3+ Months Ago

This really should be good news for webmasters with content and news sites. Imagine your articles going into Google within hours instead of days! My entire site map for Future Movies is only 50kb and sure it has its limitations but those are insignificant.

You can just provide a list of URL’s in XML if you want to be safe however I also strongly recommend anyone using a database to avoid the last update date because that date is taken from the database and if you make changes to the HTML your page maybe ignored.

Freel free to use my simple classic ASP code (easy to convert to PHP). To save on file size do not convert this code to inline code (its worth the slight server hit for multiple "response.write" becuase only Google will be using it):

Code: [ Select ]
<!-- #Include virtual="/database-connection.asp" -->
<%

Response.Buffer = true
response.ContentType = "text/xml"
response.write "<?xml version='1.0' encoding='UTF-8'?>"
response.write "<urlset xmlns='http://www.google.com/schemas/sitemap/0.84'>"

' List your static URL's

response.write "<url>"
response.write "<loc>http://www.domain.co.uk/</loc>"
'response.write "<lastmod>" & Danger & "</lastmod>"
response.write "<priority>0.5</priority>"
response.write "<changefreq>daily</changefreq>"
response.write "</url>"

response.write "<url>"
response.write "<loc>http://www.domain.co.uk/sub-home</loc>"
'response.write "<lastmod>" & Danger & "</lastmod>"
response.write "<priority>0.5</priority>"
response.write "<changefreq>daily</changefreq>"
response.write "</url>"

' List your dynamic URL's

Dim n

n = 0

strSql = "SELECT *, Table.ID AS [pageID] WHERE Table.Delete<>'Y' ORDER BY Created DESC"
Set db = Server.CreateObject("ADODB.Connection")
Set Rs = Server.CreateObject("ADODB.Recordset")
db.Open strDBConnection
Rs.Open strSql, db

Do While Not Rs.EOF
    intID = Rs.Fields("pageID").Value
    response.write "<url>"
    response.write "<loc>http://www.domain.co.uk/article.asp?ID=" & intID & "</loc>"
    'response.write "<lastmod>" & Danger & "</lastmod>"
    
    If n < 10 Then
        response.write "<priority>1.0</priority>"
        response.write "<changefreq>daily</changefreq>"
    ElseIf n < 20 Then
        response.write "<priority>0.7</priority>"
        response.write "<changefreq>monthly</changefreq>"
    Else
        response.write "<priority>0.2</priority>"
        response.write "<changefreq>yearly</changefreq>"
    End If
    
    n = n + 1
    
    response.write "</url>"
    Rs.MoveNext
Loop

Rs.Close
db.Close
Set Rs = Nothing
Set db = Nothing

' End Dynamic URL (maybe have another table?)

response.write "</urlset>"
%>
  1. <!-- #Include virtual="/database-connection.asp" -->
  2. <%
  3. Response.Buffer = true
  4. response.ContentType = "text/xml"
  5. response.write "<?xml version='1.0' encoding='UTF-8'?>"
  6. response.write "<urlset xmlns='http://www.google.com/schemas/sitemap/0.84'>"
  7. ' List your static URL's
  8. response.write "<url>"
  9. response.write "<loc>http://www.domain.co.uk/</loc>"
  10. 'response.write "<lastmod>" & Danger & "</lastmod>"
  11. response.write "<priority>0.5</priority>"
  12. response.write "<changefreq>daily</changefreq>"
  13. response.write "</url>"
  14. response.write "<url>"
  15. response.write "<loc>http://www.domain.co.uk/sub-home</loc>"
  16. 'response.write "<lastmod>" & Danger & "</lastmod>"
  17. response.write "<priority>0.5</priority>"
  18. response.write "<changefreq>daily</changefreq>"
  19. response.write "</url>"
  20. ' List your dynamic URL's
  21. Dim n
  22. n = 0
  23. strSql = "SELECT *, Table.ID AS [pageID] WHERE Table.Delete<>'Y' ORDER BY Created DESC"
  24. Set db = Server.CreateObject("ADODB.Connection")
  25. Set Rs = Server.CreateObject("ADODB.Recordset")
  26. db.Open strDBConnection
  27. Rs.Open strSql, db
  28. Do While Not Rs.EOF
  29.     intID = Rs.Fields("pageID").Value
  30.     response.write "<url>"
  31.     response.write "<loc>http://www.domain.co.uk/article.asp?ID=" & intID & "</loc>"
  32.     'response.write "<lastmod>" & Danger & "</lastmod>"
  33.     
  34.     If n < 10 Then
  35.         response.write "<priority>1.0</priority>"
  36.         response.write "<changefreq>daily</changefreq>"
  37.     ElseIf n < 20 Then
  38.         response.write "<priority>0.7</priority>"
  39.         response.write "<changefreq>monthly</changefreq>"
  40.     Else
  41.         response.write "<priority>0.2</priority>"
  42.         response.write "<changefreq>yearly</changefreq>"
  43.     End If
  44.     
  45.     n = n + 1
  46.     
  47.     response.write "</url>"
  48.     Rs.MoveNext
  49. Loop
  50. Rs.Close
  51. db.Close
  52. Set Rs = Nothing
  53. Set db = Nothing
  54. ' End Dynamic URL (maybe have another table?)
  55. response.write "</urlset>"
  56. %>


lastmod, priority and changefreq are all optional tags! Its unlikly they will be used but if they are then don’t forget its all relative so do try and have low values for all you’re old pages, I suggest priority for homepage 0.5 old pages 0.2 and new pages 1.0 which you can do dynamically. Same goes for changefreq.

Limitations:
This code obviously does not limit the URL count to 50,000 so keep an eye on that if you have a mega site – maybe add another counter to show number of URL in admin mode. The way it’s coded (non inline) the file size would never get to 10MB for 50,000 url’s and more like 5MB.
  • pine_things
  • Novice
  • Novice
  • pine_things
  • Posts: 20
  • Loc: Warwickshire, UK

Post 3+ Months Ago

Jibran wrote:
Follow my advice dudes! Dont jump Sitemaps! My pages indexed had reached 5000, now only 724 are left, in four days' time!!! The decrease started occuring after I had started using Sitemaps!!! :angry:


the header of all your pages looks pretty similar. perhaps google thinks that there are similar pages. before it could not go deep down and look at the content. now that you have created the sitemap it looks at all your pages. also the priority in your sitemap.xml for the majority of the url is 1.0.
  • Johan007
  • Guru
  • Guru
  • User avatar
  • Posts: 1079
  • Loc: Aldershot, UK

Post 3+ Months Ago

How many pages are listed in your sitemap?
  • Jibran
  • Beginner
  • Beginner
  • User avatar
  • Posts: 61
  • Loc: Lucknow, India

Post 3+ Months Ago

1200+ pages are listed. The priority has been set automatically, I'll change that too, i.e. 8 pages and 4 list of urls.txt are listed in the config.xml this when run under sitemap_gen.py gives 1210 html and 1 php page.

Also the titles for almost every page is different. I have set up dynamic titles. The titles and descriptions for every news article is different etc.
  • Johan007
  • Guru
  • Guru
  • User avatar
  • Posts: 1079
  • Loc: Aldershot, UK

Post 3+ Months Ago

Jibran if your talking about http://www.pottersrealm.com sadly I suspect you havent got enough PR to support that many pages. PR3 is no way enough! Make a links paage and start swapping with other potters. You need to aim for PR5 maybe 6.
  • Jibran
  • Beginner
  • Beginner
  • User avatar
  • Posts: 61
  • Loc: Lucknow, India

Post 3+ Months Ago

Does PR matter with the no. of pages being indexed? The following: http://www.searchengineengine.com/
shows my real PR at 4. I have added my site to some directories too. I am targetting for a higher PR. And I do have a links system:

http://www.pottersrealm.com/links.html

Does it matter to whom I link too? or only who links me?

Edit: just a question, could it be that since other high PR sites have similar content to my site, that my site is getting penalised? I did a search via Copyscape.com and found quite an overwhelming number of matches!
  • Johan007
  • Guru
  • Guru
  • User avatar
  • Posts: 1079
  • Loc: Aldershot, UK

Post 3+ Months Ago

Copyscape does not show up only similar content but more like theft content then its a possibility but this latest Bourbon update by Google is unknown. If you search for "Potters Realm Harry Potter!" you will not be number one unless you remove the Google penilisation filter and it then shows you number one. Have your stats fallen this month (or from the 24th last month?)

If yes to both then:

1. Remove dupe content
2. Get a few more inbound links from other potter sites.

We are going way offtopic here... start a new post or PM if needed.
  • Axe
  • Genius
  • Genius
  • User avatar
  • Posts: 5735
  • Loc: Sub-level 28

Post 3+ Months Ago

Of course they're going to be good. They allow you to easily say to Google "Hey, these pages exist on my site".

Even if they're several links deep into your content, you can tell Google that they are around on your site and available for public viewing.

You'll be able to get a lot more of your pages into Google's indexing queue much quicker than natural crawl (at least, that's the theory).

It doesn't replace natural crawl, it's a compliment to it.

I use it on some of my sites.
  • boohlick
  • Beginner
  • Beginner
  • User avatar
  • Posts: 61

Post 3+ Months Ago

Thats right... but you have to make it in xml form to be able to add it in your google site map.. Its good, like what he says.. google can easily crawl all your pages..
  • SplitMedia
  • Newbie
  • Newbie
  • SplitMedia
  • Posts: 13
  • Loc: Split-Media.com

Post 3+ Months Ago

For http://www.split-media.com

How would I get this "xml" what information needs to be on it? I didn't understand it very well, and I signed up with this google sitemap.. Not really understanding what was involved in it.. Any comments/suggestions?
  • malprave
  • Student
  • Student
  • malprave
  • Posts: 84

Post 3+ Months Ago

I have just completd my XML sitemap and submitted it to Google (not really, they are so busy I have to try in a few moments).

I did not, however, do it the way they perscribed (Using Python and all that), I just googled some things and found this website:

http://www.auditmypc.com/free-sitemap-generator.asp#sitemap-generator-updates

which does exactly the same thing except you don't need Python, all you need is a Java enabled browser.

It can take a while to change all the setting if you want to customise each page i.e. priority, modification occurance...

I just wanted to know how many others have done this and if we get any additional benefits!!!???
  • Axe
  • Genius
  • Genius
  • User avatar
  • Posts: 5735
  • Loc: Sub-level 28

Post 3+ Months Ago

I wrote my own Google Sitemaps generators for my sites, and for several popular PHP based scripts.

Submitted, and traffic has increased to those sites beyond their normal expected growth.
  • ATNO/TW
  • Super Moderator
  • Super Moderator
  • User avatar
  • Posts: 23473
  • Loc: Woodbridge VA

Post 3+ Months Ago

Axe provided me with his script for phpBB forums, and it is working for me as well. His version is available on various download sites.

Here's one of them
http://www.hotscripts.com/Detailed/50901.html
  • reaper
  • Proficient
  • Proficient
  • User avatar
  • Posts: 435
  • Loc: europe

Post 3+ Months Ago

I came accros this site and it has a nice free tool for generating a google sitemap that you can submit.

You should check it out

http://www.vigos.com/products/gsitemap/

Note:
Microsoft .NET Framework is required to use Gsitemap!
  • Axe
  • Genius
  • Genius
  • User avatar
  • Posts: 5735
  • Loc: Sub-level 28

Post 3+ Months Ago

no manual, no FAQ, no other documentation (don't know how that helps beginners, heh).

I'll stick to writing my own. If you've got enough pages that you can really justify having a sitemap, I think a remote piece of software, or remote script like that would just put too much of a load on the server loading all the URLs on the site.
  • Alan Lastufka
  • Proficient
  • Proficient
  • User avatar
  • Posts: 318
  • Loc: ChicagoLand, IL, USA

Post 3+ Months Ago

http://www.sitemapbuilder.net/default.aspx

That's gotta be the easiest for beginners or anyone really. You can tell it to analyze your server - or just copy and paste a link list form a word doc (I keep a text file I add every new link to myself) and it spits out an .xml file.

Super easy.
  • funlounge
  • Beginner
  • Beginner
  • funlounge
  • Posts: 40

Post 3+ Months Ago

Hi
I'm using a sitemap.xml file (submitted to Google sitemaps)
for my new site www.celebritiescentral.net
a week ago now, but still my page does not appear in its index

Any experiences with sitemaps ?

Thanks
  • Axe
  • Genius
  • Genius
  • User avatar
  • Posts: 5735
  • Loc: Sub-level 28

Post 3+ Months Ago

The sitemaps file just gets your URLs in Google's database faster, it doesn't make them take any less time to actually view your pages. They'll put your URLs in their spider queues, and if they come across backlinks, etc. they'll bump URLs up the queue and go from there...

So, it still takes as long as conventional ways to START getting indexed, although with Sitemaps, once it does start getting indexed, it'll ALL start getting indexed, so it's much faster overall, especially once Google's caught up and you're just adding new content.
  • mac5150
  • Novice
  • Novice
  • mac5150
  • Posts: 15
  • Loc: Colorado

Post 3+ Months Ago

Google give some good links to both downloadable and online site map generators.
http://code.google.com/sm_thirdparty.html

I'm using GsiteCrawler right now. It does what I need and it's pretty stable for a beta.
  • LittleEarner
  • Novice
  • Novice
  • LittleEarner
  • Posts: 18
  • Loc: England, UK

Post 3+ Months Ago

I've just tried GsiteCrawler based on your recommendation, and I must say, I'm very impressed with it. I'd certainly recommend it.

Dan
  • kofoid
  • Novice
  • Novice
  • kofoid
  • Posts: 27

Post 3+ Months Ago

Will doing regular sitemap submissions to yahoo and google help my placement or is it just a waste of my time?
  • Ganceann
  • Graduate
  • Graduate
  • Ganceann
  • Posts: 152

Post 3+ Months Ago

In theory they would help.

In reality there is no guarantee.


Overall, you should balance the submissions - dont overdo them or possibly get risked as spam submissions and therefore ignored etc.

The sitemap, if any changes are made to it then it would be a valid submission - if no changes from the previous submission then it would be seen as a duplicate submission and most likely ignored as the original submission would still be queued as such.

On large sites + forums based sites like yours it can take a long time to get fully indexed - each forum has to allow spiders the access to index them and each one needs to appear as unique to merit indexing.

The only cure is time - if the spiders have the access, then time should see them indexing everything that contains valid content.
  • kofoid
  • Novice
  • Novice
  • kofoid
  • Posts: 27

Post 3+ Months Ago

How often is too often? I have it submitted nightly
  • Ganceann
  • Graduate
  • Graduate
  • Ganceann
  • Posts: 152

Post 3+ Months Ago

Nightly is likely too often in my opinion.

The reason is that it will not show up directly after submission and can be upto 4-6 weeks old listing in serps - but can be quite up to date as well - really just depends on g handling.

If there is a 4-6 week delay or any delay on your site from when info is crawled to when it is displayed then a nightly submission is too frequent.

If there was no delay then a nightly submission would seem fine provided there was daily changes.

I would cut the submissions down from nightly to weekly (or maybe monthly) to see if it had any affect and only submit it more frequently if there were changes on any given day.
  • kofoid
  • Novice
  • Novice
  • kofoid
  • Posts: 27

Post 3+ Months Ago

sooooo, if I have a forum where the content changes daily.... what do you think?
  • Ganceann
  • Graduate
  • Graduate
  • Ganceann
  • Posts: 152

Post 3+ Months Ago

The content may change daily - the sitemap wouldn't necessarily change daily.

My advice is analyse the search keywords or search strings most people find your forum with.

Check major searc engines for your rank for these terms.

Analyse if you can bring your rank up any with those terms - checking on other things I had mentioned in a message or other SEO thread.

If your site is indexed, and is being crawled regularly submitting the sitemap nightly will not help your cause to increase in keyword searches. It may well hinder your site and tag it as a spam submission.
  • roma
  • Graduate
  • Graduate
  • roma
  • Posts: 142

Post 3+ Months Ago

I wondered if anyone knows about google sitemaps. If so, is it best to put daily on the pages or monthly or what? Or does it matter? What are the best settings for google sitemaps?

sk
  • Ganceann
  • Graduate
  • Graduate
  • Ganceann
  • Posts: 152

Post 3+ Months Ago

I am no expert on it but I have started reading google sitemaps group information.

From that I have gathered the sitemap can be used to help determine priority (in conjunction with PR) on how often googlebot will crawl the site. The problem is that the sitemap itself may get linked to a bot queue and frequent submissions may not ensure that the old sitemap is removed from the bot queue and then problems arise when 2 sitemaps show different page structures... have been reading about some people still getting googlebot errors 6 months after they had updated to a new sitemap.

It would be great if google could actually clarify things but it appears they do not have the exact answer as to the behaviour of the sitemap in a bot queue and if a new sitemap will always replace an old sitemap... seems that in theory the new sitemap would replace all previous versions, but in reality there are some glitches and a lot of people reporting on errors on a googlebot crawl to pages that have not existed for a long time.
  • roma
  • Graduate
  • Graduate
  • roma
  • Posts: 142

Post 3+ Months Ago

Well, gb hits my sites (two of them out of three) several times a day. I put up a new map as needed. Right now they have a real old map that shows very old pages that have not existed for years.

I noticed my competitors who rank higher than I do don't use a site map at all as far as I can tell. At least nothing called sitemap.xml or sitemap.xml.gz. And I assume those are the only names one can use.

Frankly, I don't think they matter.

sk
  • WebFreedom
  • Graduate
  • Graduate
  • WebFreedom
  • Posts: 138

Post 3+ Months Ago

For what it's worth, I've read that it's best to update your Google sitemap once per week, provided that you have new content.

HTH,
Sam
  • Byzantium
  • Student
  • Student
  • Byzantium
  • Posts: 86
  • Loc: London

Post 3+ Months Ago

The best bet is rather obvious - just to put a realistic assesment of how frequently the page changes.

Some pages of your site will change more frequently that others so, if Google knows about that, it will help it to make best use of its time when spidering your site.

Changing the frequency to daily is not going to make it visit each page daily. It's going to make it's own decisions about how to use it's time - but if you let it know which pages change most frequently, then it can take that into account.
  • Microsys
  • Novice
  • Novice
  • User avatar
  • Posts: 28

Post 3+ Months Ago

I am a believer in sitemap priority tag can be useful, so much I even "blogged" about it once :-)

If bandwidth becomes a problem, be sure to gzip your sitemap (many sitemap generators support this as well)
  • remaxactionfirst
  • Graduate
  • Graduate
  • remaxactionfirst
  • Posts: 103

Post 3+ Months Ago

DEBARATI, first create xml sitemap for your site.
Then
(https://www.google.com/accounts/Service ... 3Den&hl=en)
Create one use on given link then add your site and it shows you the step by step method how to add sitemap for google.

Then at the time of varyfication there is one code you have to put it on your root directory or if you select meta tag for varyfication then put that meta tag on your head portion.
Save then come on google account and click on varify button then i will automaticaly find that tag on your site and varify your site and GOOGLE starts crawling your site automatically day by day.
  • shandaman
  • Beginner
  • Beginner
  • shandaman
  • Posts: 35

Post 3+ Months Ago

Site maps are a great way to get every page in your website indexed! They also help to exclude pages you do not want indexed! Great Fun!
  • Americantruckbuyer
  • Student
  • Student
  • Americantruckbuyer
  • Posts: 83

Post 3+ Months Ago

Yeah i too think that sitemaps are really helpfull and yeah this code is also good and helpfull

use this tag

<META name="revisit-after" content="10 days"> or can also keep revisit after 7days.......


Regards
American Truck Buyer
  • tiekie
  • Novice
  • Novice
  • tiekie
  • Posts: 30

Post 3+ Months Ago

Hi

I read on onother forum that when people removed their
sitemap they got better ranking instantly... so, how can
a sitemap help ? Allot of people agreed with this

here is the tread:
http://forums.digitalpoint.com/showthread.php?t=100844

Please let me know what you think about this

Post Information

  • Total Posts in this topic: 106 posts
  • Users browsing this forum: No registered users and 4 guests
  • You cannot post new topics in this forum
  • You cannot reply to topics in this forum
  • You cannot edit your posts in this forum
  • You cannot delete your posts in this forum
  • You cannot post attachments in this forum
 
cron
 

© 1998-2016. Ozzu® is a registered trademark of Unmelted, LLC.