Ok, so what you are saying, is to randomize the string of keywords for each page the list is on.
All you want to do is shift the order around inside the meta tag to put ones at the end at the beginning, etc.
I see what you are saying now, exactly, but again, that goes to the issue is it worth indexing every page of a cgi-generated site? Every time you rebuild, the information is going to change, links will move around, from page to page, etc.
You actually have a better chance of getting properly indexed by using the proper keywords, the right wording on your front pages, and in your meta=description and using a robots.txt file to direct the search engines to the pages you want indexed.
Most engines stop at a certain depth, or number of pages. They won't get to all your pages. Some won't index pages that have refresh, redirect, or other non-static html on them.
Randomizing the keywords may not hurt, but since many of the engines are using an order-found type of hit by randomizing the keywords rather than properly picking the order will actually get you LOWER scores on "hits" by people looking for "word1 word2" when your randomization has turned it into "word2 word1"
It shouldn't be hard to randomize the meta tags, just parse on the white space, stuff it into a Hash using the key=>value of random(64000) => word, then read the hash back sorted on the key. Every time you do that, your words will come out in a different order.
You can find code fragments for all of this in the existing LinkSQL code, and probably hack it together in a short time.
------------------
POSTCARDS.COM -- Everything Postcards on the Internet
www.postcards.com LinkSQL FAQ: www.postcards.com/FAQ/LinkSQL/