Dear mailinglist managers,
I have the following problem: I run a medium sized mailinglist and
keep a backlog of digests and an HTML version of the list on a website.
This is fine for reading discussions etc.
But I get requests from people who want to search the archives. When I
setup the archive I created a robots.txt to prevent the archive to be
indexed by webcrawlers, like AltaVista or HotBot. Why? Because I don't
want those discussions indexed and have the e-mail addresses of all
subscribers (who posted something) floating on the Net so they can get
spammed! Just for posting something to my mailinglist.
I hate spam. I get about 20 spam messages in a weekend and I hate it.
My questions to you: has anyone else faced such a situation? What to do?
There are several options:
- Don't make it searchable
- Create a local search engine using some freeware tool (suggestions?). I
don't really like the idea of huge indexes however.
- Allow one site only to index the pages. But which one? Is there a search
engine on the web that has special policies against spam?
I'm very happy that the amount of spam on my mailinglist is very low and I
want to keep it that way.
:: Jeroen .
:: Utrecht University, Faculty of Arts, The Netherlands
:: Office: KNG80, k2.07 / Phone/fax: +31(30) 253 6031 / 9191