Skip to Content.
Sympa Menu

en - Re: [sympa-users] Sympa + robots.txt

Subject: The mailing list for listmasters using Sympa

List archive

Chronological Thread  
  • From: Robert McNicholas <address@concealed>
  • To: Mateusz Krawczyk <address@concealed>
  • Cc: address@concealed
  • Subject: Re: [sympa-users] Sympa + robots.txt
  • Date: Thu, 09 Aug 2012 12:10:23 -0700

Hi Mateusz,

I am running sympa 5.4.5. At our site we have some lists that wanted their archives indexed by search engines, but most don't, so we have the following in our robots.txt; this seems to work as expected.

-bash-3.2$ cat robots.txt
User-agent: *
Disallow: /
User-agent: *
Allow: /sympa/arc/ptolemy-hackers
Allow: /sympa/arc/ptolemy-interest

Search engines do index the archives of these two lists but none of our other lists, not even those that are set to be public.

I hope this helps,

-Rob McNicholas
Dept. of Electrical Engineering & Computer Sciences
University of California at Berkeley

On 8/9/2012 11:58 AM, Mateusz Krawczyk <address@concealed> wrote:
Hello,

I am not very experienced Sympa administrator. One of the list owners asked me
to make file robots.txt and disallow search engines to index mailing list web
archives. I think it's not possible because of architecture of Sympa software
-
fcgi program is processing all requests and is generating dynamic pages. I
think
the best solution for this is to set web_archive_spam_protection for specyfic
mailing list to "cookie". Am I right ? I would be very grateful for any
response.

Best regards,
Mateusz Krawczyk




Archive powered by MHonArc 2.6.19+.

Top of Page