Skip to Content.
Sympa Menu

en - Re: [en@sympa] Hardware Recommendations for Sympa

Subject: The mailing list for listmasters using Sympa

List archive

Chronological Thread  
  • From: IKEDA Soji <address@concealed>
  • To: Rob Mitchell <address@concealed>
  • Cc: "address@concealed" <address@concealed>
  • Subject: Re: [en@sympa] Hardware Recommendations for Sympa
  • Date: Wed, 26 Apr 2023 12:22:58 +0900

Hi Rob,

On 2023/04/25 21:15, Rob Mitchell (via en Mailing List) wrote:
Hello,

We recently upgraded from sympa 6.1.x to 6.2.68. We are seeing performance
issues when emails are sent to large lists. Currently we are running this
server on a VM with redhat 7 and the following hardware:

4 Intel Xeon Platinum 8270 CPU @ 2.7 ghz
32gb ram

Our Sympa installation has 7k lists and sends roughly 600k messages daily.
During our delivery delays (30 mins+) we see all 4 CPUs maxed out, but memory
usage is ok. We have a change in to double our core count to 8 CPUs to help
mitigate but I am wondering what others are using for similar sized servers.

We have built the following mailing list server with Sympa 6.2beta about
10 years ago.

Intel Xeon E5-2620 2.1 GHz 2p/24c
48 GB RAM

7 000 lists, 10 000 users.

In load tests, by tuning incoming_max_count and bulk_max_count
parameters and adjusting the number of workers, we confirmed that even
accepting 100 incoming messages per second with approx. 10 000
recipients did not cause stalls in Sympa's spools (N.B. this is just the
performance of Sympa alone. Because, on the message delivery side, we
were sending to /dev/null instead of sendmail).

One thing to consider is that when running a high-flow server with
Sympa, hard disk performance has a considerable impact, as Sympa's file
locking mechanism reduces the efficiency of disk caching by operating
system. In the above server, the Sympa's spools were placed on the SSD.


Regards,
-- Soji

-Rob Mitchell




--
株式会社 コンバージョン
ITソリューション部 システムソリューション1グループ 池田荘児
e-mail address@concealed
https://www.conversion.co.jp/



Archive powered by MHonArc 2.6.19+.

Top of Page