Here in Australia, we are connected to the Internet by rusty tin cans that occasionally get packet loss due to stray koalas playing with the line.
At least; that's what it feels like most of the time.
I work for a medium sized business (100+ employees), in which we need to frequently do mailouts for various sections of the business.
each business section has their own 'newsletters' and updates etc.
They've got pretty large mailing lists, as well as custom lists all sent via SMTP direct from databases to an exchange server; and typically they send out attachments (I'm working on a nice easy way for that to be hosted). but even when there are no attachments, it can tie up our mail server for at least an hour.
This delays email, which causes strain on other unrelated systems or time-critical tasks. which causes helpdesk jobs to increase, which overall increases an already high stress level of a small IT team (2 of us).
The current way to manage it is by delaying the mailouts until near the end of business hours. I don't believe this is a good long term solution and this isn't exactly a policy that can be heartily enforced; so every now and again theres a straggler who either doesn't know, or for whatever reason their message is 'very important', but typically not that time-critical.
now to the question:
I'm pretty much a linux newbie, but i've got an idea for a solution.
Given the problem, I would like to know if there is a way that I could set up a 'mail queuing' server. I'm thinking of a linux VM that had some type of 'quality of service' control, so I could limit the amout of bandwidth that is used constantly by our mail server, so it wont be flooded, and continue to have the other services working, decreasing our stress levels ;).
eg:
- 4000 emails go to 'mail staging' linux vm server from database
- mail staging server forwards on those emails, say 15 emails every minute to our exchange server, or set a "max outgoing bandwidth kb" for the server.
Ultimately I understand "we're gonna need a bigger pipe", but basically, the budget can't stand it at the moment.
ServerFault, is this possible?
Edit: Zoredache has asked 'why don't we send the email directly from the distribution server'
unfortunately, it's not that simple. the 'distribution server' is really a filemaker pro database hosted on a filemaker server, which there is a client plugin that allows it to send email, essentially acting like cut-down mail client.
Yes, I know it's not optimal.
2nd edit: can somebody please tag this 'filemaker'. it's a new tag, therefore I can't create it :P
Answer
There is one quick and simple way to do this. I'll show you the Postfix way:
Match the bulk mails (by header / regular expression whatever...) and let them get put on HOLD all the time. No matter when they are sent.
/etc/postfix/main.cf:
header_checks = regexp:/etc/postfix/header_checks
/etc/postfix/header_checks:
/^Custom-Mail-Header: true/ HOLD Delayed until out of hours
Then out of hours (you can define this in a crontab any way you like) you can use
15 20-23 * * * /usr/sbin/postsuper -H ALL
That'll release mail on hold at 20:15, 21:15, 22:15 and 23:15 every day.
Correct paths and crontab times where appropiate.
No comments:
Post a Comment