Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

mail server and mass batch processing

Status
Not open for further replies.

progman1010

Programmer
Jan 2, 2008
108
US
I wrote a mail script which sends emails by cron job once every 10 minutes. This way, i can queue up large volumes of mail into a db table and not worry about my individual scripts stalling.

The problem is that with any volume over approximately 1000, the system sends 4 copies of the same email (very bad). The only think i can think is happening is that the script runs, sending the mail, but never flags the mail as 'sent.'

You'll see in the code attached that the sending and the flagging as sent are done separately. I thought it'd be a good idea to do this for speed- but I do see the hole in my theory.

I am interested in hearing your thoughts on how to make it more efficient and reliable.

Thanks!
 
on batches of 1000 i don't think it would matter much if you push the update on a per email basis.

alternatively you can always fork a process to the update script each iteration. that way it would take place as a parallel process.
 
I'm thinking that it just takes so long (over 10 minutes) to send the big batches that another batch starts before the previous one finishes and marks the mail as "sent."

Maybe you need to set that flag to "sending" plus the time at the beginning of the run, then set it to "sent" at the end. The reason for the time stamp is so that the batch runs can tell which items are being sent by themselves and by other runs.
 
how would u fork a process? i've never gotten that in-depth with mysql before...
 
nothing to do with mysql. the forking takes place on a php thread.
have a look at the process control functions like pcntl_fork()

but i suspect that Miros has hit the nail on its head, particularly if you have a bottleneck somewhere. try reducing the batch size and/or increasing the wait time. if 1000 processes are taking more than 10 minutes then adding 1000 sql updates is really not going to make much difference overall. try also adding a timer into the batch to see how long it's really taking.

lastly why not consider also a 3 stage process:

stage 1.
get all the emails that need to be sent from the db

stage 2
set a timestamp on each of these records and add a status of 'sending'

stage 3
send them

stage 4
clear the timestamp (or refresh it) and set status to sent.

separately run a cron job that resets the sending status to 'failed' or something if the timestamp is more than 15 minutes old.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top