Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Improving email response system? any ideas? 1

Status
Not open for further replies.

parkers

Vendor
Oct 21, 2002
157
GB
Hi,

platform: Linux / perl


For sometime now my web scripts have been setup to flow as follows...

Submit Web Form
|
Handle Form
If OK
Populate Database
For each individual selected (determined by web form, could be > 10 individuals)
send email
Display Success Message once all emails sent

Now when our network is running normally this flow works OK and is pretty fast ... however lately the email response is taking an increasing amount of time to complete which can cause users to move to another web page before allowing all emails to be sent.

Does anyone know if there is a more efficient way of doing this so that the system displays the "OK" independent of email completion?

Would forking another process which exclusively handles email resolve this problem? (and if so do you an example of how to do this or have any good pointers)

Thanks and any help appreciated.
SP
 
I think setting up the process to run in the background and seperately sounds like a good idea.

You could create your email script (e.g. mail.pl)

Then use the system command system("mail.pl &");

The & forces it to run in the background.

You could either pass all the variables you need to the script via the system command or you could create a temp file that the mail.pl script could access at its own leisure.

Hope any of this helps.

Sean.
 
We use 'events systems'. It works like this

Request is made, record is stuck in event database. Event database has things like event name and time and then there is a table structure for varied data that gets parsed.

Every 10 minutes the event cron job runs and retrieves all outstanding events. It then processes each event. It does things like open email templates specific to the event, parse in the variables and send the emails. When the event completes successfully the event is deleted or disabled.

This gives us a lot of control over events, how they present and how they operate. It also removes teh 'timed run' restriction of the webserver and gives the user a speedy response.

I never do 'forked processes' from a CGI. They are inherently unreliable IMHO and can cause problems since they become 'rogue agents' in the system essentially. I've seen a forked process from a CGI eat CPU in the past and never went that route again.

This is my opinion, this has worked well for us over a number of years, gives a centralized 'events architecture' that can be used in a number of ways. We use ours for sending email, connecting to web services, parsing data and updating tables when certain actions happen. Not only do users submitting form info use it but our own systems use it to do delayed messaging to other components.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top