Guest_imported
New member
- Jan 1, 1970
- 0
I need to query of set of around 3000 records and perform around 15 to 30 queries on each record (done by calling subroutines). I'm facing a problem of performance when many users are accessing the file at the same time, due to opening/closing the connection many times.
How can I optimize my operation, keeping in mind that the functions/subroutines are in a seperate file that's being included in my processing file?
Is there a way to track the number of times a connection is opened?
How can I optimize my operation, keeping in mind that the functions/subroutines are in a seperate file that's being included in my processing file?
Is there a way to track the number of times a connection is opened?