I'm writing a program that stores about 30,000 new records in a mysql database every day. These are call records from a phone system. I'm writing it in python, but I imagine the programming language is not relevant for my question (correct me if I'm wrong).
While the phone system reports certian data (extension that made the call, date and time of the call, etc), some other information needs to be calculated (the price of the call), or determinine the destination of the call (city, state or country), before it can be stored away. To determine the country or domestic destination of a call, the program needs to look up the destination from the mysql database, based on the number dialed. This destination table has some 130,000 entries all by itself. In other words, about 30,000 times a day, I need to do a search in the destination table that has 130,000 entries, and then store a new record in another table in the database.
What's the best way of doing this, if I want to avoid data corruption in case the computer were to crash, but want to keep it effective?
Should I just read the destination table into memory when the program starts up, and do the search from there 30,000 times a day? Or will that be too much even for a server-grade computer, and should I just look the data up from the mysql database 30,000 times a day?
And about storing 30,000 record per day... Should I just open a mysql connection at the startup of the program and almost never close it (since the program runs 24/7), or should I open and close the mysql connection 30,000 times a day? If I leave the connection open all the time, will I lose data in case the computer crashes?
While the phone system reports certian data (extension that made the call, date and time of the call, etc), some other information needs to be calculated (the price of the call), or determinine the destination of the call (city, state or country), before it can be stored away. To determine the country or domestic destination of a call, the program needs to look up the destination from the mysql database, based on the number dialed. This destination table has some 130,000 entries all by itself. In other words, about 30,000 times a day, I need to do a search in the destination table that has 130,000 entries, and then store a new record in another table in the database.
What's the best way of doing this, if I want to avoid data corruption in case the computer were to crash, but want to keep it effective?
Should I just read the destination table into memory when the program starts up, and do the search from there 30,000 times a day? Or will that be too much even for a server-grade computer, and should I just look the data up from the mysql database 30,000 times a day?
And about storing 30,000 record per day... Should I just open a mysql connection at the startup of the program and almost never close it (since the program runs 24/7), or should I open and close the mysql connection 30,000 times a day? If I leave the connection open all the time, will I lose data in case the computer crashes?