I'm designing a site for a friend. His company has monitoring equipment which sends a csv file containing temperature readings once a day to his home computer. Each of these csv files contain 144 readings (one reading every 10 minutes for 24 hours).
Each reading consists of a number correct to 6 decimal places (e.g. 3.245354).
I need to ftp this data to a remote server and get it into a mysql database. The data will then be called from the database and displayed in php graphs using the gd library.
I'm trying to work out the best way to construct the database table to store the data. I was considering this:
Thereby creating a new row in the table for each reading. The graphs will enable him to view data for a specific hour,day,week,month or year, so lots of sorting and calculations will be needed before passing the data to the graph generating scripts.
Do you think that the table structure above is feasable considering that a single years data will make the table over 52,000 rows long?
Also, this is to be hosted on a shared remote server, would you expect any problems with having to store, retrieve and make calculations on such large amounts of data?
Each reading consists of a number correct to 6 decimal places (e.g. 3.245354).
I need to ftp this data to a remote server and get it into a mysql database. The data will then be called from the database and displayed in php graphs using the gd library.
I'm trying to work out the best way to construct the database table to store the data. I was considering this:
Code:
year int(4) not null primary key,
month int(2) not null,
day int(2) not null,
hour int(2) not null,
minute int(2) not null
Thereby creating a new row in the table for each reading. The graphs will enable him to view data for a specific hour,day,week,month or year, so lots of sorting and calculations will be needed before passing the data to the graph generating scripts.
Do you think that the table structure above is feasable considering that a single years data will make the table over 52,000 rows long?
Also, this is to be hosted on a shared remote server, would you expect any problems with having to store, retrieve and make calculations on such large amounts of data?