I have a database that monitors two factory production processes. It logs a time, a smallint, 10 int, and 6 bit datatypes for each of the two processes in a table called data. A VB app on another workstation does the INSERTS via ADO every 5 seconds that the factory process is running. This creates about 20,000 rows a day. I limit the data to 90 days; this makes the database about 1.5 million rows and 200 Mb. Timed stored procedures run every hour to summarize the data and write it to another table for general consumption via the companies intranet. However, ocassionaly engineers need use the raw data in the data table, selected by a specific time period. This is usually done via Access or Excel and limited to a specific time range. The problem is that it is incredibly slow because the data table is not indexed and they have been know to return up to 30,000+ records. A db consultant once told me NOT to index the table because of the large number of inserts. The theroy being the indexes whould always be rebuilding killing the VB app and datebase performance. Is this a trueism? Any suggestions on a better design? [sig][/sig]