Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Working with large databases 2

Status
Not open for further replies.

th201

Technical User
Dec 15, 2003
39
0
0
US
I currently have a database that contains around 11 million records. If there is a power failure or if filemaker is shut down manually without closing the files, the large database is always damaged when I try and reopen it. Is there anyway for me to avoid this aside from creating backup copies? Also when a copy is made of the large database, is it normal for it to take 30-40 mins to back the file up? And indexing fields usually can take an hour to 8 hours depending on the field. The PC it is running on is new with a pretty good processor and 2 gb of memory. Would switching to FM server help in anyway? Or increasing memory/processor speed? Any advice would be great.
 
If you're running a database of that size you should really be using Filemaker Server. You can program it to systematically backup the database at night.

Check you fields and make sure the ones that are indexed really need to be indexed; but yes, it will take that long to index them if they store a lot of data.
 
I agree with Cryogen. All good suggestions. You should also consider a UPS on database hosting machine as well. II million records would be a lot to lose.

-Striker
 
Thanks for the advice. What do u mean by "UPS"?
 
Uninterruptible Power Supply. Basically, a big battery that automatically takes over if there is a power outage. It plugs into the wall and your server plugs into it.

American Power Conversion Corporation is one of the leading manufacturers.

-Striker
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top