Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Large number of records crashes

Status
Not open for further replies.

baggins2000

Technical User
Dec 23, 2000
7
0
0
US
I have an app running on mysql and I continue to get an error whenever I get over 12 million records. This has happened roughly 3 times in a row and I've never gotten over 15 million records. Is there some kind of limit in mysql that I should be looking for or is the size of the table becoming a problem. The primary key is auto-increment
BIG INT. When I try to do a select I get
ERROR 1030: Got error 127 from table handler

When I run myisamchk --safe-recover I get
Found block that point outside data file at 2147483644

When I try to run Insert I get that I have a duplicate entry '13567721' for key 1.

The last time this happened I was able to get it started again after running myisamchk --safe-recover but then it did the same thing after about 100k more records. I finally deleted all the records and started over. The MYD file is currently 2.1 G.

Any ideas


 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top