Has anyone had problems after running table repair when it finds ErrorCode 53 Level 2 - BTree mismatch..... If I rebuild, it corrupts the tables. Is there any connection with:
1. Size of table - 128MB
2. Password protected
3. Large index file
I use default setting when installing both versions of (16-bit & 32-bit) Paradox, I believe the Local Share = False.
If your data are on the local drive, do a scan disk and see if you have bad sector on your hard drive.
This is a wild guess, you could have bad pointers on your table.
I don’t think size is a factor, if you are running 32-bit Paradox.
I have this problem on two computers, so it isn't the hard drives.
When you refer to bad pointers, are you refering to the indexes? I have suspected for awhile that they seem to be behaving badly.
If you are referring to the indexes, then any suggestions? I am really worried about this problem, as the database gets larger, the problem will only increase to the point I fear I won't have a readable table.
I used Windows Explorer to copy from the original table (which has the problem) to the other computer (which obviously also has the problem).
In respect of the version, I use both Paradox 7.0 and Paradox 8.0. For general querying of the database, I prefer to use 7.0 but my other computer is only licensed for Paradox 8.0. Are you suggesting that there is a relationship between the problem and the version? If so, I have the two versions on the same computer (the main one). Is this where the problem is?
This is what I would do:
Use only 1 version of Pdox. If your Pdox 7 is 16-bit, use Pdox 8.
I did run into problem with Pdox 7 16-bit early this year with a huge table. It just cannot handle it.
Create the same table from scratch. Do not borrow anything from the existing table.
Create only the table, leave all the keys and index later.
Insert all records from the existing table to the new table.
Make sure you have all the records.
Now create your keys and index etc.
From now on, stick with one version of Pdox will avoid a lot of heartache later on.
When you copy a table, use Paradox to do the copy.
Use Explorer with care, make sure you copy all family objects associated with the table.
The version of Paradox 7.0 is 32bit because we have another app that uses the same BDE.
I did think of one thing that might be worth mentioning, the tables are still Paradox 5.0 structure. I have tried to upsize them to Paradox 7.0 structure but one of the other tables which isn't big at all, corrupts and produces ASCII code instead of the original data.
I will give your suggestion a try.
One quick question: If I could successfully upsize in Paradox 7.0 all the tables, including the troublesome one, would receiving export data from other sites where it hasn't been upsized to Paradox 7.0 structure cause problems?
I am troubled with your having problem when coverting a small Pdox 5 table to Pdox 7. I have done this all the time and I have never ran into problems like yours. I think you have some problems with your table structure and/or data.
What are the export data format?
Text file? Paradox table?
I would use only 1 version of paradox, make all tables' format the same, Pdox 5, 7, 8 or 9.
Copy table using Pdox instead of Explorer.
Create the problem tables from scratch and take care to make sure you can see all the data in the new table.
Table Repair is not a fix-all program, there are table corruptions cannot be detected or fixed by it.
If none of the above solve your problem, you might consider hire someone to help you.
let me help u...
if u want to repair the table...open windows explorer and find your table, example Customer.db.. find customer.px and delete it, and restructure again. u can try it.
If only it were that simple because I have tried that several times, it is now part of my regular regime.
Once I reach a dataset of 50,000 records or more I am unable to query the database without getting irregular, inconsistent query results. This has forced me to partition each financial year into small tables.
The structure of the table(s) isn't good! there is a referenced memo field right in the middle of the fields which is prone to corruption.
It is necessary to join up to 4 tables, also containing records > 50,000 and this only increases the probability of inconsistent query results.
Maybe you should consider re-designing the tables.
You probably will have to do it sooner or later.
I know it's going to be a pain, but the longer you wait, the bigger the tables, the high the pain level.
For whatever it's worth,I am in the same situation with one of my apps. Yup, starting from scratch. oh, fun!
I think we have discussed this matter before and I did suggest to management that we should get someone to help resolve this problem. The delimina is that, as you said, the longer I wait, the worse the problem will be but if management let me re-design now, then it will force the redevelopment issue and that cannot happen for at least 2 years.
One bright person suggested breaking up the tables and then when I needed to query them, query them back together again. But it doesn't resolve the underlying issues, does it?
Anyone got a magic wand out there?
Still working through recreating each table into small sizes but as I said in my previous post, I eventually have to put the large tables together for a particularly big query.
If you have any suggestions, I am open to ideas, so long as it doesn't involve two things - redesign or money.
I have Paradox handling some pretty big tables without much bother (odd index becomes corrupted). Might be worthwhile that you have the latest BDE - check the datestamp on your idapi32.dll file - should be 12th Nov 199, 5.11.
HTH
This relates to Paradox for DOS (possibly also applies to v5.0 tables in Win). There was a maximum table size limit of 128Mb. I would suggest one way of testing this theorey is to create a simple table (no index) and give it a few A255 fields to give a long record length. Repeatedly add it to itself, and see what happens when it gets to 128Mb. If it hits the same problems, it looks like a size limitation rather than something up with the system.
This was reached by a client of mine a few years ago, and the only solution is to split the data between tables. I have also worked for a client where we almost reached this limit, in which case a constraint was amount of free-space on the file server when creating tempory files during queries.
No redesign or money. Talk about tying the hands behind the back.
I don’t know if replacing idapi32.dll would work. However, I am using Pdox 7 (16-bit), Pdox 8 and Pdox 9.
There are queries I cannot run in Pdox 7 will run in Pdox 8 or 9.
Have you tried to run the same queries on Pdox 8?
I have been running queries on tables with millions of records, in Pdox 8 and 9 without any problems.
Tell you what, if you send me a copy of your tables, with say 20 records. You can use dummy data if the data are sensitive and I will take a look at them for you. Make sure you include all related files, such as *.px, *.val etc. in winzip or pkzip format would be nice. I don’t think we are going anywhere here, I mean we can hypothesize until we all turned blue.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.