Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

PACK table error 1

Status
Not open for further replies.

ezpzjohn

Programmer
Jun 29, 2001
50
GB
Over the last few weeks I have started receiving reports of errors from users of our software.

I normally write code that follows the deletion of a record from a table with a PACK command to permanently remove the deleted record. Some users are now reporting that following deletion of a record they get an error, from the program code that follows the PACK command and re-fetches the updated table, saying that the table does not exist.

On reading the VFP help file I understand that a pack is carried out by first copying all the records in the table that are not marked for deletion to a new, temporary table. The original table is then deleted and the temporary table renamed with the original table name. It seems to me that the second step is failing and the temporary table is not being renamed or the table link to the database is not being recreated.

Has anyone else encountered this problem?

Could this be as a result of interference from anti-virus software as regards the file renaming?

John B
 
First, why are you going through all that trouble of PACKing the table instead of just using SET DELETED()? Using PACK is really not necessary. If you for some reason need to preserve space, you can always recycle records using BLANK, DELETE, GATHER, REPLACE, RECALL, ...

To answer your question though, it could be the AV thing. A siple test would be to temporarily disable it.
Another option is maybe the rights aren't set up properly. Maybe they only have 'Write' access and not 'Full Control' access to the folder where the table is.


-Dave Summers-
[cheers]
Even more Fox stuff at:
 
Hi John,

First point: It is not usually necessary to perform a PACK every time you delete one or more records. In most applications, the PACK function is only run on demand, as an adminstrative procedure. In other words, it's not part of the normal day-to-day operation. In a typical situation, it doesn't do any harm to let deleted records accumulate for a while.

However, let's accept that you have a reason for doing it the way you do.

Your understanding of the procedure is correct. VFP copies the table, deletes the original, and renames the copy. If that is failing after the first or second step, it could be because the original table cannot be deleted for some reason (it might be marked as read-only, for example); or the new table cannot be renamed. Can you establish which of those two cases applies?

You mention anti-virus software. I've never heard of AV interfering with a PACK, but I suppose it's always possible. Can you experiment by disabling the AV while the procedure is running?

If you can provide any more information about what exactly is failing (the deletion, the renaming, or something else) that would be helpful.

Mike

__________________________________
Mike Lewis (Edinburgh, Scotland)

Visual FoxPro articles, tips and downloads
 
>I've never heard of AV interfering with a PACK

There are at least two known cases known over at experts-exchange. The file operations PACK does obviously are suspicious for a virus infecting a file. It helps adding dbf/fpt/cdx file extensions or the data folders to an exception list for the AV software.

Mike has valid other reasons. Eg you can't pack a file, which is set to be non deletable by readonly attribute. Or a file, which is in shared use. That's another reason you rather only pack as administrative task during off hours.

One problem you might have is with indexes only allowing unique values. You can set the index filtered for NOT DELETED() to solve that problem.

Or as Dave mentioned, work with recycling records: Before inserting data, first check for a deleted record. If you find one, RECALL it. Afterwards BLANK DEFAULT AUTOINC will set default values and also will update autoinc fields, as if an insert statement would have been done. You then could GATHER new values into all those fields you would want to insert into, or execute a REPLACE or UPDATE. The problem with unique index then can also be solved by first seeking a deleted record with the unique value you want to (re)use and recall, if you find a deleted record with that specific unique value. There will only be one such deleted record, if any. If you don't find that you can recall any other deleted record, if you don't find any deleted record you can insert a new one. Sounds more complicated and slow, than it is. It's mainly at most two seeks or locates optimized by indexes and you only work on one record at a time anyway. Lots of code, but it'll be fast.

Bye, Olaf.
 
Thanks Dave, Mike and Olaf.

Many thanks for your advice and suggestions.

I had already come to realise, over the last few weeks, that a regular PACK was not best practice and that Dave's and Olaf's approach, when adding a record, of using SET DELETED OFF then searching for the primary key valuable if applicable and RECALLing the record if it's deleted is a much better use of resources.

Problem is I've used PACK for years and it has worked fine for years. Why should it stop working over the past few weeks?

I have never given much thought to improving my code until now; I am already in the process of rewriting all the offending code so that I can issue updates to clients.

I would still like to know what the issue with the PACK command is as I am getting reports of this issue from different clients almost daily now. The frustrating thing is that I have been unable to reproduce the error myself as it works fine on a couple of machines that I have tested the compiled program on. The default location for the database and tables is in a sub-folder of Users\Public so access should not be an issue. The other problem I have is that this is an off-the-shelf application that I produced and have sold several thousand copies of and a lot of the customers for the program are in their 60s, 70s and even 80+ years old. Many of them are okay using the program but are not computer literate enough to search through files and folders or even to disable AV software. When I find one that is computer literate I may be able to find out more and will post an update here.

Thanks again for the advice guys.

John B
 
John,

Clearly, you need to find what it is that's different between the systems that give the error and those that don't. Given that it has worked for many years without a problem, and has only recently started to go wrong, that would suggest an upgrade to a reent version of Windows might be the culprit. But I can't think why that should be the case.

Regarding the recycling of records. I agree with Dave and Olaf that this could be a good approach. But I'd advise against doing it in the short term - and certainly not as a workaround for your PACK problem. It would require a fair amount of code changes, with all the testing and debugging that involves. On the other hand, if you simply removed the PACK, everything will work as before, at a cost of having to keep the deleted records hanging around for a while. But that cost is likely to be very low, unless your users are deleting data at a vast rate.

Anyhow, I hope you are able to get some more information. With luck, we'll be able to get it sorted.

Mike



__________________________________
Mike Lewis (Edinburgh, Scotland)

Visual FoxPro articles, tips and downloads
 
Dave is right, also a virus definition data update of AV Software could be the cause, especially of Windows Essentials and/or Defender, the default virus scanners by MS.

And I agree with Mike, the fastest change to your code would simply be to skip any PACK. Instead you could put a routine PACKing data at app start or exit or both. Consider the time PACK Needs, if data grows larger. You could also pack at idle times.

Most important: do you get reports of users using a new version? So did something change in your app? Or do you get reports from users not having upgraded for a long time?

Bye, Olaf.
 
I have experienced this very problem.

Sometimes it is convenient to pack a locally held temporary file that has been opened exclusively during the processing of
a large number of records to produce a smaller recordset that can then be used very quickly indeed.

I have found a few, what seems to me to be machine specific, instances where a pack leaves the table unopened for a period
and have put this down to an antivirus/antimalware issue or similar.

There is a simple solution - write your own 'pack' function that copies the file out (excluding deleted records), zaps the
original and appends the copied data back in. Where I have employed this approach to avoid the error described (which is
quite repeatable - once you have a PC that exhibits the symptoms) it completely negates the issue for me.



Regards

Griff
Keep [Smile]ing

There are 10 kinds of people in the world, those who understand binary and those who don't.

I'm trying to cut down on the use of shrieks (exclamation marks), I'm told they are not good for you.
 
There is a simple solution - write your own 'pack' function

That's an excellent idea.

However, it doesn't negate the original point: Don't pack the table on the fly every time you delete a few records. Whether you write your own, or use the built-in command, you should normally only do the packing as an administrative function, outside the main work flow. (Griff, I accept that, in your case, you had a good reason for packing on the fly, but we haven't established that's the case for John.)

Mike



__________________________________
Mike Lewis (Edinburgh, Scotland)

Visual FoxPro articles, tips and downloads
 
Mike you might be right
or the table link to the database is not being recreated.

This might imply that this is a database rather than a free table.

I agree about the admin issue, but I was just suggesting a solution.

The thing is VFP can be too fast for it's own good, in these days of every
other bit of s/w wanting to look over our shoulders at our disk work seeking
malicious activities

Regards

Griff
Keep [Smile]ing

There are 10 kinds of people in the world, those who understand binary and those who don't.

I'm trying to cut down on the use of shrieks (exclamation marks), I'm told they are not good for you.
 
We used to see errors like this all the time on Novell networks when the original owner of the file was deleted as a user on the Network. Re-creating the file (which VFP does) using a different user account would fail.

Something similar could be happening here since the advent of increased Windows security.
 
Hi Guys

I had a call from a user experiencing this error yesterday. He is still using an earlier program of ours that was superseded three years ago. That program hasn't been updated by us for three years and he confirmed he has been able to delete records, which would include the PACK command, previously without any problem. Yesterday he did a deletion and got the error. I was able to talk him through creating a new database with the software and restoring from a previous backup file and he is up and running again.

He is an elderly gentleman, not very proficient with computers, but was able to tell me his son had recently change the anti-virus software on the computer from MacAffee to something called Super AntiSpyware. I'm not suggesting this was the cause, just collecting information.

Mike, the only problem with just taking out the PACK command is that will then cause some problem with creating new records. Some of the tables allow users to add records using their own made up codes, such as a client account code might be "SMITHJOHN", and these are primary key values on the table, so adding a record with the same code as a deleted record will cause a problem. Again, I know now that this is not best practice! I am self-taught, but that's no excuse - you would think after using FoxPro since the days when it was FoxBASE+ that I would have learnt my lesson by now!

I am just going to have to bite the bullet and work through all the addition and deletion routines, unit-testing as I go.

I am considering adding a "Pack tables" command on a Utilities menu, which would travel through all the tables in the database looking for deleted records and packing the table if any are found. But if that still causes the same issue and considering that there are likely to be few deleted records and if I employ a recycle deleted records approach as suggested by Olaf I don't know whether it would be worth the possible trouble it could cause.

Griff, yes these are all tables belonging to a database. I can't reproduce the error to confirm what's happening but I believe the table reference in the database is left intact as INDBC() for the table returns true but then the SELECT statement on the table returns an error such as "Error #1, File: C:\...\trandata.dbf does not exist", where "trandata.dbf" is the table on disk that has just been packed.

John B
 
I think your expanation matches what I was saying exactly.

I would definately write a simple function to do the pack the way I suggested, perhaps even releasing the
underlying table, running a loop to check for existence and then opening it again.

That might not be pretty, but it would get you running while you looked at the other approach.



Regards

Griff
Keep [Smile]ing

There are 10 kinds of people in the world, those who understand binary and those who don't.

I'm trying to cut down on the use of shrieks (exclamation marks), I'm told they are not good for you.
 
>these are primary key values on the table, so adding a record with the same code as a deleted record will cause a problem.

that is the reason I suspected earlier:

OlafDoschke said:
One problem you might have is with indexes only allowing unique values. You can set the index filtered for NOT DELETED() to solve that problem.
You can also make a primary key index filtered. In the table designer simply add an expression to the filter column: NOT DELETED()

That solves the uniqueness in the index, as a record deletion removes it from the index and the value for the primary key ca be reused.

Another thing is, surrogate keys wouldn't have that problems at all. Surrogate keys mean primary keys the user won't specify or see at all, eg an autoincrementing number. It won't matter, if you can't reuse scuh a key value, as they are of no importance.

That's halfways a religion, indeed using an autoinc does not solve the problem of needing unique values in the rest of the table. For example a list of product types shouldn't have a double name in it. For these cases you will add a condidate key and set it filtered for NOT DELETED().

So technically in both scenarios you'd add such a filtered index to be able to delete without pack. The recycle solution is cleaner, but such an index is much easier to implement, as you won't need to alter any case of inserts or appends to a table.

One downside of filtered indexes is them not being used by rushmore, but you can add a simple index unfiltered for that matter. Let's se what sys(3054) reports, when we use a filtered index in VFP9:

Code:
Set Deleted On

Cd D:\temp\ && somewhere we can write and delte later
cleanup()

Create Database filtertest.dbc
Create Table filtertest.dbf (cid C(10) NOT NULL, primary key cid FOR NOT Deleted() TAG cid)
Set Order To

*first of all: test reusing the same key:
Insert into filtertest values ("test")
Delete
Insert into filtertest values ("test")
* now what does rushmore think it can do

Sys(3054,1)
Select * from filtertest where cid="test" into cursor curfiltertest
cleanup()
Sys(3054,0)

Procedure cleanup()
   Close Tables All
   Close Database

If you comment or remove the last cleanup() call you can see the filtertest dbf contains the test key twice. Also rushmore says it can indeed optimize the query fully with the index cid, even though it's filtered.

Conclusion: Introducing a filter on your primary key solves the problem of double values in deleted records, as they are not taken into account anymore. Introducing such indexes you can remove PACKs from your application, unless you want a maintainance routine making room for further data and optimizing the performance a bit, again. It'll only be of importance in tables with large amounts of records, anyway.

Bye, Olaf.
 
One line is missing to make cleanup() really clean up: ERASE filtertest.*
It'll work anyway in the first run.

Bye, Olaf.
 
Thanks Griff.

I see your solution emulates the PACK command but means that there is no new file creation and renaming to worry about. I think this would be a good solution and, thinking about it, this is basically what I do when I restore from a backup file. I will certainly look at this an alternative to PACK in a utilities program to remove deletions from all the tables in the database.

Thanks Olaf.

Your comments regarding surrogate keys is correct, and I use this approach in about half the tables where it is not important for the user to have control over the record identifier.

A good example of where the user needs control over selecting the primary key value for a new record is the following usage. The software I write is for accounting purposes and uses a nominal ledger that is represented by a table "NOMCODES" with C(4) NCODE field and a C(50) NACCOUNT field which holds the account name (there are many other fields in this table but we are not concerned with them here). For example, a bank account might have NCODE = "1200" and NACCOUNT = "Barclays current account". To select a nominal account, from a dropdown say, the user selects based on the NCODE field and, after a while, the user remembers these codes much easier than remembering what the account is called. The NCODE field is the primary key on this table and is a foreign key on several other tables, such as the financial transactions table. The values in the NCODE field must be unique so I chose to use it as the primary key.

Thank you so much for your very useful example on the use of a filter on a primary key; I wasn't aware of this and it could certainly be the answer in the short term.

John B
 
>The values in the NCODE field must be unique so I chose to use it as the primary key.

That's what I often hear as reasoning. It's not a resoning to use a field as primary key, just because it must be unique. You can use it as a secondary key or candidate key. In Foxpro that means a candidate index. Think about my example agin, no code, but a name of a product type, eg food, clothing. As primary keys need to be used as foreign keys, too, you actually could use the product type name as foreign key, but then have not normalized your database and have a table of product types, that is obsolete, if it only would contain the product type name. Aside of that, if there is a typo, it is copied everywhere the key is used as foreign key. The ease of surrogate keys is, they don't contain any meaning. It doesn't mean you can't have an ncode with easier to memorize code for an account. But that's secondary.

No question, the same problem will exist, wheterh ncode is a primary key or secondary/candidate key field. But if using ncode as foreign key, you will need to take double care not reconnecting to child records of the earlier deleted main account record, when rusing an ncode. You don't have that problem with surrogate keys, as you don't need to reuse them, ever.

I already said and repeat: The usage of surrogate primary keys does not prevent the need for uniqueness of other fields and values, so the technical problems of deleted records within indexes are the same, whether using surrogate or primary keys, and the solution to that is a filtered index. No matter, if you restructure your database with surrogate keys or not, you can solve the uniqueness problem without PACK by using a filtered index to be able to reuse a primary or secondary key value.

Bye, Olaf.
 
Thanks for the detailed explanation of keys Olaf.

I understand that I would still need to check for uniqueness of the NCODE value if I used this field as a candidate key and had a separate computer-generated value as the primary key field on this table, so I don't see any real benefit in restructuring this table.

I am currently working through all the program code, removing the PACK commands wherever they are and applying a recycling deleted records technique when adding new records. It will take a while but I think it is the right thing to do to avoid problems in the future.

Thanks again for all your help.

John B.

 
Candidate vs Primary keays aside, you still haven't understood the nature of FILTERD indexes. You don't need to implement the harder to do recycling strategy, when you change to filtered indexes. You still will need to remove your PACKs, but you don't need to add anything instead. You won't need to check for double values within deleted records. You only would need to care for the undeleted records, as you need to do anyway.

Bye, Olaf.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top