Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

How to delete tables to avoid using ZAP. (preferably with a hardcoded system path) 1

Status
Not open for further replies.

A_Radiolog_inFoxPro

IS-IT--Management
Mar 29, 2020
40
CY
Dear All ,
Merry Christmas and a happy holidays BTW to all :D

I had a last question on thread : thread184-1828035
Regarding tables, Now I managed to get out of the mess I got my self into , but

At the moment I have 2 temporary tables that have data in them that I want to discard after using the routine and create at the beginning of said routine.

I want to delete said tables BUT after I run the routine a second time I can't delete the files ( on the low level due to the file is already in use )

For the time being on my test setup I am using ZAP after selecting the table ,

As I imagine the ZAP feature is by it self dangerous since it can wipe everything , and also safety is off.

So I would like to know if there is a way to delete the tables and associated files via a hardcoded path to make sure they are always clear of data.

Thank you in advance
And Happy holidays to all and Merry Xmas ,

The Doc.
 
From your description, I suggest that instead of using tables, you use cursors.
 
Thnnks for making this a new thread.

I don't see from the previous thread what you now finally use.

But there are two things that come to mind when you want only a temporarily exisitng table:

You can use a cursor for that, as it only exists until it is closed, there normally won't even be a file that you need to erase.
A cursor can be create with CREATE CURSOR just like CREATE TABLE, so you just swap the keyword TABLE with CURSOR and you're already done. You don't specify a name that becomes a file name, the name just is the alias name for the workarea in which the cursor exists.

Some downside of that is that the cursor really only ever exists in one datasession and one workarea, you can trick VFP into using it twice by USE DBF('cursorname') AGAIN ALIAS anothername, but that's surely not the way it is intended, also in that case you can never bring it into a workarea of another datasession, for example the private datasession of a secondary form.

So DBFs still have their use case also for temporary data.

Well, in all of this, I already pointed out one helpful function. No matter if you know the file location or not, if you know you want to delete a table you only need to know the alias name it has currently, DBF('aliasname') will give you the full filename, and the ERASE command can erase files. And also, yes, if you still have a table in use you can't delete it. That's not only true for VFP DBFs, the Windows OS prevents files open in any way, no matter if it's a workarea of a VFP datasession or any other way, to be deleted, so before ERASING a file from the hard drive, you have to close it. When it's the currently selecte4d workarea a simple USE without any further clauses does that and closes the table so an ERASE can be done afterwards.

So in short to delete the current workarea dbf file, you could do
Code:
lcFile = DBF()
USE
ERASE (lcFile)

Notice though, this code is as explosive and dangerous as ZAP could be, if the current workarea is not the DBF you want to erase. And what's not an issue for security, but would still hinder this to work is: In case a DBF is used by multiple forms or with multiple alias names, the ERASE will fail and not delete the DBF, a file always can only be deleted once nothing has it open.

Unfortunately, therefore,m I have to end this on the note of saying that you'll still be responsible for what you do with what files yourself, there's nothing that is failsafely only deleting what you intend to delete when the context just becomes complex enough, and that happens always when code evolves. So one advice for such deletions to be done without any doubt of erroring is to be very explicit, as you had the idea yourself, use fixed names for things you only want temporary. For example always using a prefix "temp" for such tables would mean you could also get rid of them and only them with

Code:
ERASE temp*.*
Well, and in case you have tables with names that accidentally also start with temp like a temperature.dbf or a tempolimit.dbf, then, well, bad luck, think again, or use a more extreme prefix like _temp_ or please_delete_me_.

You'll never come up with 100% safety about anything, but it can be made quite clear, if you just make it plain obvious enough.

Chriss
 
One more simple advice:

The Windows system has a dedicated place for any files that are only temporary, it's the TEMP directory, and its place can be detemined by GETENV('TEMP').

Windows does nothing about this special folder, actually, it just provides it. People might be used to it getting emptied after some time. But that needs an administrator writing a login script doing a deletion of files. It's obviously easier to delete things at the start of running Windows, because when processes start up, they will populate the TEMP directory and Windows will prevent deleting files that are open. Other things that empty this directory are applications like CCLeaner or similar. Windows itself only provides this directory and the environment variable that can be changed. No more, no less.

Anyway, whatever you do, do it with care. Thinking from start to end also includes thinking about creating and finally erasing files, sure. So where is the place you know where the files are when you finally need this information when deleting them? At the place you create thém. So that's where you can always first CD into TEMP or where you can store full file names into where you later can look them up, be it a DBF of filenames that are dispensible. So you just need to invent what you need yourself and make a habit of it. If you want to know what to delete at the end, then make it a habit to not only create a DBF but add its name into a table of dispensible files you process at some stage. You can then error in adding a file to that data that you need to persist, but it's less likely, isn't it?

Chriss
 
Hi,

I would try using cursors too. If needed for whatever reason you can still use a COPY TO command and save to a (temporary) file on disk for later use.

Regards, Gerrit
 
In your question to delete the temp tables, I also have had issues with VFP not releasing the handles to tables even though I did a USE IN SELECT('alias_name') command to close the table. I have had to completely shut down VFP in order to release the files.

Given this, I also agree with the others who suggested to use a cursor. The cursor does not actually reside in memory but is created as a file in the Windows temporary folder. These files are deleted by VFP when the cursor is closed. The file name is generated by VFP and is unique each time a cursor is created. So, you should not have any issues using this choice for your scenario.

Greg
 
Greg,

Yes, Foxpro always had issues with file handles only actually closing down after you unload VFP itself (or the runtime DLL in case of an EXE). Which adds to any other reason you could fail to delete a file.

I think that's even within the last version 9.00.0000.7423 and we have to live with that. Well, like erasing files from the %TEMP% directory of Windows is easiest to do at the start, it is also easiest to delete DBFs created in the previous session of an application at start.

And adressed back to you A_Radiolog_in_FoxPro,

The good thing about cursors is that you don't need to care for the files related to them, that's really handled well and without exceptions by VFP. As Greg said cursor related files are created in %TEMP%, so VFP complies to what the Windows OS provides for temporary files.

You can't have a conflict of same file names, as they are not just the alias names in case of cursors. And even in case of having multiple sessions, meaning multiple users, using the same EXE oin the same computer (a terminal server) each user will have his own TEMP folder and they can't get in conflict. Unless you would use your own TEMP folder and not rely on the Windows OS to cater for a good scenario.

I once had to fix the irony of a previous developer thinking to take the TEMP directory in his own hands instead of leaving it to the Windows OS was superior. When it came to putting his application into a Terminal Server and when he created temp files with fixed names, the second user would get in conflict as a user session has some separated parts, like its own desktop, its own share of the RAM, its own process, its own DLLs, etc, but if a program generates c:\temp\xyz.dbf that will not differ in a second session of the same application run by another user in another session, so that crashed, until we removed all that code and moved temp files to what Windows suggests. On a terminal server the %TEMP% environment variable is used as a basis, but the users unique session ID is added as a subdirectory and so every seperate user has a separate temp directory even in that scenario.

I tell that anecdote because you have a bad idea put into your thread title in the term "hardcoded", the good part is "system directory". Yes, the GETENV('TEMP') directory is something that you should rely on, because it is cared for well also in scenarios you may not yet have ever thought of, like in terminal server sessions. There is only one catch, but that is to be solved by an actual system administrator, that to work ideal the temp directory should be put on a fast drive. In case of a terminal server that could differ from the C:\ drive in which it usually resides, you may want to make use of a dedicated faster drive for temp files and configure it that way. Nevertheless, it's not the job of a programmer to think about that, that's a decision which should be made by a system administrator that knows very well, what the consequences of the TEMP environment variable configuration are.

If you reconfigure %TEMP% to say simply D:\TEMP\, in case of terminal server sessions of users with Ids 3 and 5, for example, this still results in GETENV('TEMP') not giving you D:\TEMP, as configured, but D:\TEMP\3\ and D:\TEMP\5\, and that's all automatic, because when a user sessison is created at the login to a terminal server, the base %TEMP% value is adjusted for each user.

So all in all, GETENV('TEMP') is the only thing you need to know as a programmer about the decision of where to put temp files. You could have a good reason to want the scenario I described already with a better and perhaps only dedicated to temp files harddrive. But then, when in a company with people responsible for server hardware architecture and configuration, you better involve them and don't take your route without involving and informing them.

VFP miseducated us in that aspect, as it has its own concept of three configuration paths you can specify in a config.fpw file: The EDITWORK path, SORTWORK path, PROGWORK path, and TMPFILES path. Those configuration settings allow you to override system folders and thereby shoot into your own foot, in case of separate user sessions in a terminal server environment, for example. Handle with care is the standard to say here.

With that said about Windows %TEMP% essentially, you can't go wrong with cursors that VFP generates there and any files you generate for temporary usage should also just go there, never hardcode that directory, always use the GETENV() function to ask Windows for the current setting of %TEMP% with GETENV('TEMP').

And even in case you know better places on the local drives, just make that a discussion point about the Windows administration.

Chriss
 
Sorry, I feel like Colombo coming back with soem more pesky details and questions.

If you create temporary files that should be accessible by all users, not just one, you don't get around creating them in a LAN network or other shared directory, you then can forget all talk about %TEMP% and cursors. But usually temporary files are at the same time files only necessary to the one current user and code generating and processing them to finally also erase them points in the direction of not needing them for more than the current user and during the execution of that code.

Chriss
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top