Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Westi on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Replacing FILTER with a CURSOR

Status
Not open for further replies.

mmerlinn

Programmer
May 20, 2005
748
US
I am currently using FILTER to query subsets of tables. But it is too slow. And after two years of no problems with the FILTERs in the program, suddenly I have one which no longer works properly for some unknown reason. After two hours I finally isolated the source of the problem. Next I have to find out why and fix it.

But since using FILTER is so slow, I am seriously thinking of using a CURSOR instead of FILTER rather than fixing the above problem. (I have an older version of the program that can be used in the meantime.)

What I need to know is what is the best way to create a CURSOR on the fly using the parent table as the ONLY input. What I need is to write a function with the following characteristics:

1) Needs to work with FP 2.6
2) Creates a cursor containing:
2a) all fields in parent, child, grandchild, g-grandchild, etc tables in the database tree
2b) an additional two fields for every table in the tree, one for the table name and one for the record number
3) Has as its only input parameter the name of the parent table
4) Can find all child tables of any depth
5) Can cope with both one-to-one and one-to-many relationships
6) Can cope with fields of any data type
7) Populates the cursor with all possible record combinations based on relationships and skips
8) Populates the cursor with the table names and record numbers corresponding to the underlying records in the database tree

Misused, a cursor like this can easily cause problems with available memory. However, for what I need it for, the upper limit currently is about 200 records in a cursor with about 50 fields, with a realistic maximum limit of 1000 records within the foreseeable future.

I did a small scale search looking for some way to approach this, but didn't find anything that helped.

Does anyone know where I can find any information that would help solve this problem?


mmerlinn

"Political correctness is the BADGE of a COWARD!"

 
Mike:

I don't know why. When I bring up a browse window when searching for a record, I sometimes have to wait up to a minute before the window populates. And sometimes when moving to a different record in the browse window it takes just as long to refresh.

I am assuming that the SET FILTER is the culprit causing the slowness as I can get almost instantaneous results by not using it. Of course, then I have lots of garbage to sort through that I don't need at that time.

The time involved seems to be a function of the location of the records in the underlying table, with those records near the end of the ordered table taking longest.

Table is about 10000 records long and will eventually be approaching the 100,000 mark. Largest record count currently accessed is somewhere around 150 records and is not likely to increase at the same speed as the size of the table.

I figure if I create a flat file type cursor at beginning of program with everything needed in it in the order needed, I would only have to search the cursor for what I need rather than the whole table or tables. Overall speed should increase because of the smaller dbf file. Of course, there would be some time overhead involved at every restart of the program.

If SET FILTER is not the culprit, I have no idea what could be causing the slowness. If something else is, then I would like to find out what to do about it and fix it.


mmerlinn

"Political correctness is the BADGE of a COWARD!"

 
myearwood,
You've mentioned Rushmore a couple times now, but I don't see any mention of indexes by you or mmerlin.
Rushmore doesn't even come into play unless there are indexes on the tables.
FILTER, SQL statements, BROWSE FOR and so on could utilize them if they're there. Otherwise, FILTER - especially on multiple tables - could be uselessly slow.


-Dave Summers-
[cheers]
Even more Fox stuff at:
 
[smile]

ALL:

Here is one of the functions that I use for browsing. This is the most used one and the slowest since it accesses the largest data chunks.

[tt]
**********
*
* Find
*
* Works with current table
* Defaults to current order
* Filter(s) are set before entry
*
**********

PARAMETER qKeyData, qOrder, qCols, qRows
PRIVATE ALL LIKE z*

zorder = ORDER()
IF NOT EMPTY(qOrder)
SET ORDER TO qOrder
ENDIF

IF TYPE('qCols') <> 'N'
qCols = 80
ENDIF

IF TYPE('qRows') <> 'N'
qRows = 30
ENDIF

znear = SET('NEAR')
SET NEAR ON
SEEK qKeyData

zfields = SET('FIELDS', 2)

DEFINE WINDOW wfind FROM 1, 1 TO qRows, qCols;
SYSTEM GROW CLOSE ZOOM FLOAT FONT "MS Sans Serif", 8
MOVE WINDOW wfind CENTER

IF NOT EMPTY(RELATION(1))
SET FIELDS ON
IF zfields <> 'GLOBAL'
SET FIELDS GLOBAL
ENDIF
IF EMPTY(FLDLIST())
z = 1
DO WHILE NOT EMPTY(OBJVAR(z))
IF ATC('_', OBJVAR(z)) = 0
SET FIELDS TO (SUBSTR(OBJVAR(z), ;
AT('.', OBJVAR(z)) + 1))
ENDIF
z = z + 1
ENDDO
* ENDDO generates error number 11
* when z > length of OBJVAR()
* Err_screen bypasses error message
* Know of no other way to test length of OBJVAR()
* Stupid way of testing length
ENDIF
ENDIF

BROWSE WINDOW wfind NOEDIT NODELETE ;
NOMENU TITLE 'Find record.'
RELEASE WINDOW wfind

SET FIELDS &zfields
SET FIELDS OFF
SET NEAR &znear
SET ORDER TO &zorder

RETURN
[/tt]

This typically accesses four related ordered tables with one having a one-to-many relationship. This is the default browse window I use for most situations, although I have several other specialized ones. The filter(s) have already been set before calling this function.

Note that this function was copied from a generated .SPR and modified to use wherever needed.

Populating this browse, whether initially or later, typically proceeds at a snail's idea of a snail's pace. Other than the FILTER conditions, I don't see anything in this function that would cause it to work so slowly.

Except in my TOP and BOTTOM buttons in my navigation bar, I seldom use GO TOP or GO BOTTOM. I usually either LOCATE or SEEK instead depending upon circumstances.

What I would like to do generically is to create a flat file cursor with all possibilities at the beginning of the program. Then, when needed, use this function to display a subset of the cursor. With the cursor being a relatively small subset of the underlying table(s), I think that a cursor would speed things up considerably at the user console.

[smile]



mmerlinn

"Political correctness is the BADGE of a COWARD!"

 
MMERLIN,
I think you have a far better solution sitting you in the face, which uses neither FILTER or SELECT's... I use this almost exclusively when I'm using 2.x, and large tables (tables with more than 100,000 records in them), and I need to "look" at small record sets only...
Because you have an index on the table, why not utilize that, and then limit the record set that is visible with a DO? For example:

SELECT DETAIL
(Assume Account# order is set)
SEEK M.ACCOUNTNO
IF FOUND()
DO WHILE M.ACCUNTNO = DETAIL.ACCOUNTNO
<do what you want/need>
ENDDO


This is REALLY helpful on screens that are showing "One record", and you are using navigation buttons. You can skip forward or backward, and this is EXTREMELY fast. I have used this on 486 machines with 6,000,000 records in the table, and it is intantanious. Even if you have to retrive a record set, you can quickly collect that set into either a temp table with the above DO loop, or into an Array if you'd rather deal with it that way. Temp tables are easy to create and distroy using things like SYS(3) to make a name, you just have a tiny cleanup code to ensure it is deleted when you are finished with it, especially if there is to be no change to the data.

Just a thought...


When it comes to Filters, I generally use as a rule to only filter tables that have values that will not change, and are relatively static, with less than 10,000 records ever. Otherwise, you're destined to run into a speed issue eventually.


Best Regards,
Scott

"Everything should be made as simple as possible, and no simpler."[hammer]
 
Actually, that's a really good point... can you post up the contents of the index expression? Is it a compound expression, and are you doing any manipulation on the data in the expression?


Best Regards,
Scott

"Everything should be made as simple as possible, and no simpler."[hammer]
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top