One other thing is the scope of the variable myroad. A filter is not evaluated at setting it, it's a setting evaluated again and again for each record pointer move you make, and if a variable gets out of scope you get no result at best, you normally even will get errors. It's best to use macro substitution, after you prevent someone entering a string delimiter in MYROAD.
So this will end up as
Code:
Local lcRoadFilter
lcRoadFilter= AllTrim(ChrTran(MYROAD,["],[]))
Set Filter To AtC("&lcRoadFilter",ADDRESS)>0
Locate
Thisform.grid1.refresh
Thisform.grid1.column1.setfocus
The variable lcRoadfilter is not needed after this filter is set, its value was used, not the variable name. Your filter condition now would be something like AtC("Wall St",ADDRESS)>0, AtC is case insensitive, so no need to UPPER the ADDRESS memo. Go Top is not the worst idea to let a filter take action, but Locate is better for that, locating the first record the filter works for.
The filter expression should only contain fields of its workarea, no variables, which most often go out of scope. A filter is a setting, not a command, not a query. Like any setting it's active from when you set it to when it's unset, set different or - in case of theis specific setting - when the workarea closes.
If you edit an address to another road and skip forward it'll move out of the filter, if you change some record's address to contain the road it'll appear in the set - though I use "the filter" and "set" just to illustrate the idea of a result set, there is no such object, a filter is a behaviour you set, not creating some in memory buffer of the records like a query does. This is both an advantage and disadvantage. It repeatedly takes a time for the grid to show records fulfilling the filter. Especially with many records in the table and only few in the filter, the table is often and repeatedly scanned fully, because the grid only stops searching when its last row is filled. The empty rows of the grid here cost time. It'll be macroscopic time, ie seconds can be needed just by skipping forward, if your table has tens of thousands of records. So this also only works good with not too much data, unless you combine with SET KEY or indexes. Locating the next row fulfilling the filter condition will make use of indexes, but neither $ nor ATC are optimizable operators or functions. SET KEY could be used to shrink the scope of records scanned, if there is a scope like a time span or customerid. It also helps with the scroll bar height if the grid, which will orientate to the full reccount of the table, though a filter only has few rows.
The whole world rather works with SQL queries than "live" filtering, you have the impact at the query, you "only" do a snapshot of the queried data, but it's better in many aspects. These are reasons to not use filters at all. The rushmore optimization only helps locating the next record, not building up a result set like the rushmore optimization of a query does, query optimization can use several indexes, just as one advantage.
The only thing to prevent the repeated full table scan is create a (temporary) index on the filter condition, eg into an idx file or a secondary local cdx. Indexing on eg AtC("Wall St",ADDRESS)>0 will really just be used to filter Wall Street addresses, but it'll accelerate the locating of these records very much more than an index on ADDRESS, which doesn't help at all to find partial matches, besides it'll not work for a memo with its variable length. The index generating has an initial impact, and with the next MYROAD entered, you need another index, but this'll take about the time a query takes. There are still some disadvantages of the filter left over, eg the scrollbar behaviour.
Bye, Olaf.