Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Filtering a record set from an SP 1

Status
Not open for further replies.

1DMF

Programmer
Jan 18, 2005
8,795
GB
Is there a reason why my SP runs smooth and fast even for remote users, but as soon as you try to apply a filter to the recordset it dies?

The SP only returns two columns and the table they come from I have added an index to.

Is this a known issue with MS Access, filtering and stored
procedures?

I guess I'm going to have to use a local table and re-write to run in a disconnected model to resolve this, unless you guys can point me to how I resolve this filtering issue.

Many thanks,

1DMF



"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!"

MIME::Lite TLS Email Encryption - Perl v0.02 beta
 
well the front end is Access the backend is SQL.

So it is MS access running a passthrough query to a stored procedure to obtain the record set

The returned record set is lightening fast it's the filtering that grinds to a halt (filtering via Access using the Filter attribute and Fitler = true)

basicaly my form has a search facility that filters on input so as you type, the text value of the unbound field is used to perform a filter operation to narrow down records via the 'changed event' of the unbound text box.

For remote users it is sooooo slooooow.

Is this just the way it is?

Or am I doing something fundamentally wrong?

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!"

MIME::Lite TLS Email Encryption - Perl v0.02 beta
 
Just a shot in the dark here because I never use this technique, but....


If you are using ADO to store the results of the SQL Query, you may want to sort the data prior to the user filtering it.

For example, if the data coming from the SP is sorted by shoe size, and you are allowing the user to filter on name, it will need to look at each row. If the data is sorted on name, it can quickly find the first and last rows that match and the return the rows in between.

Ex:

adoRecordsetObject.Sort = "Name"

It's only 1 line of code so I suppose it's worth a shot, right?

-George
Microsoft SQL Server MVP
My Blogs
SQLCop
twitter
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
so you agree that the disconnected model is the right approach, using the SP to obtain the record, inserting it into a local table, binding the form data source to the local table and then performing the filter?

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!"

MIME::Lite TLS Email Encryption - Perl v0.02 beta
 
that shoud say "using the SP to obtain the record set"

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!"

MIME::Lite TLS Email Encryption - Perl v0.02 beta
 
One thing to look at is how the filtering is being done.

For example, a column named lname is indexed and has last names of customers. If you do a search for last name beginning with N it can go quickly. But if for some reason your filter uses a wildcard as the first character, then the index won't be used.

WHERE lname LIKE 'N%' --will be fast
WHERE lname LIKE '%N%' --will be slow

-SQLBill

The following is part of my signature block and is only intended to be informational.
Posting advice: FAQ481-4875
 
dang - I use '%xxx%' for user experience expectation.. ie. wildest match possible (contains).

How do i get round this?

Or should I say work with it?

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!"

MIME::Lite TLS Email Encryption - Perl v0.02 beta
 
Sorry for the last response...I'm not aware of any workaround. Anytime you have a wildcard as the first character of the search it forces SQL Server to look at every single row. It's really bad using % since that means 'any amount of characters'. So it has to look at every entry and parse each character. For example:

axxx
bdcxxx

Since SQL Server can't know where in the string xxx is, it has to look at every single character...it will be quicker for axxx than for bdcxxx. Now imagine it doing that with every value you have. And if the values are long ones, it will increase the time.
jad;fjdsfjsdlkfjdujaouxxx will take much longer to parse through to find the xxx.

If you can do it, this may give better performance...

LIKE '_xxx%'.

The underline character is 'one character' then the string. So it only has to look at two characters to see if the criteria is met.

axxx - it looks at the a and sees the x next, so now it looks to see if three x's are there.

bdcxxx - looks at the b then the d. d isn't x so it goes on to the next row.

You are still doing row by row searches, but it should be bit quicker. One example of how to use it is:

Let's say you have a column that is First Initial, Middle Initial, and Last Name. You want to find all Smith entries.

LIKE '%Smith' will be really slow
LIKE '__Smith' (two underlines) will be quicker and the better choice since you know what you are looking for starts with the third character. If SQL Server doesn't find an S in the third place it moves on.

-SQLBill

The following is part of my signature block and is only intended to be informational.
Posting advice: FAQ481-4875
 
Thanks Bill, at least now I know why it is so slow on filtering.

I can't change the way the filter works, the user expects such a wild search due to the way we have so member companies with similar parts of their company name...

eg.

The Mortgage xxxxxx

The Insurance xxxxxx

and a whole bunch of other similar type company names where ony one word might be different.

So I guess SP to retireve records set and then use disconnected model to load a local table which the form is then bound to so the filtering runs fast for remote users as it is against a local table not the SQL server.

One thing I'm curious about, I always thought that when you retrieved a record set, it was stored in memory, I guess this isn't the case?

Is there a way of retrieving a datasource (record set) via an SP and store in memory so the fitlering / processing is done in memory and not via a link to the SQL server?



"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!"

MIME::Lite TLS Email Encryption - Perl v0.02 beta
 
One thing I'm curious about, I always thought that when you retrieved a record set, it was stored in memory, I guess this isn't the case?

When you run a query, SQL Server will first check to see if it has the data stored in RAM. If so, it will use that because RAM is many times faster. When the data is not in RAM, it must load it from disk before using it. This adds some overhead.

Is there a way of retrieving a datasource (record set) via an SP and store in memory so the fitlering / processing is done in memory and not via a link to the SQL server?

Since you are using ADO, the simple answer is yes. Your application can get a recordset and save it in memory for later use. When you do this, you need to be careful about a couple things. First, you could end up with a ton of memory used by your application. ADO has a filter function that you can use to find your data. The problem with this approach is that you would need to return ALL of the data from the server and then filter on the client. This will cause a lot more network traffic and will likely be slower than running the query.

I suspect you are experiencing a performance problem with your like search. There is a method that *may* help you. When you create an index, the data is stored separately from the table data (this applies to non-clustered indexes). When SQL Server reads data from disk, it does so in 8K chunks. So, if you had an index with the column you are searching through, it will be more efficient to search through the index because when it needs to load data from disk, each 8K read will return more rows, which means less reads and therefore better performance.

The trick to this approach is to make sure you have an index the includes multiple columns. You should have the primary key column(s) as the first columns in the index and the column you are searching through as the last column.

Testing this on one my my tables that only has 10,000 rows gave me consistently better performance. It did not make too much difference, but it was measurable. With a larger table, you will likely notice more significant differences in performance.

It's also possible that you already have an index like I described. Do not add another index to your table unless you check to see if the index already exists.

-George
Microsoft SQL Server MVP
My Blogs
SQLCop
twitter
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
when you say
SQL Server will first check to see if it has the data stored in RAM
I meant stored in memory on the client.

I have created an unclustered index on the main columns used to search the table.

but they are more than just the columns selected for this particular query and the main 'filter' column (companyname) is not the last in the index.

should I create another unclustered index that is just 'PK' and 'CompanyName'?





"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!"

MIME::Lite TLS Email Encryption - Perl v0.02 beta
 
I meant stored in memory on the client.

The data will only be stored in the client for as long as the ADO recordset exists. When you close it, set it to nothing, or it goes out of scope, the data is no longer stored in memory on the client.

should I create another unclustered index that is just 'PK' and 'CompanyName'?

The terminology is non-clustered. Is Company name the only column that you are allowing a wildcard search on? You could try adding another index to see if it improves your performance. If it does not improve it, or the improvement is not noticeable, then you should drop the index. But... it's probably worth a try.

Just out of curiosity, how many rows are in the table and how long does a "typical" search take?

-George
Microsoft SQL Server MVP
My Blogs
SQLCop
twitter
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
There are 15k records, only recid & company name are being selected.

The retrieval is fast enough if i bind the SP to the form, but each keystroke the filtering takes @ 5 seconds.

Perhaps i'm doing this wrong, I'm using binding to link form to datasource.

should I be using VBA to create an ADO recordset in memory and then attach that to the form?

Would that mean the filtering is done against the memory resident record set and not against SQL server which I'm getting the feeling is what is currently happening?

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!"

MIME::Lite TLS Email Encryption - Perl v0.02 beta
 
I'm completley stumped on this. (creating another index didn't help!)

Method 1

If I use the SP as the datasource bound to the form, getting the records is lightning fast, but as soon as an attempt to filter them is made it grinds to a halt.

Method 2

If I use the SP to perform an append query against a local table it takes ages to get and insert the records, but then the filtering is lightening fast.

Method 3

If I use the SP to set a DAO.recordset variable and then bind that with VBA to the form, it's lightening fast. However, as soon as I try to apply a filter the DB crashes!

---------------------------------------------------------
Why can't I have my cake and eat it?

It seems I can either have load fast, filter slow or filter fast but load slow.

What's the fastest way of creating a populated table from a recordset retrieved via an SP ?

Having a local empty table and using an append query seems very slow indeed, so if i can speed this process up, I might be able to work with the disconnected model.

Unless you know why filtering a DAO.recordset object bound to a form is crashing the DB.

anything to do with this SP1 cock-up perhaps?

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!"

MIME::Lite TLS Email Encryption - Perl v0.02 beta
 
I'm sorry. You lost me at binding. I had a bad experience with it about a hundred years ago and haven't used it since.

-George
Microsoft SQL Server MVP
My Blogs
SQLCop
twitter
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
Well we all have to bind one way or another to show the data to the user via a form!

I've given up with this SP and filtering.

I found it's because of the removal of duplicates i'm trying to achive via the select that is causing the problem.

I've changed the UI and remove duplicates via filtering once the main recordset has been retrieved, it's running fine now at acceptable speeds for remote users.

It's changed the way the UI looks and feels, but it solves the problem, with quite dramatic results.

It's just a shame the DB has such bad design to start with, if things were done differently we wouldn't have this duplicate company issue causing all the lag!

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!"

MIME::Lite TLS Email Encryption - Perl v0.02 beta
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top