Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

storing temporary and large amount of data

Status
Not open for further replies.

tanvis

Programmer
Jun 9, 2010
3
GB
Hi,

I have a scenario for which I wanted the best solution in asp.net c#. My application allows users to retrieve search data from database, this data is the users own temporary list of records which they can search, delete records or select them. All these actions are only on users temporary data list and not on the data base. I have currently carried out this functionality using data table which is stored in session. This method is causing problems when huge amounts of data is held in session (eg: 7000 records). I need a best possible solution for this. Is creating temporary tables an option? Can it be assigned to particular user sessions? Please point me to the right direction and right articles.

Thanks in advance,
Tan
 
yes, 7000 records in a datatable will wreak havoc on memory consumption. create a table in the database allow the user to do scrub the data. when there are all done, transfer the data from the intermediate table to the permanent table(s).

example
Code:
create table [ForDataCleansing] (
   [Id] bigint not null,
   [UserId] guid not null,
   remaining columns for data...,
   constriant ForDataCleansing_PK primary key (id)
)
import the temporary data into ForDataCleansing. select all the records which match the user's id. by having a simple PK on the table you can easily preform update and delete commands like this
Code:
update [ForDataCleansing] set ... where id = @Id and UserId = @user
delete from [ForDataCleansing] where where id = @Id and UserId = @user
once the data is scrubbed you can transfer the data from ForDataCleansing to the permanent RDBMS tables.

RDBMS is what most devs are comfortable with, however there are other persistence options:
document database (couchdb, ravendb, mongo)
persistent key value pairs (rhino PHT)
text file (csv)

no matter which solution you choose preformance will be gained through efficient queries.
1. page the data (don't load all 7000 rows and then display 20) query 20 records and display 20 records
2. batch statements when applicable. you can send multiple read/write commands in a single query. it may make sense to use this feature in some scenarios
3. indexing. with the above solution you get a clustered index on Id. I would also apply a non-clustered index on UserId as well.

Jason Meckley
Programmer
Specialty Bakers, Inc.

faq855-7190
faq732-7259
 
thanks for your answer. I do not understand though how one table can store data for multiple sessions considering few user would be querying at the same time,

Do you have any articles you can refer me to for the same?


Thanks for your time.
 
Maybe another idea is to use temporary tables within sql server (assumption).

The idea is this: Each user had a unique ID whcih can be the table name; for instance: joebloggs runs a query on your database so in code you would have:

Code:
SELECT * INTO #joebloggs WHERE... and then the rest of your query

The SELECT * INTO will create the temp table with the name of the user. In any subsequent sql statements use the username as the name of the table.

When complete, execute another sql statement to drop the temp table

Code:
DROP TABLE #joebloggs

Hope this helps and isn't too confusing,

Patrick
 
Thanks Patrick. That is really helpful. I was thinking in similar terms, but didnt find an appropriate or an explicit time to drop table.

Thank you
 
thanks for your answer. I do not understand though how one table can store data for multiple sessions considering few user would be querying at the same time
what doesn't make sense? if you want to store data for multiple sessions, then apply the session id to the table instead of the user id. you can then filter by session. the problem is, if a session is abandoned, you will have orphaned records in the table. you could clear them out on Session_End, but I don't think that is 100% fool proof.

you also need to consider how the end user expects to interact with the data. if they do some work, stop, and return minutes/hours later. then segmenting bulk operations by session doesn't make much sense.

this solution is not that much different than a temporary table. the main difference is the table always exists, and you will need a column to differentiate who can access the row.

Jason Meckley
Programmer
Specialty Bakers, Inc.

faq855-7190
faq732-7259
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top