Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Creating Multiple TADOQuery @ RUntime

Status
Not open for further replies.

BrianDHuff

Programmer
Jun 23, 2003
11
US
I have a very fun and interesting little challange.
Delphi 6.0 and MS Sql server using ADO for connection.

I have 1 table called Import_Raw
within this table there are about 250,000 records that are dumped in
from an import of Excel spreadsheets.

I perform several operations on the dataset, then have to dump back
out to Excel. Thats all pretty easy and no big deal.

However, at one point during my operations, I have to create and use
many adoQueries at runtime. There is no set number, so defining them
in design time is not easily done. It could be anywhere from 10 to
200 queries that I need to do.

So, what I am looking for help on is this:

What is the best way to create these queries during Runtime, and name
them indicidually something different?

with the following I'm close, but without names

var
Qbe1 : TadoQuery
Begin
qbe1.create(application);
with qbe1 do
begin
{assign params and such here}
do work
end;

thats no problem, the only problem is, I would have to define 200 or
so TadoQuery objects at design time, then come up with some way to
iterate through the names...

Im pretty comfortable in Delphi for the most part, this is just
getting a little deeper than I have gone before.

Any help is appreciated.

Brian

 
It's probably easier than you think. You don't need to give them names. Just create a TADOQuery, set the connection property, and use it. Then free it when no longer needed. All references are by the object reference, not the name.

The only time I drop a TADOQuery on the form is during initial development if I want to have it active at design time. Once debugged, I get rid of it and use a run-time creation. This is done by a function call which allows me to be sure they all have the initial property settings I want. It also provides a central "clearing house" for my SQL that I can log to a file and/or copy to the clipboard for debugging.

Something like this (using your code snippet):
Code:
var
  Qbe1 : TadoQuery
Begin
    qbe1 := MainDataModule.CreateQuery();
    try
      with qbe1 do
      begin
          {assign params and such here}
          do work
      end;
    finally
      MainDataModule.FreeQuery( Qbe1 );
    end;
End;
[code][/b]
And in the main data module (simplified):
[b][code]
type 
  TMainDataModule = class(TDataModule)
:
:
function TDataModule.CreateQuery( ): TADOQuery;
begin
  Result := TADOQuery.Create( Self );
  with Result do
    begin
      BeforeOpen := AnyQueryBeforeOpen;
      EnableBCD := False;
    end;
end;
"AnyQueryBeforeOpen" is a procedure that can do the logging, etc. described above.

However, I don't understand why you would need hundreds of them. Why in the world would you need 200 open queries at the same time in one application? Have you considered the load that would represent on your DBMS?

 
I have thought about the load, and it's not pretty.

Here is the basics of the app itself, and then you can see why I need them all open at the same time.

I import about 135'ish excel spreadsheets into MS SQL.
I then scrub, normalize, and perform some rules on the data.
Finally I have to export to a CSV file, and a master Excel spreadsheet.

Inside the data, there is a column of STORE_CODE, which represents the different stores the data comes from. On the Master Excel spreadsheet, each store code has to be on it's own "Tab", I'm using EMS's QuickExport to make the exports happen. With their VCL, it's easy to assign a new "Sheet" or "Tab" and you can assign each one a dataset. The problem is, is when you execute the export command, it creates each tab at the same time.

At first, I used one query Qbe1 for our purposes here, and would simply interate through the different store codes as needed. So QBE1, would have all the data from store_code "001" then assign to the tab for export, change the qbe1.sql to store_code "002" and assign it as well to tab2 for export etc... problem is of course, when you execute the export, all tabs contain the data from the last run of qbe1.

So, the only way I can see through it, is to create multiple queries and assign each "tab" for the export to a unique dataset.

I have to be able to do this at runtime and in a variable manner due to the fact that this month there might be 100 stores, and next month 200, etc.

Now, if there is a better way to do this, I'm all ears, but I'm not having anything come to mind that might work well.

We have a SQL server sitting just for this purpose, and right now the person doing this job manually takes 7 days to do it. Current tests of the program produce results in about 2 hours... so we can justify the expense of another server in order to save 6.5 days of man-hours... er, or so I hope heh.

Thanks for bending your brain-power to my issue here.
 

...and it's not pretty... is an understatement!

I'm not familiar with EMS nor QuickExport, so I can't help you there. But it sounds like what it is creating is worksheets with links to a database instead of just putting data into a spreadsheet. Is that what you want?

EMS may not be the right tool for the job. If I had to do something like what I think you are describing, I would probably just create .csv files with the data (or tab-delimited, which may be easier in this case) plus one more file to use as a driver. Then I would use Excel VBA to create and populate the worksheets with the data from the .csv files (using the driver file to indicate which .csv files to process). In other words, I would use Delphi for the SQL Server stuff and VBA for the spreadsheet stuff.

So unless you are working with gigantic amounts of data, you should be able to work with your existing servers. (If you have gigantic amounts of data, your final workbooks will probably be too large to use comfortably, anyway.)

If you don't grok VBA, don't worry, it's not that hard, and there are plenty of people here to assist you (See forum707.)

 
It's actually putting the data into the worksheet itself.

I might take a look a VB to do some of it, I'm not 100% comfortable in VB, however, I'm comfortable in VS .NET using ASP.NET or C#, so I would imagine VB could be done easily.

I'm thinking that I might want to look at creating String Grids or something to hold the data temporarily, then assign the grid as the dataset (which you can) in EMS Quickexport.

As usual, the question of "Can you make a quick little util that can do this? and can you do it around $1,500?" has turned into a nightmare.

Thanks for the help! It's an odd problem, and all thoughts are appreciated.

Brian
 
Does anyone think using an Array in this situation might work?
 
If you mean to keep track of all of the queries you are creating, you could use a TObjectList (D5 or later). A side benefit is that you can have the TObjectList automatically free all of the queries when you free the list.

I would still try to find a way to do the job without creating hundreds of query objects. Once a worksheet is populated with data, I just don't understand why you can't close the query (and free it if necessary) and then go on to the next worksheet.

 
I would love to do it one worksheet at a time, but can't seem to do that, all worksheets are created at the time the file is created.
 
Could you make the queries Stored Procedures on the SQL server and just call those?
 
I have data that I put into an array in Delphi, then transfer the entire array to Excel in one shot. here's some code snippets:

Code:
//declare array
JurorArray : variant;
//create array as varOleStr
JurorArray := VarArrayCreate([0, (JurorCount + 1), 0, 2], varOleStr);
/fill array
JurorArray[0,0] := 'Jurnum';
JurorArray[0,1] := 'Jurname';
for i := 1 to JurorCount do
begin
  JurorArray[i, 0] := JMSData.qryTemp.FieldByName('JURNUM').AsString;
  JurorArray[i, 1] := JMSData.qryTemp.FieldBYName('FIRSTNAME').AsString + ' ' +
  JMSData.qryTemp.FieldByName('LASTNAME').AsString;
  JMSData.qryTemp.Next;
end;
//create excel application
excelapp:= CreateOleObject('Excel.Application');
excelapp.Visible := False;
excelapp.Workbooks.Add;
excelsht :=  excelapp.WorkSheets.Item['Sheet1'];
excelsht.Activate;
//set the array range to size of array and transfer
excelsht.Range[excelsht.Cells.Item[1, 1], excelsht.Cells.Item[(JurorCount + 1), 2]].Value := JurorArray;

Just another suggestion!

Leslie
 
Thats pretty much what I ended up doing with this project. The software passed the Beta test last month, and is now in production. We got the entire thing streamlined down to just 4 open queries at one time, and the entire processing takes only 3 monites.

Thanks for the tips everyone!

Brian
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top