Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Storing LARGE Blobs in a SQL capable database (including Access) 1

Status
Not open for further replies.

OrionXIII

Programmer
Aug 29, 2011
3
US
Okay, I've got a weird one for you - I've done a lot of searching and haven't run into any solutions. I'm hoping I'm merely being brain damaged and you folks can clear up my insanity.

I've run into issues with storing Blobs larger than 32K in both Access and SQL Server no matter WHAT I set the BDE parameters to, so I'm breaking up large BLOBS (e.g. an image) into 32K chunks that I store in the database and then retrieve and store.

The problem I'm running into is very odd - Although the retrieval works just fine and correctly pulls and assembles the image file, the STORE routine is not working.

The first chunk - although the Memory Stream reports a 32K size (and indeed if I save the stream to a file, the file is a correct image file), it actually only STORES 3 BYTES.

The second chunk, same problem, but this one stores 16,380 byes.

The third chunk stores 6179 bytes - definitely bizarre.

Any ideas?

Here's the relevant code:

procedure TForm1.btnImgSaveClick(Sender: TObject);
var
WorkingFile : File of Byte;
WorkingSize, TempI : LongInt;
TempS, WorkingFileName : String;
FS : TFileStream;
MS : TMemoryStream;
ChunkSize, FilePos : LongInt;
begin
If (OpenPictureDialog1.Execute) Then
begin
FrmDBAccess.DBShadow.Params[1] := 'BLOB SIZE=64';
Query1.SQL.Clear;
Query1.SQL.Add('DELETE FROM Images WHERE WorldIndex=2 AND ImageType=3');
Query1.ExecSQL;
{Open up the existing file}
AssignFile(WorkingFile, OpenPictureDialog1.FileName);
Reset(WorkingFile);
{This is the largest BLOB chunk I can get Access to accept. Even SQL Server has unhappy times with larger.}
WorkingSize := FileSize(WorkingFile);
If (WorkingSize <= 32760) Then
ChunkSize := WorkingSize
else
ChunkSize := 32760;
CloseFile(WorkingFile);
{Create the streams - don't cross the streams!}
FS := TFileStream.Create(OpenPictureDialog1.FileName, fmOpenRead);
MS := TMemoryStream.Create;
FilePos := 0;
FS.Seek(0, soFromBeginning);
WorkingFileName := ExtractFileName(OpenPictureDialog1.FileName);
{Step through the file to be imported in 32K chunks}
While FilePos < WorkingSize Do
begin
MS.Clear;
MS.CopyFrom(FS, ChunkSize);
{Set up the Insert query}
Query1.SQL.Clear;
Query1.SQL.Add('INSERT INTO Images (WorldIndex, ImageFileName, ImageType, TheImage) VALUES :)ThisWorld, :ThisFileName, :ThisType, :Item)');
Query1.ParamByName('ThisFileName').AsString := ExtractFileName(OpenPictureDialog1.FileName);
Query1.ParamByName('ThisType').AsInteger := 3;
Query1.ParamByName('ThisWorld').AsInteger := 2;
Query1.ParamByName('Item').LoadFromStream(MS, ftblob);
Query1.ExecSQL;
{Error trap}
If (Query1.RowsAffected < 1) Then
MessageDlg('IMPORT FAILED. NO ROWS IMPORTED FOR ' + TempS + ' Bytes.', mtError, [mbOK], 0);
{Move on to the next chunk}
FilePos := FilePos + ChunkSize;
TempI := FilePos + ChunkSize;
If (WorkingSize < TempI) Then
ChunkSize := WorkingSize - FilePos;
FS.Seek(FilePos, soFromBeginning);
Close_Data;
end;
{Clean up}
FS.Free;
MessageDlg('FILE IMPORTED.', mtInformation, [mbOK], 0);
end;
end;
 
Forgive me if this does not help. I use purchased libraries to work with a MySQL database (yes, I know, not one of the databases you mentioned above - please read on...).

One activity for which a great of my code is responsible is to stream screen shot captures from oscilloscopes into our MySQL database. I used to receive the most annoying error messages claiming the quantity of the BLOB data was too great to store in the database. To successfully store the images in the database I have written tons of unnecessary code to make sure the images are less than 35kBytes (sound familiar?) in size. I knew for fact MySQL places no reasonable limits on the size of BLOB data, so I decided the purchased libraries were to blame.

I lived with and worked around this stupid constraint for years until a few months ago I discovered that MySQL places physical limits on the size of "BLOB" data types (~35kBytes) AND if I wanted more data, I would need to adopt "MEDIUMBLOB" and above. Ever since then I have not had to worry about the size of the BLOB data.

Do either SQL Server or Access have *different* BLOB data types? If so, is there a size restriction associated with the different BLOB data types?

Steve.
 
What about not using a parameter to load the Blob value?
 
Thank you BOTH!

Smays,
Both SQL Server and Access are limited to a single Blob Type, unfortunately - But I'm glad to see I'm not the only one who's run into this issue!

DjangMan,
I'm not tracking on that - Is there a different way to load the Blob value? Can you educate me? I'd love to learn a new technique whether it solves this particular problem or not!

And thank you both for taking the time to read through the code and provide some insight!

Orion

 
Instead of using a parameterized query to load the table you could try using an ADO Record Set. Use the query to update the record and then use a recordset to query that record and then update the BLOB field.

Here's some similar code that I was able to find. My example doesn't run against a SQL database, however.

Code:
procedure TdmDB.LoadFile(sFilename: string; PayPeriodEndDate: TDateTime);
var
  MemStream: TMemoryStream;
begin
  MemStream := TMemoryStream.Create;
  try
    MemStream.LoadFromFile(sFilename);
    with tbInputFile do
    begin
      Append;
      tbInputFileFILE_NAME.Value := ExtractFileName(sFilename);
      tbInputFileINPUT_FILE.LoadFromStream(MemStream);
      tbInputFileIMPORT_DATE.Value := Now;
      tbInputFileIMPORT_USER.Value := GetLocalUserName;
      tbInputFileIS_PROCESSED.Value := False;
      tbInputFilePAY_PERIOD_END_DATE.Value := PayPeriodEndDate;
      Post;
    end;
  finally
    MemStream.Free;
  end;

end;
 
Thank you - I'm gonna give this a try! I'll let you know if it works...

Thanks again for taking the time to respond and to research! I'll post any working code I manage to cobble together...:D

Jeff
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top