Okay, I've got a weird one for you - I've done a lot of searching and haven't run into any solutions. I'm hoping I'm merely being brain damaged and you folks can clear up my insanity.
I've run into issues with storing Blobs larger than 32K in both Access and SQL Server no matter WHAT I set the BDE parameters to, so I'm breaking up large BLOBS (e.g. an image) into 32K chunks that I store in the database and then retrieve and store.
The problem I'm running into is very odd - Although the retrieval works just fine and correctly pulls and assembles the image file, the STORE routine is not working.
The first chunk - although the Memory Stream reports a 32K size (and indeed if I save the stream to a file, the file is a correct image file), it actually only STORES 3 BYTES.
The second chunk, same problem, but this one stores 16,380 byes.
The third chunk stores 6179 bytes - definitely bizarre.
Any ideas?
Here's the relevant code:
procedure TForm1.btnImgSaveClick(Sender: TObject);
var
WorkingFile : File of Byte;
WorkingSize, TempI : LongInt;
TempS, WorkingFileName : String;
FS : TFileStream;
MS : TMemoryStream;
ChunkSize, FilePos : LongInt;
begin
If (OpenPictureDialog1.Execute) Then
begin
FrmDBAccess.DBShadow.Params[1] := 'BLOB SIZE=64';
Query1.SQL.Clear;
Query1.SQL.Add('DELETE FROM Images WHERE WorldIndex=2 AND ImageType=3');
Query1.ExecSQL;
{Open up the existing file}
AssignFile(WorkingFile, OpenPictureDialog1.FileName);
Reset(WorkingFile);
{This is the largest BLOB chunk I can get Access to accept. Even SQL Server has unhappy times with larger.}
WorkingSize := FileSize(WorkingFile);
If (WorkingSize <= 32760) Then
ChunkSize := WorkingSize
else
ChunkSize := 32760;
CloseFile(WorkingFile);
{Create the streams - don't cross the streams!}
FS := TFileStream.Create(OpenPictureDialog1.FileName, fmOpenRead);
MS := TMemoryStream.Create;
FilePos := 0;
FS.Seek(0, soFromBeginning);
WorkingFileName := ExtractFileName(OpenPictureDialog1.FileName);
{Step through the file to be imported in 32K chunks}
While FilePos < WorkingSize Do
begin
MS.Clear;
MS.CopyFrom(FS, ChunkSize);
{Set up the Insert query}
Query1.SQL.Clear;
Query1.SQL.Add('INSERT INTO Images (WorldIndex, ImageFileName, ImageType, TheImage) VALUES ThisWorld, :ThisFileName, :ThisType, :Item)');
Query1.ParamByName('ThisFileName').AsString := ExtractFileName(OpenPictureDialog1.FileName);
Query1.ParamByName('ThisType').AsInteger := 3;
Query1.ParamByName('ThisWorld').AsInteger := 2;
Query1.ParamByName('Item').LoadFromStream(MS, ftblob);
Query1.ExecSQL;
{Error trap}
If (Query1.RowsAffected < 1) Then
MessageDlg('IMPORT FAILED. NO ROWS IMPORTED FOR ' + TempS + ' Bytes.', mtError, [mbOK], 0);
{Move on to the next chunk}
FilePos := FilePos + ChunkSize;
TempI := FilePos + ChunkSize;
If (WorkingSize < TempI) Then
ChunkSize := WorkingSize - FilePos;
FS.Seek(FilePos, soFromBeginning);
Close_Data;
end;
{Clean up}
FS.Free;
MessageDlg('FILE IMPORTED.', mtInformation, [mbOK], 0);
end;
end;
I've run into issues with storing Blobs larger than 32K in both Access and SQL Server no matter WHAT I set the BDE parameters to, so I'm breaking up large BLOBS (e.g. an image) into 32K chunks that I store in the database and then retrieve and store.
The problem I'm running into is very odd - Although the retrieval works just fine and correctly pulls and assembles the image file, the STORE routine is not working.
The first chunk - although the Memory Stream reports a 32K size (and indeed if I save the stream to a file, the file is a correct image file), it actually only STORES 3 BYTES.
The second chunk, same problem, but this one stores 16,380 byes.
The third chunk stores 6179 bytes - definitely bizarre.
Any ideas?
Here's the relevant code:
procedure TForm1.btnImgSaveClick(Sender: TObject);
var
WorkingFile : File of Byte;
WorkingSize, TempI : LongInt;
TempS, WorkingFileName : String;
FS : TFileStream;
MS : TMemoryStream;
ChunkSize, FilePos : LongInt;
begin
If (OpenPictureDialog1.Execute) Then
begin
FrmDBAccess.DBShadow.Params[1] := 'BLOB SIZE=64';
Query1.SQL.Clear;
Query1.SQL.Add('DELETE FROM Images WHERE WorldIndex=2 AND ImageType=3');
Query1.ExecSQL;
{Open up the existing file}
AssignFile(WorkingFile, OpenPictureDialog1.FileName);
Reset(WorkingFile);
{This is the largest BLOB chunk I can get Access to accept. Even SQL Server has unhappy times with larger.}
WorkingSize := FileSize(WorkingFile);
If (WorkingSize <= 32760) Then
ChunkSize := WorkingSize
else
ChunkSize := 32760;
CloseFile(WorkingFile);
{Create the streams - don't cross the streams!}
FS := TFileStream.Create(OpenPictureDialog1.FileName, fmOpenRead);
MS := TMemoryStream.Create;
FilePos := 0;
FS.Seek(0, soFromBeginning);
WorkingFileName := ExtractFileName(OpenPictureDialog1.FileName);
{Step through the file to be imported in 32K chunks}
While FilePos < WorkingSize Do
begin
MS.Clear;
MS.CopyFrom(FS, ChunkSize);
{Set up the Insert query}
Query1.SQL.Clear;
Query1.SQL.Add('INSERT INTO Images (WorldIndex, ImageFileName, ImageType, TheImage) VALUES ThisWorld, :ThisFileName, :ThisType, :Item)');
Query1.ParamByName('ThisFileName').AsString := ExtractFileName(OpenPictureDialog1.FileName);
Query1.ParamByName('ThisType').AsInteger := 3;
Query1.ParamByName('ThisWorld').AsInteger := 2;
Query1.ParamByName('Item').LoadFromStream(MS, ftblob);
Query1.ExecSQL;
{Error trap}
If (Query1.RowsAffected < 1) Then
MessageDlg('IMPORT FAILED. NO ROWS IMPORTED FOR ' + TempS + ' Bytes.', mtError, [mbOK], 0);
{Move on to the next chunk}
FilePos := FilePos + ChunkSize;
TempI := FilePos + ChunkSize;
If (WorkingSize < TempI) Then
ChunkSize := WorkingSize - FilePos;
FS.Seek(FilePos, soFromBeginning);
Close_Data;
end;
{Clean up}
FS.Free;
MessageDlg('FILE IMPORTED.', mtInformation, [mbOK], 0);
end;
end;