MoonchildHK
Programmer
Hi All,
I am having serious problems with the ftp part of a program I am writing. I have a several TClientSocket, the first of which connects to the control port (21) of the ftp server. The other one is used to actually x-fer the data (on a port assigned using PASV).
My problem is that when I am transfering big files (say 100Mb) then ALL my memory disapears and windows crashes!
Here is the code that is run from the OnRead handler for the TClientSocket. I open a file when the socket connects, and then save the data to the file on each read, then close the file on dis-connect. I have the following code in a TThread so that the program does not freeze with large files.
The files turn up fine on the disk (they are video files.. and I can view them OK) but if I am x-fering a large file.. the memory just goes very fast! I am de-alocating the small amount of memory I am using for each read. I have also tried re-using the same memory, which I allocated at the start of the program and got the same results.. more and more meory being used until none left!
Could it be the ClientSocket is using memory to store the data before I read it, and then not freeing it again? How would I solve this?
If anyone has ANY ideas I would be most gratefull. I have been trying to solve this for days now with no sucess :-(
Thanks,
Steph
I am having serious problems with the ftp part of a program I am writing. I have a several TClientSocket, the first of which connects to the control port (21) of the ftp server. The other one is used to actually x-fer the data (on a port assigned using PASV).
My problem is that when I am transfering big files (say 100Mb) then ALL my memory disapears and windows crashes!
Here is the code that is run from the OnRead handler for the TClientSocket. I open a file when the socket connects, and then save the data to the file on each read, then close the file on dis-connect. I have the following code in a TThread so that the program does not freeze with large files.
Code:
void __fastcall receiveFile::Execute()
{
// Read data from the server
int length;
char *buffer;
length = DataSocket->Socket->ReceiveLength();
buffer = new char[10240];
DataSocket->Socket->ReceiveBuf(buffer, length);
fwrite(buffer, length, 1, fhCurrentFile);
delete[] buffer;
}
The files turn up fine on the disk (they are video files.. and I can view them OK) but if I am x-fering a large file.. the memory just goes very fast! I am de-alocating the small amount of memory I am using for each read. I have also tried re-using the same memory, which I allocated at the start of the program and got the same results.. more and more meory being used until none left!
Could it be the ClientSocket is using memory to store the data before I read it, and then not freeing it again? How would I solve this?
If anyone has ANY ideas I would be most gratefull. I have been trying to solve this for days now with no sucess :-(
Thanks,
Steph