Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

I Give Up: Data too long for column 'Image' at row 1

Status
Not open for further replies.

smays

Programmer
Aug 18, 2009
125
US
Okay, I give up. How does one place more than 50kBytes in a modern MySQL database (5.1.48-community) programmatically?

I have tried increasing the max_allowed_packet to 16M:
Code:
[mysqld]

# The TCP/IP Port the MySQL Server will listen on
port=3306
max_allowed_packet = 16M

Here is how the table is created:
Code:
CREATE TABLE `image_table` (
  `ImageID` int(10) unsigned NOT NULL AUTO_INCREMENT,
  `ImagePathNFilename` varchar(256) NOT NULL,
  `Image` blob NOT NULL,
  PRIMARY KEY (`ImageID`)
) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=latin1

Here is the query I am attempting to use:
Code:
char *cQueryStatement = "INSERT INTO image_table(ImagePathNFilename, Image) VALUES('%s', '%s')";

I don't get it. What is the REAL size limit of BLOB fields? I get the same response whether I am working over a network as I do as localhost. I have used commercial libraries and I get the same response. I am confident the problem is with MySQL - more specifically with some setting I would have never in a million years guessed.

If you have any clue as to what the problem is, please, let me know.

Thanks,
Steve.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top