Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Warning: 3

Status
Not open for further replies.

RonRepp

Technical User
Feb 25, 2005
1,031
0
0
US
Hi:

I'm getting a warning when I create a table that I've never seen before. Can someone explain it?

Code:
Warning: The table 'OrderDetail' has been created but its maximum row size (26503) exceeds the maximum number of bytes per row (8060). INSERT or UPDATE of a row in this table will fail if the resulting row length exceeds 8060 bytes.

This was created originally from a delimited file. There are 165 fields for which I have to allow, even though their biggest file has only had about 80 fields.

Thanks,


Ron Repp

If gray hair is a sign of wisdom, then I'm a genius.

My newest novel: Wooden Warriors
 
The actual limit on the row size of a table is 8060 bytes. It will allow you the ability to create tables with more bytes than that but will not insert more than that number of bytes in a record. It is warning you that potentially you could be losing data. I would suggest a redesign to two or more related tables.

"NOTHING is more important in a database than integrity." ESquared
 
It is simple
a page can only hold 8K so the insert will fail

-----------------------------
Does that make sense to you?
 

The insert will only fail if all of the fields coming in add up to more than 8060 bytes. You just have to be careful that they don't.

The best solution is to normalize (and/or de-normalize) your table so that the row size is smaller.

However, one cannot always re-design into two or more tables. I inherited a POS database that had this same problem. The main table was (is) GINORMOUS. Problem is, the application that hits the database has all of the query functionality built into the web pages. "It is not a business priority" to re-build the application, so I am stuck with this hunk of junk.

You could test inserts and updates to make sure they don't exceed the size, then dump the data to a file or table if they do. I also got stuck with a ingest function into said database where I get a plus-delimited --(I know, 1972, right?)-- file on an irregular basis that has to be inserted [IMMEDIATELY!] into this POS database. Sometimes you're stuck with what you've got, and the best you can do is CYA and move on. Good luck!


v/r

Gooser

Why do today
that which may not need to be done tomorrow [ponder] --me

The Universal Model
 
Everyone:

I have followed the advice and built several tables that will do the trick. I need all the available fields, as this is coming through as delimited EDI, and it would have been nice to keep everything in one table, but not a necessity. Also, I doubt that all the fields would have been used (even the test files didn't use them all), but I have to program for them in case they do.

So, I basically have 10 tables using POID as a relation for all.

Thanks again,

Ron Repp

If gray hair is a sign of wisdom, then I'm a genius.

My newest novel: Wooden Warriors
 
That's good. Even if the tests don;t use them all now, you never know when you might exceed that record limit. Plus you may not need to query all the related tables every time and having smaller ros in the table will improve performance and disk access. For when you do need access to all of them, I suggest an indexed view with all the joins. That will make future queries easier to write because you don't have to specify all the joins.

"NOTHING is more important in a database than integrity." ESquared
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top