Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

W4104 Unable to insert a record. 1

Status
Not open for further replies.

Gery

MIS
Apr 9, 2002
24
0
0
US
I used MSSQL 2k Centralized Database in a SAN environment with AS 2k Build 1100 and got the following error in the Centralized DB Servers Activity Log:

W4104 Unable to insert a record. (EC=SQL Error State:22003, Native Error Code: 1FB3, ODBC Error: [Microsoft][ODBC SQL Server Driver][SQL Server]Arithmetic overflow error converting expression to data type int.SQL Error State:01000, Native Error Code: E25, ODBC Error: [Microsoft][ODBC SQL Server Driver][SQL Server]The statement has been terminated.)

Customer had to merge tapes to restore files. The following may cure the problem.

MSSQL DB's go to
ASDB
-> TABLES
-> ASTPDRV (right klick and choose "Design Tables")
-> TDTTLKBWRITTEN -> change from "int" to "bigint"

This did the trick but only works on MSSQL 2K

By the way, this was the quickest support I ever had from the CA support people.

Kind regards
Gery

If it ain't broke, don't fix it
 
Great tip, but we have a Microsoft SQL Server 7 and there ain't no bigint... Does anybody know by chance, if there's another way to fix it?
Would be great if someone could help. CA doesn't get my point...

Tobias
 
Hi tabbit
The guy from CA told me, that there is no way to change this settings on SQL 7. I have the same problem with the environment of an other customer. Didn't find a solution so far.
Kind regards
Gery
 
has anybody tried setting it to "float" on mssql7 ?
 
I didn't try the "float" type, but the "decimal" type; ArcServe didn't update the value and the "Reporter" option did not work anymore.
This is no suitable solution. It seems the type must stay "integer".
 
What about increasing the length of that column? Has anyone tried this? Or would it be risky? I'm running SQL 7, and trying to find an answer.
 
This is from the CA supportbase:

This is usually related to the 'TDTTLKBWRITTEN' field in
the 'ASTPDRV' table. This is the total bytes written by a
drive. The highest the value can go since it is an INT
field is 2147483647.

Resetting the number back to 0 should resolve the issue.

This can be done through enterprise manager.

This value is just for tape usage history (hours of usage).
 
Resetting which number?

In Table Design I have...

TDTTLKBWRITTEN | int | 4 | 10 | 0
 
0 out the value not the properties.

An easy way to take care of this is to go into the ARCserve Manager Database, select the drive and delete it from within the database.
 
Thanks....

Oh lord, guess I should have looked there sooner. There's a bunch of garbage devices in there. I delete them, but they all keep coming back. I stopped and started the DB engine and tried again with no luck. Any thoughts?
 
What are the details on these garbage devices?
Do they show up in the ARCserve Device Manager window?
 
That is just too strange, with valid and invalid data mixed, perhaps it is due to corruption.

I use the VLDB so I do not have a lot of details when SQL is used, but I think there is a file setupsql in the home directory that will let you initilize just the device database.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top