Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Needs Performance in bulk updates

Status
Not open for further replies.

7591838

Programmer
Nov 18, 2003
4
AU
Hi,

Im using musql on sun system. i have a stand alone java based application running on the same system, which reads the text file and update the database. I need 100 rows/sec updates but mysql is taking too much time. I have used batch update of java with 1000 rows at once. the rows are read by the appication in 10 sec and it takes 40secs by mysql to update the table. I have indexes on the required fields.

the data is of the form:

coulmns: key1, data1, data2, data3, data4, data5

row1: 1,a,b,c,,,
row2: 1,,,c2,d,e
row3: 1,a2,,,d2,e2
row4: 1,,,,,e3

ofcourse in the table i should have the latest information as :

1,a2,b,c2,d2,e3

Thanks for ur time and response.

S

 
Hi,

thanks for reply.

I need the performance enhancement as mentioned in the subject of the message. im using char and fixed length rows and indexes on the key field.

 
Then what does this mean?

ofcourse in the table i should have the latest information as :

1,a2,b,c2,d2,e3




If you are looking to get a file loaded into MySQL quickly, I recommend that you write your data file to the filesystem of your server, then use MySQL's LOAD DATA INFILE query (
Want the best answers? Ask the best questions: TANSTAAFL!!
 
sorry for making my question bit complex.

In short i need the performance boost in update query. I want to update 100 rows/sec in a table. I m getting the data in chunks so have to update the table in chunks. cant use load data infile because the file keeps on increasing with the passage of time.

Regarding the latest information i mean that the other applications r getting data from this table so i have to make available the latest data.


The scenario is as:

I m having a text file which is feded by messages 100 /sec. I have an java application which reads these messages parses them and update the table one by one. But in this way it was going very slow and cant keep up the speed of the messages. so i tried to use the bulk update in which i read the 1000 messages from the file, parse them and trying to update them in one go. but it is taking time. i have two indexes. and the 'where' of update contains two conditions, one compares the date and the other compares the name, both of these fields have index on them.


 
You should be able to append 100 rows/sec quite easily to a normal sized table with LOAD DATA INFILE as sleipnir suggest

What about getting your application to write a text file until it has written 1000 records then start writing another table and while it is doing that call the LOAD DATA INFILE statement
 
How often do you COMMIT?

I often used threads or just simple message queues. The program processing the messages writes to a message queue. You can have many instances of the same update database program with each on looking a the various unique queues.

Look at msgsnd and msgrcv. Your message program just goes thru a message list and alternately sends the update message to the receiving programs in turn.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top