Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Retrive and update records in bulk 2

Status
Not open for further replies.

Zoom1234

Programmer
Oct 30, 2003
116
BE
Hi,
I have a to fetch records from one mysql server and update it in another server.
Code:
connect to database1
retrieve records from a table (ID is primary key)
connect to database2
update the record in the table using ID

Since there are large no of records (about 500000), I am wondering what is the best way to do this? (i.e.cursors..)

MySQL Version is 5.0.37.

Can someone throw some light on this and point me in right direction possibly with some psudo code.

Thanks in advance.


 
1. dump the table from database1

2. upload the table to database2

3. update the table in database2 with the data from the uploaded table using a single multi-table (joined) update statement

r937.com | rudy.ca
Buy my new book Simply SQL from Amazon
 
Thanks for the response r937.

Can you please throw some light on multi-table (joined) update statement. I have never used it before.


 
I know of no multirow UPDATE statement, but you can "abuse" the INSERT .... ON DUPLICATE KEY statement, like:

Code:
INSERT INTO YourTable(ID, OtherField, YetOtherField)
       VALUES(1, 'One', '...'),
             (2, 'Two', '...'),
             (3, 'Three', '...')
       ON DUPLICATE KEY UPDATE OtherField=VALUES(OtherField), YetOtherField=VALUES(YetOtherField);

I use this technique to make SQL scripts repeatable. In your case, I do not know what the speed consequences are. In any case, I would split the rows up in batches and not issue a single statement for 500000 rows. If you use InnoDB, it may really help in speed to do the whole lot in a transaction.



+++ Despite being wrong in every important aspect, that is a very good analogy +++
Hex (in Darwin's Watch)
 
Thanks for the replies r937 and DonQuichote.

What i want to do is take data from database 1, append and prepend some predefined text to data and store it in database 2.Then run this query from PHP.

Since there are 5000000 records I am thinking of using cursors and call this from PHP (that is another part).
Is the approach sounds sensible to you?

Thanks.
 
5 bazillion records via a cursor? whoa!!

what's wrong with trying a method that works on sets of rows, rather than one row at a time?

r937.com | rudy.ca
Buy my new book Simply SQL from Amazon
 
Are both databases on the same server or can the destination reach the source?

If both are on the same server, you can do a lot with just an update query. Just give the field names as database.table.field, like:
Code:
UPDATE database1.YourTable AS SourceTable INNER JOIN database2.YourTable AS DestinationTable ON SourceTable.ID=DestinationTable.ID
      SET DestinationTable.SomeField = CONCAT('Prefix', SourceTable.SomeField, 'PostFix');

In case they can reach each other but are on different server instances, you can make a federated table pointing to the source table.

I only know cursors within stored procedures. If you are going to do this with stored procedures, you may as well call them directly or with a trigger. You don't need PHP for that, although the action may be part of something larger in PHP.


+++ Despite being wrong in every important aspect, that is a very good analogy +++
Hex (in Darwin's Watch)
 
r937 and DonQuichote, thanks for your help.

DonQuichote,
They are on different servers. I know it's not a good idea to use cursors.

r,
can it be done like that (in batch of 50000 say)
update <tablename> set col1=CONCAT('Prefix', SomeField, 'PostFix'),col2=CONCAT('Prefix', SomeField, 'PostFix') LIMIT 0,50000 ;

Will it be effective taking no of records into consideration?
 
the concats are fine -- you can do a lot to transform the data with SQL while loading

also, you'd want to used a multi-table update

taking numbers into consideration is necessary only if you have a situation where the update statement cannot be allowed just to run to completion

but don't use LIMIT

LIMIT only makes sense if you use ORDER BY, but ORDER BY would further slow it down

instead, use key ranges

r937.com | rudy.ca
Buy my new book Simply SQL from Amazon
 
If the destination server can reach the source server, take a look at the federated engine. It is a kind of "link table" so you can define a table in your destination database which links to source database. With that situation, you can just refer to the linked table as if all data in that table was in the same database.

Instead of the LIMIT clause, give a WHERE clause with an ID range:
Code:
... WHERE ID BETWEEN 0 AND 500
That way, you know exactly which records you already processed and which records you still have to process.

+++ Despite being wrong in every important aspect, that is a very good analogy +++
Hex (in Darwin's Watch)
 
Thanks.

Just to know is dynamic query possible in the Version 5.0.37.
 
Yep. See PREPARE and EXECUTE statements. You can build the query to be executed in an SQL variable (with the SET command or with SELECT ... INTO) and then prepare and execute it. Although you can give parameters, you don't need to give them.

+++ Despite being wrong in every important aspect, that is a very good analogy +++
Hex (in Darwin's Watch)
 
Thanks .
I will have a look at them. Thanks for your patience and support.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top