Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Backup best practices/strategy 1

Status
Not open for further replies.

vbMonk

MIS
Jun 20, 2003
19
US
We run PostgreSQL 7.3 on a Redhat 9 server.

I wonder if people would be willing to share a bit with me about their backup approaches. I'm not looking so much for details, e.g. the text of scripts, but rather, I'd like to know how you think about it and what your strategy is. Do you back up daily? More than once a day? Do you back up data and schema all the time? Or do you perhaps do the data more often than the schema? Do you use pg_dump or pg_dumpall? Or a combination?

I know that to some degree the answers to these questions depend on the nature of the data, etc., but I am curious as to how others approach this problem.
 
Hi vbMonk,

Roycrom wrote a FAQ "How can I backup several Postgres databases at once". You might give it a look.

I am running PostgreSQL 7.4 on my web server using RH 9. I am running Fedora Core 2 on my desktop which also has Postgresql 7.4. I backup by piping the database off the web server onto the desktop computer.

After I'm done, I have two identical databases, both of which are fully functional. Here is a snippet of how I backup postgres on the web server, IP 192.168.1.78, to my localhost desktop system.

You might experiment with some of the many options available for dropdb, dropuser, createuser, and pg_dump. The options can be found in the manual pages of the respective commands.

#-----------------------------------------------
#!/bin/sh

dropdb -h localhost smvfp
dropdb -h localhost smdemo
dropdb -h localhost phorum
dropdb -h localhost lice

dropuser -p 5432 -h localhost -U postgres -i -e leland
dropuser -p 5432 -h localhost -U postgres -i -e demouser
dropuser -p 5432 -h localhost -U postgres -i -e smgeneric
dropuser -p 5432 -h localhost -U postgres -i -e jim

createuser -p 5432 -h localhost -d -A -P -e leland
createuser -p 5432 -h localhost -d -A -P -e demouser
createuser -p 5432 -h localhost -D -A -P -e smgeneric
createuser -p 5432 -h localhost -D -A -P -e jim

createdb -O leland -p 5432 -h localhost -E SQL_ASCII -e smvfp
createdb -O demouser -p 5432 -h localhost -E SQL_ASCII -e smdemo
createdb -O postgres -p 5432 -h localhost -E SQL_ASCII -e phorum
createdb -O postgres -p 5432 -h localhost -E SQL_ASCII -e lice

pg_dump -X use-set-session-authorization -h 192.168.1.78 smvfp | psql -h localhost smvfp
pg_dump -X use-set-session-authorization -h 192.168.1.78 smdemo | psql -h localhost smdemo
pg_dump -X use-set-session-authorization -h 192.168.1.78 phorum | psql -h localhost phorum
pg_dump -X use-set-session-authorization -h 192.168.1.78 lice | psql -h localhost lice

#-------------------------------------------------

Regards,

LelandJ



Leland F. Jackson, CPA
Software - Master (TM)
Nothing Runs Like the Fox
 
Thanks for your response, Leland. I have read Roycrom's FAQ and it may come in handy when I get around to actually scripting my dump processes. It doesn't really answer my question, though.

My questions in fact, revolve somewhat around the options for pg_dump. In all the posts I've read at various forums and such, I don't see anyone using any of the options. No one seems to use the -c or -C for example. I can only assume that they intend to recreate their databases manually in the event of a problem. But, I'd really like to know rather than assume. And no one seems to dump the schema and data separately. Again, I can only guess at the reason, which I would assume is because it doesn't appear necessary, in terms of files size, performance, etc. But I'd be interested in knowing peoples' thought processes on these things.

 
Hi vbMonk,

Yes, if I used pg_dump, I would include the c fromat option so output is to a custom archive suitable for input into pg_restore, which allow reordering of data load as well as schema elements. The c format option also compresses the output.

Also, the t, tar, archive option is good and can be used by pg_restore to restore the arcive in many ways.

You might consider the pg_dumpall option. It is good to backup everything at once, and I have used it many times when backing up my current data prior to a new postgres install. If I remember correctly, everything can be restored from a pg_dumpall including user, indexes, sequences, restrains, and much more. Check out the manual page on it.

I'm not running any giant databases. I use PostgreSQL mainly in connection with development of perl applications. I set up a database and tables and load them with test transactions to evaluate and debug my applications. That is why I backup using a pipe between two postgres databases.

Postgres also has replication software, but I havn't used replication.

Here is the Doc on using pg_dumpall:



Maybe someone will come in with some more ways to do backup.

Regards,

LelandJ





Leland F. Jackson, CPA
Software - Master (TM)
Nothing Runs Like the Fox
 
Hi vbMonk,

I did a google on "postgresql backup" and came up with the following link:


I tried the pg_backup.sh script and it worked great. I think I'm going to switch to it. It can be set up in crontab to automatically run several times each day.

Regards,

LelandJ

Leland F. Jackson, CPA
Software - Master (TM)
Nothing Runs Like the Fox
 
Awesome script. Really gives me a leg up on the mechanics of the process anyway. Thank you, Leland.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top