Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

pg_dumpall problem 1

Status
Not open for further replies.

roycrom

Programmer
Aug 2, 2002
184
0
0
GB
Hi all,

I am running PostgreSQL on Solaris 8. I have about 15 db's that I need to dump regularly but I'm having problems with pg_dumpall. Using command:
pg_dumpall -U user > dumpfile

I get this in my dumpfile:
--
-- pg_dumpall (7.2.1) -U user
--
\connect template1
DELETE FROM pg_shadow WHERE usesysid <> (SELECT datdba FROM pg_database WHERE datname = 'template0');

and this on screen:
connected to template1...
ERROR: pg_shadow: Permission denied.
ERROR: pg_shadow: Permission denied.

I don't have a pg_shadow!!!

Also if I do manage to get pg_dumpall to work - does it leave the current database structure in place with users and databases unchanged? ------------------------------------------
Thanx | your welcome !!

roycrom :)
 
HI roycrom,

Per Doc 8.1.2:

The call sequence of pg_dumpall is simply:

pg_dumpall > outfile

Pg-dumpall has one little flaw: It is not prepared for interactively authenticating to each database it dumps. If you are using password authentication, then you need to set the environment variable PGPASSWORD to communicate the password to the underlying calls to pg_dump. More severely, if you have different passwords set up for each database, the pg_dumpall will fail. You can either choose a different authentication mechanism for the purposes of backup, or adjust the pg_dumpall shell script to your needs.


You might try to make sure that postgres is a valid user in each of the databases. Also, make sure postgres has a password setup in the Solaris 8 system. Then log in as postgres to perform the backups. Make sure you have the environment variable PGPASSWORD setup. It can be done like:

PGPASSWORD='Pg_su_Password&quot;
export PGPASSWORD

After the above is done, you can type env at the command prompt and PGPASSWORD would appear as one of the many environment variables for postgres. Then you will not be prompted to enter a password as the backups are performed. I have this setup in my .bash_profile so that whenever I login as postgres, PGPASSWORD is automatically set.

*********** roycrom Asked ******************
Also if I do manage to get pg_dumpall to work - does it leave the current database structure in place with users and databases unchanged?
*************************************************

Yes, the database structure and users will be preserved. I'm not so sure about functions, stored procedures, triggers, specal restraints. Some of these may have to be re-established. Maybe someone will provided information about some of these finer points of a backup/restore.

Leland F. Jackson, CPA
Software - Master (TM)
Nothing Runs Like the Fox
 
Thanks leland,

I wrote a little script to do this from cron and it works just right now I've set that environmental variable. pg_dumpall does indeed appear to leave the database intact. ------------------------------------------
Thanx | your welcome !!

roycrom :)
 
Hi Leland,

Just thougth I'd let you know that when I try to do:

pg_dumpall > outfile

then this doesn't work unless I'm connected as user postgres.

However if I call it like this:

pg_dumpall -U postgres > outfile

Then it works fine with the extra option. Hope this helps you in future. I've also written a little script that uses pg_dumpall to dump the full cluster to one file and pg_dump to create individual database dumps.

I also wrote a little script to drop all the databases in a cluster although this is something that wouldn't be used very often I guess. If you would like these scripts just reply here and i'll post them. :) ------------------------------------------
Thanx | your welcome !!

roycrom :)
 
Hi,

I've just written it using vi that uses the bourne shell.
It relies on oid2name that comes with the distribution and you'll obviously need to change the paths so I've just used PATH so you can change it yourself. I use the first script to get a snapshot of the database cluster just before a backup of the server is done so that I know exactly what date and time it was done and so i know that it has not been changed during the backup time. I call it like this from the cron:

min hr * * * /PATH/dumpdb.sh >> /logs/cron.log 2>&1

#!/bin/sh
#
# Purpose: This script creates a dump of the full database cluster
# into one file and then proceed to dump individual databases
# into one directory - /PATH/dumpdir
#
# create variables

USER=&quot;postgres&quot;
NAMECMD=&quot;/PATH/oid2name -U $USER&quot;
TMPDUMP=&quot;/PATH/dump.tmp&quot;
DUMPDAT=&quot;/PATH/dump.dat&quot;
DUMPDIR=&quot;/PATH/dumpdir&quot;
PGPASSWORD=&quot;*****&quot; # You'll need to change this
PATH=$PATH:/PATH/bin
export PGPASSWORD PATH

echo &quot;Script $0 starting at `date`&quot;

# Remove any old files

[ -f $TMPDUMP ] && rm -f $TMPDUMP
[ -f $DUMPDAT ] && rm -f $DUMPDAT

# Get database names and put into .tmp file

$NAMECMD | cut -b10- > $TMPDUMP


# Get rid of unwanted stuff in .dat file by filtering out.

while read LineIn; do
CHAR=`echo $LineIn | cut -c1`
if [ &quot;$CHAR&quot; != &quot;&quot; ] && [ &quot;$CHAR&quot; != &quot;-&quot; ]; then
if [ &quot;$LineIn&quot; != &quot;template0&quot; ] && [ &quot;$LineIn&quot; != &quot;template1&quot; ] && [ &quot;$LineIn&quot; != &quot;ases:&quot; ]; then
echo $LineIn >> $DUMPDAT
fi
fi
done<$TMPDUMP

# May as well do a complete dump as well.

[ -f $DUMPDIR/alldata.dmp ] && rm -f $DUMPDIR/alldata.dmp
echo &quot;Commencing full dump into $DUMPDIR/alldata.dmp...&quot;
pg_dumpall -U $USER > $DUMPDIR/alldata.dmp


# Dump individual databases separately so can restore just one if we need.

if [ -f $DUMPDAT ]; then
echo &quot;&quot;
echo &quot;Dumping individual databases...&quot;
while read LineIn; do
echo &quot;dumping $LineIn to $DUMPDIR/$LineIn.dmp&quot;
[ -f $DUMPDIR/$LineIn.dmp ] && rm -f $DUMPDIR/$LineIn.dmp
pg_dump -U postgres -f $DUMPDIR/$LineIn.dmp $LineIn
done<$DUMPDAT
else echo &quot;No individual databases to dump, exiting...&quot;
exit 0
fi

# Tidy up and change owner of files.

chown $USER:$USER $DUMPDIR/*

[ -f $TMPDUMP ] && rm -f $TMPDUMP
[ -f $DUMPDAT ] && rm -f $DUMPDAT

echo &quot;Script completed at `date`&quot;

-----------------------------------------------------------------------------

This next script would only need to be used if you're databases became corrupted and you needed to get rid of them all before using the output file from pg_dumpall to do a full restore. You may find another use.

-----------------------------------------------------------------------------
#!/bin/sh
#
# create variables

USER=&quot;postgres&quot;
NAMECMD=&quot;/PATH/oid2name -U $USER&quot;
TMPDUMP=&quot;/lcc/bin/dump.tmp&quot;
DUMPDAT=&quot;/lcc/bin/dump.dat&quot;
PGPASSWORD=&quot;******&quot;
PATH=$PATH:/PATH/bin
export PGPASSWORD PATH

echo &quot;Script $0 starting at `date`&quot;

# Remove any old files

[ -f $TMPDUMP ] && rm -f $TMPDUMP
[ -f $DUMPDAT ] && rm -f $DUMPDAT

# Confirm to proceed

echo &quot;Are you sure you want to drop all databases? [y/n]: \c&quot;
read ANSWER

if [ $ANSWER != &quot;y&quot; ] && [ $ANSWER != &quot;ye&quot; ] && [ $ANSWER != &quot;yes&quot; ] && [ $ANSWER != &quot;Y&quot; ] && [ $ANSWER != &quot;YE&quot; ] && [ $ANSWER != &quot;YES&quot; ]; then
echo &quot;&quot;
echo &quot;Exiting, no databases have been affected.&quot;
exit 0
fi

# Get database names and put into .tmp file

$NAMECMD | cut -b10- > $TMPDUMP


# Get rid of unwanted stuff in .dat file by filtering out.

while read LineIn; do
CH=`echo $LineIn | cut -c1`
if [ &quot;$CH&quot; != &quot;&quot; ] && [ &quot;$CH&quot; != &quot;-&quot; ]; then
if [ &quot;$LineIn&quot; != &quot;template0&quot; ] && [ &quot;$LineIn&quot; != &quot;template1&quot; ]
&& [ &quot;$LineIn&quot; != &quot;ases:&quot; ]; then
echo $LineIn >> $DUMPDAT
fi
fi
done<$TMPDUMP


# Drop all the databases

if [ -f $DUMPDAT ]; then
echo &quot;&quot;
echo &quot;Continuing to drop all databases...&quot;
while read LineIn; do
echo &quot;Dropping database - $LineIn...&quot;
dropdb -U postgres $LineIn
done<$DUMPDAT
else
echo &quot;&quot;
echo &quot;No databases exist, exiting...&quot;
exit 0
fi

# Tidy up after myself!

[ -f $TMPDUMP ] && rm -f $TMPDUMP
[ -f $DUMPDAT ] && rm -f $DUMPDAT

echo &quot;Script completed at `date`&quot;

-------------------------------------------------------------------------

So there you have it. I hope you find these useful, feel free to post them on the FAQ. I don't really talk to other Postgres users so I'm not too sure how useful they would be.

As you can see you'll have to edit it slightly but should only be in the variables at the top of each script. Thinking about it I should have seperated it up into functions or procs but I've not got the time at the moment.

Anyway, thanks for your previous help, hope this returns the favour somewhat. ------------------------------------------
Thanx | your welcome !!

roycrom :)
 
Hi roycrom,

I added your post to the FAQ, so it appears under my name. You should probably submit it again to the FAQ, so you would receive the credit for all the work you did. Then I could delete the one I submitted.

When I referred to the bash shell, I was really talking about the bourne shell. I have written scripts using bourne shell (e.g. SCO Xenix), but its been quite a while back. Nice work you did.

LelandJ
Leland F. Jackson, CPA
Software - Master (TM)
Nothing Runs Like the Fox
 
Hi again Leland,

Thanks for the star, its my first one!! I have posted the FAQ with a couple of mods so its reads better to more people. ------------------------------------------
Thanx | your welcome !!

roycrom :)
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top