Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

space...the cluttered frontier...?, help

Status
Not open for further replies.

jbay

Technical User
Feb 24, 2000
22
US
hi...printer is on the fritz and my one program is slowing down...overall poor performance with this unix machine...i have a unix V r3 system....the /usr and /bms areas are now at 90+ % full from a low 70's% full capacity a few months ago...what and how do i clear up this space increase and get things somewhat normal again...email details if need be please....thank you for your time and help...take care [sig]<p>james bay<br><a href=mailto:rgscom@aol.com>rgscom@aol.com</a><br><a href= > </a><br>please bare with me with regard to my questions...i'm trying to comprehend the vast concepts as they apply to computers in general and at times forget the KISS philosophy and i do so appreciate your insights/step by step guidance and overall your generosi[/sig]
 
It's not easy to say how you can clear space, we do not know the nature of your files & data.

You must get a policy in place that says certain files have a particular life span then there 'out of here'.

As long as you back up regular you should not hit too many problems.

The man page for find has good exaples on removing unwanted files.

[sig]<p>Ged Jones<br><a href=mailto:gedejones@hotmail.com>gedejones@hotmail.com</a><br><a href= > </a><br>Top man[/sig]
 
&quot;The man page for find has good exaples on removing unwanted files&quot;

With respect to Ged -- this is *NOT* something you should be fooling with if you're a beginner.... <grin>

Jbay -- what version of Unix is this? SCO? [sig]<p>Mike<br><a href=mailto:michael.j.lacey@ntlworld.com>michael.j.lacey@ntlworld.com</a><br><a href= Cargill's Corporate Web Site</a><br>Making mistakes, so you don't have to. &lt;grin&gt;[/sig]
 
Good advice Mike, last time I gave it I got chewed up. [sig]<p>Ged Jones<br><a href=mailto:gedejones@hotmail.com>gedejones@hotmail.com</a><br><a href= > </a><br>Top man[/sig]
 
<wry smile> I only know because (a few years ago) it was one of the first commands that I tried out after researching it in the man pages. An interesting day but not one I would wish on anyone.... [sig]<p>Mike<br><a href=mailto:michael.j.lacey@ntlworld.com>michael.j.lacey@ntlworld.com</a><br><a href= Cargill's Corporate Web Site</a><br>Making mistakes, so you don't have to. &lt;grin&gt;[/sig]
 
I've been there I messed up using mtime.
It would be a good idea to ensure you are used to using tar prior to playing with find to remove files. [sig]<p>Ged Jones<br><a href=mailto:gedejones@hotmail.com>gedejones@hotmail.com</a><br><a href= > </a><br>Top man[/sig]
 
James, long time no see - sorry to hear things aren't so good again. If I recall correctly, the last time you had this problem we tracked it down to a bunch of files in [tt]/usr/adm/acct/pacct[/tt], at least for the /usr filesystem.

This directory contains files that may be used to produce accounting information to charge for system usage. They aren't much use if you aren't charging for your server usage, so can be safely removed.

Before you remove anything... To be on the safe side, could you take a look in the directory and post back with some of the filenames? A full directory listing won't be needed, just post a few filenames. We'll be able to confirm then whether it's safe to go into that directory and remove all the files. That should then clear up a lot of space.

The [bb]/bms[/tt] directory I can't help with that much. This isn't a standard Unix directory name, so it is probably related to the application you run on the server. Not knowing anything about the application, I can't really recommend anything. It's likely that the increase in disk usage on this filesystem is related to the entry of data into your application. Are there any data archiving options available that may clear some space?

Hope this helps. [sig]<p> Andy Bold<br><a href=mailto: > </a><br><a href= > </a><br>"I've probably made most of the mistakes already, so hopefully you won't have to..." Me, most days.[/sig]
 
hello folks, Andy, good to see you as well ol chap...hope things are well, wish i could say the same for this crummy old unix system...the version of unix is at&ts' system V r3, according to uname....file systems are geting cluttered again...andy i took your advice from before about rm *, every morning the usr/adm/acct contents...in there are these um, folders-->core, fiscal, nite, and sum....as i said each day i come in i rm * in the sum and nite folders to empty them out i try a cd core and a cd fiscal, but output says not a directory thus dunno how to remove stuff in those two....if it isnt painfully obvious i am a total newbie to unix in general, andy will attest to that...anyway, that does increase the blocks but doesnt alleviate the build up in the /usr and /bms filesystems as they are now well over 90% full and slowly climbing...as for my man command....when i do it, things fly by so fast on the screen that i cant read it nor know how to make it appear one page at a time...ive been trying to locate temp/log and non essential space consuming files to remove and make space but dunno where the flock they are....these tears i cry are from laughing so hard...folks, any and all detailed, comprehensible, newbie language solutions and help are welcomed...as always much mahalos and thanks to you for your time and help...take care [sig]<p>james bay<br><a href=mailto:rgscom@aol.com>rgscom@aol.com</a><br><a href= > </a><br>please bare with me with regard to my questions...i'm trying to comprehend the vast concepts as they apply to computers in general and at times forget the KISS philosophy and i do so appreciate your insights/step by step guidance and overall your generosi[/sig]
 
Want to find your largest files: -
From the toplevel of a directory type: -

du -a | awk '$1 > 10000 { print $0 }'| pg

You can of course alter 10000 which represents file size in 512bytes. This will tell you which files are large and also which directories as well.

I use the compress command to reduce file size where I need to keep them but are not used.

Do you have space in other areas so that you could repartition to make more space for /usr?



[sig]<p>Ged Jones<br><a href=mailto:gedejones@hotmail.com>gedejones@hotmail.com</a><br><a href= > </a><br>Top man[/sig]
 
try &quot;man find | pg&quot; to slow that output down a bit [sig]<p>Mike<br><a href=mailto:michael.j.lacey@ntlworld.com>michael.j.lacey@ntlworld.com</a><br><a href= Cargill's Corporate Web Site</a><br>Making mistakes, so you don't have to. &lt;grin&gt;[/sig]
 
hi all,

james, if u 2 like to take print outs of man pages and read them at leisure in ur favourite editor.. then u can create a text file out of the man pages by
$ man find |col -b > find.txt
now use your favourite editor to read find.txt.

and since we were reading &quot;find&quot; man pages..
to look for files above a particular size.. (they might be culprits for hogging up ur disk(s)) ..do.
$ find . -size +10000k -print

here i am looking for any file larger than 10MB in the current directory (replace &quot;.&quot; by any other directory you want to look in to)

if u see certain files as larger than necessary then compress or backup and delete them.. ofcourse check out the intended application or tell us here.. m'be someone knows about it.

hope that helps,
shail [sig][/sig]
 
will report to you all what i find once i figure out what it is you all mean with your suggestions....this sad newbie has to take it in baby steps...as always though infinite thanks for your time and wisdom...keep it coming... [sig]<p>james bay<br><a href=mailto:rgscom@aol.com>rgscom@aol.com</a><br><a href= > </a><br>please bare with me with regard to my questions...i'm trying to comprehend the vast concepts as they apply to computers in general and at times forget the KISS philosophy and i do so appreciate your insights/step by step guidance and overall your generosi[/sig]
 
at the prompt i put in &quot;find /bms -size +10000k -print&quot;...the other command suggested to find large space hog files wouldnt work, but anyway, the output of the above command was 10 files...they are phour.idx/.dat and pshift.idx/.dat and pdsc.idx/.dat and plog.idx/.dat and lastly plogcom.dat...all of these files are in the /bms/data/ directory....should i delete them, i dont know what they are and if they are system/application crucial or just garbage taking up my valued space....also what other methods of finding and removing garbage is there for this system V unix....thanks [sig]<p>james bay<br><a href=mailto:rgscom@aol.com>rgscom@aol.com</a><br><a href= > </a><br>please bare with me with regard to my questions...i'm trying to comprehend the vast concepts as they apply to computers in general and at times forget the KISS philosophy and i do so appreciate your insights/step by step guidance and overall your generosi[/sig]
 
hi jbay,

I have no idea about what these files are used for.

but there is a simple trick. in general if a file is a text file log then u might remove some lines at the top after taking a proper backup (incase u want to look at the log later for incidents) [ the names u have given certainly donot look like text log files!!]

If it is a binary file then check the last time the file was updated &quot;ls -l <FileName>&quot; if the last update is too old back up the file and remove it. just for safety do a &quot;touch <FileName>&quot; (creates a zero size file) JIC if any application requires the file to be present for working.. If some application starts giving problems then it owns the file and go in to its documentation to understand how the file can be cleaned officialy (here your backup will be useful)

B)
hope that helps,
shail [sig][/sig]
 
James,

It is not always big files causing the problem. A large collection of small files has the same effect, you can pick up on this by checking directory sizes: -

du | awk '$1 > 10000 { print $0 }'| pg

or to pick up large diretories and files : -

du -a | awk '$1 > 10000 { print $0 }'| pg

With large logifles I tend do something like this: -

mv xxx.log xxx.log_date;tail -100 xxx.log_date > xxx.log

If the log is just for the record I would then compress it.


Note on cd to core, core is a file, core files are normally created when a job falls over or the system crashes, they contain information on what was being processed at the time of the fail. If you do not want to keep them they can be deleted otherwise tech support may require them to analyze what the problem is. [sig]<p>Ged Jones<br><a href= > </a><br>Top man[/sig]
 
Re the bms direcory, who owns it? Do you know what uses it, is it BMS patrol for instance? [sig]<p>Ged Jones<br><a href= > </a><br>Top man[/sig]
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top