Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

move files to dedicated directory based on filename

Status
Not open for further replies.

raggmopp

MIS
Feb 14, 2002
40
0
0
US
Hi all:

perl 5.8.8 on RH5.3 64 bit.

Have hundreds of thousands of files whose filenames are generated by the Oracle SR module.
The task is to move the files into a subdirectory structure based on the SR number.

Example; the subdirectory tree looks like
.../ticketinfo/SR14xxxxx/SR140xxxx/SR1400xxx
SR1401xxx
SR142xxx
...
SR141xxxx/SR1410xxx
SR1411xxx
...
SR15xxxxx/SR150xxxx/SR1500xxx
SR1501xxx
SR1502xxx
...
SR151xxxx/SR1510xxx
...
Here is the piece where I am trying to place the file in it's appropriate directory
Perl:
#my $ssh = Net::OpenSSH->new($VHOST);
            #$ssh->rsync_get({glob => 1, verbose => 1, times => 1},
                             #($anon_d/'%4sxxxxx'/'%5sxxxx'/'%6sxxx'/$SRdir/\"$df\"), (substr($SRdir,0,4),substr($SRdir,0,5),substr($SRdir,0,6)));

I'm thinking there's a better way, plus this is having problems as well.

Searching high and low, nothing I have found fits the bill.
Help is much appreciated.

Many thanks

 
What do you need to do with them afterwards?
What info will you need from them?
If you need to access them you could leave them where they are and catalogue them in a database.

Keith
 
I don't need to do anything with them.
The info contained in the files are related to product support.

By segrating it is keeping the subdirectories at a limit of too many files.
From my point of view, it is primarily organizational and efficiency.
The analysts need to read access to the files, so really the only thing changing is the location.

 
Assuming they are remote and you don't aren't running perl where they are stored locally?

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
[noevil]
Travis - Those who say it cannot be done are usually interrupted by someone else doing it; Give the wrong symptoms, get the wrong solutions;
 
[tt]If I was remote, then yes, I would be running the script locally on the receiving system. (Why is that question even coming up? Even if the system was located overseas I still have to put the script on that system.) The analysts grab a copy and perform their tasks in providing support to customers. All I have to do is place the data into its appropriate location and that is determined by the name assigned to the customer's data for this support incident. (The name comes from the Oracle SR module in case somebody wants to ask.)

Actually, there are 2 systems that receive data from customers. 1 system is located in the DMZ and accepts SFTP, HTTPS and ANONYMOUS FTP into a chroot jail. (I don't want to hear it! I know very well that anon ftp is the oxymoron in this config but those are the marching orders. Believe it or not, there are some shops that believe ports 21, 23 & 80 are more secure than ports 22 & 443. I wish they would share whatever it is they're smoking.) Customers will connect to the system in the DMZ and transfer their data using SFTP, HTTPS or FTP. Once the customer data successfully completes the upload an entry is made into a log file - either the xferlog or the secure log. That log entry is the trigger for a daemon to pull the files into our network - YES, the customer data is 'pulled' into our network over port 22. The data cannot be 'pushed' into our network. And the data transfer must complete successfully before the log entry is made so no worries about incomplete/corrupt data being retrieved.

It is at this step that the data needs to be routed to the appropriate directory based on its name. (Which is assigned by Oracle SR module.) The landing place for the data has perms set so nobody else can write to the filesystem, so the analysts must grab a copy. The customer data remains unchanged on what is a 'ro' filesystem to everybody.

This is the last piece, I have the daemon watching the logs, syslog.conf is configured for the logs, chroot jail configured, data can only be pulled in, no interactive logins, etc. Even our analysts cannot get an interactive login to the DMZ system - just me and 2 co-workers have interactive logins. Once inside our network the analysts must grab a copy because the filesystem is 'ro' to them. One of the goals is to keep the customer's data unchanged so when it does get archived it is the same as the day it arrived.

The format of the file names are SR####### - that is capital SR followed by 7 numerics. Example, SR1234567. The subdirectory it needs to go to would be /SR12xxxxx/SR123xxxx/SR1234xxx/. A real and current example would be SR1536390 - it needs to go to /SR15xxxxx/SR153xxxx/SR1536xxx/

I hope I have answered any other questions that may be forthcoming. And if I may ask, why the questions about what I am going to do with the data, or if I am local or remote? And no, I can't leave them where the are, the directory/file system would have so many files that performance would suffer and it would hurt.

Many thanks
[/tt]

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top