georgeocrawford
Technical User
Hi,
My problem is this. On my computer (which can be accessed from my webserver - both machines are running OS X Server), I have a directory, 'Files', with a large number of subdirectories. I would like to recursively scan through the directory, recording the name and path of each file encountered. I would then like to enter the details for each file into a MySQL database in such a way that I can use a php script to graphically display a directory browser in a web page.
'Files' should be scanned periodically (i.e. with a cron job launching the script) and the database updated - preferably only with respect to the changed files (i.e. those moved, added or deleted since last scan).
I can't get my head round how to do this. My thinking so far is to get a php script to invoke a shell script (perl?) to scan the directory, and produce a text file with the resulting directory tree. The php script will the parse the text file, and somehow enter the details in a logical way into the database.
Problems -
1 - I don't know what the fastest method would be for this, or even whether I need a shell script or a text file at all - would php's file functions be as fast as shell? (see discussion at Thread434-682722 - How can the search be stripped of all information except that relating to files moved, added or deleted since the last scan?
3 - I can't figure out the best way to record the path of each file in a database. How about two tables - FOLDERS with fields called 'Folder Name', 'Folder id' and 'Parent id', and FILES with fields called 'File name' and 'Parent id'????
My most pressing question at this stage is speed. The files are on an old G3 Mac, so I want the search to be as fast and processor-friendly as possible.
Thanks for all your help!
______________________
George
My problem is this. On my computer (which can be accessed from my webserver - both machines are running OS X Server), I have a directory, 'Files', with a large number of subdirectories. I would like to recursively scan through the directory, recording the name and path of each file encountered. I would then like to enter the details for each file into a MySQL database in such a way that I can use a php script to graphically display a directory browser in a web page.
'Files' should be scanned periodically (i.e. with a cron job launching the script) and the database updated - preferably only with respect to the changed files (i.e. those moved, added or deleted since last scan).
I can't get my head round how to do this. My thinking so far is to get a php script to invoke a shell script (perl?) to scan the directory, and produce a text file with the resulting directory tree. The php script will the parse the text file, and somehow enter the details in a logical way into the database.
Problems -
1 - I don't know what the fastest method would be for this, or even whether I need a shell script or a text file at all - would php's file functions be as fast as shell? (see discussion at Thread434-682722 - How can the search be stripped of all information except that relating to files moved, added or deleted since the last scan?
3 - I can't figure out the best way to record the path of each file in a database. How about two tables - FOLDERS with fields called 'Folder Name', 'Folder id' and 'Parent id', and FILES with fields called 'File name' and 'Parent id'????
My most pressing question at this stage is speed. The files are on an old G3 Mac, so I want the search to be as fast and processor-friendly as possible.
Thanks for all your help!
______________________
George