I have a friend who is a photographer and I'm helping him get his images online. He says there will eventually be 40,000 images. The image data that visitors to his web site need to search is:
1. the filename: in the format "John Doe tackles Bill Smith 101001.jpg"; and
2. the JPG caption with a fuller description of the image (generally around 200 characters).
I'm new to Perl but have so far written the code to extract the caption from images and I'm wondering how it should be stored. I want him to be able to dump images into any subdirectory that he creates and have my perl script check for any new directories/images and add them to (??? a flat-text file/DBI_File/Relational Database ???). The user then navigates the directory tree.
I have started down the path of creating a flat-file in each directory which has new entries added to it when images are placed in the directory and deleted when images are removed, but I'm concerned about performance and how I'm going to eventually write the code to search it. Does anybody have any recommendations about how I should design this?
1. the filename: in the format "John Doe tackles Bill Smith 101001.jpg"; and
2. the JPG caption with a fuller description of the image (generally around 200 characters).
I'm new to Perl but have so far written the code to extract the caption from images and I'm wondering how it should be stored. I want him to be able to dump images into any subdirectory that he creates and have my perl script check for any new directories/images and add them to (??? a flat-text file/DBI_File/Relational Database ???). The user then navigates the directory tree.
I have started down the path of creating a flat-file in each directory which has new entries added to it when images are placed in the directory and deleted when images are removed, but I'm concerned about performance and how I'm going to eventually write the code to search it. Does anybody have any recommendations about how I should design this?