Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations biv343 on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Iterating through directories, but they go too deep

Status
Not open for further replies.

kmcferrin

MIS
Jul 14, 2003
2,938
US
I recently wrote a script that could be run against a file server's directory tree to generate a list of files over a certain age that we could use for archiving. Everything seems to work just fine, until it hits upon an extremely deep directory, i.e., the full path is longer than 256 characters. At that point it crashes out. I've tried adding On Error Resume Next so that the script would just keep on chugging along, in the hopes that it would just keep going until it got to the next (shorter) directory. Obviously that didn't do the trick. I'm not sure how they actually managed to get that way to begin with, since Windows doesn't let you create directories that deep. Rumor has it that they may have been created by a Mac or UNIX machine.

At any rate, here's the subroutine that does the iterating:

Code:
Sub CrawlTree(strTargetDir)
    Dim objFolder, arrFolders, objFiles, Item, Item2
    Set objFolder=objFSO.GetFolder(strTargetDir)
    Set arrFolders=objFolder.SubFolders
    Set objFiles=objFolder.Files

    ' Get all sub-folders in this folder
    For Each Item In arrFolders
        CrawlTree(item)
    Next
    Item2=0

    'Scan through  the files collection, find files older than 14 days, and delete.
    For Each Item2 in objFiles
    Dim strAccessDate, strCreatedate, objFileName, intDaysOld
    Set objFileName = objFSO.GetFile(Item2)
    strAccessDate = objFileName.DateLastAccessed
    intDaysOld = DateDiff("d", strAccessDate, Now)
    If intDaysOld > intCutoffAge Then
        Wscript.Echo Now & " -- " & objFileName.Path & " is " & intDaysOld & " days old."
	    Set objFile = objFSO.OpenTextFile(strOutputFile, ForAppending)
	    objFile.Writeline objFileName.Path
	    objFile.Close
        intFileCount = intFileCount + 1
        intFileSize = objFileName.Size
        intFileSizeCount = intFileSizeCount + intFileSize
    Else
    End If
    Next
End Sub

It crashes on the "For Each Item In arrFolders" line about 7 lines into the sub. The error is "Path not found". Does anyone have any idea I can work around this?

I'm suspecting that "Set arrFolders=objFolder.SubFolders" may be collecting the full path for the subfolders, even though they are longer than 256 characters.
 
So I was thinking about it, and I changed the offending line to:

Code:
    For Each Item In arrFolders
        If Len(Item) < 256 then
            CrawlTree(item)
        Else
        End If
    Next

I went ahead and fired up the script again to see if that fixes it, but I won't know for several hours if it worked (parsing a 1TB+ tree on a very slow server). In the meantime, if anyone has any ideas I'm open to suggestions.
 
Many of the older API calls limit the pathnames they process to MAX_PATH (normally 260). There are special "wide" API calls for dealing with "Unicode file paths" of up to 32,767 wide characters.

I don't know of any scriptable components that wrap these Unicode versions of the file API functions.

Example: GetLongPathName

One could develop such an ActiveX DLL wrapper in VB6 or C++ but as I said I doubt there is anything standard.

Hard to say whether it might work, but I suppose you could try using the FSO with pathnames in UNC format that begin with the prefix "\\?\" and see what happens. I have my doubts though.
 
I should add the clarification that GetLongPathName isn't about the "long" names we're talking about here. it was just an example.

The actual functions exported by Kernel32.dll are GetLongPathNameA (ANSI) and GetLongPathNameW (Wide).
 
Well, it looks like the Len() trick worked. I came in this morning and the script had happily skipped those directories and then crashed a couple hours later on a different issue. I got that issue resolved, but these guys have serious problems with their filesystem. On the bright side, by the time I'm done it will be nice and clean.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top