Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Another approach.

Status
Not open for further replies.

marsd

IS-IT--Management
Apr 25, 2001
2,218
US
How would you guys consider dealing with this?
Problem:
1.Open a random file in a directory.
2.Seek to a random point in the selected file.
3.Read from file till a certain char is found or eof.
4.If char is found: read and print to stdout till matching
char is found on the 'other end' of the text block.

My code is here, I'm interested in a more concise
approach:
Code:
 set dir [expr {$argc > 0 ? [lindex $argv 0] : "c:/fortune"}]
set debug 0

proc randFile {dir} {
set m 0
set contents [glob $dir/*]
    
          foreach name $contents {if {[file isfile $name]} {incr m}}
          if {$m > 0} {set key [expr int(1 + rand() * $m)]}
          return [expr {[string length [lindex $contents $key]] > 0 ? [lindex $contents $key] : [randFile $dir]}]
}


proc fortunePrint {filename {tm "55000"}} {
global debug dir
set flag 0
               if {![catch {set fd [open $filename r]} err_open]} {
                    set sz [file size $filename]
                    seek $fd [expr int(1 + rand() * $sz)] start
                    puts "\n\n\n#################FORTUNE##################\n"          
                    while {1} {
                              set char [read $fd 1]
                              if {[eof $fd]} {break;}
                              if {$debug > 0} {puts -nonewline stdout $char}
                              if {"$char" == "%" && $flag < 1} {set flag 1; set char [read $fd 1]}
                              if {"$char" == "%" && $flag > 0} {set flag 0;break}
                              if {$flag > 0} {puts -nonewline stdout $char;flush stdout}
                    }
                    puts "\n###########################################"
                    close $fd
                    after $tm; return [fortunePrint [randFile $dir] $tm]
                } else {
                     puts "ERROR: $err_open"
                     return -1
                }
}


fortunePrint [randFile $dir]

Yes, it is a barebones win fortune script. ;)
 
Code:
  set rc [catch {
    set fn [lindex argv 0]
    set fp [open $fn]
    set lines [split [read $fp] \n]
    close $fp
  } msg]
  if {$rc} { puts stderr $msg; exit }
  set index [expr {rand() * [llength $lines]}]
  puts [lindex $lines $index]
(9 lines, not tested)

ulis
 
That's concise but it needs to be a little more
granular with error detection. It also needs to
find the start of a message block and read till
the end of the block. The message block is delimited
with "%" chars. I don't see an easy way of doing
this right now besides reading consecutive bytes
one at a time.

Thanks.
 
Getting message blocks is matter of replacing \n by % in the split command.

ulis
 
Back to the granularity of error detection.
What is needed is the file to be read. Tcl error messages are (in my opinion) sufficient to understand well what appends.
Only the close is questionable: once the file read, there is no need to close the file before the end of the script.
Two remarks:
1/ closing a file opened in reading (quasi) never fails
2/ failing the close means that the system is out.
So I find this granularity optimal.

ulis
 
The catch statement encapsulates multiple instructions.
It will only catch the first failure. Functional, but
not redundant enough.
Say for example I wanted to handle errors as they happen
using a write trace? More flexibility is needed.

The size of a fortune db file is very large. I am
not conversant enough with read to know how large
files are copied(mmapped, dynamically, on-demand),but
the memory demands of splitting and recording a huge
file at once is not really what I wanted..thus the
roundabout method of emulating fgetc().
I would use C but I find the C api for windows
exceedingly obfuscated and requiring more attention
to a closed paradigm then I have time for.

Thanks for your feedback!
 
Oh..about the close..The proc is recursive
and opens a potentially different file each time.
If I don't close, I'll run out of file descriptors
eventually :(.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top