Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Using ftell and fseek 1

Status
Not open for further replies.

pjw001

Technical User
Aug 12, 2002
297
GB
Hi,

I am having trouble reading a file on DOS. It all worked OK, but we have been presented with files where the records are terminated with (LF)(CR)(LF) instead of the usual (CR)(LF). I am trying to amend it to cope with this.

The program loops over the file as follows:

ftell to get the file position.

fgetc until '\n'(LF) is found.

fseek to get back to the start of the record

fgets to read the record for processing

Start the loop again.

This works fine if the records are terminated with (CR)(LF), but when terminated with (LF)(CR)(LF) the program starts to go backwards in the file for a few iterations and then just stays in the same place. (This is based on printing out what is returned by various calls to ftell.)

Am I doing something fundamentally wrong here or do I just need to debug my code?

One possible way I can see round this is to manipulate the position returned by ftell to point to where it should be. Is this feasible or just totally mad? (It is opened as a text file.)

Any advice would be most welcome.

Thanks.
 
Why can't you use fgets throughout instead of running up and down with ftell and fseek?

fgets will read until \n or the buffer size is reached. Just get it into the buffer and strip off a \r if one exists.
 
Thanks for the reply.

The reason, rightly or wrongly, is that it uses fgetc until it finds \n so that it knows how big the record is so that it can allocate an appropriate amount of memory for use by the fgets.

The routine has no knowledge of what the file might contain. All it knows is that it is a stream of bytes.

I have come round to thinking that perhaps I just allocate a large amount of memory (1,000 characters perhaps) and throw an error if the fgets has reached this limit.

Given that the routine has worked perfectly for 17 years common sense tells me to make any amendments with care. The fact that it does something in a seemingly odd way doesn't mean it is wrong, it could just mean that I don't understand (a likely possibility where C language programs are concerned.

Having said all that, I return to my original question of "Am I doing something fundamentally wrong here or do I just need to debug my code?".



 
Nothing wrong. I'd do the allocation method - far easier and a lot faster than multiple fgetc and backspacing. Memory is cheap so why not just allocate 4096.
 
That is what I have done. After spending about two days trying to figure out what was going on it has taken me less than a day to rewrite it without the fgetc.
 
That is normally the case. Sometimes, it takes longer to get an old piece of code to work than to completely rewrite it.

I've also taken 3 days battling like you did, 5 days to write the proposal and convince the bosses that it needed to be done and 1 day to code it. The most difficult is convincing the bosses because they will say "It has been working for the last 20 years so why do you need to change it?". You can offend by saying "It is badly written" or "The author used an unmaintainable solution" so you have to come up with all sorts of reasons why it needs rewriting. Far easier when there is no traceability or if you are the sole person in charge, answerable to no one for your changes.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top