Hi everyone,
I am writing a script using Perl/Tk wich has as a goal to register information and with that information to be able to write switch configs.
This script has a couple of parts:
The switch part. On this part you can add, find and change switch information. I have a file wich contains the following information:
switchname, ip, subnet, gateway, and, some, more , options The file now contains about 2400 rows. I suspect that this will grow to about 10000 max.
Like I said.. there are three things that can be done with this file: (add, change and find). To describe my problem I only describe the adding of a switch here.
this means I have a dialog where someone can enter the switch and ip information and the rest. To check if the entered information already excists or to get the needed information I have 3 options:
[ol]
[li]do a grep on the file. If the grep returns true. it excists if not then add the information to the file.[/li]
[li]import the whole file and put it in an array. Itterate through the array and if it excists do nothing if not add it to the file and clear the array.[/li]
[li]import the whole file and put it in a hash. Check if it excists. if not add the information to the hash.[/li]
[/ol]
My questions are:
Which of the three options is best.
[ol]
[li]I need to do a grep on the file everytime. And when the file gets bigger the grep will take longer. How long will it take to do a grep on a 10000 lines long file?[/li]
[li]This is not doable and has a big strain I believe on the script. Everytime I want to add or query I need to open the file read it in and put it in an array and itterate through it. [/li]
[li]I would put everything in a hash and have the data ready all the time. but still when something is added I have to open and add to the file AND at to the hash. Is it smart to have such a big file in memory all the time?[/li]
[/ol]
It was a long post. If none of the options above are any good. I will have to rewrite my script to split up the files based on some predefined criteria by the user. I do not have the ability to use MySql or SQL or DBI.
thanks in advance.
InDenial
I am writing a script using Perl/Tk wich has as a goal to register information and with that information to be able to write switch configs.
This script has a couple of parts:
The switch part. On this part you can add, find and change switch information. I have a file wich contains the following information:
switchname, ip, subnet, gateway, and, some, more , options The file now contains about 2400 rows. I suspect that this will grow to about 10000 max.
Like I said.. there are three things that can be done with this file: (add, change and find). To describe my problem I only describe the adding of a switch here.
this means I have a dialog where someone can enter the switch and ip information and the rest. To check if the entered information already excists or to get the needed information I have 3 options:
[ol]
[li]do a grep on the file. If the grep returns true. it excists if not then add the information to the file.[/li]
[li]import the whole file and put it in an array. Itterate through the array and if it excists do nothing if not add it to the file and clear the array.[/li]
[li]import the whole file and put it in a hash. Check if it excists. if not add the information to the hash.[/li]
[/ol]
My questions are:
Which of the three options is best.
[ol]
[li]I need to do a grep on the file everytime. And when the file gets bigger the grep will take longer. How long will it take to do a grep on a 10000 lines long file?[/li]
[li]This is not doable and has a big strain I believe on the script. Everytime I want to add or query I need to open the file read it in and put it in an array and itterate through it. [/li]
[li]I would put everything in a hash and have the data ready all the time. but still when something is added I have to open and add to the file AND at to the hash. Is it smart to have such a big file in memory all the time?[/li]
[/ol]
It was a long post. If none of the options above are any good. I will have to rewrite my script to split up the files based on some predefined criteria by the user. I do not have the ability to use MySql or SQL or DBI.
thanks in advance.
InDenial