Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Search results for query: *

  1. nanohurtz

    improved parsing out to .htm

    this is the sample input string again alias tq_test_line="/usr/cmvc/cmtools/bin/cm_move.sh -e cmtest/appsmore/27/hpux/text/cmexports -t /cmtest/appsmore/27/hpux -s /prod1/uat_stage/clmstux/ap_clms"
  2. nanohurtz

    improved parsing out to .htm

    Thanks justice, very helpful. I was able to fix the $varsource and $vartarget issue using $varalias =~ s/=.*//g; $vartarget =~ s/"//g; . The carriage returns though elude me. Here's a sample in string alias tq_test_line="/usr/cmvc/cmtools/bin/cm_move.sh -e...
  3. nanohurtz

    improved parsing out to .htm

    I've written a script that will read an 8 part string that starts with the word -alias-. It extracts the 2nd, 6th and 8th variable and prints it out to htm with count in a table format. example: http://www.graffece.com/dev/prevplanout.htm I have 3 problems One. I would like to further isolate...
  4. nanohurtz

    Exporting Recordsets to .xls

    Im wondering if an array would work. Problem is that it's limited in the number of RecordSets it could hold
  5. nanohurtz

    Booting Users from Access

    forgot to mention, this is Access 97 -SubCon
  6. nanohurtz

    Exporting Recordsets to .xls

    forgot to mention, this is Access 97
  7. nanohurtz

    Booting Users from Access

    Is there a code that gracefully kills the .ldb instance that appears when a user is in Access. Or in otherwords kicks the user out. I have a script that will resync, lockdown and compress an .mdb. Only problem is that users will forget they have the app open when it's time to do maintenance and...
  8. nanohurtz

    Exporting Recordsets to .xls

    Hi, I am trying to write a SubRoutine/SQL statement that will group data by zipcode and export each recordset into individual spread sheets based on those groupings. (i.e. 10472.xls 10888.xls..). I've written the SQL to group all Zipcodes and output to a recordset (rst). The second query uses...
  9. nanohurtz

    Getting Rid of Redudant Data

    Thank Barbie, Can I purge the hash for every instance of /SCHEDULE/END? I want the @rray to handle duplicates within each schedule. It's ok if a duplicate job turns up in another schedule. How do I refine?
  10. nanohurtz

    Few lines of Code Needed

    Cool, I got it. I will definitely need to warm up to the power of arrays. Thanks again
  11. nanohurtz

    Getting Rid of Redudant Data

    I meant to say..'getting rid of duplicate records it may dynamically create from reading a flat file'
  12. nanohurtz

    Few lines of Code Needed

    Wow, looks great Neil. How do I incorporate the I/O (reading and writing to and from files) portion to your newly modified code?
  13. nanohurtz

    Getting Rid of Redudant Data

    I need to modify this code to eliminate any duplicate 'words' it picks up from a flat file THE CODE #!/usr/bin/perl -w open(IN, &quot;< scdreader.txt&quot;) or die &quot;Cannot open file for read\n&quot;; open(OUT, &quot;> scdreaderout.txt&quot;) or die &quot;Cannot open file for...
  14. nanohurtz

    Few lines of Code Needed

    well..since noone helped with the Pattern search part of my request today, I managed to muster up the rather lengthy script on my own. It was successfully tested using optiPerl3 (my favorite)..flawless. One problem though, it's too long. Can some one tighten this code up by a few lines and still...
  15. nanohurtz

    Advanced Pattern Searching Snippet

    Hi, I'm looking for a .pl code snippet that can read a .txt flat file of 12,000 jobs like the ones included below (FLATFILE INPUT) and output to 6 column .xls output as also included. Please note that some of the lines defy the $_=~/m^(.*)#(.*) or (SERVER)#(JOB) pattern because the flat file was...

Part and Inventory Search

Back
Top