Having problems create a function to convert julian date to a date format of YYYYMMDD. Not sure If im doing this right.
I getting an error code
./j2g[3]: 0403-057 Syntax error at line 6 : `jd=$1' is not expected.
#!/bin/ksh
+2
+3 function juliandate_to_gregdate{
+4
+5...
Convert ASCII to EBCDIC is this possible with shell script
See I have to develop a script that converts a file from ascii to ebcdic, do more functions to the file, then send it to the Host. But i'm not sure how to convert the file to ebcdic.
Good job Guys but when I use the input file below it does not work.. I only give me "DB: I34Q23SB - Node: I34Q23SB"
Database 1 entry:
Database alias = I34Q23SB
Database name = I34Q23SB
Node name = QIST23SB
Database release level...
thanks rharsh,
it still does not read the second line with the node name. the output is DB: IBSQ35 - Node: IBSQ35. it should be
DB: IBSQ35 - Node: QIBS34.
I tried changing the print "Node: $1\n"; to print "Node: $4\n"; and it still gives me the pattern "IBSQ35"
btw, what is the debug...
Ok. I tried it.. but got an error see below.
here is my full code
print ("Enter the search pattern:\n");
$pattern = <STDIN>;
chop ($pattern);
open(DB_list, "/home/GetNode/listdb.txt");
print ("Matches found:\n");
while (<DB_list>)
{
if (substr($_, "Database name") != -1)
{ #Database...
PaulTEG, this what i have so far. I now that Im not going the right way.. Im to new at this. But here is what I have..
input file -
Database name = IBSQ35
Node name = QIBS34
while (<DB_list>)
{
if (/\bDatabase name\s+=\s$pattern\b/)
{
print...
Hello, I need help with white spaces. Im try to search for the input listed below called Database name. Im very new to all this. Also any clues on how to search for the first line then get the Node name called QIBS34.
code -
if (/\bDatabase name\s=\s$DB_Name/)
input -
Database name...
Hi, Im getting this error..
Can't modify constant item in scalar assignment at ./pattern_search1.1 line 12, near """)"
here is my code:
print ("Enter the search pattern:\n");
$pattern = <STDIN>;
chop ($pattern);
open(FILENAME, "/home/x1207/GetNode/listdb.txt");
print ("Matches found:\n")...
thanks duncdue, but what does this line do
$data =~ m|$request[^\d]+(\d+)|i;
i know that $data will be assign, then it does a search for, then the next part I do not know and why if ($1).
the only code I have is to search for "Cars" Im not even sure it this will work... Just start writing perl 2 days ago.
$record = <STDIN>;
chop ($record);
open (INPUT, "inventorylist.txt") or die "Nope!! $!\n";
while ($record = INPUT) {
print $record;
if ($record =~ "Cars") {...
Hello,
What I need help with is to search an input file with a specific text givin by user (this is ok with me). The next step is what Im having problems with, see the user only know a certain input. Will call it Cars. What the user needs is how many car are available.
usrer input-...
Where do you think the problems is comming from?
I get an error when running nawk -v find_db="$db" -f find_db.awk IQSB;
Syntax Error The source line is 3.
The error context is
>>> dblist=/home/zzzzz/GetNode/listdb1. <<< txt
awk: 0602-500 Quitting The source line is 3.
#!/bin/ksh...
Does anyone know how to search and print out 10 or 20 lines.
Just say I have a input file call A. In the input file I have numbers 1 - 100. Each number on a new line. What I want to do is search for a number say 20. Then printed out 20, 21, 22...., up to 30 or 40.
is this possible with awk.
Thanks,
But I have no clue on how to use nawk. I'm just a newbie with awk. How do I use my three input file on the the script you wrote and how do I run it.
Hello, I have develop a awk script to look into three files.
Here is the problem. It works if for a 2 medium and 1 small input files, but it i use 3 medium files it does not come out.. the out is blank... Do you have any clues why???
see script at the bottom
How the scripts work from command...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.