Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Find duplicated words in a file 2

Status
Not open for further replies.

terrassa5

IS-IT--Management
Feb 22, 2005
40
GB
I'm new in scripting, and I need help to create a script to find users connected twice or more times on a system.

I have a file that looks like that:

L2000 bx/bw mnt.545 L2000 22-02-2005 08:09
L2000 bx/bw nav.3675 L2000 22-02-2005 08:42
L2000 bx/bw ort.2232 L2000 22-02-2005 08:30
L2000 bx/bw osc.15224 L2000 22-02-2005 11:09
L2000 bx/bw pei.18965 L2000 22-02-2005 11:59
L2000 bx/bw pei.29141 L2000 22-02-2005 08:00
L2000 bx/bw per.29718 L2000 22-02-2005 08:04
L2000 bx/bw pos.29759 L2000 22-02-2005 08:05

Third column is the userID and process number. Columns 1 and 2 are always the same, and file is ordered by third column. I need a script that tells me which users are repeated in the file, in this case we see that user 'pei' is twice.

Could you help me with it?

Thank you very much.
 
You may try something like this:
awk '
{split($3,a,".");++u[a[1]]}
END{for(i in u)if(u>1)print i,u}
' /path/to/input

Hope This Helps, PH.
Want to get great answers to your Tek-Tips questions? Have a look at FAQ219-2884 or FAQ222-2244
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top