-C, --cvs-exclude auto-ignore files in the same way CVS does
The -C option has nothing to do with compression. It is used to exclude files from the backup process accorded to the defined exclusion list.
I.T. where a world of choas and confusion comes down to nothing but 1s and 0s.
Try accessing what you need through the Developer tab, Controls Panel, click 'Legacy Tools'
This may prove useful as well. Link
Best of luck and make it a great day.
Steven
I.T. where a world of choas and confusion comes down to nothing but 1s and 0s.
Mark,
Here is what I did. I ended up dropping the code on the server and running it locally to avoid the conversion insanity.
#Setting of static variables
$PSServer = "mcaddnstdc01"
$DNSZone1 = "mclane.mclaneco.com"
$DNSZone2 = "mclaneco.com"
$hstname = "vcacdnstest"
$IP = "10.40.27.117"
#...
Thanks again Mark. PowerShell is not my normal coding playground and I am trying to get up to speed quickly in an environment that has both PS 3.0 and PS 4.0.
I.T. where a world of choas and confusion comes down to nothing but 1s and 0s.
Mark,
Thanks!
The code should have been -
$command = "dnscmd $PS_Server /RecordDelete $myzone $myhost A 192.192.192.192 /f"
Invoke-Command -ComputerName $PS_Server -ScriptBlock $command
However, the issue is not resolved. The error now is :
Invoke-Command : Cannot bind parameter...
When I take the following and attempt to have parameters passed I end up with File Path errors.
$command = { dnscmd PS_Server /RecordDelete myzone myhost A 192.192.192.192 /f }
Invoke-Command -ComputerName PS_Server $command
Change that fails and produces errors -
$command = { dnscmd...
D'OH!!! I have this figured out. I failed to run PowerShell as an admin.
I.T. where a world of choas and confusion comes down to nothing but 1s and 0s.
I am new to PowerShell but I have been able to work with DNS entries using the following command line commands but when I attempt to place them in a PowerShell script I get a File Path error.
Would someone enlighten me as to where I am going wrong? This is a small part of a larger project with...
Feherke,
Thank you. The single, double quoting, triple single quotes is always fun to attempt to keep straight.
I went with double quoting the variable and then using the single quotes which are more sed "standard" for syntax.
Thank you once again.
SteveR77
I.T. where a world of choas and...
This is one of those things that always seems to bite me.
I have the following in my script -
sed -i '284s/"[^"]*"/"mclane.com mclane.mclaneco.com"/g' <filename>
I need to configure the command so the 284 line number can be read as $ln in case the file
gets modified and the line number that...
Issue with this was I forgot about SSH reading STDIN.
This resolved the issue.
while read line
do
ssh root@"$line" "lsof | egrep '/stale/fs|/export/backup' | awk '{print $2;}' | sort -fu | wc -l" </dev/null
I.T. where a world of choas and confusion comes down to nothing but 1s and 0s.
I am currently working on the logic associated with testing for stale NFS mounts in my environment.
However, I am at a lost as to why the loop in my first case condition only gives me the first value from the file being passed then drops back to the menu options. I am aware that the stale...
PHV,
Thank you very much! I had been looking at this too long and forgot about the semi colon to group by the value I needed.
Thank you once again and make it a great day.
SteveR
I.T. where a world of choas and confusion comes down to nothing but 1s and 0s.
I have the following code -
#!/bin/bash
#
while read line
do
awk -F":" -v frdt=$(date --date="$line" +%Y:%m:%d) '(substr($0,1,10)==frdt) { count++} { tbytes+=$12} END {print frdt ", " tbytes/1073741824 ", " count }' /usr/roc/om/server/adm/activity_log
done < DATES
I expected the...
The final solution:
The Command:
awk -F":" -v frdt=$(date --date="yesterday" +%Y:%m:%d) '(substr($0,1,10)>=frdt) { tbytes+=$12} END {print frdt ", " tbytes/1073741824 " GB" }' /usr/roc/om/server/adm/activity_log
Resulting Output:
2014:11:04, 28.8526 GB
I.T. where a world of choas and...
Apologies.
Here is the code - awk -F":" '($1 == 2014 && $2 == 11 && $3 == 03) { tbytes+=$12} END {print $1"-" $2"-" $3"," tbytes/1000000000 " GB" }' activity_log
2014-11-05,18.8843 GB
I.T. where a world of choas and confusion comes down to nothing but 1s and 0s.
I am attempting to use AWK to generate a single line summary given a date range. Currently, the values are hard coded but the intent is to pass the formatted date values. Right now I am getting the last value in the file but I am not seeing where the filtering is off in the following code...
If the controlM processes are handled with a crontab as the scheduler you can set the increment for when you wish the job sequence to fire.
Example
Run the follow command at 4am and the next a 4:30am every day.
0 4 * * * /command
30 4 * * * /command
I.T. where a world of choas and...
LKBrwnDBA,
Apologies, but indirectly I did provide the data file's field seperator it is apart of the command line.
awk -F: '{ if($1 == 2014 && $2 == 9) if (index($15,"_")>0) print substr($15,length($15)-1); else print substr($15,1,2);print $2 $3 $1 $4 $5 $6 $8 $9 $10 $12 $13 $14 $15}'...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.