This is really odd...
<script language="javascript">
function checkCharacter(e) {
// \w+ any/repeating word character
validString=/\w+/;
if (!validString.test(e)) {
alert('false');
}
else {
alert('true');
}
}
</script>
<form action="" onSubmit="checkCharacter(this.testChar.value);" method="post">
<input type="text" name="testChar">
<input name="submit" type="submit" value="submit">
</form>
So this should return false for any characters not in the \w, which means all non alpha-numeric characters. Normal strings work fine. Except when I put in any string with a _ (underscore) character, it returns false. \w shouldn't do that, right?
<script language="javascript">
function checkCharacter(e) {
// \w+ any/repeating word character
validString=/\w+/;
if (!validString.test(e)) {
alert('false');
}
else {
alert('true');
}
}
</script>
<form action="" onSubmit="checkCharacter(this.testChar.value);" method="post">
<input type="text" name="testChar">
<input name="submit" type="submit" value="submit">
</form>
So this should return false for any characters not in the \w, which means all non alpha-numeric characters. Normal strings work fine. Except when I put in any string with a _ (underscore) character, it returns false. \w shouldn't do that, right?