Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

handwriting recognition in vfp

Status
Not open for further replies.

markftwain

Technical User
Jul 12, 2006
108
0
0
Hi experts,

I am moving my vfp 9 input from keyboard to handwriting under windows 7 with a wacom pen/pad.

The microsoft inked.inkedit.1 activeX control does a very good job, but seems to lack the rightclick event.

Are there better options available for embedded handwriting recognition for vfp? (or how can I add the rightclick event to the activeX control?) Should I use an inkoverlay on top of an editbox?

I am extremely new to handwriting recognition, so any help is much appreciated.
 
Have you tried a Google search for: handwriting vfp ?

I just ran the search and came up with a number of 'hits', some of which look possibly useful.

Good Luck,
JRB-Bldr
 
yes.

The best came from Microsoft a number of years ago. None specifically addressed rightclick with inkedit or inkoverlay.

Thanks,
Mark
 
I haven't worked with this control, but does it have anything similar to MouseDown and MouseUp events? If so, it probably has a way to determine which mouse button was pressed/released

Craig Berntson
MCSD, Visual C# MVP,
 
Yes.

MouseDown will detect rightclick with a button value of 2.

Thanks, Mark
 
I think rightclick support is quite useless for a touchscreen application. As long as you don't have a pen with a right click button, there is no such thing as detecting right vs left hand touching ;).

Right click in a touchscreen application is something I would consider to avoid.

Just a side note:

The normal controls like textbox, editbox etc. automatically offer handwriting recognition or virtual keyboard on tablet systems with xp tablet or higher. You just set focus into the textbox and windows displays a symbol beneath you can click (left click) on to activate handwriting recognition. You don't need to program anything to get this feature.

The only need to call the handwriting recognition is, when you want to activate it by other means. I would say: Teach your users to get used to the normal way windows offers automatically.

To your own application it's just like the input came from keyboard.

Bye, Olaf.
 
<pedant>
Touch screen applications are not the same as tablet PC applications.

One of the critical differences is that tablet PC (in Microsoft's definition) requires an electromagnetic stylus. They are NOT touch-screen unless the manufacturer has gone above and beyond the spec.

(The post-pc tablet, as defined by Apple, is still emerging so who knows where that will end up?)

Touch screen apps, which we've been doing since at least Fox 2.x, have their own unique requirements but handwriting recognition isn't one of them. Handwriting is unique to tablet PC and, as Olaf says, is built-in.
</pedant>
 
I don't if this will help, but I developed a handwriting application several years ago. This was an application where the users used a tablet PC with a stylus to make notes while they were on site visits.

There was definitely no question of a right-click event, or any similar events, for the control that received the handwriting. You just wrote into the control, and that was all. The software converted your scrawl to text (often with comical results), but the user had no control over anything other than the stylus.

My client ran the application as a trial for several weeks. They than abandonded it. They decided the results were so poor as to be unusable.

We agreed that handwriting-based applications were fine where the user just ticked boxes or entered simple string of letters or digits (like in an inventory check, for instance), but not for anything remotely more sophisticated.

I should add that that was several years ago. It's possible that the technology has improved since then. I don't know.

Mike

__________________________________
Mike Lewis (Edinburgh, Scotland)

Visual FoxPro articles, tips, training, consultancy
 
I don't know about the details, actually the wikipedia article about tablet pcs talks of a touchscreen.

And I think it's inverse, resistive screens recognising mechanic stress, eg touching, is the less sophisticated technology in comparison with capacitive screens. I maybe wrong there, as I'm not depp into hardware or gadgets myself.

Nevertheless I have a netbook, which is a tablet convertible and Win7 on it and it works with a normal VFP control without any extra programming. You can choose handwriting recognition or on screen keyboard and switch between theese options. You activate these by any windows control for keyboard entry, showing the windows text cursor, including vfp controls.

No matter if you call it touchscreen or not, if a stylus is onvolved, handwriting recognition is built into windows.

Bye, Olaf.
 
Mike,

I can report that handwriting recognition is very good on Windows 7 even without training as you need for speech recognition. It's not good for writing long texts or source code. It also could be faster, but then you can't write that fast with a stylus anyway and you don't have to wait for the recognition to start, you write on and text even is amended while you write longer sentences.

Bye, Olaf.
 
Based on my smart phone experience, is tap-and-hold on a touchscreen the equivalent of a rightclick? Or is there a different event for tap-and-hold?

Tamar
 
It depends. <g>

Try it on three different laptop makers' touchpads. Sometimes it's scroll lock, others it's right click. Universal agreement on the meaning of gestures is light years away. The great things about standards is there are so many of 'em!
 
Hi Tamar, dan,

Rightclick event via gestures, hm. Typically the right click event is needed at a specific position, in case of handwriting recognition at least precise enough to point at the text you want to edit or the empty textbox you want to enter into.

I'm quite satisfied with the automatic symbol appearing when you set focus there. I've also seen a strikethrough gesture work or selecting a part of the text first.

At least you can say that in regard of right click apple was visionary for not having a right mouse button.

No matter if and how you can imitate a right click on a touch screen with a stylus involved, I'd expect a touchscreen app to enable activating handwriting recognition with anything else but a rightclick event. I wonder why you would want to implement it that way. Of course the left click is for setting the focus only, but why not also popup handwriting recognition at that moment? You should be able to detect if the screen is a touchscreen and even if not let your users configure your application with pen mode on or off.

Bye, Olaf.

 
Hi experts,

As suggested by Microsoft, I am using a wacom pad, attached to a windows 7 netbook running vfp.

The technology is such that the wacom driver delivers pressure sensitive strokes and three programmable buttons.

In my (limited) experience, Microsoft handwriting recognition is excellent. I am using the activeX object inkEd.inkEdit.1 for input and the computer display as a last resort.

Thank you, Mark
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top