|Delphi Clinic||C++Builder Gate||Training & Consultancy||Delphi Notes Weblog||Dr.Bob's Webshop|
Delphi 2010 – what a feeling!
One of the new features of Delphi 2010 is the native support for touch, gestures and multi-touch (supported by Windows 7). Although it has always been possible to support simple touch in Delphi applications, where the touch screen simply emulates a mouse (so you can touch the buttons on the screen without using a keyboard or mouse), it was not yet possible to have Delphi (out-of-the-box) respond to special gestures or multi-touch events.These features are now possible with the release of Delphi 2010, and hence the topic of this article.
In order to play along, you need a copy of Delphi 2010 (you can download and install the 30-day trial edition if you wish).Start Delphi 2010.In order to create a new application, I do File | New – Other, which brings me to the Object Repository.This is the first area where we can see some changes already: a handy filter option at the top, to hide the icons and wizards that I’m not looking for.Just enter a few characters like “App” in order to filter the contents of the Object Repository down to the App-specific icons and wizards.A nice way to help me find my way without getting lost in the dozens of items.
This filter is especially important now that the Object Inspector shows all icons in all catagories, even when they are not applicable at this time (for example in the ActiveX category, where a number of wizards cannot be used until an AxtiveX library project has been created first). The benefit is that you will always see what’s possible, even if it’s not applicable just yet.The filter will also hide these items, although this is not visible in the screenshot above. After starting a new VCL Forms Application, we first continue by building a small database application.Just use a TclientDataSet in order to produce a stand-alone executable (without the need for database drivers).As data contents, I’d like to show the biolife table, which can be found in the biolife.xml file in the Common Files\CodeGear Shared\Data directory (in the future this might become the Common Files\Embarcadero Shared\Data directory perhaps). As soon as the FileName property of the TClientDataSet component is pointing to the biolife.xml file, we can use the Object Inspector to set the Active property of the TClientDataSet to True to show all data at design-time.By clearing the FileName property while Active is set to True, the data remains visible, but this time we force the data to be stored in the DFM file itself, and no longer outside of the application.This is a special trick that I use every once in a while, to embed the data inside the executable.Note that there are consequences to this approach: first of all, the DFM file will become about 2 MB in size (so it takes a while to save or compile the project – especially the linking stage).Second, and sometimes worse, is the fact that since the data is embedded in the executable itself, you cannot make any changes to the data.This may often be a show stopper, unless you’re building a brochure or catalog that you don’t want or need people to change anyway (like a city plan with pictures for example).For our demo, I’m assuming we do not need the biolife data to be changed by the user, so we’re on our way to produce a standalone executable indeed.
With the data persistent in the TClientDataSet, we can double-click on this component to start the Fields Editor. Right-click in the Fields Editor and select “Add All Fields” to see the available fields from the biolife table in the Fields Editor.In order to see them all (except one) on the form, we can simple drag them from the Fields Editor and drop them on the form.All, except for Length_In, which I don’t need (unless you don’t use centimetres as your measurement type, in which case you could decide to skip the Length (cm) field instead of course).Each of these fields will be transformed into a label and data-aware control when dragged onto the form. Most fields will end up being represented by a TDBEdit control, with the exception of Notes (inside a TDBMemo) and Graphic (in a TDBImage).After moving the TDBImage controls around, my version of the form looks as follows:
Note that I’m not using a TDBNavigator control on purpose, since I want to use the touchscreen with the simple touch and gesture functionality for navigation.
Touch and Action
Speaking of touch: now we’re getting to the interesting part of the article: the touch and gesturing support in Delphi 2010. We need to start by adding a TActionList component on the form.This will help to connect gestures to (standard) actions later.Delphi 2010 already contains a number of pre-defined gestures, which can be connected to actions (saving a lot of work compared to writing individual event handling code), as I’ll show shortly. The TActionList component does not have to be “filled” with (standard) actions right away, because we can do that “on the fly” (when needed).However, we do need a special TGestureManager component on the form as well (see next screenshot), and we should assign this TGestureManager component to the GestureManager subproperty of the Form’s Touch property.
Using the Object Inspector, we can expand the Touch property in order to view the GestureManager subproperty (assigned to the TGestureManager control on the form). We also see a Gestures subproperty, with a list of standard (and later also custom) gestures supported by Delphi 2010.The list of standard gestures starts with Left, Right, etc.Next to each gesture in the Object Inspector, we see a checkbox (to indicate that the gesture is “hooded” to an action or event) as well as a little drawing that depicts the gesture movement (albeit without direction, so the horizontal line for “Left” and “right” looks the same). For each gesture we can use the Object Inspector to create a New Action (for which we then need to implement the OnExecute event handler), or we can select an existing standard action (which is then added to the TActionList component).For our example, let’s connect the Left gesture to a Standard Action from the Database category, namely TDataSetPrior (in order to move to the previous record).The screenshot below shows the different submenus that have to be used to connect the TDataSetPrior action to the Left gesture.
After we’ve connected the Left gesture, it’s just as easy to connect the Right gesture to the TDataSetNExt action, the ChevronLeft gesture to the TDataSetFirst, and the ChevronRight gesture to the TDataSetLast action. The two chevron gestures can be compared to “bigger than” and “smaller than” characters, but then drawn with your finger on the touchscreen.
The only thing left to do is compile the project and run it. If you have a touch screen (like the LG L1510SF 1024x768 Flatron that I purchased for a good price) then you can move your finger over the screen to make the gesture movements required.If you do not have a touch screen, then you can still emulate this behaviour by using the mouse: click with the left mouse button and “drag” the gesture movement around the screen.It’s not the same, but at least you can test your gesture movements without the need to purchase a touchscreen yourself. Either way, you can use the Left, Right, ChevronLeft and ChevronRight gestures now to navigate through the records of the biolife table without the need for a keyboard or TDBNavigator control.
Apart from the built-in standard gestures, we can also create our own custom gestures with Delphi 2010. This can be done with the TGestureManager control: right-click on it, and select Custom Gestures.Inside the dialog that follows, you can click on the Create button to create a new gesture, where you need to move your finger over the touch screen (or drag with the mouse) to draw the initial path of the gesture.Once the initial path is drawn, we can still make some modifications to it, to ensure it will be recognised correctly. As an example, let’s draw the “Z” character (see next screenshot), and give it the name “Zorro” (a famous tv-character in The Netherlands about an outlaw hero who uses his sword to draw a “Z” on the chest of his government victims).
When making custom gestures, it’s important to realise that the gesture must be one fluent movement. As a result, you cannot make an X gesture, since that requires two separate gestures.For the X-gesture that I created, I had to change the sensitivity, which is set to 80% by default, to ensure that it’s easily possible to have the “Z”-gesture be recognised when drawn on screen by an end user.A high sensitivity requires that you redraw the blue dots exactly, and lower sensitivity extend the grey area around them (until it gets too fuzzy where you might confuse one gesture with another). We can also use the Custom Gesture dialog to remove dots from the gesture line, add new dots, zoom in and/or out, modify the coordinates and play a simulation of the gesture, as well as a test where you need to redraw the gesture to see if it is recognised correctly.It may take a while, but in the end you will have a near perfect custom gesture. When you click on OK, you are returned to the Custom Gesture dialog of the GestureManager component where a preview of the gesture shape is drawn as well as the name we’ve given it, and the ID (which starts by -1 and counts further down).
After all this work, it’s very fortunate that we do not have to draw the custom gestures for each new application, but we can simply store (export to a .dgf file – Delphi Gesture File) and load or import the .dgf in the TGestureManager of another application again. Once we have a custom gesture, we can work with it (i.e.respond to the recognised gesture) in two ways.First of all, we can use the OnGesture event of the form itself.In this event handler, we get the EventInfo of type TGestureEventInfo, as well as a var parameter Handled to indicate – when set to True - that we’ve handled this gesture. Note that we will only get inside the OnGesture event handler if we didn’t make one of the standard and already connected gestures such as the Left, Right, ChevronLeft or ChevronRight gestures. When inside the OnGesture event handler, we can use the EventInfo.GestureID field to identify the ID of the custom gesture.In our case, that value was -1, so my event handler can be implemented as follows:
procedure TForm1.FormGesture(Sender: TObject; const EventInfo: TGestureEventInfo; var Handled: Boolean); begin if EventInfo.GestureID = -1 then ShowMessage('Zorro') else ShowMessage(IntToStr(EventInfo.GestureID)) end;If an unrecognised gesture is made, then the GestureID is equal to 0 (which may be an indication that you need to work on the sensitivity of your custom gestures, or explain more clearly how the users should make their gesture). We have to remember for ourselves what the ID values are of our custom gestures, of course.Fortunately, there is also another way to connect an action to the custom gesture.If you take a look at the third screenshot again, you’ll see the Object Inspector with the Touch property and the Gesture subproperty.Right below that, the standard gestures are listed.However, as soon as we add a custom gesture, a new entry “Custom” is shown in the Object Inspector – right above the standard gestures.Custom contains the list of (currently only one) custom gestures that have been added to the TGestureManager.In this case, just the Zorro gesture, including the image preview of the “Z”, as we saw before in the Custom Gesture dialog.
This means that we can now make a connection between the gesture and an action. In this case, I’d like to create a new action, select this action in the Object Inspector when done, and then implement the OnExecute event handler of the new action, for example as follows:
procedure TForm1.Action1Execute(Sender: TObject); begin ShowMessage('Zorro also likes to touch Delphi 2010') end;Once the gesture is connected to a custom action and its OnExecute is implemented, the Ongesture event handler of the form will no longer respond to his Zorro gesture (have a try for yourself if you wish).
Running the demo project we’ve created so far – and which I’ve called Touching Biolife – does not show a GUI different from a “normal” non-touching GUI.If you take a look at the next screenshot, there’s no indication (yet) we’re dealing with a touch- and gesture-enabled application.The lack of a TDBNavigator may lead you to wonder how to navigate to the next record, but that’s about it.
Sweeping your finger over the screen from left to right (for the next record) or from right to left (for the previous one) may be a bit strange at first, but you’ll quickly get the hang of it. The effect is of course much nicer if the form would contain a city map or street plan, where you can “move” the map around with your fingers.This can also be done with gestures, and even better with the so-called Multi-Touch support which can be found in Windows 7 (but which also requires special, more expensive, hardware).Using Multi-Touch and interactive gestures, the EventInfo structure of the Ongesture event will hold field values for the inertia (speed of the gesture), as well as the start and stop coordinates of the gesture.Since this is only supported by Windows 7, in combination with special hardware compatible with the multi-touch and interactive gestures, I’ll leave an example behind for now (until I can get my hands on the special multi-touch hardware that is).
One more thing I can show you is the final step to make the application completely independent from mouse and keyboard: for deployment in a kiosk or bus station for example. Since these applications sometimes still need some textual input, Delphi 2010 contains a special TTouchKeyboard component.One which shows the keyboard layout for the current locale (which means that users from France will get the accents available on the keyboard when they need them). The following screenshot shows the TTouchKeyboard on my machine, using the US-International keyboard layout, at the bottom of the Touching Biolife form:
Obviously, placing the virtual keyboard at the bottom of the form is not a very effective way to handle input. It would be more convenient to use some kind of pop-up dialog that shows the virtual keyboard only when needed for example.But once the virtual keyboard is shown, we can use the keys on the screen as touch buttons and click on the keys to produce the input text.And if you have no touch screen but a mouse attached to your machine, then you can obviously still use the mouse to click on the buttons (so as a software developer you do not need to have a touch screen to work with the TTouchKeyboard – only when it comes to multi-touch and interactive gestures you need Windows 7 and the special hardware).
A final word: in order to turn the current demo application into a really standalone executable, we should add the MidasLib unit to the uses clause of the project, so we do not need to deploy the MIDAS.DLL. This results in a truly standalone executable that we can place on a USB-stick for example in order to show an application that doesn’t need mouse or keyboard in order to run.