We envision an aggregation of applications that use post-WIMP, pen-centric interfaces to make computational assistance more natural and efficient; among such applications are MathPad2, the Music Notepad, and ChemPad, which are shown in the image above. We originally called that aggregation *Pad, using ‘*’ as a wildcard, but are now calling it starPad for ease of location through search engines.
The goal of the starPad SDK project is to make it easier for people to
write these sorts of applications by providing a convenient interface for a
broad layer of pen-centric functionality in addition to some research
functionality. Currently, it is written for MS Windows .Net and uses WPF. It
includes a convenient interface to stroke-level operations, a recognition
library for handwritten math and gestures, some UI techniques such as
GestureBar, and a pen- and gesture-based application shell supporting
selection, undo, zooming, text input, images, and save/load.
We are releasing a beta version
(0.1.3) of our source tree; the README
file includes an overview of the included functionality.
Currently, the license that comes with it essentially allows free
non-commercial use. The authors disagree as to whether to make the code free
for commercial use as well (and thus open-source) and have agreed to postpone
the decision until and unless someone actually wants to use it for such a
purpose. So, if you want to use our code in a commercial product, please get in
touch with us: tsm, bcz, and acb at cs.brown.edu.
For future functionality for this SDK, we are currently investigating the
feasibility of externally retrofitting, without altering, running applications
with new interfaces and functionality made possible by using a combination of
application-independent pixel-level recognition techniques and more specialized
techniques for inspecting data structures exposed by the window system or
application. Users will be able to make annotations to those running
applications, including hand-drawn ink, typed text, diagrams, interactive
widgets, or even links to other application user interface components.
Registration techniques will then be researched to associate the annotations
with specific elements of an application or document so that such an annotation
can be made to appear perhaps only in one place in a specific file, or whenever
a certain application runs. The facility we are developing would support a
variety of practices, including: integrating functionality from different
applications, enriched collaboration, task or user customized interfaces, and
adding new fine-grained user interface elements to applications. Registration
techniques will be supported through such means as using system and
accessibility information, optical character recognition and shape recognition
techniques for reading bitmaps, and recognizing handwritten ink. We also expect
to gain insight into the impact on usability of different levels of artifacts
due to imperfection in the registration techniques and heuristics.
Additionally, we are planning to investigate prototype extensions to
applications that are meritorious in and of themselves, such as universal
spell-checking.
|