Behaviour Tree Editor
Brain Designer is a visual behaviour tree editor. It allows you to build behaviour trees by using simply drag&drop. The editor supports plugins, exporters and stores behaviours as XML files.
Please notice that the Brain Designer is an editor. You have to write your own exporter which generates files for your AI. The editor contains no node logic so it contains no AI. You have to implement the nodes by yourself. Nodes can be added, removed or modified by modifying the source code of the example plugin included.
Support for Workspaces and Plugins
Independent Component Analysis (ICA)
Sample application demonstrating how to use Independent Component Analysis (ICA) to perform blind source separation of audio signals.
This sample application demonstrates usage of the API to control Lego Mindstorm RCX robotics kit. With the sample application it is possible to connect RCX brick (USB IR tower is required), read its sensors' values, control motors, get information about the device and play simple sounds on the device.
The sample application is very similar to the above one, but demonstrates usage of API to control Lego Mindstorm NXT robotics kit. The application provides the same functionality - connecting to NXT brick (over Bluetooth), checking its sensors values, controlling motors, etc.
This application also demonstrates controlling Surveyour's Stereo Vision System board. The application allows receiving video feeds from both cameras, show stereo anaglyph images and manipulate robot by driving it using predefined commands or using direct motors' control.
The sample application demonstrates the work of different motion detection algorithms. With this application it is possible as just to enable/disable motion detection, as turn on different motion post processing algorithms, like highlighting of motion regions. It supports number of different video sources, which includes USB web cameras, JPEG snapshots and MJPEG streams over HTTP (IP cameras), local video files. In addition it allows specifying regions of interest, where the motion should be detected. Yet another feature of this application is to show motion history - a chart on the bottom, which shows history of detected motion level.
Handwritten digits recognition by using Non-linear (Multiple) Discriminant Analysis using Kernels (KDA).
Hands Gesture Recognition
This sample demonstration uses motion detection as its first step and then does some interesting routines with the detected object - hands gesture recognition. Let's suppose we have a camera, which monitors some area. When somebody gets into the area and makes some hands gestures in front of the camera, application should detect type of the gesture and raise an event, for example. When the hands gesture recognition is detected, the application may perform different actions depending on the type of gesture. For example, gestures recognition application may control some sort of device or another application sending different commands to it depending on the recognized gesture. What type of hands gestures are we talking about? This particular application recognize up to 15 gestures, which are combination of 4 different positions of 2 hands - hand is not raised, raised diagonally down, diagonally up or raised straight.