Abstract

Parkinson’s disease (PD) is a progressive neurodegenerative movement disorder where motor dysfunction gradually increases. PD-specific dopaminergic drugs can ameliorate symptoms, but neurologists also strongly recommend physiotherapy combined with regular exercise. As there is no known cure for PD, traditional rehabilitation programs eventually tire and bore patients to the point of losing interest and dropping out of those programs just because of the predictability and repeatability of the exercises. This can be avoided with the help of current technology, character-based, interactive 3D games promote physical training in a nonlinear fashion, and can provide experiences that change each time the game is played. Such “exergames” (a combination of the words “exercise” and “game”) challenge patients into performing exercises of varying complexity in a playful and interactive environment. In this work we present a Unity3D-based platform hosting two exergames tailored to PD patients with mild to moderate symptoms. The platform employs Microsoft Kinect, an affordable off-the-shelf motion capture sensor that can be easily installed in both home and clinical settings. Platform navigation and gameplay rely on a collection of gestures specifically developed for this purpose and are based upon training programs tailored to PD. These gestures employ purposeful, large-amplitude movements intended to improve postural stability and reflexes and increase upper and lower limb mobility. When the patient’s movements, as detected by Kinect, “match” a preprogrammed gesture, an onscreen 3D cartoon avatar responds according to the game context at hand. In addition, ingame decision-making aims at improving cognitive reaction.

1. Introduction

Parkinson’s disease (PD) is a multisystem neurodegenerative disorder caused by the depletion of dopamine in the substantia nigra area of the brain, which results in both motor and nonmotor clinical symptoms [1]. Relevant symptoms usually appear around 60 years of age and progress slowly, but irreversibly. Although patients with PD in the early stages typically show some level of motor dysfunction, patients at more advanced states show nonmotor cognitive, behavioral, and mental-related symptoms [1, 2]. Tremor, Rigidity, Akinesia, and Postural instability (grouped under the acronym TRAP) are the four distinct, fundamental symptoms of PD [3]. Akinetic symptoms include both hypokinesia (reduced amplitude of voluntary movement) and bradykinesia (reduced speed in voluntary movement). These symptoms that often develop in different combinations hinder common daily activities, trouble patients’ social relationships and reduce the quality of life, especially as the disease progresses [4, 5]. Although a cure for PD has not yet been discovered, medications usually control symptoms and maintain body functionality at reasonable levels through the early years of the disease [1], but also [6]. However, as neurodegeneration progresses, new motor symptoms emerge that include motor complications related to the chronic administration of dopaminergic agents with short half-lives (such as hyperkinesis and motor fluctuations), drug resistant postural instability, and freezing of gait. However, standard medication paths cannot be easily achieved, as the course and symptoms of the disease varies significantly among patients [7]. During the intermediate and advanced stages of the disease, the majority of patients experience significant motor problems despite optimal pharmacological management.

In addition, physiotherapy appears highly effective in controlling PD-related symptoms: physical exercise involving stretching, aerobics, unweighted or weighted treadmill and strength training improves motor functionality (leg action, muscle strength, balance, and walking) and, therefore, quality of life [8, 9]. Exercise and physical activity have been shown to delay the deterioration of motor function, to improve cognition and to prolong functional independence, e.g., [911]. Furthermore, important new evidence [12] suggests a possible neuroprotective and neurorestorative role of exercise in PD. This physiotherapy-related beneficial effect on motor and cognitive function applies to all disease stages. Calgar, et al. [13] point out the effectiveness of home-based, structured physical therapy exercise programs tailored to the individual patient, with measurable improvements in motor capability. Finally, Farley & Koshland [14] demonstrate that PD patients practicing large amplitude exercises (i.e., those involving large scale limb movements such as reaching an object, walking, twisting the torso to the side, stepping and reaching forward or sideways) reap measurable benefits in terms of improvements in speed and agility for both upper and lower limbs. Farley & Koshland further support that this so-called “Training BIG” amplitude-based intervention strategy seems to activate muscles to meet the distance, duration, and speed demands set forth by the relevant exercises.

At the same time, PD patients attending long-term repetitive exercise programs tend to get bored of the same daily physiotherapy routine [15]. In fact, [16] argues that exercise must be more frequent than two or three times a week to be effective in stepping performance, mobility, cognition, and reaction as well as reducing the frequency of falls. They also advocate further research to investigate the doses of training according to the state of the disease. PD patients must also adhere to regular and long-lasting clinical physiotherapy programs [17] before noticing improvement and in order to maintain gains from exercise in the medium and longer term. In fact, Mendez, et al. [18] propose investigating the learning potential of patients with Parkinson’s disease by applying new therapeutic strategies and validating their utility. Indeed, exergames (exercise-based games) based on consoles like Sony’s PlayStation Eye, Nintendo’s Wii, or Microsoft’s Kinect appear very promising exercise tools. Such games use audio and visual cueing in loosely structured 3D environments to avoid repeatability (the seed of boredom) characterizing more traditional physiotherapy programs. A review conducted by Vieira et al. [19] concludes that tools based on virtual reality used as therapeutic tools as they have been shown to improve motor function, balance, and cognitive capacities. A number of other recent studies seem to corroborate not only the feasibility but also the benefits of exergames for PD patients, which seem to emerge as a highly effective physiotherapy practice [20].

Extreme care and forethought must be put forth on designing exergames specifically for PD patients. For example, exergames must provide motivation and positive feedback, be progressively challenging and adaptable to a specific patient’s condition and at the same time be safe to minimize and even eliminate accidents such as falls. Therefore, designing safe exergames for PD patients is a challenging task and certain design principles must be followed [21]. Amini et al. [22] designed a system that monitors PD patients’ movement and detects freezing-of-gait (FOG) and falling incidents which can be handled appropriately (contact emergency services, relatives, etc.). In addition, that same system employs an automated laser casting system based on servo motors and an Arduino microcontroller casting visual cues based on patient walking patterns in order to improve patient mobility. Barry et al. [20] propose that platforms which do not require raised platforms, handheld controllers, and/or body markers need to be further evaluated with respect to the physiotherapy opportunities with maximal safety for PD patients. Microsoft’s Kinect sensor falls in that category, as it requires no external input controllers and can capture the motion of the entire human body in 3D, using an RGB camera and a depth sensor. Players can manipulate game interactions by moving their body in front of the sensor.

A Kinect-based game for PD patients developed by Galna et al. [23] uses specific upper and lower limb movements to improve dynamic postural control. In that game, the upper torso of the player is mapped onto a farmer avatar who drives a tractor, collects fruits and avoids obstacles in a 3D environment. To guide the tractor so as to avoid obstacles such as sheep, high wires and birds, the patient must hold one leg steady on the floor and take large steps (front, back, and sideways) with the other leg to the direction he/she wants the tractor to move. In order to preserve patient motivation, the game has several levels of increasing difficulty: game complexity increases from simple hand movements for low levels, through more complex activities combining cognitive decisions and physical movements, all the way to dual tasking (simultaneous hand and foot work). The design principles for the game resulted from a workshop where participants played and evaluated a wide range of commercial games developed for Microsoft Xbox Kinect and Nintendo Wii. The workshop verified the difficulty some patients have in interacting with Wii’s handheld controller and balance board. The authors conclude that Kinect seems both safe and feasible for PD patients and advocate further investigating the potential of the sensor as a home-based physiotherapy solution.

Another PD-oriented multiplayer game also based on the Kinect sensor was developed by Hermann et al. [24] to investigate if cooperation within the context of a game can lead to improvements in terms of communication and coordination. Using hand movements, two players collect buckets of water in a flooded area to reveal an object hidden underneath. The game is played in two different modes. In the first mode (loose cooperation) both players drain the area, while in the second mode (strong cooperation) only one player drains, while the second player reveals the object. That work concludes that multiplayer games are indeed feasible for PD patients and that asymmetric roles (strong cooperation) can motivate communication between the participants and lead to a better game experience.

Cikajlo et al. [25, 26] practiced intensive upper extremity physiotherapy via tele-rehabilitation on 28 early-stage Parkinson patients using a Kinect-based exergame called “Fruit Picking.” The objective of the game was to collect apples before they fell from a tree and place them in a basket at the bottom of the screen (raise arm with open hand, close hand on an apple to grab it). The system recorded the player’s movements and adapted the level of difficulty in real time in the sense that, with game progress, the tree was more sparsely populated with more apples, which ripened and fell more frequently from it. Further analysis showed evidence for clinically meaningful results for these target-based tasks.

2. Materials and Methods in the Design and Implementation of a Kinect-Based Interactive 3D Exergame Platform

The present work reports on a Kinect-based, interactive 3D platform hosting two exergames (the Balloon Goon game and the Slope Creep game) tailored to PD patients stages 1 through 3 in the Hoehn and Yahr [27] scale, i.e., with mild to moderate symptoms, without severe postural instability and motor impairment. The Kinect sensor provides data streams from an RGB camera and an IR depth camera which are processed by the MS Kinect SDK in real time to create a depth-map and yield coordinates for the 3D skeleton of the human in front of the sensor in each frame. A custom collection of gestures tailored to the PD condition has also been designed to navigate the platform and game menu and to interact with game objects in the sense that when the player’s movement “matches” a gesture programmed in a game, a 3D cartoon avatar moves as programmed in that game context. These gestures have been adopted from existing training programs (e.g., [28]) aiming at improving postural stability and reflexes as well as increasing overall mobility for upper & lower limbs. This approach is in line with the findings of Farley and Koshland [14], who demonstrate measurable benefits in limb speed and agility for PD patients practicing large amplitude training in the sense of performing extended but purposeful and fluent movements.

The game platform, the gaming environments, the visual artefacts users can interact with and the exergames themselves have been developed on the Unity game engine, a powerful development platform used to develop 2D and 3D games. Several video and audio effects embedded in the game environment clearly communicate feedback based on game progress to the user. These include a score board, target/obstacle hit sound effects, instructive text and applause. To achieve a game goal, patients have to complete either a prescribed or a dynamically-generated sequence of tasks such as hitting moving objects within a specific window of opportunity, avoiding obstacles and collecting prizes. Game difficulty progresses either in the form of accommodating different levels or by increasing the frequency of tasks with game progress and summons various patient capabilities and dexterities, an approach intended to lessen the frustration from repetitive failure.

Platform-level navigation is facilitated via custom, PD-specific gestures intended to circumvent on-screen pointer systems using the location of a patient’s hand, a highly frustrating approach for people having trouble maintaining a steady hand to select a menu option. For that same reason gestures were favored which cannot be caused by inadvertent movement due to tremor/hyperkinesia. These gestures are by design reserved for navigation to avoid further confusion to the player. As a result, patients do not have to resort to alternative input methods and controllers. These main menu navigation gestures are discussed directly below.

“Twist Torso to Navigate” Gesture. This gesture implements a previous/next action to navigate a list by rotating the upper torso to the left or to the right, as shown in Figure 1. Because of this, the list must be short (a few items only) otherwise navigation becomes cumbersome. Gesture detection tracks the shoulder joints on a horizontal plane at the level of the shoulders and benefits mobility of the upper torso as well as postural control, according to [29, 30].

“Stretch Arms High to Select/Pause” Gesture. The gesture requires concurrently raising both arms sufficiently high above the shoulders to either select the game or action shown in the center of the screen (outside a game) or pause the game under way (within the game). This is a common stretching exercise [31], which increases flexibility for both torso and arms. Figure 2 presents how the gesture operates in the context of navigation.

3. The Balloon Goon Game

This score-based, gesture-driven game, first presented in [32], is the first game that was added in the new platform. The game starts as shown in Figure 3 and calls for controlled arm and leg gestures reminiscent of “punches” and “kicks” in order to pop balloons which fall randomly along four vertical rods. Arm/leg extensions (punches/kicks) can be used to pop inner/outer rod balloons, respectively. Bilateral exercise is encouraged by requiring that balloons falling along the left two rods can be popped by left punches/kicks and those falling along the right two rods by right punches/kicks. Successful gesture detection triggers appropriate predefined animations of the on-screen avatar. Three game levels of increasing difficulty have been implemented. As one progresses to higher levels, the speed and number of the balloons increase so that they drop more frequently. In addition, the third level is even more demanding in motor and cognitive capabilities and reaction time, as it includes higher value (bonus) balloons along with “bomb” balloons which must be avoided as popping them incurs a drop in the score. Finally, performance is prominently displayed to the patient during gameplay and upon game completion.

3.1. Gestures and Animations

Arm Extension (“Punch”) Gesture. The gesture is detected when the player shown in the bottom-right inserts in each of the screenshots shown in Figure 4 stretches forward the left or the right arm. As one might expect for exercises tailored to patients with Parkinson, the game requires a controlled purposeful arm extension reminiscent of a punch and not an actual powerful and fast punch. Gesture detection is examined by comparing distances between the joints of the hand being extended and the corresponding shoulder along the relevant vertical plane. Successful gesture detection triggers the corresponding 3D avatar (shown in the main screen in Figure 4) animation that may or may not cause a balloon to pop. The gesture is based on reaching-and-grasping therapy exercises [33] that increase arm mobility. As a side note, the “punch” gesture is also used to select game menu options, e.g., to activate an on-screen quit button or proceed to the next level.

Leg Extension (“Kick”) Gesture. The gesture is detected when the human actor shown in the bottom-right inserts in each of the screenshots in Figure 5 raises a leg by bending the knee and stretches it out to the front. As in the “punch” gesture, the game does not require an actual powerful and fast kick, but rather a controlled and purposeful leg extension. The logic employed to detect a “kick” is similar to the Arm Extension (“punch”) gesture and is based on the positions of the Foot and Hip joints. Gesture detection triggers the corresponding 3D avatar animation shown in the main screen in Figure 5 and can cause the balloon to pop if performed in a timely manner. The required gesture combines a leg-lift-and-hold in marching position which improves balance [29, 34] with a leg stretch exercise used to strengthen leg muscles [35]. It is important to note that patients unsure of their balance may use side supporters without interfering with the gesture detection algorithm.

3.2. Game Logic

An overview of the Balloon Goon game logic appears in Figure 6. Upon game initialization, the human actor is identified and his/her skeletal structure is tracked using Kinect SDK functions. Subsequently, a live feed from the RGB camera of the Kinect sensor is shown in an insert within the main window; this can be seen in all in-game screenshots up to this point. To start the game the player must raise both hands; a detected “Stretch Arms High to Select” gesture causes the game environment to load. A countdown counter then appears to alert the player that the game will begin shortly. That same counter also appears as a recapping aid every time the game resumes to alert and prepare the player. For example, to take a break from the game being played, the player performs a “Stretch Arms High to Pause” gesture to pause the game and be presented with two options: quit or resume playing the current game, either of which can be activated by the corresponding “punch” gesture.

At the end of the count down, the game starts and balloons appear falling along the four vertical rods. Punch/Kick gestures pop balloons along the inner/outerrods. Every game loop checks for the detection of either of these gestures (also for a third “pause” type gesture). If a punch or kick gesture is detected, the corresponding animation of the avatar is activated. On screen, the appropriate arm or leg of the avatar stretches to perform a punch or kick as appropriate. Popping a falling balloon requires an actual collision of the hand or foot of the avatar with that balloon, meaning that the animation must be triggered in time, before the balloon leaves the pop-able region. The event of a valid collision between the hand or foot of the avatar with a pop-able balloon triggers a the following actions to notify the player: the balloon becomes brighter, an appropriate sound and balloon popping animation play and, finally, the score is incremented by the value of the popped balloon. At higher levels, balloons not only fall faster and more frequently, but also new balloon types appear mixed with regular balloons: bonus balloons, as shown in Figure 7(a), carry a higher value and must be collected, whereas a bomb-type balloon, shown in Figure 7(b), carry a penalty and must be avoided.

At any point during game play, a “Stretch Arms High” gesture pauses the game. A pause screen appears as shown in Figure 8(a) presenting two options: quit and resume play, each of which can be activated by an appropriate “punch” left or “punch” right gesture, respectively. However, as shown in Figure 8(b) our player has decided to leave the room to take a break, as a result of which a “Player Not Detected” note appears on screen. When the player returns and is once more detected, he/she may select to resume play, at which point a countdown counter appears which alerts and reaccustoms him/her to the state of the game when paused as in Figure 8(c).

When all balloons have dropped, the game level is considered complete and performance data for that level is communicated on screen as in Figure 8(d) which remains active for some time for the player to digest that information. Then a new screen appears as in Figure 8(e) presenting two options: quit or proceed to the next level.

In having to make the right decision in the allotted time, the game calls for a real time collaboration of the cognitive and neuromuscular systems and promotes cognition and agile reaction, as well as balance and flexibility. For example, in order to pop a balloon, the patient must first recognize along which (left or right) pole that balloon falls and then decide whether to use a punch (inner pole) or a kick (outer pole). Furthermore, the entire decision-making process and ensuing action must conclude while the balloon drops through the “pop-enabled” region.

3.3. Scene Design

The 3D avatar used in the exergames has been adopted from Unity’s free asset store, where it is provided already rigged and skinned. Additional 3D game assets (trees, prizes, obstacles, etc.) have been designed in Autodesk 3ds Max 2016 with textures created in Adobe Photoshop CS2. Some of these game objects can interact with the avatar or other game assets, while others are used as static scenery. These game assets have been modelled with a sufficiently low polygon count to strike a balance between a visually appealing game environment and a responsive real-time game experience, even for computers with mediocre hardware resources. A third-person viewpoint affords a clear view of both the avatar and the game scene, which includes static game assets such as a background static terrain, two directional lights and a skybox carefully placed so as not to distract the player and at the same time host game information in a nonintrusive manner.

The main game assets are the four vertical posts and the balloons falling along those posts. A single texture has been used for the vertical rods, with the exception of a segment on each rod which is textured with a different (reddish) color to denote an “active” region and inform the player when to pop a balloon: when a balloon enters this active segment, it becomes brighter and “poppable.” In addition, balloons that can be popped by a “punch” gesture are differently textured from those that can be popped using a “kick” gesture (first two inserts in Figure 9). Finally, the third game level adds two types of balloons: bonus balloons (textured with a smiley face) which give extra points when popped and bomb balloons (textured with a bomb) that take away points (last two inserts in Figure 9) if popped.

4. The Slope Creep Game

In this newly developed gesture-driven game (snapshots of which appear in Figure 10) the player controls a 3D cartoon skier in a snowy landscape. The skier moves along either one of two parallel lanes to collect prizes (rings, stars) of different values and also to avoid obstacles. As in real cross country ski, the player pushes two imaginary ski poles down and backwards to make the avatar move forward. Leaning left/right causes the avatar to change lanes. Regarding prizes, rings on the ground can easily be collected, but stars are a lot higher and can only be collected by causing the skier to jump over a platform. A successful jump requires only a squat gesture (to pick up additional speed for a higher jump) at some distance before the avatar reaches the ramp. Sparse rocks in the terrain can be avoided by changing lanes; otherwise part of the avatar’s life is lost. The game ends when the avatar crosses the finish line. During gameplay and also upon game completion the player is kept informed of his/her performance (rings/stars collected, lives available). Progressive difficulty is introduced within a level as rings are gradually succeeded by stars and rocks.

4.1. Gestures and Animations

“Push Both Ski Poles” Gesture. The gesture involves stretching both arms to a frontal extension, followed by lowering them to the initial position (as shown in Figure 11) within a given time period. Positional deviations between both the left/right hands and the left/right shoulder joints are calculated in real time and compared to the actual lengths of the left/right arms. The logic employed is similar to the Arm Extension (“punch”) gesture in the Balloon Goon game, with the additional requirement that both arms must move in unison to push the ski poles from an initial position where both arms extend to a full frontal extension. Successful gesture detection triggers the corresponding 3D avatar to move forward in the game environment. The gesture is based on arm and shoulder strengthening exercises. For patients experiencing differential mobility issues between left and right sides, holding a stick with both hands should help arm synchronization.

“Lean Torso to the Side” Gesture. The gesture is detected when the human actor shown in the bottom-right insert in the screenshots shown in Figure 12 leans the upper torso to the left or right side. The gesture is detected based on deviations between the positions of multiple joints, such as the Hip Center, Shoulder Center and both Shoulder joints, the Core Length and the Shoulder length. For the gesture to be detected successfully, the player must lean to one side while keeping the core of the body straight. Successful gesture detection causes the 3D avatar to turn left or right in the game environment by applying the correct animation clip to switch between ski lanes (this appears in the main screen area in Figure 12). The exercise promotes torso flexibility and is based on existing physiotherapy exercises [35]. Except from the obvious benefits for stretching, if executed from a standing position may even promote balance as patients transfer their weight from one side to the other.

Squat Gesture. The gesture requires that the human actor performs a squat, as in Figure 13, which is identified by looking for sufficiently small foot-to-hip distances (along the vertical axis) compared to those measured while standing. Squats improve patients’ balance and strengthen leg muscles [34]. In the Slope Creep game, squats cause the avatar to pick up speed so as to jump higher from an upcoming raised platform.

Patients with stability problems may resort to mobility support aids like a four-legged walking cane or partially support their weight on a pair of chairs put on either side to better control their posture and avoid a fall. If one opts to use chairs, they should be set a bit further away from the sides of the body to avoid interference with the “Push Both Ski Poles” gesture.

4.2. Game Logic

The Slope Creep game mixes torso and arm gestures to guide an avatar along a ski lane, collecting prices and avoiding obstacles along the way. In contrast to the Balloon Goon game, the Slope Creep game uses a single longer level of progressive difficulty, implemented by the rate of appearance of bonuses and obstacles, which become more frequent with game progress. The gestures employed target flexibility, muscle strength and balance and aim at improving cognitive reaction through a timely response to upcoming prizes and obstacles. Figure 14 displays the game logic.

Similarly to the Balloon Goon game, user identification is followed by a “Stretch Arms High to Select” gesture to start the game. However, no countdown counter is present in this game, because initiative to start moving the skier avatar now must be taken by the user who has to execute a “Push Both Ski Poles” gesture. In the meantime, an idle animation of the skier occasionally shifting his weight from side to side keeps the screen busy.

The game environment consists of a straight ski lane (composed of two parallel paths) and various game artefacts. The skier avatar can switch between paths (to collect prizes or avoid obstacles) by performing a “Lean Torso to the side” gesture, which triggers the corresponding animation causing the 3D character to switch paths. However, leaning to the left/right path when the avatar is already on left/right path, respectively, has no effect. Prizes along each path appear either in the form of rings positioned near the ground or stars at a respectable height above ground. To collect a price, the avatar has to switch to the correct path so as to force an avatar-prize collision, upon which the score maintained at the top of the game environment is incremented by the value of the prize collected. As in the Balloon Goon game, collisions with game artefacts trigger various sounds to enhance the game experience.

Whereas collecting a ring is pretty straight forward, collecting a higher-value star is slightly more involved/demanding. An upcoming star is preceded by a speed-up lane (to alert the player), which ends in a jumping ramp with the star even higher above that ramp. The following scenario has been implemented, which does not require a player to actually jump at any point in the game, so as to avoid possible falls as a result of mobility and balance problems. The player must first switch to the speed-up lane if he/she is not already on that lane and squat when cued for the avatar to pick up speed. Successful detection of a squat gesture triggers an animation which speeds up the avatar. When the avatar enters the jumping ramp, a collision is detected and another animation performs a high jump over the ramp. The parameters of this jump guarantee collision of the avatar with the star above the ramp resulting in collecting the star and incrementing the score on the score board accordingly. On the other hand, missing the squat results in a slower approaching avatar speed so that the lower-jump animation that is triggered is not sufficient to collect the prize.

Obstacles in the form of rocks along the ski lanes can be avoided by performing a “Lean Torso to the side” gesture to the other (empty) lane in a timely manner, as in Figure 15(a). Failure to escape the obstacle, an animation is triggered which shows the avatar falling over the obstacle and landing on the snow, as in Figure 15(b). One (up to a total of 3) lives is lost and the player is directed to perform “Push Both Ski Poles” gesture to retry. Unless all lives are lost, thus ending the game, the player reaches the finish line, as in Figure 16(a), and is greeted by a screen displaying detailed performance information (prizes collected and lives retained) and a numerical score, as in Figure 16(b). The screen lingers for a few seconds before the player can opt between restarting the game and quitting. Finally, as in the Balloon Goon game, a “Raise Both Hands” gesture results in game pause as well as in the appearance of an appropriate menu screen.

4.3. Scene Design

Several 3D game assets such as terrain, trees and fences have been designed in 3ds Max ab-initio using basic shapes. Textures designed in Photoshop were subsequently applied on objects (Figure 17). In addition, the ski equipment (skis and poles) have been designed and attached on the avatar while other objects are allowed to dynamically interact with it. However, ski equipment can also become detached from the avatar, e.g., in case the avatar collides with an obstacle and falls. A third-person camera view tracks the avatar during the gameplay. Other assets that can interact with the avatar include ramps, speed lanes, sidebars, prizes (in the form of coins and stars), and obstacles (in the form of rocks). A darker skybox (compared to the Balloon Goon game) was selected to create a contrast with the showy terrain.

Whereas the main game assets objects were created in 3ds Studio Max, the majority of dynamic game assets were used to interact with the avatar were recomposed within Unity to allow for grouping. For example, the jumping ramp game asset group is composed of the ramp 3D model, the speed lane, and three sidebars. This group was saved inside Unity as a prefab to allow spawning in various locations in the game scene (Figure 18). Prefabs of stage parts with already positioned obstacles and prizes can be loaded at game starts but also during gameplay. These premade game sets accelerate the creation of game levels of varied difficulty by positioning these stage sets in the desirable position inside the game environment.

5. Discussion and Future Work

Regular physical exercise and appropriate training can significantly improve motor function, postural control, balance, and strength in Parkinson’s disease (PD) patients. This work presents an interactive 3D game platform built on Unity’s 3D game engine which hosts two exergames (the Balloon Goon game and the Slope Creep game) developed specifically for PD patients with mild to moderate motor symptoms (Hoehn & Yahr stages II and III). The platform employs Microsoft’s Kinect sensor to capture patient movement in real time, without handheld controllers or external raised platforms that may be a danger to or impede specifically patients with PD. In fact, main platform navigation and gameplay adheres to PD-specific design requirements and principles drawn from the bibliography and presented in [21]. The choice to employ the Kinect sensor offers a unique opportunity to create game systems that facilitate patient monitoring during exercise to provide real time feedback, such as onscreen guiding artifacts and repetition counters as well as performance data.

A unique contribution of the present work is the design and implementation of a custom collection of platform gestures that are tailored to the PD condition and have been designed mainly to enforce correct exercise form and execution and are employed as a motion “vocabulary” to facilitate game navigation and interaction with game objects. For example, when a macroscopic physical movement of a player “matches” a preprogrammed gesture, an onscreen menu item is selected or a 3D cartoon avatar responds, according to the current game context. It is of paramount importance to note that these gestures have emerged from existing training programs aiming at improving postural stability and reflexes as well as increasing overall mobility for upper and lower limbs. Therefore, to reap the benefits claimed by these training programs, exercises should be followed accurately and in correct form. Our way to enforce this is through a vocabulary of gestures. Furthermore, within the context of each game, these gestures are parameterized in levels of difficulty so as to tax the cognitive and neuromuscular systems at various degrees and promote their real-time collaboration.

Therefore, our approach departs significantly from existing works ([22, 23, 25, 26]) which adopt an objective-based approach which allows free body form during gameplay as long as the game objective is met, because it is much more disciplined and structured, as it is gesture-based. That is, a given gesture is recognized as such only if the player’s motion follows specific motion patterns. For example, in the Balloon Goon game, proper detection of a “Punch” gesture involves the player extending an arm from the resting position up and forward until fully extended to the front (as in panel (b) in Figure 3) and proper detection of a “Kick” gesture requires the player to both lift and extend a leg forward until fully extended to the front (see panel (c) in Figure 3). Moving on to the Slope Creep game, for the “Push Both Ski Poles” gesture to be recognized and have the avatar respond as programmed, the player must (a) stretch both arms in unison to a frontal extension, (b) lower them down to the initial position again in unison, and (c) complete stages a and b within a specified time period. If for example one of the player’s arms lags the other “significantly” either in stage (a) or in stage (b) or if the entire moving pattern (a)+(b) takes too long, the gesture tracking logic discounts the motion pattern being tracked as a legitimate gesture and does not allow the on-screen avatar to respond as programmed, i.e., to start moving forward in the game environment. Proper gesture execution is affirmed by calculating positional deviations among relevant joints (hand, elbow, and shoulder for arm movements and hip, knee and ankle for leg movements) and deriving relevant angles in real time. Similar timing constraints are applied in the “Lean Torso to the side” and “Squat” gestures of the Slope Creep game. Parameterization is such that parameter values can be set for a spectrum of patients (e.g., stage II in the Hoehn & Yahr scale) but also fine-tuned or even relaxed on a per-patient basis, according to directions or feedback of an attending physiotherapist. The latter approach is meaningful in situations where some patients are more affected and frustrated by repeating failure as they are acquainted to platform navigation and the games themselves.

In addition, of significant importance to clinical motor assessment is the platform’s ability to quantify patient mobility/dexterity on a per-exercise basis using exercise-specific performance metrics. Although such detailed “kinesiological imprints” can be affected by various factors, such as time of day, patient tiredness, effectiveness of administered drugs, on/off times, meaningful and statistically sound results over a period of a few days of using the platform are possible: for example, exercise early in the day and at the same time after taking medication. As a result, carefully customized daily exercise schedules afford the possibility to collect a time series of performance data that can be usefully correlated with e.g., detailed medication history records and disease progress.

The platform has recently been demoed to local physiotherapists attending Parkinson patients with mild to moderate motor symptoms to obtain feedback with respect to the body kinesiology entailed in each individual game. We actively seek to identify and address safety issues not already foreseen by the principles laid out in, e.g., [21], and thus not already incorporated into our design and fine tune parameters related to the game experience. The latter includes fine tuning of the parameter values describing the looseness or strictness of the individual gestures per PD stage or even on a per-patient basis, possibly reorganizing difficulty levels in customizable exercise paths and dynamically motivating patient participation into game scenarios that are more challenging at both physical and cognitive levels. Based on early positive feedback, we also seek funding to fully deploy the platform in selected physiotherapy establishments and at a later stage possibly obtain proper license to allow willing PD patients to run the platform in their homes, partly to enable the collection of rich time series of performance data that can be usefully analyzed and correlated with, e.g., detailed medication history records and disease progress. It is however our belief that safe and carefully designed and implemented exergame platforms specifically for PD patients can be utilized in conjunction with more traditional physiotherapy programs to enliven a daily exercise schedule, as they are based on movements specific to the PD condition in existing training curricula tailored to the disease, though in a more loose and playful environment.

Data Availability

No data were used to support this study.

Conflicts of Interest

The authors declare that there are conflicts of interest regarding the publication of this paper.

Acknowledgments

Initial/preliminary results that the research presented in this article, namely, the Balloon Goon game, appear as a conference paper [32]. The present article is significantly expanded so that it (a) incorporates that game into a new exergame platform for Parkinson patients, (b) adds an entirely new game, (c) describes architectural and design components for each game, and finally (d) adds proper unifying gesture-based game-level navigation that is proper Parkinson patients. The authors would also like to acknowledge that the work described herein was partially funded by the first internal research funding program of TEI of Crete.