ClinicalTrials.Veeva

Menu

Gamified Digital Balance Assessment for Older Adults in Community Settings: Development, Validation, and User Experience Evaluation in a Mixed Methods Study

S

Shanghai Jiao Tong University School of Medicine

Status

Completed

Conditions

ELDERLY PEOPLE
Digital Health
Fall Prevention

Treatments

Device: Brief-BESTest assessment
Device: Digital balance assessment tool
Device: Gamified Digital Balance Assessment (GDBA)

Study type

Interventional

Funder types

Other

Identifiers

NCT06958653
Balance assessment tool

Details and patient eligibility

About

The study was conducted in two sequential phases to evaluate the reliability and user experience of a GBDA tailored for community-dwelling older adults.

Phase 1: Reliability of digitalized Brief-BESTest assessment In the first phase, participants performed a single balance assessment session, during which both the clinician-administered Brief-BESTest and the digitalized Brief-BESTest were scored concurrently. This approach enabled direct comparison between clinical and automated assessments under identical task conditions.

Testing was conducted in a controlled indoor setting featuring a 1 m × 1 m, 10 cm-thick EVA foam mat (35D density) and safety handrails on three sides. Prior to the assessment, participants completed a baseline questionnaire collecting demographic data (age, sex), anthropometric measurements (height, weight), and fall history (past 12 months). Written informed consent was obtained from all participants.

During the assessment, a certified physical therapist delivered standardized verbal instructions and rated each task using the validated Brief-BESTest rubric (maximum score = 24). Simultaneously, the digitalized Brief-BESTest system recorded participants' movements using a monocular 4K camera and calculated scores via an algorithm that mirrors the original scoring criteria. The torso and joint movements were analyzed in real time, and balance scores were automatically computed.

To evaluate inter-rater reliability, a second trained clinician independently rated 20% of the sample. This concurrent scoring design ensured consistent task execution while enabling evaluation of inter-method reliability of the automated system's scoring against expert clinician judgment.

Phase 2: Impact of GBDA on User Experience The second phase involved a parallel group randomized controlled trial to assess the impact of gamification on user experience. Participants were randomly assigned (1:1) to either the control group (uses digitalized Brief-BESTest) or the experimental group (uses GDBA) through a simple coin-randomization method by a blinded researcher. Testing was conducted in a 1 m × 3 m evaluation zone equipped with front, side, and rear safety railings, and a centrally placed EVA foam pad (identical to Phase 1). The DBTS system included a display screen, a Logitech Brio 4K webcam (30 fps) for motion tracking, and a built-in speaker for voice prompts. A detachable, ergonomically designed user console-compliant with Chinese anthropometric standards-was mounted on the front railing for interface navigation (see Figure 2).

In the control group, participants performed balance tasks following pre-recorded verbal instructions from a certified physical therapist. In the experimental group, tasks were presented via the GDBA interface, which included animated avatars, voice guidance, progress indicators, and real-time performance feedback. Each participant completed one practice trial per task to minimize learning effects, followed by the formal assessment. A 2-minute seated rest period was provided between tasks to reduce fatigue.

Immediately following the assessment, participants completed self-report measures on perceived exertion, intrinsic motivation, and intention for continued use. They then participated in a brief semi-structured interview exploring their perceptions of system usability and engagement. All interviews were audio-recorded and transcribed for thematic analysis. Participants received a nominal compensation (USD $10 equivalent) upon study completion.

Enrollment

40 patients

Sex

All

Ages

60+ years old

Volunteers

Accepts Healthy Volunteers

Inclusion criteria

  • aged 60 or older,
  • living independently,
  • able to walk with or without an assistive device (without external help),
  • willing and able to provide informed consent.

Exclusion criteria

  • conditions that impede walking (e.g., hip fractures, lower limb amputations, hemiparesis),
  • medications causing dizziness or affecting balance (e.g., psychotropic drugs),
  • self-reported cardiovascular, pulmonary, neurological, musculoskeletal, or mental disorders,
  • severe fatigue or pain,
  • severe uncorrected vision or hearing impairments that may affect their ability to interact with the digital system

Trial design

Primary purpose

Prevention

Allocation

Randomized

Interventional model

Parallel Assignment

Masking

Single Blind

40 participants in 3 patient groups

Control group
Active Comparator group
Description:
Participants in the control group used Brief-BESTest to assess their balance ability
Treatment:
Device: Brief-BESTest assessment
Experimental group
Active Comparator group
Description:
Participants in the experimental group uses the Gamified Digital Balance Assessment to assess their balance ability
Treatment:
Device: Digital balance assessment tool
Gamified group
Experimental group
Description:
The GDBA builds upon the digitalized Brief-BESTest by incorporating evidence-based gamification elements designed to enhance motivation and engagement among older adults. The gamification design was guided by self-determination theory, which posits that autonomy, competence, and relatedness are key drivers of intrinsic motivation, and by recent systematic reviews on gamification for older adult health interventions.
Treatment:
Device: Gamified Digital Balance Assessment (GDBA)

Trial contacts and locations

1

Loading...

Data sourced from clinicaltrials.gov

Clinical trials

Find clinical trialsTrials by location
© Copyright 2026 Veeva Systems