
BuskUs
Designer & Prototype developer: Ke Zhang / Lisa Miller / Muzi Hu
For one of my Masters projects I focused on Sydney´s growing busking culture. We observed that today´s busking audience needs a method of communicating and engaging with buskers, doing so might promote audience interaction and increase the profitability of street performance.

Aspect 01
Our project “BuskUs” involved 2 aspects. The first is an interactive screen which features lighting effects to attract the audience’s attention and displays instruments that allow the audience to participate in the musical performance.

​​​
Aspect 02
The second aspect of BuskUs is an app/website. The app allows the audience to make song requests to the busker. The busker can then see the votes from the audience and choose the song they prefer to play based on demand.

BuskUs
HCD Design Methods
HCD Methods are carried out through the design process.
We implemented multiple runs of user tests for a better approach to the objectives of each stage.

Observation
Observing user's behavior without direct interruption physically or in language
Logging
Write down details from ovservation as specific as possible
Interview
Semi-structured Interview with the participant
To test the visualization we used semi-structured interviews, where the user is asked to perform a series of tasks. The purpose of this is to understand usability flaws so they can be revised. These interviews allowed users the opportunity to suggest ways they feel most comfortable using the interface.

I followed the double diamond process as a design framework. By going through the background research, insight analysis, ideation, sketching, low fidelity mock-ups, initial user testing, high fidelity mock-ups, further user test, Final refinement, we develop a better understanding of true user needs.
Aspect 01
Interactive Mockup
From the interactive visualisation I developed, people can see a live silhouette of themselves holding different types of instrument according to their standing position, and it allows multiple participants to be involved at the same time so they can even form a “band”.
The coding logic behind the application is based on face tracking using OpenCV in Processing. Images captured from a video camera are edited visually frame by frame. As long as there is a human face detected, the image of the instrument will be displayed in a certain format according to the size and position of the face. At the same time, a silhouette is drawn to highlight the shape of the participant’s body to provide visual feedback.
Developing environment
Processing 2.2.6
Open Source Libraries used:
OpenCV and Video




Photos of userbility test on low fidelity interactive prototype for stage 01
Aspect 02
Sitemap Example


Wireframe Example
Mockup Example

