Google Glass prototype: https://www.justinmind.com/usernote/tests/13029184/13029208/13039767/index.html The prototype was created using Justinmind Prototyper, which supports prototyping for Google Glass. At the time of creating the prototype, we did not have a Glass device, so the prototype should be viewed in a web browser. The card at the top of the prototype view represents the s view - it will be what they see when they look at the Glass screen. In order to represent the gestures that one would actually use to operate on Google Glass (swiping up/down/left/right and tapping to select), the prototype includes buttons below the view screen, each labeled with the appropriate gesture. Thus, to simulate a swipe up, the button labeled swipe up should be clicked. To demonstrate a typical usage scenario, the application shows a situation where the user is trying to order food in a foreign restaurant. The phrase suggestions are all hard-coded because the prototyping tool does not allow us to access any geographic information and does not allow us to access any databases or external translation tools. The prototyper is also limited in not being able to record or produce sound, so we had to add buttons to the screen (highlighted in blue) that emulate the waiter finishing speaking, the app pronouncing a word, etc. These buttons will obviously not appear in the final, finished UI since there are no buttons on a Google Glass device; they are there only to help viewers of the prototype move from screen to screen. Application Prototype: https://www.fluidui.com/editor/live/preview/p_WA6Pld5uRoSaCNqym8d8ZBBrFe6mtTdl.1414639067927 This prototype was created using FluidUI. The app is navigated by tapping with the cursor that appears as a circle in the browser. The first tab is progress, where you can see a progress page that was hard coded. The second tab is Practice which will take you to an example screen of what a guided practice session would look like. The next tab is Transcripts. If you click on the first transcript, it will show you an example of how a transcript would show up. The last tab is Missed Words which will show the words a user may have mispronounced or not known throughout the day. Tapping on the speaker icon of the first word will pronounce the word (although our prototype does not do that, it indicates that the app is speaking by changing the icon). Clicking on the first word will take you to a translation of that word (although the real gesture d like to use is swiping).