Filtering by
- All Subjects: Virtual Reality
- Creators: Computer Science and Engineering Program
In this experiment, a haptic glove with vibratory motors on the fingertips was tested against the standard HTC Vive controller to see if the additional vibrations provided by the glove increased immersion in common gaming scenarios where haptic feedback is provided. Specifically, two scenarios were developed: an explosion scene containing a small and large explosion and a box interaction scene that allowed the participants to touch the box virtually with their hand. At the start of this project, it was hypothesized that the haptic glove would have a significant positive impact in at least one of these scenarios. Nine participants took place in the study and immersion was measured through a post-experiment questionnaire. Statistical analysis on the results showed that the haptic glove did have a significant impact on immersion in the box interaction scene, but not in the explosion scene. In the end, I conclude that since this haptic glove does not significantly increase immersion across all scenarios when compared to the standard Vive controller, it should not be used at a replacement in its current state.
This thesis is based on bringing together three different components: non-Euclidean geometric worlds, virtual reality, and environmental puzzles in video games. While all three exist in their own right in the world of video games, as well as combined in pairs, there are virtually no examples of all three together. Non-Euclidean environmental puzzle games have existed for around 10 years in various forms, short environmental puzzle games in virtual reality have come into existence in around the past five years, and non-Euclidean virtual reality exists mainly as non-video game short demos from the past few years. This project seeks to be able to bring these components together to create a proof of concept for how a game like this should function, particularly the integration of non-Euclidean virtual reality in the context of a video game. To do this, a Unity package which uses a custom system for creating worlds in a non-Euclidean way rather than Unity’s built-in components such as for transforms, collisions, and rendering was used. This was used in conjunction with the SteamVR implementation with Unity to create a cohesive and immersive player experience.
This project seeks to motivate runners by creating an application that selectively plays music based on smartwatch metrics. This is done by analyzing metrics collected through a person’s smartwatch such as heart rate or running power and then selecting the music that best fits their workout’s intensity. This way, as the workout becomes harder for the user, increasingly motivating music is played.
In 2020, all states and territories within the United States have at least 20% obesity rates among adults, with the state of Arizona specifically being between 30-35% of adults (CDC, 2021). Being overweight and having obesity are linked to increased risk of heart disease, stroke, type 2 diabetes, high blood pressure, certain cancers, as well as other chronic conditions (NIH, 2018). The high percentage is partly due to the work environment in society, which has become increasingly sedentary with the rise of labor-saving technologies, like computers for example. As a result, sedentary jobs have increased 83% since 1950 (American Heart Association, 2018). Our proposed solution to this problem of people not getting enough exercise is Bet Fitness. Bet Fitness is a mobile app that utilizes social and financial incentives to motivate users to consistently exercise. The quintessence of Bet Fitness is to bet money on your health. You first create a group with your friends or people you want to compete with. You then put in a specified amount of money into the betting pool. Users then have to exercise for a specified amount of days for a certain period of time (let’s say for instance, three times a week for a month). Workouts can be verified only by the other members of the group, where you can either send photos in a group chat, link your fitbit/other health data, or simply have another person vouch that you worked out as proof. Anyone who fails to keep up with the bet, loses their money that they put in and it gets equally distributed to the other members of the party. According to our initial survey, this idea has generated much interest among college students.
This project is called the Zoom Room and it is about the use of virtual reality (VR) for workspace productivity. It is where Zoom and VR meet to form an enhanced productive workspace for users. Equipped with two 3D printers that show how a 3D printer moves and the intricate parts that make up the 3D printer, it is much more than just a standard meeting room. It is a place to analyze machines and meet with others in a virtual space.
For the average person, when they use a computer, they interact with two main groups: the Computer Input, which consists of a keyboard and a mouse, and the Computer Output, which consists of a monitor and speakers. For those with physical disabilities, traditional Computer Input and Output methods can be difficult or uncomfortable to use. I believe VR Technology can make using computers much more accessible for those individuals, and my application demonstrates that belief.
The purpose of this research thesis paper is to provide further insight into the development of extended reality (XR), augmented reality (AR), and virtual reality (VR) technologies within the educational space and survey how well they are received as well as whether or not they can provide additional learning benefit in regards to other learning mediums such as reading textbooks, watching videos on the subject matter, and other such more traditional mediums. The research conducted consisted of a collaborative effort alongside the School of Biological and Health Systems Engineering (SBHSE) personnel and using their provided resources in order to generate a framework with the aforementioned technology, to aid in the development of a web-based XR system which will serve primarily as a means for SBHSE students at Arizona State University (ASU) to enhance their learning experience when it comes to topics such as anatomy and physiology of the human body, with the potential of extending this technology towards other subject matters as well, such as other STEM-related fields. Information about the initial research which included an analysis of the pertinent readings that support a benefit to using XR technology as a means to deliver course content is what is first focused on throughout this document. Then, the process that went into the design and development of the base framework that was in joint collaboration with the SBHSE will be covered. And, to conclude, a case study to generate applicable data to support the argument is covered as well as the results from it, which presented a potential for a future development plan and next steps plan once the developed materials and research are handed off.