- Open Access
- Authors : Kale Arati Kashinath , Kudkyal Bhavana Ambadas , Desai Rushika Rajendra , Ahire Shruti Sanjay, Survase Vidhya Shimant
- Paper ID : IJERTV11IS050045
- Volume & Issue : Volume 11, Issue 05 (May 2022)
- Published (First Online): 13-05-2022
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Augmented Reality in Anatomy
1 Mrs. Kale Arati Kashinath Department of Informational Technology JSPMS JSCOE Hadapsar
Pune, India
2 Miss. Kudkyal Bhavana Ambadas Department of Informational Technology JSPMS JSCOE Hadapsar
Pune, India
3 Miss. Desai Rushika Rajendra Department of Informational Technology JSPMS JSCOE Hadapsar
Pune, India
4 Miss.Ahire Shruti Sanjay Department of Informational Technology JSPMS JSCOE Hadapsar
Pune, India
5 Miss. Survase Vidhya Shimant Department of Information Technology JSPMS JSCOE Hadapsar
Pune, India
Abstract A medical student who wants to study different concepts like Digestive System, Heart and other systems that are hard to Imagine and understand everything from a picture or diagram and books. Currently, a student who wants to study the Anatomy of different parts (organs) of the Human Body has to look and understand everything from a picture or diagram.AR in Anatomy is an app that lets students explore the physical body to know how it works. By using this app Users can see the Anatomy of different Human body parts in action, rotate them, zoom in and out. The app also gives labelling of small parts too. AR helps the scholar better remember the knowledge they need just learned. These diagrams are very complex and they are printed in 2D on books because of which it is very difficult to visualize. Students will be able to Visualize these diagrams very easily and effectively using our APP.
Keywords: Augmented Reality, Unity,AR Core,AR Scene, Anatomy, Medical Student, Human Body, Cloud,Body parts.
I.INTRODUCTION
This project aims is to investigate how effective learning experiences can be improved with a technology, called Augmented Reality (AR). With the help of AR, students are going to learn anatomy to memorize about body organs and examine this within the range of body functions. Conceiving the relationships in three dimensions of various organs and their interdependent functions is a major difficulty in this task. The method described in this paper is to help or guide students by providing an AR vision of the anatomical details that provides a structured learning approach to the material.
This is a research project to study whether augmented reality (AR) is an effective tool to learn anatomy concepts. Anatomy is a very important subject as fundamental towards many relevant fields, such as health science, and A medical student who wants to study different concepts like Digestive System, Heart and other systems that are hard to Imagine and Understand has to study and understand
everything from a picture or diagram and books. so with the help of AR technology, it is to be expected that it provides more engaging and effective learning experiences. As this technology-aided learning provides flexible accessibility.
-
PROBLEM DEFINITION
To solve the problem of Visualizing and Imagining complex Human anatomy parts using an Augmented Reality based app to make studying simple, fun, and responsive.
-
AUGMENTED REALITY SYSTEM
Hardware-The framework requires a camera to catch the real-world so as to figure out where the virtual components will precisely must be drawn. We have utilized a camera. In the wake of handling the caught picture, the framework gets the genuine camera position and direction relative to physical markers, and figures out where the virtual components must be drawn. The picture that shows up on the phone likewise shows up on the PC screen.
Software- We modified the framework utilizing UNITY 3D. The rendition utilized was Unity 3d with vuforia support. Solidarity enables clients to make games and encounters in both 2D and 3D, and the motor offers an essential scripting in C#. Vuforia has become an Integrated Software Technology Development Kit (SDK) for smartphones that allows the production of augmented reality applications. Which uses Computer perception engineering to constantly interpret and tr ack planar representations and 3D issues.The capacity to enroll photographs encourages artists to organize and place simulatedobjects,
-
RELATED WORK
Learning anatomy allows students to understand and remember a great deal of knowledge within the spectrum of
body functions to contextualise this. In this mission, visualising the relationships of different organs in three dimensions and their inter – dependent activities is a major challenge. The framework outlined in this paper is a development that offers a systematic learning approach to the material to support students by presenting a virtual reality version of the anatomical information under investigation. This is a research project to explore whether haptics are an efficient instrument for studying anatomy while having equal access to more immersive encounters of augmented reality (AR). This research project is intended to explore how a technology, called Augmented Reality (AR) with haptics, can enhance successful learning environments. Flexible usability is typically provided by technology-aided learning. It is anticipated that an intuitive immersive approach such as AR with haptics can have more stimulating and effective learning environments. [1]
In the subject of genetics, human body anatomy is an important concept that needs to be understood after junior high school. Many instructional resources are available in the shape of a mannequin (puppet) for books and anatomy, but they are also inadequate to help students understand the anatomy of the human body. Augmented Reality (AR) is a technology that interactively merges a real world into a simulated world. The aim of this thesis is to construct an AR application for learning to be more interesting and easier for students to understand human body anatomy. This application helps students to study the anatomy of the human body through 3D model contact by using textbooks and mannequins beforehand. The method of analysis in this thesis is to use a quantitative method that gathers information and then creates the prototype to show the effect. Using the waterfall approach, which involves planning (data collection and analysis), architecture (user interface and diagram), execution and checking, the application development method is carried out. The outcome of the study is an AR framework for studying human body anatomy featuring a 3D object, organ explanation, and location that can be viewed on the web. [2]
Due to limitations on visualising the body anatomy from 2D into 3D image, students typically face difficulties in studying human body anatomy. This project aims to use virtual reality technologies to build a human anatomy learning system. By using this method, students are supposed to be able to clearly understand the human body's anatomy using a 3D rendering of the image. The tool used in this framework is the Mobile Computing Application Augmented Reality Identifier. By taking a picture, the marker is filmed. Then, the image collected is broken into bits, and the pattern fits the images contained in the archive. They have used the Floating Euphoria System in this analysis and combined it with the SQLite database. The human body's Virtual Reality Anatomy Device has functionality that can reveal the whole body or portions of the human body interactively. They validated the virtual reality anatomy system for studying the anatomy of the human body with high school students and medical
students to ascertain the utility of the application. The findings demonstrate that the virtual reality visualisation of the human anatomy learning system lets students study human anatomy mre effectively. [3]
Japan Society for the Promotion of Science (JSPS),2018 Museums of human anatomy specimens are widely used by students of surgery, nursing, and paramedics. The specimens stored in these museums enable students to understand the intricate relationships of organs and systems in more detail through dissection and prosection than textbooks may offer. However, without additional descriptions from a docent or supplementary examples, it can be difficult for students, particularly novices, to recognise the different parts of these anatomical structures. Augmented reality (AR) has increasingly been used to view simulated objects in images taken from the physical world in many museum exhibits. The learning environment can be changed greatly by this technology. Three AR-based support systems for tours in medical specimen museums were developed in this study, and their learning usability and efficacy were analysed. Using an AR marker, the first device was constructed. By recording AR markers using a tablet camera, this scheme could display virtual label information for specimens. For all specimens, individual AR markers were needed, but their presence in and on the exposed specimens may also be obtrusive. In order to set the specimen image itself as an image marker, the second method was created, as most specimens were seen in crosssection. [4]
This paper proposes a virtual reality learning method that uses the feedback of a depth camera to teach high school students anatomy interactively. The goal is to exemplify human anatomy using the Microsoft Kinect depth camera by showing 3D models over a person's body in real time. Without using targets for mapping, users can see how bones, muscles, or organs are spread in their bodies. It is possible to describe an AR system as one that enables physical and virtual objects to coexist in real time in the same space. A depth sensor includes a red-green-blue (RGB) camera and an infrared laser depth camera. This instrument measures the relative distance between each of the two parts of the human body and knows where each joint is in relation to the three-dimensional (3D) universe on this basis. Consequently, it offers a full-body recording of 3D motion. The Microsoft Kinect is a camera with depth. A great deal of visual processing can be carried out in real time without complaints about light quality or slow mathematical processing, which were among the key issues with AR technology in the past, due to recent advances in computing and the introduction of low-cost depth cameras to the market. [5]
Abstract Augmented Reality (AR) is a technology that increases reality with, and enables users to interact with, either two or three-dimensional computer generated imagery (CGI), objects and/or details. AR on mobile devices is evolving and provides a great deal of learning and training potential. The creation process of a mobile prototype learning environment using mobile augmented
reality is explored in this article (mAR). The prototype is called Mobile Augmented Reality Human Anatomy or HuMAR, and the anatomy of the human skeletal system is the selected learning subject. HuMAR's primary goal is to help students and that could eventually boost their academic experience. There has been a study stating that when studying the above subject, there is a decrease in retaining and producing long-lasting knowledge for longer. The design, concept, prototype creation and outcomes of HuMAR taken from a pilot test are described in this article. The pilot test used the experimental approach for science's students from three separate universities. The goals of the pilot test were to consolidate, from a didactic and technological point of view, user interface. It is concluded that, based on the findings of the pilot test, students were pleased with HuMAR in terms of its usability and characteristics, which in turn may have a beneficial effect on their learning experience. [6]
An summary of the key aspects of Augmented Reality (AR) and the core principles of this technology were discussed in this article. It defines the key fields in which AR and essential AR instruments are used today. Any aspects of Augmented Reality applications will be presented and an outline of them will be given by this article. Future routes are answered.[7]
Augmented Reality (AR) is an increasingly prevalent concept with numerous use fields. This work examines the outcomes of a survey explicitly designed to chart the ease of adoption of potential functionality to be introduced in AR-enhanced mobile outdoor applications. This survey was designed for a clearer evaluation of consumer tastes in outdoor settings involving the use of AR. The preference level was calculated on the basis of product indicators related to different current and relevant models. In relation to the targeted questions, the paper addresses the survey findings on the preference level and suggests a methodological structure for input on the adoption of AR technologies. [8]
V .GOALS AND OBJECTIVES
The major objective is to enhance learning and teaching process, this is done by displaying 3D view of medical diagrams to make them more informative. Providing the 3D all angle view of medical organs to provide a realistic experience. These all angle view is integrated with detailed labelling and information present within virtual visualization. Providing easy and clear visualization of complex and clogged organ diagrams by presenting exact virtual replica as of real organs. Along with all possible angle visuals of diagrams zooming capabilities is also provided where user can
-
Zoom in and out features simply by moving device in forward and backward directions
-
Zooming provides capability of exploring inner hidden parts of organ that are out of sight.
Providing better understanding of concepts by learning with real like virtual organs first person perspective. Making best possible use of technology in Medical
education. Providing teachers with better tools to deliver. Enable individual users to access learning materials anytime and just with their mobile phones. Creating a joyful environment and interesting way of learning.
-
SYSTEM ARCHITECTURE
Fig1.Architecture of AR in Anatomy
-
EXPERIMENTAL RESULTS
Here, the Students can pick the visual learning mode. Moreover, life systems learning mode with AR starts with capture marker. Right now, app captures live feed through phone and recognizes a marker file. First, the application shall define the markers in the marker which distinguishes facts. The application displays a real 3D model under a position compared to the label in the 3D model presence phase. The Students respond to the kind of contact on the phone screen in the Students input process. Then, the application detects any on-screen contact collaboration. If the touch is unlikely to be directly on the area set apart with Display parts, the following procedure should continue at that point. Second, is the information presence. Right now, the Students gets an appreciation for the touch structure and the 3D model directions. The required details on the part being addressed is shown from the application's directions. The program shows the view on android gadget acquired from the phone. The camera would then determine the closeness of the markers.
Screenshots attached below shows actual output of the APP
At the point where the marker is identified, 3D models of the human body will be displayed , the 3D model , For e.g., once the Students Association has provided the body with a portion of the 3D prototype seen on the label, the proposal may present a list.The shown 3D models can be regulated in expansion by revolution and model amplification. It is achieved by using the Android touch screen input. This functionality uses the Android functionality to detect and evaluate the data location of the contact on the screen. The program detects the touch of the Students along the
smartphone diplay and instead worries regarding the current instructions of the item being shown. Upon forming a connection between the click directions as well as the 3D objects, a software shows data and maps the details well into the database. For instance, once a student comes across some of the body parts, a representation of the organ is shown on the application. Students may use pivoting markers to transform the shown organ likewise. we should be able to zoom in and out by changing the separation marker with the camera.
-
CONCLUSION AND FUTURE WORK
The project aims to enhance learning and teaching process in medical education system. Different aspects of this model help exponentially in understanding the diagrams and learning in a better way. The 3D model with all angle view which can be zoomed in and out for exploring minute details, along with labels and other detailed information integrated which provides not only detailed and better virtual visualization but also creates interest in the learning due to its elegance.Portable applications with expanded reality innovation additionally make the longing of understudies to utilize This technology is higher as a related learning and comprehension tool for mankind structures. A few upgrades of the portable application that should be possible are to give better direction to fledgling clients, include perception materials with more decisions highlighted in sight and sound stages by utilizing sound or video, straightforward material comment enhancements with more obvious language utilization and UI. In view of the aftereffects of this investigation, it tends to be finished up that applications utilizing portable expanded reality can assist high with tutoring understudies and clinical understudies in learning the life systems of the human body with an intelligent learning.
ACKNOWLEDGMENT
I would like to express my special thanks of gratitude to my teachers such as, our Project Guide Prof.Arti Kale our Honorable HOD Prof. Vishwas Kalunge sir as well as our principal Dr.R.D.Kanphade Sir who gave us the golden opportunity to work on the topic AR in Anatomy, which also helped us in doing a lot of Research and we came to know about so many new things, we are really thankful to them. Secondly we would also like to thank our parents and friends who helped us a lot in this. And we hope our Project will be helpful to medical studends.
REFERENCES
[1] Soon-ja Yeom , "Augmented Reality for Learning Anatomy", Ascilite 2011 Wrest Point | Hobart Tasmania Australia, School of Computing and Information Systems University of Tasmania, PP. 1377-1383, 2011. [2] Rita Layonaa , Budi Yuliantob, Yovita Tunardic, Web based Augmented Reality for Human Body Anatomy Learning Computer Science Department, School of Computer Science, Bina Nusantara University, Jakarta, Indonesia 11480, Elsevier, Procedia Computer Science 135 (2018), PP. 457464, 2018. [3] Michael H Kurniawan , Suharjito Suharjito, Diana, Gunawan Witjaksono, Human Anatomy Learning Systems Using Augmented Reality on Mobile Application, Bina Nusantara University, Jakarta, Indonesia 11480, Elsevier, Procedia Computer Science 135 (2018), PP. 80-88, 2018. [4] Atsushi Sugiura , Toshihiro Kitama , Masahiro Toyoura , Xiaoyang Mao, The Use of Augmented Reality Technology in Medical Specimen Museum Tours, Japan Society for the Promotion of Science (JSPS) GrantsinAid for Scientific Research (KAKENHI), Grant number: JP17K12944; Yamanashi Prefecture Satoshi Omura Professional Development Fund Grant, American Association For Anatomy, PP.561-571, 2019. [5] Cristina Manrique-Juan; Zaira V. E. Grostieta-Dominguez, Ricardo Rojas-Ruiz, Moises Alencastre-Miranda, Lourdes Muñoz-Gómez, Cecilia Silva-Muñoz, A Portable SYNOPSIS Augmented-Reality Anatomy Learning System Using a Depth Camera in Real Time, The American Biology Teacher (2017) 79 (3), PP. 176183, 2017. [6] Siti Salmi Jamaliab, Mohd Fairuz Shiratuddina, Kok Wai Wonga, Charlotte L. Oskam, Utilising Mobile-Augmented Reality for Learning Human Anatomy, School of Engineering & Information Technology Murdoch University, 90 South Street, Murdoch, Perth, WA 6150, Australia, Elsevier, Procedia-Social and Behavioral Sciences 197 (2015), PP. 659-668, 2015. [7] Manjul D E , Omkar Desai , Shrikar Desai , Khushbu S. Tikhe, Augmented Reality: A New Way to View the World,IJERT, Id: IJERTV9IS030545, Volume 09, Issue 03 (March 2020), 2020. [8] Tasneem Khan, Kevin Johnston, and Jacques Ophoff, The Impact of an Augmented Reality Application on Learning Motivation of Students, Department of Information Systems, University of Cape Town, South Africa, Hindawi, Francisca Rosique, 2019. [9] Al Hamidy Hazidar, Riza Sulaiman, Visualization Cardiac Human Anatomy using Augmented Reality Mobile Application, Volume 5, Issue 3, ISSN (Online): 2249071X, PP. 22784209,2018.
[10] Rui Pascoal, Bráulio Alturas, Ana de Almeida, Rute Sofia, A Survey of Augmented Reality: Making Technology Acceptable in Outdoor Environments, CISTI 2018 – 13th Iberian Conference on Information Systems and Technologies, 2018. [11] Ho-Gun Ha, Jaesung Hong, Augmented Reality in Medicine, ] Department of Robotics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu, South Korea.,PP. 242-247, 2016. [12] Karthik Adapa, Saumya Jain, Richa Kanwar, Tanzila Zaman, Trusha Taneja, Jennifer Walker, Lukasz Mazur, Augmented reality in patient education and health literacy: a scoping review protocol, BMJ Open, 2020. [13] Son-Lik Tang, Chee-Keong Kwoh, Ming-Yeong Teo, Ng Wan Sing, Keck-Voon Ling, Augmented reality systems for medical applications, IEEE, Engineering in Medicine and Biology Magazine ( Volume: 17, Issue: 3, May-June 1998), PP. 49-58,1998.
[14] Jayashree C Pasalkar, Vivek S Deshpande, Dattatray Waghole, Performance analysis of delay in wireless sensor networks, Trends in innovative computing, PP. 192-195, 2012.