Skip to content

Commit

Permalink
Merge branch 'main' of https://github.com/hubmapconsortium/hra-ui int…
Browse files Browse the repository at this point in the history
…o develop
  • Loading branch information
bherr2 committed Jul 31, 2024
2 parents dc0ec35 + 0496bec commit dc8ac95
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 9 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@
- heading: Overview
descriptions: |
The Human Reference Atlas (HRA) Organ Gallery is a virtual reality (VR) application that enables users to explore 3D organ models of the HRA in their true size, location, and spatial relation to each other.
The HRA Organ Gallery has two main use cases: 1) introducing both novice and expert users to the HuBMAP data available via the HuBMAP Data Portal, and 2) providing quality assurance and quality control (QA/QC) for HRA data providers.
More use cases are under development. Further information can be found in this <a href="https://doi.org/10.3389/fbinf.2023.1162723" target="_blank">publication</a>.
The HRA Organ Gallery has two main use cases: 1) introducing both novice and expert users to the 2D and 3D data available in the HRA via the HuBMAP Data Portal, the SenNet Data Portal, and similar efforts, and 2) providing quality assurance and quality control (QA/QC) for HRA data providers.
More use cases are under development. The concept of the application is descriped in this <a href="https://doi.org/10.3389/fbinf.2023.1162723" target="_blank">publication</a>.
Figure 1 compares the user interface for exploring the HRA via the [Exploration User Interface (EUI)](https://apps.humanatlas.io/eui/) to how it appears in the HRA Organ Gallery when using a Meta Quest 2 or 3.
styles:
margin-bottom: 2rem
Expand All @@ -50,34 +50,34 @@
<a href="https://apps.humanatlas.io/eui/" target="_blank">Exploration User Interface</a> in a
standard-size browser window on a 17-in display.
<span class="hra-bold">(B)</span>: The HRA Organ Gallery allows the user to view the organs, tissue blocks, and cell type counts of the HRA in true scale using
immersive technology (VR). [Source](https://doi.org/10.1101/2023.02.13.528002).
immersive technology (VR). [Source](https://doi.org/10.1101/2023.02.13.528002). More information can be found at our [README](https://github.com/cns-iu/hra-organ-gallery-in-vr/blob/main/README.md).
- type: page-data
pageData:
- heading: Background
descriptions: |
The HRA project, led by HuBMAP, aims to map the adult healthy human body at single-cell resolution through a collaboration across 17 (and counting) international consortia.
The HRA, funded by HuBMAP, SenNet, and similar efforts, aims to map the adult healthy human body at single-cell resolution through a collaboration across 17 (and counting) international consortia.
The project includes three main categories of data: biological structure, spatial, and specimen data. The ASCT+B tables are compiled by experts to capture biological structure data, describing the connection between anatomical structures, cell types, and biomarkers.
- type: page-data
pageData:
- heading: Why Virtual Reality?
descriptions: |
Using a visually explicit method of data integration, the HRA benefits can be better defined despite the disparate data. Virtual reality (VR) presents a unique opportunity to explore both spatial and abstract data in an immersive environment that enhances presence beyond traditional interfaces like windows, icons, menus, and pointers, also called WIMP paradigm (Van Dam, 1997). Although some users may be able to learn how to explore 3D reference organs and tissue blocks on a 2D screen (Bueckle et al., 2021, 2022), many still struggle with interacting with 3D objects on a 2D screen.
Using a visually explicit method of data integration, the HRA enables users to explore disparate data. Virtual reality (VR) presents a unique opportunity to explore both spatial and abstract data in an immersive environment that enhances presence beyond traditional interfaces like windows, icons, menus, and pointers, also called WIMP paradigm (Van Dam, 1997). Although some users may be able to learn how to explore 3D reference organs and tissue blocks on a 2D screen (Bueckle et al., 2021, 2022), many still struggle with interacting with 3D objects on a 2D screen.
- type: page-data
pageData:
- heading: Data Visualizations
descriptions: |
The application's primary building blocks, presented in Figure 3 of <a href="https://doi.org/10.1101/2023.02.13.528002" target="blank">the preprint paper</a>, include the SceneBuilder which serves as the data manager for the application and retrieves data from the CCF API, and the DataFetcher which uses Node, NodeArray, and GLBObject classes to store 3D organs in GLB format. Once the setup is complete, the user can interact with the organs, and the entire scene takes about 5-7 seconds to load when running natively on the Meta Quest 2. As of early March 2023, through the HRA Organ Gallery, users can investigate 55 3D reference organs and 1,203 mapped tissue blocks obtained from diverse donors and providers, connected to over 5000 datasets, in a cohesive, immersive, and 3D VR environment at the convergence of VR, information visualization, and bioinformatics.
The application's primary building blocks, presented in Figure 3 of <a href="https://doi.org/10.1101/2023.02.13.528002" target="blank">the preprint paper</a>, include the SceneBuilder which serves as the data manager for the application and retrieves data from the HRA API, and the DataLoader which uses Node, NodeArray, and GLBObject classes to store 3D organs in GLB format. Once the setup is complete, the user can interact with the organs, and the entire scene takes about 5-7 seconds to load when running natively on the Meta Quest 2 or 3. Through the HRA Organ Gallery, users can investigate 55 3D reference organs and 800+ mapped tissue blocks obtained from 300+ diverse donors and providers and 20+ tissue data providers (as of HRA v2.1), connected to 6000+ datasets, in a cohesive, immersive, and 3D VR environment at the convergence of VR, information visualization, and bioinformatics.
- type: page-data
pageData:
- heading: Feedback
descriptions: |
In order to engage a diverse group of experts in the HRA effort, we request feedback from subject matter domain experts at the NIH and its funded initiatives. We would like to involve specialists in fields such as bioinformatics, 3D modeling, medical illustration, anatomy, data curation, and biology. Additionally, we plan to conduct a user study to gather quantitative data on various QA/QC error detection methods and completion time, as well as to evaluate the app's usability, engagement, and presence. Information for test users is available <a href="https://github.com/cns-iu/hra-organ-gallery-in-vr/blob/main/INFORMATION_FOR_TESTERS.MD" target="_blank">here</a>. The code is available on <a href="https://github.com/cns-iu/ccf-organ-vr-gallery" target="_blank">GitHub</a>.
Information for test users is available <a href="https://github.com/cns-iu/hra-organ-gallery-in-vr/blob/main/INFORMATION_FOR_TESTERS.MD" target="_blank">here</a>. In order to engage a diverse group of users in the development of the HRA Organ Gallery, we request feedback from a broad range of specialists in fields such as bioinformatics, 3D modeling, medical illustration, anatomy, data curation, and biology. We are continuous evaluating the app's usability, engagement, and presence.
Please email Andreas Bueckle (abueckle@iu.edu) if you are interested in contributing your input to the research and development of the HRA Organ Gallery.
Also, please email Andreas Bueckle (abueckle@iu.edu) if you are interested in contributing your input to the research and development of the HRA Organ Gallery.
- type: margin
bottom: 5rem
Expand Down
2 changes: 1 addition & 1 deletion apps/medical-illustration/project.json
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
"maximumError": "4kb"
}
],
"outputHashing": "all"
"outputHashing": "none"
},
"development": {
"buildOptimizer": false,
Expand Down

0 comments on commit dc8ac95

Please sign in to comment.