Design
VR-Cardiomics is a stereoscopic 3D application that has been designed for different IEs based on use-case-specific requirements captured from intensive discussions between domain experts and immersive analytics developers. It has been developed as a standalone application for IE, with a focus on the adaptability to different immersive platforms. The heart model, spatial gene expression data, and visual elements were based on the authors’ previous study on a 3D-Cardiomics web application [37]. This has been further developed into VR-Cardiomics with additional IE-specific functionalities.
VR-Cardiomics was developed for either FTVR or HMD-VR. Because the FTVR version resembles an external desktop, this strength has been exploited, and the application is therefore designed for environments similar to a web application. Nevertheless, the three-dimensional feature of the FTVR was added to the respective heart models as the focus of the application. User interfaces and a menu navigation are continuously present on the screen. The size of the heart model in FTVR is perceived to be smaller than that in the HMD-VR version. In addition, the limited screen size leads to a general reduced overview of the environment and model.
By contrast, the HMD-VR version was implemented in a more explorative manner. Because an HMD is used, the user is provided with an entire 360∘ virtual environment. Based on the limited screen size of the FTVR compared to the virtual space of an HMD-VR, the FTVR is fixed for a side-by-side comparison of two models simultaneously, whereas HMD-VR allows multiple models to be simultaneously compared. The benefit of a larger area also results in a larger perceived heart model. An example of a heart model in VR-Cardiomics is given in Fig. 1, the gene expression calculations of which were obtained from bulk RNA-sequencing based on a previous study by Mohenska et al. [37]. RNA sequencing was conducted for each of the 18 areas of the heart, and the expression levels obtained were mapped onto a 3D heart model in Unity using a linear interpolation of a color gradient, from blue for low expression levels to red for high expression levels. Alternatively, for color vision deficiencies, a two-color gradient from blue to yellow was chosen. The menu panel in VR can either be used as a 3D canvas or as a portable touchpad.
The current HMD-VR menu navigation has not been tested through an adequate user study, and design alternatives still need to be explored. To exploit their full potential, detailed user studies and evaluations are necessary for both prototypes.
Prototype implementation HMD-VR
For the HMD-VR environment, Oculus Rift S and Oculus Quest 2 in link mode were used as inside-out tracked HMDs, applying a state-of-art, generic implementation. Thus, the environment can be adopted for other HMDs. The Oculus Integration SDK version 28.0 was implemented in Unity to include basic functionalities.
In the following short description of the interaction, the 18 slices of the heart model can be moved, rotated, and enlarged individually (Fig. 1). A handle that can be gripped and rotated allows the entire model to be moved as a single unit (Fig. 2 A). Using the menu button, the entire model will be expanded horizontally in front of the user (Fig. 2 C), which allows the internal heart sections to be viewed while providing an overview of all pieces at the same time.
Several input and output features have been elaborated to support the user while applying the HMD-VR, such as a backup file created as a background process for each session (Fig. 2 D). A chronological data list is generated, which stores all selected genes that have been visualized onto the model and the compared expression patterns. In addition, export functions for the tabular results, such as the similar genes table, have been introduced to generate a text file in CSV format. In addition, the application has various screenshot and recorder functions that can inter alia capture the user’s current field of view, either as a screenshot in a PNG/JPG format or as a video recording in mp4 format, as well as simultaneous images from four different angles to the heart model. Therefore, all other visual components in the environment are disabled for a duration of two frames (0.03 s) while a screenshot is taken to avoid blocking the view of the cameras to the heart model.
Prototype implementation FTVR
As a comparative model for FTVR, a zSpace All-in-One 200 24-GL monitor was used, which is operated through the computing power of an external PC. To enable core interactions with objects and UI elements within Unity, the zSpace core SDK (version 6.0.0.11) was integrated, which had to be adapted to the present needs owing to the EOL of the zSpace 200 series in 2018.
The FTVR zSpace resembles a traditional 2D computer monitor; thus, the use of zSpace will seem familiar to the user. Both applications are based on the same functionalities and features. Differences arise only from a technical point of view. zSpace requires specific integration modules integrated in the SDK mentioned above to allow interaction with objects and UI elements. All UI elements, as well as all interactive objects from the VR-Cardiomics implementation for the HMD-VR, had to be adapted to zSpace accordingly. All comparison functions from the previously described VR applications, such as the combined view, heatmap comparison, and group selection, were adopted for a side-by-side comparison of the two models for zSpace.
Several adaptations were made based on this device. For example, a handle (Fig. 2 A) to move the entire heart model was substituted by removing the single-piece interaction of the heart slices. This implementation was omitted because it did not improve the handling, and had a negative impact on the intuitiveness of the application. The selection of the individual heart slices for the group selection was transferred to the stylus instead, allowing the environment to be completely controlled by the stylus pen with the exception of a keyboard input for a gene search.
Interaction: HMD-VR
Using HMD-VRs, the user can be immersed in an entire 3D virtual world, providing the opportunity to interact with objects and the environment in a unique way. VR-Cardiomics attempts to exploit this unique feature to its full potential by providing numerous interaction possibilities between the user and the visualized expression patterns on the 3D models. The HMD-VR application was developed using Oculus Rift S and Oculus Quest 2 in link mode. Both devices use Oculus touch controllers. Each controller has three physical and two trigger buttons, supplemented with a joystick for navigation within the environment. Finger positions are recorded using sensors to allow gesture control, such as grabbing or pointing. The main component of VR-Cardiomics is its menu panel, which can be used as a portable touchpad or a fixed menu canvas that can be used through a point-and-click approach. Both options are always present and can be used in a way that best suits the user. Point-and-click refers to how the user points the controller in the direction of the button on the menu panel and confirms the selection by pressing a controller button. Otherwise, the menu can be moved by grabbing the attached pink handle of the menu (Fig. 3 10). The touch controller recognizes whether the user is pointing in a certain direction with an index finger, imitating the gesture in VR. The user can therefore use the index finger to point and touch a button on the menu panel, receiving short vibration feedback from the controller to confirm the selection. A certain expression value from the dataset was selected using the menu panel. Therefore, a virtual keyboard was designed to allow a text input in the search bar of the menu panel. Touching the search bar (Fig. 3 2) or the keyboard button (Fig. 3 4) allows the keyboard to appear in front of the user. The keyboard is used similarly to the menu panel by being moved using the attached handle and allowing an interaction through a point-and-click/touch motion. According to the current input, the dataset is searched for matching gene names and will show the results in a list below the search box. The user can scroll through this list and select from the results of the gene expression visualized on the model.
Because a virtual environment is suitable to allow more models to be used at the same time, a function to select one of the current models was implemented. By pressing the button of the touch controller, a red illuminating circle appears below the first model. Pressing the button again will move this circle to the next model. The current selection is confirmed by pressing an additional button on the touch controller. For all further single-object operations, such as selecting a gene from the dataset, switching between an absolute or normalized expression visualization, or using the expanded feature, the selected object will be applied.
In addition to the interaction with the UI elements of the application, the user can also interact with the 3D heart in several ways. Each model added has an additional handle, as presented in Fig. 2, A. This handle is an extension of the 3D model and is used to move and rotate the model. In addition, the user can interact with each of the 18 pieces of the model individually. Therefore, the piece can be grabbed by bringing the controller to the piece and pressing both trigger buttons on the touch controller. To resize the piece, the user can grab it with both controllers simultaneously and move the controllers away from each other, resulting in an enlargement of the piece. The same is true for the opposite movement used to make the piece smaller again.
Because a direct interaction with an object is a feature unique to an HMD-VR, the authors tried using it for the functions of VR-Cardiomics to allow an intuitive and exploratory interaction with the data. Therefore, the above-mentioned heatmap comparison, used to compare all pieces of the two heart models concurrently, is implemented in a drag-and-drop-like manner. Thus, the handle of one model is grabbed and overlapped with the second model to be compared with. Releasing the handle at this position results in an automated comparison of both models, resulting in one model presenting the intensity of the differences of each piece in a heatmap-like manner.
The group selection also benefits from these interactions because pieces for both groups are simply selected and deselected by touching them and confirming the selection by pressing a button on the touch controller. The corresponding piece will be visually highlighted, and a short vibration feedback will be received to confirm the selection.
The avatar movement within the environment is applied by using the joysticks of the touch controller; therefore, the environment can also be used while seated. The controller of the dominant hand moves the avatar, whereas the other controller rotates the avatar around the axis.
Interaction: FTVR
Interaction with zSpace is mainly accomplished using the zSpace stylus pen. Alternatively, input devices such as keyboards or computer mice can be used. The user is required to wear polarization glasses with tracking markers to simulate the user’s perception of a 3D object (Fig. 4). The movement of the user’s head is calculated according to the tracking markers, and the 3D object rotation is adapted to the movement of the head accordingly.
With the stylus pen, the 3D object can be moved, rotated, or selected (Fig. 4). Likewise, it can be used to control UI elements of the inter alia menu functions. This pen features three physical buttons and is traced by the trackers of the zSpace desktop. Interactions with the menu elements is achieved by pointing at them and pressing the physical buttons to confirm the selection. To provide the users with a sense of where they are pointing at, a visual laser beam is displayed from the end of the pen to the intersection with an object. Because the stylus pen only offers a total of three buttons and gesture controls are not supported, certain functions have to be implemented as buttons in the menu.
The menu is therefore always present on the screen surrounding the 3D heart, which is used for data interaction. Owing to the technical differences of the device, an implementation based on the web application was more feasible for zSpace. However, most of the features are still implemented, and thus the application is also comparable to a VR application.
This means that the selection of a gene expression from the dataset is conducted using the stylus pen instead, as mentioned above. The user has to click the search bar with either a stylus pen or a computer mouse. Text input is achieved using a keyboard, resulting in scrollable suggestions from which the user can select the expression pattern to be used. Because zSpace is based on a side-by-side comparison of two models at once, the selection of the object is automated. The second heart object can be used once the expression pattern for the first heart is selected. Two additional buttons, one to enable a second 3D heart for comparison (Combined View), and one to enable the heatmap comparison of all 18 pieces simultaneously (Heatmap), will appear next to the search bar. Both models are interactive and can be moved and rotated by pointing at them and pressing one of the physical buttons of the pen. The model reacts to the movement of the pen as long as the button is pressed. The 3D models are bound to either the search bar or a list of similar gene results. Selecting a button of similar genes leads to a colorization of one of the models according to the expression pattern, whereas the other remains with the previous expression pattern of the search bar. This heart object can be changed using the search bar for a new expression pattern from the dataset.
Pressing the heatmap button will result in a comparison of both of the current expression patterns for each single piece projected onto one of the models, similar to the HMD-VR application mentioned above. In addition, the group selection feature is similar to that of the HMD-VR version. The selection of individual heart pieces for the group selection is implemented using the stylus pen, allowing the user to point at a piece and confirming the selection by pressing the button on the pen, which results in a color-based highlighting of the selected piece.