Examinando por Autor "Moreno-Lumbreras, David"
Mostrando 1 - 7 de 7
- Resultados por página
- Opciones de ordenación
Ítem BabiaXR: Facilitating experiments about XR data visualization(Elsevier, 2023) Moreno-Lumbreras, David; Gonzalez-Barahona, Jesus M.; Robles, GregorioBabiaXR is a toolset for conducting experiments about 3D data visualizations in extended reality (XR) in the browser. BabiaXR provides both components for building complex data visualizations, and for easily transforming them into scenes suitable for running experiments with subjects. For data visualization, it provides components to retrieve, filter, select, and visualize data. To facilitate empirical experiments with human subjects, it provides components for showing information to subjects, controlling their interaction with data, and recording their reactions. This enables the easy transformation of a certain data visualization scene into an experiment directly usable by subjects. BabiaXR is extensible, based on the A-Frame JavaScript framework for XR. As such, it is easily composable with other A-Frame components, and complex data visualization scenes can be created by using only HTML constructs. BabiaXR can be used in any XR device supporting WebXR, and with limited capabilities also in desktop and mobile devices.Ítem CodeCity: A Comparison of On-Screen and Virtual Reality(Elsevier, 2022) Moreno-Lumbreras, David; Minelli, Roberto; Villaverde, Andrea; Gonzalez-Barahona, Jesus M.; Lanza, MicheleContext: Over the past decades, researchers proposed numerous approaches to visualize source code. A popular one is CodeCity, an interactive 3D software visualization representing software system as cities: buildings represent classes (or files) and districts represent packages (or folders). Building dimensions represent values of software metrics, such as number of methods or lines of code. There are many implementations of CodeCity, the vast majority of them running on-screen. Recently, some implementations using virtual reality (VR) have appeared, but the usefulness of CodeCity in VR is still to be proven. Aim: Our comparative study aims to answer the question ‘‘Is VR well suited for CodeCity, compared to the traditional on-screen implementation?’’ Methods: We performed two experiments with our web-based implementation of CodeCity, which can be used on-screen or in immersive VR. First, we conducted a controlled experiment involving 24 participants from academia and industry. Taking advantage of the obtained feedback, we improved our approach and conducted a second controlled experiment with 26 new participants. Results: Our results show that people using the VR version performed the assigned tasks in much less time, while maintaining a comparable level of correctness. Conclusion: VR is at least equally well-suited as on-screen for visualizing CodeCity, and likely better.Ítem CodeCity: On-Screen or in Virtual Reality?(IEEE, 2021) Moreno-Lumbreras, David; Minelli, Roberto; Villaverde, Andrea; González-Barahona, Jesús M.; Lanza, MicheleOver the past decades, researchers proposed numerous approaches to visualize source code. A prominent one is CODECITY, an interactive 3D software visualization that leverages the “city metaphor” to represent software system as cities: buildings represent classes (or files) and districts represent packages (or folders). Building dimensions represent values of software metrics, such as the number of methods or the lines of code. There are many implementations of CODECITY, the vast majority of them running on-screen. Recently, some implementations visualizing CODECITY in virtual reality (VR) have appeared. While exciting as a technology, VR’s usefulness remains to be proven. The question we pose is: Is VR well suited to visualize CODECITY, compared to the traditional on-screen implementation? We performed an experiment in our interactive web-based application to visualize CODECITY. Users can fetch data from any git repository and visualize its source code. Our application enables users to navigate CODECITY both on-screen and in an immersive VR environment, using consumer-grade VR headsets like Oculus Quest. Our controlled experiment involved 24 participants from academia and industry. Results show that people using the VR version performed the assigned tasks in much less time, while still maintaining a comparable level of correctness. Therefore, our results show that VR is at least equally wellsuited as on-screen for visualizing CODECITY, and likely better.Ítem Diving into Software Evolution: Virtual Reality vs. On-Screen(2024-09-04) Moreno-Lumbreras, David; González Barahona, Jesus M.; Robles, GregorioAbstract—Background: Traditional 2D visualizations have been widely used for software metrics and evolution analysis, offering structured views of complex data. However, the advent of Virtual Reality (VR) technologies introduces new possibilities for immersive and interactive software visualization, potentially enhancing comprehension and user engagement. Objective/Aim: This report aims to evaluate the effectiveness of VR immersive visualizations compared to traditional 2D on- screen visualizations for understanding code metrics across soft- ware releases. Specifically, we seek to determine if VR provides better accuracy and speed for comprehending high-level software evolution tasks. Method: We will conduct a controlled experiment with 30 participants from academia and industry, using GitLab, GitHub, or an IDE for on-screen visualizations and a VR setup using Meta Quest 3. Tasks related to software evolution will be completed in both settings. Accuracy and time will be measured and analyzed using mixed linear models and non-parametric tests to compare the two approaches. Data will be sourced from GitHub repositories with similar project characteristics.Ítem Software development metrics: to VR or not to VR(Springer, 2024) Moreno-Lumbreras, David; Robles, Gregorio; Izquierdo-Cortázar, Daniel; Gonzalez-Barahona, Jesus M.Context Current data visualization interfaces predominantly rely on 2-D screens. However, the emergence of virtual reality (VR) devices capable of immersive data visualization has sparked interest in exploring their suitability for visualizing software development data. Despite this, there is a lack of detailed investigation into the effectiveness of VR devices specifically for interacting with software development data visualizations. Objective Our objective is to investigate the following question: “How do VR devices compare to traditional screens in visualizing data about software development?” Specifically, we aim to assess the accuracy of conclusions derived from exploring visualizations for understanding the software development process, as well as the time required to reach these conclusions. Method In our controlled experiment, we recruited N=32 volunteers with diverse backgrounds. Participants interacted with similar data visualizations in both VR and traditional screen environments. For the traditional screen setup, we utilized a commercially available set of interactive dashboards based on Kibana, commonly used by Bitergia customers for data insights. In the VR environment, we designed a set of visualizations, tailored to provide an equivalent dataset within a virtual room. Participants answered questions related to software evolution processes, specifically code review and issue tracking, in both VR and traditional screen environments, for two projects. We conducted statistical analyses to compare the correctness of their answers and the time taken for each question. Results Our findings indicate that the correctness of answers in both environments is comparable. Regarding time spent, we observed similar durations, except for complex questions that required examining multiple interconnected visualizations. In such cases, participants in the VR environment were able to answer questions more quickly. Conclusion Based on our results, we conclude that VR immersion can be equally effective as traditional screen setups for understanding software development processes through visualization of relevant metrics in most scenarios. Moreover, VR may offer advantages in comprehending complex tasks that require navigating through multiple interconnected visualizations. However, further experimentation is necessary to validate and reinforce these conclusions. Similar content being viewed by othersÍtem The influence of the city metaphor and its derivates in software visualization(Elsevier, 2024-04) Moreno-Lumbreras, David; Gonzalez-Barahona, Jesús M.; Robles, Gregorio; Cosentino, ValerioContext: The city metaphor is widely used in software visualization to represent complex systems as buildings and structures, providing an intuitive way for developers to understand software components. Various software visualization tools have utilized this approach. Objective: Identify the influence of the city metaphor on software visualization research, determine its state-of-the-art status, and identify derived tools and their main characteristics. Method: Conduct a systematic mapping study of 406 publications that reference the first paper on the use of the city metaphor in software visualization and/or the main paper of the CodeCity tool. Analyze the 168 publications from which valuable information could be extracted, and build a complete categoric analysis. Results: The field has grown considerably, with an increasing number of publications since 2001, and a changing research community with evolving interconnections between groups. Researchers have developed more tools that support the city metaphor, but less than 50% of the tools were referenced in their papers. Moreover, 85% of the tools did not use extended reality environments, indicating an opportunity for further exploration. Conclusion: The study demonstrates the active and continually growing presence of the city metaphor in research and its impact on software visualization and its derivativesÍtem Virtual Reality vs. 2D Visualizations for Software Ecosystem Dependency Analysis – A Controlled Experiment(2024-08-30) Moreno-Lumbreras, David; Gonzalez-Barahona, Jesus M.; Robles, GregorioBackground/Context: Data is typically visualized using 2-D on- screen tools. With the advent of devices capable of creating 3D and Virtual Reality (VR) scenes, there is a growing interest in exploring these technologies for data visualization, particularly for complex data like software dependencies. Despite this interest, there is lim- ited evidence comparing VR with traditional 2D on-screen tools for such visualizations. Objective/Aim: This registered report aims to determine whether comprehension of software ecosystem dependencies, visualized through their metrics, is better when presented in VR scenes com- pared to 2D screens. Specifically, we seek to evaluate if answers obtained through VR visualizations are more accurate and if it takes less time to derive these answers compared to traditional 2D on-screen tools. Method: We will conduct an experiment with volunteers from various backgrounds, using two setups: a 2D on-screen tool and a VR scene created with A-Frame. The data will focus on web projects using the Node Package Manager (NPM) registry. Subjects will answer a series of questions in both setups, presented in random order. We will statistically analyze the correctness and the time taken for their answers to compare the two visualization methods