Visualization of Information in VR Environments to Support the Monitoring and Analysis of Objects in Space
Dissertation Project
Master in Design and Multimedia – University of Coimbra
September 2024
- Project Deploy - https://vr-satellite-visualization.vercel.app/
- Dissertation Paper - https://baes.uc.pt/handle/10316/118087?mode=full
- José Pedro Antunes - design and development
- Evgheni Polisciuc - supervision
- Jorge C. S. Cardoso - supervision
- Visualization of Information in VR Environments to Support the Monitoring and Analysis of Objects in Space
This repository contains the source code and supporting materials for my dissertation project titled "Visualization of Information in VR Environments to Support the Monitoring and Analysis of Objects in Space." The project explores innovative techniques for spatial data visualization using virtual reality (VR), addressing the growing need for effective monitoring of satellites and other space objects to prevent collisions.
- The rapid increase in the number of satellites and space objects has amplified the risk of collisions.
- Traditional 2D visualization methods lack the depth and interactivity needed for effective spatial analysis.
- VR offers an immersive solution, allowing users to “step inside” the data, manipulate 3D representations, and gain better insights into complex spatial relationships.
- Explore VR Visualization Techniques: Develop methods to leverage immersive 3D environments for data visualization.
- Interactive Data Manipulation: Enable interactive exploration of satellite data, including position, trajectory, and uncertainty.
- Collision Prevention: Support the analysis and prediction of potential collisions through dynamic visualizations.
- Multi-Platform Integration: Create a tool that functions seamlessly across desktop, mobile, and VR devices.
The dissertation was developed under the guidance of Professors Evgheni Polisciuc and Jorge C. S. Cardoso and in collaboration with Neuraspace, a company specializing in space traffic management using artificial intelligence. The research focuses on combining traditional information visualization techniques with modern VR capabilities to create an effective system for monitoring and analyzing spatial data.
Key aspects of the dissertation include:
- Literature Review & State of the Art: Analyzing existing visualization methods, VR techniques, and their limitations.
- Data Integration: Utilizing satellite data provided by Neuraspace to simulate real-time monitoring and collision prediction.
- Design & Prototyping: Iterative design process including mockups, UI prototypes, and usability testing.
- Development & Implementation: Building the visualization module, integrating VR interfaces, and addressing technical challenges (e.g., performance, interaction design).
The development of the project followed a structured approach:
-
Preliminary Work:
- Analysis of spatial data and exploration of VR tools and frameworks (e.g., Three.js, A-Frame).
- Initial prototyping and design iterations based on early mockups and user feedback.
-
Design & Prototyping:
- Creating detailed design mockups for the visualization module.
- Conducting usability tests and refining the interface.
-
Implementation:
- Integration of data streams from Neuraspace into a VR environment.
- Development of interactive elements such as globe visualization and dynamic dashboards.
- Overcoming challenges related to performance, hand tracking, and VR sickness.
-
Evaluation:
- System testing to validate the accuracy and responsiveness of the visualization.
- Collecting feedback from potential users to inform future improvements.
-
Frontend & VR Frameworks:
- Three.js, A-Frame.js, WebVR APIs
-
Data Processing:
- Data integration from external APIs and Neuraspace datasets
-
Design & Prototyping:
- Figma, ShapesXR, Adobe Photoshop
-
Other Tools:
- Git for version control
- Vite and Vercel
The project structure is as follows:
Visualization_of_Information_in_VR_Environments_of_Objects_in_Space/
│
├── Documents/ # Documentation related to the project
│ ├── %5BDS%5D%20Design/ # Design files, mockups, and prototypes
│ └── [QA] Quality/ # Quality assurance documents and processes
│
├── Exploration/ # Experimental explorations and tests
│ ├── A-frame/ # Tests using A-frame for VR visualization
│ ├── Resources/ # Additional resources
│ └── exp_conjunctions.json # Sample from the Neuraspace dataset (exp_ => exporation)
│
├── src/ # Main source code for the VR application
│ ├── Experience/ # Core application logic and VR experience management
│ │ ├── Shaders/ # Custom shaders for rendering effects
│ │ ├── Utils/ # Utility functions and helpers
│ │ ├── World/ # 3D world setup and scene management
│ │ ├── Camera.js # Handles VR camera and user perspective
│ │ ├── Experience.js # Main experience manager
│ │ ├── Renderer.js # Manages rendering pipeline
│ │ └── sources.js # Handles asset sources
│ ├── fonts/ # Fonts used in the project
│ ├── index.html # Main HTML file
│ ├── script.js # Main JavaScript file
│ ├── style.css # Main CSS file
│ └── test.html # Test file
│
├── static/ # Static assets such as models and textures
│ ├── models/ # 3D models used in the VR environment
│ ├── textures/ # Textures for materials and objects
│ ├── conjunctions.json # Main sample from Neuraspace dataset for conjunction analysis
│ └── exp_conjunctions.json # Sample dataset from Neuraspace for exploratory analysis
│
├── .gitignore # Specifies files and folders to ignore in version control
├── LICENSE # License information for the project
├── package-lock.json # Auto-generated dependency lock file
├── package.json # Project dependencies and scripts
├── README.md # This file
└── vite.config.js # Configuration for Vite (build tool)
- Node.js
- A compatible web browser that supports WebVR / WebXR
- Clone the repository:
git clone https://github.com/your-username/Visualization_of_Information_in_VR_Environments_of_Objects_in_Space.git cd Visualization_of_Information_in_VR_Environments_of_Objects_in_Space
- Install dependencies:
npm install
- Run the application:
npm run dev
- Please make sure that you have access to the necessary satellite data, as described in the documentation. (In this case, the application is running a sample from the Neuraspace dataset)
I would like to express my sincere gratitude to:
- Professors Evgheni Polisciuc and Jorge C. S. Cardoso for their invaluable guidance throughout this project.
- Neuraspace for providing the data and collaboration that made this research possible.
- My colleagues, friends, and family for their continuous support and encouragement during this journey.
For any questions or further information, please feel free to reach out:
- Name: José Pedro da Rocha Antunes
- Email: [email protected]
Made with ❤️ as part of my Master’s Dissertation.