From 5626d83c353b67fdf3b173739dc4ef00acc8d4b1 Mon Sep 17 00:00:00 2001 From: FabiKogWi Date: Wed, 11 Sep 2024 11:48:57 +0200 Subject: [PATCH 01/11] Created documentation file with structure --- documentation.md | 15 +++++++++++++++ 1 file changed, 15 insertions(+) create mode 100644 documentation.md diff --git a/documentation.md b/documentation.md new file mode 100644 index 0000000..b719ec5 --- /dev/null +++ b/documentation.md @@ -0,0 +1,15 @@ +# Documentation +ToDo: Convert this to ReadMe when finished + +## Overview + +## Table of Contents + +## Installation +### Via Docker +### By yourself on your machine + +## Code added and how it works + +## Tutorial on how to use the code via example of our own project + From 2cc7e251ce77023e73a5edcc9ffcd68fca417c0b Mon Sep 17 00:00:00 2001 From: FabiKogWi Date: Wed, 11 Sep 2024 12:17:23 +0200 Subject: [PATCH 02/11] Added code to be documented in documentation.md --- documentation.md | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/documentation.md b/documentation.md index b719ec5..91b6c27 100644 --- a/documentation.md +++ b/documentation.md @@ -10,6 +10,14 @@ ToDo: Convert this to ReadMe when finished ### By yourself on your machine ## Code added and how it works +What code was added? +- Scene Selector +- Minimap +- Teleportation +- 3D Bounding Boxes +- Pathing +- User Controls +- Performance Improvements (?) ## Tutorial on how to use the code via example of our own project From a3325ceb2773690665e53e2d1df16e1471de1ca5 Mon Sep 17 00:00:00 2001 From: FabiKogWi Date: Sat, 21 Sep 2024 16:13:05 +0200 Subject: [PATCH 03/11] install instructions --- documentation.md | 12 ++++++++++++ 1 file changed, 12 insertions(+) diff --git a/documentation.md b/documentation.md index 91b6c27..af613b2 100644 --- a/documentation.md +++ b/documentation.md @@ -6,8 +6,20 @@ ToDo: Convert this to ReadMe when finished ## Table of Contents ## Installation + ### Via Docker +- Install [Docker](https://www.docker.com/) (Required!) +- Download the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) +- Create a directory called `public` in the same directory as the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) +- Place any splats you want to use inside the public folder +- run `docker compose up` in the terminal with the working directory set to the directory you saved the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) + ### By yourself on your machine +**Requirements:** [Node](https://nodejs.org/en) and yarn (install via `npm install -g yarn`) +- Pull the code from [GitHub](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview): `git clone git@github.com:cgtuebingen/hyperrealistic_indoor_streetview.git` +- Run `yarn dev` in the terminal in the root directory of the project (should be hyperrealistic_indoor_streetview) + + ## Code added and how it works What code was added? From 2be131f587a52f7d24cde88c498acaef2aa8bf7a Mon Sep 17 00:00:00 2001 From: FabiKogWi Date: Sun, 22 Sep 2024 19:17:39 +0200 Subject: [PATCH 04/11] Added more fluff --- documentation.md | 56 ++++++++++++++++++++++++++++++++++++++---------- 1 file changed, 45 insertions(+), 11 deletions(-) diff --git a/documentation.md b/documentation.md index af613b2..4ee5961 100644 --- a/documentation.md +++ b/documentation.md @@ -14,7 +14,7 @@ ToDo: Convert this to ReadMe when finished - Place any splats you want to use inside the public folder - run `docker compose up` in the terminal with the working directory set to the directory you saved the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) -### By yourself on your machine +### Via GitHub **Requirements:** [Node](https://nodejs.org/en) and yarn (install via `npm install -g yarn`) - Pull the code from [GitHub](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview): `git clone git@github.com:cgtuebingen/hyperrealistic_indoor_streetview.git` - Run `yarn dev` in the terminal in the root directory of the project (should be hyperrealistic_indoor_streetview) @@ -22,14 +22,48 @@ ToDo: Convert this to ReadMe when finished ## Code added and how it works -What code was added? -- Scene Selector -- Minimap -- Teleportation -- 3D Bounding Boxes -- Pathing -- User Controls -- Performance Improvements (?) - -## Tutorial on how to use the code via example of our own project +What features were added? +**Scene Selector** +*In Progress* + +**Teleportation** +The teleportation features offer the possibility to let the user teleport to points of interest in the scene with a click of a button. +It introduces the `handleTeleport()` function in `CanvasLayer.tsx` which is used to teleport the User to a location defined inside the function. + + +**Minimap** +*In Progress* + + +**3D Bounding Boxes** +There are now 3D Bounding Boxes which limit the space the user can move to. These can be moved to reflect the internal structure of a splat. + +**Interactive Elements** + +**Improved User Controls** +User Controls were changed to feel more like conventional controls in a First-Person Games (WASD-Control-Scheme). Camera height is constant. + + +**Performance Improvements** + + +## Example for Inspiration for your own projects +**Outline** +A small community center was mapped from the inside via Gaussian Splatting with bounding boxes added to mimick the architecture of the building. The goal was to be able to move inside the building like inside a video game. + +The project was deployed on a server via GitHub. To keep the project updated GitHub Actions was used to automatically pull merges and pushes to main on to the server and restarting it. + +**GitHub Actions** +GitHub Actions was used to automatically deploy updates to the server. Uses shimataro/ssh-key-action@v2 to ssh onto the server and automatically reinstall Node and Yarn to then pull the new version from GitHub and then restart the server. + +**Splat Creation** +*!Please note that the method described here is probably outdated and you should do your own research on how to create splats!* +To create splats Nerfstudios [Splatfacto](https://docs.nerf.studio/nerfology/methods/splat.html) was used. Splatfacto relies on [Colmap](https://colmap.github.io/) and [FFmpeg](https://www.ffmpeg.org/), so make sure you have these installed when using Splatfacto. +Colmap is difficult to work with, so make sure to only feed it clear, unshaky images taken from very even angles. For further install instructions consult the respective websites. + +**Splat Cleanup** +The splats where cleaned up using [Supersplat]() and [Blender](https://www.blender.org/) with the [Gaussian-Splatting](https://github.com/ReshotAI/gaussian-splatting-blender-addon) addon to load the files. Blender was used as a work-around to be able to edit and move point-clouds while Supersplat is a lightweight web-editor and viewer for splats that makes it easy to edit splats by removing gaussians and reorienting the splat. + +**Landing Page** +A landing page was created to integrate the viewer into a larger context and offer information on the project. From 5ec43524d0c4cf233119337fb9042739cc7960ca Mon Sep 17 00:00:00 2001 From: FabiKogWi Date: Mon, 23 Sep 2024 13:50:09 +0200 Subject: [PATCH 05/11] Added documentation about rooms --- documentation.md | 31 +++++++++++++++++++++++++------ 1 file changed, 25 insertions(+), 6 deletions(-) diff --git a/documentation.md b/documentation.md index 4ee5961..90fd15b 100644 --- a/documentation.md +++ b/documentation.md @@ -1,6 +1,4 @@ # Documentation -ToDo: Convert this to ReadMe when finished - ## Overview ## Table of Contents @@ -21,10 +19,7 @@ ToDo: Convert this to ReadMe when finished -## Code added and how it works -What features were added? -**Scene Selector** -*In Progress* +## Features **Teleportation** The teleportation features offer the possibility to let the user teleport to points of interest in the scene with a click of a button. @@ -34,17 +29,41 @@ It introduces the `handleTeleport()` function in `CanvasLayer.tsx` which is used **Minimap** *In Progress* +**Rendering Changes** +The Web-Viewer now supports room based rendering, meaning multiple splats can be used and loaded depending on where the user is in the scene. You can define rooms as an object following this pattern: +``` +const roomConfig = [ + { + splat: "../../public/Splat.splat", + name: "room1", + adjacent: [], + minX: -100, maxX: 100, minY: 0, maxY: 5, minZ: -100, maxZ: 100, + slopes: [], + objects: [], + elements: { + arrows: [], + panes: [], + windowarcs: [] + } + } + ]; +``` + **3D Bounding Boxes** There are now 3D Bounding Boxes which limit the space the user can move to. These can be moved to reflect the internal structure of a splat. +**Arrows** + **Interactive Elements** +Interactive 3D Elements were added to help the user **Improved User Controls** User Controls were changed to feel more like conventional controls in a First-Person Games (WASD-Control-Scheme). Camera height is constant. **Performance Improvements** +Performance was improved on low-end machines via small optimization with about 5-10fps gained. ## Example for Inspiration for your own projects From 9021bdcca8f0f1db90b8272a75b2d1937f9b21cc Mon Sep 17 00:00:00 2001 From: FabiKogWi Date: Tue, 24 Sep 2024 16:19:42 +0200 Subject: [PATCH 06/11] Added documentation on Arrowgraphs --- documentation.md | 38 ++++++++++++++++++++++++++++++++++---- 1 file changed, 34 insertions(+), 4 deletions(-) diff --git a/documentation.md b/documentation.md index 90fd15b..8ca3022 100644 --- a/documentation.md +++ b/documentation.md @@ -29,8 +29,8 @@ It introduces the `handleTeleport()` function in `CanvasLayer.tsx` which is used **Minimap** *In Progress* -**Rendering Changes** -The Web-Viewer now supports room based rendering, meaning multiple splats can be used and loaded depending on where the user is in the scene. You can define rooms as an object following this pattern: +**Rendering Changes and Additions** +The Web-Viewer now supports room based rendering, meaning multiple splats can be used and loaded depending on where the user is in the scene. You can define rooms as an object in the roomConfig array following this pattern: ``` const roomConfig = [ { @@ -48,13 +48,43 @@ const roomConfig = [ } ]; ``` +Also introduced were slopes, objects and elements like panes and windowarcs. To add a pane simply push it into the pane array via `roomConfig[n].elements.panes.push(pane1)`. +To make a pane create an object with the following pattern: +``` +{position: xyz-coordinates, verticalRotation: number, horizontalRotation: number, sizeFactor: number, content: content} +``` +windowarcs work similiarly with the required attributes being +``` +{position: xyz-coordinates, horizontalRotation: number, arcRadius: number, arcHeight: number, content: content} +``` +Arrrows where added to support guiding the user through splats. Arrows follow the pattern: +``` +{position: xyz-coordinates, graphName: string} +``` +To connect the single arrows in an arrowGraph you need to add the edges to an ArrowGraph. You create an ArrowGraph with the new ArrowGraph constructor like this: +``` +const graph = new ArrowGraph(scene: THREE.Scene); +``` +It has three attributes: +``` +graph: arrowgraph \\ Holds edges and arrows +arrowsShortestPaths: (THREE.Object3D|undefined)[][] \\Holds the shortest paths between the arrows +scene: THREE.Scene \\ Holds the THREE Scene +``` +Methods are: +- `addArrow(arrow: THREE.Object3D): void` +Adds arrows to the graph +- `addEdge(arrow1name: string, arrow2name: string): void` +Adds edges between the arrows (at this point it is necessary to connect the arrows manually) +- `findShortestPaths(): void` +Finds the shortest paths between all arrows and saves them in arrowsShortestPaths + + **3D Bounding Boxes** There are now 3D Bounding Boxes which limit the space the user can move to. These can be moved to reflect the internal structure of a splat. -**Arrows** - **Interactive Elements** Interactive 3D Elements were added to help the user From 0a7b64cba2320ce4f26ea70e7738b81c52e09239 Mon Sep 17 00:00:00 2001 From: FabiKogWi Date: Tue, 24 Sep 2024 16:33:11 +0200 Subject: [PATCH 07/11] Finished documentation of arrowgraphs --- documentation.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/documentation.md b/documentation.md index 8ca3022..8bd763d 100644 --- a/documentation.md +++ b/documentation.md @@ -78,6 +78,9 @@ Adds arrows to the graph Adds edges between the arrows (at this point it is necessary to connect the arrows manually) - `findShortestPaths(): void` Finds the shortest paths between all arrows and saves them in arrowsShortestPaths +- `updateArrowRotations(destinationArrowName: string): void` +Updates the rotations of all arrows so they point towards the target arrow along the shortest path to it. + From a4ab7af9ca6354b3f3d740256d4a0fda1bc4c3d6 Mon Sep 17 00:00:00 2001 From: FabiKogWi Date: Tue, 24 Sep 2024 20:43:05 +0200 Subject: [PATCH 08/11] Added overview, user controls and extended teleportation --- documentation.md | 61 +++++++++++++++++++++++++++++++++++------------- 1 file changed, 45 insertions(+), 16 deletions(-) diff --git a/documentation.md b/documentation.md index 8bd763d..e780e6a 100644 --- a/documentation.md +++ b/documentation.md @@ -1,29 +1,31 @@ -# Documentation +# ReadMe ## Overview - -## Table of Contents +Hyperrealistic Indoor Streetview is a student project under the tutorship of [Jan-Niklas Dihlmann](https://github.com/JDihlmann) from the University of Tübingen aiming to use Gaussian Splatting for creating photo-realistic web-viewable 3D enviroments of indoor spaces inspired from Google Streetview. It supports modelling 3D enviroments with bounding boxes, offers a number of quality of life features and introduces a game inspired control scheme. ## Installation ### Via Docker - Install [Docker](https://www.docker.com/) (Required!) +**If you want to use your own splats** - Download the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) - Create a directory called `public` in the same directory as the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) - Place any splats you want to use inside the public folder - run `docker compose up` in the terminal with the working directory set to the directory you saved the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) +**If you just want to try the web-viewer** +- `docker image pull fabiuni/teamprojekt` Download the image +- `docker run -dp:3000:3000 fabiuni/teamprojekt` Run the image + ### Via GitHub **Requirements:** [Node](https://nodejs.org/en) and yarn (install via `npm install -g yarn`) -- Pull the code from [GitHub](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview): `git clone git@github.com:cgtuebingen/hyperrealistic_indoor_streetview.git` -- Run `yarn dev` in the terminal in the root directory of the project (should be hyperrealistic_indoor_streetview) - - +- Clone the code from [GitHub](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview): `git clone git@github.com:cgtuebingen/hyperrealistic_indoor_streetview.git` +- Run `yarn dev` in the terminal in the root directory of the project (should be hyperrealistic_indoor_streetview) ## Features **Teleportation** The teleportation features offer the possibility to let the user teleport to points of interest in the scene with a click of a button. -It introduces the `handleTeleport()` function in `CanvasLayer.tsx` which is used to teleport the User to a location defined inside the function. +It introduces the `handleTeleport()` function in `CanvasLayer.tsx` which is used to teleport the User to a location defined inside the function. It uses the `teleport(x: number, y: number, z: number, lookAtX: number, lookAtY: number, lookAtZ: number)` which is defined in teleportControls.tsx and handles the actual logic behind the teleportation. **Minimap** @@ -73,33 +75,59 @@ scene: THREE.Scene \\ Holds the THREE Scene ``` Methods are: - `addArrow(arrow: THREE.Object3D): void` -Adds arrows to the graph +Adds a THREE.Object3D to the graph which serves as the arrows model in the 3D-Viewer - `addEdge(arrow1name: string, arrow2name: string): void` -Adds edges between the arrows (at this point it is necessary to connect the arrows manually) +Adds edges between the arrows (at this point it is necessary to connect the arrows manually). The arrowNames - `findShortestPaths(): void` Finds the shortest paths between all arrows and saves them in arrowsShortestPaths - `updateArrowRotations(destinationArrowName: string): void` -Updates the rotations of all arrows so they point towards the target arrow along the shortest path to it. +Updates the rotations of all arrows so they point towards the target arrow along the shortest path to it. - +Now simply add the edges of the arrowgraph to the graph via `arrowGraph.addGraph(arrow)`. +An example for adding arrows is below: +``` +// Declare arrowGraph +const arrowGraph = new ArrowGraph(scene); + +// Add arrows to room +roomConfig[1].elements.arrows[0] = {xyzCoordinates, arrowGraph}; +roomConfig[1].elements.arrows[1] = {xyzCoordinates, arrowGraph}; +// Add edges to the graph +arrowGraph.addEdge("arrowOneName", "arrowTwoName"); // the arrow names are the names the arrows have in the scene, so you +// need know these names before entering them +``` **3D Bounding Boxes** There are now 3D Bounding Boxes which limit the space the user can move to. These can be moved to reflect the internal structure of a splat. **Interactive Elements** -Interactive 3D Elements were added to help the user +Interactive 3D Elements were added to help the user interact with the scene. As the feature is in active developement it will be documented in the future **Improved User Controls** -User Controls were changed to feel more like conventional controls in a First-Person Games (WASD-Control-Scheme). Camera height is constant. +User Controls were changed to feel more like conventional controls in a First-Person Games (WASD-Control-Scheme). Camera height is now constant. Below is an overwiew of the important functions. +`FirstPersonControls ({ speed, rooms, updateCurrentRoom }): null` is the function that defines user controls and can be imported under the same name. Below the parameters are explained: +``` +\\ speed is an object +speed: {value: number, min: number, max: number, step: number} + +\\ rooms is an array of rooms (see the roomConfig[] array above for an example) +rooms: array[room] +\\ updateCurrentRoom is a function that handles the change from the current room to a new one (for an example of +\\ implementation see updateCurrentRoom in CanvasLayer.tsx) +updateCurrentRoom(newRoom: room): void + +``` +The control-scheme can easily be extended by adding keys inside the switch cases in onKeyDown and onKeyUp. Make sure to add it in both to be able to stop moving. **Performance Improvements** Performance was improved on low-end machines via small optimization with about 5-10fps gained. - ## Example for Inspiration for your own projects +This is the project the codes was written for and may serve as an example for what you can do with this (and maybe what not). + **Outline** A small community center was mapped from the inside via Gaussian Splatting with bounding boxes added to mimick the architecture of the building. The goal was to be able to move inside the building like inside a video game. @@ -114,8 +142,9 @@ To create splats Nerfstudios [Splatfacto](https://docs.nerf.studio/nerfology/met Colmap is difficult to work with, so make sure to only feed it clear, unshaky images taken from very even angles. For further install instructions consult the respective websites. **Splat Cleanup** -The splats where cleaned up using [Supersplat]() and [Blender](https://www.blender.org/) with the [Gaussian-Splatting](https://github.com/ReshotAI/gaussian-splatting-blender-addon) addon to load the files. Blender was used as a work-around to be able to edit and move point-clouds while Supersplat is a lightweight web-editor and viewer for splats that makes it easy to edit splats by removing gaussians and reorienting the splat. +The splats were cleaned up using [Supersplat]() and [Blender](https://www.blender.org/) with the [Gaussian-Splatting](https://github.com/ReshotAI/gaussian-splatting-blender-addon) addon to load the files. Blender was used as a work-around to be able to edit and move point-clouds while Supersplat is a lightweight web-editor and viewer for splats that makes it easy to edit splats by removing gaussians and reorienting the splat. **Landing Page** A landing page was created to integrate the viewer into a larger context and offer information on the project. + From 49d6e2b71995709a983c37752a7d76a1ce6209a3 Mon Sep 17 00:00:00 2001 From: FabiKogWi <146583802+FabiKogWi@users.noreply.github.com> Date: Tue, 24 Sep 2024 20:48:33 +0200 Subject: [PATCH 09/11] Update documentation.md --- documentation.md | 18 +++++++++--------- 1 file changed, 9 insertions(+), 9 deletions(-) diff --git a/documentation.md b/documentation.md index e780e6a..b9ebf07 100644 --- a/documentation.md +++ b/documentation.md @@ -99,13 +99,13 @@ arrowGraph.addEdge("arrowOneName", "arrowTwoName"); // the arrow names are the n ``` -**3D Bounding Boxes** +**3D Bounding Boxes**\ There are now 3D Bounding Boxes which limit the space the user can move to. These can be moved to reflect the internal structure of a splat. -**Interactive Elements** +**Interactive Elements**\ Interactive 3D Elements were added to help the user interact with the scene. As the feature is in active developement it will be documented in the future -**Improved User Controls** +**Improved User Controls**\ User Controls were changed to feel more like conventional controls in a First-Person Games (WASD-Control-Scheme). Camera height is now constant. Below is an overwiew of the important functions. `FirstPersonControls ({ speed, rooms, updateCurrentRoom }): null` is the function that defines user controls and can be imported under the same name. Below the parameters are explained: ``` @@ -122,29 +122,29 @@ updateCurrentRoom(newRoom: room): void ``` The control-scheme can easily be extended by adding keys inside the switch cases in onKeyDown and onKeyUp. Make sure to add it in both to be able to stop moving. -**Performance Improvements** +**Performance Improvements**\ Performance was improved on low-end machines via small optimization with about 5-10fps gained. ## Example for Inspiration for your own projects This is the project the codes was written for and may serve as an example for what you can do with this (and maybe what not). -**Outline** +**Outline**\ A small community center was mapped from the inside via Gaussian Splatting with bounding boxes added to mimick the architecture of the building. The goal was to be able to move inside the building like inside a video game. The project was deployed on a server via GitHub. To keep the project updated GitHub Actions was used to automatically pull merges and pushes to main on to the server and restarting it. -**GitHub Actions** +**GitHub Actions**\ GitHub Actions was used to automatically deploy updates to the server. Uses shimataro/ssh-key-action@v2 to ssh onto the server and automatically reinstall Node and Yarn to then pull the new version from GitHub and then restart the server. -**Splat Creation** +**Splat Creation**\ *!Please note that the method described here is probably outdated and you should do your own research on how to create splats!* To create splats Nerfstudios [Splatfacto](https://docs.nerf.studio/nerfology/methods/splat.html) was used. Splatfacto relies on [Colmap](https://colmap.github.io/) and [FFmpeg](https://www.ffmpeg.org/), so make sure you have these installed when using Splatfacto. Colmap is difficult to work with, so make sure to only feed it clear, unshaky images taken from very even angles. For further install instructions consult the respective websites. -**Splat Cleanup** +**Splat Cleanup**\ The splats were cleaned up using [Supersplat]() and [Blender](https://www.blender.org/) with the [Gaussian-Splatting](https://github.com/ReshotAI/gaussian-splatting-blender-addon) addon to load the files. Blender was used as a work-around to be able to edit and move point-clouds while Supersplat is a lightweight web-editor and viewer for splats that makes it easy to edit splats by removing gaussians and reorienting the splat. -**Landing Page** +**Landing Page**\ A landing page was created to integrate the viewer into a larger context and offer information on the project. From b06255a24099c2e3efa8cbef85ccb0a5498bd726 Mon Sep 17 00:00:00 2001 From: FabiKogWi <146583802+FabiKogWi@users.noreply.github.com> Date: Tue, 24 Sep 2024 20:49:49 +0200 Subject: [PATCH 10/11] Update documentation.md --- documentation.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/documentation.md b/documentation.md index b9ebf07..65874b3 100644 --- a/documentation.md +++ b/documentation.md @@ -23,15 +23,15 @@ Hyperrealistic Indoor Streetview is a student project under the tutorship of [Ja ## Features -**Teleportation** +**Teleportation**\ The teleportation features offer the possibility to let the user teleport to points of interest in the scene with a click of a button. It introduces the `handleTeleport()` function in `CanvasLayer.tsx` which is used to teleport the User to a location defined inside the function. It uses the `teleport(x: number, y: number, z: number, lookAtX: number, lookAtY: number, lookAtZ: number)` which is defined in teleportControls.tsx and handles the actual logic behind the teleportation. -**Minimap** +**Minimap**\ *In Progress* -**Rendering Changes and Additions** +**Rendering Changes and Additions**\ The Web-Viewer now supports room based rendering, meaning multiple splats can be used and loaded depending on where the user is in the scene. You can define rooms as an object in the roomConfig array following this pattern: ``` const roomConfig = [ From ca70192aa83b9d4f5c91b4e233cbb7cd5a9a10f8 Mon Sep 17 00:00:00 2001 From: FabiKogWi Date: Wed, 25 Sep 2024 22:04:10 +0200 Subject: [PATCH 11/11] removed documentation and moved content to ReadMe.md --- README.md | 299 +++++++++++++++++------------------------------ documentation.md | 150 ------------------------ 2 files changed, 106 insertions(+), 343 deletions(-) delete mode 100644 documentation.md diff --git a/README.md b/README.md index 68a5b5a..65874b3 100644 --- a/README.md +++ b/README.md @@ -1,237 +1,150 @@ -# Hyperrealistic Indoor Street View +# ReadMe +## Overview +Hyperrealistic Indoor Streetview is a student project under the tutorship of [Jan-Niklas Dihlmann](https://github.com/JDihlmann) from the University of Tübingen aiming to use Gaussian Splatting for creating photo-realistic web-viewable 3D enviroments of indoor spaces inspired from Google Streetview. It supports modelling 3D enviroments with bounding boxes, offers a number of quality of life features and introduces a game inspired control scheme. -

- - Exercise: getting familiar with the pipeline - -

+## Installation -https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/assets/9963865/0a519b96-e1d7-40c6-86f7-8eb69c25e82e +### Via Docker +- Install [Docker](https://www.docker.com/) (Required!) +**If you want to use your own splats** +- Download the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) +- Create a directory called `public` in the same directory as the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) +- Place any splats you want to use inside the public folder +- run `docker compose up` in the terminal with the working directory set to the directory you saved the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) +**If you just want to try the web-viewer** +- `docker image pull fabiuni/teamprojekt` Download the image +- `docker run -dp:3000:3000 fabiuni/teamprojekt` Run the image -# Overview -In order to create your own gaussian splat for the web, we have to go through some steps. That require setup on your local machine and the cluster from the university. +### Via GitHub +**Requirements:** [Node](https://nodejs.org/en) and yarn (install via `npm install -g yarn`) +- Clone the code from [GitHub](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview): `git clone git@github.com:cgtuebingen/hyperrealistic_indoor_streetview.git` +- Run `yarn dev` in the terminal in the root directory of the project (should be hyperrealistic_indoor_streetview) +## Features -- [Installation](#installation) - - [Local Setup](#local-setup) - - [Server Setup](#server-setup) -- [Data Acquisition](#data-acquisition) -- [Preprocessing](#preprocessing) -- [Training](#training) -- [Application](#application) +**Teleportation**\ +The teleportation features offer the possibility to let the user teleport to points of interest in the scene with a click of a button. +It introduces the `handleTeleport()` function in `CanvasLayer.tsx` which is used to teleport the User to a location defined inside the function. It uses the `teleport(x: number, y: number, z: number, lookAtX: number, lookAtY: number, lookAtZ: number)` which is defined in teleportControls.tsx and handles the actual logic behind the teleportation. -# Update: 02.05 -> Issues regarding SSH Conection -Use tmux to run a task on the server without relying on the SSH connection. After connecting to the server run the following commant to start a tmux session: -```bash -tmux new -s mysession -``` - -After starting the tmux session you can run the training command or installation in the tmux session. You can detach from the tmux session by pressing `ctrl + b` and then `d`. Or simply close the terminal. Your tmux session will still run in the background. - -After disconnecting from the server you can reconnect to the server and reattach to the tmux session. First connect to the server and then run the following command to list all tmux sessions: -```bash -tmux attach-session -t mysession -``` -You can also have multiple sessions, just use different names for the sessions. - - -# Installation -First, we need to setup our local and server environments. We will setup our local environment to run the web server and display the gaussian splat in our web browser. For the server setup we will use the cluster of computer graphics at the University of Tübingen. We will optimize the gaussian splats in the cluster and download the optimized splats to display them in our web browser. - -Pipeline - -> This installtion process might be lengthy and require some time. However, it is important that every one of you once understands the full pipeline of the project. Help is always available in the discord channel. Help each other first and if no one can help, ask me. - -## Local Setup -We need to prepare our local environment to run the web server and display the gaussian splat in our web browser. We will use [React](https://react.dev/) in combination with [React Three Fiber](https://docs.pmnd.rs/react-three-fiber/getting-started/introduction) to do so. I will provide the full code for that, so you only have to setup your environment to allow the code to run. - -> This tutorial is optimized for macOS and Linux. If you are using Windows, you might have to adjust some commands, I highly recommend using [WSL](https://docs.microsoft.com/en-us/windows/wsl/install) to run a Linux distribution on your Windows machine. +**Minimap**\ +*In Progress* -### Node.js and package managers -First we need to install [Node.js](https://nodejs.org/en/download). You can download the installer from the website or use a package manager like [brew](https://brew.sh/) on macOS. You can also install node over the node version manager [nvm](https://github.com/nvm-sh/nvm) to manage multiple node versions on your machine (recommended). - -Check if you have installed node and npm by running the following commands in your terminal: -```bash -node -v -npm -v +**Rendering Changes and Additions**\ +The Web-Viewer now supports room based rendering, meaning multiple splats can be used and loaded depending on where the user is in the scene. You can define rooms as an object in the roomConfig array following this pattern: ``` - -Next, we need to install [yarn](https://yarnpkg.com/) as our package manager of choice (replacing nmp). You can install yarn with npm by running the following command: -```bash -npm install -g yarn +const roomConfig = [ + { + splat: "../../public/Splat.splat", + name: "room1", + adjacent: [], + minX: -100, maxX: 100, minY: 0, maxY: 5, minZ: -100, maxZ: 100, + slopes: [], + objects: [], + elements: { + arrows: [], + panes: [], + windowarcs: [] + } + } + ]; ``` -After that you can check if yarn is installed by running: -```bash -yarn -v +Also introduced were slopes, objects and elements like panes and windowarcs. To add a pane simply push it into the pane array via `roomConfig[n].elements.panes.push(pane1)`. +To make a pane create an object with the following pattern: ``` - -### Install the web application -Next, we need to clone the repository to our local machine. You can do so by running the following command in your terminal: -```bash -git clone git@github.com:cgtuebingen/hyperrealistic_indoor_streetview.git +{position: xyz-coordinates, verticalRotation: number, horizontalRotation: number, sizeFactor: number, content: content} ``` - -After cloning the repository we need to install the dependencies for the web application. You can do so by running the following command in the root directory of the repository: -```bash -cd ./hyperrealistic_indoor_streetview -yarn install +windowarcs work similiarly with the required attributes being ``` - -If everything went well you can start the web server by running the following command in the root directory of the repository: -```bash -yarn dev +{position: xyz-coordinates, horizontalRotation: number, arcRadius: number, arcHeight: number, content: content} ``` - -Goto `http://localhost:5173/` (might be different for you) in your web browser to see the web application running. You should see a first gaussian splatting scene I created for you. - - - -## Server Setup -We will use the cluster of computer graphics at the University of Tübingen to train our gaussian splatting scene. On the server we will first setup our [python](https://www.anaconda.com/download) environment and then install [nerfstudio](https://docs.nerf.studio/) to train our gaussian splatting scene. - -### Connecting to the cluster -We have a full tutorial and cheatsheet on how to connect to the cluster. As I cannot share it publicly, you will find the `ssh_and_vnc.pdf` in the web page linked in the Discord channel or email you received. Complete steps 1-4 (including 4) before continuing with the next steps. - -> Important: Complete the steps 1-4 (including 4) before continuing with the next steps. - -### Python environment -Next we will install miniconda to manage our python environment. Be sure that you connected to the cluster and are on one of the pool-machines (e.g. `cgpool1801,...,cgpool1803` or `cgpool1900,...,cgpool1915` or `cgpoolsand1900,...,cgpoolsand1907`). - -These five commands quickly and quietly install the latest 64-bit version of the installer and then clean up after themselves (enter them one after the other): -```bash -cd ~ -mkdir ~/miniconda -wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh -bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3 -rm -rf ~/miniconda3/miniconda.sh +Arrrows where added to support guiding the user through splats. Arrows follow the pattern: ``` -After installing, initialize your newly-installed Miniconda. The following commands initialize for bash and zsh shells: -```bash -~/miniconda3/bin/conda init bash +{position: xyz-coordinates, graphName: string} ``` - -You can also add the path to your `.bashrc` file to automatically activate the conda environment when you open a new terminal. -```bash -cd ~ -touch ~/.bashrc -echo "source ~/miniconda3/etc/profile.d/conda.sh" >> ~/.bashrc -source ~/.bashrc +To connect the single arrows in an arrowGraph you need to add the edges to an ArrowGraph. You create an ArrowGraph with the new ArrowGraph constructor like this: ``` - - -### Nerfstudio -Next we will install [nerfstudio](https://docs.nerf.studio/) to train our gaussian splatting scene. - -#### Create a new conda environment -```bash -conda create -n nerfstudio python=3.8 -conda activate nerfstudio -python -m pip install --upgrade pip +const graph = new ArrowGraph(scene: THREE.Scene); ``` - -#### Dependencies -We need to install some dependencies to run nerfstudio, like `torch`, `torchvision`, `functorch` and `tinycudann`. We will install `torch` and `torchvision` from the official pytorch website and `functorch` and `tinycudann` from the github repositories. -```bash -pip install torch==2.1.2+cu118 torchvision==0.16.2+cu118 --extra-index-url https://download.pytorch.org/whl/cu118 -conda install -c "nvidia/label/cuda-11.8.0" cuda-toolkit -pip install ninja git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch +It has three attributes: ``` - -#### Nerfstudio -Finally we can install nerfstudio and set it up with the following commands: -```bash -pip install nerfstudio -ns-install-cli +graph: arrowgraph \\ Holds edges and arrows +arrowsShortestPaths: (THREE.Object3D|undefined)[][] \\Holds the shortest paths between the arrows +scene: THREE.Scene \\ Holds the THREE Scene ``` +Methods are: +- `addArrow(arrow: THREE.Object3D): void` +Adds a THREE.Object3D to the graph which serves as the arrows model in the 3D-Viewer +- `addEdge(arrow1name: string, arrow2name: string): void` +Adds edges between the arrows (at this point it is necessary to connect the arrows manually). The arrowNames +- `findShortestPaths(): void` +Finds the shortest paths between all arrows and saves them in arrowsShortestPaths +- `updateArrowRotations(destinationArrowName: string): void` +Updates the rotations of all arrows so they point towards the target arrow along the shortest path to it. -### COLMAP -We will use [COLMAP](https://colmap.github.io/) to preprocess the images for the training. We install COLMAP with the following commands: -```bash -conda install -c conda-forge colmap +Now simply add the edges of the arrowgraph to the graph via `arrowGraph.addGraph(arrow)`. +An example for adding arrows is below: ``` +// Declare arrowGraph +const arrowGraph = new ArrowGraph(scene); +// Add arrows to room +roomConfig[1].elements.arrows[0] = {xyzCoordinates, arrowGraph}; +roomConfig[1].elements.arrows[1] = {xyzCoordinates, arrowGraph}; -# Data Acquisition -The first step is to get the data. In our case of inverse rendering we need images from a static scene. So as a first step scan your room with your mobile phone. Meaning you have to take a lot of pictures from different angles and positions. - -- Check that the room is well lit and there are no moving objects. -- Take rouhgly 80-140 pictures of your room from different angles and positions. +// Add edges to the graph +arrowGraph.addEdge("arrowOneName", "arrowTwoName"); // the arrow names are the names the arrows have in the scene, so you +// need know these names before entering them +``` -# Preprocessing -In the next step we will preprocess the images to retreive camera poses and intrinsics for the images.We will use [COLMAP](https://colmap.github.io/) to do so. Neatly it is already installed on the cluster and can be interfaced with Nerfstudio. +**3D Bounding Boxes**\ +There are now 3D Bounding Boxes which limit the space the user can move to. These can be moved to reflect the internal structure of a splat. -#### Upload your images to the cluster -First, we need to upload the images to the cluster. You can use `scp` to upload the images to the cluster: +**Interactive Elements**\ +Interactive 3D Elements were added to help the user interact with the scene. As the feature is in active developement it will be documented in the future -```bash -scp -r /path/to/your/images/ username@servername:/path/to/save/your/images/ +**Improved User Controls**\ +User Controls were changed to feel more like conventional controls in a First-Person Games (WASD-Control-Scheme). Camera height is now constant. Below is an overwiew of the important functions. +`FirstPersonControls ({ speed, rooms, updateCurrentRoom }): null` is the function that defines user controls and can be imported under the same name. Below the parameters are explained: ``` -Replace `/path/to/your/images/` with the path to your images on your local machine and `/path/to/save/your/images/` with the path to save your images on the cluster. Furthermore replace `username` with your username and `servername` with a viable servername (e.g. `cgcontact` or `cgpool1801,...,cgpool1803` or `cgpool1900,...,cgpool1915` or `cgpoolsand1900,...,cgpoolsand1907`). - -> Alternatively if you are using visual studio code you can use the `Remote - SSH` extension to connect to the cluster and upload the images directly from the editor. +\\ speed is an object +speed: {value: number, min: number, max: number, step: number} -#### Preprocessing -Run the following command to preprocess the images with COLMAP: -```bash - ns-process-data images --data /path/to/save/your/images/ --output-dir /new/path/to/processed/images/ -``` -Replace `/path/to/save/your/images/` with the path to your images on the cluster and `/new/path/to/processed/images/` with the path to save your processed images on the cluster. +\\ rooms is an array of rooms (see the roomConfig[] array above for an example) +rooms: array[room] +\\ updateCurrentRoom is a function that handles the change from the current room to a new one (for an example of +\\ implementation see updateCurrentRoom in CanvasLayer.tsx) +updateCurrentRoom(newRoom: room): void -# Training -Before starting the training be sure to be connected to the cluster and on one of the pool-machines (e.g. `cgpool1801,...,cgpool1803` or `cgpool1900,...,cgpool1915` or `cgpoolsand1900,...,cgpoolsand1907`). Before starting a training always check the available resources on the cluster with the following command: -```bash -pool-smi ``` -If you are sure that no one is using the cluster you can start the training with the following command (use atleast a RTX 2080Ti): -```bash -ns-train splatfacto --data /new/path/to/processed/images/ --output-dir ./outputs -``` - -> You can see the progress of the training in the terminal. If you want to stop the training press `ctrl + c`. +The control-scheme can easily be extended by adding keys inside the switch cases in onKeyDown and onKeyUp. Make sure to add it in both to be able to stop moving. -Nerfstudio Viewer +**Performance Improvements**\ +Performance was improved on low-end machines via small optimization with about 5-10fps gained. -> (Optional) You can also visualize the training progress by using the nerfstudio viewer. Create a port forwarding for the cluster and open the viewer in your web browser. Run this command on your local machine (not on the cluster) or use the VS Code port forwarding: -> ```bash -> ssh -L 7007:127.0.0.1:7007 servername -> ``` -> The viewer is available at `http://localhost:7007/` in your web browser. You may have to change the port to the one displayed in the terminal, when you started the training. +## Example for Inspiration for your own projects +This is the project the codes was written for and may serve as an example for what you can do with this (and maybe what not). +**Outline**\ +A small community center was mapped from the inside via Gaussian Splatting with bounding boxes added to mimick the architecture of the building. The goal was to be able to move inside the building like inside a video game. -#### Export the optimized scene -After the training is finished you can export the optimized scene with the following command: -```bash -ns-export gaussian-splat --load-config outputs/...[experiment_path].../config.yml --output-dir ./exports/splat -``` -Replace `[experiment_path]` with the path to the experiment folder in the `outputs` directory. You can find the path in the terminal output of the training. +The project was deployed on a server via GitHub. To keep the project updated GitHub Actions was used to automatically pull merges and pushes to main on to the server and restarting it. -# Application +**GitHub Actions**\ +GitHub Actions was used to automatically deploy updates to the server. Uses shimataro/ssh-key-action@v2 to ssh onto the server and automatically reinstall Node and Yarn to then pull the new version from GitHub and then restart the server. -After you trained the model successfully you can download the optimized scene to display them in your web browser. +**Splat Creation**\ +*!Please note that the method described here is probably outdated and you should do your own research on how to create splats!* +To create splats Nerfstudios [Splatfacto](https://docs.nerf.studio/nerfology/methods/splat.html) was used. Splatfacto relies on [Colmap](https://colmap.github.io/) and [FFmpeg](https://www.ffmpeg.org/), so make sure you have these installed when using Splatfacto. +Colmap is difficult to work with, so make sure to only feed it clear, unshaky images taken from very even angles. For further install instructions consult the respective websites. -#### Download the optimized scene -First, we need to download the optimized scene to our local machine. You can do so by running the following command in your terminal on your local machine: -```bash -scp -r username@servername:./exports/splat/splat.ply /.../hyperrealistic_indoor_streetview/ -``` -> Again if you are using visual studio code you can use the `Remote - SSH` extension to connect to the cluster and download the optimized scene directly from the editor. +**Splat Cleanup**\ +The splats were cleaned up using [Supersplat]() and [Blender](https://www.blender.org/) with the [Gaussian-Splatting](https://github.com/ReshotAI/gaussian-splatting-blender-addon) addon to load the files. Blender was used as a work-around to be able to edit and move point-clouds while Supersplat is a lightweight web-editor and viewer for splats that makes it easy to edit splats by removing gaussians and reorienting the splat. -#### Display the optimized scene -After downloading the optimized scene you can display the scene in your web browser. But before that, we need to convert our `splat.ply` file to a `scene.splat` file. Be sure to have the `splat.ply` file in the root directory of the repository. You can convert the `splat.ply` file to a `scene.splat` file by running the following command in the root directory of the repository: -```bash -node ./convert_ply_to_splat.js -``` - -Then the `splat.splat` file should be within the `public` folder of the web application. You can start the web server by running the following command in the root directory of the repository: -```bash -yarn dev -``` +**Landing Page**\ +A landing page was created to integrate the viewer into a larger context and offer information on the project. -Bildschirmfoto 2024-04-25 um 11 23 47 -Go to `http://localhost:5173/` (might be different for you) in your web browser to see the web application running. You should see the optimized gaussian splatting scene you trained on the cluster. Navigate with your mouse and zoom, see [here](https://github.com/pmndrs/drei?tab=readme-ov-file#controls) for full controls. diff --git a/documentation.md b/documentation.md deleted file mode 100644 index 65874b3..0000000 --- a/documentation.md +++ /dev/null @@ -1,150 +0,0 @@ -# ReadMe -## Overview -Hyperrealistic Indoor Streetview is a student project under the tutorship of [Jan-Niklas Dihlmann](https://github.com/JDihlmann) from the University of Tübingen aiming to use Gaussian Splatting for creating photo-realistic web-viewable 3D enviroments of indoor spaces inspired from Google Streetview. It supports modelling 3D enviroments with bounding boxes, offers a number of quality of life features and introduces a game inspired control scheme. - -## Installation - -### Via Docker -- Install [Docker](https://www.docker.com/) (Required!) -**If you want to use your own splats** -- Download the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) -- Create a directory called `public` in the same directory as the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) -- Place any splats you want to use inside the public folder -- run `docker compose up` in the terminal with the working directory set to the directory you saved the [docker-compose.yml](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview/blob/e70b5098cc485d6f7ac06f7e6e20f7c50f8afe00/docker-compose.yml) -**If you just want to try the web-viewer** -- `docker image pull fabiuni/teamprojekt` Download the image -- `docker run -dp:3000:3000 fabiuni/teamprojekt` Run the image - - -### Via GitHub -**Requirements:** [Node](https://nodejs.org/en) and yarn (install via `npm install -g yarn`) -- Clone the code from [GitHub](https://github.com/cgtuebingen/hyperrealistic_indoor_streetview): `git clone git@github.com:cgtuebingen/hyperrealistic_indoor_streetview.git` -- Run `yarn dev` in the terminal in the root directory of the project (should be hyperrealistic_indoor_streetview) - -## Features - -**Teleportation**\ -The teleportation features offer the possibility to let the user teleport to points of interest in the scene with a click of a button. -It introduces the `handleTeleport()` function in `CanvasLayer.tsx` which is used to teleport the User to a location defined inside the function. It uses the `teleport(x: number, y: number, z: number, lookAtX: number, lookAtY: number, lookAtZ: number)` which is defined in teleportControls.tsx and handles the actual logic behind the teleportation. - - -**Minimap**\ -*In Progress* - -**Rendering Changes and Additions**\ -The Web-Viewer now supports room based rendering, meaning multiple splats can be used and loaded depending on where the user is in the scene. You can define rooms as an object in the roomConfig array following this pattern: -``` -const roomConfig = [ - { - splat: "../../public/Splat.splat", - name: "room1", - adjacent: [], - minX: -100, maxX: 100, minY: 0, maxY: 5, minZ: -100, maxZ: 100, - slopes: [], - objects: [], - elements: { - arrows: [], - panes: [], - windowarcs: [] - } - } - ]; -``` -Also introduced were slopes, objects and elements like panes and windowarcs. To add a pane simply push it into the pane array via `roomConfig[n].elements.panes.push(pane1)`. -To make a pane create an object with the following pattern: -``` -{position: xyz-coordinates, verticalRotation: number, horizontalRotation: number, sizeFactor: number, content: content} -``` -windowarcs work similiarly with the required attributes being -``` -{position: xyz-coordinates, horizontalRotation: number, arcRadius: number, arcHeight: number, content: content} -``` -Arrrows where added to support guiding the user through splats. Arrows follow the pattern: -``` -{position: xyz-coordinates, graphName: string} -``` -To connect the single arrows in an arrowGraph you need to add the edges to an ArrowGraph. You create an ArrowGraph with the new ArrowGraph constructor like this: -``` -const graph = new ArrowGraph(scene: THREE.Scene); -``` -It has three attributes: -``` -graph: arrowgraph \\ Holds edges and arrows -arrowsShortestPaths: (THREE.Object3D|undefined)[][] \\Holds the shortest paths between the arrows -scene: THREE.Scene \\ Holds the THREE Scene -``` -Methods are: -- `addArrow(arrow: THREE.Object3D): void` -Adds a THREE.Object3D to the graph which serves as the arrows model in the 3D-Viewer -- `addEdge(arrow1name: string, arrow2name: string): void` -Adds edges between the arrows (at this point it is necessary to connect the arrows manually). The arrowNames -- `findShortestPaths(): void` -Finds the shortest paths between all arrows and saves them in arrowsShortestPaths -- `updateArrowRotations(destinationArrowName: string): void` -Updates the rotations of all arrows so they point towards the target arrow along the shortest path to it. - -Now simply add the edges of the arrowgraph to the graph via `arrowGraph.addGraph(arrow)`. -An example for adding arrows is below: -``` -// Declare arrowGraph -const arrowGraph = new ArrowGraph(scene); - -// Add arrows to room -roomConfig[1].elements.arrows[0] = {xyzCoordinates, arrowGraph}; -roomConfig[1].elements.arrows[1] = {xyzCoordinates, arrowGraph}; - -// Add edges to the graph -arrowGraph.addEdge("arrowOneName", "arrowTwoName"); // the arrow names are the names the arrows have in the scene, so you -// need know these names before entering them - -``` - -**3D Bounding Boxes**\ -There are now 3D Bounding Boxes which limit the space the user can move to. These can be moved to reflect the internal structure of a splat. - -**Interactive Elements**\ -Interactive 3D Elements were added to help the user interact with the scene. As the feature is in active developement it will be documented in the future - -**Improved User Controls**\ -User Controls were changed to feel more like conventional controls in a First-Person Games (WASD-Control-Scheme). Camera height is now constant. Below is an overwiew of the important functions. -`FirstPersonControls ({ speed, rooms, updateCurrentRoom }): null` is the function that defines user controls and can be imported under the same name. Below the parameters are explained: -``` -\\ speed is an object -speed: {value: number, min: number, max: number, step: number} - -\\ rooms is an array of rooms (see the roomConfig[] array above for an example) -rooms: array[room] - -\\ updateCurrentRoom is a function that handles the change from the current room to a new one (for an example of -\\ implementation see updateCurrentRoom in CanvasLayer.tsx) -updateCurrentRoom(newRoom: room): void - -``` -The control-scheme can easily be extended by adding keys inside the switch cases in onKeyDown and onKeyUp. Make sure to add it in both to be able to stop moving. - -**Performance Improvements**\ -Performance was improved on low-end machines via small optimization with about 5-10fps gained. - -## Example for Inspiration for your own projects -This is the project the codes was written for and may serve as an example for what you can do with this (and maybe what not). - -**Outline**\ -A small community center was mapped from the inside via Gaussian Splatting with bounding boxes added to mimick the architecture of the building. The goal was to be able to move inside the building like inside a video game. - -The project was deployed on a server via GitHub. To keep the project updated GitHub Actions was used to automatically pull merges and pushes to main on to the server and restarting it. - -**GitHub Actions**\ -GitHub Actions was used to automatically deploy updates to the server. Uses shimataro/ssh-key-action@v2 to ssh onto the server and automatically reinstall Node and Yarn to then pull the new version from GitHub and then restart the server. - -**Splat Creation**\ -*!Please note that the method described here is probably outdated and you should do your own research on how to create splats!* -To create splats Nerfstudios [Splatfacto](https://docs.nerf.studio/nerfology/methods/splat.html) was used. Splatfacto relies on [Colmap](https://colmap.github.io/) and [FFmpeg](https://www.ffmpeg.org/), so make sure you have these installed when using Splatfacto. -Colmap is difficult to work with, so make sure to only feed it clear, unshaky images taken from very even angles. For further install instructions consult the respective websites. - -**Splat Cleanup**\ -The splats were cleaned up using [Supersplat]() and [Blender](https://www.blender.org/) with the [Gaussian-Splatting](https://github.com/ReshotAI/gaussian-splatting-blender-addon) addon to load the files. Blender was used as a work-around to be able to edit and move point-clouds while Supersplat is a lightweight web-editor and viewer for splats that makes it easy to edit splats by removing gaussians and reorienting the splat. - -**Landing Page**\ -A landing page was created to integrate the viewer into a larger context and offer information on the project. - -