Skip to content

crs4/Indoor-floor-plans-prediction

Repository files navigation

This repository provides work-in-progress code, data and pretrained models for a deep-learning approach for predicting complex indoor floor plans from registered omnidirectional images.

##Overview The approach assumes as input a set of spatially registered and vertically aligned equirectangular images. For each input image, a depth and room shape prediction module, an end-to-end neural network predicting an intermediate clutter-free depth map of the scene and a segmented floor projection (i.e, Nadir shape) of the predicted uncluttered room. We exploit camera registration to put all Nadir projections in the same reference floor plan. Given this joined representation, we adopt an encoder-decoder transformed-based architecture to process relationships between Nadir projections and predict the final room shapes (Nadir maps), adopting a two-level queries (i.e., room polygons- room corners) embedding. As a final result, we obtain the predicted room polygons.

Python Requirements

See the file requirements.txt

Pre-requisited

cd models/ops
sh make.sh

# unit test for deformable-attention modules (should see all checking is True)
# python test.py

cd ../../diff_ras
python setup.py build develop

Installation

We suggest to create a Python virtual environment and installing all the essential Python modules using pip. After cloning the repository, run:

# python -m venv .env
# source .env/bin/activate
# pip install -r requirements.txt

Data

To test single image depth estimation and its floorplan footprint estimation we provide a panoramic indoor scene from Structured3D at data/s3d_single/test . To test floorplan reconstruction we provide an exemple scene from Structured3D at data/s3d_floor, which includes as input 9 panoramic images from which an entire multi-room floor plan is reconstructed.

Download Pretrained Models

To be copied in your local ./checkpoints directories. [FIXME]

Usage

To test prediction of single image depth and its Nadir shape run (example):

python eval_nadirshape.py --pth ./nadirshape/ckpt/DEMO_RUNS/s3d_depth/best_valid.pth --root_dir ./data/s3d_single/test/  
- `--pth` path to the trained model.
- `--root_dir` path to the input equirectangular scene.
- `--output_dir` path to the output results.

To generate Nadir maps from a set of omnidirectional images run (example):

python generate_nadirmaps.py --pth ./nadirshape/ckpt/DEMO_RUNS/s3d_depth/best_valid.pth --data_dir ./data/s3d_floor  
- `--pth` path to the trained model.
- `--data_dir` path to the input scene (registered images).
- `--output_dir` path to the output results.

To predict a floorplan as a set of rooms polygons run (example):

python eval_nadirfloor.py --pth checkpoints/DEMO_RUNS/nadirfloornet_s3d.pth --dataset_dir ./results/s3d_nadirmaps  
- `--pth` path to the trained model.
- `--dataset_dir` path to the input scene (registered images).
- `--output_dir` path to the output results.

Acknowledgements

We acknowledge the support of the PNRR ICSC National Research Centre for High Performance Computing, Big Data and Quantum Computing (CN00000013), under the NRRP MUR program funded by the NextGenerationEU.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published