Official Repository for SuperCaustics: Real-time, open-source simulation of transparent objects for deep learning applications
SuperCaustics is a simulation toolset made in Unreal Engine for generating massive computer vision datasets that include transparent objects. SuperCaustics is specifically compatible with ClearGrasp. You can process the data you collect using the Dataset Creator into their style of data, though you dont specifically have to use that pipeline. You can also use the Neural Networks provided in this repository.
RELEASE version: 1.00
Upcoming Features in version 1.1:
-
class and instance labeling
automatic labeling of objects with class tracking and instance counting support. -
macros, bug fixes and cleanups
Created modular blueprint macros for fundamental tasks that could be used elsewhere in the project. -
features
automatic camera-space groundtruth materials for complex custom objectsFor suggesting future features, please open a discussion in Issues.
Item | Link |
---|---|
SuperCaustics v1.00 Project Files (Must run on NvRTX UE4.26-Caustics) | Download |
SuperCaustics Dataset (v1.00) | Download |
Contact: If you have questions, comments (or bugs!) please open a github issue or contact me at: mehdimousavi.redcap[at]gmail[dot]com
This repository is tested with Unreal Engine 4.26.1 on Windows 10 (SuperCaustics with UE Editor) and Ubuntu 16.04 (Neural Networks), Python 3.7, Pytorch 1.7.1 and Easytorch 2.8.3
Hardware Requirements:
- nvidia geforce rtx 2060 for real-time ray-tracing (or higher)
- intel core i5 7600K
- 16 GB RAM
- 20 GB disk space
- ample disk space for raw dataset to occupy
Software Requirements:
- (neural networks):
pip install easytorch pytorch torchvision pillow numpy imageio shutil opencv-python
- (supercaustics editor):
NVRTX Unreal Engine 4.26 - Caustics branch
- (probe data gatherer)
pip install pykeyboard
SuperCaustics features a fully-fledged automatic scene generation system with compatibility and user-friendliness in mind. Using SuperCaustics to generate your own data is very easy. Heres how:
To use SuperCaustics Editor, you need a compatible version of Unreal Engine 4.26x or higher. To download Unreal Engine, follow step-by-step instructions to be added to Epic Games Github here, and afterwards you can access and build UE4 from source here. You can also skip all of that and download the engine from Here: Download NvRTX UE4.26.1-Caustics
To import your 3D meshes, follow these steps:
- Export 3D mesh into
.FBX
or any other format accepted by Unreal Engine. - Drag
.FBX
file and drop into a folder inside the UE4 Content browser. - Go to
Content>Logic>Glass_Actors>Master>Actor.bp
- Right-click on
Actor.bp
, andCreate a child blueprint
fromActor.bp
- Open the child blueprint you just created. go to viewport, and drag-drop your 3D mesh into
static mesh component
. - Repeat from (1) to create as many Glass actors as you wish.
you really dont have to do this since SuperCaustics comes with free 3D meshes made for transparent object detection (curated for SuperCaustics dataset).
Using a template scene content>maps>template>realistic.map
We can easily set up a simulation exactly how we like it. You can create a duplicate of this scene, or make your own scene and bring in the components we're about to discuss into the scene yourself.
Set the bounds of your simulation by adjusting these settings:
1.1 Objects:
Object Range From, To
determine a range of objects to be spawned at each generated scenario.Objects to spawn
takes anint
and spawns n number of objects.Glass Array
consists of objects within the reach of the Generator. To remove/add items from this array click on thex
or+
button.Add Force?
Determines if a physics impulse is added to each spawned object upon generation.Max Impulse Modifier
sets the intensity of the impulse added in the beginning of the simulation.Spawn Offset Distance
sets the minimum distance between random spawning points.
1.2 Camera:
Camera Space Normals?
switches between Camera-space and world-space surface normals.Intel Realsense Camera?
switches between SuperCaustics custom camera and a simulated Intel Realsense camera.
you can simulate any camera you want by looking at the physical sensor & lens specifications, and changing the settings on the camera module accordingly. This is important because the image projection with various cameras is different, and objects may look different in one camera compared to another.
This module manages the visibility and position of props in each generated scene. You can click on each prop and change it's properties however you wish, and it will be placed randomly in each iteration of the simulation. Out of the box, prop manager supports up to 6 unique props, and It can generate and output unique colors for uni-material objects at runtime. (note the tiger has a different color every time I reset the simulation.)
Setting up props:
To set up your own props, you can create your object
inside unreal engine (or elsewhere and import it), find prop manager in world outliner
(top right of the screen), select Prop manager components and drag and drop your object
into static mesh
and material
components.
Backdrop is a Blueprint system with the ability to detect and cycle through its materials
at runtime. You can add or remove any material
or material instance
to the material bank found within AIP CORE
section in the Backdrop's detail panel (see below)
Backdrop looks for m
key event to change its material at runtime.
HDRI maps are used to dictate diffused lighting and reflection patterns on specular items. for instance, if you wanted your glass objects to reflect flourecent lights, you can do that with HDRI Bakcdrops.
HDRI Backdrop system is adapted from a similar engine plugin, to be compatible with SuperCaustics. Our version is capable of swapping HDRI Backdrops during runtime.
HDRI Backdrop works similarly as Backdrop
. You can download a vast array of HDRI maps from the internet, drag and drop them in the HDRI Bank
in the details panel, and you should be good to go.
HDRI Backdrop looks for h
key to change the scene HDRI mapping at runtime.
Light Manager is a small system that stores lighting information (colors, rotations) and handles its runtime operations.
Light Manager has a SuperCaustics
Category where you can change the color of the main light source along with its rotations whenever the l
control event is pressed.
please note that sizeof color map == sizeof rotation map
Events, cameras and Data Ablation events are handled by the level blueprint
. Level blueprint can handle every present object inside the scene. Events are triggered by keyboard control signals (keystrokes). To view and edit the level blueprint, click on Blueprints
at the top middle of the screen, and click on Open level blueprint
.
m
: changing backdrop materiald
: toggle raytracing and DLSSh
: Cycles throughHDRI Bank
array inside theHDRIBackdrops
system for lighting profile and reflections.l
: lights - rotates the primary light source inLightManager
q
: fast restart simulationc
: Take screenshot at current resolution.(to change your resolution, run the command r.setres 1920x1080w)
*v
: Switch views (cameras)u
: Summon Blue ToteBox prop
g
: show transparent object maskse
: Toggle mesh causticst
: show depth ground-truth.r
: show surface normals ground-truth.o
: show outlines ground-truth
- To save system resources during fast restarts, RTX and DLSS are set to disabled by default, so you need to press
D
everytime before taking screenshots. (Probe does that automatically.) - Due to the nature of real-time ray tracing (raytracing at lower resolution, upscale and denoise) screenshots made at non-native resolutions will appear noisy, since there is no denoise at the upper levels where screenshots are captured. This is why
c
is set to only taking screenshots at native resolution. However, if you wish to override that, you can set a custom screenshot size inlevel blueprint
or by usingHighResShot
console command. - If you wish to, you can edit these key events inside the
level blueprint
. - surface normals ground truth
r
only shows the surface normal modality you choose inGenerator module
.
Probe is a data gathering script that sends control signals to the data-ablation module inside the SuperCaustics simulation. Probe works by sending specific keyboard commands to the supercaustics window. How to use Probe:
Set arguments:
setsize
:int
determines how many scenarios would you like collected.moves_file
:address
save your moves/scenarios, for later recreation if needed.
Run Probe.py & Wait until data is fully collected.
Probe works with any window/application, this is probe running on another project.
After setting up your data, copy it somewhere accessible to the python code. Then, run:
python segmentation_main.py -ph train -ms 1 -data /path/to/dataset -nw 32 -ep 35 -b 8 -gi 4 -log experiments_supercaustics
Arguments:
-ms
: model scale (double
)-nw
: number of worker threads, usually, keep it at half of your available cpu threads (int
)-b
: batch size. (int
)-gi
: gradient accumulation. use to increase effective batch size. (int
)-ep
: number of epochs to train. (int
)-ph
: phase. (train
ortest
)-data
: path to your dataset. use absolute path for percision. (example:./home/data/folder/
)-spl
: split ratio - easytorch will split the data for train/val/test randomly. (example:-spl 0.7 0.2 0.1
)
If you end up using SuperCaustics or The Neural Networks, please cite the papers below:
@misc{mousavi2021supercaustics,
title={SuperCaustics: Real-time, open-source simulation of transparent objects for deep learning applications},
author={Mehdi Mousavi and Rolando Estrada},
year={2021},
eprint={2107.11008},
archivePrefix={arXiv},
primaryClass={cs.GR}}
@InProceedings{10.1007/978-3-030-64559-5_41,
author="Mousavi, Mehdi
and Khanal, Aashis and Estrada, Rolando",
title="AI Playground: Unreal Engine-Based Data Ablation Tool for Deep Learning",
booktitle="Advances in Visual Computing",
year="2020",
publisher="Springer International Publishing",
address="Cham",
pages="518--532",
isbn="978-3-030-64559-5"}