🧬 Project 1: Molecular Structures in VR
Chemistry
Biology
Education
Download real protein structures from the RCSB Protein Data Bank and visualize them in VR.
Walk through hemoglobin, insulin, the COVID spike protein, and thousands of other molecules at atomic scale.
What's Included:
• Jupyter notebook with complete working code
• Automatic download from Protein Data Bank
• Atom coloring by element type
• Examples: Hemoglobin (1A3N), Insulin (1MSO), SARS-CoV-2 Spike (6VXX)
Real Dataset: RCSB Protein Data Bank (rcsb.org) - 200,000+ protein structures
Get Started:
Open the Jupyter Notebook →
🌌 Project 2: Astronomy & Star Catalogs
Astrophysics
Data Visualization
Education
Explore real star catalogs from ESA's Gaia mission and walk through the local stellar neighborhood.
See stars colored by temperature, create clusters like the Pleiades, and understand why constellations are just 2D projections.
What's Included:
• Generate realistic 3D star fields (100 light-years)
• Optional: Query real Gaia DR3 data (10,000+ nearest stars)
• Famous stars: Alpha Centauri, Sirius, Vega, Betelgeuse
• Create Pleiades-like star clusters
Real Dataset: ESA Gaia DR3 (1 billion stars mapped)
Get Started:
Open the Jupyter Notebook →
🏙️ Project 3: LiDAR Urban Planning
Urban Planning
Architecture
Smart Cities
Process city-scale LiDAR scans and explore urban environments in VR.
Load LAS/LAZ files, color by elevation or building classification, and walk through cities at 1:1 scale.
What's Included:
• Load and parse LAS/LAZ LiDAR files
• Intelligent subsampling for VR performance
• Color coding: height, classification, intensity
• Links to free datasets (USGS 3DEP, OpenTopography, Amsterdam 3D)
Real Datasets: USGS 3DEP (US cities), Amsterdam 3D, UK Environment Agency
Get Started:
Open the Jupyter Notebook →
🎥 Tutorial: Capture Mixed Reality Videos with RealityMixer
Meta Quest
iPhone/iPad
Video Capture
Tutorial
Create professional mixed reality videos of yourself exploring point cloud data in VR.
RealityMixer lets you capture your VR sessions without a green screen, perfect for demos, presentations, and sharing your visualizations.
What You'll Need:
• Meta Quest (Quest 1, 2, or Pro)
• iPhone or iPad (A12 chip or newer, iOS 14+)
• RealityMixer app (
Download on App Store)
• Same WiFi network for both devices
Step-by-Step Setup
1. Enable Developer Mode on Quest
• Open Meta Quest app on your phone
• Go to Menu → Devices → Developer Mode
• Enable Developer Mode
• Put on your Quest headset and accept the prompt
2. Install RealityMixer on iOS
Download from the App Store or build from source on GitHub.
3. Connect Your Devices
• Make sure Quest and iPhone are on the same WiFi
• Open RealityMixer on iPhone
• Put on your Quest headset
• Open any ImmersivePoints visualization on Quest
• RealityMixer should auto-detect your Quest
4. Calibrate Your Camera
RealityMixer uses ARKit to track your position and blend you into the VR scene. Follow the on-screen calibration instructions:
• Point your iPhone at yourself wearing the Quest
• Tap the calibration button
• Hold still for a few seconds
• Once calibrated, you'll see yourself in the VR environment!
5. Record Your Session
Use iPhone's screen recording to capture your mixed reality exploration:
• Swipe down to Control Center
• Tap the Record button (circle icon)
• Explore your point cloud in VR
• Stop recording when done
• Video saves to Photos app
Pro Tips for Great Captures
Lighting: Use good lighting on yourself for better person segmentation
Movement: Move slowly and deliberately - ARKit tracking works best with smooth movements
Framing: Position your iPhone 3-6 feet away for the best perspective
Performance: Close other apps on your iPhone for smoother capture
Quest Browser: Use the Quest browser in VR mode (not desktop mode) for ImmersivePoints
Use Cases for Mixed Reality Capture
Research Presentations: Show your team exploring LiDAR scans or molecular structures
Educational Content: Create tutorials demonstrating how to navigate 3D data
Conference Demos: Record demonstrations for remote presentations
Social Sharing: Share your VR explorations on YouTube, Twitter, LinkedIn
Documentation: Document analysis workflows for papers or reports
Troubleshooting
Can't find Quest device?
• Ensure both devices are on same WiFi
• Restart RealityMixer app
• Restart Quest headset
• Check that Developer Mode is enabled
Laggy or choppy video?
• Use 5GHz WiFi for better bandwidth
• Reduce point cloud size (subsample to fewer points)
• Close background apps on iPhone
Person segmentation not working?
• Improve lighting conditions
• Move to a less cluttered background
• Ensure iPhone has A12 chip or newer
Calibration issues?
• Hold still during calibration
• Ensure Quest headset is clearly visible
• Try recalibrating from different angle
Alternative: SideQuest Workaround
Due to Android 12 compatibility issues on newer Quest firmware, you may need to use SideQuest
with the Mixcast Quest Bridge as an intermediate step.
See the RealityMixer README for detailed instructions.
View RealityMixer on GitHub →
Download on App Store →
💡 More Projects Coming Soon
We're working on practical examples for these use cases. Stay tuned!
🧠 Medical Imaging: Walk Through Brain Scans
Category: Healthcare & Medicine
Medical
Neuroscience
Diagnostics
Convert MRI, CT, or PET scan voxel data into point clouds. Doctors can literally walk through a patient's
anatomy at human scale, viewing tumors, blood vessels, or neural pathways from any angle.
Example: Color neural tissue by activation level from fMRI data, or blood vessels by
flow rate. Surgeons can plan complex procedures by examining anatomy in true 3D before surgery.
Particularly valuable for teaching medical students, explaining procedures to patients, and collaborative
surgical planning across remote locations.
🤖 Project 4: ML Embeddings Visualization
Machine Learning
Data Science
Embeddings
Reduce high-dimensional data to 3D using t-SNE, UMAP, or PCA and debug your ML models in VR.
Visualize MNIST digits, find misclassifications, and understand your model's decision boundaries.
What's Included:
• t-SNE, UMAP, and PCA dimensionality reduction
• Color by: class labels, prediction correctness, confidence
• Works with MNIST, Fashion-MNIST, or your own embeddings
• Train a classifier and visualize where it fails
Real Dataset: Scikit-learn digits (included), MNIST (70K samples)
Get Started:
Open the Jupyter Notebook →
🏺 Archaeological Site Reconstruction
Category: History & Archaeology
Archaeology
Cultural Heritage
Museums
Transform photogrammetry or LiDAR scans of archaeological sites into explorable VR experiences. Color
artifacts by age, material, or excavation layer.
Museum application: Visitors can explore Pompeii, Angkor Wat, or Egyptian tombs in VR,
with points colored by historical period or architectural element. Access restricted areas safely.
Document sites threatened by climate change or conflict. Share discoveries with researchers worldwide
before physical artifacts deteriorate.
📈 Financial Data Landscapes
Category: Finance & Business
Finance
Trading
Analytics
Create 3D landscapes where X/Y represent different securities or market sectors, Z represents performance,
and color represents volatility or trading volume.
Trading floor application: View your entire portfolio as a 3D landscape. Peaks are
winners, valleys are losers, color intensity shows risk level. Spot correlations and anomalies instantly.
Visualize correlation matrices, risk profiles, or time-series data in ways that reveal hidden patterns.
Perfect for algorithmic trading analysis or portfolio risk assessment.
🌊 Ocean and Atmospheric Data Visualization
Category: Environmental Science
Climate
Oceanography
Weather
Visualize 3D oceanographic or atmospheric data. Each point could represent a measurement location,
colored by temperature, salinity, pressure, or pollution levels.
Climate research: Explore ocean currents in 3D, seeing how temperature layers move.
Visualize hurricane structure from the inside, or track microplastic distribution in the ocean depths.
Perfect for climate modeling, weather pattern analysis, and communicating environmental data to
policymakers and the public in intuitive ways.
🎨 Generative Art and Audio Visualization
Category: Art & Entertainment
Art
Music
Generative
Create point cloud sculptures that respond to audio, mathematical functions, or generative algorithms.
Transform music into 3D visual experiences where people can walk through sound.
Music visualization: Map frequency to X, time to Y, amplitude to Z, and timbre to color.
Create explorable sculptures of songs. Perfect for music therapy, live performances, or art installations.
Generate fractals, L-systems, or particle simulations as point clouds. Create VR art galleries where each
piece is an explorable 3D data sculpture.
🔗 Social Network and Graph Visualization
Category: Network Analysis
Networks
Social Media
Graph Theory
Use force-directed graph layout algorithms to position social network nodes in 3D space. Each person is
a point, colored by community, influence, or activity level.
Business intelligence: Visualize customer networks, organizational structures, or supply
chains. Walk through your network to discover hidden influencers, bottlenecks, or communities.
Applications include social media analysis, fraud detection networks, citation graphs in research, or
analyzing communication patterns in large organizations.
⚽ Project 5: Sports Analytics & Player Tracking
Sports
Analytics
Gaming
Visualize player movement and game events in 3D space with time as the vertical axis.
Create tactical heatmaps, analyze player trajectories, and explore game strategy in VR.
What's Included:
• Load StatsBomb Open Data (World Cups, Champions League)
• Generate synthetic player tracking data
• 3D trajectory view (time as Z-axis)
• 2D heatmap view (position density)
Real Datasets: StatsBomb Open Data (free soccer event data), Metrica Sports samples
Get Started:
Open the Jupyter Notebook →
🚀 Ready to Try These Projects?
All projects include:
- ✅ Complete Jupyter notebooks with working code
- ✅ Links to real, publicly available datasets
- ✅ Step-by-step instructions
- ✅ Inline visualization + VR links
# Install ImmersivePoints
pip install immersivepoints
# Clone the repository
git clone https://github.com/rmeertens/ImmersivePoints.git
cd ImmersivePoints/examples
# Pick a project and start Jupyter
cd 01_molecular_structures
jupyter notebook molecular_visualization.ipynb
📂 Browse All Projects
📤 Upload Your Own Data