Heavily inspired from the Nerfstudio Google colab notebook, this is a guide on how to train your first NeRF using google’s cloud computers.
Then go to Google colab and start a new notebook.
Tip
If you are going to process video into stills - first connect to Colab with a CPU to avoid draining your credits unnecessarily, as Colmap doesn’t like the GPU on Colab.
In fact, if you have a moderately good processor, do the data processing locally instead. Here’s a guide on how to install nerfstudio with conda on your computer.
Connect with a A100
, then L4
as your second choice, and finally T4
if others are unavailable.
Gonna install dependencies first. Then I am going to mount the google drive, where I have made sure to include my training video.
Run this to install dependencies:
# @markdown <h1>Install Nerfstudio and Dependencies (~8 min)</h1>
# Change to content directory
%cd /content/
# Clean up existing installations
!pip uninstall -y torch torchvision nerfstudio
# Update and install Python 3.10
!apt-get update
!apt-get install -y python3.10 python3.10-venv python3.10-dev
# Create and activate Python 3.10 virtual environment
!python3.10 -m venv /content/nerfstudio_env
!source /content/nerfstudio_env/bin/activate
# Upgrade pip and install core dependencies
!pip install --upgrade pip
!pip install torch==2.1.2+cu118 torchvision==0.16.2+cu118 --extra-index-url https://download.pytorch.org/whl/cu118
# Install TinyCuda
!gdown "https://drive.google.com/u/1/uc?id=1-7x7qQfB7bIw2zV4Lr6-yhvMpjXC84Q5&confirm=t"
!pip install tinycudann-1.7-cp310-cp310-linux_x86_64.whl
# Install COLMAP
!apt-get install -y colmap
# Install additional requirements
!pip install colab-xterm
!pip install ninja git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
# Install Nerfstudio
!pip install git+https://github.com/nerfstudio-project/nerfstudio.git
# Downgrading Numpy
!pip install numpy==1.26.4
# Mount Google Drive
from google.colab import drive
drive.mount('/content/drive')
print("Installation complete. Environment is ready for Nerfstudio usage.")
To process the video into stills:
# @markdown <h1>Process Video for NeRF Training with GPU Support</h1>
# @markdown <h3>Enter the path to your video in Google Drive</h3>
# Set video path
video_path = "/content/drive/MyDrive/NeRF_datasets/VR - NeRF - TEST.mp4" # @param {type:"string"}
output_dir = "/content/data/nerfstudio/custom_data"
%cd /content/
!pip install colab-xterm
%load_ext colabxterm
%env TERM=xterm
from IPython.display import clear_output
clear_output(wait=True)
%xterm
# Install xvfb for display handling
!apt-get install xvfb -y
import os
from google.colab import drive
def mount_drive():
"""Mount Google Drive if not already mounted"""
if not os.path.exists('/content/drive'):
drive.mount('/content/drive')
# Mount Google Drive if not already mounted
mount_drive()
# Process the video
if os.path.exists(video_path):
# Create the processing command with xvfb-run
colmap_command = f'xvfb-run -a ns-process-data video ' \
f'--data "{video_path}" ' \
f'--output-dir "/content/data/nerfstudio/custom_data" ' \
f'--verbose ' \
f'--num-frames-target 300 ' \
f'--camera-type perspective'
print("\nCopy and paste the following command into the xterm terminal:")
print("\n" + colmap_command + "\n")
# Check for transforms.json and provide training command
if os.path.exists(f"{output_dir}/transforms.json"):
train_command = f'ns-train nerfacto --viewer.websocket-port 7007 --viewer.make-share-url True --data "{output_dir}"'
print("\nOnce processing is complete, run this training command:")
print("\n" + train_command + "\n")
else:
print("\nAfter running the processing command, check if transforms.json was created.")
print("Note: The GPU should be automatically used if available.")
else:
print(f"Error: Could not find video file at {video_path}")
print("Please check the path and ensure the file exists in your Google Drive")
To process data locally:
ns-process-data images \
--data "/Users/miro/Documents/Work/_Global/NeRF_Within/data/250210-VR_NeRF_test/images" \
--output-dir "/Users/miro/Documents/Work/_Global/NeRF_Within/data/250210-VR_NeRF_test/processed" \
--verbose
Warning
Make sure to avoid any spaces in the path name, or else you will get an error. Replace spaces with underscores.
Then, to train my scene from processed data (extract videos) via nerfacto:
# @markdown <h1>Start Training</h1>
%cd /content
!pip install colab-xterm
%load_ext colabxterm
%env TERM=xterm
from IPython.display import clear_output
video_path = "/content/drive/MyDrive/NeRF_datasets/250214-SG-DesksBank" # @param {type:"string"}
clear_output(wait=True)
if os.path.exists(f"{video_path}/transforms.json"):
print(
"\033[1m"
+ "Copy and paste the following command into the terminal window that pops up under this cell."
+ "\033[0m"
)
print(
f"ns-train nerfacto --viewer.websocket-port 7007 --viewer.make-share-url True nerfstudio-data --data {video_path}"
)
print()
%xterm
else:
from IPython.core.display import HTML, display
display(HTML('<h3 style="color:red">Error: Data processing did not complete</h3>'))
We can train with nerfacto
or instant-ngp
, though nerfacto
is preferred.
To keep it alive while training, I run:
!nvidia-smi