Point Cloud Visualizer
How to install Optional Libraries?
- please see compatibility notes for your platform in documentation
- go to Preferences > Add-ons and find PCV
- expand to see PCV preferences
- click
Install LIBRARY_NAME
button - wait until Blender is responsive again (if you start Blender from command line you can observe progress)
- restart Blender after each installed library
How to update to a new PCV version?
- download latest version
point_cloud_visualizer-###.zip
- start Blender, go to Preferences > Add-ons and find PCV in list
- expand by clicking left corner triangle
- disable PCV unchecking checkbox next to addon name
- click
Remove
button - click
Save Preferences
button at bottom (skip if you haveAuto-Save Preferences
enabled) - quit Blender
- start Blender
- install new PCV version and enable
Installation
- download latest version
point_cloud_visualizer-###.zip
- install as regular blender addon: https://docs.blender.org/manual/en/latest/editors/preferences/addons.html#installing-add-ons
Is there a trial version?
- no, sorry. it is not possible
Large Datasets
- currently, PCV needs to load all points from source file and keep them in system memory
- during loading, system memory usage will peak quite high, 4 (and more in some cases) times the runtime usage
- for display, PCV need to upload all points that are going to be displayed (you can control amount with PCV > Display > Percentage slider) to gpu memory
-
to determine approximate maximum number of points your gpu can display at once you can use calculator in PCV preferences (for example 8GB gpu can display ~300M points with default shader and disabled scalars), calculator formula is simple:
ram = 8192 # MB b = (1024*1024) * ram # bytes # default shader uses 3x float32 for point location and 4x float32 for color (rgba) # float32 takes 4 bytes, hence 3*4 + 4*4 bytes n = int(b / (12 + 16)) print(n) # 306783378
trying to display more points (i.e. uploading more data than gpu memory can contain) will result in Blender crash or freeze
- when extremely big data need to be loaded and is not required to work with exactly all points, alternative loading methods
Every Nth
orSlice
can be used to reduce number of points during loading
Platform Compatibility
- works on Windows, Linux and macOS
- see Optional Libraries Compatibility for more details on 3rd party libraries
Blender Compatibility
- PCV 3.0 is for Blender 3.6 LTS and later
About Optional Libraries
- NONE of them is required for regular use, majority of PCV functionality has no dependencies
- functionality that depends on any library is marked in PCV panel with plugin icon
- all of them are installed from The Python Package Index (PyPI) using
pip
command - libraries are not installed automatically, you need to open PCV preferences and click each install library button
- they are installed to user
site-packages
directory as defined insite.getusersitepackages()
, exact location differs for each platform, if you clickRead Me First
button in PCV preferences, it will print out path to user site-packages directory
Where are Optional Libraries used?
-
Open3D
: select filters -
laspy
: import/export LAS files -
lazrs
: import/export LAZ files
-
laszip
: import/export LAZ files -
pye57
: import/export E57 files -
PyMeshLab
: import E57 files and select filters -
SciPy
: select filters
Optional Libraries Compatibility
-
Windows
- do NOT install
Open3D
andPyMeshLab
together, Blender will crash when one and the other is used (see documentation workaround) - do NOT install
Open3D
andpye57
together, Blender will crash when one and the other is used (see documentation workaround) - do NOT install
Open3D
andlaspy
+laszip
together, Blender will crash when one and the other is used (see documentation workaround) - PCV will prevent crashes by default by monitoring which library has been used in current Blender session and not allowing problematic operation to be run, this behavior can be turned off in PCV preferences
- do NOT install
-
macOS
- on Apple Silicon you need to use Blender Intel build to install and use
PyMeshLab
-
pye57
have to be built from source, see step by step guide in PCV full documentation
- on Apple Silicon you need to use Blender Intel build to install and use
-
Linux
- no known limitations/problems
Where is documentation?
- latest documentation is always here: Point Cloud Visualizer 3.0 Documentation
I got warning icons and `Missing vertex normals`, `Missing vertex colors` and `Missing scalars` messages in `Display` panel
- loaded data does not have normals, colors or scalar fields so affected shading options cannot work properly, missing values are substitued by default normal or colors (exact values are in PCV > Load > General)
How to render points in Cycles or Eeveee (with colors)?
- points need to be converted to Blender native data type that can be rendered in engine. Conversion target is mesh vertices with Geometry Nodes to turn them to instances or points, see PCV > Convert panel
- Converting points will create suitable color data and adds basic material using it so colors are preserved and available for render engine
- mesh instances can be rendered in Cycles and Eevee, points are supported only by Cycles
Points are missing when blend file is saved and reopened?
- PCV does not store loaded points in blend file because there is no suitable data type available. only PCV settings and path to data file is stored. when blend file is reopened, points need to loaded or imported from linked file again by clicking
Draw
button. - if you are working with multiple PLY clouds at once, you can use PCV 3d viewport panel (viewport header top right corner: cloud icon) to
Draw
all PCV instances in scene at once - any changes to points made with PCV have to be saved using Export panel to PLY before quiting Blender or loading blend file otherwise will be lost
- other option is to use
Pack
option that will automatically convert points to hidden mesh datablock on blend file save and restore on blend file load, but since blender uses only single precision floating point numbers for vertex positions, geolocated point data will lost its precision and cannot be moved exactly back. if you don't need to export points back to "world" locations (i.e. LAS/LAZ or E57 files) you can ignore this limitation
Discover more products like this
laz import photogrammetry scanning importer point retopology scan point cloud 3D Scan e57 las gaussian splatting reconstruction lidar laser scan ply 3dscan pts Visualization pointcloud cloud