prscrew.com

Streamlining Your Agisoft Metashape Workflow with Python: Part III

Written on

Chapter 1: Introduction to Dense Point Clouds

In this third installment of my series on optimizing your Agisoft Metashape process with Python, we will focus on constructing and refining dense point clouds. The initial article introduced project setup and image alignment, while the second part showcased iterative gradual selection to filter out low-accuracy tie points.

Once we have eliminated these inaccurate tie points, we can proceed to compute depth maps, which lead to the creation of the dense point cloud. Any outliers that were missed during the gradual selection can be addressed with a confidence filter. This segment will outline the necessary background and provide Python code to automate this operation.

What is a Dense Point Cloud?

To summarize, Agisoft Metashape generates a dense point cloud from the initial sparse point cloud, accurate camera alignments, depth estimation, and point interpolation. This method effectively fills in the voids between sparse points, yielding a detailed and high-resolution 3D representation of the analyzed scene or object. Aashutosh Pyakurel offers a comprehensive explanation of this procedure in his article about OpenMVS, although the principles are analogous.

Why Generate a Dense Point Cloud?

After filtering out low-accuracy tie points, the sparse point cloud may appear quite sparse. The image below starkly contrasts the sparse point cloud (left) with the dense point cloud (right). From an aesthetic standpoint, particularly when creating 3D assets for gaming or virtual reality applications, the sparse point cloud is insufficiently detailed. In drone surveys, the dense point cloud is essential for calculating the digital elevation model (DEM) and subsequently creating the orthomosaic.

Comparison of Sparse vs Dense Point Cloud

Agisoft Metashape Parameters

As with every processing step within Metashape, specific parameters must be established. In this case, we need to define parameters for the functions chunk.buildDepthMaps and chunk.buildPointCloud. Let’s explore the parameters for each function before delving into the code.

chunk.buildDepthMaps

According to the Agisoft Metashape Python Reference Manual, the following parameters can be provided to the chunk.buildDepthMaps function:

  • downscale (int): Quality of the depth map.
  • filter_mode (FilterMode): Mode for filtering the depth map.
  • cameras (list of int): List of cameras to be processed.
  • reuse_depth (bool): Option to reuse existing depth maps.
  • max_neighbors (int): Maximum number of neighboring images for depth map generation.
  • subdivide_task (bool): Enable finer task subdivision.
  • workitem_size_cameras (int): Number of cameras in a workitem.
  • max_workgroup_size (int): Maximum size of the workgroup.
  • progress (Callable[[float], None]): Callback for progress tracking.

The most critical parameters are downscale, filter_mode, and reuse_depth. The remaining parameters can generally use their default values.

downscale

The downscale parameter correlates with the quality setting in the graphical user interface. This quality is quantified by integer values. Through some experimentation, it becomes evident that Metashape expects the downscale value to be a power of two, although this isn't explicitly stated in the manual. The following list illustrates the correspondence between the integer values supplied to the chunk.buildDepthMaps function and the GUI settings:

  • 1 = Ultra High
  • 2 = High
  • 4 = Medium
  • 8 = Low
  • 16 = Lowest

For our case, we set downscale=2.

filter_mode

The User Manual for Agisoft Metashape elaborates on the filter_mode parameter:

Due to various factors, such as noisy or poorly focused images, outliers may occur among the points. Metashape offers several built-in filtering algorithms designed to tackle the challenges posed by different projects. If crucial small details need to be preserved during reconstruction, it is advisable to use the Mild depth filtering mode to prevent important features from being misidentified as outliers. This setting is particularly useful in aerial projects, especially when dealing with poorly textured roofs.

For a drone survey, like the one being discussed, the mild depth filtering mode is optimal. This can be achieved by setting filter_mode=Metashape.MildFiltering.

reuse_depth

Referring again to the Agisoft Metashape User Manual:

Depth maps available in the chunk can be reused for the point cloud generation operation.

Enabling the reuse of depth maps for point cloud computation is accomplished by setting reuse_depth=True.

The function call then appears as follows:

chunk.buildDepthMaps(downscale=2, filter_mode=Metashape.MildFiltering, reuse_depth=True)

chunk.buildPointCloud

The following parameters can be passed to the chunk.buildPointCloud function:

  • source_data (DataSource): Source data for point extraction.
  • point_colors (bool): Option to include color information for points.
  • point_confidence (bool): Option to calculate point confidence.
  • keep_depth (bool): Option to retain depth maps.
  • max_neighbors (int): Maximum number of neighbor images for depth map filtering.
  • uniform_sampling (bool): Option for uniform point sampling.
  • points_spacing (float): Desired spacing between points (in meters).
  • asset (int): Asset to be processed.
  • subdivide_task (bool): Enable finer task subdivision.
  • workitem_size_cameras (int): Number of cameras in a workitem.
  • max_workgroup_size (int): Maximum size of the workgroup.
  • progress (Callable[[float], None]): Callback for progress tracking.

The key parameters are point_colors, point_confidence, and keep_depth. These parameters are relatively straightforward. The point_colors parameter allows for the inclusion of color information from the images. Setting it to False can reduce processing time, but it is generally advisable to set it to True. The point_confidence parameter enables the filtering of the dense point cloud, aiding in the removal of any outliers that may remain. The keep_depth parameter ensures depth maps are retained, allowing DEM calculations from either the dense point cloud or the depth maps.

The function call then appears as follows:

chunk.buildPointCloud(point_colors=True, point_confidence=True, keep_depth=True)

Confidence Filter

The point confidence values obtained by setting point_confidence=True can be utilized to eliminate outliers after constructing the dense point cloud. These confidence values range from 0 to 255. Start by filtering out low-confidence points:

min=0

max=1

point_cloud = doc.chunk.point_cloud

point_cloud.setConfidenceFilter(min, max)

point_cloud.cropSelectedPoints()

point_cloud.setConfidenceFilter(0, 255)

point_cloud.compactPoints()

At this stage, only the selected points are visible after executing point_cloud.setConfidenceFilter(min, max). These points are then removed using point_cloud.cropSelectedPoints().

If it appears that all points have disappeared, don't worry! The command point_cloud.setConfidenceFilter(0, 255) resets the view to display the remaining points in the dense cloud.

Although it seems we are done, there’s one final step. The Agisoft User Manual states:

Deleting points via Crop Selection or Delete Selection tools merely invalidates them; the points remain stored in the project and can be restored. To permanently delete all selected points, use the Compact Point Cloud command from the Point Cloud submenu of the Tools menu and save the project.

In our Python script, we compact the point cloud with the following command:

point_cloud.compactPoints()

If we neglect to compact the dense point cloud, points that are no longer visible may still be considered during subsequent product generation, such as the digital elevation model and orthomosaic, which we will cover in the next article.

Python Code

The code below will facilitate the computation of the dense point cloud. Before executing the code in Agisoft Metashape, ensure that the bounding box encompasses the area of interest, as tie points outside this box will be disregarded.

#!/usr/bin/env python3

# -- coding: utf-8 --

import Metashape

def copy_new_chunk(chunk, old_label, new_label):

new_chunk = chunk.copy()

clbl_i = str("".join([old_label, new_label]))

new_chunk.label = clbl_i

return clbl_i

def find_chunk(doc, label):

if type(doc) != Metashape.Document:

print("Not a valid MetaShape file!")

return False

if type(label) != str:

print("Please provide a string as input!")

return False

for chunk in doc.chunks:

if chunk.label == label:

print(f"Chunk {label} was found.")

return(chunk)

print(f"Chunk {label} was not found!")

return None

# Main body of program

Projs = ["/data2/metashape_course/20230828_mcourse.psx"]

# chunk(s) to process within a project

clbls = ["run2"]

doc = Metashape.app.document

for proj in Projs:

doc.open(proj)

proj_name = proj.split("/")[-1]

proj_path = "/".join(proj.split("/")[:-1])

for clbl in clbls:

for chunk in doc.chunks:

if chunk.label != clbl:

continue

print("")

print(f"Now processing project <{proj_name}>...")

# find chunk by name

chunk = find_chunk(doc, clbl)

# sets selected chunk as active

doc.chunk = chunk

dc_clbl = copy_new_chunk(chunk, clbl, "-DC") # copy to new chunk before proceeding

print("Now building depth maps")

chunk = find_chunk(doc, dc_clbl)

doc.chunk = chunk

chunk.buildDepthMaps(downscale=8, filter_mode=Metashape.MildFiltering, reuse_depth=True)

doc.save()

print("Now building dense cloud")

chunk.buildPointCloud(point_colors=True, point_confidence=True, keep_depth=True)

doc.save()

# filter dense cloud

cln_clbl = copy_new_chunk(chunk, clbl, "-clean") # copy to new chunk before proceeding

chunk = find_chunk(doc, cln_clbl)

doc.chunk = chunk

point_cloud = doc.chunk.point_cloud

point_cloud.setConfidenceFilter(0, 1)

point_cloud.cropSelectedPoints()

point_cloud.setConfidenceFilter(0, 255)

point_cloud.compactPoints()

doc.save()

Chapter 2: Conclusion

Now you can effectively leverage Python to automate the dense cloud processing stage in Agisoft Metashape. The following video illustrates the transition from the sparse point cloud generated by the initial alignment, through the thinned point cloud achieved via gradual selection, to the finalized dense point cloud.

Share the page:

Twitter Facebook Reddit LinkIn

-----------------------

Recent Post:

Navigating Domain Hosting Changes Without the Pain

Understand the complexities of changing domain hosts and how to navigate the process effectively.

Stop Identifying as a Writer If You Aren't Writing

Embrace your identity as a writer by sharing your work, not just dreaming about it.

Exploring the Reliability of Wikipedia as a Source

A critical look at the reliability of Wikipedia as a source of information, exploring its strengths and weaknesses.

Uncovering the Mysteries of Black Helicopters: Conspiracy or Fact?

Explore the enigmatic reports of black helicopters and the conspiracy theories surrounding them, highlighting concerns over surveillance and government secrecy.

Medium's Comment Notification Issues: A Writer's Dilemma

A writer discusses the lack of comment notifications on Medium, raising concerns about the platform's glitches affecting engagement.

Unraveling the Dark Matter Hurricane Impacting Earth

Explore the implications of a Dark Matter hurricane striking Earth and the scientific advancements in detecting its particles.

How to Make Dating Enjoyable Again: Ditch the Interview Vibe

Transform your dating experience from stiff interviews to fun and relaxed encounters.

Navigating My Phone Addiction: A Journey to Balance and Clarity

A personal account of overcoming phone addiction and finding balance with technology.