Brekel PointCloud v3 – FAQ

  • What are the minimum system requirements?

This is dependent on the amount and type of sensors you want to use.
You can find more info in the documentation included with the app.

 

  • Can I mix & match different sensor types/brands?

Yes definitely, the solvers were designed to handle data from different sensor types.
In fact combining different sensor types may be beneficial in some cases since they all have different noise/accuracy characteristics in the pointcloud data they can deliver.

 

  • How many sensors should I use?

The software works with one or more sensors (as many as your hardware can handle).
Adding additional sensors with different viewpoints can increase quality due to seeing parts that were occluded for another sensor.
So basically the more the merrier.

 

  • How many sensors can I use on a single computer?

This is dependent on the USB bandwidth of your machine, the type of sensors and the CPU/GPU of your machine.

Kinect v2 (XBox One) have driver/SDK limitations restricting usage to a single sensor per machine.

For desktop machines you can add PCI-Express cards to expand your USB bandwidth.

You can use additional machines with a network connection and use sensors connected to them.

(see documentation for more information)

 

  • How much better is a setup with sensor A, B, C versus a setup with X, Y or just a single sensor Z (substitute letters with your favorite sensor brands/types).

Generally speaking multiple sensors are always preferable since they see more angles of the subject and have less occlusions.
And newer sensor types almost always provide better/cleaner data than older ones, especially the Kinect sensor range.
Your particular setups I have most likely not tested so if you need a more specific answer, test it out with the trial and/or evaluation version.

 

  • Do I need multiple licenses when using multiple sensors/machines?

One “Multi Sensor” license allows you to connect to as many sensors/machines as you want.

When using multiple machines the idea is to run the GUI on one machine and the headless/console version of the same app on your other machines and use the “Network Sensor” option to receive data from sensors on networked machines.

 

  • Is there interference between overlapping sensors?

Yes and no.
Structured Light sensors (like Kinect v1 & Orbbec Astra) can produce a bit more noise in overlapping areas.
Kinect v2 (Time of Flight) can on occasions have some Z-wobble since they cannot be synchronized.
Azure Kinect can be synchronized fully reducing any interference.
Stereo sensors (like Intel RealSense D400 series) generally don’t interfere.
In general interference does not pose much of an issue for the solver.

 

  • Can sensors be synchronized?

Azure Kinect sensors have sync in/out ports on their backs (remove the cover) and can be daisy chained using a simple 3.5mm audio jack cable, the Brekel app will automatically detect this and set things up accordingly.
Internally the software synchronizes all incoming data using timestamps.

 

  • How is v3 different from v1/v2?

v3 supports aligning & fusing data from multiple sensors which can, depending on your setup, improve quality with occlusions and/or increase capture volume.
v3 has more output file format options, with possibly more coming in the future.
v3 is in active development.

 

  • Are there upgrade discounts for v1/v2 license owners?

Yes of course, you can find more info about updates & upgrades here

 

  • Why are there no sample files?

Data quality is dependent on how many sensors you use, their brand/type, how you set them up regarding angle/distance to subject and if you use the deep tracker or not for example.
Due to these variances it’s best to try things out for yourself for your particular setup.

Which depth sensor(s) do you own?

View Results

Loading ... Loading ...