Dr Mike J Smith

Personal Webpage

Email me
T:+44 (0)207 099 2817

Mike Smith

RSS FeedRSS Feed






Studio of Objects: External Camera Panoramic Spheres

Wed, 28 Jan 2015

Following on from the last blog post (Paolozzi's Studio of Objects) I wanted to touch upon the external camera setup that we used to integrate photographic imagery with the laser scans from the Leica P20. Its worth referring to these two "manuals" that Leica have produced on using an external camera - they are helpful and give some good points on hardware, setup, software, processing and integration. The first (which is older) is titled Cyclone External Camera Workflow-Nodal Ninja bracket, with the second (newer) titled Spherical Panoramas for HDS Point Clouds. In short though, the following setup is required:

1. We want to replace the internal camera on the sensor with a better quality external camera and then import the imagery in to Cyclone, so allowing you to texture the point cloud.

2. In order to best achieve this, the focal centre of the lens must match the optical center of the scanner - this enables accurate image-to-point cloud texture mapping.

3. Once the laser scan is complete, the scanner is removed from the tribrach and a tripod head is then attached to mount the camera on. This is actually two pieces of equipment:
a. Nodal Camera bracket: this allows you to precisely position the focal centre of the lens and is a standard camera accessory. Leica have recommended the Nodal Ninja 3 Mk2 in the past.
b. Spacer: this positions the Nodal Ninja in exactly the right location and is essentially a spacer with a tribrach adaptor. We used Red Door for both the adaptor (L2010 Tribrach Adapter for Leica Geosystems) and the spacer itself (L-N3-C-10 ScanStation Camera Adapter for Leica Geosystems). You (fairly obviously) need both bits (don't make that mistake like I did!!!).

4. Select the lens you want to use, with two aspects to consider:
a. Resolution: there is a trade-off between resolution and time. A wide angle lens will capture a panoramic sphere with relatively fewer photos (the reason Leica use a fisheye lens) but at the expense of resolution. There is a BIG caveat here: you need to convert the final panorama in to a cubemap for import in to Cyclone. Each cubemap "face" (or image) can have a maximum dimension of 4096x4096 pixels, the limit in Cyclone. This means the maximum image size is 16MP wide and 12MP high (but as its a cubemap that means the whole image is limited to 96MP) - given the scanning resolution of the P20 this is disappointing as it limits the resolution benefits you can achieve from an external camera (although the dynamic range will be significantly better), although I expect this to change in the future, not least because Leica internal cameras will almost certainly improve. For the record our panoramas with the 24mm lens were ~235MP! You need to choose a lens suitable for your camera: this will almost certainly be either an FX (35mm) or DX (24mm) sensor which applies a multiplier to the image. 1x for the former and 1.5x for the latter. We selected 16mm fisheye and 24mm rectilinear lenses. When shooting with these lenses they both had a 30 degrees horizontal rotation between shots (on the tripod) to make sure there was a large overlap between images. In addition, the 24mm lens required a vertical rotation of +30 degrees and -15 degrees to make sure there was sufficient vertical coverage. For a full sphere you would also want to take vertical upwards (i.e. pointing at the ceiling) with 30 degree horizontal rotation although we didn't in this instance.
b. Focus: objects "acceptably in focus" are denoted by the depth of field which is determined by the focal length, aperture and focus setting. Each lens has a hyperfocal distance where everything is in focus, typically at very small apertures (which increases softening in the image from diffraction). Where hyperfocal isn't practical or possible, an alternative is to use a range of image processing techniques to stack multiple photos with different depths of field - this is know as focus stacking. Where you use a longer focal length for greater resolution, the depth of field is smaller and so focus stacking becomes more important. We used this technique with the 24mm lens - DoFMaster allows you to calculate your depths of field for objects at different distances where you probably want a good overlap in DoF.
c. Shooting: with the Nikon D810 mounted in the Nodal Ninja we shot in RAW only mode to capture the "at sensor" data. The camera was put in to manual focus and manual mode. With the aperture set at f8 (based upon the DoF calculations) and ISO100, metering gave an optimum shutter speed of 1/3s. The lens was pre-focused and then a 2s timer used to release the shutter (essential for the slow shutter speed). For the 24mm lens the focus settings were 0.8m and 2m - so 2 photos were shot at -15 degrees for each of the 12 horizontal rotations (i.e. every 30 degrees) and then repeated for +30 degrees giving a total of 48 photos for each scan location.

5. Find the nodal point for your camera!! This is fiddlier than you think as you need to find the point of no parallax when taking an image - this is the point at which the focal centre of the lens remains fixed. It varies between camera body and lens combinations. Nodal Ninja have a set of resources which include specific settings for different cameras - some are in date and some out of data, however this page was very helpful showing traditional and rapid techniques to find it. We used the rapid technique, utilising an empty ball point pen as our siting device instead of a piece of card. It worked well!

This covers the hardware - I'll talk about the processing side in a later post!

in close association with hijack and Dacapo

[/remote_sensing] permanent link


Add Comment
(all comments will be moderated)

Comments are closed for this story.