Omnifocus Nonfrontal Imaging Camera

The concept of omnifocus nonfrontal imaging camera, OMNICAM or NICAM, initiated a new chapter in imaging and digital cameras. NICAM has introduced hitherto non-existent imaging capabilities, in addition to overcoming some problems with previous methods. NICAM is capable of acquiring seamless panoramic images and range estimates of wide scenes with all objects in focus, regardless of their locations. To understand the impact of NICAM, first consider imaging with conventional cameras.The camera’s field of view is generally much smaller than the entire visual field of interest. Consequently, the camera must pan across the scene of interest, focus on a part at a time, and acquire an image of each part. All the resulting images together then capture the complete scene. As by-product of focusing, the range of the objects in the scene can also be estimated. Usual methods for focusing as well as range estimation from focusing mechanically relocate the sensor plane, thereby varying the focus distance setting in the camera. When a scene point appears in sharp focus, the corresponding depth and focus distance values satisfy the lens law. The depth for the scene point can then be calculated length and the focus distance.

Details

each chosen pan angle, focusing i.e. finding the best focus distance setting. The purpose of the first action is to accumulate data for the entire visual field from the camera’s narrower fields of view. This action is therefore essential. An innovation of nonfrontal imaging is in the elimination of the second action. The nonfrontal imaging camera has a sensor plane which is not perpendicular to the optical axis as is standard. This imaging geometry eliminates the time consuming mechanical translation of the sensor plane. Camera panning, required for panoramic viewing anyway, in addition enables focusing. Further, a range-from-focus estimate for each visible scene point is also computed as a by-product of identifying the sharpest image. Thus, from pan motion alone, nonfrontal imaging obtains a composite focused image of all objects/points in a wide scene regardless of their depths, which is in complete registration with a range map obtained in parallel. While it is well known that focus distance control yields both range and focused images, nonfrontal imaging has made it possible for the first time to realize this dual functionality simultaneously for all visible scene points. Further, this functionality is achieved passively, i.e, without any active illumination of the scene, e.g., using laser.

Thus, nonfrontal imaging has the following novel capabilities: (i) It provides panoramic (up to 3600) images of a scene without any visible seams. (ii) Each object in an image is in complete focus regardless of its location, i.e., that there is no need to explicitly perform the standard focusing action (accomplished mechanically in “manual” cameras and automatically in “automatic” cameras, but requiring mechanical movement in each case). (iii) Along with the sharp visual image, the camera also delivers the location (coordinates) of each focusable, visible scene point. One consequence of these capabilities is that a single nonfrontal imaging camera can provide stereo pairs of images for three-dimensional, omnifocused, viewing of the entire scene in natural lighting. In fact, this visual 3D experience is even more informative in some ways than natural, human viewing of real world, since humans have finite depth of field while the NICAM driven display shows all parts in focus.

UNIQUENESS

Nonfrontal imaging represents qualitative leaps in what is feasible with the current technology. It makes it possible to achieve hitherto infeasible functionalities and performance levels in imaging. None of the available techniques can deliver seamless focused panoramic images. Most methods choose a scene point/object and bring it into focus by (manually or automatically) controlling sensor location. Thus, they can focus at objects one by one. Similarly, they can estimate range from focus one object/point at a time. The following paragraphs describe the differences between NICAM and the various related existing technologies.

Panoramic Images: In conventional photography, generating panoramic images has been more of an art, pursued by artists who take independent photographs and create a mosaic from them. There are some fundamental problems with this, however. Whenever an image is taken by the camera, a choice of focus distance must be made. Usually, this is done by imaging in focus that object which appears at the image center. This means that any objects at other distances in the camera’s field will not be in focus. In particular, the image borders may have different amounts of blur along them. Since in general different focus settings are used to obtain photographs of contiguous scene parts, when they are mosaiced to form a panoramic image the discontinuities in the image sharpness of the scene parts straddling the borders give rise to seams. Thus in the panoramic image neither all objects are imaged in focus nor is the mosaic seamless. One may attempt to alleviate this problem by reducing the size of the camera’s visual field, but this does not eliminate the problem because the objects in the scene are not of the same shape as the camera’s visual field (e.g., rectangular) so they always straddle across image borders. Of course, the smaller the visual field size the larger is the number of images in the mosaic, which increases the seam density. In fact, a major issue in the construction of panoramic images has been how to process the mosaic to camouflage seams in order to avoid perceptual detection.

Panoramic Cameras: A number of panoramic cameras have been designed over the years for photographic applications. The scene scan is performed by moving the camera mechanically, or pointing it at a special reflector surface such as a conical mirror. To image certain scene points in sharp focus, either the points have to be at a specified depth from the camera, or the depth of field of the camera must be made sufficiently large by a combination of reducing the aperture and increasing the focal length. These solutions are not acceptable since they require that the ambient light intensity levels be high or that the scene objects of interest all lie in a narrow depth range.

Panoramic Range Acquisition: Analogous to panoramic image acquisition, panoramic range acquisition methods are not very common. Typically a narrow field of view device is pointed in different directions to obtain a panoramic range image. Or, multiple devices are arranged in a circle such as in sonar sensor rings. Almost all methods are invasive, i.e., they involve scene illumination, e.g., using a laser beam or structured lighting.

Patented Technologies: Most of the relevant U.S. patents in the last 5 years have been filed by Japanese corporations. However, none of these come close to the NICAM methodology. The closest idea to NICAM was in a patent filed for the Asahi Kogaku Kogyo Kabushiki Kaisha of Tokyo, Japan. In this patent, a camera was described that could select one of three tilt angles between the CCD plane and the optical axis. The application suggested was in photographing one frame of a scene with two subjects. The range to the subjects was determined by an unspecified “range-determining means”. EMI Limited, England has a patent on using a tilted plane sensor to determine the focus motor drive signal. Some of the other companies that hold patents in slightly related ideas are Olympus Optical Corporation, Japan, Canon Kabushiki Kaisha and Hitachi Ltd, Tokyo, Japan. A camera with a tilted sensor plane and a scanning mirror was also used by a researcher at Jet Propulsion Laboratory to determine the range of scene points.

SAMPLE IMAGES TO ILLUSTRATE PERFORMANCE

Images 1-5 illustrate the performance of NICAM. Typically, a panoramic image is divided into multiple rows, each showing the view over smaller than the entire angle covered. If the entire length of the panoramic image were printed in a single row, the height will be reduced significantly. To avoid such excessive compression of detail, and to maximize legibility by using all available space, a complete panoramic view is divided into smaller contiguous subangles, and the corresponding subimages are shown in successive rows of the image. Thus, the right end of the row connects to the left end of the following row.

Images 1, 2 and 3 are examples of omnifocused panoramic images acquired using NICAM. Image 1 shows a 1000 view of an outdoor scene, shown split into two rows of 500 each. The objects in the scene are at a range of distances (flowers 1.5?, tree 4.5?, bench 8?, chair 30? and building 50? and larger) but all are imaged in focus and no seams exist across the entire panoramic view. Image 2 shows a 600 view of a room inside the Beckman Institute where the distances range from a few feet to about 20 feet. Finally, Image 3 contains a 3600 panoramic view of the Computer Vision and Robotics Laboratory in the Beckman Institute where object distances of 2 to 30 feet from the camera are indicated.

Image 4 demonstrates the omnifocusing performance of NICAM compared to the limited depth of field of a regular camera. The upper row in Image 4 (i) shows a 400 (angle chosen by user) omnifocused image acquired by NICAM and the lower row shows a 200 (angle dictated by the camera) view of the same scene acquired using a regular camera focused at 4? (a choice must be made as to which object to focus on). The progressive loss of focus for objects located closer or farther than the focused depth of 4? can be seen. Image 4 (ii) shows a “panoramic” view constructed from multiple images taken by a regular camera, by concatenating images of contiguous parts of the scene. Since the different images are taken when the camera is focused at different objects, the borders between images give rise to seams. Further, since the camera is focused on a specific object as each image is acquired, not all parts within even a single image are in focus. This should be contrasted with the seamless, panoramic imaging capability of NICAM shown in Images 1-3.

Image 5 (i) demonstrates the impact of the range estimation capability of NICAM A pair of contiguos planar patches formed by wrapping newspaper on a step-like structure is placed in front of NICAM. The resulting omnifocused panoramic image is shown in the first row. The second row shows the range estimation capability wherein the step structure recovered by NICAM is depicted. The range estimate available for each pixel in the omnifocused image determines the position and irradiance of the corresponding scene point. The omnifocused image and the recovered shape are combined to produce the 3D omnifocused step structure shown through a perspective view in the bottom row. If we assume that the irradiance due to this scene point is invariant for small perturbations of the viewpoint, then the intensity and range information can also be combined to produce pseudostereo images as would be acquired by a pair of hypothetical cameras placed around NICAM. Such stereo images when viewed through a stereo mechanism, e.g. stereo glasses, depict the scene in full 3D, using data obtained by a single NICAM! Image 5 (ii) shows such stereo pairs for three scenes. For ease of viewing, the left and right “eye” images have been color coded red and green and overlapped; when viewed through the enclosed red and green glasses on the different eyes, the scenes can be seen in 3D and omnifocused. The top left scene consists of two planes perpendicular to the line of sight, the nearer one at a distance of 2 ft. from the camera (right plane) and the farther one at a distance of 3 ft. (left plane). The top right scene contains a single chess piece at the distance 1 in. The bottom row shows multiple chess pieces placed at three different distances from the camera: 17 in., 20 in. and 25 in. All parts of all scenes are in focus and the 3D structure is visible through stereo viewing. It appears that such 3D depiction using a single visual camera has never been accomplished before!

Image 1: 100 degree Outdoor View

Image 2: 60 degree Indoor View

Image 3: 360 degree Indoor View

Image 4(i): 400 Omnifocused View

Image 4(i): 200 Standard Camera View of the Same Scene as in 4(i), Focused at 4 ft.

Image 4(ii): Regular Camera Mosaic

Image 5(i): On the following page:

Image 5(ii)

APPLICATIONS OF OMNICAM

Following are some examples of applications that depend on NICAM’s unique imaging capabilities.

Photography: Suppose a photographer wishes to capture a scene around the Washington monument. She must make two decisions before she can push the camera button. First, she must decide which part of the scene she wants to capture in the photograph, and then direct the camera to point in the desired direction using the appropriate zoom lens. Second, within the visible scene, she must determine whether the photograph should show the Washington monument in sharp focus, or the trees in front, or the buildings behind, and set the focus control accordingly. The result will be a picture showing, say the monument, in sharp focus, and the lawn to the left and right as well as the trees and the buildings blurred. If she wants to show a wider scene than the camera’s field of view, she must take multiple pictures by moving the camera across the scene and then manually “pasting” the individual photographs to make a mosaic showing the large scene. Each picture in the mosaic will show a pre-selected object in sharp focus, with consecutive photographs taken, in general, with different focus settings. Consequently, the degree of blur across a picture boundary would change visibly, causing perceptible sharpness transitions across objects that happen to straddle inter-picture borders. Of course, even within each pasted image the objects other than those focused on appear blurred proportional to their relative distances from the focused object. Further, the entire process is very time consuming because of the required mechanical redirecting of cameras, mechanical focusing in each direction, and the subsequent cutting and pasting. Because of these difficulties, such mosaicing of images has been practiced as an art by photographers wherein they attempt to smooth out the interimage transitions. Using the NICAM technology, the photographer could capture the Washington monument, the buildings behind, the trees in front, the lawn and objects as far to the left and right as she wants, up to a complete 3600 view, all of which would be completely focused in a single photograph. The significance of NICAM to nature photography is mentioned in a letter (Appendix A) written by a former staff photographer of the National Geographic Magazine.

Security and Surveillance: Another major application area is that of security systems and surveillance. For example, consider multiple cameras located at one or more posts outside a building to achieve visibility in all directions. Typically, a guard inside the building would monitor the images delivered by the cameras on separate monitors placed in front of him. The cameras of course show certain objects in focus while others, outside of the depth of field, appear blurred. The guard could control the cameras to focus at different objects in different directions, but that would only result in switching among the areas monitored best. If a fast panning NICAM is used, a single camera will replace the entire set of cameras and still obtain a focused image of all parts of the scene visible from the watchpost, with no loss of detail due to blurring. This image could be displayed over a surrounding screen, as well as on the usual separate monitors. Further, the guard could also be shown a 3D stereo display of the 3600 surround, using a single NICAM. His display could be viewed in 3D using a headmounted display. Alternatively, the guard could see it on a surrounding 3600 screen, e.g., using stereo glasses, while being able to turn his head in any direction as if he were on the watchpost itself.

Analogously, inside a building, a small number of NICAM’s can cover the entire premises to be monitored, instead of a much larger number of normal cameras. For example, a 3600 view of a large building lobby may be covered by a few NICAM’s instead of, say 15, ordinary cameras which may still not cover as much depth as NICAM. When located inside a nuclear plant or another hazardous area, clear views of the entire scene can facilitate much faster teleoperation with no adjustment of the camera.

Surgery: In endoscopic or laproscopic surgery, it is a common problem that the surgeon cannot clearly see the interior body structure in the vicinity of the area under operation which leads to imprecision in surgery. A miniature NICAM could image the large, wide space inside the stomach in focus. The surgeon could then perform the operations using NICAM, either with the aid of only the focused images or while stereoscopically viewing the space in 3D and in focus. Another example is neurosurgery. Neurosurgeons today must constantly adjust the focus of the high-zoom visual aid as they move their tools even by millimeters to nearby parts of the brain. This requires constant movements of hands and attention. With fast panning NICAM, the focus will not have to be adjusted so her/his hands can perform surgery continuously without interruptions to adjust focus.

3D Modeling: The range estimation capability of NICAM could be used to visually acquire models of objects, e.g., to be manufactured or to be manipulated for video synthesis, at speeds much faster than those of laser based scanners which must scan the object a small part at a time. The reliability of range estimates will vary with object depth in a priori known ways. NICAM could also be used to track objects moving nearby. The acquired 3D information could also be used to compose omnifocused stereo displays of scenes, e.g., for remote real-estate viewing.

ILLUSTRATIONS OF APPLICATIONS

Some of the applications in which NICAM, panning at sufficiently high speed, can be used are illustrated through Images 6-10 below. Image 6 illustrates the application of NICAM in remote monitoring of 3D environment for home security. Image 7 demonstrates omnifocused viewing of brain at high magnifications so the surgeon can avoid adjusting the focus and concentrate his mind and hands on the surgery. In Image 8, the NICAM is used as a sensor for locating nearby obstacles and thus avoiding them, and thus for autonomous control of vehicle motion. Image 9 shows how the 3D shape information extracted by NICAM, when necessary under textured illumination, can be supplied to a computer aided design system, and in turn, to numerically controlled machines which can then automatically make a duplicate of the model object. Finally, Image 10 illustrates how NICAM can be used in remote manipulation of hazardous environments. Here, NICAM is located next to a robotic manipulator in a hazardous environment; the images obtained by NICAM are transmitted to a safely located human operator who sees a virtual 3D view of the scene from the robot’s viewpoint; the user manipulates the virtual environment through a virtual manipulator; and the manipulation actions are then transmitted to and followed by the slave manipulator at the hazardous site.

Image 6

Image 7

Image 8

Image 9

Image 10

MEDIA REPORTS

NICAM technology has been the subject of several reports, including the following:

  1. Popular Mechanics, August 1995, p. 24.
  2. Discover Magazine, November 1995, p. 48.
  3. ZDF German Television’s Program Science and Technology, 1995.
  4. San Diego Post Tribune, January 3, 1996.
  5. Super Interessante Magazine, Vol. 10, No. 1, Brazil, January 1996, p. 10.
  6. San Francisco Examiner, January 1996.
  7. Boy’s Life, July 1997, p 12.
  8. Optical Materials and Engineering News (OMEN), September 1993, pp. 5-6.
  9. Photonics Spectra, November 1993, p. 29
  10. Canadian Broadcasting Corporation Science Program :Quirks and Quarks, Dec 18, 1993; Repeated December 1995.
  11. International Science and Technology Satellite News, March 2, 1994.
  12. Arlington Heights Daily Herald, Lake County, Illinois, April 4, 1995.

SYSTEMS DEVELOPMENT

Several prototypes of the camera have been developed. One implementation with on-board processing is shown in Image 11. The camera has a touch interface, located on the right hand side, that makes the operation easy (as has been demonstrated by the experience of first-time users). The images can be stored and transported using a built-in zip drive seen at the front bottom of the camera in Image 11. The system also contains a large hard disk and ports for connectivity to the outside. The camera has an on board display so the user can monitor and control the desired beginning and end pan angles. The system needs only a connection to the power line. Another self-contained unit, developed earlier, with on-board processing was delivered to the Department of Defense. Another implementation as a remote, accessory unit, with the camera end separated from the host computer (a desktop or a laptop computer) is shown in Image 12.

Image 11: A stand-alone NICAM prototype

Figure 12. A NICAM prototype as a peripheral device, with a single cable connection to the host laptop.

APPENDIX A: Letter