Basic Sensor Modules: TargetBoard, SimpleFieldSensor and Camera

Top  Previous  Next

Three fundamental sensor modules provided in the LightLike library are TargetBoard, SimpleFieldSensor, and Camera.  In the present section we concentrate on TargetBoard and SimpleFieldSensor, but we also describe several important properties common to all three basic sensors, as well as to more specialized LightLike sensors.

The following picture shows the interfaces of TargetBoard and SimpleFieldSensor.  The interfaces are essentially identical;  the principal difference is that TargetBoard outputs the integrated intensity incident at the sensor, while SimpleFieldSensor reports integrated complex field incident at the sensor.  As defined in the linked sections, the "integrated" in these two terms refers to temporal integration.


All of LightLike's sensors are of the temporally integrating type, based on the specification of an exposure length.  The output variable in TargetBoard, called integrated_intensity, has the physical units of J/m2.  If the user is more interested in the irradiance, W/m2, that number must obtained by dividing out the exposure length in post-processing.  In the case of SimpleFieldSensor, the integrated complex field output (called simplyfld in the interface) is somewhat eccentric: this quantity was explained in an earlier section on LightLike nomenclature.

As we see from the above picture of the module interfaces, these (and other) LightLike sensors have a common set of timing inputs: on, exposureInterval, exposureLength,    and sampleInterval.  These quantities have the same function in all the sensors, and their usage was explained in the section on timing and triggering.


Wavelength sensitivity

As the picture shows, the first parameter in TargetBoard and SimpleFieldSensor specifies the singlewavelength to which the sensor responds.  If light of a different wavelength impinges on the sensor, zero signal will be reported by the sensor.  Note that some of LightLike's other sensor modules have a slightly different wavelength interface:  e.g., Camera allows the user to specify a minimum and maximum wavelength of response.  In this connection, note also that the LightLike library provides a number of spectral filter components.  However, the filter components are often unnecessary because of the wavelength selectivity built into the sensor interfaces.


Multiple beams incident on sensor, interference

If the LightLike incident on a sensor contains beams from two or more sources, of wavelengths to which the sensor responds, then the reported output is a sum of type consistent with the physical nature of the sensor output.  Specifically, intensity sensors such as TargetBoard and Camera will add the individual integrated intensities, while SimpleFieldSensor will add the individual integrated complex fields.

If we wish to model the interference of two (or more) beams from sources that are temporally coherent, then we may need some special measures.  We consider two cases.  First, the interference between coherent beams of identical wavelength is easy to model.  We must use SimpleFieldSensor to compute the complex field superposition, and then in post-processing take the squared magnitude of the net output if that is the quantity of interest.

The second case of interest occurs if the incident beams have different wavelengths but are still temporally coherent.  In that case, the field superposition created by SimpleFieldSensor only has meaning if the system includes special modeling provisions:  see the section on interference of polychromatic fields.


Spatial sampling, integration and interpolation

The remaining parameters in the interface specify the mesh (nxy,dxy) on which sensor output is reported.  The mesh registration for TargetBoard and SImpleFieldDensor is of "gwoom" type, and nxy may be even or odd.  Note the mesh parameters only provide for  square sensors.  An asymmetric sensor could be constructed as a composite subsystem by relatively displacing a set of square sensors.  The other option of course is simply to ignore the zeros reported by the sensor outside an asymmetric region of interest:  in practice, this is usually an adequate procedure.

It is important to understand that the output of all sensor modules represents point samples, at the sensor mesh points, of whatever physical quantity the sensor reports.  For example, suppose that a TargetBoard dxy specification is two times the propagation mesh dxy of the incident field.  In that case, the sensor performs nospatial integration or smoothing of the incident field:  the sensor simply reports point samples seen at its own (nxy,dxy) mesh.  If the mesh points are not precisely registered because of transverse displacements (e.g., motion-induced), then the sensor modules will automaticaly do nearest-neighbor interpolation to best estimate the point samples.  Of course, to model practical situations it may be important to introduce the modeling of spatial integration that is performed by physical sensor pixels that are large compared to the spatial intensity or field variation that is incident on the sensor.  To perform such calculations, the LightLike library provides spatial integration capabilities in a module called (somewhat inaptly) SensorNoise.