The performance of today’s magnetometers is defined by measurement parameters, including sensitivity, resolution, absolute accuracy, etc. A list of definitions is provided below to assist you in understanding how to optimize your surveys, evaluate manufacturer specifications and clarify often mis-used or interchanged definitions, such as sensitivity and resolution. To proceed, choose from any of the following:
Absolute accuracy defines maximum deviation from the true value of the measured magnetic field. Since nobody really knows the true value of the field, absolute accuracy is determined by considering factors involved in determining the field value and their accuracy, like gyromagnetic constant, maximum offset of the time base frequency etc.
Dead zone is a set of orientations of the sensor that do not produce measurements. Optically pumped magnetometers cannot be oriented either 0 or 90 degrees relative to the magnetic field direction. Zones 0 0 + 5-10 0 and 90 0 + 5-10 0 are dead zones for those magnetometers. Some other types of magnetometers, including Overhauser instruments, operate with omni-directional (isotropic) sensors.
Gradient tolerance defines maximum gradient at which the magnetometer produces meaningful reading, not necessarily with the declared sensitivity. Gradient tolerance rather defines the limits of operation of the magnetometer.
Heading error is a maximum deviation of the measurement in function of sensor orientation. Sources of heading error may be contamination of the sensor by magnetic inclusions. In optically pumped magnetometers, heading errors are related to fundamental physical principles. More information can be found in “A Brief Review of Quantum Magnetometers”.
Reading intervals or number of readings per second define the speed of operation. Sensitivity / accuracy should be defined at each of intervals as there is an increase in noise that is not easy to predict mathematically (i.e. does not follow the general rule that noise is proportional to the square root of the speed of readings).
Resolution is a minimum step of the counter used to measure precession frequency and its conversion into magnetic field. It is generally substantially higher (an order of magnitude) than the sensitivity to avoid a contribution of the counter to the overall noise of the system.
Sensitivity is a statistical value indicating relative uncertainty of repetitive readings of the same magnetic field intensity. It is defined as r.m.s. (root – mean – square) value per square root of a unit of bandwidth (Hz1/2). For example, a sensitivity of 1 pT / Hz1/2 means 1 pT r.m.s. (about 3 – 4 pT peak-to-peak depending on the character of the noise) will be a scatter of readings about any “etalon” (fixed value) of the applied magnetic field per 1 Hz of measurement bandwidth.
For wider bandwidths the noise is supposed to increase with the square root of the bandwidth i.e. it should double for 4 Hz of bandwidth etc. This is not really true for most total field measurements. It would be more correct to state the noise at a specific number of readings per second and per associated bandwidth. Sensitivity only defines the scatter; it does not say anything about systematic error if measurement or “offset” from the true value of the magnetic field.
Temperature range defines the range of temperatures at which the magnetometer operates with its full specs (sensitivity / accuracy).
Tracking speed is usually of importance in airborne magnetometry. It defines how many nT / sec change in the field will be followed by the magnetometer coherently. The same as gradient tolerance, tracking speed defines the limits of operation, i.e. in those conditions maximum sensitivity and / or accuracy is not required.
We would appreciate hearing any feedback you may have so that we can make improvements / additions in future. To send a short email, click here.
BACKIf you would like to receive a callback from GEM Systems.