Multi-Variable Model for Identifying Crop Response Zones in a Field
Background and Summary of the Invention
Remote sensing is the science of acquiring information about the earth's land and water resources without coming into physical contact with the feature to be studied. One of three basic outcomes can effect light (electromagnetic energy) as it passes through the earth's
5 atmosphere and strikes an object; it can be absorbed, reflected or transmitted. In general, remote sensing measures that part of the electromagnetic spectrum that is either reflected or emitted (thermal energy) from an object. As an object (green plant) grows, generally, the leaf area of the plant increases, and the different portions of the electromagnetic spectrum respond accordingly (i.e., red reflectance decreases and near-mfrared reflectance increases).
I o There are different methods of data collection from remote sensing systems; a single band (panchromatic), several bands (multispectral) or hundreds of bands (hyperspectral). These images of reflectance can be useful at a specific wavelength or waveband, but are often more useful when combined with images at other wavelengths (i.e., multispectral or hyperspectral) Multiple wavelength reflectance data allows for the creation of field maps
15 that illustrate ratios of selected wavelengths. These mathematical ratios of wavebands (a type of vegetation indices) have statistically significant relationships with vegetative conditions with an area, and when collected strategically over time are useful in visualizing crop growth and development change over the course of a growing season (temporal resolution).
Changes in reflectance values over time can be attributed to differences in plant growth and development or plant health. This assumes that environmental conditions that may effect the reflectance of light have remained the same over time. However, we know that it is unlikely that the sun will be at the exact same angle, that cloud patterns are the same, that particulate matter m the atmosphere will be constant or the position of the sensor over the object will be unchanged from one date of image capture to the next. These factors introduce variation between data sets not attributable to the growing crop, thus making it virtually impossible to accumulate data over a growing season (growing season is considered from the end of harvest through the next harvest) that can be compared to identify changes in the crop alone. While there have been vaπous prior art attempts to eliminate these kinds of unwanted variation, (i.e., using laser light sources at night instead of the sun as a light source, schemes for adjusting the vaπation in photographic film, and others) the inventors are not aware of successful methodology that has been developed for taking the data as is collected and then satisfactoπly adjusting the data itself for compaπson over time (i.e., through a growing season). As reliable data compaπsons have not been made m the prior art, there are few reliable conclusions that can be drawn for a grower to help him in making the few decisions that are withm his power to decide.
To solve these and other problems m the pπor art, the inventors herein have succeeded in developing a methodology for normalizing data taken at different times over a growing season which eliminates the effect of the changing environmental and other conditions on the data so that the data is truly representative of the changing, growing crop in the field. This methodology can be applied to data in any form, but the inventors have chosen to apply it to visible and infrared reflectance data that have been converted to a form of vegetative index, such as the Normalized Difference Vegetative Index (NDVI). There are advantages to converting reflectance data to an NDVI, as is explained in greater detail below Once converted, the data is then normalized using a statistical analysis over each data set independently of the other data. This is done by subtracting the mean value from each pixel value and then dividing the result by the standard deviation By normalizing each data set, the extraneous variations introduced into the data is removed and the data may then be compared to gam insight about the crop and field. The power of this normalization should not be underestimated. It allows for the first time, as known to the inventors, agπcultural data taken at different times and necessaπly under different environmental conditions to be compared and to be combined as a tool for further analysis. This powerfully eliminates the effects of varying influences by factoring them out of the data while the pπor art has either rather ineffectively sought to control the conditions under which the data were collected or to control the environmental conditions subject to control and ignore all others.
Still another aspect to the present invention is the temporal compaπson of this normalized data which provides for the first time information that a grower may find useful in his decision making process. The inventors have found that the data is useful to define different segments of the field that perform similarly for growing crop and to create a story which characteπzes the history of a growing season as it unfolds m these differently defined segments of the field. These "stones" for different parts of a field can be quite unique and yet produce very similar crop yield. Taking yield alone, a grower would see no difference between these different field areas, and previously would have been led to believe that he should make the same decisions for them, and as a result not achieve any improvement in yield. For example, one area might expeπence an early decline m vegetation, perhaps caused by too much moisture which depresses its final yield. Another area may be dry which also depresses its final yield. Yet the yield value alone would not distinguish between them. With the present invention, it is finally possible to create these "stories" or "histoπes" for the individually defined "pixels" of an entire field, and then to associate these pixels with field areas that share the same story, which enables the field to be divided into "like story" areas, or crop response zones as the inventors have defined the term. These crop response zones are areas of a field that have similar vegetative values at the time intervals m which the data is taken. So, for example, one such crop response zones might have low vegetation at the first and second intervals, mid level vegetation at the third interval, and high level vegetation at the last interval or end of the growing season Another crop response zones might have high level vegetation at all intervals. Still other crop response zones would have other patterns of vegetation.
Crop response zones represent segments of the field where the crop grew similarly over time in response to certain static (soil texture, organic matter, elevation, slope) and dynamic variables (precipitation, solar radiation, air temperature). Thus, an understanding of the relationships between static and dynamic variables and resultant crop response will enable the grower to prescπbe and apply certain combinations of controllable inputs such as seed, tillage, fertilizers and pesticides uniquely for specific field segments. For example, a grower will be able to identify those fields or segments of fields which respond positively to a certain hybπd/vaπety of seed. The inventors have utilized mathematical analysis to more rigorously define these crop response zones and that more πgorous analysis is explained below. However, an important part of the mvention is that a grower can now define segments of his field that share common characteristics for which specifically tailored decisions may be made to optimize the yield across the entire field Previously, growers were not provided with any scientifically valid way to define these field segments, even though many growers were able to adjust their decision making based on their great skill and expeπence over many years with
their own fields. While the innate good "feel" that a grower commonly uses may result in some yield improvement, the present mvention will now, for the first time, provide some validation to the grower that specific field areas exhibit certain characteristics that require different decisions m order to maximize their yield. While some of the advantages and features of the present invention have been described above, a greater understanding of the mvention may be attained by referring to the drawings and detailed description of the preferred embodiment which follows Brief Description of the Drawings
Figure 1 is a graphical representation of a computer system for operating the method of the present invention,
Figure 2 is a graphical representation of the electromagnetic spectrum, Figure 3 is a graphical representation of a typical remote sensing model, Figure 4 is a graph depicting the reflected electromagnetic energy sensed by a remote sensing model from vaπous crops and naturally occurring surfaces, Figure 5 is a graphical representation of the additive properties of colored light,
Figure 6 is a graphical representation of the pixel concept as it relates to digital numbers,
Figure 7 is a pictorial representation of a seπes of images lllustatmg the effects of diffeπng spatial resolution, Figure 8 is a pictorial representation of a seπes of images illustrating the effects of quantization level,
Figure 9 is a pictoπal representation of two images illustrating different methods of resampling,
Figure 10 is a graphical illustration of a vegetative index known as NDVI, Figure 11 is the formula for normalizing raw data,
Figure 12 is a pair of graphs illustrating the comparison of two data sets both before and after normalization,
Figure 13 is a graph depicting the initial step of segregating data into clusters, Figure 14 is a graph depicting the iterative process of cluster delineation, Figure 15 is a graph depicting the final phase of segmenting the data into clusters,
Figure 16 is a yield map,
Figure 17 is a set of processed aeπal images taken through a growing season, including a reference bare soil image,
Figure 18 is a graphical depiction of a normalized layer stacked image and its corresponding time progression,
Figure 19 is a graphical depiction of a cluster map and its corresponding spectral curves,
Figure 20 is a table and corresponding graph illustrating the concepts of divergence 90 and separability, Figure 21 is an image of the final crop response zone map and corresponding spectral curves, and
Figure 22 is a graphical depiction of the normalization model. Detailed Description of the Preferred Embodiment
The present invention takes advantage of the remote sensing of visible and infrared radiation reflected from crops in order to generate the initial raw data. This raw data is then converted to a vegetation index value. The converted data is then aggregated, clustered, and classified into crop response zones The process and methodology of creating crop response zones may by readily achieved by processing data on a personal computer, preferably a more powerful pc such as a workstation As shown in Figure 1 , a personal computer 20 has a processor 22, a variety of input devices 24 such as a keyboard, mouse, etc. as is well known m the art, and a display 26 which preferably is a larger size such as 22" computer monitor capable of producing color images. The majoπty of the computer programs used in the present invention are commercially available, except for the normalization step which is performed by the particular software program mentioned and included in this disclosure. This process will now be explained in greater detail.
Overview of Remote Sensing in Agriculture
Remote sensing is the collection of data from a distance; that is, without physical contact between the sensor and the object being measured Although there are many types of remotely sensed data, the one most commonly associated with the term remote sensing is simple photography collected from aircraft or satellites In fact, since the collection of the first aerial photograph in 1840, views from airborne and space borne platforms have become quite commonplace. Today, the value of this "view from above" is obvious when one only considers our reliance on weather satellites and space-based military surveillance.
This "view from above" has also played a major role in agriculture over the last fifty years with the collection of aeπal photography, in support of soil surveys. However, with the recent advancements m sensor technology, the concept of remote sensing in agriculture has grown to include, hand-held devices which measure how plants reflect certain portions of the light spectrum, hand-held devices that measure temperature, multiple sensors mounted on farm implements and spπnkler systems, and airborne and space-borne digital collection systems that measure wavelengths of light way beyond the abilities of human vision. All of these systems are based on the fact that if a plant is growing differently from the surrounding
plants, those differences can somehow be measured. This ability to measure the response of plants to wavelengths of light beyond human vision, coupled with its non-mvasive nature has put remote sensing in the forefront of agricultural research. Remote Sensing: Energy Matter Interactions There are basically two types of remotely sensed systems available for land cover evaluation; active systems and passive systems. Active systems (i.e., radar, sonar, laser and seismic) send out their own energy and look for some sort of energy response. The amount of energy reflected back to the sensor gives the scientist insight into the type of object being measured. Passive systems on the other hand, do not provide their own source of energy and rely solely other sources of object illumination (i.e., typical reflective based cameras / scanners and thermal imaging systems). The pπmary source of energy for most passive systems is the sun, which emits energy in all possible wavelengths called the electromagnetic spectrum (Figure 2). The following discussion relates only to passive systems using the sun as their source of energy. However, it should be understood by those of ordinary skill m the art that the initial raw data could be obtained by any method known in the pπor art, including both passive and active.
Basic Remote Sensing Model
As sunlight (I.) travels through space and stakes the earth (plants, soil, etc.), it undergoes one of three processes. The different wavelengths of light coming from the sun are either absorbed by the object (Ar), reflected off of the object (R ), or transmitted through the object (T ) (Figure 3). Each object on earth reacts to these incoming wavelengths of light (termed the electromagnetic spectrum) in its own unique way resulting m a spectral curve. Figure 4 gives the spectral curves for a vaπety of land cover types. Simply put, these curves indicate the amount of energy that is reflected from each object m the different portions of the electromagnetic spectrum.
In practice, the electromagnetic spectrum is divided into three basic sections (Figure 4). These subdivisions include the visible, the near infrared, and the middle infrared portions of the spectrum. Each is descπbed in detail below.
The first subdivision deals with that portion of the light spectrum where humans can see (400 nanometers to approximately 700 nanometers). It is in this part of the spectrum where pigment dominates. For instance, a blue car appears blue to the human eye because the car is absorbing green and red wavelengths of light while at the same time reflecting the blue portion of the light spectrum. A green object, on the other hand would absorb red and blue, while reflecting green light. Based on the additive properties of light (Figure 5), an object that appears yellow to the human eye would be absorbing blue light while reflecting red and green light. A white object reflects all light and so is composed of all wavelengths of light,
whereas, a black object is absorbing all wavelengths of light, thereby reflecting no energy at all.
Based on this simple concept descπbed above, one can begin to understand how objects on earth obtain their visual appearance A green plant is green, for example, because the chlorophyll (pigment) absorbs both blue and red light, while not readily absorbing green light. The healthier the plant, the more the chlorophyll production resulting in absorption of both the blue and red wavelengths. As a green plant begins to undergo stress (or simply senesces), the chlorophyll production slows, resulting in (at first) an increase in red reflectance, giving the plant a yellow appearance (remember red and green light mixed make yellow). Bare soil on the other hand, obtains its color through a combination of minerals, moisture, and organic matter, each of which affect the visible portion of the spectrum in different ways. For the most part, a soil curve in the visible portion of the electromagnetic spectrum has a flat to slight increase in reflectance with increasing wavelength As well, the lower the overall spectral reflectance, the darker the soil; the higher the overall reflectance, the lighter the color of the soil.
The second major division of the electromagnetic spectrum ranges from about 700 nanometers to approximately 1500 nanometers and is called the near infrared. This portion of the light spectrum responds to the amount and health of plant cellular structure. In other words, objects like a soybean plant or maple tree will have high reflectance m the near infrared because they have large quantities of cellular structure that are onented perpendicular to the incoming rays of light. Conversely, objects such as pine trees and less healthy vegetation will have lower reflectance of near infrared radiation while non-vegetated objects will have an even lower reflectance. Environmental objects with the lowest reflectance of all in the near infrared portion of the spectrum tends to be wet bare soil and water. The third major division of the electromagnetic spectrum ranges from around 1500 nanometers to approximately 3000 nanometers and is refeπed to as the middle-mfrared. It is this portion of the electromagnetic spectrum where moisture plays a dominant role. Although other factors such as organic matter, iron content, and clay content have an effect, moisture appears be the pnmary mechanism affecting reflectance More specifically, the higher the moisture content, the lower the reflectance As objects lose moisture or begin to dry, their reflectance in this portion of the electromagnetic spectrum increases While this concept has been proven in a laboratory setting, applying this concept in practice has been somewhat evasive.
Temporal Variations in Spectral Curves While it is true that many objects have a spectral curve that is static, many more objects have a spectral curve that is dynamic over time. Certainly, an agricultural field
begins with variations of bare soil (light to dark) which have unique spectral curves. Over time, the soil is worked (changing the soil color) and vegetation begins to emerge. As vegetation begins to fill the field, there is a lowering of the red reflectance (due to increased chlorophyll) and an increase in near infrared reflectance (due to increased cellular structure). As the crop begins to mature, the field no longer has a bare soil curve; instead it has taken on the spectral curve of healthy green vegetation. As individual plants undergo stress there is withm field variability of the spectral curve indicating variable amounts of chlorophyll production and a corresponding variable health of individual plant cells. Eventually the field begins to senesce and the chlorophyll begins to break down along with the vegetative cellular structure. This results in an increase in red reflectance and a decrease m near infrared reflectance (actually moving back toward the spectral curve of bare soil). As the crop is harvested and the bare soil in exposed, the spectral curve resets itself to that of bare soil.
This dynamic nature of spectral curves is not unique to agπcultural crops. In fact, almost all things in nature have some sort of dynamic spectral curve based on the season. However, from an agricultural perspective, it is the dynamic nature of spectral curves that can be used to help determine the health (or potential stress) of vegetated areas duπng the season. The present invention is broad enough to be used with virtually any growing vegetation although it finds particular application for a grower of an agπcultural crop.
Resolutions in Remote Sensing Remote Sensing Systems
When one discusses remotely sensed systems, the issue of resolution inevitably arises. However, few people seem to understand that there are three basic types of resolution with regard to any given imaging system. These three resolutions include spectral, spatial, and temporal. While each plays a significant role in agricultural remote sensing, they are very different from each other. Each is discussed below. Spectral Resolution
The spectral resolution of imaging systems simply indicates how many portions of the electromagnetic spectrum are being measured at a given time. This number of bands can range from only one band (termed panchromatic) to several hundred (hyperspectral). Typically, most imaging systems used in agriculture collect between 2 and 20 spectral bands (termed multispectral). Equally important to the number of bands, is the band-widths and the exact positioning of the bands along the spectrum. Historically, multispectral imaging systems have collected reflectance data using bandwidths of between 0.05 and 0.2 micrometers (50 to 200 nanometers). These bands are typically bandpass in nature and rarely overlap each other, resulting m unique measurements of specific portions of the electromagnetic spectrum The band placement of historical imaging systems generally
relates to specific portions of the spectrum where soil, water, or vegetation is behaving in a unique way. These positions include the following:
0.4 - 0.45 μm - water turbidity and chlorophyll production in green plants 0.5 - 0.55 μm - peak of the green portion of the spectrum to measure plant health 0.6 - 0.65 μm - the trough of a green vegetative curve indicating amount of pigment 0.8 - 1.10 μm - estimate of cell structure and also indicates moisture content 2.3 - 2.50 μm - measurement of soil moisture, organic matter, and clay content 10.0 - 12.0 μm - thermal emittance indicating temperature of an object
Coupled closely with spectral resolution is the concept of quantization. Most current imaging systems have 8-bit detectors, which allow digital numbers (DN's) between 0 and 255 to be used. The better utilized the digital range is, the higher the potential for differentiating between spectrally different objects (Figure 8). Each band of panchromatic, multispectral, or hyperspectral data is arranged so that the lower the reflectance, the lower the number. The digital numbers themselves, however, are only relative to each other and cannot be compared from one day to the next or from one image to the next. In order to be transformed into percent reflectance (for comparison with other images), one must account for atmospheπc interference, time of day, sensor calibration, and a vanety of other factors most of which are typically beyond the control of the data collector.
Spatial Resolution
Most current airborne imaging systems are comprised of charged coupled device arrays (CCD aπays). These arrays are basically a grid of sensors, each of which collects or measures how much energy is being reflected off of the target m a particular wavelength (discussed above). Each individual grid is referred to as a pixel (Figure 6). The area on the ground that a pixel correlates with (pixel size) is determined by the sensor's optics and the altitude of the imaging system. Typically, the larger the pixel size the blockier the image (Figure 7). The spatial resolution for most airborne imaging systems ranges between λ meter and several meters. The spatial resolution for imaging systems mounted on space borne satellites vanes between 5 meters and several kilometers, depending on the application.
Histoncally, the spatial resolution of airborne digital cameras has been limited by the size of the sensor array and the above ground height of the measurements. Additionally, until recently, the technology has been too expensive to provide the required spatial resolution (combined with adequate areal extent) for most applications in agriculture. Recent advancements, however, in sensor technology are enabling more cost effective data collection, higher quality data, and more rapid information turn-around to the end users.
percent reflectance (for comparison with other images), one must account for atmospheπc interference, time of day, sensor calibration, and a vaπety of other factors most of which are typically beyond the control of the data collector
Spatial Resolution Most current airborne imaging systems are comprised of charged coupled device arrays (CCD arrays). These arrays are basically a gnd of sensors, each of which collects or measures how much energy is being reflected off of the target m a particular wavelength (discussed above). Each individual grid is referred to as a pixel (Figure 6). The area on the ground that a pixel correlates with (pixel size) is determined by the sensor's optics and the altitude of the imaging system. Typically, the larger the pixel size the blockier the image
(Figure 7). The spatial resolution for most airborne imaging systems ranges between ' meter and several meters. The spatial resolution for imaging systems mounted on space borne satellites vanes between 5 meters and several kilometers, depending on the application.
Historically, the spatial resolution of airborne digital cameras has been limited by the size of the sensor array and the above ground height of the measurements. Additionally, until recently, the technology has been too expensive to provide the required spatial resolution (combined with adequate areal extent) for most applications in agπculture. Recent advancements, however, m sensor technology are enabling more cost effective data collection, higher quality data, and more rapid information turn-around to the end users. More recently, spatial data are being collected with GPS sensors in the form of point data, line data, and polygon data. Although theoretically a point and line cannot have area, these data types as collected in an agπcultural setting often imply an area of interest. Therefore, all vector types (points, lines, and polygons) can be ultimately considered to be or related to pixels. Temporal Resolution
Temporal resolution is an underused term m remote sensing that relates to the exact time of year, time of season, or time of day that an image needs to be acquired over an area of interest. Coupled with the exact timing of image acquisition is the total number of images required to adequately characterize the area of interest This type of resolution is probably the most misunderstood and under researched area of remote sensing. What is the proper time for remotely sensed acquisition of a corn crop to help estimate yield, nitrogen stress, plant stand, etc? One could ask the same question of soybeans, cotton, citrus, alfalfa, potatoes, and many other crops. The answer is that few researchers seem to understand the importance of the questions above, much less the answers. This may perhaps be due to the failure of the pnor art to provide the necessary technology to fully adjust for the temporal differences in the data collected. Without this technology, there is no reason to thmk about when to collect data
because the data can't be correlated or used in combination due to the interferences introduced by the changes in environmental and other conditions which contaminate the data, at least without the benefit of the present invention.
Image Preprocessing There are several steps involved m the preparation of air borne imagery prior to information extraction. These steps include band-to-band registration, vignetting correction, and geocorrection to a ground coordinate system, and are known in the prior art Each of these steps is discussed below.
Band to Band Registration When an airborne multispectral imaging system is flown, the cameras may be aligned in a row or set up m a two-dimensional array of their own. Nonetheless, the cameras are designed so that they image approximately the same area and are electronically triggered so that they image at virtually the same time. The result is a multi-band image m which each band is closely registered to the others. The problem is that with multispectral imagery, each pixel representing a given area on the ground in a particular waveband must be exactly registered with other pixels / bands measuring the same ground area. If the bands are not aligned, the image will take on a fuzzy appearance when viewed on a computer monitor and will provide misleading results when processed for information extraction.
One process of band-to-band registration requires manual location of similar points between two different bands. Once several points are located, an automated process is often employed that passes a moving kernel (computer based window) over the two images looking for areas of good spatial correlation. This automated method of point picking generally locates dozens to hundreds of points for an image with an array of approximately 1000 pixels by 1000 pixels. The system uses these points to calculate a mathematical transformation (using two-dimensional least squares, for example) to waφ one band to the base image. The result is a multispectral image with all pixels representing a given area on the ground being aligned or stacked so that they now represent a spectral vector.
Vignetting
Most remote sensing systems that employ the use of a lens have a unique type of distortion called vignetting. Vignetting causes a darkening of the image as you move from the center toward the edge of the image. The darkening is a function of using the edge of the lens and is apparent m most aenal photography along the four corners. In digital imagery, it is often very difficult to visually identify vignetting, however, it can be identified through a variety of computer based methods. Both empiπcal and theoretical correction equations can be generated, however, the empiπcal method is most often employed. Most companies flymg airborne imagery have the mathematical correction equations for their cameras. These
correction equations are similar to a quadratic trend surface of the lens distortion Vignetting correction simply removes the trend equation to adjust (add to or subtract from) the radial darkening produced by the imaging system's lens. This process is well known m the prior art Geocorrection Most imaging systems flown on aircraft use Global Positioning Satellites (GPS) to know when the system is directly over the field of interest. However, small subtleties in aircraft attitude result m an image that is seldom onented in a true north / south direction. As well, the imaging systems provide no location information for each pixel, thereby limiting the researcher's knowledge of its geographic position on the earth. This lack of geographic orientation can be corrected by locating known points on the earth (road intersections, center of a bush, corner of a house, etc.) and finding the corresponding pixels in the imagery. Once approximately ten to fifteen points are located, a transformation equation can be calculated (two-dimensional least squares, for example) and the image can be warped to overlay its correct geographic position (i.e., each pixel is positioned at its correct geographic coordinate). During this process, a map projection is chosen (i.e., state plane, UTM, etc.) to account for the flattening of the earth's curved surface. As well, a Datum is chosen (NAD27, NAD83, WGS84) that is used as the coordinate system's origin of reference. This process allows the remotely sensed data to be registered with other geographically onented data such as field boundaπes, yield data, and GPS measured soil samples During the geocorrection process, several decisions have to be made. One very important decision is that of resampling. Do the newly created pixels simply reflect the old digital values, or should the newly created pixel be a weighted average of the pixels around it The first method is termed nearest-neighbor while the second method may use bilmear- mteφolation or cubic-convolution resampling, both of which are well known in the prior art There are implications to using each method. Theoretically, nearest neighbor preserves the integrity of the original pixels while the other two methods can drastically change the data values (Figure 9).
Another decision to be made is that of appropπate transformation polynomial. Typically, one wants to use the lowest order polynomial possible to eliminate poor extrapolation beyond the picked control points. However, at times the aircraft may be in a small bank resulting in data that indicates an apparent trapezoid shaped field when the field is actually a rectangle. Under this scenano, a higher order polynomial may be required or perhaps a rubber sheeting algorithm that performs a nonlinear stretch of the image Again, these methodologies are well known m the pnor art. Image enhancement
Image enhancement refers to the process of adjusting the image to enhance certain features withm an image. For instance, a single band of imagery can measure light (energy) on a scale of 0-255 with digital numbers, but the human eye can only separate a few shades of a given color (less than 10). Often the colors in an image are adjusted so that the colors magnify the differences for the desired portions of an image. For example, in an agπcultural image a field may have a brightness vanation m a given band that ranges from 120 tol40, a farm road 80-82, and a barn roof 180-183. If no adjustments are made, the computer will segment the image into 12 equal categories from 80-183, which will only permit 2 colors to represent the vanation m the field. But if we enhance the image, we can force the majority of the colors over the area of interest (i.e., so that 10 of the 12 colors m the range show field variation).
An entire image contains a wide range of bnghtness values. For instance, a road, a building, and an agricultural crop may range over 100 digital counts in the blue portion of the spectrum. However, withm a single cornfield, the range of the digital numbers might be less than 10. Therefore, a grower that is more interested in looking at the crop in his field can have the image enhanced to adjust the color of the image to be on the scale of the differences withm the field. This results m the ability to see more vaπability in the field and less variability for the road or roof tops (things that have less interest to the end user).
Vegetative Indices While a given band of data (i.e., red or near infrared) may be very informative to a grower, the combination of two or more bands (in the form of a vegetative index) is often useful. Vegetative indices are often used for assessing the variability of vegetative health withm a given field. The most common of these known in the prior art includes the NDVI (Normalized Difference Vegetative Index) which is calculated as follows. NDVI= (nir-red)/(nir+red)
This NDVI particular ratio plays on the inverse relationship between the red and near infrared with regard to healthy green vegetation versus bare soil. As stated earlier in the "Temporal Variation of Spectral Curves" section, there is a temporal dynamic to vanous natural objects. An NDVI takes advantage of these temporal differences by measuring the deviations away from a soil spectral curve as an agπcultural crop begins to grow. As a crop begins to emerge, there is more chlorophyll production, causing a decrease in red reflectance. As well, there is an increase in biomass or cell structure causing an increase in near infrared reflectance. This inverse relationship is captured in an NDVI resulting m a high value (near 1.0 for healthy green vegetation) and a very low number for stressed or unhealthy vegetation (near 0.0). One thing to note is that an NDVI is very sensitive to atmosphenc and sensor vanations (Figure 10).
There are, however, a variety of so called vegetation indices, or data that characteπze vegetative growth, that are not mathematically based or are simple calculations at best. These include (but are not limited to) the near infrared (by itself) or the near infrared minus the red. In fact, there are many types of data that can be considered as vegetation indices or vegetation health monitors. These include (but are not limited to) yield monitor denved data, EM-38 data, soil surveys, and organic matter maps.
Image Normalization
Historically, the cost and labor required to obtain radiometncally-corrected data has limited the feasibility of any process requiring the analysis of multi -temporal remotely sensed imagery. Radiomefric correction, as discussed previously, is the method of accounting for specific sources of error m collected data. An important aspect of the crop response zone invention relies on vegetative indices calculated using multi-temporal imagery. Therefore a method of pseudo-calibration is important to realizing the invention. The methodology developed to supplement calibration of the remotely sensed data will be the focus of the next section.
The method of pseudo-calibration chosen by the inventors is a normalization technique, which can transform any type of data given its distnbution about a given value. The technique only requires simple calculations to be performed after the field mean and field standard deviation have been determined (Figure 11.) Using this formula every eight bit pixel value (0 - 255) is replaced by a positive (or negative) value corresponding to its position greater or less than the mean value. Figure 12 shows two data sets before and after normalization. The figure shows that the data can be meaningfully compared on similar scales after the normalization formula has been implemented. Being able to compare data of similar scale allows the analyst the ability to assess the relative vegetative health through the growing season (growing season is considered from the end of harvest through the next harvest) by eliminating the undesirable effects of variations in the data introduced by environmental and other conditions. Additionally, similar scale is of the utmost importance m isodata clustenng which is an important aspect of the crop response zone mvention.
The historical limitation of radiometnc correction to analysis of multi -temporal remotely sensed imagery has been overcome using a method of pseudo-calibration called normalization. This innovation provides an advantage that the crop response zone invention demonstrates over the pπor art. The inventors are aware of some attempts m the pπor art to provide a "standard" which could then be used to "normalize" the data. One such attempt involves the placing in the field of a set of placards, ranging from dark to light colored, whose image is collected at the same time that the image of the crop is collected. The theory is that the data representing the placard in each image would provide a gauge as to how the different
conditions affected the data collected dunng each flight, and that the data could then be corrected to a common standard using some conversion factor calculated from the placard data. However, this was not found to be satisfactory.
Image Clustering and Classification One of the most fascinating and powerful operations one can perform on multispectral imagery is that of grouping, i.e. clustenng and classification. This process enables the researcher to identify natural groupings of spectral homogeneity. For instance, the average spectral signature (spectral curve) for a given land cover type (e.g., deciduous forest) can be calculated for a given data set. Once this statistic is calculated, each pixel in the image can be compared to this statistic to determine if it has any potential of being deciduous forest. The following gives an overview of how the clustenng and classification process works.
Clustering
The first step in the classification process is to develop a set of mathematical statistics that represent each potential land cover m the study area. These statistics will be compnsed of a mean and standard deviation (for each land cover class) for each band of the multispectral imagery. Although there are several basic methods of statistics generation, one pπmary method (the unsupervised approach) is used in areas where ground truth may be limiting.
The most popular method of developing a set of unsupervised statistics is the Iterative Self-Orgamzmg Data Analysis Technique (ISODATA) The following is a listing of the steps involved in this iterative method of generating training statistics: the software plots the data in multidimensional feature space (Figure 13) the first pπncipal axis is drawn through the data arbitrary cluster boundanes are set withm the system the mean value for each arbitrary cluster is determined (this is done for each band)
THE EUCLIDEAN DISTANCE IS CALCULATED FOR EACH PLYEL AND THE CLUSTER CENTERS each pixel is regrouped into the cluster in which it had the smallest Euclidean distance a new mean is calculated for the new clusters (Figure 14) the process begins again the process continues until the cluster centroids are stabilized and less than 5% (generally user defined) of the pixels changes cluster classes (Figure 15) once the iterations stop, descnptive statistics (means and standard deviations) for each cluster is calculated the clusters are then evaluated as to what type of land cover they represent
The inventors have found that in analyzing their expenmental data that eight clusters was often the optimum number. It should also be noted that the clusters each have their own statistical identity, and can be quite different from other clusters. For example, one cluster
may be quite targeted with little variation m its distnbution of values while another cluster might have a larger distribution. Neighbonng clusters might even have data points that overlap. This anomaly is accounted for m the step of classifying where probability statistics are used. Classification
Once the clusters have been created and evaluated (i.e., identified as to land cover type), the classification process can be implemented. Each pixel is analyzed (independently) as to its probability of belonging to a given cluster class (based on a defined decision rule). Each pixel is then officially assigned (or classified) to the class to which it had the highest probability of belonging. The different decision rules include maximum likelihood, minimum distance, and Mahalanobis. Each utilizes slightly different parametnc rules dunng the classification procedure. Typically, the decision algonthms utilize the mean, standard deviation, and covaπance matrix of each cluster to compute the probability of association
The output from a classification is a two-dimensional array of numbers m which each pixel is given the value of the cluster class that it most closely matched. As well, most classification software output a mathematical distance layer, which indicates the spectral distance the pixels was from the cluster centroid. This distance layer can be utilized to evaluate which pixels were more closely associated with a given cluster and, conversely, which pixels had a higher potential of being misclassified A vanation on this distance layer evaluation is that of a fuzzy classifier. With this classification option a multi-layer classification map is produced that has the following structure.
Layer 1 - Each pixel is assigned the cluster number to which it had the highest probability of belonging.
Layer 2 ~ Each pixel is assigned the cluster number to which it had the second highest probability of belonging.
Layer N - Each pixel is assigned the cluster number to which it had the N* highest probability of belonging.
Using this multi-layer classification and the classification distance layer, a fuzzy filter is processed over the data. The decision rule (user defined) basically looks at each pixel m conjunction with those pixels directly around it to determine if the correct decision was made by the classifier. For instance, if a pixel in an image was categonzed (classified) as soybean while all of the pixels around it were classified as pine forest, one would begin to question the validity of the classification With this in mind, the fuzzy filter will look to Layer 2 of the classification to see if the next highest class the pixel belonged to was pme forest. If there was a moderate chance of the pixel belonging to pme forest and all of the pixels around it were categonzed as pme forest, the fuzzy filter will change the pixel to a pme forest pixel. If
however, there was a very low probability of the pixel belonging to pme forest, the algonthm will leave it classified as soybean. Crop Response Zone Generation Concept of a Crop Response Zone Over the past several years, growers throughout the country have begun to use yield monitors equipped with GPS systems to produce fairly detailed yield maps of their fields (Figure 16). Although initially, these yield maps produced a great deal of interest and enthusiasm, over time, the growers began to question what m fact was the cause of the variation in yield that they were observing and what (if anything) should they do about it. After intense analysis of various yield maps from around the country, it became apparent that all poor yielding areas were not "poor yielding" for the same reason In fact, many poor yielding areas may reach the "poor yield" status through totally different mechanisms. It therefore, became one goal of the inventors to attempt to understand, categoπze, and eventually explain, why certain portions of an agricultural field do not reach its yield potential.
One method used by the inventors was that of analyzing and processing multiple dates of digital aeπal imagery. By looking at vegetation (through the eyes of digital remote sensing systems) it was thought that some refinement of the yield map might be possible (i.e., poor yielding areas could be segmented into different vegetative growth progressions). In fact, this proved to be the case. Areas with late germination due to a wet spring and, topographically high, well drained areas, subjected to late season drought, both produced a poor yield. However, each reached poor yield through a totally different path. Acting on this concept, it was believed that remote sensing was one of the only reliable mechanisms for monitoring withm field vegetative change over time. Based on the above logic, it is obvious that the mapping of crop response zones requires the processing of multiple dates of remotely sensed imagery acquired durmg a given growing season. One thing to note is that the term growing season as defined earlier is considered from the end of harvest through the next harvest. However, based on crop rotation patterns throughout the midwest the collection of data over a given field could be every year, every other year, or every third year. Furthermore data from different crops during different growing seasons may be combined for analysis.
The following section will give a detailed account of the steps involved in crop response zone generation including dates of image acquisition, reformatting of digital data, band-to-band registration, vignetting correction, geocorrection of aenal imagery, layer stacking of all dates, image normalization, calculation of vegetative indices, cluster generation, and image classification. Many of these processes are quite m-depth and require
substantial background knowledge in agriculture and image processing in order to fully understand and appreciate the concepts involved To facilitate a reader gaming a full understanding of the invention by reading this disclosure, short discussions pertaining to remote sensing processes and concepts have been provided and presented above. This explanation of crop response zones will reference these short discussions as certain concepts are discussed in connection with the example discussed below.
Aerial Imagery
Aerial imagery was collected four times throughout the growing season. The image dates coπelated with bare soil, VI 2, VT, and R4 crop stages (see section on "Resolutions in Remote Sensing"). The aerial imagery was flown with digital cameras with an array size of approximately 1500 pixels wide and 1000 pixels in the along track dimension The digital systems were 8-bit systems and were collected and stored on an on-board computer in a Tagged Image Format (TIF) Four bands were collected representing the blue, green, red, and near infrared portions of the electromagnetic spectrum (see section on "Spectral Nature of Remote Sensing"). The cameras were aligned in a two-by-two matax and were rigid mounted (pseudo-bore sited) with the lenses focussed on infinity. The imagery was flown at approximately 5000 feet above ground level (AGL) to produce a spatial resolution of approximately one meter by one meter (see section on "Resolutions in Remote Sensing"). The digital cameras have square pixels and are not interlaced during image acquisition. The optimum time for image acquisition was two hours before or two hours after solar noon (see section on "Resolutions in Remote Sensing"). Images were not acquired dunng times of poor atmospheric conditions (haze, ram, clouds) No cloud shadows were acceptable in the imagery. Vignetting Correction and Band-to-Band Registration Once the plane landed the images were extracted from the on-board computer and processed for vignetting. Positive Systems (the vendor that built the aenal imaging system called the ADAR 5000) produces a vignetting equation (empirically) for each of their cameras (see section on "Image Preprocessing"). Each data file was processed through a semi- automated band-to-band registration program. This program ensures a root mean square error of less than one pixel (see section on "Image Preprocessing"). The data were ultimately converted into an ERDAS format for storage on CD. These processes are all well known m the prior art.
Reformatting
The data were received by the inventors on CD in ERDAS *.lan format. The data were reformatted (changed to a more software compatible format) using ERDAS Imagine 8.31. The resulting format was an Imagine *.ιmg file with a statistics file that ignored zero and corresponding pyramid layers for fast processing and rapid image display Geocorrection
The data were referenced to GPS collected field boundanes (which used an Ashtek survey grade GPS) (see section on "Image Preprocessing"). The geocorrection process utilized a minimum of 7 points per image with a root mean square error of less than one meter A nearest neighbor resampling algorithm was used along with a 1st order mathematical transformation. Rubber sheeting was used only in areas where there was significant local relief withm a field (i.e., Council Bluffs). All images were rectified to the Universal Transverse Mercator Projection (using the appropnate longitudinal zone) with a NAD83 Datum (virtually no difference from WGS84). The inventors have referenced each successive image to the first image taken, as opposed to referencing each image to a reference map or the like. However, this is considered to be a matter of choice and not critical to the successful operation of the invention.
Normalization
Once rectified to a map base, the multi-date images were processed through a computer model in accordance with a computer program as disclosed in the attached Exhibit A to normalize the data. Normalization helps account for sensor vanation, changes in growing season, changes in sun angle between acquisitions, and changes in atmospheric condition during image acquisition (see section on "Image Normalization). Basically, normalization enables temporal comparisons in the data. The normalization model included (at the beginning of the model) the computation of an NDVI for each image (see the section on "Vegetative Indices"). The resulting NDVI images were then normalized by the model For the bare soil image, the red band was used, however, it was also normalized dunng the model execution mentioned above. As well, the model produced a normalized image of the yield monitor data. Additionally, the model constructed a new five band data file (termed layer stacking) with the following data layers (Figure 17 and Figure 18):
Band 1 - normalized red band of bare soil image Band 2 - normalized NDVI of the V12 flight Band 3 - normalized NDVI of the VT flight Band 4 - normalized NVI of the R4 flight
Band 5 - normalized yield monitor image
As shown m fig. 22, this model has three levels of data processmg. The first level (A.) of processing is the computation of the NDVI values from the raw ιmagery(2, 3, 4). The second level (B.) is the process of normalization of input data. This involves the temporary storage of the mean and standard deviation of the data sets These values are then used to compute the normalized data set. The third and final step (C ) involves the stacking of the normalized data sets spatially. This data set is now m a format that lends itself to the grouping method.
Clustering and Classification
The five band data file was then processed through an ISODATA clustering algorithm (see section on "Clustenng and Classification"). The parameters for ISODATA were as follows:
Set initial cluster axis as a pnncipal axis and automatic boundary segmentation (similar to first pnncipal component) Number of clusters was set to eight (optimum amount based on m-house study)
Convergence was set to 95%
Number of iterations was set to 10
All pixels were used (i.e., increment for both x and y were set to 1)
An output image was created using the green, red, and near infrared statistics to drive the blue, green, and red color guns, respectively
The resulting clusters were analyzed both spectrally (looking at spectral curves) and spatially (using the cluster map produced by the software) (Figure 20). By looking at both the spectral and spatial information (along with information on spectral separability - Transformed Divergence -_ see Erdas Imagine Field Guide), the clusters were grouped into zones of similar vegetative progression over time. The generic formula for separability along with an actual table of Transformed Divergence is shown in Figures 19 & 20.
Crop Response Zone Classification
Once the clusters were analyzed and labeled, the raw normalized data were processed through a maximum likelihood classifier (see section on "Clustering and Classification"). Unlike the clustenng algorithm that simply uses a "minimum distance to the mean" computation, the maximum likelihood algorithm employs the use of the cluster mean and the standard deviation to determine the probability of correct categorization. Although at times there is little difference, major differences have been noted depending on the data. The following are the parameters set during the classification process.
The non-overlap rule was set to non-parametnc
The overlap rule was set to parametnc
The parametnc rule was set to maximum likelihood
All chi-square values were set equal to each other for the a pπon probability No threshholdmg was used dunng the classification process
Crop Response Zone Curve Evaluation By evaluating the spectral curves generated dunng the clustering process, one can begin to understand a bit about the crop response zone environment, and the story that is told for the crop dunng the growing season. Figure 21 shows both the classification map and the corresponding spectral curves. A quick analysis reveals some interesting trends. The following is a brief analysis of three zones. Cluster #1 (red) - This area has below average organic matter (band 1), has poor vegetation on flight 2, very poor vegetation on flight 3, poor vegetation on flight 4, and ends up having the lowest overall yield for the field.
Cluster #4 (puφle) - This cluster has above average organic matter, lower than average vegetation on flight 2, above average vegetation on flights 3 and 4, and still ends up with a below average yield. This is an area of the field that is susceptible to too much early season moisture. Even though the vegetation looks good on flight 3 and 4, the yield loss was already established by flight #2.
Cluster #6 (white) - This cluster (or crop response zone) has above average organic matter and excellent vegetative health throughout the growing season. Its final yield is among the best in the field.
The grower can use this kind of information as feedback for use m his making the relatively few decisions available to him to increase his yield Heretofore, raw yield data was not very useful, for the reasons given. However, this data now becomes useful, even powerful, for helping the grower decide on strategies for different locations in his field. And, with the increasing sophistication of farm equipment and their GPS capability, the grower has the ability to tailor his farming activities for these vanous crop response zones located at different areas m his field Thus, the present invention actually increases the usability of the more sophisticated farming equipment, and makes it more cost effective so that its increased expense can be justified through increased yields. The invention also provides a value added for a seed supplier in that upon doing as crop response zone analysis of a grower's field, the seed which provides the best yield for each crop response zone can be separately identified for the grower while other seed suppliers not having access to the crop response zone information would not know how to specify seed variety, quantity, etc with the same kind of precision. While the principal advantages and features have been exemplified in the context of the preferred embodiment, one of ordinary skill in the art would recognize that the invention
is not so limited. There are vanous changes and modifications that would be obvious to one of ordinary skill in the art while keeping withm the scope of the invention. For example, vaπous specific mathematical techniques have been used for various steps in the methods disclosed herein Other mathematical techniques could also be implemented and which would not represent a departure from the invention. The preferred embodiment utilizes a particular vegetative index m its preferred embodiment, but other vegetative indices could be used as well. Certain statistical parameters have been utilized in certain of the steps, but other parameters could possibly be used as well. Still other changes could be visualized by those of ordinary skill in the art, but the invention should be considered as being limited only by the scope of the claims and their equivalents.
23 EXHIBIT A
#
# sec cell size for the model #
SET CELLSIZE MIN; #
# set window for the model #
SET WINDOW UNION; #
# set area of interest for the model
SET AOI "d:/7_l_l/711.aoi"; #
# declarations #
Integer RASTER nl_f2_ndvi FILE DELETE_IF_EXISTING USEALL ATHEMATIC 8 BIT
UNSIGNED INTEGER "c: /temp/f2_ndvi . img";
FLOAT TABLE n3_Output;
FLOAT TABLE n6_Output;
Integer RASTER n7_f2_ndvi_std FILE DELETE_IF_EXISTING USEALL ATHEMATIC 8 BIT
SIGNED INTEGER "c: /temp/f2_ndvi_std.img";
Integer RASTER n8_f3_ndvi FILE DELETE_IF_EXISTING USEALL ATHEMATIC 8 BIT
UNSIGNED INTEGER "c: /temp/f3_ndvi . img";
FLOAT TABLE nlO_Output;
FLOAT TABLE nl3_Output;
Integer RASTER nl4_f3_ndvi_std FILE DELETE_IF_EXISTING USEALL ATHEMATIC 8 BIT
SIGNED INTEGER "c: /temp/f3_ndvi_std. img";
Integer RASTER nl5_f4_ndvi FILE DELETE_IF_EXISTING USEALL ATHEMATIC 8 BIT
UNSIGNED INTEGER "c: /temp/f4_ndvi . img";
FLOAT TABLE nl7_Output;
FLOAT TABLE n20_Output;
Integer RASTER n21_f4_ndvi_std FILE DELETE_IF_EXISTING USEALL ATHEMATIC 8 BIT
SIGNED INTEGER "c: /temp/f4_ndvi_std. img";
Integer RASTER n24_PROMPT_USER FILE OLD NEAREST NEIGHBOR AOI NONE
"d: /7_l_l/aerial_imagery/7_l_l_1997_zones. img";
FLOAT TABLE n26_Output;
FLOAT TABLE n29_Output;
Integer RASTER n30_yie_std FILE DELETE_IF_EXISTING USEALL ATHEMATIC 8 BIT SIGNED
INTEGER "c:/temp/yie_std.img";
Integer RASTER n31_PROMPT_USER FILE OLD NEAREST NEIGHBOR AOI NONE
"d: /7_l_l/aerial_imagery/7_l_l_1997_zones.img";
Integer RASTER n41_PROMPT_USER FILE OLD NEAREST NEIGHBOR AOI NONE
"d: /7_l_l/aerial_imagery/7_l_l_1997_zones . img";
Integer RASTER n49_PROMPT_USER FILE OLD NEAREST NEIGHBOR AOI NONE
"d: /7_l_l/aerial_imagery/7_l_l_1997_zones.img";
Integer RASTER n76_ls_1234y_std FILE DELETE_IF_EXISTING USEALL ATHEMATIC 8 BIT
SIGNED INTEGER "c: /temp/ls_1234y_std. img" ;
Integer RASTER n78_PROMPT_USER FILE OLD NEAREST NEIGHBOR AOI NONE
"d: /7_l_l/aerial_imagery/7_l_l_1997_zones.img";
FLOAT TABLE n80_Output;
FLOAT TABLE n83_Output;
Integer RASTER n84_fl_om_std FILE DELETE_IF_EXISTING USEALL ATHEMATIC 8 BIT
SIGNED INTEGER "c: /temp/fl_om_std. img";
*
# function definitions « n83_Output = GLOBAL STANDARD DEVIATION ( $n78 PROMPT USER , IGNORE 0 ) ;
SHOW $n83_Output ; n80_Output = GLOBAL MEAN ( $n78_PROMPT_USER , IGNORE 0 ) ;
SHOW $n80_Output; n84_fl_om_std = ( ( ($n78_PROMPT_USER - $n80_Output) / $n83_Output) * 1);
#define n53_memory Float ($n49_PROMPT_USE (4 ) + $n49_PROMPT_USER ( 3 ) ) tdefine n51_memory Float ($n49_PROMPT_USER(4 ) - $n49_PROMPT_USER(3) )
#define n55_memory Float (EITHER 0 IF ( $n53_memory == 0.0 ) OR $n51_memory /
$n53_memory OTHERWISE) nl5_f4_ndvi = ($n55_memory - GLOBAL MIN ( $n55_memory )) / ( GLOBAL MAX (
$n55_memory ) - GLOBAL MIN ( $n55_memory )) * 255;
#define n45_memory Float ($n41_PROMPT_USER(4 ) + $n4 l_PROMPT_USER(3) )
#define n43_memory Float ($n41_PROMPT_USER(4 ) - $n41_PROMPT_USER(3) )
#define n47_memory Float (EITHER 0 IF ( $n45_memory == 0.0 ) OR $n43_memory /
$n45_memory OTHERWISE) n8_f3_ndvi = ($n47_memory - GLOBAL MIN ( $n47_memory )) / ( GLOBAL MAX (
$n47_memory ) - GLOBAL MIN ( $n47_memory )) * 255;
#define n35_memory Float ($n31_PROMPT_USER(4) + $n31_PROMPT_USER(3) ) tdefine n33_memory Float ($n31_PROMPT_USER(4 ) - $n31_PROMPT_USER(3) )
#define n39_memory Float (EITHER 0 IF ( $n35_memory == 0.0 ) OR $n33_memory /
$n35_memory OTHERWISE) nl_f2_ndvi = ($n39_memory - GLOBAL MIN ( $n39_memory )) / ( GLOBAL MAX (
$n39_memory ) - GLOBAL MIN ( $n39_memory )) * 255; n29_Output = GLOBAL STANDARD DEVIATION ( $n24_PROMPT_USER , IGNORE 0 ) ;
SHOW $n29_Output; n26_Output = GLOBAL MEAN ( $n24_PROMPT_USER , IGNORE 0 ) ;
SHOW $n26_Output; n30_yie_std = ( ( ($n24_PROMPT_USER - $n26_Output) / $n29_Output) * 1); n20_Output = GLOBAL STANDARD DEVIATION ( $nl5_f4_ndvi , IGNORE 0 ) ;
SHOW $n20_Output; nl7_Output = GLOBAL MEAN ( $nl5_f4_ndvi , IGNORE 0 ) ;
SHOW $nl7_Output; n21_f4_ndvi_std = ( ( ($nl5_f4_ndvi - $nl7_Output) / $n20_Output) * 1); nl3_Output = GLOBAL STANDARD DEVIATION ( $n8_f3_ndvi , IGNORE 0 ) ;
SHOW $nl3_Output; nl0_Output = GLOBAL MEAN ( $n8_f3_ndvi , IGNORE 0 ) ;
SHOW $nl0_Output; nl4_f3_ndvi_std = ( ( ($n8_f3_ndvi - $nl0_Output) / $nl3_Output) * 1) ; n6_Output = GLOBAL STANDARD DEVIATION ( $nl_f2_ndvi , IGNORE 0 ) ;
SHOW $n6_Output; n3_Output = GLOBAL MEAN ( $nl_f2_ndvi , IGNORE 0 ) ;
SHOW $n3_Output; n7_f2_ndvi_std = ( ( ($nl_f2_ndvi - $n3_Output) / $n6_Output) * 1) ; n76_ls_1234y_std = STACKLAYERS ( $n84_fl_om_std , $n7_f2_ndvi_std ,
$nl4_f3_ndvi_std , $n21_f4_ndvi_std , $n30_yie_std ) ;
QUIT;