Figure 1. Relation of Mineral Composition of Tissue to Growth. (Prevot and Ollagnier, 1956; Smith, 1962)
Difficulties have been encountered in the use and interpretation of plant analyses, although the quantitative association between absorbed nutrients and growth has been studied by many. Reliable interpretive data are lacking for a number of crops, particularly for plants during the initial stages of growth, and concentrations near or at toxicity levels. Initially, single concentration values were sought, but it became evident with continuing study that ranges in concentration would better describe the nutrient status of the plant. Prevot and Ollagnier (1956) and Smith (1962) have drawn a figure to represent the association between plant growth and nutrient concentration of a selected plant part (Fig 1). Although this response curve shows a fairly large slope change in the deficiency range, Ulrich (1961) has obtained response curves in which the slope change in the deficiency range is extremely small (see Fig 2).
Figure 2. Relation of Mineral Composition of Tissue to Growth. (Ulrich, 1961).
The nature of Ulrich's observed curve indicates several areas for application of plant analysis technique. With extreme deficiencies, element concentrations may be greater than those found in plants free of the deficiency. The range in concentration between deficiency (with visual symptoms) and the critical concentration (no visual symptoms) can be small. For some elements and plants, the techniques needed to detect these small changes in concentration have yet to be adequately defined. One suggested solution is to determine total plant element content (uptake) and thereby eliminate or minimize the dilution effect. However, this technique has several limitations. It is not applicable when dry-matter differences are large. It requires careful sampling and plant sample preparation, as the dry-matter content must be determined for the entire plant.
The same trend that was followed in soil testing is being pursued in plant analysis; that is, great efforts are being made to define the entire left hand side (deficiency range) of the response curve shown in Fig. 2. From a practical standpoint and in light of the current use of the plant analysis technique, the limits of the sufficiency range are in far greater need of exact determination. Most plant analysis recommendations are not made on the basis of degree of deficiency or excess. Considerable efforts have been made to define the deficiency area of the response curve. However, by comparison, little has been done to define and pinpoint where toxicity occurs.
Some have based plant analysis interpretations on "critical" or "standard values." A critical value is that concentration below which deficiency occurs. Critical values have been widely published and used, although they have limited value since they only designate the lower end of the sufficiency range. Kenworthy (1961) developed an interpretative system for fruit trees based on "standard values." These values were determined from the analyses of large numbers of leaf samples collected from normal producing orchards. An interpretation is made by comparing an analysis to the standard value for that tree species. Standard values are single values and, therefore, have the same limitations as those for critical values.
Ranges in concentration have been published, giving the limits of nutrient classification (for example, low, adequate, high, ...etc. Several references (Chapman, 1966, 1967; Neubert and others, 1969; Walsh and Beaton, 1973; and Reuter and Robinson, 1986) give the most comprehensive listing. Other references are:
The effects of time of sampling, variety or hybrid, and environmental factors, such as soil moisture, temperature, light quality and intensity may significantly affect the relationship between nutrient concentration and plant response. Consequently, a defined sufficiency range may not apply to all situations or environments. Nutrient uptake and internal mobility, as well as dry-matter changes, can affect the nutrient concentrations in plant tissues. Concentration and dilution occur due to the difference between plant growth and nutrient absorption and movement of the nutrients within and between plant parts. Under normal growing conditions, nutrient absorption and plant growth closely parallel each other during most of the vegetative growth period. Exceptions occur during the very early growth period shortly after germination, after seed set, and at the beginning of senescence. However, if the normal rate of growth is interrupted, nutrient accumulation or dilution can occur.
Jones and Mederski (1964) observed that nutrient concentrations in soybean plants oscillated considerably. Analyses to determine the nutrient concentration in leaves, stems, and pods, as well as dry-matter yield, were made every third day during the entire growth cycle. When the total plant uptake (concentration times dry matter) was plotted versus time, the curves were fairly smooth. Therefore, much of the oscillation in nutrient concentrations was essentially due to concentration or dilution associated primarily with changes in dry-matter production. Thus, it is essential that the time of sampling, stage of growth, and character of growth prior to sampling be known and considered when interpreting a plant analysis result.
It has been observed that plants within the same species will vary in their ability to absorb nutrients (Gorsline and others, 1965; Munson, 1969). Similar observations have been made for cotton (Anderson and Harrison 1970). At first glance, one may conclude that such differences complicate the plant analysis technique of relating nutrient concentration to plant growth sufficiently to invalidate its use. A similar opinion was probably expressed some years ago when it was discovered that the same soil test interpretation did not fit all soil types. Soil type is usually considered when making soil test interpretations. Accordingly, genotype may become a factor in the interpretation of a plant analysis.
Gorsline and others, (1965) noted that the ability of a corn plant to absorb a nutrient is an inherited characteristic and can be genetically transferred. The characteristic is one of imparting a high nutrient-accumulating ability. This characteristic should not affect the interpretation of a plant analysis. It has been noted that analyses of leaf tissue of different varieties or hybrids that were responding differently to the same environment did describe correctly the plant's appearance. However, much more research is needed to properly evaluate the effect of genotype on the interpretation of plant analyses.
Interactions or the balance of the elements within the plant have been given considerable study (Bingham, 1963; deWit, Dihkshoorn & Noggle, 1963; Emmert, 1961). Clark (1970) found that the nutrient concentrations of corn plants varied substantially as one nutrient was varied from deficiency to near excess. However, until recently, little had been done to apply balance concepts to a practical system for interpreting plant analyses. The importance of these interactions as they relate to yield has been revealed in work by Peck, Walker, and Boone (1969) and Walker, Peck and Carmer (1969). These techniques of evaluating plant analyses should add much to our knowledge of the association and interaction or nutrient concentrations on plant growth and yield.
An interpretation of a plant analysis at the Soil, Plant, and Water Laboratory is based on comparing the elemental concentration found against a sufficiency range. The concentration of each element analyzed is reported as less than, greater than, or within the sufficiency range. If soil test data and cultural practice information are supplied, an explanation for element concentrations outside the sufficiency range is given. Corrective treatments when required are also normally given.
The causes for a nutrient concentration to fall outside the sufficiency range are many and varied. Low or high soil test levels, low or high soil water pH, improper fertilization, soil compaction, nematodes, and climatic factors are common causes. For most crops and cropping situations, the nutrient concentration found in leaf or plant tissue more closely follows the soil test level and/or soil pH than amount of fertilizer applied. The one major exception is nitrogen. The utilization of a balanced lime and fertilizer program over a period of years will do more to maintain the proper nutrient balance in plants than any one specific lime or fertilizer treatment. As a general rule, a soil testing MEDIUM to HIGH in the essential plant nutrients will produce plants with elemental concentrations which will normally test within the sufficiency range.
There are some common occurring soil-plant growth conditions. Soil test P and plant P, and soil test K and plant K are usually significantly and positively correlated, irrespective of other soil factors. Phosphorus uptake can also be affected by cool soil temperatures, water-logged soil conditions, and extremely low soil pH.
Soil test Ca and plant Ca are usually positively related, but soil pH, fertilizer treatments, and climatic factors can have some affect on this relationship. As the soil pH increases, the correlation between soil test Ca and plant Ca decreases. Heavy applications of N and K fertilizer will tend to decrease the uptake of Ca.
Plant Mg can be affected by several factors. A decreasing soil pH and an increasing K soil test level can markedly reduce the uptake of Mg, irrespective of the Mg soil test level. The uptake of Mg decreases sharply when the soil-water pH drops below 5.4. This is why a Mg deficiency can be partially corrected by just increasing the soil-water pH by liming. When the soil test level (in pounds per acre) of K to Mg exceeds 4:1 or when the soil test level (in pounds per acre) of Ca:Mg exceeds 8:1, Mg uptake by some plants may be depressed. This is of primary importance with forages where greater ratios could lead to increased incidence of grass tetany. Therefore, with some crops, extra precautions should be taken to ensure that the proper balance of Mg to both K and Ca is maintained. As with Ca, the correlation between soil test Mg and plant Mg decreases as the soil-water pH increases.
The effect of soil pH on the availability of most of the micronutrients is well known. In general, as the soil pH increases, the availability and, therefore, the uptake of Cu, Fe, Mn, and Zn decreases. Also, as the organic matter content of the soil increases, the soil pH effect is intensified. The primary exception is Mo where availability tends to increase with increasing soil pH.
Boron deficiencies are due primarily to lack of adequate B in the soil. The corrective treatment is to apply B fertilizer according to current recommendations. Excesses would only result from over fertilization with B.
Copper deficiencies occur primarily on high organic matter soils and possibly on sandy soils which contain low amounts of indigenous Cu and which have pH values approaching 7.0. Excessive Cu plant levels could occur where large quantities of some animal manures, particularly poultry litter, have been applied over a prolonged period.
Iron availability and uptake is a complex subject, as many soil and plant factors can influence the Fe level in the plant. Deficiency may occur when the soil-water pH is near neutral and the soil is high in organic matter. Iron deficiency has been observed in centipede grass, azaleas, blueberries, camellias, pecan trees, some sorghum ,a few soybean varieties, and pin oak trees. In pecans, high Zn in the trees is thought to be a contributing factor in inducing Fe deficiency. From soil and plant conditions, the only corrective treatment is to change varieties or try another tree.
Manganese availability is markedly influenced by soil-water pH, probably more so than any other micronutrient. Manganese toxicities can occur when the soil-water pH is less than 5.4 and deficiencies when the soil-water pH is greater than 6.3. For most Georgia soils, soil-water pH exerts the greatest influence on Mn availability to plants.
Molybdenum is an interesting element. Deficiencies are not easily detected by a plant analysis. The Mo requirement of legumes is high since the N fixing bacteria require higher levels of this element than the plant itself. The normal corrective treatment is a seed treatment with Mo. Also, the Mo related deficiency of poor N fixation is affected by soil-water pH. The response to Mo seed treatments for most legumes is most pronounced at low soil pH (5.2) and decreases as the soil-water pH increases. Therefore, maintenance of the proper soil pH will do much to eliminate the potential of a Mo deficiency.
Zinc availability is related to both soil-water pH and level of soil Zn. Zinc uptake normally decreases as the soil-water pH increases. However, soil test Zn is usually a good indicator of Zn availability. A Zn deficiency can be readily corrected by applying Zn according to current recommendations.
Aluminum is not an essential plant nutrient, but can be a factor affecting plant growth. High Al (if not due to soil or dust contamination) levels in the plant are the result of either a very low soil-water pH (pH's less than 4.8) or anaerobic soil conditions such as flooded or heavy compacted soils. Aluminum does not readily enter the plant, therefore its presence in the plant in high concentrations indicates an extreme soil condition.
It is evident that the interpretation of a plant analysis and a corrective recommendation based on such an analysis can become a complex task requiring considerable skill on the part of the interpreter and sufficient knowledge of the site conditions. One of the common errors made by those submitting plant tissue for analysis is failure to supply the essential information needed to properly interpret the analysis and prescribe corrective treatments. A properly completed Plant Submission Form is an essential part of the submitted plant tissue. Without it, proper evaluation of a plant analysis result is impossible.