1General METSAT Information

Page

1–1. Remote Sensing

001. Radiative transfer

002. Three laws of radiative transfer

1–2. METSAT Systems

003. METSAT advantages

004. Satellite types and coverage

005. Types of weather METSAT imagery

006. SSM/I products

007. Animated satellite imagery (ASI)

EMOTE sensing is one of the most important sources of meteorological data available. Since the launch of the first meteorological satellite (METSAT) in 1960, over 50 METSATs have been launched. Satellite data not only encompasses imagery, but also provides input to the preparation of numerical forecasts and research projects. Satellite data provides real-time data that covers all areas of the globe. Obtaining maximum value from this data requires correct interpretation of METSAT imagery. This unit covers the transfer of energy from the earth to the satellite and METSAT systems.

1–1. Remote Sensing

With an understanding of the transfer of energy from the earth to the satellite, you have the basis for how the sensors are designed to function on the spacecraft.

001. Radiative transfer

Energy is exchanged in three ways: radiation, conduction, and convection. Energy transfer by radiation is distinguished from energy transfer by the other two forms because it doesn’t occur by means of the medium through which energy passes. Convection is the movement through space of the medium. Conduction is transfer through contact between two mediums. Energy transfer by radiation is the dominant mechanism through which energy is transmitted between the earth and the rest of the universe.

Electromagnetic radiation is a collection of an infinite number of waves (at all wavelengths) known as the electromagnetic spectrum. Passive remote sensing devices on satellites measure the reflected or emitted electromagnetic radiation in the visible, near infrared, infrared, and microwave wavelengths. To understand radiative transfer, we need to know some important terms.

Black body

A black body is a theoretically perfect absorber and emitter of radiation. The theory states a black body will absorb all the radiation it meets and emits all the radiation it absorbs when in thermodynamic equilibrium. There are no perfect black bodies.

Emissivity

This is the ratio of emitted radiation from an object to the emitted radiation from a black body at the same frequency (or wavelength) and temperature. Since all emitters are not perfect black bodies, the amount of radiation emitted is part of the black body’s emittance at the same wavelength and temperature. For example, if iron emits at 50 percent the ability of a black body (100 percent or 1.0) at the same wavelength and temperature, its emissivity at that wavelength is represented as 0.5.

Absorptivity

Absorptivity is the opposite of emissivity. It’s the ratio of absorbed radiation by an object to the absorbed radiation by a black body at the same wavelength and temperature. For objects in thermodynamic equilibrium, absorptivity equals emissivity.

Reflectivity

An object’s reflectivity is the ratio of the total amount of radiation reflected from the object to the total amount of incident radiation. For example, we see the moon because light from the sun is reflected by the moon to the earth. Two factors affecting the moon’s brightness are the absorption of energy by the moon’s surface and the scattering of energy by the earth’s atmosphere.

In METSAT imagery, the albedo of an object is a measure of its reflectivity. The albedo is the percentage of incident energy actually reflected. An object with a high albedo, such as clouds, reflects much of the light from the sun. An object with a low albedo, such as a forest, reflects little of the sun’s light.

Scattering

Scattering occurs when energy at a specific wavelength contacts an object about the same size as the wavelength of the incident radiation. The energy is scattered by objects in all directions.

Transmissivity

This is the ratio of energy that passes through an object to the total amount of energy received. Radiation that is not absorbed, reflected, or scattered by the object is transmitted through the object unhindered.

002. Three laws of radiative transfer

Understanding the laws governing radiative transfer are essential as improvements in depiction of satellite data increase. The use of multispectral color composite imagery (two or three channels of satellite data displayed as one image) or composite imagery (the process of deleting part of an infrared (IR) image and replacing it with visible data) is increasing. Without an understanding of these laws, interpretation of new forms of METSAT imagery becomes more difficult.

Planck’s law

This law says the amount of radiation emitted by a black body at a given wavelength is proportional to its temperature. Figure 1–1 shows examples for black bodies having temperatures of 7000°K, 6000°K, and 5000°K (typical temperature values for stars). Here, E is the irradiance (the rate of energy transfer) and the Greek letter Lambda (l ) is the wavelength. These curves show the higher the temperature of an object, the greater is the amount of radiation emitted by the object.

Wien’s displacement law

This law, which is a derivation of Planck’s law, says the wavelength of the maximum irradiance of a black body depends on its temperature. A very hot object emits its maximum amount of radiant energy at shorter wavelengths than a cooler object. Figure 1–2 illustrates Wien’s displacement law. The figure shows the wavelength of maximum irradiance for the hot solar surface is in the visible light portion of the electromagnetic spectrum and the wavelength of maximum irradiance for the cooler earth’s surface is in the IR portion.

Figure 1–2. Normalized black body irradiances for the sun (5780°K) and the earth (255°K).

Kirchoff’s law

Kirchoff’s law says for objects in thermodynamic equilibrium (a steady temperature), absorption of radiant energy must be equal to the emission of radiant energy. If an object receives more energy than it emits, the object warms; if the object emits more energy than it absorbs, it cools. Objects that are good emitters are also good absorbers and vice versa.

These three laws directly relate to METSAT imagery. METSAT sensors measure the emitted energy of all objects in their line of sight. The one exception is the microwave sensor. It measures the absence of energy at a specific wavelength to detect specific data. All substances have different emissive characteristics though some differences may be very slight. In the wavelengths used by the IR sensors of all the satellites, the emissive characteristics of land, water, and clouds are nearly the same. Given this, if the temperatures of all three were the same, they would all appear to be nearly the same gray shade on an IR image.

Since the emissive characteristics are nearly the same, this allows the temperature differences between land, water, and clouds to be significant in how they appear on IR imagery. Since high clouds are cooler than low clouds, they appear whiter than the low clouds. Another example of the importance of temperature differences is the "black fog" phenomenon. Generally, the temperature differences between fog and land are not sufficient to distinguish them on IR imagery. But, if an area of fog was to move over an area of land that was sufficiently cooler than the fog, the fog would appear darker than the adjacent land.

Self-Test Questions

After you complete these questions, you may check your answers at the end of the unit.

001. Radiative transfer

1. Match the energy transfer terms in column B with the descriptions in column A. Items in column B may be used once.

Column A

_____1. Occurs when energy at a specific wavelength contacts an object about the same size as the wavelength of the incident radiation.

_____2. The ratio of emitted radiation from an object to the emitted radiation from a black body at the same wavelength and temperature.

_____3. The ratio of absorbed radiation by an object to the absorbed radiation by a black body at the same wavelength and temperature.

_____4. The ratio of the total amount of radiation reflected from the object to the total amount of incident radiation.

_____5. The ratio of energy that passes through an object to the total amount of energy received.

_____6. A theoretically perfect absorber and emitter of radiation.

Column B

a. Emissivity.

b. Black body.

c. Scattering.

d. Absorptivity.

e. Reflectivity.

f. Transmissivity.

002. Three laws of radiative transfer

1. Explain Planck’s law.

2. Explain Wien’s displacement law.

3. Explain Kirchoff’s law.

1–2. METSAT Systems

There are currently four countries besides Europe with METSATs in orbit. The Commonwealth of Independent States (the former Soviet Union) has METEOR, a polar orbiting satellite; Japan has the Geostationary Meteorological Satellite (GMS); India has Indian satellite (INSAT); and Europe has meteorological satellite (METEOSAT). The latter two programs are also geostationary satellites. These satellite orbits are varied and complex and determine the type and area coverage of the imagery produced. The United States has three METSAT programs.

003. METSAT advantages

There are advantages and drawbacks to METSAT imagery and its interpretation. We discuss the drawbacks later. The following lists the advantages of METSAT imagery and its use as an analysis tool.

a. METSAT imagery is an observation that is more frequent than synoptic reports.

b. It provides data in areas lacking conventional data, such as over ocean and desert regions.

c. It also enhances resolution in areas that have an organized, dense synoptic network. Examples of these would be the US and European observation networks.

d. Animated looping allows systems to be put in motion. This allows you to see system motion and the interaction between different pressure systems. You also see interaction between weather systems of different scales.

e. A single METSAT image gives you a more complete idea of the vertical structure of the atmosphere than one or two products. You can see the low-, mid-, and upper-level features simultaneously. You can also determine how they relate to each other.

004. Satellite types and coverage

Let’s now discuss the characteristics of different satellite types and the coverages they provide. Understanding these different characteristics will help you in interpreting METSAT imagery.

Polar orbiting

Polar orbiting satellites orbit the globe from pole to pole at a slight angle to the poles. Figure 1–3 illustrates an example of the polar orbiting satellite’s inclination angle. In an ascending orbit, the satellite has a south-to-north orbit. In a descending orbit, it has a north-to-south orbit.

The inclination angle (98.7°) and the altitude of the satellite, approximately 472 nautical miles (nm), allow for global coverage every 12 hours. Polar orbiter imagery covers 1600nm across an 8-inch-wide picture. This is about 26.7° latitude width at the equator. Figure 1–4 shows a polar orbiting scan line. Within 30°N/S of the equator, you’ll receive two satellite passes every 12 hours at a given location. The closer a station is to the N/S poles, the more satellite passes it will receive in a 12-hour period.

Figure 1–4. Satellite scan line.

You’ll only receive this imagery for real-time analysis at direct readout sites (DRO) or across a fax machine. (A DRO can receive a direct transmission of the imagery from a satellite as it passes over the site.)

Air Force Global Weather Central (AFGWC) receives all the satellite data from the spacecraft and stores it on disk. This allows METSAT imagery from around the globe to be meshed by a computer and to form a regional, hemispheric, or global METSAT image. This imagery is sent to military and civilian operations.

Geosynchronous

This satellite is stationed over the equator at 19,312nm/35,800 kilometer (km). The satellite stays in position because of a balance between centrifugal force (CeF) and gravity (g); (g = CeF). The satellite orbits the earth at the same angular velocity as the rotating earth. Each satellite covers 140° of longitude and latitude, giving you 120° of useful cloud data (see fig. 1–5).

Figure 1–5. GOES coverage.

There are three to five satellites positioned around the earth that provide worldwide coverage (see fig. 1–6). They provide continuous coverage of the same location on the earth 24 hours a day.

Figure 1–6. Satellites.

Differences

Polar orbiting satellites give you global coverage in a 12-hour period. This works well with our global military mission. You can also see different cloud and terrain features more clearly because of the better resolution of METSAT imagery.

The geostationary satellites give you continuous coverage of the same area over a 24-hour period. This allows you to loop the imagery to follow fronts, lows, severe weather, and many other cloud and non-cloud features.

Operational satellites

There are two types of operational satellites—polar orbiting and geosynchronous. The polar orbiting satellites consist of the Defense Meteorological Satellite Program (DMSP) and National Oceanic and Atmospheric Administration (NOAA) satellites. Geosynchronous satellites consist of geosynchronous operational environmental satellites (GOES)—USA, geostationary meteorological satellite (GMS)—Japan, meteorological satellite (METEOSAT)—European, and Indian satellite (INSAT)–India.

005. Types of weather METSAT imagery

The wavelengths used by METSAT sensors to produce METSAT imagery are part of the electromagnetic spectrum. Figure 1–7 shows the electromagnetic spectrum.

Figure 1–7. Electromagnetic spectrum.

Gray shade description

Gray shades are how we see cloud and non-cloud phenomena on the imagery. A computer assigns 256 gray shades to features on the imagery, ranging from a very dark gray to a very light gray. The eye can only see 15 to 20 gray shades.

A gray shade is assigned a specific brightness or temperature value, depending on the type of imagery you are working with. Figure 1–8 shows the specific gray shade assigned to a particular feature.

On visual imagery, you are seeing the reflectivity of features that are converted to brightness values. With infrared imagery, you’re seeing the temperature of features that are converted to brightness values. On water vapor imagery, you’re seeing the amount of moisture sensed in a vertical layer that is then converted to a brightness value.

We use the contrast in gray shades to help us interpret cloud and non-cloud phenomena on METSAT imagery.

Figure 1–8. Gray shades.

Visible (VIS)

Visible METSAT imagery measures the reflected visible radiation or the brightness of the reflected sunlight with a wavelength of 0.4–0.74 microns (µm) (see fig. 1–9). Next, we list some factors that affect the brightness measured.

Figure 1–9. Visual imagery.

Illumination

Sun angle considerations are important because they affect your interpretation of clouds, cloud patterns, terrain features, and other atmospheric phenomena. With experience, taking these considerations into account becomes second nature.

Time of year/day

These determine how much sunlight you have at a certain location depending on the season (for example, winter versus summer, noon versus dusk).

Latitude

The equator receives approximately the same amount of sunlight day to day throughout the year. The farther north or south you look, the less light you have, depending on the time of the year or the time of day.

Cloud height

Higher clouds cast shadows on the lower-level clouds; the lower-level clouds are harder to see due to less illumination. Figure 1–10 shows an example of how cloud shadows may obscure lower clouds.

Figure 1–10. Cloud shadows.

Cloud characteristics

Water droplets are five times more reflective than ice crystals when we view them from a satellite. Ice crystals scatter more sunlight away from the sensor than does a water droplet. A thicker or denser layer of ice crystals has the same reflectivity as a lesser amount of water droplets. The brightness increases with increasing density (concentration) of cloud particles (crystals/droplets for example, fog and stratus versus cirrostratus). Brightness increases with increasing cloud thickness (depth for example, cumulonimbus). Brightness also depends on the ability of a surface to reflect sunlight (albedo). The table below shows some different phenomena and their reflectivities.

1. Large thunderstorm

92%

7. Thin stratus

42%

2. Fresh new snow

88%

8. Thin cirrostratus

32%

3. Thick cirrostratus

74%

9. Sand, no foliage

27%

4. Thick stratocumulus

68%

10. Sand and brushwood

17%

5. White Sands, NM USA

60%

11. Coniferous forest

12%

6. Snow, 3–7 days old

59%

12. Water surfaces

9%

Near infrared (NIR)

NIR measures the amount of reflected sunlight and emitted energy. The factors affecting the brightness in visual imagery have the same effects in the near IR. The NIR wavelength is 0.75 to 2.0µm. Because vegetation reflects best in these wavelengths, you can detect vegetation, terrain features, and lithometeors such as haze, and dust, better than at the visual wavelengths alone.

Most METSAT sensors are designed so the visual imagery is a combination of VIS and NIR wavelengths. This allows for good land and water contrast and provides easier identification of features on the imagery. This wavelength is also very sensitive to lunar radiation. The sensitivity allows us to get nighttime visual imagery from DMSP spacecraft when there is a quarter moon or more. NIR imagery identifies:

Water vapor (WV)

Water vapor imagery measures the earth’s radiation at 6.7µm on GOES imagery and 5.7 to 7.1µm on METEOSAT imagery. This is the band of maximum water vapor absorption in the electromagnetic spectrum. The WV channel is also called the moisture channel.

WV imagery is based on the amount of energy at a specific wavelength that does not reach the sensor. The more energy that is blocked from the sensor at a specific wavelength, the lighter the gray shade.

GOES WV imagery shows the amount of moisture in the atmosphere that is absorbing energy at 6.7m m (see fig. 1–11). The brightness of the radiation observed depends on both the atmosphere’s water vapor content and its temperature. By measuring this radiation, the WV imagery shows the moist and dry areas. However, since absorption is highest between 610 to 240mb, middle and high-level moisture affects the sensor much more than low-level moisture. Very moist areas in the middle and upper levels that contain cirrus clouds appear white. Areas that are moist in the middle and upper levels with no cirrus clouds appear as a light gray. Dry areas in the mid- and upper-levels with no clouds appear as dark gray or nearly black, provided the low-level temperatures are warm. Even dry areas in the upper levels appear light gray if the surface temperatures are cold.

Moisture pattern

The moisture pattern is the result of vertical motion and moisture advection in the atmosphere. A moisture region is normally associated with upward vertical motion that appears as a light gray shade. A drier region is normally associated with downward vertical motion that appears as a dark gray shade. You can see moisture on the WV imagery although no clouds are evident on other METSAT imagery.

Figure 1–11. Water vapor imagery.

Advantages

WV imagery provides information where no mid- and upper-level clouds are present. You can more accurately locate circulation centers, jet streams, wind maximums, troughs, and ridges. The moisture pattern and the changes to it appear more fluid than the cloud features on other METSAT imagery.

Uses

Since it senses mid- and upper-levels better, features at these levels are more easily distinguished than lower-level features. Some uses of WV imagery are:

Studies show that 80 percent of the time, well-defined "plumes" in the water vapor imagery accompany extreme rainfall (5 inches or more within a 24-hour period). On the global scale to synoptic scale, the 6.7m m water vapor imagery reveals northward movements or surges of mid- to upper-level moisture from the tropics into the mid-latitudes. These surges, called water vapor plumes, are usually associated with large-scale circulations. As the plumes progress from the tropics northward into the US, they often interact with low-level moist, unstable air (represented by ridge axes of equivalent potential temperature) and mechanisms that produce upward vertical motion such as jet streaks. These interactions often result in flash floods. This was the case during the upper Midwest floods of the summer of 1993. Water vapor plumes persisted on the back side of the subtropical ridge (located over the eastern US). Jet streaks, associated with near continuous series of short-wave systems from the west, repeatedly interacted with the western and northern boundary of the persistent moisture plume.

Far infrared (FIR)

FIR measures the long-wave radiation of objects (clouds and ground) using a wavelength of 10.2 to 12.8m m. The amount and frequency of long-wave radiation depends on the temperature and emissivity of the object. Different temperatures emit at different IR wavelengths. The computer converts the wavelengths received by the METSAT sensors to brightness values that represent temperatures. Shades of gray ranging from black to white are assigned to temperature values. In normal IR, hot temperatures are white and cold temperatures are black. Gray shades are easier to interpret than on the inverted IR.

FIR METSAT imagery is normally inverted so colder clouds appear white and hot surfaces appear black. The advantage of FIR imagery is that it allows 24-hour a day viewing. It does not depend on reflected sunlight for an image. This allows looping of the data 24 hours a day, enabling you to follow systems easily with continuity.

Factors affecting the measured temperature

Knowing the factors affecting the accuracy of the temperature being measured determines your confidence in the data. In this next area you’ll learn about the factors you need to consider.

Time of year/date

Account for diurnal and seasonal temperature fluctuations. These are most apparent in low-level features.

Cloud characteristics

Certain cloud characteristics affect what the sensor detects and, finally, what you interpret. One characteristic is the cloud particle type. Ice crystals are semitransparent to radiation, causing thin cirrus to appear as a medium gray shade.

Another characteristic is the cloud thickness (depth) and density of the cloud. They determine how much radiation from below is allowed through to the sensor. The thicker the cloud and denser the cloud droplet/ice crystals, the more accurate the cloud top temperature reading (for example, cirrus to cirrostratus). Increasing cloud droplet/ice crystal density reduces radiation from below.

Upper-level clouds are most apparent on unenhanced infrared imagery. Low-level clouds do not stand out as easily because there is little temperature contrast between the cloud and the earth’s surface. Enhanced IR (EIR) imagery can give you a temperature contrast for any level in the atmosphere, depending on the enhancement you use.

GOES legend

The GOES legend supplies information on the imagery. Refer to figure 1–12 while you read the following information.

Figure 1–12. GOES legend.

Enhancement

Enhancement is a series of gray shades corresponding to thermal values that provide a thermal contrast to features on the imagery. Enhancing an image is done on the ground by computers. Unenhanced imagery shows a transition from warm to cold by a uniform increase in brightness. This is a linear-scale relationship where you have one gray shade to one temperature value. Figure 1–13 shows the relationship between the different gray scales and the temperature.

Figure 1–13. Gray scales.

Thresholding

Thresholding is assigning a gray shade to a temperature range. A series of threshold values (alternating dark and light gray shades) on METSAT imagery is known as step contouring. This allows you to see deep, vertically developed clouds, such as a cumulonimbus, easier.

Brilliance inversion

A brilliance inversion occurs by assigning a range of temperatures to a range of gray shades. This process allows you to see more detail in the data. An example would be the change from altostratus to cirrostratus clouds.

Common GOES enhancement curves

There are several types of GOES enhancement curves available for forecasters to use. Many enhancement curves have been applied to polar orbiting imagery. Each curve has a specific purpose or use.

ZA

The ZA curve is closest to a straight linear curve (see fig. 1–14). It has a slight low/high linear enhancement, making it one of the most commonly used curves. WV imagery is also enhanced using the ZA curve identifier. Color enhancements should only be accomplished on this curve, if you have this capability.

EC

This is a cool season, general purpose curve (see fig. 1–15). Segment five enhances temperatures ranging from –13°C to –50°C that are normally associated with precipitation during the cool season. Segments six and seven depict convective cloud tops. Figure 1–16 compares the ZA and EC enhancement curves.

 

 

Figure 1–14. ZA imagery.

Figure 1–15. EC imagery.

Figure 1–16. ZA/EC enhancement curves.

MB

This good, all-purpose curve is most commonly used for convective activity (see fig. 1–17). Cirrus and cirrostratus are also easily detected on this curve. Segment three gives a good definition to mid-level clouds. Segments four through seven contour convective activities; segment eight allows a more gradual gray shade change to better define convective tops.

 

Figure 1–17. MB imagery.

 

JG

This winter time curve is used to define water currents, low stratus, and coastal fog (see fig. 1–18). It can help identify the surface freezing line, where the temperature changes from –0.2°C to –0.7°C between segments two and three. This is useful for temperature forecasting and identifying the freezing level. The portion colder than freezing is identical to the MB curve. Figure 1–19 compares the MB and JG enhancement curves.

Figure 1–18. JG imagery.

Figure 1–19. MB/JG enhancement curves.

CC

This curve is designed for cloud interpretation in colder northern latitudes in the winter (see fig. 1–20). The CC curve makes it easier to identify low and mid clouds associated with colder air in the winter.

Figure 1–20. CC imagery.

HF

This enhancement curve is designed for west coast forecasters to enhance system cloud tops over the Pacific Ocean. Figure 1–21 compares the CC and HF enhancement curves.

Figure 1–21. CC/HF enhancement curves.

 

C1/C3/C6/C9

This is a combined VIS and IR imagery. The imagery is visual to a certain temperature and then the computer switches over to an IR image. Normally, enhancement begins between –43°C to –54°C, depending on which enhancement curve you use. These curves help you identify cumulus, towering cumulus, and cumulonimbus development in the VIS portion while, simultaneously, you can find the approximate heights of convective cloud tops. These curves also aid in finding convective cloud tops embedded in nimbostratus clouds. Figure 1–22 shows an example of the C9 imagery. Figure 1–23 shows the C3 enhancement curve.

Viewing considerations

The viewing considerations you’ll learn about are very important to remember when you interpret METSAT imagery. Satellite imagery doesn’t lie to you but it can be misinterpreted if you forget the following considerations during the interpretation process.

Figure 1–22. C9 imagery.

Figure 1–23. C3 enhancement curve.

Resolution

Resolution is the smallest individual element a sensor can detect. This is designated on the imagery as a small square, known as a pixel. Figure 1–24 shows what pixels look like in a digital image.

Resolution is greatest at the subpoint and decreases in all directions away from this point. The subpoint is located directly beneath the satellite; on polar orbiting satellites, the series of subpoints is known as the subtrack. This designates the track of the satellite (see fig. 1–25).

 

Figure 1–25. Satellite track.

If the object/cloud is below the resolution of the METSAT sensor, the sensor averages the brightness/temperature of the object with the background. Individual elements are not seen and a compromise gray shade results. This causes the clouds to appear warmer and lower in height than they actually are.

Listed below are the different satellite resolutions at their respective subpoints. Figure 1–26 illustrates the different sector dimensions for some satellites and their resolutions.

 

 

 

 

 

 

 

 

 

 

 

 

Figure 1–26. Sector dimensions.

All GOES IR imagery has 7km resolution despite the number on the legend. All GOES WV sectors have 14km resolution.

Attenuation

Attenuation is any loss of energy due to absorption and scattering of IR radiation by atmospheric elements. IR wavelengths are absorbed by water vapor and carbon dioxide. This occurs only in the IR wavelengths. Figure 1–27 illustrates an example of attenuation in a clear atmosphere.

Figure 1–27. Attenuation in a clear atmosphere.

This process reduces the amount of energy reaching the METSAT sensor so cloud tops appear higher and colder than they really are. How much attenuation occurs depends on:

Figure 1–28. Attenuation geometry.

Figure 1–29. Attenuation—cloud levels.

Attenuation is greatest in the tropics, due to a deep layer of moisture, and at the edges of the earth disk (on geosynchronous), due to the oblique (shallow) viewing angle, which increases the amount of moisture through which the energy must propagate.

Contamination

Contamination is energy sensed by the satellite from two or more sources along the same line of sight. The sensor averages the brightness/temperature, thus giving you an inaccurate cloud-top temperature reading. Figure 1–30 shows how contamination can occur through thin cirrus clouds. Contamination can occur on VIS and IR imagery. The amount of contamination depends on the viewing angle, the cloud element spacing, the cloud layer thickness, and the vertical temperature profile through thin cirrus clouds.

Figure 1–30. Contamination through thin cirrus clouds.

Viewing angle

The viewing angle of the METSAT sensor determines the amount of contamination affecting the sensor. If the sensor is directly over thin cirrus, there is more contamination than if the sensor must look through it at an oblique angle. Remember, the more oblique the viewing angle, the more attenuation is a factor.

Spacing

The land or ocean surface is evident between cloud elements (for example, stratocumulus field). This causes the sensor to interpret two types of gray shades within the same region. This also causes a general darker gray shade in the region, which indicates clouds are lower than they really are.

Thickness

The thickness of the cloud layer or layers interacting with the IR energy from the warm earth or a low-level cloud deck makes a thin upper-level cloud layer appear warmer and lower than it actually is (for example, thin cirrus over water). With VIS imagery, the higher clouds are thin enough so that you can see the earth’s surface or lower clouds.

Vertical temperature profile

This can mislead your interpretation in certain situations. If there is a strong inversion and moisture is trapped below it, there is a difference in temperatures in the vertical over a region. Black stratus is a cloud feature where the stratus traps radiation from the earth. This warms the cloud up significantly while the surrounding clear regions continue to cool as expected. This makes the region where the black stratus is located appear as a clear region with clouds surrounding it. Looking at observations you’ll find the reverse is true.

Foreshortening

Foreshortening is a loss of resolution caused by an oblique (shallow) viewing angle that results in a distortion near the edge of the picture on any type of METSAT imagery. The METSAT sensor is looking into the sides of the clouds, not the tops of the clouds, and is overestimating the cloud coverage. The clouds are compressed onto the METSAT image at the disk edge (for example, scattered to broken cloud layers appear overcast). This is most noticeable on geosynchronous METSAT imagery along the edges of the disk.

Cloud location errors can occur due to the oblique (shallow) viewing angle. Figure 1–31 shows a diagram of an actual cloud versus its apparent location. This varies with cloud height. The error isn’t significant with synoptic-scale systems. Remember to always adjust the clouds toward the satellite subpoint.

Figure 1–31. Diagram illustrating actual versus apparent cloud location.

Time response

This is the time it takes for the sensor to heat up and cool off plus react to large temperature changes. This can result in the displacement of cloud edges to the east on a GOES METSAT image and the underestimation of cloud-top heights. The GOES IR sensor can only change its reading a maximum of 26°C per pixel. For example, the surface temperature just west of a violent thunderstorm is 32°C and a significant part of the cloud top near the west edge of the storm is –80°C. One pixel would show the correct 32°C. The next pixel would show a temperature no lower than 6°C. The following pixel would show –20°C, while the fourth pixel would be –46°C. The time lag can cause the true cloud-top temperature to be occasionally inaccurate.

Gridding errors

Occasionally the geographic grid, whether it is manually or computer-produced, is mispositioned. There are two ways to compensate for this. One is to develop a master set of clear overlays to overlay the picture so that you’ll have the correct depiction. The other is to identify geographic features, such as lakes, rivers, coastlines, or mountains, on any METSAT image.

Sun angle

A low sun angle enhances the cloud-top texture due to shadows. This is seen only on the VIS imagery. Visual image pictures early or late in the day are best for defining clouds, cloud patterns, and cloud layers. Considering the sun angle is your most important viewing consideration with visual METSAT imagery interpretation.

Latitude

Typically, temperatures decrease for clouds and features of the same level as they get closer to the poles. For example, fog in South Carolina and Georgia has a cloud-top temperature of 15°C. Fog west of Lake Winnipeg has a cloud-top temperature of 3°C. The gray shade for the fog west of Lake Winnipeg is lighter than the fog in South Carolina and Georgia.

Other considerations

Other considerations are the working condition of the satellite equipment, spacecraft problems, and communication line problems between the satellite center and your station. These problems may cause dropouts on the imagery or streaking on the imagery.

Special sensor microwave/imagery (SSM/I)

SSM/I is a joint Navy/Air Force operational sensor that measures critical atmospheric, oceanographic, and land parameters on a global scale by passive sensing of microwave radiation. The first operational sensor was launched in June 1987 on the DMSP satellite F8. The second sensor was launched on another DMSP satellite in December 1990. These were the first two of seven instruments scheduled to be launched over a 20-year period. The SSM/I has monitored the development and course of 75 percent of the tropical storms and cyclones that have occurred since it was launched. Its global, day-night, all-weather surveillance provides valuable, cost effective data for precisely locating storms and for determining their physical characteristics. Because the microwave radiation from storms can penetrate the dense overlying cirrus clouds with very little attenuation, it reveals physical details not always depicted by visible and infrared images, including the eye of the vortex and the structure of deeply convective regions.

Terms

To understand SSM/I, you must become familiar with some common terminology.

Polarization

Polarization is a term we use to describe the orientation of the electric and magnetic components of electromagnetic radiation waves. When the components are oriented more, or completely, in a particular direction than any other, the radiation is said to be polarized.

Algorithm

Algorithm is a procedure for solving a mathematical problem. Basically, it is a formula.

Brightness temperature

Brightness temperature is the temperature an object/surface appears to have when we measure the intensity of its emitted radiation at a particular frequency/wavelength. It differs from the actual physical temperature according to the emissivity of the object (scale 0–1). Objects with low emissivity (calm water) appear colder than they really are, while objects with high emissivity show the same temperature. For a theoretical black body (perfect emitter/absorber), the brightness temperature would equal the actual temperature of the object at all wavelengths.

Polarization difference

Polarization difference is the arithmetic difference in temperature between the horizontal and vertical channels at any one frequency.

Sensor description

SSM/I is a satellite-borne sensor. It passively detects emitted and reflected microwave radiation at four frequencies/wavelengths, 19.3, 22.2, 37.0, and 85.5 Gigahertz (GHz). The 19.3, 37, and 85.5GHz frequencies each have two channels (one horizontal, one vertical). The 22.2GHz frequency is measured by only one channel since it is primarily used to measure water vapor, whose signal is unpolarized. A total of seven channels are sensed.

Using radiation laws, the sensed microwave energy, initially recorded as an electric voltage in each channel, is converted by algorithm to a "brightness temperature." By combining the brightness temperatures of different channels, we can derive other weather and land parameters by using statistically based algorithms.

Advantages of SSM/I

Microwaves penetrate the clouds with little or no attenuation. The sensing is independent of solar illumination, as it is with IR. SSM/I provides information complementary to that available in the visible and IR spectral regions. One good example is dense ice clouds that completely obscure the ground, both visually and at IR frequencies, and are almost completely transparent to microwaves. Operationally this means we can sense certain surface weather parameters day and night and in cloudy or clear weather.

Orbital characteristics

The satellite orbits at a distance of 833km above the earth in a circular, sun-synchronous, near-polar orbiting path and is inclined at 98.8° . Figure 1–32 shows the SSM/I scan geometry. The orbital period is 102 minutes, giving 14.1 revolutions per day. The satellite track moves at 6.58km/sec along the earth’s surface. These orbital parameters were chosen in a compromise between providing the best global coverage (high orbit is better) at the highest resolution (low orbit is better).

Figure 1–32. SSM/I scan geometry.

Sensor scanning geometry/resolution

The reflector of the SSM/I is positioned at an angle of 45° to the SSM/I spin axis. This angle is called the cone angle because the beam of the antenna sweeps out a 45° cone around the spacecraft. SSM/I’s conical scan "sees" the ground at the same angle of incidence throughout the scan. This is beneficial, since the amount of polarization in a sensed pixel depends strongly on the incidence angle. Also, the field of view/resolution remains the same throughout the scan. The scan width at the earth’s surface is 1400km. Gaps in the coverage occur between 30oN and S latitude (see fig. 1–33). Now, with two SSM/I sensors in orbit, there is a 99 percent probability of viewing a storm in the tropics at least once a day. The resolution of the five lower channels is greater than 25km, while sensor resolution at 85GHz is 12.5km.

Figure 1–33. The SSM/I swath/coverage in 24 hours.

006. SSM/I products

The radiation emitted by the earth is predominately in the infrared and microwave portions of the spectrum. The intensity of the radiation is a function of frequency, polarization, the incidence angle of observation, the emissivity of the scene, and the transmission through, and radiation by, the atmosphere. Since the atmosphere is more absorptive at infrared than at microwave frequencies, a microwave sensor receives a greater amount of the radiation from the surface and lower atmosphere. The SSM/I, operating at the frequencies listed above, can sense, by means of emitted and reflected radiation, not only the low-level clouds and water content, but also the wind-roughened sea surface and the type of land below the clouds. The emissivity of the surface is related to its physical properties; therefore, we can make conclusions about those properties.

The SSM/I retrieves information on cloud concentration, precipitation, cloud liquid water, humidity, and marine wind for meteorologists and naval operations. The SSM/I detects not only storms, but sea ice, and outlines their boundaries for the safe routing of ships, and measures land parameters for geological, agricultural, and military purposes. In just one 13-minute overpass of the tropical zone, for example, it can sense the surface for any of these potential purposes in over 26,000 locations, day or night, in any kind of weather. Within 1–4 hours, receiving terminals around the world, on land or sea, can access the information.

SSM/I products are available in two forms, Sensor Data Records (SDR) and Environmental Data Records (EDR). SDRs are corrected brightness temperatures that comprise the basic raw data the SSM/I is sensing. EDRs are derived from SDRs and contain environmental parameters directly usable by meteorologists or oceanographers. Parameters that can be interpreted from SSM/I include: precipitation intensity of storms over land and water (whether tropical, extratropical, or polar), cloud concentration, cloud water content, and water vapor over the ocean. Also, parameters such as ocean surface wind speed, sea-ice concentration, ice/water boundaries, differentiation between new, first-year, and multi-year ice, land-surface temperatures, snow-water content, and soil moisture can be determined.

EDR generation

Processing of SDRs into EDRs depends upon the surface type. There are three processing paths possible for detection: land, ocean, and ice parameters. The table below shows which parameters are calculated for each surface type. Scenes tagged as coasts are not processed since mixed land/water footprints are difficult to interpret. The value of a particular EDR is usually calculated as a combination of the brightness temperatures (SDRs) of specific channels.

Surface type

EDRs to calculate

Ocean

Rain rate

Cloud water content

Surface wind

Water vapor

Rain flag

Ice

Ice concentration

Ice age

Edge flag

Rain over soil

Rain rate

Rain over vegetation

Rain rate

Water/land mix

None

Glacier

None

Vegetation

Surface temperature

Moist soils

Surface temperature

Soil moisture

Agricultural and range

Surface temperature

Dry soil, desert

Surface temperature

Coast

None

Receipt of SSM/I data

Supercomputers process the data at two central locations, Fleet Numerical Meteorology and Oceanography Center (FNMOC) and Air Force Global Weather Central (AFGWC). The data is displayed on the Satellite Data Handling System (SDHS) at AFGWC while the Navy’s Naval Oceanographic Data Display System (NODDS) displays finished products as transmitted from FNMOC. The Navy’s Tactical Environmental Support System (TESS[3]) can also display SSM/I imagery.

Tactical sites acquire raw SSM/I data (imagery) through MARK 3 or TMQ–29 ground stations for their local area in real-time as the DMSP satellite passes overhead. Data can be received only when the DMSP is above the local horizon, which lasts, at most, 17 minutes of a 101 minute orbit. Processing is done on a dedicated system know as Mission Sensor Tactical Imaging Computer (MISTIC). MISTIC stores the SDR and EDR products in data bases. Images are displayed and manipulated using the software on MISTIC. This software allows the user to enhance features by modifying the image or by providing information not readily apparent from the raw image. Data can also be stored and exchanged between MISTIC sites via modem or floppy disk.

Data accuracy

Accuracy varies based on the type of data. Brightness temperatures are accurate within 1° Kelvin (K) for SDRs. EDR accuracy depends on the parameter used. For geographic locations, the pixels are within 12km resolution.

SDR interpretation

The most primitive form of SSM/I data likely to be encountered by operational personnel are SDRs. The 19 and 37GHz channels are most useful for viewing surface phenomena. The 22.2GHz channel is for viewing atmospheric water vapor while the 85GHz channel is for viewing rain and clouds.

Land parameters

During the day, the land surfaces of earth, because of their relatively low albedo, receive a great deal of energy from solar radiation. This energy is turned into heat and is reradiated at lower frequencies (infrared and microwave).

When it is dry, bare soil has a high emissivity (0.9–0.95) at microwave frequencies and it is nearly constant with frequency, so that all SSM/I channels record a similar brightness temperature. Emission from dry soil is generally highly polarized (the amount of vertically polarized energy is more than the amount of horizontally polarized).

Wet soil has a dramatically different radiative characteristic and its appearance is different in the imagery. Thoroughly wet soil has an emissivity of about 0.6 and significantly lower brightness temperatures than the same soil in a dry condition. NOTE: The lower the emissivity is, the lower the brightness temperature will be.

Vegetation

In SSM/I, vegetation appears as a combination of modified radiation from underlying soil and emissions from the vegetation canopy itself. Generally, plants have a higher emissivity (brightness temperatures) than soil. Therefore, vegetation appears warmer than the surrounding ground.

Snow

Snow appears much colder (lower brightness temperatures) than bare soil and, up to a limit, gets even colder as more snow is added. It also reduces the polarization difference of the soil. As with dry soil, the situation changes dramatically when the snow becomes wet. Wet snow appears warmer than dry snow, as well as having a higher polarization difference. As a snowpack melts and refreezes repeatedly, it appears increasingly colder.

Ocean parameters

The emissivity of open calm sea water is relatively low, ranging from about 0.4 to 0.6GHz. Due to the lower emissivity of water relative to land, land/water boundaries stand out dramatically. Salinity of the water has little effect on emissivity so fresh and salt water bodies at the same temperature appear about the same. The result of these properties is that calm open ocean appears as a cool, uniform background. The presence of clouds, rain, water vapor, or wind changes the signal substantially; This allows us to make accurate measurements of these phenomena.

Using radiometric data, sea ice is classified into three categories: new ice (£ 30cm); first-year (FY) ice (30cm–1 meter), which has yet to undergo the annual process in which upper layers melt and refreeze; and multi-year (MY) ice (>1 meter), which has undergone at least one melt cycle. Both new and FY ice have high emissivities (0.9–0.95). Since open water has an emissivity of about 0.5, the ice appears much warmer than the surrounding ocean. The emissivity of MY ice is slightly lower (0.8–0.9) than FY ice. Thus, MY ice appears cooler than the FY ice. Snow covered ice appears warmer than ice with no snow.

Atmospheric parameters

SSM/I looks at the following atmospheric phenomena: water vapor, clouds, rain, and wind.

Water vapor

We measure water vapor using the 22.2GHz channel. At this frequency, WV appears warmer than the ocean background. The ocean, because of its uniformity, provides a superior background to land for identifying WV concentrations. Rain or a large amount of liquid cloud water alters the signal, making measurement of WV impossible.

Clouds

Clouds appear in SSM/I imagery as the result of a complex combination of emission, absorption, and scattering. The SSM/I responds to the amount of liquid water in the cloud, not to its thickness.

High-altitude clouds composed of small ice crystals (that is, cirrus) are virtually invisible at microwave frequencies. This is a valuable characteristic when we observe mature tropical storms having a cirrus overcast. Low and mid-altitude clouds appear significantly warmer than the background. Over the ocean, emissivities range from 0.4 to 0.6.

Detection of clouds over land is very difficult, particularly low-altitude clouds. Clouds with extensive vertical development or layers of high liquid water content appear much cooler than the underlying land.

Rain

If rainfall is light and the storm is only mildly convective (that is, nimbostratus), brightness temperatures increase with increasing rainfall rate. In convective storms where a lot of ice is found in the upper regions, scattering plays a major role (especially at 85GHz). In these storms the higher the rain rate, the lower are the brightness temperatures. In non-convective storms, it is sometimes difficult to distinguish rain from clouds. We can overcome this by using several frequencies. As with clouds, rain over land is much more difficult to measure.

Wind

Wind speeds above 15 knots roughens the surface and produces sea foam. Both effects increase the emissivity of the sea surface. Small waves on the surface, known as capillary waves, cause the greatest change in emissivity since they have about the same wavelength as microwaves. The effect from capillary waves saturates at 25–30 knots. Above these speeds the effect of foam is dominant, which under the right conditions can have an emissivity near 1.0. The horizontally polarized channels are more sensitive to the wind speed signal. Wind speed values are not accurate in areas of heavy rain. Even light rain degrades the signal. Degradation of the signal is determined on the 37GHz channels.

EDR interpretation

This section discusses each of the SSM/I-derived environmental products.

Integrated water vapor content (WV)

This EDR, also called oceanic total precipitable water, is a measurement of the amount of water vapor in a column extending from the top of the atmosphere to the surface along the sensor-to-ground path. WV is calculated in kg/m2 and ranges from 0 to 80 kg/m2, in levels of 0.5 kg/m2. WV is calculated only over points classed as water. Furthermore, if the rain rate or cloud water content values exceed preset limits, WV is not calculated. Generally, WV values are reliable except in areas of precipitation. Precipitation increases WV error dramatically since the square of the 22V channel temperature is used in the algorithm.

Sea-ice concentration (IC) and ice age (IA)

IC is a fraction of ocean area covered by ice. Operational experience has indicated that when real IC values are low, calculated IC values are biased too high. As a result, one can readily identify the ice edge, but specific concentrations near the edge are unreliable. The distinction between FY and MY ice types has been generally reliable.

Surface types

EDRs require a determination of the surface characteristics before calculation of the parameter. The current algorithm recognizes 13 surface types. The process of determining which surface type belongs involves applying a series of "if-then-else" type tests using combinations of brightness temperatures. The algorithms discriminate quite well between ocean, land, and sea-ice. Over land, identification of desert areas is reasonably accurate, but sometimes snow appears where none exists. High terrain can be misclassified as wet soil or rain over vegetation.

Soil moisture (SM)

The SM is a measure of the approximate amount of recently deposited precipitation in the soil. Army ground units are interested in this data for determining trafficability estimates. The soil moisture EDR values range from 0mm to 150mm. Soil moisture estimation is made difficult by several factors:

  1. Vegetation or roughness of the soil can confuse the signal.
  2. Clouds with high water content.
  3. Penetration of the soil is less than 1mm of depth. This thin layer may not represent what is below it.

Rain rate (RR)

Rain rate EDR is measured in mm/hr. Small nonconvective rainstorms can be detected only over the ocean, while large convective storms with sufficient vertical development (for ice particles to form) can be detected anywhere. Generally, the EDR rain rates are underestimated, especially in tropical storms.

Rain flag (RF)

The flag is a dimensionless quantity ranging from 0 to 3 and provides a quality check on the accuracy of wind speed EDR estimates. The rain flag denotes the accuracy of the wind speed measurement as follows:

Rain Flag

Wind Accuracy

0

< 2m/s

1

2–5m/s

2

5–10m/s

3

> 10m/s

The rain flag is calculated based on a series of tests of SSM/I channels and polarization differences against fixed values.

Ocean surface wind speed (WS)

Wind speed is calculated only over the ocean and values are in units of meters per second (1m/s = 2 knots), up to 29m/s. WS is not calculated in areas of heavy rain. It is significantly degraded by other atmospheric phenomena. The presence of rain or cloud water results in WS values higher than they actually are. Thus, a WS value sensed in a dry atmosphere is much more reliable than one sensed in a rainy atmosphere.

Cloud water content (CW)

The cloud water content EDR measures the integrated total cloud water in a footprint in units of kg/m2. Values range from 0 to 12.6kg/m2 and are quantified in levels of 0.05kg/m2. In practice, values more than 10.6kg/m2 have not been seen.

Land surface temperature (ST)

The land surface temperature EDR is an estimate of the actual ground temperature. Surface temperature values are expressed in Kelvin with a range from 240 to 340°K. Indications are ST values of desert areas are biased high—by about 12°K in Saudi Arabia.

SSM/I observations of tropical storms

You can use the SSM/I imagery of tropical storms in precisely locating the eye and in outlining near-gale and gale-force winds. Along with all the validated SSM/I retrievals, this imagery can be obtained day or night, under any kind of cloud cover.

Detection

The microwave radiation penetrates the dense overlying cirrus clouds, revealing physical details not usually detectable with visible and IR images. The eye of the vortex and the structure of deeply convective regions are two features that are better detected using SSM/I.

Accuracy of storm fixes is about the same as with aircraft reconnaissance. Accuracy of storm fixes at sea is better than with IR and visual images. Near-gale and gale-force winds (30 knots) are outlined.

Tropical storm tracking

Traditionally, storm tracking has been done using both WC-130s and satellite imagery (visual and IR). Satellite views were limited to top and middle levels of a storm. Field work by Joint Typhoon Warning Center (JTWC) indicates SSM/I products increase the precision of tracking to a level approaching that of all previous techniques combined. A primary factor in the precision of tracking is the ability to locate the center of the eye at the surface. The 85GHz channels reveal both the eye and windstreaming in the planetary boundary layer (PBL).

SSM/I provides information previously unavailable from meteorological satellites. Atmospheric, land, and oceanic data, available in EDR and SDR formats, provide valuable information. In the tropics the data is useful in determining winds at the ocean surface and with tracking tropical storms.

007. Animated satellite imagery (ASI)

Weather forecasters feel most comfortable preparing weather forecasts using AWDS products. The easiest way to understand satellite imagery is to infer how weather systems on satellite imagery would look on weather products. The problem is that cloud patterns on satellite imagery develop due to the movement of air parcels with respect to one another. Let’s look at how a rectangular cloud mass at 500mb changes shape in various wind fields in figure 1–34.

Figure 1–34. Rectangular cloud mass at 500mb.

Although the center of the cloud mass moves differently in each case, the cloud pattern maintains the same shape. Since the shape of the cloud depends on the motion of air parcels with respect to one another, the cloud would "look" the same in both cases. The geographical center, however, depends on the motion of the cloud parcels with respect to the earth and is different in both cases.

Principal reference frames

In viewing satellite imagery, we need to understand the changing cloud shapes and how they relate to the development of weather systems.

Earth-based reference frame

This reference frame’s origin (center) remains fixed at a geographical location of the surface of the earth. Observed winds and motion are viewed in the earth perspective. Wind and cloud movement is called earth relative. The winds reported on weather products are earth relative. They are plotted with respect to a fixed geographical location.

System-based reference frame

This reference frame’s origin (center) moves with the center of a weather system. Observed winds and motion are viewed in the system perspective. Wind and cloud movement is called system relative. This reference frame explains the evolution of cloud patterns over time. To fully understand the evolution of cloud systems on satellite imagery, we must understand the relationship between earth-based and system-based reference frames.

The relationship between the two reference frames

Consider the following situation:

The surface wind is from the northeast at 20 knots. Obviously, observers standing outside looking northeast would feel 20 knots of wind on their face.

What if those observers were in a convertible driving toward the northeast at 20 knots? The observers feel 40 knots of wind on their face. In other words, the measured wind depends on where the anemometer is placed. Here the system perspective equals the moving car and the earth perspective equals the earth. The two observations are related.

Windcar = Windearth - Speed of car

NOTE: Remember this equation is a vector. System perspective winds are stronger than earth perspective winds. In more general terms:

System perspective winds

 

=

Earth perspective winds

 

Movement

of

system

Kinematics

This is the study of air movement by simply observing the motion of fluid parcels without regard for the forces driving that motion.

Four types of pure motion

Mathematically, we can break the wind field over a region into four types of pure motion (see fig. 1–35). Each type is independent of all other types. The presence of one type does not relate to the presence of another. This is related to vector components. The u-component (north-south) tells us nothing about the v-component (east-west).

Figure 1–35. Composition of the wind field.

Translation

Translation is movement in a straight line. Straight-line winds of a constant wind speed would characterize a wind field where only translation is occurring. All air parcels in this flow would move at the same speed and in the same direction. Speed of movement of a meteorological phenomenon is the most commonly used measure of translation (a vorticity maximum, 500-mb low, jet maximum, etc.).

Rotation

This is turning about a point. Winds turning in a circular pattern about one point characterize a wind field where only rotation is occurring. Cyclonic rotation is positive. Anticyclonic rotation is negative. Air parcels within this flow are moving in curved paths. Vorticity is the most commonly used measure of rotation.

Divergence

This is the spreading or contracting of the wind field. Winds spreading out from/contracting toward a central point characterize a wind field of pure divergence. Positive divergence (or divergence) is movement away from the central point. Negative divergence (or convergence) is contraction toward the central point. Air parcels in a field of pure divergence are either moving directly toward the central point or moving directly away from the central point. Since numerical values are difficult to calculate accurately, meteorologists normally refer to divergence qualitatively (that is, it’s either occurring or not).

Deformation

This is the stretching or shearing of the wind field. An oval-shaped streamline pattern characterizes wind flow in a pure deformation zone.

Col (neutral point)

A col is the center of the deformation zone where winds are calm.

Axis of dilatation

The axis of most rapid stretching. Air parcels are traveling away from the col.

Axis of contraction

The axis of most rapid shrinking. Air parcels are moving toward the col.

Air parcels in a field of deformation are moving toward the col along the axis of contraction as rapidly as they are moving away from the col along the axis of dilatation. No divergence exists in a region of pure deformation. Since numerical values are difficult to calculate accurately, meteorologists normally refer to deformation qualitatively (that is, either it occurs or doesn’t). In using weather products, we normally identify the parts of the deformation zone. Satellite meteorologists normally refer to the axis of dilatation as a "deformation zone."

Combinations of the four primary types of motion

Because the observed wind field is composed of all four types of motion, locating just one of the pure types of motion on a weather product is rare. It is important to remember that we can consider each type of motion separately. The overall wind field may be represented by the following equation:

Total wind field

=

Translation

+

Rotation

+

Divergence

+

Deformation

                 

Recall that the total wind shown on weather products is the earth perspective wind field, so we rewrite the equation as:

Earth perspective

=

Translation

+

Rotation

+

Divergence

+

Deformation

                 

Also, recall from before the

System perspective winds

 

=

Earth perspective winds

 

Movement

of

system

So,

System perspective winds

 

=

 

Translation

 

+

 

Rotation

 

+

 

Divergence

 

+

 

Deformation

 

-

Movement

of

system

But translation is really the movement of the system, so we simplify the equation to:

System perspective winds

 

=

 

Rotation

 

+

 

Divergence

 

+

 

Deformation

             

Divergence is typically smaller than rotation and deformation. Consequently, rotation and deformation are the two main types of motion responsible for system perspective winds.

The system perspective winds are responsible for determining the shape of the cloud mass on satellite imagery. Rotation and deformation are the main components of the system perspective winds. Therefore, satellite meteorologists spend more time analyzing signatures of rotation and deformation on satellite imagery. The signature of rotation is the vorticity comma cloud. The signature of deformation is the deformation zone cloud system.

Initialization

This is the process of using still and looper imagery to help place surface and upper-air features. Looper imagery, valid at the same time as an upper-air product, can help with initial placement of short waves, jets, or surface features. We can also determine moisture coverage with synoptic-scale systems. We can also analyze amplitude changes to the jet, long-wave and short-wave troughs and ridges to help trends in intensity changes. If the placement is incorrect, make adjustments to the products.

Satellite loops

Geostationary satellite data is more conducive to looping than polar orbiting data.

Animated visible satellite imagery

Animating visual imagery has definite advantages and limitations.

Advantages

Visible imagery offers the best spatial resolution imagery. One kilometer GOES A-sector images are ideal for studying mesoscale phenomena. Visible imagery offers the best temporal resolution as well. GOES A-sector images are available every half hour.

Limitations

Imagery is only available during the daylight hours. The changing sun angle throughout the day is distracting. Imagery from early and late in the day is difficult to evaluate. While GOES A-sector images provide detailed mesoscale data, entire synoptic-scale systems do not generally fit on the image. Keep in mind, time intervals with the satellite imagery may not be close enough to observe all events.

Animated infrared satellite imagery

As with animating visual imagery, animated IR imagery has advantages and limitations.

Advantages

Infrared imagery is available 24 hours a day. Clouds on infrared imagery appear the same despite the time of day. Many different sectors are available to view various phenomena. It’s easier to determine cloud tops and their location on infrared.

Limitations

Non-cloud features, such as land, may change significantly from day to night. This can be distracting. The user is limited to preselected enhancement curves. Low-level phenomena are sometimes difficult to detect.

Animated water vapor satellite imagery

Animated WV imagery also has its own advantages and limitations.

Advantages

It is an excellent tool for analyzing the jet stream and the synoptic situation. Data is available on GOES C and D sectors. These sectors are ideal for synoptic-scale evaluation. It reveals the interactions between middle latitude and tropical systems. Often apparent is high-level moisture feeding from subtropical regions into mid-latitude storms. Also, we can see colder air aloft moving southward in strong low zonal situations. The imagery also shows areas of strong upward or downward vertical motion by evaluating the changes in the moisture gradient.

Limitations

The imagery is sent with a ZA curve. This makes assessing subtle changes in gray shades difficult. Interpretation is not as straightforward as interpreting visible or infrared imagery. The user must become comfortable with assigning meaning to various satellite signatures or to similar satellite signatures that are caused for different reasons. For example, a dark band may suggest either a jet stream or the poleward side of an upper-level deformation zone.

Enhancement curves

Most operational satellite loopers allow the user to apply locally developed color enhancement curves to the imagery being looped. A few simple considerations make the development of enhancement curves easier. Most loopers enhance pixel brightness rather than temperature. If you are adding an enhancement on an infrared image other than a ZA curve, you cannot "split" repeat colors. For example, when you enhance an MB curve, it is impossible to enhance temperatures near 0°C without making the medium gray (–32°C through –42°C) enhancement the same color. It is easier to enhance a brilliance inversion than a threshold. Brilliance inversions produce better contrast.

Where possible, it is best to obtain unenhanced or ZA enhanced imagery and to apply a custom enhancement over it. This eliminates the repeat color problem. Target a color enhancement for a single type of satellite image. Work with a sample image that has the largest brightness range possible. If you use a ZA curve, for example, use an image with the highest thunderstorm top you can find. This ensures all images in the loop appear with the desired color for the coldest cloud tops.

Both water vapor and far infrared images are available with the ZA curves. Since the land fluctuates greatly on the far IR, but not at all on the water vapor, they require different enhancements. Keep the color schemes simple. Ask yourself this question, "Will another forecaster’s attention be drawn to the significant features on the image?" For example, it may not be obvious blue indicates colder tops than purple.

If you use too many colors on an enhancement, the enhancement detracts rather than adds to interpretation of meteorological phenomena. Generally, it is best to use no more than two colors on an enhancement. Use different shades of each color to add detail to the enhancement scheme. A third color can add to the utility of the enhancement if it meets the following criteria:

A good example would be using shades of yellow to enhance cold cloud tops. Add just enough red to enhance the highest cloud tops within a convective complex.

National Weather Service (NWS) Satellite Weather Information System (SWIS) units use the same color enhancement techniques the Alden 3000 and the Automated Digital Facsimile System (ADFS) receivers use. The numerical values of different colors on each system are identical.

NWS Technical Attachments often contain information of enhancement curves developed on SWIS units. They list the breakpoints of the curve and meteorological significance. You can use the breakpoints to input the curve into satellite loopers.

Image intervals

You must consider the intervals between images in the loop when you develop satellite loops. Generally, the smaller the size and time scales of the phenomena is, the shorter the interval will be needed between pictures in the loop.

You must also give consideration to missing pictures. Operationally, you need to plan for at least one missing picture in a loop. This doubles the interval around the missing picture.

The following intervals were found useful for the data described.

Interpretation techniques

Generally, the techniques learned for still photo interpretation also apply to animated imagery interpretation. However, ASI interpretation is better for some specific problems. With looped imagery, you’re examining changes for cloud systems and flow over time. Do not become locked into your area of interest. Always observe what is occurring around the area of interest before you make your final determination of the feature, system, or flow.

Low-level wind flow

Determine low-level wind flow by following the motion of low clouds with time, their location in relation to systems, and their interaction with terrain. On infrared imagery, low clouds appear murky gray. On visible imagery, use your cloud identification skills to identify low clouds.

Upper-level wind flow

Determine upper-level wind flow by following the motion of high clouds with time. On (unenhanced) infrared imagery, high clouds appear bright white.

Special forecasting situations

In this last section on ASI, we look at some of the special forecasting situations.

Gulf return flow

You can track the return flow of moisture from the Gulf of Mexico into the central United States by following the northward movement of stratus clouds. This is important to monitor when you forecast precipitation in the central and eastern United States.

Closed circulation centers

Recall from our discussion of relative motion that closed circulation centers are difficult to identify and locate on still imagery. On animated imagery, because you tend to follow the system and not its translation, cyclones and anticyclones are easier to locate.

Cyclones

Locate cyclones by identifying the center of a closed cyclonic circulation in the clouds. Infrared loops are best for identifying upper-level cyclones. Visible loops are best for identifying surface cyclones. As long as thicker, higher clouds do not obscure the lower clouds, water vapor loops generally reveal the circulation center along a moisture gradient. The dry slot generally starts southwest of the low and wraps around the east side of the low. As this dry slot darkens, the low intensifies. As the dry slot becomes diffuse, the system begins to weaken. Upper-level cyclones over subtropical and tropical regions tend to be almost circular clear areas.

Anticyclones

Because upper-level highs are normally associated with dry air and subsidence aloft, they are easiest to locate on water vapor loops. Enough water vapor may be present aloft to show anticyclonic spiraling of the moisture. You can usually observe an elongated moisture shield with a smooth inside border SW–N of the upper-level high. Because anticyclones are large, broad features, the exact location of the circulation center is difficult to place.

Split flow

Sometimes the westerly flow aloft splits into two or more significant branches. These branches merge a considerable distance downstream. Weather conditions vary, depending on which branch of the flow affects a region.

Forecasting for locations near the boundary between each flow regime is difficult. Weather varies significantly over small distances. Water vapor loops are invaluable when you deal with this type of forecast problem. Moisture rich air in the southern stream contrasts with dry air in the northern stream. This creates a well-defined boundary on water vapor imagery. The northward progression of moisture is easy to extrapolate using satellite loops. If a short-wave trough in the northern stream moves southward toward the moisture shield advancing from the southern stream, the moisture shield is normally deflected toward the southeast.

Water vapor loops are ideal for recognizing this scenario. Cloudiness associated with the northern stream short-wave trough is scarce or nonexistent. The trough shows up as a dark band on the water vapor imagery.

Upper ridges

Changes in the amplitude and sharpness of ridges are easy to monitor using ASI.

Sharpness

As a ridge sharpens, the amount of high cloudiness streaming over the ridge axis decreases. Satellite loops reveal the rapid movement of clouds over the ridge axis. The clouds rapidly evaporate as the subsidence strengthens downstream while the ridge sharpens. As a ridge flattens, the cloudiness flowing over the ridge axis increases and progresses further downstream before it dissipates.

Amplitude

While a ridge builds, the jet stream axis along the ridge shifts northward. This corresponds to a northward advance of the baroclinic zone cirrus cloud shield. As a ridge collapses, the cirrus shield slips southward.

Vorticity comma clouds wrapping into deformation zone cloud systems

ASI reveals the rapidly changing nature of the synoptic-scale comma-cloud system associated with extratropical cyclones. When we view still imagery, the natural tendency is to rely too heavily on extrapolation when we forecast the movement of the cloud pattern. While the system may change relatively little on weather products, the satellite cloud pattern may change significantly. The smaller scale structure of the system is often visible on satellite imagery.

Often, major storms develop the comma-cloud pattern when a vorticity comma-cloud system wraps back around the north side of the system and becomes a deformation zone cloud system. The process begins with an old deformation zone cloud system. This system is normally very well defined. A new vorticity comma cloud rapidly forms as convection develops in the dry air aloft behind the baroclinic zone cloud shield. As the new vorticity comma cloud grows, the old deformation zone cloud shield rapidly warms and fragments. As the thunderstorms continue to develop, high-level cloudiness (cold tops) expands rapidly. The expanding vorticity comma cloud then moves northward into the deformation field north of the closed/closing 500-mb low and is stretched into a new deformation zone cloud system. With some storms, this may occur only once. Sometimes it occurs several times. The synoptic-scale vorticity product often does not have sufficient detail to define each vorticity maximum involved in the process.

Often, the first clue in the above process is the movement of a band of high-level moisture into the dry slot of an extratropical cyclone. The moisture is associated with the divergence ahead of the vorticity maxima. As divergence acts on lower-level moisture close to the baroclinic zone cloud shield, convection associated with the new vorticity comma cloud develops.

Precipitation forecasting

Precipitation generally increases when the cloud tops of infrared imagery cool and/or the area of cold cloud tops expands. Rapidly expanding and cooling cloud tops is an indication precipitation will increase. The heaviest precipitation from synoptic-scale systems tends to fall along the southern edge of the coldest cloud tops. Use satellite loops to extrapolate the movement of heavy precipitation areas associated with these coldest cloud tops.

Long-wave pattern amplification

Amplification of the long-wave pattern normally proceeds around the globe from west to east. In other words, a trough deepens over the dateline. This results in a ridge building into the Gulf of Alaska, which deepens a trough over the West Coast. Loops of full-disk C-sector imagery are excellent tools for following this process because they show the evolution of the long-wave pattern.

Mesoscale boundary interaction

Intersecting boundaries are one of the best indicators of thunderstorm development. Satellite loops of high resolution visible imagery are useful for tracking boundaries. You can use extrapolation to estimate where and when boundaries are likely to interact. This shows where and when to forecast convection.

Short-wave trough interacting with the front

Usually the cloud pattern is the first indicator a short-wave trough is interacting with a frontal boundary. A slight "S" shape develops on the cold-air side of the cloud pattern. The cloud pattern becomes more defined and organized as the self-development process continues.

Changes in zonal index

Observing and forecasting changes in the overall zonal index, or flow, helps you forecast weather at your station. Apply the dynamics of system interaction to looped satellite interpretation. This allows you to anticipate changes in the zonal index before you receive your 12-hourly run of upper-air and model products.

Development of short-wave trough behind a cold front

Sometimes, short-wave troughs appear at the surface as a developing front in the cold air behind an existing cold front. Usually there is not a large amount of moisture associated with this system until it cuts off the cold air from the leading cold front. This is very common along and off the east coast of continents. It’s known as a "Cold-Air Vortex system."

Decay of an upper-level high

Upper-level highs are usually most apparent as they develop and persist. Knowledge of a weakening or dissipating upper-level high is very useful in forecasting weather changes. Upper-level highs in the subtropics can dissipate with the intrusion of cold air at the surface.

Self-Test Questions

After you complete these questions, you may check your answers at the end of the unit.

003. METSAT advantages

1. List the advantages of METSAT imagery.

004. Satellite types and coverage

1. What is the inclination angle of the polar orbiting satellites?

2. Of what areas do polar orbiting satellites provide coverage?

3. Geosynchronous satellites orbit the earth at the same angular velocity as what?

4. Which satellite allows you to loop imagery to follow fronts, lows, severe weather, and many other cloud and non-cloud features?

5. What satellites are considered geosynchronous satellites?

005. Types of weather METSAT imagery

1. Indicate for each statement below whether it represents visible, far infrared, near infrared, water vapor, or SSM/I imagery.

a. Operates at a wavelength of 0.4 to 0.74 microns (µm).

b. Operates at a wavelength of 0.75 to 2.0µm.

c. Passively detects emitted and reflected microwave radiation at four wavelengths.

d. Operates at 6.7µ on GOES imagery and 5.7 to 7.1µm on METEOSAT imagery.

e. Operates at a wavelength of 10.2 to 12.8µm.

f. The wavelength is very sensitive to lunar radiation.

g. Measures reflected sunlight.

h. This imagery is used for identifying vorticity maximums.

i. Does not depend on reflected sunlight for an image.

j. Gaps occur in the global coverage between 30° N and S latitude.

k. You are seeing the reflectivity of features that are converted to brightness values.

l. You are seeing the temperature of features that are converted to brightness values.

m. You are seeing the amount of moisture sensed in the vertical, which is then converted to a brightness value.

  1. Match the GOES enhancement curves in column B with their descriptions in column A. Items in column B may be used once.

Column A

_____1. Cool season, general purpose curve.

_____2. WV imagery is also enhanced using this curve identifier.

_____3. Good all-purpose curve most commonly used for convective activity.

_____4. A winter time curve used to define water currents, low stratus and coastal fog.

_____5. Aids in finding convective cloud tops embedded in nimbostratus clouds.

_____6. Designed for cloud interpretation in colder northern latitudes in the winter.

_____7. Designed for west coast forecasters to enhance system cloud tops over the Pacific Ocean.

Column B

a. CC.

b. C1/C3/C6/C9.

c. EC.

d. HF.

e. JG.

f. MB.

g. ZA.

3. On a global scale, what critical parameters does SSM/I measure ?

4. What frequencies have two channels to include both horizontal and vertical polarizations?

5. What are the three main advantages of the SSM/I?

6. How long does it take the SSM/I sensor to complete one complete revolution of the globe? How many revolutions occur per day?

7. What is the scan width covered by SSM/I at the earth’s surface?

006. SSM/I products

1. What are the parameters that can be interpreted from SSM/I using EDRs?

2. What two central locations process the data received from SSM/I satellites?

3. How accurate are brightness temperatures for SDRs?

4. When you interpret SDRs, what are each of the frequencies best for viewing?

5. What clouds are virtually invisible at microwave frequencies and why?

6. Match the EDR in column B with their descriptions in column A. Items in column B may be used once.

 

 

 

Column A

_____1. Measures the integrated total cloud water in units of kg/m2.

_____2. Also called Oceanic Total Precipitable Water.

_____3. Values are expressed in Kelvin with a range of 240 to 340° K.

_____4. It is a fraction of ocean area covered by ice.

_____5. A dimensionless quantity ranging from 0 to 3.

_____6. Army ground units are interested in this data for determining trafficability estimates.

Column B

a. Rain flag (RF).

b. Integrated water vapor content (WV).

c. Soil moisture (SM).

d. Cloud water content (CW).

e. Land surface temperature (ST).

f. Sea-ice concentration (IC).

7. Using SSM/I, what two main features of tropical storms can we better detect?

007. Animated satellite imagery (ASI)

1. How are the winds reported on weather products?

2. What are the four types of pure motion? Describe each.

3. What are the two main types of motion responsible for system perspective winds? How do they appear on satellite imagery?

4. What satellite data is the most conducive for satellite looping?

5. Match the type of animated satellite imagery in column B with their descriptions in column A. Items in column B may be used more than once. Items in column B may be used more than once.

Column A

_____1. Shows the interaction between the mid-latitude systems and tropical systems.

_____2. Clouds appear the same despite the time of day.

_____3. Offers the best spatial resolution imagery and the best temporal resolution as well.

_____4. Interpretation is not straightforward.

_____5. Only available during daylight hours.

_____6. Low-level phenomena are sometimes difficult to detect.

Column B

a. Animated infrared satellite imagery.

b. Animated visible satellite imagery.

c. Animated water

vapor satellite

imagery.

6. Generally, when enhancing satellite imagery, how many colors is it best to use? Can you exceed that number? If so, what are the criteria?

7. What are the satellite imagery time intervals that are useful for looping data on a planetary scale? Synoptic scale? Mesoscale?

8. Cyclones are located by identifying the center of a closed cyclonic circulation in the clouds. Which satellite loops are best for identifying upper-level and surface cyclones?

9. Why is the exact location of the anticyclone’s circulation center difficult to place?

10. What should you look for on infrared imagery to indicate that precipitation is increasing?

11. What type of cloud pattern is the first indicator a short-wave trough is interacting with a frontal boundary?

Answers to Self-Test Questions

001

1. (1) c.

(2) a.

(3) d.

(4) e.

(5) f.

(6) b.

002

1. Planck’s law says that the amount of radiation emitted by a black body at a given wavelength is proportional to its temperature.

2. Wien’s displacement law, which comes from Planck’s law, says the wavelength of the maximum irradiance of a black body depends on its temperature.

3. Kirchoff’s law says for objects in thermodynamic equilibrium, the absorption of radiant energy must be equal to the emission of radiant energy.

003

1. METSAT imagery enhances resolution in areas that have an organized, dense synoptic network. It provides data in areas lacking conventional data such as over ocean and desert regions. Animated looping allows systems to be put in motion, which allows you to see system motion and the interaction between different pressure systems. You will also see interaction between weather systems of different scales. METSAT imagery as an observation is more frequent than any synoptic reports. Also, a single METSAT image gives you a more complete idea of the vertical structure of the atmosphere than one or two products. You can see the low-, mid-, and upper-level features simultaneously. You can also determine how these features relate to each other.

004

1. Polar orbiting satellites have an inclination angle of 98.7°.

2. The entire earth’s surface.

3. The rotating earth.

4. Geostationary satellites.

5. GOES, GMS, METEOSAT, and INSAT satellites.

005

1. a. Visible.

b. Near infrared.

c. SSM/I.

d. Water vapor.

e. Far infrared.

f. Near infrared.

g. Visible.

h. Water vapor.

i. Far infrared.

j. SSM/I.

k. Visible.

l. Applies to both far and near infrared.

m. Water vapor.

2. (1) c.

(2) g.

(3) f.

(4) e.

(5) b.

(6) a.

(7) d.

3. Atmospheric, oceanographic, and land parameters.

4. The 19.3, 37, and 85.5 Gigahertz frequencies.

5. The microwaves penetrate the clouds with little or no attenuation. The sensing is independent of solar illumination. Also, SSM/I provides information complementary to that available with visible and IR imagery.

6. 102 minutes. 14.1 revolutions per day.

7. 1,400 kilometers.

006

1. Parameters that can be interpreted from SSM/I include: precipitation intensity of storms over land and water (whether tropical, extratropical, or polar), cloud concentration, cloud water content, and water vapor over the ocean. Also, parameters such as ocean surface wind speed, sea-ice concentration, ice/water boundaries, differentiation between new, first-year, and multi-year ice, land-surface temperatures, snow-water content, and soil moisture can be determined.

2. Supercomputers process the data at two central locations—Fleet Numerical Meteorology and Oceanography Center (FNMOC) and Air Force Global Weather Central (AFGWC).

3. Within 1° Kelvin.

4. The 19 and 37GHz channels are most useful for viewing surface phenomena. The 22.2GHz channel is for viewing atmospheric water vapor while the 85GHz channel is for viewing rain and clouds.

5. Cirrus, because they are composed mostly of small ice crystals.

6. (1) d.

(2) b.

(3) e.

(4) f.

(5) a.

(6) c.

7. The eye of the vortex and the structure of deeply convective regions.

007

1. Earth relative.

2. They are translation, rotation, divergence and deformation. Translation is straight-line movement. Rotation is the turning about a point. Divergence is the spreading or contraction of the wind field. Deformation is the stretching or shearing of the wind field.

3. Rotation and deformation. The signature of rotation is the vorticity comma cloud. The signature of deformation is the deformation zone cloud system.

4. Geostationary satellite data.

5. (1) c.

(2) a.

(3) b.

(4) c.

(5) b.

(6) a.

6. Generally, no more than two colors. A third color can add to the utility of the enhancement if it corresponds to extreme conditions that occur over a very small geographical area or if it occurs within an area enhanced with another color.

7. On the planetary scale: 3 or 6 hour intervals. On the synoptic scale: 2 hours is satisfactory, 1 hour is ideal. On the mesoscale: 1 hour intervals are essential.

8. Infrared loops are best for identifying upper-level cyclones and visible loops are best for identifying surface cyclones.

9. Because anticyclones are large, broad features.

10. Rapidly expanding and cooling cloud tops is an indication precipitation will increase.

11. A slight "S" shape developing on the cold-air side of the cloud pattern.

Do the Unit Review Exercises (URE) before going to the next unit.

Unit Review Exercises

Note to Student: Consider all choices carefully, select the best answer to each question, and circle the corresponding letter. When you have completed all unit review exercises, transfer your answers to ECI Form 34, Field Scoring Answer Sheet.

Do not return your answer sheet to ECI.

Please read the unit menu for Unit 2 and continue. è