Weather Forecasting with Digital Signals

Keywords: dsp technology, dsp applications

INTRODUCTION:

Digital signal processing (DSP) is concerned with the representation of the signals by a sequence of numbers or symbols and the processing of these signals. Digital signal processing and analog signal processing are subfields of signal processing.

The analog waveform is sliced into equal segments and the waveform amplitude is measured in the middle of each segment. The collection of measurements makes up the digital representation of the waveform. Converting a continuously changing waveform (analog) into a series of discrete levels (digital)

Applications of DSP

DSP technology is nowadays commonplace in such devices as mobile phones, multimedia computers, video recorders, CD players, hard disc drive controllers and modems, and will soon replace analog circuitry in TV sets and telephones. An important application of DSP is in signal compression and decompression. Signal compression is used in digital cellular phones to allow a greater number of calls to be handled simultaneously within each local “cell”. DSP signal compression technology allows people not only to talk to one another but also to see one another on their computer screens, using small video cameras mounted on the computer monitors, with only a conventional telephone line linking them together. In audio CD systems, DSP technology is used to perform complex error detection and correction on the raw data as it is read from the CD.

some of the mathematical theory underlying DSP techniques, such as Fourier and Hilbert Transforms, digital filter design and signal compression, can be fairly complex, the numerical operations required actually to implement these techniques are very simple, consisting mainly of operations that could be done on a cheap four-function calculator. The architecture of a DSP chip is designed to carry out such operations incredibly fast, processing hundreds of millions of samples every second, to provide real-time performance: that is, the ability to process a signal “live” as it is sampled and then output the processed signal, for example to a loudspeaker or video display. All of the practical examples of DSP applications mentioned earlier, such as hard disc drives and mobile phones, demand real-time operation.

Weather forecasting- is the science of making predictions about general and specific weather phenomenon for a given area based on observations of such weather related factors as atmospheric pressure, wind speed and direction, precipitation, cloud cover, temperature, humidity, frontal movements, etc.

Meteorologists use several tools to help them forecast the weather for an area. These fall under two categories: tools for collecting data and tools for coordinating and interpreting data.

Weather forecasting- is the science of making predictions about general and specific weather phenomenon for a given area based on observations of such weather related factors as atmospheric pressure, wind speed and direction, precipitation, cloud cover, temperature, humidity, frontal movements, etc.

Meteorologists use several tools to help them forecast the weather for an area. These fall under two categories: tools for collecting data and tools for coordinating and interpreting data.

In a typical weather-forecasting system, recently collected data are fed into a computer model in a process called assimilation. This ensures that the computer model holds the current weather conditions as accurately as possible before using it to predict how the weather may change over the next few days.

Weather forecasting is an exact science of data collecting, but interpretation of the data collected can be difficult because of the chaotic nature of the factors that affect the weather. These factors can follow generally recognized trends, but meteorologists understand that many things can affect these trends. With the advent of computer models and satellite imagery, weather forecasting has improved greatly.

Weather forecasting- is the science of making predictions about general and specific weather phenomenon for a given area based on observations of such weather related factors as atmospheric pressure, wind speed and direction, precipitation, cloud cover, temperature, humidity, frontal movements, etc.

Meteorologists use several tools to help them forecast the weather for an area. These fall under two categories: tools for collecting data and tools for coordinating and interpreting data.

* Tools for collecting data include instruments such as thermometers, barometers, hygrometers, rain gauges, anemometers, wind socks and vanes, Doppler radar and satellite imagery (such as the GOES weather satellite).

* Tools for coordinating and interpreting data include weather maps and computer models.

In a typical weather-forecasting system, recently collected data are fed into a computer model in a process called assimilation. This ensures that the computer model holds the current weather conditions as accurately as possible before using it to predict how the weather may change over the next few days.

Weather forecasting is an exact science of data collecting, but interpretation of the data collected can be difficult because of the chaotic nature of the factors that affect the weather. These factors can follow generally recognized trends, but meteorologists understand that many things can affect these trends. With the advent of computer models and satellite imagery, weather forecasting has improved greatly. Since lives and livelihoods depend on accurate weather forecasting, these improvements have helped not only the understanding of weather, but how it affects living and non living things on Earth.

Weather forecasting is the science of making predictions about general and specific weather phenomena for a given area based on observations of such weather related factors as atmospheric pressure, wind speed and direction, precipitation, cloud cover, temperature, humidity, frontal movements, etc.

Meteorologists use several tools to help them forecast the weather for an area. These fall under two categories: tools for collecting data and tools for coordinating and interpreting data.

Tools for collecting data include instruments such as thermometers, barometers, hygrometers, rain gauges, anemometers, wind socks and vanes, Doppler radar and satellite imagery (such as the GOES weather satellite).

Tools for coordinating and interpreting data include weather maps and computer models.

In a typical weather-forecasting system, recently collected data are fed into a computer model in a process called assimilation. This ensures that the computer model holds the current weather conditions as accurately as possible before using it to predict how the weather may change over the next few days.

Weather forecasting is an exact science of data collecting, but interpretation of the data collected can be difficult because of the chaotic nature of the factors that affect the weather. These factors can follow generally recognized trends, but meteorologists understand that many things can affect these trends. With the advent of computer models and satellite imagery, weather forecasting has improved greatly. Since lives and livelihoods depend on accurate weather forecasting, these improvements have helped not only the understanding of weather, but how it affects living and nonliving things on Earth.

Weather forecasting is the application of science and technology to predict the state of the atmosphere for a future time and a given location. Human beings have attempted to predict the weather informally for millennia, and formally since at least the nineteenth century. Weather forecasts are made by collecting quantitative data about the current state of the atmosphere and using scientific understanding of atmospheric processes to project how the atmosphere will evolve.

Once an all-human endeavor based mainly upon changes in barometric pressure, current weather conditions, and sky condition, forecast models are now used to determine future conditions. Human input is still required to pick the best possible forecast model to base the forecast upon, which involves pattern recognition skills, teleconnections, knowledge of model performance, and knowledge of model biases. The chaotic nature of the atmosphere, the massive computational power required to solve the equations that describe the atmosphere, error involved in measuring the initial conditions, and an incomplete understanding of atmospheric processes mean that forecasts become less accurate as the difference in current time and the time for which the forecast is being made (the range of the forecast) increases. The use of ensembles and model consensus help narrow the error and pick the most likely outcome.

There are a variety of end uses to weather forecasts. Weather warnings are important forecasts because they are used to protect life and property. Forecasts based on temperature and precipitation are important to agriculture, and therefore to traders within commodity markets. Temperature forecasts are used by utility companies to estimate demand over coming days. On an everyday basis, people use weather forecasts to determine what to wear on a given day. Since outdoor activities are severely curtailed by heavy rain, snow and the wind chill, forecasts can be used to plan activities around these events, and to plan ahead and survive them.

History of weather control

If we dispense with legends, at least Native American Indians had methods which they believed to induce rain. The Finnish people, on the other hand, were believed by others to be able to control all weather. Thus Vikings refused to take Finns on their raids by sea. Remnants of this belief lasted well into the modern age, with many ship crews being reluctant to accept Finnish sailors.

Read also  Gauley Bridge and Bhopal Disasters

The early modern era saw people observe that during battles the firing of cannons and other firearms often precipitated precipitation. The first example of weather control which is still considered workable is probably the lightning conductor.

For millennia people have tried to forecast the weather. In 650 BC, the Babylonians predicted the weather from cloud patterns as well as astrology. In about 340 BC, Aristotle described weather patterns in Meteorologica. Later, Theophrastus compiled a book on weather forecasting, called the Book of Signs. Chinese weather prediction lore extends at least as far back as 300 BC. In 904 AD, Ibn Wahshiyya’s Nabatean Agriculture discussed the weather forecasting of atmospheric changes and signs from the planetary astral alterations; signs of rain based on observation of the lunar phases; and weather forecasts based on the movement of winds.

Ancient weather forecasting methods usually relied on observed patterns of events, also termed pattern recognition. For example, it might be observed that if the sunset was particularly red, the following day often brought fair weather. This experience accumulated over the generations to produce weather lore. However, not all of these predictions prove reliable, and many of them have since been found not to stand up to rigorous statistical testing. It was not until the invention of the electric telegraph in 1835 that the modern age of weather forecasting began. Before this time, it had not been possible to transport information about the current state of the weather any faster than a steam train. The telegraph allowed reports of weather conditions from a wide area to be received almost instantaneously by the late 1840s. This allowed forecasts to be made by knowing what the weather conditions were like further upwind. The two men most credited with the birth of forecasting as a science were Francis Beaufort (remembered chiefly for the Beaufort scale) and his protégé Robert FitzRoy (developer of the Fitzroy barometer). Both were influential men in British naval and governmental circles, and though ridiculed in the press at the time, their work gained scientific credence, was accepted by the Royal Navy, and formed the basis for all of today’s weather forecasting knowledge. To convey information accurately, it became necessary to have a standard vocabulary describing clouds; this was achieved by means of a series of classifications and, in the 1890s, by pictorial cloud atlases.

Great progress was made in the science of meteorology during the 20th century. The possibility of numerical weather prediction was proposed by Lewis Fry Richardson in 1922, though computers did not exist to complete the vast number of calculations required to produce a forecast before the event had occurred. Practical use of numerical weather prediction began in 1955, spurred by the development of programmable electronic computers.

* Modern aspirations

There are two factors which make weather control extremely difficult if not fundamentally intractable. The first one is the immense quantity of energy contained in the atmosphere. The second is its turbulence.

Effective cloud seeding to produce rain has always been some 50 years away. People do utilize even the most expensive and experimental types of it, but more in hope than confidence.

Another even more speculative and expensive technique that has been semiseriously discussed is the dissipation of hurricanes by exploding a nuclear bomb in the eye of the storm. It is questionable that it will ever even be tried, because if it failed, the result would be a hurricane bearing radioactive fallout along with the destructive power of its winds and rain.

* Modern day weather forecasting system

Components of a modern weather forecasting system include:

  • Data collection
  • Data assimilation
  • Numerical weather prediction
  • Model output post-processing
  • Forecast presentation to end-user

* Data collection

Observations of atmospheric pressure, temperature, wind speed, wind direction, humidity, precipitation are made near the earth’s surface by trained observers, automatic weather stations or buoys. The World Meteorological Organization acts to standardize the instrumentation, observing practices and timing of these observations worldwide. Stations either report hourly in METAR reports, or every six hours in SYNOP reports. Diurnal (daily) rhythm of air pressure in northern Germany (black curve is air pressure) Atmospheric pressure is the pressure at any point in the Earths atmosphere. … For other uses, see Temperature (disambiguation). … An AWS in Antarctica An automatic weather station (AWS) is an automated version of the traditional weather station, either to save human labour or to enable measurements from remote areas. … Weather buoys are instruments which collect weather and ocean data within the worlds oceans. … WMO flag The World Meteorological Organization (WMO, French: , OMM) is an intergovernmental organization with a membership of 188 Member States and Territories. … METAR (for METeorological Aerodrome Report) is a format for reporting weather information. … SYNOP (surface synoptic observations) is a numerical code (called FM-12 by WMO) used for reporting marine weather observations made by manned and automated weather stations. …

Measurements of temperature, humidity and wind above the surface are found by launching radiosondes (weather balloon). Data are usually obtained from near the surface to the middle of the stratosphere, about 30,000 m (100,000 ft). In recent years, data transmitted from commercial airplanes through the AMDAR system has also been incorporated into upper air observation, primarily in numerical models. radiosonde with measuring instruments A radiosonde (Sonde is German for probe) is a unit for use in weather balloons that measures various atmospheric parameters and transmits them to a fixed receiver. … Rawinsonde weather balloon just after launch. … Atmosphere diagram showing stratosphere. … Aircraft Meteorological Data Relay (AMDAR) is a program initiated by the World Meteorological Organization. …

Increasingly, data from weather satellites are being used due to their (almost) global coverage. Although their visible light images are very useful for forecasters to see development of clouds, little of this information can be used by numerical weather prediction models. The infra-red (IR) data however can be used as it gives information on the temperature at the surface and cloud tops. Individual clouds can also be tracked from one time to the next to provide information on wind direction and strength at the clouds steering level. Polar orbiting satellites provide soundings of temperature and moisture throughout the depth of the atmosphere. Compared with similar data from radiosondes, the satellite data has the advantage that coverage is global, however the accuracy and resolution is not as good. A weather satellite is a type of artificial satellite that is primarily used to monitor the weather and/or climate of the Earth. … Sounding – The historical nautical term for measuring depth. …

Meteorological radar provide information on precipitation location and intensity.. Additionally, if a Pulse Doppler weather radar is used then wind speed and direction can be determined..

* Data assimilation

Data assimilation (DA) is a method used in the weather forecasting process in which observations of the current (and possibly, past) weather are combined with a previous forecast for that time to produce the meteorological `analysis’; the best estimate of the current state of the atmosphere. Weatherman redirects here. …

Modern weather predictions aid in timely evacuations and potentially save lives and property damage.

More generally, Data assimilation is a method to use observations in the forecasting process.

In weather forecasting there are 2 main types of data assimilation: 3 dimensional (3DDA) and 4 dimensional (4DDA). In 3DDA only those observations are used available at the time of analyses. In 4DDA the past observations are included (thus, time dimension added).

The first data assimilation methods were called the “objective analyses” (e.g., Cressman algorithm). This was in contrast to the “subjective analyses”, when (in the past practice) numerical weather predictions (NWP) forecasts were arbitrarily corrected by meteorologists. The objective methods used simple interpolation approaches, and thus were the kind of 3DDA methods. The similar 4DDA methods, called “nudging” also exist (e.g. in MM5 NWP model). They are based on the simple idea of Newtonian relaxation. The idea is to add in the right part of dynamical equations of the model the term, proportional to the difference of the calculated meteorological variable and the observation value. This term, that has a negative sign “keeps” the calculated state vector closer to the observations.

The first breakdown in the field of data assimilation was introducing by L.Gandin (1963) with the “statistical interpolation” (or “optimal interpolation” ) method. It developed the previous ideas of Kolmogorov. That method is the 3DDA method and is the kind of regression analyses, which utilizes the information about the spatial distributions of covariance functions of the errors of the “first guess” field (previous forecast) and “true field”. These functions are never known. However, the different approximations were assumed.

Read also  410 - This page has been removed

In fact optimal interpolation algorithm is the reduced version of the Kalman filtering (KF) algorithm, when the covariance matrices are not calculated from the dynamical equations, but are pre-determined in advance. The Kalman filter (named after its inventor, Rudolf Kalman) is an efficient recursive computational solution for tracking a time-dependent state vector with noisy equations of motion in real time by the least-squares method. …

When this was recognised the attempts to introduce the KF algorithms as a 4DDA tool for NWP models were done. However, this was (and remains) a very difficult task, since the full version of KF algorithm requires solution of the enormous large number of additional equations. In connection with that the special kind of KF algorithms (suboptimal) for NWP models were developed.

Another significant advance in the development of the 4DDA methods was utilizing the optimal control theory (variational approach) in the works of Le Dimet and Talagrand, 1986, based on the previous works of G. Marchuk. The significant advantage of the variational approaches is that the meteorological fields satisfy the dynamical equations of the NWP model and at the same time they minimize the functional, characterizing their difference from observations. Thus, the problem of constrained minimization is solved. The 3DDA variational methods also exist (e.g., Sasaki, 1958). Optimal control theory is a mathematical field that is concerned with control policies that can be deduced using optimization algorithms. …

As it was shown by Lorenc, 1986, the all abovementioned kinds of 4DDA methods are in some limit equivalent. I.e., under some assumptions they minimize the same cost functional. However, these assumptions never fulfill.

The rapid development of the various data assimilation methods for NWP is connected to the two main points in the field of numerical weather prediction: 1. Utilizing the observations currently seems to be the most promicing challange to improve the quality of the forecasts at the different scales (from the planetary scale to the local city, or even street scale) 2. The number of different kinds of observations (sodars, radars, sattelite) is rapidly growing.

The DA methods are currently used not also in weather forecasting, but in different environmental forecasting problems, e.g. in hydrological forecasting. Basically the same types of DA methods, as those, described above are in use there. Data assimilation is the challange for the every forecasting problem.

Numerical weather prediction

Numerical weather prediction is the science of predicting the weather using mathematical models of the atmosphere. Manipulating the huge datasets and performing the complex calculations necessary to do this on a resolution fine enough to make the results useful can require some of the most powerful supercomputers in the world. Image File history File links NAM_500_MB.PNG‎ File links The following pages on the English Wikipedia link to this file (pages on other projects are not listed): Numerical weather prediction Block (meteorology) … Image File history File links NAM_500_MB.PNG‎ File links The following pages on the English Wikipedia link to this file (pages on other projects are not listed): Numerical weather prediction Block (meteorology) … A millibar (mbar, also mb) is 1/1000th of a bar, a unit for measurement of pressure. … Geopotential height is a vertical coordinate referenced to Earths mean sea level – an adjustment to geometric height (elevation above mean sea level) using the variation of gravity with latitude and elevation. … Weather is a term that encompasses phenomena in the atmosphere of a planet. … A mathematical model is an abstract model that uses mathematical language to describe the behaviour of a system. … A supercomputer is a computer that leads the world in terms of processing capacity, particularly speed of calculation, at the time of its introduction. …

An example of 500 mbar geopotential height prediction from a numerical weather prediction model

Model output post processing

The raw output is often modified before being presented as the forecast. This can be in the form of statistical techniques to remove known biases in the model, or of adjustment to take into account consensus among other numerical weather forecasts. For other senses of this word, see bias (disambiguation). …

In the past, the human forecaster used to be responsible for generating the entire weather forecast from the observations. However today, for forecasts beyond 24hrs human input is generally confined to post-processing of model data to add value to the forecast. Humans are required to interpret the model data into weather forecasts that are understandable to the end user. Additionally, humans can use knowledge of local effects which may be too small in size to be resolved by the model to add information to the forecast. However, the increasing accuracy of forecast models continues to decrease the need for post-processing and human input. Examples of weather model data can be found on Vigilant Weather’s Model Pulse.

Presentation of weather forecasts

The final stage in the forecasting process is perhaps the most important. Knowledge of what the end user needs from a weather forecast must be taken into account to present the information in a useful and understandable way.

* Public information

One of the main end users of a forecast is the general public. Thunderstorms can cause strong winds, dangerous lightning strikes leading to power outages, and widespread hail damage. Heavy snow or rain can bring transportation and commerce to a stand-still, as well as cause flooding in low-lying areas. Excessive heat or cold waves can kill or sicken those without adequate utilities. The National Weather Service provides forecasts and watches/warnings/advisories for all areas of the United States to protect life and property and maintain commercial interests. Traditionally, television and radio weather presenters have been the main method of informing the public, however increasingly the internet is being used due to the vast amount of information that can be found.

* Air traffic

The aviation industry is especially sensitive to the weather. Fog and/or exceptionally low ceilings can prevent many aircraft landing and taking off. Similarly, turbulence and icing can be hazards whilst in flight. Thunderstorms are a problem for all aircraft, due to severe turbulence and icing, as well as large hail , strong winds, and lightning , all of which can cause fatal damage to an aircraft in flight. On a day to day basis airliners are routed to take advantage of the jet stream tailwind to improve fuel efficiency. Air crews are briefed prior to take off on the conditions to expect en route and at their destination.

* Utility companies

Electricity companies rely on weather forecasts to anticipate demand which can be strongly affected by the weather. In winter, severe cold weather can cause a surge in demand as people turn up their heating. Similarly, in summer a surge in demand can be linked with the increased use of air conditioning systems in hot weather.

* Private sector

Increasingly, private companies pay for weather forecasts tailored to their needs so that they can increase their profits. For example, supermarket chains may change the stocks on their shelves in anticipation of different consumer spending habits in different weather conditions.

a) =Ensemble forecasting=

Although a forecast model will predict realistic looking weather features evolving realistically into the distant future, the errors in a forecast will inevitably grow with time due to the chaotic nature of the atmosphere. The detail that can be given in a forecast therefore decreases with time as these errors increase. There becomes a point when the errors are so large that the forecast is completely wrong and the forecasted atmospheric state has no correlation with the actual state of the atmosphere.

However, looking at a single forecast gives no indication of how likely that forecast is to be correct. Ensemble forecasting uses lots of forecasts produced to reflect the uncertainty in the initial state of the atmosphere (due to errors in the observations and insufficient sampling). The uncertainty in the forecast can then be assessed by the range of different forecasts produced. They have been shown to be better at detecting the possibility of extreme events at long range.

Ensemble forecasts are increasingly being used for operational weather forecasting (for example at ECMWF , NCEP , and the Canadian forecasting center).

b) =Nowcasting=

The forecasting of the weather in the 0-6 hour timeframe is often referred to as nowcasting . It is in this range that the human forecaster still has an advantage over computer NWP models. In this time range it is possible to forecast smaller features such as individual shower clouds with reasonable accuracy, however these are often too small to be resolved by a computer model. A human given the latest radar, satellite and observational data will be able to make a better analysis of the small scale features present and so will be able to make a more accurate forecast for the following few hours.

Read also  Computer Numerical Control Vertical Milling Machine Engineering Essay

Signal Processing

Generating imagery for forecasting terror threats

Intelligence analysts and military planners need predictions about likely terrorist targets in order to better plan the deployment of security forces and sensing equipment. We have addressed this need using Gaussian-based forecasting and uncertainty modeling. Our approach excels at indicating the highest threats expected for each point along a travel path and for a ‘global war on terrorism’ mission. It also excels at identifying the greatest-likelihood collection areas that would be used to observe a target.

1 on geospatial analysis and asymmetric-threat forecasting in the urban environment. He showed how to extract distinct signatures from associations made between historical event information and contextual information sources such as geospatial and temporal political databases. We have augmented this to include uncertainty estimates associated with historical events and geospatial information layers.2

Event Forecasting Spatial Preferences

The notion of spatial preferences has been used to find potential crime1 and threat3 ‘hot spots.’ The premise is that a terrorist or criminal is directed toward a certain location by a set of qualities, such as geospatial features, demographic and economic information, and recent political events. Focusing on geospatial information, we assume the intended target is associated with features a small distance from the event location. We assign the highest likelihoods to the distances between each key feature and the event, and taper them away from these distances. This behavior is modeled using a kernel function centered at each of these distances. For a Gaussian kernel applied to a discretized map, the probability density function ρ for a given grid cell g and uncertainty estimates u is given by

Dig is the distance from feature i to the grid cell, Din is the distance from the feature to event location n, c is a constant, ΦE and ΦF are the position uncertainty for event and features respectively, I is the total number of features, and N is the total number of events. Figure 1(a) shows a sample forecast image based on this approach, denoting threat level with colors ranging from blue for lowest threat, through red for highest threat. For the same set of features and events, Figure 1(b) shows a more manageable forecast-in terms of allocating security resources-determined by aggregating feature layers prior to generating the likelihood values.

Modeling Uncertainty

One of the most important aspects of forecasting is having an estimate of the confidence in the supporting numerical values. In numerical weather prediction, there is always a value of confidence assigned with each forecast. For example, predicting an 80% chance of rain implies that numerical weather models given input parameter variations, predicted eight out of ten tries that it would rain.

Similarly, for our event forecasts, we have identified three key sources of uncertainty. These are: first, positional uncertainty associated with geospatial locations for geographic, demographic, economic, political-event, and historical-event data; second, error associated with feature reduction; and finally, methodological error associated with the event forecasting algorithms. Here, we will focus only on the positional error of historical event locations.

The historical event record of the data we used included the date, location, type of attack, organization claiming responsibility, a description of what happened, and confidence of the recorded data. The confidence values for the locations are rated from 1 to 5, with error values starting at ±10m and increasing by a power of 10 for each rank. The ratings represent analyst confidence in the precise event location. Error values in the event locations, uE, are incorporated into the distance measurements by setting the feature-to-event distance, Din, to Din ± uE. We account for this variation by discretizing the distance range, and sampling by Monte Carlo simulation. Figures 1(c) and 1(d) show the impact of accounting for the uncertainty. These forecast images were converted to Google Keyhole Markup Language and are shown in the Google Earth program,4 overlaying the correct georegistered terrain.

Satellites to Profile Weather, Improve Forecasts through GPS

BOULDER-A revolutionary, globe-spanning satellite network will furnish round-the-clock weather data, monitor climate change, and improve space weather forecasts by using signals from the Global Positioning System (GPS). Through atmosphere-induced changes in the radio signals, scientists will infer the state of the atmosphere above some 3,000 locations every 24 hours, including vast stretches of ocean inadequately profiled by current satellites and other tools. Nearly 100 scientists from over a dozen countries are meeting in Boulder on August 21-23 to help plan the use of data from this $100 million mission, which will begin operations in 2005.

Called COSMIC, the satellite network is now being developed through a U.S.-Taiwan partnership based on a system design provided by the University Corporation for Atmospheric Research, where the COSMIC Project Office is based. Taiwan’s National Science Council and National Space Program Office (NSPO) and the U.S. National Science Foundation are providing primary support for COSMIC.

“The increased coverage will improve weather forecasts by providing data where there previously was none or not enough,” says Ying-Hwa Kuo, project director for the Constellation Observing System for Meteorology, Ionosphere and Climate (COSMIC), also called ROCSAT-3 in Taiwan. With six satellite receivers, COSMIC will collect a global, 3-D data set expected to improve analyses of both weather and climate change. By tracking temperature in the upper atmosphere up to 30 miles high, COSMIC could help clarify whether these regions are cooling due to heat-trapping greenhouse gases closer to the surface. Also, by tracking moisture in the bottom 12 miles of the atmosphere, COSMIC provides much-needed information on the three-dimensional distribution of atmospheric water vapor, which is crucial for accurate prediction of precipitating weather systems. COSMIC will also measure high-altitude electron density, potentially enhancing forecasts of ionospheric activity and “space weather.”

COSMIC’s satellites will probe the atmosphere using radio occultation, a technique developed in the 1960s to study other planets but more recently applied to Earth’s atmosphere. Each satellite will intercept a GPS signal after it passes through (is occulted by) the atmosphere close to the horizon. Such a path brings the signal through a deep cross-section of the atmosphere. Variations in electron density, air density, temperature, and moisture bend the signal and change its speed. By measuring these shifts in the signal, scientists can determine the atmospheric conditions that produced them. The result: profiles along thousands of angled, pencil-like segments of atmosphere, each about 200 miles long and a few hundred feet wide.

Rather than replacing other observing systems, COSMIC will blend with them, filling in major gaps and enhancing computer forecast models. Many satellite-based products are like topographic maps that trace the contours of atmospheric elements in a given height range with high horizontal precision. COSMIC is more akin to a set of probes that drill through the depth of atmosphere with high vertical precision. Thus, says Kuo, “COSMIC will complement the existing and planned U.S. meteorological satellites.”

Radiosondes (weather sensors launched by balloon) have obtained vertical profiles since the 1930s. However, they are launched only twice a day in most spots, and few are deployed over the ocean. In contrast, the COSMIC data will be collected continuously across the globe. The GPS radio signals can be picked up by the low-orbiting COSMIC receivers even through clouds, which are an obstacle for satellite-borne instruments that sense infrared rays of the spectrum.

UCAR and colleagues began exploring the use of GPS-based observing systems in 1995 with the successful launch of a test satellite. Several other systems have been launched by researchers in the U.S., Germany, and Argentina. All of these are research-based systems, with the data made available within days or weeks. COSMIC’s data will be available within three hours of the observations, making them a potential boon to everyday forecast operations. The COSMIC Project Office will serve as a clearinghouse for research use of the data from COSMIC and other GPS-based systems by scientists in the United States, Taiwan, and elsewhere.

UCAR is overseeing ground-based facilities, satellite payloads, launch services, and data processing structures for COSMIC. Orbital Sciences Corporation is responsible for spacecraft design. The first spacecraft will be built at Orbital’s facilities in Dulles, Virginia. The rest of the constellation will be built and tested in Taiwan, where the system’s mission control will be based. NSPO and Taiwan industrial partners will join in satellite system development. Other collaborators include NASA, the National Oceanic and Atmospheric Administration, the Air Force, Jet Propulsion Laboratory, and Naval Research Laboratory.

REFERENCES:

http://www.ucar.edu/communications/newsreleases/2002/cosmic.html

http://spie.org/x8500.xml?highlight=x2410&ArticleID=x8500

http://www.indopedia.org/Weather_forecasting.html

Order Now

Order Now

Type of Paper
Subject
Deadline
Number of Pages
(275 words)