31st Annual NAFIPS Meeting

Berkeley, CA, USA
August 6-8, 2012

Site Navigation[Skip]

  • home
  • keynote
  • program
  • special sessions
  • registration
  • venue
  • committees
  • students
  • submission
  • support

Sidebar[Skip]




contact

Asli Celikyilmaz
Marek Z. Reformat 
nafips12@gmail.com
























































































































































































VLADIK KREINOVICH


Biography
Vladik Kreinovich received his M.Sc. in Mathematics and Computer Science from St. Petersburg University, Russia, in 1974, and Ph.D. from the Institute of Mathematics, Soviet Academy of Sciences, Novosibirsk, in 1979. In 1975-80, he worked with the Soviet Academy of Sciences, in particular, in 1978-80, with the Special Astrophysical Observatory (representation and processing of uncertainty in radioastronomy). In 1982-89, he worked on error estimation and intelligent information processing for the National Institute for Electrical Measuring Instruments, Russia. In 1989, he was a Visiting Scholar at Stanford University. Since 1990, he is with the Department of Computer Science, University of Texas at El Paso. Also, served as an invited professor in Paris (University of Paris VI), Hong Kong, St. Petersburg, Russia, and Brazil.

Main interests: representation and processing of uncertainty, especially interval computations and intelligent control. Published 3 books, 6 edited books, and more than 800 papers. Member of the editorial board of the international journal "Reliable Computing" (formerly, "Interval Computations"), and several other journals. Co-maintainer of the international website on interval computations http://www.cs.utep.edu/interval-comp

Honors: President-Elect, North American Fuzzy Information Processing Society; Foreign Member of the Russian Academy of Metrological Sciences; recipient of the 2003 El Paso Energy Foundation Faculty Achievement Award for Research awarded by the University of Texas at El Paso, and a co-recipient of the 2005 Star Award from the University of Texas System.


Talk
Need for Expert Knowledge (and Soft Computing) in Cyberinfrastructure-Based Data Processing


Abstract: A large amount of data has been collected and stored at different locations. When a researcher or a practitioner is interested in a certain topic, it is desirable that he or she gets easy and fast access to all the relevant data. For example, when a geoscientist is interested in the geological structure of a certain area, it will be helpful if he or she get access to a state geological map (which is usually stored at the state's capital), NASA photos (stored at NASA Headquarters and/or at one of corresponding NASA centers), seismic data stored at different seismic stations, etc. Similarly, when an environmental scientist is interested in the weather and climate conditions in a certain area, it is helpful if he or she has access to satellite radar data, to data from bio-stations, to meteorological data, etc. Cyberinfrastructure is a general name for hardware and software tools that facilitate this data transfer and data processing, making it easier for the user. Ideally, a user should simply type in the request, and the system will automatically find and process the relevant data -- it should be as easy and convenient as a google search.

At present, the main challenges in cyberinfrastructure design are related to the actual development of the corresponding hardware and software tools. Most existing tools are concentrating on moving the data and on processing the data by using existing well defined algorithms. As cyberinfrastructure becomes a reality, it becomes clear that we while some of its results are exciting, other results require additional expert analysis and corrections. To make results more relevant, it is therefore desirable to incorporate expert knowledge into the cyberinfrastructure. Some expert knowledge is formulated in precise terms; these types of knowledge are easier to incorporate. However, a large part of expert knowledge is formulated not in precise terms, but by using imprecise (fuzzy) words from a natural language (like "small"). To incorporate this knowledge, it is therefore natural to use fuzzy techniques (and more generally, soft computing techniques), techniques specifically designed for formalizing such imprecise facts and rules.

In this talk, we describe several problems in which such incorporation is needed, and we overview our experience of such incorporation in geosciences and environmental sciences applications of cyberinfrastructure.

1) Somewhat surprisingly, the need for such expert knowledge emerges even in situations when we simply want to "fuse" data from different sources. In such situations, seemingly natural statistical approaches (such as Maximum Likelihood methods), sometimes lead to physically meaningless results. To get physically meaningful results, we must supplement the data itself (and the corresponding statistical information) with expert knowledge describing which fusion results are physically meaningful and which are not. In the talk, we show how this expert knowledge can help.

2) The need for an expert knowledge is even more acute in the actual data processing, e.g., in solving inverse problems, when we need to reconstruct the values of the quantities of interest -- such as density at different depths and different locations -- from the measurement results. From the mathematical viewpoint, the corresponding problems are often "ill-posed", meaning that usually, several drastically different density distributions are consistent with the same observations. Out of all these distributions, we need to select the physically meaningful one(s) -- and this is where expert knowledge is needed, to describe what "physically meaningful" means. On the example of the above geophysical problem, we show how this expert knowledge can be taken into account.

3) The above two applications are related to processing the existing data, the data coming from the existing measuring instruments. In many practical situations, the data from the existing instruments is not sufficient, so new measuring instruments are needed. For example, to get a better understanding of weather and climate processes, we need to place more meteorological stations in under-covered areas -- such as Arctic, Antarctic, and desert areas. Which are the best locations for these new instruments? Which are the best designs? We would like to gain as much information as possible from these new instruments. The problem is that we do not know exactly what processes we will observe -- this uncertainty is what motivates us to build the new stations in the first place. Because of this uncertainty, to make a reasonable decision, we need to use expert knowledge. In this talk, we show how we have used NASA's experience of solving a similar problem of optimization under uncertainty -- when NASA selected the sites for the first Moon landings -- to find the optimal location of a meteorological tower.


               June 2012
[Back To Top]