1

Perspectives on Distributed Data Fusion

David L. Hall

CONTENTS

1.1    Introduction

1.2    Brief History of Data Fusion

1.3    JDL Data Fusion Process Model

1.4    Process Models for Data Fusion

1.5    Changing Landscape: Key Trends Affecting Data Fusion

1.6    Implications for Distributed Data Fusion

References

1.1    INTRODUCTION

Multisensor data fusion has an extensive history and has become a relatively mature discipline. Extensive investments in data fusion, primarily for military applications, have resulted in a number of developments: (1) the widely referenced Joint Directors of Laboratories (JDL) data fusion process model (Kessler et al. 1991, Steinberg et al. 1998, Hall and McMullen 2004); (2) numerous mathematical techniques for data fusion ranging from signal and image processing to state estimation, pattern recognition, and automated reasoning (Bar-Shalom 1990, 1992, Hall and McMullen 2004, Mahler 2007, Das 2008, Liggins et al. 2008); (3) systems engineering guidelines (Bowman and Steinberg 2008); (4) methods for performance assessment (Llinas 2008); and (5) numerous applications (see, for example, the Annual Proceedings of the International Conference on Information Fusion). Recent developments in communications networks, smart mobile devices (containing multiple sensors and advanced computing capability), and participatory sensing, however, lead to the need to address distributed data fusion. Changes in information technology (IT) introduces an environment in which traditional sensing/computing networks (e.g., for military command and control (C2) or intelligence, surveillance, and reconnaissance [ISR]) for well-defined situation awareness are augmented (and sometimes surpassed) by uncontrolled, ad hoc information collection. The emerging concept of participatory sensing is a case in point (Burke et al. 2006). For applications ranging from environmental monitoring to crisis management, to political events, information from ad hoc observers provide a huge source of information (albeit uncalibrated). Examples abound: (1) monitoring of the spread of disease by monitoring Google search terms, (2) estimation of earthquake events using Twitter feeds and specialized websites (U.S. Geological Survey (http://earthquake.usgs.gov) n.d.), (3) monitoring political events (http://ushahidi.com n.d.), (4) citizen crime watch (Lexington-Fayette Urban County Division of Police, see http://crimewatch.lfucg.com n.d.), (5) solicitation of citizens to report newsworthy events (Pitner 2012), and (6) use of citizens for collection of scientific data (Hand 2010). While ad hoc observers and open source information provide a huge potential resource of data and information, the use of such data are subject to many challenges such as establishing pedigree of the data, characterization of the observer(s), trustworthiness of the data, rumor effects, and many others (Hall and Jordan 2010).

Traditional information fusion systems involving user-owned and controlled sensor networks, an established system and information architecture for sensor tasking, data collection, fusion, dissemination, and decision making are being enhanced or replaced by dynamic, ad hoc information collection, dissemination, and fusion concepts. These changes provide both opportunities and challenges. Huge new sources of data are now available via global human observers and sensors feeds available via the web. These data can be accessed and distributed globally. Increasingly capable mobile computing and communications devices provide opportunities for advanced processing algorithms to be implemented at the observing source. The rapid creation of new mobile applications (APPs) may provide new algorithms, cognitive aids, and information access methods “for free.” Finally, advances in human–computer interaction (HCI) provide opportunities for new engagement of humans in the fusion process, as observers, participants in the cognition process, and collaborating decision makers. However, with such advances come challenges in design, implementation, and evaluation of distributed fusion systems.

This book addresses four key emerging concepts of distributed data fusion. Chapters 1, 2, 3 introduce concepts in network centric information fusion including the design of distributed processes. Chapters 4, 5, 6, 7, 8 address how to perform state estimation (viz., estimation of the position, velocity, and attributes of observed entities) in a distributed environment. Chapters 9, 10, 11, 12 focus on target/entity identification and on higher level inferences related to situation assessment/awareness and threat assessment. Finally, Chapters 13 through 18 discuss the implementation environment for distributed data fusion including emerging concepts of service-oriented architectures, test and evaluation of distributed fusion systems, and aspects of human engineering for human-centered fusion systems. The remainder of this chapter provides a brief history of data fusion, an introduction to the JDL data fusion process model, a review of related fusion models, a discussion of emerging trends that affect distributed data fusion, and finally a discussion of some perspectives on distributed fusion.

1.2    BRIEF HISTORY OF DATA FUSION

The discipline of information fusion has a long history, beginning in the 1700s with the posthumous publication of Bayes’ theorem (1763) on probability and Gauss’ development of the method of least squares in 1795 to estimate the orbit of the newly discovered asteroid Ceres using redundant observations (redundant in the mathematical sense meaning more observations than was strictly necessary for a minimum data, initial orbit determination). Subsequently, extensive research has been applied to develop methods for processing data from multiple observers or sensors to estimate the state (viz., position, velocity, attributes, and identity) of entities. Mathematical methods in data fusion (summarized in Kessler et al. [1991], Hall and McMullen [2004], and many other books) span the range from signal and image processing methods to estimation methods, pattern recognition techniques, automated reasoning methods, and many others. Such methods have been developed during the entire time period from 1795 to the present.

A brief list of events in the history of information fusion is provided in the following:

•  Publication of Bayes’ theorem on probability (1763)

•  Gauss’ original development of mathematics for state estimation using redundant data (1795)

•  Development of statistical pattern recognition methods (e.g., cluster analysis, neural networks, etc.) (early 1900s–1940s)

•  Development of radar as a major active sensor for target tracking and identification (1940s)

•  Development of the Kalman filter (1960) for sequential estimation

•  Implementation of U.S. Space Track system (1961)

•  Development of military focused all-source analysis and fusion systems (1970s–present)

•  First demonstration of the Advanced Research Project Agency computer network (ARPANET)—the precursor to the Internet (1968)

•  First cellular telephone network (1978)

•  National Science Foundation Computer Science Network (CSNET) (1981)

•  Formation of JDL data fusion subpanel (mid-1980s)

•  Creation of JDL process model (1990)

•  Tri-Service Data Fusion Symposium (1987)

•  Formation of the annual National Symposium on Sensor Fusion (NSSDF) (1988)

•  Second generation mobile cell phone systems (early 1990s)

•  Commercialization of the Internet (1995)

•  Creation of the International Society of Information Fusion (ISIF) (1999)

•  Annual ISIF Fusion Conferences (since 1998)

•  Emergence of nonmilitary applications (1990s to present), including condition monitoring of complex systems, environmental monitoring, crisis management, medical applications, etc.

•  Emergence of participatory sensing to augment physical sensors (1990s)

While basic fusion algorithms have been well known for decades, the routine application of data fusion methods for real-time problems awaited the emergence of advanced sensing systems and computing technologies that allowed semi-automated processing. Automated fusion of data fusion requires a combination of processing algorithms, computers capable of executing the fusion algorithms, deployed sensor systems, communication networks to link the sensors and computing capabilities, and systems engineering methods for effective system design, development, deployment, and test and evaluation. Similarly, the emergence of distributed data fusion systems involving hard (physical sensor) data and human (soft) observations requires a combination of new fusion algorithms, computing capabilities, communications systems, global use of smart phones and computing devices, and the emergence of a net-centric generation who routinely makes observations, tweets, reports, and shares such information via the web.

1.3    JDL DATA FUSION PROCESS MODEL*

In the early 1990s, a number of U.S. DoD large-scale funded efforts were underway to implement data fusion systems. An example was the U.S. Army’s All Source Analysis System (ASAS) (Federation of American Scientists [www.fas.org/]). The field of data fusion was emerging as a separate discipline, with limited common understanding of terminology, algorithms, architectures or engineering processes. JDL was an administrative group created to assist in coordinating research across the U.S. Department of Defense laboratories. The JDL established a subgroup to focus on issues related to multisensor data fusion. The formal name was the Joint Directors of Laboratories, Technical Panel for Command, Control and Communications (C3) data fusion subpanel. This subgroup created the JDL data fusion process model (see Figure 1.1). The model was originally published in a briefing (Kessler et al. 1991) to the Office of Naval Intelligence and later presented in papers, used as an organizing concept for books (Hall and McMullen 2004, Liggins et al. 2008), national and international conferences, government requests for proposals, and in a few cases government and industrial research organizations. The original briefing (Kessler et al. 1991) presented a hierarchical, three-layer model. The top part of the model is shown in Figure 1.1. For each of the fusion “levels,” a second layer identified specific subprocesses and functions, while a third layer identified subfunctions and candidate algorithms to perform those functions. These sublayers are described in Hall and McMullen (2004).

Image

FIGURE 1.1 Top level of JDL data fusion process model. (Adapted from Hall, D.L. and McMullen, S.A.H., Mathematical Techniques in Multisensor Data Fusion, Artech House, Norwood, MA, 2004.)

Since its inception, the model has undergone several additions and revisions. The initial model included only the first four levels of fusion processing: object refinement (level 1), situation refinement (level 2), threat refinement (level 3), and process refinement (level 4). Steinberg et al. (1998) extended the model by adding a precursor level of fusion and sought to make the model more broadly applicable beyond military applications. Level 0 fusion involves sensor-based data processing and estimation. Level 0 processing recognized the increasing role of smart sensors and processing at the sensor level. Hall et al. (2000) and, independently, Blasch and Plano (2002) extended the model to include human–computer interaction involving cooperative cognition between a human user and a data fusion system. Other extensions to the data fusion model have been discussed by Llinas, who presents the case for further consideration of current data fusion issues including distributed data fusion systems and ontology-based systems.

The six high-level processes defined in the JDL model are summarized as follows:

1.  Level 0 fusion (data or source preprocessing) involves processing data from sensors (e.g., signals, images, hyper-spectral images, vector quantities, or scalar data) to prepare the data for subsequent fusion. Examples of data preprocessing include image processing, signal processing, “conditioning” of the data, coordinate transformations (to relate the data from the origin or platform that the sensor is located on to a centralized set of coordinates), filtering, alignment of the data in time or space, and other transformations.

2.  Level 1 fusion (object refinement) combines data from multiple sensors or sources to obtain the most reliable estimate of the object’s location, characteristics, and identity. The term object is usually meant to indicate physical objects such as a vehicle or human. However, we could also fuse data to determine the location and identity of activities, events, or other geographically constrained entities of interest. The issues of object/entity location (estimation) are often discussed separately from the problem of object/entity identification. In real fusion systems, however, these subprocesses are usually integrated.

3.  Level 2 fusion (situation refinement) uses the results of level 1 processing to develop a contextual interpretation of their meaning. This involves understanding how entities are related to their environment, the relationship among different entities and how they are interrelated. For example, the motion of vehicles in an environment may depend upon factors such as roads, road conditions, terrain, weather, and the presence of other vehicles. The actions of a human in a crowd might be interpreted much differently, than the same human motion and actions in the absence of other surrounding people. The techniques used for level 2 fusion may involve artificial intelligence, automated reasoning, complex pattern recognition, rule-based reasoning, and many other methods.

4.  Level 3 fusion (threat refinement/impact assessment) involves projecting the current situation into the future to determine the potential impact or consequences of threats associated with the current situation. Level 3 processing seeks to draw inferences about possible threats, courses of action in response to those perceived threats and how the situation changes based on our changing perceptions. Techniques for level 3 fusion are similar to those used in level 2 processing but also include simulation, prediction, and modeling.

5.  Level 4 fusion (process refinement/resource management) seeks to improve the fusion process (more accurate, timelier, and more specific). This might be accomplished by redirecting the sensors or information sources, changing the control parameters on the other fusion algorithms or selecting which algorithm or technique is most appropriate to the current situation and available data. The level 4 process involves functions such as sensor modeling, modeling of network communications, computation of measures of performance, and optimization of resource utilization.

6.  Level 5 processing (human–computer interaction/cognitive refinement) seeks to optimize how the data fusion system interacts with human users. The level 5 process seeks to understand the needs of the human user and respond to those needs by appropriately focusing the fusion system attention on things that are important to the user. Types of functions may include use of advanced displays, search engines, advisory tools, cognitive aids, collaboration tools, and other techniques. This may involve use of traditional HCI functions such as geographical displays, displays of data and overlays, processing input commands, and the use of nonvisual interfaces such as sound or haptic (touch) interfaces.

The originators of the JDL model fully recognized that the JDL levels were an artificial partitioning of the data fusion functions and that the levels overlap. In real systems, fusion is not performed in a sequential (level 0, level 1, …) manner. Instead, the processes are interleaved. For example, in level 1 processing, information about a target’s kinematics can provide insight into the target identification and potential threat (level 3). However, this artificial partition of data fusion functions has proven useful for discussion purposes.

1.4    PROCESS MODELS FOR DATA FUSION

There are a number of models that address cognitive and information processes that are related to data fusion. A survey and assessment of these process models was conducted by Hall et al. (2006). A summary of the models (and additional models) is presented in Table 1.1, along with references which describe the models in more detail. Hall et al. (2006) divided the models into two broad categories, data fusion models and decision making models. To a certain extent, this is an arbitrary partitioning but reflects how these models are referenced in the literature. In addition, models such as the observe–orient–decide–act (OODA) loop have several extensions and variations. Each of these models has advantages and disadvantages related to describing the fusion and decision making process. They are summarized here to indicate the potential variations in how to describe or characterize the process of fusing information to understand an evolving situation and ultimately result in a decision or action. A good discussion of higher level models for data fusion (viz., at the situation awareness and threat assessment levels) is provided by Bosse et al. (2007). It should be noted that the list of models in Table 1.1 is not exhaustive. There are a number of additional models related to specific application domains such as robotics and medicine. It should also be noted that these process models do not explicitly consider the distributed aspect of fusion.

TABLE 1.1
Summary of Data Fusion Models/Frameworks

Model

Description

References

JDL data fusion process model

A functional model for describing the data fusion process

Kessler et al. (1991)

Liggins et al. (2008)

Hall and McMullen (2004)

Steinberg et al. (1998)

Hall et al. (2000)

Blasch and Plano (2002)

Functional levels of fusion

An abstraction of input–output functions of the data fusion process—focus on types of data processed and associated techniques appropriate to the data types

Dasarthy (1994)

Transformation of requirements to information processing (TRIP) model

Application of the waterfall development process to data fusion—emphasis on linking inferences to required information and data collection

Kessler and Fabien (2001)

Omnibus model

Adaptation of Boyd’s OODA loop for data fusion

Bedworth and O’Brien (2000)

Endsley’s model of situational awareness

A cognitive model for situational awareness

Endsley (2003),

Endsley et al. (2000)

Three-layer hierarchical model

Three-layer modular approach to data fusion, integrating data at different levels: (1) data level (e.g., signal processing), (2) evidence level (statistical models and decision making), and (3) dynamics level

Thomopoulos (1989)

Behavioral knowledge formalism

Sequence of basic stages of fusion; extraction of a feature vector from data, alignment and association, development of pattern recognition and semantic labels, and linking feature vectors to events

Pau (1988)

Waterfall model

Hierarchical architecture showing flow of data and inferences from data level to decision-making level

Harris et al. (1998)

General data fusion model (DFA) using UML

General data fusion architecture model based on the unified modeling language (UML), using a taxonomy based on definitions of data and variables or tasks

Carvalho et al. (2003)

Unified data fusion (λJDL) model

Model that seeks to unify situation awareness functions, common operating picture, and data fusion

Lambert (1999, 2001)

Recognition primed decision (RPD) making

A naturalistic theory of decision making focused on recognition of perceptual cues and action

Klein (1999),

Klein and Zsambok (1997)

Kaempf et al. (1996)

Observe, orient, decide, act (OODA) loop

A process model of military decision making based on observing effective commanders; extended by several authors for general situation assessment and decision making

Boyd (1987), Brehmer (2005), Bryant (2006),

Rousseau and Breton (2004)

Grant (2005)

Salerno’s model

A framework that links data sources (categorized by perishability) to perception, comprehension, and projection

Salerno (2002),

Salerno et al. (2004)

In the domain of military applications and intelligence, the two most utilized models are arguably the JDL data fusion process model summarized in the previous section and Mica Endsley’s model of situation awareness (Endsley 2000, Endsley et al. 2003). Because of its extensive use in the situation awareness and cognitive psychology community, it is worth illustrating Endsley’s model in Figure 1.2. Endsley’s model seeks to line aspects of a cognitive task (illustrated in the top part of the figure) to characteristics on an individual performing the cognition (shown in the bottom part of the figure). Note that the levels in Endsley’s model do not correspond to the levels in the JDL model, but rather are meant to model the cognitive processes for situation awareness. Endsley and her colleagues have utilized this model for a variety of DoD applications, performing extensive interviews with operational analysts and knowledge elicitation to identify appropriate techniques for the Endsley levels of fusion. Salerno (2002) and his colleagues (Salerno et al. 2004) have compared the JDL model and Endsley’s model and have developed a high-level information functional architecture.

1.5    CHANGING LANDSCAPE: KEY TRENDS AFFECTING DATA FUSION

The context of distributed data fusion involves (1) rapid changes in IT, (2) individual and societal changes impacted and enabled by IT, and (3) the impact of IT as both a cause and solution for global problems. A summary of sample trends is provided in Tables 1.2, 1.3, 1.4. The tables list trends related to three main constructs: (1) IT, (2) information, and (3) people. Briefly we see the following trends and associated impacts.

Image

FIGURE 1.2 Endsley’s situation awareness model. (Adapted from Endsley, M.R. et al., Designing for Situation Awareness: An Approach to User-Centered Design, Taylor & Francis Group, Inc., New York, 2003.)

•  Information Technology—Very rapid changes are occurring in IT, ranging from ubiquitous, persistent surveillance of the entire earth via advanced sensors and human observers, increasingly capable mobile computing devices (via smart phones, embedded “invisible” computers in everyday devices, net-books, notebook computers, etc.), ubiquitous network connectivity with increasing access speeds, and improvements in HCI via multi (human) sensory inputs. This leads to near-universal connectivity among people, a tsunami of data on the web, and access to virtually unlimited computing capability. These have impacts on all aspects of human life and enterprise and certainly affect the concepts and implementation of data fusion systems. A summary of key areas including data collection, mobile computing, and network speed and connectivity is provided in Table 1.2.

•  Information—The huge increase in available data (including signal, image, video, and text) via sensors and human input leads to major challenges in storage, access, archiving, distribution, meta-data generation, and issues such as data pedigree. The ultimate limitation of human attention units (the limited number of people to access data and their limited ability to pay attention to data) will lead to both opportunities and challenges in human–data interaction. Table 1.3 summarizes key areas including data archiving and distribution, meta-data generation, and hard and soft fusion.

TABLE 1.2
Examples of Technology Trends Impacting Data Fusion

Area

Trends and Issues

Data collection

•  Ubiquitous, persistent surveillance: The capability exists now for worldwide ubiquitous, persistent surveillance. Resources such as national collection systems, long duration unmanned aerial vehicles (UAVs); leave-behind and resident ground-based sensors provide the opportunity for multispectral, multimode surveillance on a 24 × 7 basis. This allows focused and persistent surveillance about nearly any area of interest. The challenge is how to address the huge avalanche of data to sort through the data to find information of use/interest to generate meaningful knowledge. Such surveillance impacts areas such as environmental monitoring, understanding the distribution and evolution of disease, and crime and terrorism.

•  New sensors and sensing modalities: Physical sensors continue to be improved with new modalities of observation, increased sophistication in embedded signal and image processing, increased modes and agility in operation and control, and continuing improvements in sensor-level processing such as semantic meta-data generation, pattern recognition, dynamic sensor performance characterization, target tracking, and adaptive processing.

•  Open source information: Websites are available for all sorts of collected information. For example, sites based on reporting and mapping tools (ushahidi.com) provide information on emergency events, political uprisings, etc. Google Street View provides maps and photographs of numerous places around the world with ground level 360° photographs. The photograph sharing site Flickr (flickr.com) contains over 5 billion photographs taken by 10 million active subscribers. Commercial data providers such as DigitalGlobe (digitalglobe.com) provide access to satellite imagery including standard visual images, stereo images, and eight-band spectral images. Data regarding weather information, environmental data, detailed maps, video surveillance cameras, traffic monitoring, and many other types of information are all readily available.

Mobile computing

•  Mobile computing capabilities are rapidly increasing both in functionality, memory, speed, and network interconnectivity. New smart phones have typical specifications that include 4–16 GB memory (expandable to 32 GB), processing speeds in the range from 1 to 1.2 GHz, fourth generation communications speed, and touch screens with 480 × 800 pixels to 960 × 640 pixels. Over 1 million open source applications have been developed. The result is incredible hand-carried computing/sensing/communications devices that have proliferated throughout the world.

Network speed and connectivity

•  Internet connectivity is nearing worldwide ubiquity. Original connection via telephone landlines at 60 kilobits per second has changed to connections via television cable coax or fiber optics at typical speeds of 4–6 megabits per second, with additional mobile connection via mobile broadband over terrestrial mobile phone networks, WiFi hotspots in urban areas, and satellite Internet connections. While the United States lags behind other countries, some countries provide connections with speeds of 100 Mbs into homes. Increasingly, mobile devices are sharing and accessing video data via mobile Internet to the extent that video data dominates the data content of the mobile Internet. An excellent site that summarizes the history of the Internet is provided by (zakon.org).

Cloud computing

•  Cloud computing involves the delivery of computing resources as a product (sharing resources and information over a network), analogous to the concept of the electric grid. The ubiquity of Internet capability allows computing resources (large data storage, sophisticated computer models, large-scale computing capacity, etc.) to be at anyone’s fingertips for a fee. Ultimately such concepts could eliminate local IT staff and computers while providing access to unprecedented capability. An example is Wolfram Alpha (wolframalpha.com) which provides free access to large data sets and sophisticated physical and mathematical models.

Human computer interfaces

•  Human computing interfaces: Advances in HCI involve increased fidelity in human access to data as well as multi-(human) sensory methods of interaction. Examples include full-immersion, three-dimensional interfaces and sonification (Ballora 2010) to allow visual and aural pattern recognition, haptic interfaces to provide a sense of touch. The potential exists to create new multisensory, full immersion interfaces that fully engage the sophisticated ability of humans to recognize patterns and detect anomalies.

TABLE 1.3
Examples of Information Trends Impacting Data Fusion

Area

Trends and Issues

Data archiving and distribution

The exploding digital universe: According to a 2010 Gartner report, the top three challenges for large enterprises are data growth, followed by system performance and scalability (Harding 2010). In 2007, the digital universe was 2.25 × 1021 bits (281 exobytes); by 2011, it was estimated to grow by a factor of 10. Fast growing data sources include digital TV, surveillance cameras, sensor applications, and social networks. Major issues include how to store, archive, distribute, access, and represent such data (Chute et al. 2008).

Meta-data generation

Meta-data generation: Given the enormous amounts of data (signals, images, video) being collected and stored via the Internet of Things and human data collection, a challenge involves how to represent the data for subsequent retrieval and use. Significant advances are being made in automated linguistic indexing of pictures (viz., machine-generated semantic labels) with anticipated extensions to signal data and to video data. This would provide the ability to access signal, image, and video data via emerging advanced search engines (e.g., next generation CITESEER type engines [citeseer.ist.psu.edu]).

Hard and soft fusion

Hard and soft information fusion: An emerging area in data fusion research is the fusion of hard (traditional physical sensor) data and soft (human observation) data. This topic was first discussed at a Beaver Hollow workshop held in February 2009, hosted by the Center for Multisource Information Fusion (CMIF) (see infofusion. buffalo.edu). The workshop explored issues in the fusion of hard and soft data, characterization of human source data, architecture issues, and even fundamental definitions of the terms hard and soft fusion. Since that workshop, special sessions on hard and soft fusion have been held at the International Society of Information Fusion (ISIF) FUSION 2010 conference and the FUSION 2011 conference.

TABLE 1.4
Examples of People Trends Impacting Data Fusion

Area

Trends and Issues

Digital natives

Net-generation: The current “net-generation” of people under the age of 30 have grown up with the Internet, cell phones, social networks, global connectivity, instantly available online resources, and significantly different social outlooks and cognitive approaches than previous generations (see Tapscott 2009). These “digital natives” have different expectations for everything from social interactions to business to problem solving that are having significant impact on all aspects of society. Shirkey (2010) describes some implications of the new era of collaboration which results in projects such as the world’s encyclopedia (Wikipedia), shareware software, PatientsLikeMe, Ushahidi, and other dynamic collaborative efforts.

Participatory sensing

Soft and participatory sensing: Several developments and trends have provided the opportunity for the creation of a new, worldwide, data collection resource. These include (1) the huge increase in smart phones throughout the world (estimated in 2010 to be greater than 4.6 billion cell phones), (2) the increase in processing capability and sensor “add-ons” to smart phones (including high fidelity cameras, video capability, environmental sensors, etc.), and (3) the emergence of the digital native generation (Palfrey and Gasser 2008) who routinely collect information and share personal information via Twitter, Facebook, and other social sites. This has led to the concept of participatory sensing, in which individuals and groups of people actively participate in the collection of information for purposes ranging from crime prevention to scientific studies.

•  People—Finally, changes in IT and availability of information lead to changes in human behavior and expectations. The net-generation (people younger than 30 years) has always had access to the Internet, cell phones, computers, and related technologies. These “digital natives” exhibit different ways of addressing problems, viewpoints on collecting and sharing personal information, ways of establishing distributed social networks, etc. This in turn has implications for education, collaboration, business, and information security. Table 1.4 summarizes the potential impacts of a new generation of digital natives and the emergence of participatory sensing.

1.6    IMPLICATIONS FOR DISTRIBUTED DATA FUSION

As indicated in the previous section, a number of changes in technology, information, and people are impacting and will continue to impact the design and implementation of information fusion systems. Certainly, the proliferation of cell phones (leading to an avalanche of human observations), ubiquitous, high-speed networks, increased mobile computing power, cloud computing, new attitudes of users (based on a digital native outlook), and other factors are impacting data fusion systems. We are seeing the potential for “everyday” fusion systems supporting improved monitoring and operation of automobiles, medical diagnosis, monitoring of the environment, and even smart appliances. It is thus necessary to reconsider traditional data fusion technologies, design, and implementation methods to extend to these new applications and environment. While the changes in technology, information, and people provide increased opportunities, they also enable challenges to traditional thinking about fusion systems. As sensors and sources of information proliferate and new mobile applications become readily available, new challenges will involve (1) calibration and characterization of information sources, (2) establishment of methods to automatically determine the trustworthiness and pedigree of information, (3) the need to automatically generate semantic meta-data to represent signal, image, and video data, (4) how to establish the reliability of open-source software and algorithms, (5) meeting the expectations of increasingly sophisticated users, (6) creation of hierarchies of data and information fusion systems, (7) understanding how to utilize sensor-generated meta-data (e.g., in situ pattern recognition), and (8) robust architectures that span data to knowledge fusion, and many more.

It is hoped that this book will provide some additional insights to begin to address some of these issues.

REFERENCES

Ballora, M. 2010. Beyond visualization: Sonification. In Human-Centered Information Fusion, Hall, D. and J. Jordan (eds.), chapter 7. Norwood, MA: Artech House, Inc.

Bar-Shalom, Y. (ed.) 1990. Multi-Target-Multi-Sensor Tracking Advanced Applications, vol I. Norwood, MA: Artech House.

Bar-Shalom, Y. (ed.) 1992. Multi-Target-Multi-Sensor Tracking Advanced Applications, vol II. Norwood, MA: Artech House.

Bedworth, M. and J. O. O’Brien. 2000. The omnibus model: A new model of data fusion? IEEE Aerospace and Electronic Systems Magazine, 15(4), 30–36.

Blasch, E. and S. Plano. 2002. DFIG Level 5 user refinement issues supporting situational assessment reasoning. Proceedings of SPIE, vol. 4729, Wyndham, PA, pp. 270–279.

Bosse, E., J. Roy, and S. Wark. 2007. Concepts, Models and Tools for Information Fusion. Norwood, MA: Artech House.

Bowman, C. L. and A. N. Steinberg. 2008. Systems engineering approach for implementing data fusion systems. In Handbook of Multisensor Data Fusion: Theory and Practice, 2nd edn., M. E. Liggins, D. L. Hall, and J. Llinas (eds.), chapter 22, pp. 561–596. Boca Raton, FL: CRC Press.

Boyd, J. 1987. A discourse on winning and losing. Technical Report, Maxwell AFB, Montgomery, AL.

Brehmer, B. 2005. The dynamic OODA loop: Amalgamating Boyd’s OODA loop and the cybernetic approach to command and control. Proceedings of the 10th International Command and Control Research Technology Symposium, McLean, VA.

Bryant, D. J. 2006. Rethinking OODA: Toward a modern cognitive framework of command decision making. Military Psychology, 18(3), 183.

Burke, J. et al. 2006. Participatory sensing. Proceedings of WSW’06 at SenSys’06, October 31, 2006, Boulder, CO.

Carvalho, R., W. Heinzelman, A. Murphy, and C. Coelho. 2003. A general data fusion architecture. Proceedings of the Sixth International Conference on Information Fusion (Fusion’03), July 2003, Cairns, Queensland, Australia, pp. 1465–1472.

Center for MultiSource Information Fusion based at the University of Buffalo. http://www.infofusion.buffalo.edu, owned and maintained by Center for MultiSource Information Fusion, University of Buffalo, June 27, 2012.

Chute, C., A. Manfrediz, S. Minton, D. Reinsal, W. Schlichting, and A. Toncheva. March 2008. The diverse and exploding digital universe: An updated forecast of world wide information growth through 2011. IDC White paper sponsored by EMC.

CiteSeer. http://citeseer.ist.psu.edu/index, CiteSeerX: owned and maintained by The Pennsylvania State University College of Information Sciences and Technology, June 27, 2012.

Das, S. 2008. High-Level Data Fusion. Norwood, MA: Artech House.

Dasarthy, B. V. (ed.) 1994. Decision Fusion. Washington, DC: IEEE Computer Society.

DigitalGlobe. http://www.digitalglobe.com/, owned and maintained by Digital Globe corporation, June 27, 2012.

Endsley, M. R., B. Bolte, and D. G. Jones. 2003. Designing for Situation Awareness: An Approach to User-Centered Design. New York: Taylor & Francis Group, Inc.

Endsley, M. R., L. O. Holder, B. C. Leibrecht, D. C. Garland, R. L. Wampler, and H. D. Matthews. 2000. Modeling and Measuring Situation Awareness in the Infantry Operational Environment. Alexandria, VA: U.S. Army Research Institute for the Behavioral and Social Sciences, Infantry Forces Research Unit.

Federation of American Scientists, Intelligence resource Program. http://www.fas.org/irp/program/process/asas.htm, Federation of American Scientists: Intelligence Resource Program, All Source Analysis System, maintained by Steven Aftergood, updated November 25, 1998.

Flickr. http://www.flickr.com/, owned and maintained by Yahoo, downloaded June 27, 2012.

Grant, T. 2005. Unifying planning and control using an OODA-based architecture. Proceedings of the 2005 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on IT Research in Developing Countries, Mpumalanga, South Africa, pp. 159–170.

Hall, M. S., S. A. Hall, and T. Tate. 2000. Removing the HCI bottleneck: How the human computer interface (HCI) affects the performance of data fusion systems. Proceedings of the 2000 MSS National Symposium on Sensor and Data Fusion, June 2000, San Diego, CA, pp. 89–104.

Hall, D. and J. Jordan. 2010. Human-Centered Information Fusion. Norwood, MA: Artech House, Inc.

Hall, D. and J. Llinas. 1997. An introduction to multi-sensor data fusion. Proceedings of the IEEE, 85(1), 6–23.

Hall, D. L. and S. A. H. McMullen. 2004. Mathematical Techniques in Multisensor Data Fusion. Norwood, MA: Artech House.

Hall, D. et al. 2006. Assessing the JDL model: A survey and analysis of decision and cognitive process models and comparison with the JDL model. Proceedings of the National Symposium on Sensor Data Fusion, June 2006, Monterey, CA.

Hand, E. 2010. Citizen science: People power. Nature 466, 685–687.

Harding, N. 2010. Gartner: Data storage growth is the top challenge for IT organizations. Posting on IT Knowledge Exchange, November 3, 2010. (see http://itknowledgeexchange.techtarget.com/server-farm/gartner-data-storage-growth-is-the-top-challenge-for-it-organizations/)

Harris, C. J., A. Bailey, and T. J. Dodd. 1998. Multi-sensor data fusion in defense and aerospace. Aeronautical Journal, 102(1015), 229–244.

Kaempf, G. L., G. Klein, M. L. Thorsden, and S. Wolf. 1996. Decision making in complex naval command-and-control environments. Human Factors, 38(2), 220–231.

Kessler, O. et al. November 1991. Functional description of the data fusion process. Report prepared for the Office of Naval Technology Data Fusion Development Strategy, Naval Air Development Center, Warminster, PA.

Kessler, O. and B. Fabien. 2001. Estimation and ISR process integration. Report for the Defense Advanced Projects Research Agency (DARPA), Washington, DC.

Klein, G. A. 1999. Sources of Power: How People Make Decisions. Cambridge, MA: MIT Press.

Klein, G. A. and C. E. Zsambok (eds.). 1997. Naturalistic Decision Making. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.

Lambert, D. A. 1999. Assessing situations. Proceedings of the IEEE 1999 Information, Decision and Control, February 1999, Adelaide, South Australia, Australia, pp. 503–508.

Lambert, D. A. 2001. Situations for situation awareness. Proceedings of the ISIF Fourth International Conference on Information Fusion, (FUSION 2001), August 2001, Montreal, Quebec, Canada, pp. 545–552.

Lexington-Fayette Urban County Division of Police. http://crimewatch.lfucg.com, Lexington-Fayette Urban County of Division Police, Crime Map, owned and maintained by Lexington, Kentucky government, June 29, 2012.

Liggins, M., D. L. Hall, and J. Llinas. 2008. Handbook of Multisensor Data Fusion, 2nd edn., Boca Raton, FL: CRC Press.

Llinas, J. 2008. Assessing the performance of multisensor fusion processes. In Handbook of Multisensor Data Fusion: Theory and Practice, 2nd edn., M. E. Liggins, D. L. Hall, and J. Llinas (eds.), chapter 25, pp. 655–675. Boca Raton, FL: CRC Press.

Mahler, R. P. S. 2007. Statistical Multisource-Multi-Target Information Fusion. Norwood, MA: Artech House.

Palfrey, J. and U. Gasser. 2008. Born Digital: Understanding the First Generation of Digital Natives. New York: Basic Books.

Pau, L. F. 1988. Sensor data fusion. Journal of Intelligent and Robotic Systems, 1, 103–116.

Pitner, S. 2012. Reporting news with a cell phone, http://handheldjournalism.com/reporting-news-with-a-cell-phone/, March 14, 2010.

Rousseau, R. and R. Breton. 2004. The M-OODA: A model incorporating control functions and teamwork in the OODA loop. Proceedings of the 2004 Command and Control Research Technology Symposium, San Diego, CA, pp. 15–17.

Salerno, J. 2002. Information fusion: A high-level architecture overview. Proceedings of the 5th International Conference on Information Fusion, Annapolis, MD, pp. 1218–1230.

Salerno, J., M. Hinman, and D. Boulware. 2004. Building a framework for situation awareness. Proceedings of the 7th International Conference on Information Fusion, Stockholm, Sweden, pp. 680–686.

Shirkey, C. 2010. Cognitive Surplus: Creativity and Generosity in a Connected Age. New York: The Penguin Group.

Steinberg, A. N., C. L. Bowman, and F. E. White. 1998. Revisions to the JDL model. Joint NATO/IRIS Conference Proceedings, October 1998, Quebec City, Quebec, Canada.

Tapscott, D. 2009. Grown Up Digital. New York: McGraw Hill.

Thomopoulos, S. C. 1989. Sensor integration and data fusion. Proceedings of SPIE 1189, Sensor Fusion II: Human and Machine Strategies, November 1989, Philadelphia, PA, pp. 178–191.

U.S. Geological Survey. http://earthquake.usgs.gov, U.S. Geological Survey, earthquake hazards program, U.S. Department of the Interior, June 28, 2012.

Ushahidi. http://ushahidi.com/, Ushahidi, owned and maintained by Ushahidi, a nonprofit technology company, June 29, 2012.

Waltz, E. and J. Llinas. 1990. Multisensor Data Fusion. Norwood, MA: Artech House, Inc.

WolframAlpha Computational Knowledge Engine. http://www.wolframalpha.com/, owned and maintained by Wolfram Alpha Corporation, June 29, 2012.

Zakon Group, LLC. http://www.zakon.org/robert/internet/timeline/, Zakon Group Hobbes’ Internet Timepline 10.2, by Robert Hobbes Zakon, December 30, 2011.

*  The Joint Directors of Laboratories data fusion process model has been described in multiple references including (1) the original technical report (Kessler et al. 1991) and (2) various textbooks (Waltz and Llinas 1990, Hall and McMullen 2004, Hall and Jordan 2010), review articles (Hall and Llinas 1997), and revisions of the model (Steinberg et al. 1998, Hall et al. 2000, Blasch and Plano 2002). The JDL model has been referenced extensively in books, papers, government solicitations, and tutorials. This section of this chapter is thus not new, but rather a brief summary that paraphrases (and in some cases duplicates) the author’s previous writings on this subject.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset