30

 

 

Fusion of Ground and Satellite Data via Army Battle Command System

 

Stan Aungst, Mark Campbell, Jeff Kuhns,
David Beyerle, Todd Bacastow, and Jason Knox

CONTENTS

30.1   Introduction

30.2   Description of the Army Battle Command System

30.2.1   Situational Awareness

30.2.2   Common Operating Picture

30.2.3   Information Fusion and Decision Making

30.2.4   Joint Command and Control

30.2.5   The Global Command and Control System-Army

30.2.6   Force Battle Command Brigade-and-Below

30.3   Evolution of the Army Battle Command System

30.3.1   Remote Sensing, Ground-Based Systems (Image and Nonimage)

30.3.2   Tactical Unmanned Aerial Vehicles and Aerostats (Sensor Networks)

30.3.3   Ground Sensors

30.4   Discussion and Implications for Disaster Management

30.4.1   Simulation

30.4.2   Intelligence Analysis, Data Mining, and Visualization

30.5   Summary and Final Recommendations

Glossary of Key Terms

Acknowledgments

References

 

 

30.1   Introduction

Purpose, method, and endstate are components of the concept of a commander’s intent purported in the 1996 version of the U.S. Army’s document titled, Field Manual 100-5 (FM 100-5).1 The basic premise is that once military commanders state their intent, all other matters are subordinate and become supplementary information. Intent is a format or framework for command and information operations in the field.

In Force XXI, the U.S. Army is concentrating on developing a tactical command and control (C2) structure from the ground up taking into consideration all previous C2 nodes and echelons.2 Previously, the U.S. Army utilized a theater C2 structure. This type of rigid structure is no longer feasible because recent events suggest that the mission be global and flexible.

Vital to this flexible C2 structure is a reliable, secure communications infrastructure that integrates forces and assets utilizing a variety of sensors connected by mission-specific support systems. It is the authors’ intent to introduce this architecture, the application areas, and its evolution. A flexible C2 organizational structure is recommended, which can be expanded or contracted depending on the event. We will discuss a joint command and control (JC2) situational awareness (SA) in which real-time information and decision making use data fusion to form the basis for an effective communication infrastructure. The effective implementation of fusion technology across all agencies, military and civilian, will also decrease response and recovery time, be that in the battlefield or a natural disaster.

Our method is to describe the current state of technology for SA, the common operating picture (COP), and its evolution. We begin with the introduction of the Army’s Battle Command System (ABCS) in Section 30.2, which contains the subsystems such as SA, COP, information fusion and decision making, and JC2.

In Section 30.3, we discuss the evolution of the ABCS to include remote sensing, unarmed and armed aerial sensors (unmanned aerial vehicles, UAVs), and ground sensors. Next, in Section 30.4, the authors suggest that a flexible command and control (JC2) and ABCS system and some software engineering recommendations could provide a framework for mitigating, responding, and minimizing recovery from a disaster.

 

 

30.2   Description of the Army Battle Command System

In the 1980s a decision was made within the military to aid the warfighter through the use of automation. The idea was to incorporate a multitude of sensors, networks, and software to increase the responsiveness and collaboration between all echelons of the military. The system has to be an integral component of all phases of an operation. From movement, logistics, and maneuver, there has to be some level of automation and information sharing.3 The ABCS4 was introduced to address this need. The system is composed of several sensor systems and networks (stationary and mobile) with software covering the battlefield operating systems (BOS). The advantage of these systems is the automated fusion of sensors, planning, positioning, logistics, and other important information systems that facilitate decision making. Information is distributed in all directions throughout the systems, both vertical and horizontal, to develop a consistent SA picture.

30.2.1   Situational Awareness

SA is an extremely important capability, necessary to effectively plan and execute operating orders, under rapidly changing variables beyond the control of the command organization. Possessing the precise knowledge to respond to evolving situations is only part of the process needed to control and successfully complete a mission. The ability to accurately and quickly combine (fuse) all the data from heterogeneous sources is equally important. Fusing this data allows an organization to be more proactive, while limiting the number of dynamic variables outside their control. Additionally, the versatility of heterogeneous data allows one to apply many different analysis techniques (signal and image processing, simulation, and mathematical modeling, etc.) to provide more accurate predictions of future events. The primary purpose of this fused data is to give a more accurate view of the battlefield by providing organizations with a COP and datasets that could be archived or used for training purposes.

30.2.2   Common Operating Picture

The COP represents a subset of the heterogeneous data collected and subsequently fused to give the individuals, across all organizations involved, the same near-real-time view of the event. The COP provides all individuals with the ability to proactively respond and reactively adjust to situations to comply with the intent of both the higher organization and their own organization. The COP includes key aspects of the situation such as allowing individuals to see key events that have occurred in near real time. Input data may come from ground, aerial, satellite, remote sensing, or human interaction. These heterogeneous sources must be fused together to form reliable information for real-time decision making in the field.

30.2.3   Information Fusion and Decision Making

Multisensor information fusion seeks to combine information from multiple sensors and sources that are not feasible from a single sensor or source. The proliferation of micro- and nanoscale sensors, wired and wireless communication, and ubiquitous computing enables the assembly of information from sensors models and human input for a variety of applications such as Department of Defense (DoD) applications, environmental monitoring, crisis management, disaster management, medical diagnosis, monitoring and control of manufacturing processes, and intelligent buildings. A key problem is how to integrate or fuse information from heterogeneous sources. Techniques for such information are drawn from a broad set of disciplines including statistical estimation, signal and image processing, artificial intelligence, cryptography, software engineering, computer engineering, and information sciences. Major issues involve communicating across an architecture of distributed sensing and processing sites, selection and integration of disparate types of algorithms, adding in the role of human-in-the-loop for analysis and decision making, and defining the degree of automation and computer-aided cognition. Information fusion and JC2 provide an effective combination for flexible battlefield management.

30.2.4   Joint Command and Control

The concept of joint in the military involves situations where organizations develop mechanisms for bridging organizational differences and extracting strategic value from inter-organization cooperation. This could involve considerations such as the operation, communication infrastructures, standard protocols and tactical structures, processes and human expertise. JC2 is defined as an interrelated system of command links or nodes integrating maneuver forces and strike assets, informed by a variety of sensors (ground, satellite, or other communications and data links). An important step is to create JC2 structures that function on the operational level to assist warfighters to respond quickly to events in regional commands.3

A C2 structure provides a framework for organizations (a system of systems) that operate under different levels of responsibilities. Each level contributes to the overall framework, providing both higher and lower organizations with key support assets. These interrelated systems allow a particular situation (e.g., within the battlefield) to be controlled, synchronized, and supported logistically from the top down. Each organization’s leader has his/her own requirements to achieve objectives, which must fit within the constraints of the higher organizations leader. In addition, the higher-level organization must give the subordinate elements the support they need to achieve their aspect of the mission. The higher-level organization must task each subordinate organization and then synchronize the completion of these tasks. This synchronization acts as a force multiplier and provides a more effective synergistic effort. SA allows the leaders at a higher level to see accurately and in near real time how the situation is unfolding and how their intent is being met. It allows them to adjust quickly and stay proactive in the decision-making cycle. The COP allows each individual at any level in the organization to visualize their environment. It provides a validation that they are meeting the higher-level organizations intent while ensuring that they are staying within the constraints placed on them by the higher organization.

The ability to maintain C2 at every level requires that a communications infrastructure fuse data from heterogeneous sources, distribute fused information, and provide SA and COP to the precise levels of the organization in a timely, accurate, secure, and reliable manner. This command structure is usually enforced from the top down with the highest level of the organization providing the key components pertinent, accurate information. This C2 flexible structure enables the lower elements to successfully communicate to higher and lower subsystems of the organization, enabling the whole organization to effectively maintain C2 at each level. Simultaneously, appropriate area experts (e.g., G2, S2, etc.) analyze, fuse, and send the data to the organization leaders in a form that provides an accurate, near real-time SA to facilitate critical decision making. Under these conditions, information fusion must function within a secure, reliable, scalable communications infrastructure to facilitate decision making at each level and enable a more effective JC2 infrastructure.

The potential for fusing a data-intensive system supported by an information infrastructure is very promising. If performed effectively, it would enable organizations to expand and contract their assets quickly and efficiently. However, the designers (e.g., software engineers) must also be cognizant of the requirements to operate within a secure, reliable communication system, and a common interface for the purpose of greater autonomy and distribution of information to the battlefield.

30.2.5   The Global Command and Control System-Army

The Global Command and Control System-Army (GCCS-A) (Figure 30.1) is an organizational component of the Joint Global Command and Control System (GCCS-J).

Images

FIGURE 30.1
Diagram of information inclusiveness.

The Army Tactical Command and Control System (ATCCS) is a subsystem of the GCCS-A, which provides an information network that supplies seamless information flow from brigade to corps level of operations. ATCCS receives information from Force XXI, Battle Command Brigade-and-Below (FBCB2) and facilitates the flow of information from brigade through corps units to support critical decisions. This same information that is aggregated and analyzed by ATCCS is then transmitted to the GCCS-A providing information flow horizontally and vertically.

30.2.6   Force Battle Command Brigade-and-Below

FBCB2 provides SA and C2 to the lowest tactical echelons. This permits a bidirectional flow of data providing the COP discussed earlier.

The system is composed of the components summarized in Table 30.1.

The FBCB2 system provides the user with the general capabilities given in Table 30.2.

It should be noted that FBCB2 utilizes a common messaging protocol to communicate between systems for interoperability. To send a message to a destination, FBCB2 uses a tactical internet that permits a redundant and routable data link, critical for battlefield operations. This standardized protocol increases flexibility and interoperability because it is based on an accepted standard.

TABLE 30.1
FBCB2 Components

Images

TABLE 30.2
FBCB2 System Capabilities

Images

The Defense Information Infrastructure Common Operating Environment (DII COE)5 is a program that aids in defining standards for preexisting technologies. The use of standards ensures that FBCB2 and other ABCS systems can interoperate.

Common operating standards improve the quality of information processed, refined, and analyzed, aiding the decision-making process for battlefield management. Some of this information can be supplied by sensors or humans.

 

 

30.3   Evolution of the Army Battle Command System

30.3.1   Remote Sensing, Ground-Based Systems (Image and Nonimage)

Remote sensing technologies, together with other geospatial technologies such as geographic information systems (GIS), geographic positioning systems (GPS), and position navigation and timing (PNT) systems, play a significant role in the improvement of the nation’s homeland security mission, disaster management, and critical infrastructure protection. Remote sensing from space combines a broad synoptic view with the ability to detect changes in surface features quickly and routinely. Remote sensing from aircraft and UAVs have the ability to examine areas in great detail from below the clouds, whereas ground-based systems make possible the close-in observation of events in real time. Each of these areas of remote sensing technology can contribute significantly in homeland security and disaster management.6

Examples of the use of remote-sensed data to support nuclear power plant facilities and transportation security proactively and reactively include the following capabilities:

  • Detect, classify, and analyze temporal and spatial changes in surface features

  • Develop accurate digital terrain models and 3D surface features as a means for modeling landforms along rights of way

  • Visualize terrain from different perspectives, with the potential for developing threat cones and view sheds

  • Classify vegetation types along transportation lifelines as a possible deterrent to concealment

  • Identify facilities where topography or identifiable hazards (e.g., nuclear, chemical, fuel facilities) place communities at risk

  • Analyze environmental factors quickly and effectively

  • Merge real-time sensor output (video, biochemical sensors) with archived geospatial data

  • Identify, characterize, and analyze a wide variety of risks to transportation networks through a gradual program of gathering image intelligence along rights of way

  • Create detailed maps of an area that has suffered attack to assist in response7,8

30.3.2   Tactical Unmanned Aerial Vehicles and Aerostats (Sensor Networks)

The U.S. military’s inventory includes a number of UAV platforms with a suite of sensors to aid in military sensing.

Larger UAVs have a greater payload capacity, including weapon systems. Typically, sensors enable surveillance (full-motion video cameras, thermal and infrared [IR]) communications intercept, retrans and jamming (EW) (Figure 30.2).

Nuclear biological chemical (NBC) agent detectors are mass spectrometers, gas analyzers, radiation sensors, etc.

An aerostat is a static, tethered airship similarly with fitted sensor payload. The joint land attack cruise missile defense elevated network sensor (JLENS) is an example. The aerostat can deliver robust performance from its array of sensors (Figure 30.3).

Images

FIGURE 30.2
Unmanned aerial vehicle.

Images

FIGURE 30.3
Joint land attack cruise missile defense elevated network sensor.

30.3.3   Ground Sensors

Numerous ground-based sensors have been deployed including sensor networks with acoustic, seismic, IR, and video capability. The rapid evolution of low-cost sensor networks, with each node containing GPS self-location capability, local wireless communications, and local processing, provides the ability to deploy sensor networks for detecting humans, animals, vehicles, chemical and biological entities, and local environmental conditions. These ground sensors may involve unattended sensor networks, mobile robotic sensors, and sensors worn by individual soldiers. Indeed, the soldiers themselves can become a multisensor system involving their own observations and the observations of sensors that they carry or wear.9 Issues of the use of ground-based sensors include challenging observing environments (e.g., the effects of weather, terrain, and line-of-sight on observations), power requirements for observation and communication with other sensors, and issues regarding how to task and manage the sensor resources.10

 

 

30.4   Discussion and Implications for Disaster Management

In the twenty-first century, emergency management coordinators and managers face unprecedented challenges and threats. Increasing demands are being placed on agencies to maintain the existing communications infrastructure and new responsibilities to improve systems safety, communications, performance, and security policies. Presently, a variety of advanced technologies are involved to enhance planning, designing, managing, operating, communicating, and maintaining all facets of the nation’s public and private utilities (e.g., nuclear power plants) and critical transportation systems. Aerial, ground, and satellite remote sensing including UAVs and information fusion represents areas of rapid development that can be leveraged to address these challenges. These technologies have significant and unique potential for application to a number of natural disasters and emergencies including homeland security applications.

A fundamental problem with a disaster is that it does not respect boundaries that including political, regional, organizational, geographical, professional, sociological, and consideration.11 In addition, a disaster does not have a definite beginning and end point. It is a dynamic process that has one definite characteristic—chaos. Situational reports can be seen to characterize several concerns that inhibit optimal decision making during a disaster situation:

  • Disasters and critical emergency events can overwhelm communication

  • Coordination among agencies may be difficult

  • Information may be fragmented

  • The reliability of data could be questionable due to data overload such that decision makers may receive misinformation to base their decisions

  • Data or information fusion may not be fully available

Considerable efforts are ongoing throughout the disaster management community to articulate concerns and characterize the dynamics between organizations. An important guide for this effort is found in the recommendations made by the Board on Natural Disasters (BOND) in their report to the National Research Council.12

The BOND’s primary goals are to

  1. Improve decision making before, during, and after emergencies to ensure better access and quality of data and information

  2. Identify users and their needs

  3. Provide information products specifically designed to meet users’ need

  4. Promote efficiency and cost-effectiveness

  5. Stimulate and facilitate mitigation

These are important components in any fully functional, communications system for comprehensive disaster management. It is also important that any software engineering process focus on commercially available software in developing solutions that reflect the needs of the disaster management community. Software engineering should focus at the lower, more informative operational levels in the systems development life cycle (SDLC); it in fact forms the basis for the development of the software and hardware stipulated by the BOND.12 As stated by the BOND, the assessment process will also be required to identify germane components associated with specific disaster events and scenarios. These components will then form the basis for future technical requirements and deliverables.

Information management and policy are important issues related to disaster and emergency management. Disaster management is an exercise in information fusion and information processing, logistics, simulation, intelligence analysis, and decision making. To effectively undertake these tasks requires a thorough understanding of disaster information requirements and the characteristics associated with the unique disaster event. Disasters come in different sizes, have different behavior, and can be categorized on the basis of their impact on natural resources, transportation systems, communities, and so on. They can also be discriminated and categorized along a number of dimensions such as impact, severity, duration, geographic setting, and advance warning.12 To develop the information technology and data fusion architecture and infrastructure for case scenario application(s) and response, it is essential to understand the disaster event from the perspective of those responsible for assimilating the data and making critical decisions. The effective use of this information for producing operational plans and policies that respond to the disaster event and its recovery is critical to minimizing loss of life, property, and environmental impact.

As previously stated, it is important that the disaster management community identify commercial off-the-shelf products (COTS) and implement solutions that reflect the complex needs of disaster management. It is also equally important that the hardware and software engineers design the hardware and software to reflect the needs of the disaster management life cycle (DMLC) of mitigation, response, and recovery. Thus,

software engineering = disaster life cycle management

In addition, the procedure to fuse data (information fusion) requires accurate, reliable data (data accuracy) to facilitate decisions. Fusing unreliable data is not the answer. The old acronym GIGO (garbage in, garbage out) is still applicable for data fusion. Information that arrives late (timeliness) from disparate sources and is transmitted over nonstandard formats (data consistency) requires a significant effort to recompile into a coherent picture (COP and data understandability) of the disaster area (SA). Timely information fusion can be a key concept that ensures immediate responses are taken to minimize response time and recovery operations.

We need to consider software principles that aid in the improvement of the DMLC. Some recommendations for disaster management software engineers are as follows:

  • Understand the information requirements before fusing the data

  • Be cognizant of the unique disaster event and its impacts

  • Utilize live simulations, virtual simulation, and constructive simulation for preparation and training for disasters13

  • Effectively use information fusion to aid in the production of operational plans and policies for mitigation, response, and recovery

  • Utilize event warning systems (e.g., ground sensors, remote sensing, and aerial sensing UAVs, etc.) that aid in the preparedness, response, and recovery phases of the DMLC

  • Ensure that the communication infrastructure is scalable and within and between agencies

  • Select a common interface for each agency with common message formats

The open geospatial consortium (OGC) is developing a standard sensor description model (metamodel) as part of its sensor web enablement (SWE) effort. The OGC is an international industry consortium organized to develop publicly available geoprocessing specifications. OGC members are developing a standard XML encoding scheme for describing sensors, sensor platforms, sensor-tasking interfaces, and sensor-derived data. The goal is to make web-enabled devices discoverable and accessible using standard services and schemas. The sensor model language (SensorML) is a component that provides sensor information necessary for discovery, processing, and georegistration of sensor observations.

The geolocation of sensor data has required software specifically designed for that sensor system. The availability of a standard model language for describing platform position, as well as instrument geometry and dynamics, allows for the development of generic multipurpose software that can provide geolocation for potentially all remotely sensed data. The availability of such software in turn provides a simple, single application programming interface (API) for software developers to incorporate sensor geolocation and processing into their application software. This allows the development of software libraries that can parse these files and calculate required look angles and timing for each sensor pixel.

The SensorML provides an XML schema for defining the geometric, dynamic, and observational characteristics of sensors and provides a standard XML encoding scheme for observations and measurements of all kinds. This research also has led to specifications for open interfaces for

  • Sensor collection services. A software service that provides observed values by seeing what sensors of a specified type are available in a specified region.

  • Sensor planning services. A software service that enables acquisition requests and notification of relevant events.

  • Sensor registries. A catalog that enables discovery of sensors and observed values.

There are other issues to consider for data fusion to occur: first, it must have a physical path for data to flow. This data path must be designed to handle a variety of situations. Different organizations respond to different situations and each can be loosely categorized—the data path must be designed around the characteristics of these categories. One organization may have many subunits and the paths may be decomposed and designed around these subordinate units. The categories are levels of coverage area, mobility, flexibility, security, and bandwidth. Some organizations have only one function and cannot deviate. In this case flexibility is categorized as low. However, this same organization may need to be on the move constantly to react to a developing situation. The unit may also have a large coverage area. This can present a problem. How to develop a system that can communicate on the move, over a large geographic area? This situation may call for a wireless communication medium not limited by line-of-sight but most likely using a satellite link. This same link may require different levels of security? Is there a threat of jamming or a need to overcome it? Does the data being transmitted need to be encrypted or does it require a nonsecure link?

In the military the data link can be categorized as highly secure, mobile, and flexible. There is also a large coverage area and applications that are bandwidth intensive. There are many echelons to the military that require each of these categories to be decomposed to the lowest level. Security is a constant requirement within the military. The need for a high level of flexibility is also a constant. There is a wide range of military missions that force the military to adapt to a situation to complete the mission. The remaining categories vary at different levels of the military. In general, the higher the level the greater the need for bandwidth, and the larger the coverage area. Mobility is actually the reverse and the lower the level the larger the need for mobility.

Disaster management organizations have a wide range of communication assets. Given the multitude of possible scenarios, disaster management organizations may require a highly flexible data system. This system must be flexible enough to handle any type of reorganization that might occur, which may be achieved through interoperability between different communication assets.

Software can be designed to perform special functions or it can aid in developing the COP and thereby providing standard SA. These specialized functions must take raw data and fuse into information that is provided from the specific functional areas. This allows decisions in these areas to be executed with more confidence while maintaining the ability of the organizations or individuals responsible for these areas to stay as proactive as possible, minimizing the need to be reactive. This software must share some type of common protocol for passing and interpreting the data being passed. The common protocol aids in the efficient use of physical resources as well as interoperability between systems.

The military has several different software packages they use to ensure near real-time SA and all provide at a minimum the COP. There are systems dedicated to the logistical aspects of their missions, as well as fire support, air defense, and maneuver control aspects. The primary communication infrastructure to provide a COP and SA to the lowest possible level is FBCB2.14

Preparation for disaster management can be accomplished through simulation, intelligence analysis, data mining, and visualization, which may preclude a disaster or at least minimize their impact.

30.4.1   Simulation

There are basically three types of simulation methods, which could be easily integrated into training to prepare for disasters.

  1. Live simulation with communication equipment in a real-life environment out in the elements (rain, snow, sleet heat, etc.), with assigned equipment and simulated disasters. An example would be a digital field exercise (DFX) using state-of-the-art communications equipment, sensors, and UAV technology.

  2. Virtual simulation places individual first responders, EMS personnel, and disaster management teams (e.g., Hazmat, CBR, etc.) in simulators (VR) that replicate actual disasters and homeland security scenarios as if the first responders were actually in the field.

  3. Constructive simulation entails large-scale computer simulation that represents battalion size units and above.3

30.4.2   Intelligence Analysis, Data Mining, and Visualization

In addition to simulation exercises, disaster managers will be required to answer questions not yet posed. Intelligence analysis, data mining, and visualization will assist disaster managers in answering these questions. The ability to predict future hurricane events is enhanced by daily, seasonal, and annual data mining queries via data mining algorithms. Mathematical techniques such as fuzzy logic and genetic algorithms assist the data mining queries to interpret the intelligence contained in the images. Airport runways and possible path obstructions, underpasses, over-pass bridges, pipelines, borders, and port facilities are all candidates for virtualization.

It is our belief that information fusion is applicable to disaster management and homeland security, both proactively (mitigation and preparedness phase) and reactively (response and recovery phases). Some of the goals of information fusion are congruent with the BOND in their report to the National Research Council. These points are covered in Section 30.4. These are important components in an operational infrastructure for comprehensive disaster management.

Finally, it is important to prepare for disasters via simulations, intelligence analysis, and visualization methods and to be proactive to potential disasters, preempting them before they occur.

Disaster management is an exercise in logistics and communications. Disaster agencies and personnel (especially managers) rely on their hardware and telecommunications; both agency personnel and software engineers must have an understanding of the information needs (requirements) and the characteristics of the disaster event and its uniqueness. This knowledge will facilitate effective architectures and technologies that meet the needs of the disaster management community. There must also be a precise understanding of the DMLC of mitigation, preparedness, response, and recovery. Traditional software engineering methodologies must match this life cycle and also be cognizant that each disaster is unique. In addition, a common interface should be designed for each agency.

 

 

30.5   Summary and Final Recommendations

Information fusion for situation assessment and the development of a COP requires a system of systems that integrate the capabilities of unmanned aerial and ground sensors, satellite communications, remote sensing capabilities, and a joint C4ISR (command-control-communication-coordination-intelligence-surveillance-reconnaisssance) capability along with a common interface to a reliable communication infrastructure. With such a capability, regional commanders and disaster management personnel can operate independently and be confident in his/her ability to respond to any situation.

 

 

Glossary of Key Terms

COP: The common operational picture provides a fused synopsis of the heterogeneous data collected to give the individuals across all organizations involved, the same near-real-time view of the event.

DII COE: The Defense Information Infrastructure Common Operating Environment5 is a program for the purpose of reducing cost and ensuring capability. It defines a standard for using preexisting technologies that have already been proven.

FBCB2: Force Battle Command Brigade and Below provides SA and C2 to the lowest tactical echelons.

SA: Situational awareness is an extremely important component to effectively plan and execute operating orders during a crisis involving rapidly changing variables that are beyond the control of the command organization.

 

 

Acknowledgments

We gratefully acknowledge the assistance of LTC Tim Purcell, Battalion Commander, for permitting Dr. Stan Aungst to attend classes, observe and ask questions at 3 DFX at Fort Indiantown Gap. Dr. David Hall, Associate Dean of Research, College of IST, Penn State University.

 

 

References

1. US Army Field Manual 100-5.

2. Force XXI: Division Redesign, Army Times, 22 June 1998.

3. MacGregor, D. A., Command and Control for Joint Strategic Action, Digital War, The 21st Century Battlefield, edited by Robert L. Bateman, i-books, 1999.

4. Army Battle Command System (ABCS), 21 February 1999. Online. Internet. http://www.fas.org/man/dod-101/sys/land/abcs.htm.

5. Defense Information Infrastructure Common Operating Environment (DII COE), 11 January 2007. Online. Internet. http://www.sei.cmu.edu/str/descriptions/diicoe.html.

6. Tobin, G. A. and Burrel, E., Natural Hazards: Explanation and Integration, London, NY, Guilford Press, 1997.

7. National Simulation Center, Training and Simulation, Fort Leavenworth, KS, Combined Arms Center, 1996.

8. NRC Report, Information Infrastructure for Managing Natural Disasters, Board on Natural Disasters, Washington DC, National Academic Press, 1998.

9. Magnuson, S., Army wants to make “every soldier a sensor”, National Defense Magazine, May 2007.

10. Avasala, V., Mullen, T., and Hall, D., A comprehensive sensor management approach based on market-oriented programming, Proceedings of the IEEE/WIC. ACM International Conference on Intelligent Agent Technology, Hong Kong, 18–22 December 2006.

11. NRC Report, Information Infrastructure for Managing Natural Disasters, Washington DC, National Academic Press, 2000.

12. Statement of Bruce Baughman Office of National Preparedness, Federal Emergency Management Agency (FEMA), Before the Committee on Transportation and Infrastructure, Subcommittee on Economic Development, Public Building and Emergency Management, US House of Representatives, 11 April 2002.

13. Roper, W., Geospatial Technology Support to the Nation’s Navigation System, Transportation Research Board, Washington DC, National Research Council, January 1999.

14. Force XXI Battle Command, Brigade-and-Below (FBCB2), 12 September 1998. Online. Internet. http://www.fas.org/man/dod-101/sys/land/fbcb2.htm.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.14.141.115