Verified Document

Big Data And Intelligence Essay

¶ … 2003, when the National Geospatial-Intelligence Agency (NGA) was instituted, the concept of geospatial intelligence (GEOINT) was largely at its outset. Its beginnings had been propagated in preceding decades, but the circumstances were finally prime for the discipline to develop, grow and spring new branches with the ability to support a growing community practicing a verified dexterity.[footnoteRef:1] In delineation, geospatial intelligence takes into account taking advantage of and analyzing images and geospatial information to outline, appraise, and visually portray physical features and geographically mentioned activities on Earth. This is purposed to distinguish the important property of geographical position linked with the data that the National Geospatial Intelligence Agency and the intelligence community generate and analyze. It is also purposed to lay emphasis on the value-added examinations that the NGA undertakes to generate a specific kind of actionable intelligence.[footnoteRef:2] Taking into consideration that the purpose of geospatial intelligence is to pinpoint and track down anything and any individual on the planet, collecting and examining GEOINT sources and generating prompt, precise, pertinent and actionable intelligence necessitates strong global partnerships, an aspect that has grown in recent periods. A decade later, GEOINT is in complete blossom.[footnoteRef:3] The progresses in the past decade are considerably outstanding and that is what makes the next decade exceedingly significant. This paper will examine the biggest challenges or limitations to the future of GEOINT in the next decade. [1: Mapping Science Committee. New research directions for the national geospatial-intelligence agency. National Academies Press, 2010.] [2: Flint, Colin. Introduction to geopolitics. Routledge, 2016.] [3: Alderton, Matt. "The Defining Decade of GEOINT." Trajectory, 2014.] Accomplishing Constant TPED

One of the problems or challenges that GEOINT will face in the next decade encompasses accomplishing constant TPED, which includes tasking, processing, exploitation and dissemination. In the contemporary, unrelenting TPED of geospatial intelligence over geographic space and time is fundamental and significant. Nonetheless, prevailing sensor networks through aircrafts and satellites together with database management systems are insufficient to accomplish persistent TPED for numerous reasons. To begin with, prevailing senor networks were purposed for tracking fixed and permanent targets such as buildings and also the equipment for the military.[footnoteRef:4] They are largely meager in both space and time and it is also time-consuming to shift the sensors in order for them to focus and lay emphasis on the sought after geographic expanse connected to the pertinent time interval. Moreover, even if a suitable network were utilized, the prevailing databases fail to measure up to the fundamentally greater rates of data and volumes of data that are created by set out sensor arrangements. In a nutshell, the key and significant challenges in accomplishing TPED take into account the efficacious use of sensor networks, spatiotemporal data mining and discovery, and spatiotemporal database management and systems.[footnoteRef:5] [4: Perdikaris, John. Physical Security and Environmental Protection. CRC Press, 2014.] [5: National Academies of Sciences, Engineering, and Medicine. "Priorities for GEOINT Research at...

In particular, the timeliness of geospatial intelligence is progressively growing into a significant aspect owing to the rising numbers of mobile targets, among other things. Therefore, the field of geospatial intelligence is making a changeover and evolving from planned targeting to time-sensitive targeting. In particular, it is becoming progressively more significant to shift toward instantaneous data creation, processing, and distribution to diminish dormancy and underdevelopment in intelligence generation and delivery practices. Nonetheless, the customary geospatial intelligence creation practice is largely dependent on manual explanation of data collected from geospatial sensors and sources. As a result, this is an instantaneous problem and difficulty in GEOINT taking into consideration the rising volume of data from geospatial sensors. Another significant challenge that will be faced in GEOINT in the next decade will be to quickly and effectively ascertain the practices contained within the intelligence cycle that are most appropriate for computerized processing, which are suitable for human cognition, and which necessitated a mixture of both human and machine backing so as to reduce the GEOINT timeline. This is imperative for not only prevailing but also future systems.4
Interoperability

Interoperability, which encompasses sharing with forces, external partners, and communities on the whole, will be a key challenge for geospatial intelligence in the next decade. This is largely with regard to the fact that the NGA continues to undertake its objective of sharing geospatial intelligence not just with other organizations in the United States, but also with alliance partners and foreign partners. A fitting example encompasses an interchange of triangulation maps amongst coalition forces. In this case, a source might lay emphasis on landscape maps, where every direction segment is maneuverable by land vehicle, conceivably for the reason that it fundamentally serves missions for the military. Maps from another source may consist of land in addition to direction segments that are based on water for amphibious means of transportation, conceivably because they serve the Marines. Bearing this in mind, if the maps emanating from the two different sources are combined devoid of taking into consideration the dissimilarities in semantic implications, it can give rise to land vehicles misplacing outlet in the course of combats or falling into profound water bodies. Another major challenge comes in the form of having accurate tracking of moving targets in the instances where geopositions are recorded by two varying sources that employ incongruent coordinate systems, data file designs, and map codes. Moreover, significant problems to be experienced in spatiotemporal interoperability takes into account the role of instantaneous sensor inputs, the issues of coping with inadequate and scarce data, dissimilar ontologies, and ambiguity management, moving targets, and fluctuating profiles in time and space.3

Exploiting all Forms of Intelligence

In the next decade there will be extensive changes most of all because of the progressive level of technology. Bearing this in mind, geospatial intelligence will face the challenge of exploiting all forms of intelligence. In the contemporary world, subsequent to the 9/11 events, the National Geospatial-Intelligence Agency is required to take advantage of all kinds of intelligence to impede and prevent repudiation and deception, be able to track moving targets, and also have accuracy during such targeting. Initially, in geospatial intelligence, this implied synthesis across images, maps, and sensor data. In the present day, this is progressing to encompass fusion through all kinds of intelligence.…

Sources used in this document:
References

Alderton, Matt. "The Defining Decade of GEOINT." Trajectory, 2014.

Buxbaum, Peter. "Geospatial's Big Data Challenge." Intelligence Geospatial Forum, 2015.

Flint, Colin. Introduction to geopolitics. Routledge, 2016.

Mapping Science Committee. New research directions for the national geospatial-intelligence agency. National Academies Press, 2010.
National Academies of Sciences, Engineering, and Medicine. "Priorities for GEOINT Research at the National Geospatial-Intelligence Agency." (2006). Retrieved from: https://www.nap.edu/read/11601/chapter/5
Cite this Document:
Copy Bibliography Citation

Related Documents

Big Data Annotated Bibliography
Words: 1473 Length: 4 Document Type: Annotated Bibliography

Zaslavsky is the leader of the Semantic Data Management Science Area (SMSA). He has published more than 300 publications on science and technology. Perera has vast experience in computing and technology as he is a member of the Commonwealth Scientific and Industrial Research Organization alongside publishing numerous journals. Georgakopoulos is the Director of Information Engineering Laboratory. He has published over 100 journals on issues related to science and technology

Big Data Big Data
Words: 1643 Length: 5 Document Type: Essay

Big Data Faris (2013) speculates as to whether NSA leaks will compromise big data's future. The article, published on the website Dataversity, notes that there is public concerns about data leaks at NSA. Consumers are becoming more aware about just how much of their information is available to the government. The author calls into question the dichotomy of private data and public data, in particular were corporate entities are gathering data,

Big Data Analysis
Words: 613 Length: 2 Document Type: Article Review

Big Data Nowadays, enterprises are employing statisticians when carrying out sophisticated data analysis. This is caused by the increased affordability in data acquisition and data storage among large scale and small-scale enterprises. This article focuses on highlighting the emerging Magnetic, Agile, Deep (MAD) data analysis. This is figured out as a shift from traditional enterprise data intelligence. The article presents its design philosophy as well as experience and techniques that portray

Big Data and Supply Chain Management Essay
Words: 4316 Length: 14 Document Type: Essays

Introduction Big data has become one of the most important aspects of supply chain management. The concept of big data refers to the massive data sets that are generated when millions of individual activities are tracked. These data sets are processed to yield insights that help inform managerial decision-making. Supply chains in particular have leveraged big data because companies have been able to develop technology to not only capture hundreds of

SQL and Big Data Gaining Greater Insights
Words: 1338 Length: 4 Document Type: Essay

SQL and Big Data Gaining greater insights into terabytes of unstructured and structured data organizations have been collecting in many cases for decades across diverse computing and storage platforms are increasingly being unified through advanced data and system architectures. Big Data is the term used to define very large, diverse data sets that contain both structured and unstructured data that defy analysis using conventional database management and analytics applications (International

CFOs Big Data Opportunities in Firms
Words: 2057 Length: 7 Document Type: Research Paper

Introduction The growth of big data has had significant transformative effects on several industries including technology, agriculture, health, education, and finance. Over the past decade, the number of humans using smartphones has increased tremendously and this has created a big pool of user data as smartphone users stamp their digital footprint all over the web. The electronic data collected as a result of these activities yield Big Data that provide valuable

Sign Up for Unlimited Study Help

Our semester plans gives you unlimited, unrestricted access to our entire library of resources —writing tools, guides, example essays, tutorials, class notes, and more.

Get Started Now