We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.
Start a Discussion

What Technology Should the Government Use to Speed Up the Dissemination of Information and to Quickly Pursue Connective Threads on Potential Terrorist Threats.

Vote 0 Votes
Following the recent failed attempt to bring down a Northwest Airlines flight, President Obama has declared the streamlining and sharing of terrorist threats a top priority. What technology do you think the government should use to help speed up the dissemination of information and to quickly pursue connective threads on potential terrorist threats.

10 Replies

| Add a Reply
  • Semantic processing has gone from research and experimental stages to being very viable for practical applications. Syntactic processing of text just looks at the structure, grammar, frequency of words, phrases, etc. Semantic processing does limited understanding of text (full understanding is about a century away - ask anyone who has wrestled with Natural Language understanding - they will tell you how difficult it is!) and can understand it enough to push it to the front of a list of alerts.

    The CIA, FBI and other intelligent agencies have gigabytes of text reports that need to be digested, highlighted and brought to the attention of people just in time for them to take action. If ten text reports about Umar Abdul Mutallab had been processed by semantic technology, they would have correlated the different reports and pushed the combination ahead of the pack so that techniology and people could have "connected the dots"! It's all there today. We just need to make sure that all agencies share their reports and a centralized system correlates them all.

  • I am not an expert in these technologies and use cases but simply stated, it sounds like a use case where Complex Event Processing (CEP) could add significant value in correlating data bits that lead to potential threats.

  • Another technology that could be very useful in solving this problem is "Case Based Reasoning". Interesting events like Umar's dad contacting the US authoritis in Nigeria could trigger the creation of a "Case Frame". Then additional evidence that has the person or the country gets added to these case frames. If the case frame grows rapidly, you have something brewing. If the case frame does not grow rapidly, then it slowly recedes to the background.

    The problem here is one of information coming in in many forms, database entries (Visa granting for example or Air Travel Booking), Text reports (CIA or FBI Station reports) and bits of pieces of information that may come in many forms. You have thousands of gigabytes of this information. What information is interesting and needs to be paid attention to? This is the problem authorities face. There is so much information and only so many hours in a day only for so many people. Case based reasoning and Semantic Processing provide a way to identify, chunk information that goes together and push the package ahead of the pack for someone to pay attention to.

  • Putting aside the information collection and analysis challenges (which must be quite advanced and proprietary and extend beyond the public domain), I think that another key challenge is lightweight (situational) exposure and integration and. I would assume that every day new information sources are added to the resource pool, often with their specific structure and format. I expect a similar dynamic on the user side. If we could provide the contributor of the source with an easy to use technology that would make it possible for her/him to rapidly expose it in some normalized (restful?) form, it would be easier to includ it in the sophisticated analyzers and make it available and searchable to the community right away. Actually such technology is becoming increasingly ubiquitous – Mashup technology. I’m particularly familiar with the offering of convertigo.com, and I expect that similar products are being used to some extent in the intelligence community. I think though that an explicit assessment of the extensive use of Mashup technology could provide value and address at least part of the challenge.

    What we are looking at here is a large scale, fine grained and dynamic SOA variety, with particularly powerful governance and registry. A nice case, isn’t it?

  • With the risk of data overload, intelligence analysts are going to need ways to rapidly discover patterns, ask questions, pursue hunches and generally dig through data.

    Because it helps people see and understand data 10-100 times faster than typical table or static chart displays, shareable, interactive data visualization software would make a great addition to their toolkit.

    Polaris, now called VizQL, is a data visualization technology that got its start at Stanford as part of a DOD project. See the original Stanford article at http://window.stanford.edu/projects/polaris/. Note: I work for Tableau Software, which uses VizQL as the foundation of its products.

  • I agree with Kelly. Complex Event Processing (CEP) is perfect technology for discovering patterns of events that might be evidence of a terrorist plot that is in progress. What is especially valuable is if the CEP engine is closely integrated with the engine that is running the business process applications so that it has easy visibility to the business data that is the basis of the suspicious pattern (e.g. one-way ticket, no luggage, etc).

  • Here's an interesting analysis on what the problem is and what may be needed:

    The National Counterterrorism Center was established in 2004 for the specific purpose of dot-connecting -- forcing the CIA, the NSA, the FBI, the State Department, military intelligence and other agencies to share what they know. But as those agencies gather more and more data, processing it inevitably becomes harder. The problem may not be that the system is improperly engineered but simply that it's grossly overloaded.

    Do we need more analysts? Faster computers? Better software? Maybe all of the above. But I doubt we need to reshuffle the bureaucracy yet again -- and I doubt we need more information.

    The very first task should be cutting that list of 550,000 "entities" down to a manageable size. The architect Ludwig Mies van der Rohe was right: Less sometimes really is more.

    A counterterrorism strategy that needs fewer dots
    Washington Post - Eugene Robinson

  • Few years ago I was participating in a Consulting assignment behalf of Gartner Consulting. A Penetration Test was one of my assignments. The preparations includes reading Security Research Notes. One of these articles explained why Software based Security is weaker than Security including physical hardware devices in addition to software. This observation is valid for the question asked: No software can replace physical checks or Biometric indicators checking. The physical checks and biometric checks are not 100% percent guarantee. They should be supplemented by software systems, which may raise the probability of hits (a term from signal detection theory). As far as Software is concerned, Business Intelligence (BI) is the most important technology especially Operational Business Intelligence. Business Intelligence SOA Services shared globally could be valuable. In addition usage of Event Driven Architecture (EDA) is essential.

  • Interpretation of information is highly relative in nature where the important thing is the reference point – against what are you evaluating or processing the information?. You may have information available to you but to understand whether a particular sequence of events could lead to an alert is based on previous occurrence patterns which actually isn’t the difficult part as variation or a new approach to a possible attack could go un-noticed in that case.
    I believe that the problem is not with the technology which could be used to avert or detect such situations but accept the basic fact that 50-60 percent of times one could detect an impending disasters using pattern matching and the rest would have to be inferred based on solid human intelligence networking involving agencies across the world. Even if we consider that 80% of the times an impending disaster could be avoided using pattern matching still the element of surprise from the remaining 20% looms large and could potentially be much more fatal.
    I think, we could use the concepts of ANN (Artificial Neural Network) for pattern matching and processing information available at hand and classical approaches of human intelligence and information gathering to take care of the exceptions which in my opinion is more fatal and needs more attention.

  • I would second the motion that Complex Events Processing (CEP) technology would go a long way to help preventing this kind of event. I actually demonstrated this for the US Navy a few years ago and they showed great interest in the potential intelligence applications. This would need to be combined with some type of mash-up or data virtualization to allow for seamless access to information from multiple underlying data sources.

Add a Reply

Recently Commented On

Monthly Archives