This is just too scary
The purpose of data mining is not to check individuals' personal information against information about known terrorists, or those suspected of terrorism on "reasonable grounds" as they cross borders, send emails or access public services. The purpose of it is to predict who might be a terrorist -- a little like the film "Minority Report," in which officials stop criminal acts before they happen by reading people's minds. However, the technology that is being used today falls far short of the technology of Hollywood fantasy.
First, the information on which data mining or risk scoring depend is often inaccurate, lacking context, dated, or incomplete. And like the ATS program, data mining and risk scoring programs never contain a mechanism by which individuals can correct, contextualize or object to the information that is being used against them, or even know what it is. Operating on a "preemption" principle, these systems are uninterested in this kind of precision. They would be bogged down if they were held to the ordinary standards of access, accuracy, and accountability. Secondly, the criteria used to sort masses of data will always be over-inclusive and mechanical. Data mining is like assessing guilt by "Google" key-word searches. And since these systems use broad markers for predicting terrorism, ethnic and religious profiling are endemic to them.
(read more)
0 Comments:
Post a Comment
<< Home