After the CDS Academy Awards last week, we caught up with some of our student winners to find out more about their work.
Scooping up the prize for the project with the greatest social impact was Yiqiu Shen, Zemin Yu, and Xinsheng Zhang’s fascinating research on how data science can be used to combat terrorism.
A major challenge facing us today is how to quickly identify the groups responsible behind terrorist attacks so that the relevant individuals can be apprehended. With roughly 3290 unique terrorist groups around the globe, each with their own set of characteristics and motivations, law enforcement agencies require additional tools to help keep our country safe.
To that end, Shen, Yu, and Zhang have been building a new model that predicts which terrorist group committed a particular act based on specific features like the location of the attack, the nationality of the victims, and the day that the attack occurred.
The model’s predictions are based on the wealth of textual data about previous terrorist attacks collected by the Global Terrorist Database (GTD). The database contains meticulous details about each terrorist attack that has occurred since 1970, such as its location or attack type.
After extracting the relevant details of each attack from the GTD, Shen, Yu, and Zhang went on to convert the textual data into numerical values using Python. After ‘learning’ this information, their model can then produce its predictions regarding future events.
The team speculated that the model could take the form of a web application in the future, where a human expert could quickly input data about an attack immediately after its occurrence through a front-end web user interface. The data would be sent to the back-end where the model is located, and then the model would return its predication of which terrorist group is responsible for the event based on the data received.
Yet, as with any new invention, there are also some drawbacks to consider. A key problem, explained the team, is that terrorists could take advantage of the program and change their behavioral patterns to elide law enforcement agencies. Moreover, the model’s inaccuracies could lead to the persecution of innocent individuals, or exacerbate damaging stereotypes based on nationality or race. These ethical considerations remind us that the effects of our technological inventions are always doubled: a significant tool today could also turn against us tomorrow depending on who is in power.