As a forensic linguist, Professor Tim Grant uses a scientific approach to language to help convict criminals. His specialism is short-form messages - such as texts, Twitter posts and emails - and he has helped to gain justice in a variety of high-profile cases, from incidents of anonymous harassment to murders and threats of terrorism. He is the Director of Aston University’s Centre for Forensic Linguistics, which was the first of its kind in the world.
Thanks to crime films and TV detective series, most people are familiar with the idea of forensic scientists using DNA profiling or fingerprint analysis to secure convictions. Yet many aren’t aware that our written communications also contain traces of our identity; dialect words, slang, even the way we use punctuation could all be as distinctive as a fingerprint.
The team of forensic linguists at Aston work in a variety of areas from the use of police interviews to communication difficulties faced by people who represent themselves in court. Professor Grant - who is one of a few forensic linguists used by police as part of the National Crime Agency (NCA) - works in the field of authorship analysis and the use of language as evidence. This applies to cases where, for example, someone has died and their phone continues to send text messages; or with incidents of anonymous trolling on Twitter.
Professor Grant, who is particularly interested in short form messages, has developed methods of analysing linguistic evidence that attempt to move the field away from qualitative expert analysis (i.e., a very close reading and analysis of the text) into a more a rigorous, carefully structured process that lessens the chance of human bias.
“We know that forensic scientists are influenced by the story you tell before they analyse the data,” says Professor Grant. “Forensic scientists are human and we know that even for fingerprint examiners, how you tell the story behind a case can affect their conclusion. So can we devise methods which help address that problem of expert bias?”
Professor Grant used his distinctive approach in the case of Jamie Starbuck, who was suspected of murdering his wife Debbie whilst they were travelling around the world together. Both had emailed home regularly, but the suspicion was that, at some point, Jamie Starbuck had begun to masquerade as his dead wife, sending emails and texts to friends and family to allay suspicion. Work by Professor Grant and his colleague, Dr Jack Grieve, eventually led to the international arrest warrant for Starbuck, who was found guilty and jailed for a minimum of 30 years.
“The suspicion was that the wife wasn’t sending the emails - that the husband was sending them on her behalf,” Professor Grant explains. “Jack did the language analysis but I didn’t at first show him these emails. Instead I showed him the undisputed emails from before they went away - emails sent separately by the husband and sent by the wife - and I said: ‘Jack, can you just do a contrastive analysis and tell me what is consistent within each of their styles and what distinguishes them?’ And then, only once he’d done that did I say ‘Here are the disputed emails, which ones match more?’ That way you can’t get the bias in - you’re not looking at the disputed ones and then looking back and saying ‘This email looks more like this person’s’.”
In the Starbuck case there was quite a lot of comparison data to work with, but Professor Grant admits that things get trickier with short messages such as tweets, which are a maximum of 140 characters. As he explains, computer scientists have traditionally played a part in forensic linguistics because they are able to write software that will process chunks of text into categories. At the moment a tweet is too short for that kind of straightforward statistical analysis, but cases such as the trolling of feminist campaigner Caroline Criado-Perez in 2013 show there is a need for robust methods of identification. This raises fundamental questions about how distinctive language really is. Will it stand up to a comparison with everybody else in the world?
“They are tough problems to solve,” says Professor Grant. “I did a project with [Aston academic] Dr Nicci MacLeod where we looked at tweets and we looked at the more open set problems - so if you’ve got a set of ten or 20 or 200 or 2,000 potential suspects, can you rank-order them as being closer to the target text or not? But it’s very much at the development end and it’s hard to do.”
As technology evolves, it presents more challenges in the area of crime detection. The Tor Project, for example, is a system that uses a series of virtual servers to conceal the geographical origin of a computer. There are now examples of ephemeral chat software, such as the Snapchat app, where messages self-destruct, apparently without trace. But as forensic linguists get better at solving these problems, other questions arise about the ethical use of their knowledge.
“We’re doing this work for hopefully good reasons, but then, there’s a flip-side as well. I’m very interested in the moral side, which is much discussed in my literature. If we’re analysing online chat to help hunt down a paedophile, most people would say that’s a good thing, but if it’s an environmental protester - or someone else who’s not flavour of the month with the government - that is more complicated.”