Please don’t shout
We’ve all heard about the protocol on not using CAPITAL LETTERS when writing emails. Apparently, if you do so, it’s the same as shouting at the person to whom you’re writing. David F. Swink, co-founder and Chief Creative Officer of Strategic Interactions, Inc., based in Fairfax, Virginia, USA, says that “email tone is conveyed through word choice, syntax, punctuation, letter case, sentence length, opening, closing, and other graphic indicators like emoticons and emoji.” He warns, “Just because you write in a certain way, it doesn’t mean it’s received the same way”.
But what if the way you type a message could be interpreted and analysed by a computer to determine your feelings? That would be interesting, particularly as so much of our life today is spent at a keyboard.
But communicating emotion through type can be hard, according to an article in New Scientist last month.
By measuring the way someone is typing – the speed, rhythm and how often they use backspace – and then combining that information with an emotional analysis of the typed text, a computer program has been able to predict how they are feeling with 80 per cent accuracy. Tested on different emotions, the program successfully detected joy 87 per cent of the time, while anger was identified 81 per cent of the time (Behaviour and Information Technology, doi.org/vbt).
See: Emailing angry? Your keyboard feels your pain, in New Scientist, by Hal Hodson (29 August 2014)
Photoshop photos using your brain activity
While we’re on the subject of technology analysing how we’re feeling as we type, what about analysing what we think when we study a picture?
Cornell University’s Library has an interesting paper, which explores the potential of brain-computer interfaces in segmenting objects from images. The approach explained in the paper is centered on designing an effective method for displaying the image parts to the users such that they generate measurable brain reactions using a score based on EEG activity. After several such blocks are displayed, the resulting probability map is binarised and combined with the GrabCut algorithm to segment the image into object and background regions. This study shows that BCI and simple EEG analysis are useful in locating object boundaries in images.
It may sound complicated: New Scientist explained it more simply. “Forget fiddling around with a mouse to chop unsightly bits out of family snaps. A team at Dublin City University has designed a brain-computer interface that can automatically pick out the bits of photos that you care about, just by measuring brain activity while you look at them. Each picture is shown whole, then split up in 192 pieces and shown in a very fast slide show. The pieces that make the brain light up correspond to the desirable components of the photo”.
Woof, Woof… Dogs have brains and they are rather clever too!
You probably already know that dogs are smarter than humans in a number of ways. For example, some dogs bred for their sniffing or scenting capabilities, are able to spot whether their diabetic handler іѕ having а hypoglycaemic episode (blood sugar level being too low). These dogs can do this because their noses are several thousands of times more sensitive than the human nose. Dogs are also highly successful in sniffing out cancer in humans, thanks to this incredible sense of smell.
Late last year, Sam Webb wrote in the Daily Mail reporting on a headset that can read your dog’s mind using a gadget (No More Woof) that claims to “analyse brain waves to transform inner barks into human speech”. The technology used is a combination of the latest technologies in three different tech-areas, Electroencephalography (EEG) sensoring, microcomputing and special brain-computer interface software.