“The Creepy Line”

When asked in 2010 about the possibility of a Google “implant,” Google’s then-CEO Eric Schmidt famously said:

“Google policy is to get right up to the creepy line and not cross it.

With your permission you give us more information about you, about your friends, and we can improve the quality of our searches. We don’t need you to type at all. We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.”

Since that reassuring depiction of what awaits us in the future, Google has danced energetically around “the creepy line” many times, from subverting users’ privacy preferences in Safari and paying the largest FTC fine in history to introducing the omniscient Google Glass that gets as close to human trafficking as possible without drilling into the brain.

When the internet behemoth raises the bar, others rush to conquer and some manage to surpass it. Buried in the minutiae of CES 2013, in a booth not much smaller than a 10,000-inch Samsung UHD TV, was Affectiva, showcasing its primary product Affdex:

“Affdex tracks facial and head gestures in real-time using key points on viewers’ face to recognize a rich array of emotional and cognitive states, such as enjoyment, attention and confusion.”



Deciphering concealed emotions by “reading” facial microexpressions, popularized by Paul Ekman and the hit TV series Lie To Me, is nothing new, of course. What’s lifting us over the creepy line is the imminent ubiquity of this technology, all packaged into a web browser and a notebook with a webcam, no installation required.


Eyes Wide Shut

Today, Affectiva asks viewers’ permission to record, as they watch TV commercials. What happens tomorrow? After all, DNA evidence in courts was first used in the late 1980s and has been controversial ever since. It’s been used to exonerate incarcerated people as well as abused and misused to convict innocent ones. Like DNA analysis, facial expression reading technology will advance and may attain similar stature in law and in other fields…Some day.

Currently, however, along with its twin brother face recognition technology, microexpression reading isn’t yet firmly grounded in law. This uncertainty gives it the necessary space to evolve technologically but also also opens the door to significant privacy and security abuse.


The technology, when packaged into a smartphone, for example, can be used to help some of those with Asperger’s syndrome to read facial expressions. But it can also be used in a videotelephony app as a surreptitious “lie detector.” It could be a great tool during remote diagnosis and counseling in the hands of trained professionals. But it could also be used to record, analyze and track people’s emotional state in public venues: in front of advertising panels, as well as courtrooms or even job interviews. It can help overloaded elementary school teachers better decipher the emotional state of at-risk children. But it can also lead focus-group obsessed movie studios to further mechanize character and plot development.

The GPU in our computers is the ideal matrix-vector processing tool to decode facial expressions in real-time in the very near future. It would be highly conceivable, for instance, for a presidential candidate to be peering into his teleprompter to see a rolling score of a million viewers’ reactions, passively recorded and decoded in real-time, allowing him to modulate his speech in synchronicity with that real time feedback. Would that be truly “representative” democracy or abdication of leadership?

And if these are possible or even likely scenarios, why wouldn’t we have the technology embedded in a Google Glass-like device or an iPhone 7, available all the time and everywhere. If we can use these gadgets to decode other people’s emotional state, why can’t these gadgets use the same to decode our own and display them back to us? What happens when, for the first time in homo sapiens history, we have constant (presumably unbiased) feedback on our own emotions? The distance from detecting emotional state by machines to suggesting (and even administering) emotion altering medicine can’t be that far, can it? How do we learn to live with that?

The technology is out there. From Apple’s Siri, Google already has the blueprint to advance Google Now from searching to transactions. One would think the recent hiring of Singularity promoter Ray Kurzweil as director of engineering points to much higher ambitions. Ambitions we’re not remotely prepared to parse yet. Much closer to that creepy line.

13 thoughts on ““The Creepy Line”

  1. I would gladly lend my writing hand so as to render poetic justice to a work-in-lyrical progress… Siri deserves cloud-nine penmanship corseted ‘slim dunk’ from take-five diplomacy… A Steinbeck-Brubeck minor-keyed literary genre that ought to intertwine disparate epiphanies.

    Language unlocked human evolution. Right up to tongue-in-cheek and circumlocution. And now Siri. Can we take the language corpus at his words, and decipher…at last…the genteel equivalence between some-one and some-body, …in the same way as mass once expressed, in a coming-out equation, its equivalence to energy ?

    And if any-one wishes fiddling upon the square, the speed of light ensconces any-body…

  2. Politicians only use to society is their ability to Lie.
    In fact Alpha males have special abilities for deception plus
    thinking fast on their feet.

    The whole moving of hands that politicians and tv people
    do is just to hide their body twitches.

    All this will be used to suppress anyone who is against the system not for finding truth or justice.

    Sadly, Deception is first weapon or law of Evolution as
    state by E. O Wilson.

  3. I predict some future cop will use this technology to distinguish humans from Androids (of the Nexus brand of course).

  4. “Would that be truly “representative” democracy or abdication of leadership?”

    I’ll vote: neither of the above. But it could help candidates recognize how different audiences match their preconceptions, and tailor their message. That’s good politics.

    • There are claims that some of the microexpressions are deeply involuntary and cannot be gamed. And yet there must be third party business opportunities for those who already teach how to game lie detectors, I reckon.

    • I suppose there are situations in which you’d want to. But in politics, you probably mostly want the candidate to know what you like hearing.

  5. Great article.

    Do you envision a future (albeit bleak) where Apple will need to be as reliant on advertising as Google, when making great hardware becomes ‘good enough’, i.e. commoditized?

Comments are closed.