The big thing in technology right now is the iPad. And for good reason. Apple usually is on the cutting edge of computer-tech popularity – if not advancement.
Personally, I’m not planning on buying an iPad – I don’t even own an iPod.
So, I’m probably not the best candidate for a technology blog entry, but as often is the case, I am far more interested in the philosophical implications of technological advances – and one technology experiment going on at ASU right now is particularly intriguing.
Saturday night I was emceeing at the ASU International Students’ Easter Celebration Banquet. At my table were Javier and Helen (from Mexico), two PhD candidates at ASU in the field of computer sciences. What they are working on astounded me.
The conversation started out innocently enough when Javier and Helen shared that they were working on a project having to do with human-computer relations and intuitive computer software programming.
My first thoughts jumped to Facebook advertising and Google analytics.
I shared this and they said, “Yes…but no, that isn’t what we are doing.”
Javier posited, “Let me give you an example. There are computer game consoles in existence today where a character in the video game can move forward, backward, right or left, shoot a gun or thrust a sword with the game player just using their mind to manipulate the character to do so. But that’s easy stuff…”
I interjected saying, “You call that easy stuff?”
Javier responds, “Yes, because we are not talking about games; we are talking about a computer being able to tell that you are bored and responding by changing its interface with the computer user. Whether that means the screen changes color, the music gets louder or the computer prompts a change by asking you what you want to do next…or maybe it even knows what you want and changes automatically without you having to fuss with a mouse.”
“I can’t do that Dave.” – H.A.L.
Helen then shared about how much promise there is in the technology in terms of not only commercial use, but medical and scientific use as well. One’s first thoughts jump to paraplegics who will someday be able to use their prosthetic limbs just as they would use a natural limb. Imagine the possibilities!
Of course, there are various “fun” applications of this technology as well. There are the video game and computer options, bionic sportsmen and women, an iPod that changes to a song simply because you are thinking about it or the once-remote-controlled-but-now-mind-controlled vehicles.
In fact, it was one of these fun applications where Helen discovered an interesting challenge with the technology (and that I picked up on as philosophically fascinating).
When she was manipulating the control of a “mind” controlled vehicle she had to take precious time to calibrate the machine to respond to her thought waves. For example, when she thinks “right” she thinks “right” in a certain way that is different than other people. Her mind’s expression of “right” is different than my expression of “right.” First of all, she speaks Spanish in her head. Second of all, she is a woman. Third of all, she is older than I am. Fourth of all, the context of when/where/how she learned the meaning of “right” is different than mine and thus there are different brain connections and synapses firing off when she thinks “right.” So varied are these differences that even after she manipulated the vehicle to be able to move forward, back, left and right just with her brain, the moment she passes the brain pad controls over to someone else, even someone as close to her as her husband, the vehicle won’t budge.
Throughout their research they’ve come up against the problem of “spoken language” and “brain language.” They find it tough to get their software to perceive core thoughts (such as the meaning of “right”) rather than expressed thoughts (such as the word “right”). While they can easily train a machine to respond to brain word expressions they can’t get it to respond to core thoughts. When that becomes possible they will really have a breakthrough, because then the technology applications are universal regardless of culture, gender, sex or age.
So far, they were able to successfully master this “core communication” with eye-zoom technology. The Mexican researchers and their team were able to connect the human eye with a video camera and enable that camera to respond to the eyes’ dilation and retraction as it adjusted to light conditions and tried to zoom in for details. That means if the eye focused on something, the camera responded by zooming in to focus on it as well; if the eye panned out to catch the periphery, the camera zoomed out in kind.
The humorous part of this advancement is that when they collected a sample of males and had them watch a screen feed of a crowd with a “woman in the red dress” walk by all of the “eye-cameras” zoomed in on certain features of the woman’s anatomy. The men in the sample did not know abou the “eye cameras” and when asked what they noticed on the screen very few of them mentioned the woman. Ha!
Back to the philosophical bits.
These researchers truly believe they will be able to break through language, culture, sex and gender differences to get at core communication between humans and computers.
Ludwig Wittgenstein’s work on “word games” comes to bear on this conversation. In his work, Tractatus Logico-Philosophicus, and indeed in other works, Wittgenstein posits the following:
- The world consists of independent atomic facts—existing states of affairs—out of which larger facts are built.
- Language consists of atomic, and then larger-scale, propositions that correspond to these facts by sharing the same “logical form”.
- Thought, expressed in language, “pictures” these facts.
Javier and Helen are attempting to endow their software with the ability to read the “atomic propositions” and “facts” that the brain records as such and then chooses to express in language pictures.
They really think that breakthrough will happen sometime soon.
What I think is crazy is that the vast majority of humans can’t even do that yet. I mean, even in a marriage relationship, a male (or female for that matter) has the most difficult time trying to decipher their spouse’s verbal and non-verbal communication.
Men across the world opine, “She said this, but what does she really mean by ‘this?'”
The greatest gift this technology might provide to the human race is an advanced form of marriage counseling whereby a husband has a mini-computer in his ear that tells him what his wife is really saying when she talks to him. 🙂
But really, my mind is still trying to wrap itself around the implications of such technology. To be sure, there are thousands of positive, and negative, opportunities burgeoning from such advancement. However, there are some deeper challenges and questions that this type of programming proposes to all of us.
Throughout our human history we endeavored to understand each other, even going so far as to develop complex codes of communication in verbal, non-verbal and written form.
Despite all our effort we still mis-communicate every single day. And to be honest, there is a beauty in that.
Although I don’t care for the arguments I cause when I don’t understand Elizabeth I do value the forgiveness she gives me and the grace we explore together as we try again and again to grow closer as companions for life.
Introduce a computer into that equation and I think something vital is sapped from our relationship as husband and wife.
Or, what if the computer does end up understanding my wife better than me? Who will she want to spend time with more? Barring my stunningly good looks, the computer will win out every time.
I am usually not one to react negatively to technology. I love Facebook and think it is a great way for humans to connect all over the globe (I mean, there are people from America, Hungary, Mexico, South Africa, New Zealand and other places reading this blog right now). I even Tweet and think it is worthwhile conversation.
But this human-computer-intuitive-relationship stuff got my guard up.
What happens to the “ubuntu” spirit (A person is a person through other persons) when a computer can communicate to the core of a person’s thoughts faster and more efficiently than a human can.
Does the proverb change from “umuntu ngumuntu ngabantu” (a person is a person through other persons) to “umuntu ngumuntu ngakhomputha”?
I sincerely hope not.
What do you think?
When, not if, this technology comes around do you think we will just turn-off from each other and plug in to a computer?
How do you feel?
Will our need for community be met in a computer interface rather than an interpersonal relationship?
Let me know.