I am fascinated with the advances computing, and have always approached computing from the perspective of what can this tool do for me to make my life easier. As a professional tester I have a lot more work to do then I can reasonably accomplish in the limited timeframe allotted for most projects. So, well-designed test automation is a great tool that frees up some of my cycles performing mundane tasks that need to be accomplished.
A few months ago I had an email exchange and an industry consultant regarding test automation and emotions that I blogged about and he later talked about at a Star conference. In that post I also tried to illustrate a simple technique called polling for simulating irritation (or frustration) of a task that is taking too long to complete via an automated test.
The consultant I had the email exchange with wrote, "I would want my automation to feel frustration, to recognize it, and to act on those feelings in some way that provides valuable information to the product. But until we've got not only artificial intelligence, but also artificial emotional intelligence, that ain't gonna happen."
When I design an automated test I often think of the various ways to achieve exactly what it is I am trying to prove or disprove with the test. Then I think of the things that can go wrong (such as race conditions, errant message boxes, tasks taking too long to complete, making simple decisions based on Boolean states, etc.) and design the test in such a way that can logically deal with those situations. So, as I thought about our conversation and automated test design I asked myself, can automation do more than simple mundane tasks? Can automation make decisions or perform tasks based on practical reasoning or simulated emotions?
It seems that some researchers in the Netherlands are unlocking doors with artificial intelligence that may eventually lead to advances in smarter test automation design. Researchers at Utrecht University are hard at work on an emotional robot (a cat none the less) that simulates 22 emotions including "anger, hope, gratification, fear, and joy," used in complex decision making processes. Marvin Minsky stated "...we all have these things called emotions, and people think of them as mysterious additions to rational thinking. My view is that an emotional state is a different way of thinking."
I agree with the researchers, and I "don't believe that computers can have emotions," and also mostly agree with their statement "that emotions have a certain function in human practical reasoning." (I say mostly because I do know that some emotions express by some people are completely irrational and result in impractical reasoning.) Perhaps AI in test automation this is still a long way off, but I am always looking for ways to improve and become more effective and more efficient. I am always learning and looking for ideas to improve myself and my skills. So, based on this research I now ask myself, are there cost effective emotional logic patterns to simulate rational reasoning, and I can or should I employ that in the design of some of my automated tests to make them more robust?
Just a thought. Isn't technology great!