In the past decade the mobile handset has seen more and more sensors added to it. Today’s device can “see”, “hear”, “speak”, “know where it is”, “how fast they are going”, “interact with your sense or touch” etc. and more of these are added every day. Applications are being built around these sensors by developers all around the world. But these apps are infants in the broader perspective of human-machine interfaces. The comparison of a child’s development brings up an interesting analogy for mobile development and where its headed.
If you look at a newborn child you find the first few months are a struggle to cope with the senses. Light is stimulating all kinds of nerve endings in the eyes and the child is figuring out whether what they are looking at is the mom or the fridge.
As the child develops, the mind begins to draw conclusions from these senses. Ignores certain things and notices others. We call this perception: the process of becoming aware of something through our sense. What we perceive is then in turn used to drive our actions based on our goals and needs. Perception evolves over time in the first few years of a child’s learning and is driven by social constraints, physical responses of pain and pleasure and the generic component of needs and motivation.
As an example take for example your mobile device. Like most people you probably also leave it near your bedside over night. There is activity on the device most of the night from received emails to spams to status updates on social media and what not. None of us want those and end up setting the phone on vibrator. The mobile acts like a child who just doesn’t know when to stop talking (those of you with kids over 3 know what I am talking about). What is it then the mobile (applications) needs to do? They can sense the world around them but the mobile does not contain a perception engine that allows it to make a higher order judgment. The mobile can SEE that the lights in the room are off, it can HEAR that the room is quiet, it can tell that the user doesn’t usually turn the phone on during a certain part of the night, yet from all this host of information it is currently unable to conclude that its probably best to be quiet at night.
In the coming years, as our mobile development industry grows up, frameworks for higher decision making will enable mobiles to perceive the world around them in much better ways and will allow them to interact with people in a way a person would be aware of another person. Once devices can perceive, applications can begin to develop a social awareness opening up a whole new dimension for these devices.
What apps are you building now a days and what level of interaction do they operate on with the users?