|Computer science professor Dr. Hamdy Soliman meets with his undergraduate research team. From left are Jeff Napier, Dr. Soliman, Max Brister, Mark Mendoza and Omar Soliman.
|Small packages! Dr. Soliman's team is reprogramming these little "motes" to serve as smart sensors in the field. A primary goal of current research is to reprogram these sensors so they can make independent decisions. In the background, Dr. Soliman meets with his team.
Dr. Hamdy Soliman, professor of Computer Science & Engineering, first landed a National Science Foundation grant in 2007 to begin his multidisciplinary research.
Since then, he and his fellow professors have employed more than 30 students to help develop the ground-breaking sensor technology.
“We thought of a new science of a smart sensor networks to be very extended – monitoring and probing the unknown, like forests, ocean floors, battlefields or volcanoes,” Soliman said. “How can we be there? We spread the sensors as arms, skin and eyes in the environment we are probing. And we must have large number of sensors, proportional to the area to be covered.”
To thoroughly monitor a large target area, such sensor network would require a massive number of sensors. Further, those sensors would need to be low cost and use minimal amount of both computational power and battery power. The nature of the terrains where sensors are deployed requires the installation of ad-hoc possibly mobile wireless sensor network (MASNET), with several key challenging characteristics – mobility, wireless communications, low power/computation budget, security and, most importantly, low cost. Available sensors are $200 to $300, but even those sensors must be reprogrammed for different system environments.
“It’s very challenging to keep costs low while achieving an ambitious goal,” Soliman said.
Soliman explained the concept of the sensor network as collecting data that are used to build intelligence and make decisions. First, raw data, like sensed temperature, light, or sound, are aggregated to create records of information. Such records are stamped with spatial/temporal fields to advance information to knowledge, Soliman said. Then, with the aid of a trained smart neural models that process the obtained knowledge, the network is programmed to make a decision – “fire” or “no fire” – "intrusion" or "no intrusion" – which is intelligence, he said.
“That’s the main focus of the network – data to information to knowledge to intelligence,” he said.
The remote sensors gather and send data points wirelessly to a remote laptop (sink/gateway), also located in the field. That laptop then sends information to a central computer that ultimately makes decisions. However, Soliman’s team is now programming the sensors to throw up the proverbial red flag. They want the sensors to yell (self decide), “Fire!” or “Intruder!”
This spring, four undergraduate students have been developing a class in wireless sensor networks for Dr. Soliman. Omar Soliman, Dr. Soliman’s son, and Mark Mendoza wrote the lab manual. Max Brister and Jeff Napier have been teaching labs.
The four students each have more than a year of experience on the project, writing code, reprogramming the sensing nodes and preparing the new lab.
“We’ve been code monkeys, implementing new protocols,” Napier said.
Brister said the trick is to implement twin protocols – collect more data at the laptop level, yet still program the nodes to do more processing of the data.
Transmitting data is the largest drain on the nodes’ batteries, Soliman said. So, if they can program the nodes to only transmit when necessary, the sensor network can remain deployed for weeks or months at a time.
Omar Soliman, who is also the Student Regent at Tech, said they’ve been forcing the nodes to operate at a faster speed than the manufacturer’s specifications in order to accommodate that goal.
Teague Bick, a 2010 electrical engineering graduate who is now a graduate student at University of California, Irvine, figured out how to override the factory settings and overload the nodes to complete 5,000 operations per second. The current team is implementing Bick’s breakthrough. He also expanded the sensor capability to sense smoke, attaching a regular off-the-shelf fire alarm to the Mica-Z mote (sensor unit), which was an essential step for the design of smart fire detection sensor network.
“I have high caliber undergraduate students,” Dr. Soliman said. “They have to make the nodes smarter. These students have a heck of a task.”
A team of graduate students are working on the visualization – or neural network – aspect of the sensing network.
“The neural network models mimic how the brain works,” Soliman said. “No matter how powerful a supercomputer is, our brain and our visual system wins out. Let’s see how the brain works. So, we’ve designed a neural network model that mimics the brain algorithmically, and trained to remember from the sensor network collected knowledge, in the field, about a fire or intrusion. At the operating mode, it decides when it sees similar knowledge and announces fire or intrusion.”
More graduate students are working on MASNET that does “routing without routes.” They are developing a system of transmitting data over arbitrary pathways, a theoretical research that has been published in many conferences and academic journals.
Soliman and his students have tested the networks to detect forest fires – by barbecuing sensors at his house, and have tested intrusions with his research students playing the intruders at one of the Tech fields.
“We’ve established our first successful experiments and publications,” he said.
In addition to many theoretical publications, he has also presented his first experimental smart detection of forest-fire research at the IEEE Sensor Conferences in Hawaii in 2010.
– NMT –
By Thomas Guengerich/New Mexico Tech