Efficient time synchronization of sensor networks by means of time series analysis

For decades, researchers have been working on improving sensor networks. A key design goal is to keep the cost of individual sensors (such as cameras and thermometers) as low as possible to enable large networks with thousands of linked sensors. This entails a disadvantage: Low-priced sensors have limited energy and computing capacities. Therefore, methods designed to make the most of limited resources are of crucial importance.

This is where time synchronization plays a fundamental role. Tight synchronization can lower the energy consumption of the nodes by reducing their radio activity time. This extends their lifetime significantly. Researchers at the Institute for Networked and Embedded Systems at the Alpen-Adria-Universität Klagenfurt have developed a new synchronization technique to address this issue. Particular emphasis was placed on ensuring that the method is not too greedy in its consumption of resources, which would cancel out the advantages of the synchronization.

“Imagine that a group of friends have arranged a meeting. Usually you agree on a time and place. It is often the case that not all of them arrive on time, so the coordinator of the meeting calls the latecomers. This involves effort,” explains Jorge Schmidt, Postdoctoral researcher in Professor Bettstetter’s team. If this example is transferred to the sensor networks that he and his colleagues are investigating, this effort means a loss of energy and computing power for the individual sensors.

Working with doctoral student Wasif Masood, Schmidt and Bettstetter have now developed a technique that reduces the additional effort of synchronization between the oscillators of the individual sensors. Schmidt explains this in more detail with the help of an example: “With a group of friends, we already know who is usually late. Therefore, the coordinator of such a meeting could tell the individual friends different times in order to intercept the delay.

This is exactly what the newly developed technique does: Using time series analysis it learns the behavior of the sensor clocks and can anticipate or correct future deferrals before asynchronicities can even begin to develop. “While the idea of learning behaviors to predict future corrections is not new, we have shown that the behavior models extracted from our time series analysis work very well with commonly employed wireless sensor devices,” Jorge Schmidt adds.

The synchronization technque was tested both in the lab and outdoors under varying temperature conditions using commercially available sensor devices.

http://www.aau.at

Media Contact

Dr. Romy Müller idw - Informationsdienst Wissenschaft

All latest news from the category: Information Technology

Here you can find a summary of innovations in the fields of information and data processing and up-to-date developments on IT equipment and hardware.

This area covers topics such as IT services, IT architectures, IT management and telecommunications.

Back to home

Comments (0)

Write a comment

Newest articles

A ‘language’ for ML models to predict nanopore properties

A large number of 2D materials like graphene can have nanopores – small holes formed by missing atoms through which foreign substances can pass. The properties of these nanopores dictate many…

Clinically validated, wearable ultrasound patch

… for continuous blood pressure monitoring. A team of researchers at the University of California San Diego has developed a new and improved wearable ultrasound patch for continuous and noninvasive…

A new puzzle piece for string theory research

Dr. Ksenia Fedosova from the Cluster of Excellence Mathematics Münster, along with an international research team, has proven a conjecture in string theory that physicists had proposed regarding certain equations….