Using Adaptive Range Control to Maximize 1-Hop Broadcast Coverage in Dense Wireless Networks

Xiaoyan Li
Rutgers University


Abstract

Many recent sensor network protocols, such as directed diffusion and ad-hoc positioning, rely on periodic broadcasts. In addition, many wireless protocols, such as code propagation and dynamic source routing rely on aperiodic broadcasts. Indeed, the fundamental broadcast nature of wireless networks makes broadcast an ideal building block for the discovery, routing and localization functions, which will be critical for future sensor network systems. We thus consider the problem of how networks of extremely dense sensors connected by a wireless network can adjust their output power to maximize the number of 1-hop receivers of a broadcast message. In this talk, I will explain both our analytic model and the distributed algorithm we built upon it. First, we derive an geometric model that predicts the optimal range for maximizing 1-hop broadcast coverage given information like network density and node sending rate. Since the model equations can only be solved numerically, we next develop extrapolations that find the optimum range for any rate and density given a single precomputed optimum. Simulation results show that in spite of many simplifications in the model, our extrapolation is able to predict the optimal range to within 16%. Next, I will discuss the distributed range control algorithm we built based on the extrapolation model. It allows each node to set the maximizing radio range using only the locally observed sending rate and node density. The algorithm is thus critically dependent on the empirical determination of these parameters. Our algorithm can observe the parameters using only message eavesdropping and thus does not require extra protocol messages. Using simulation, we show that our algorithm converges fairly quickly and provides good coverage for both uniform and non-unif rm networks across a wide range of conditions. We also demonstrate the utility of our algorithm for higher layer protocols by showing that it significantly improves the reception rate for a flooding application as well as the performance of a localization protocol.