top of page

Paving the Way to Ambient-Powered Wireless Connectivity

By Peter Siegumfeldt and Patricia Bower



Digital transformation, virtualization and automation are driving innovation and efficiency in residential, commercial and industrial facilities. This, in turn, requires the connectivity of devices, instrumentation and materials—all of which are considered “connected devices” and “the internet of things.” In many IoT applications, such as environmental sensors, where density and flexibility matter, a combination of wireless connectivity and batteries as a power source is a hard requirement.


For battery-powered IoT devices, designers are faced with the challenge of optimizing for maximum battery life to mitigate costs and e-waste for end users. Where possible, the end game is to eliminate batteries and rely on ambient energy harvesting from sources like radio frequency (RF), solar, mechanical vibration and thermal gradients.


In the case of environmental sensors, micro- and nanoelectromechanical system (MEMS and NEMS, respectively) fabrication has ushered in new levels of power efficiency and miniaturization. However, for the wireless connectivity side of the battery-powered IoT sensor solution, battery life is often dominated by the radio subsystem. When architecting a sensor module or tag, and depending on the specific target application and environment, designers must choose which wireless protocol their solution is connecting over and select the appropriate radio system-on-chip (SoC). Existing solutions are not always optimized for long battery life, but advances in ultra-low–power RF design are offering a roadmap to single battery over product life and, ultimately, ambient-powered IoT devices.


Which wireless protocol for IoT connectivity?


Depending on the application and end-user requirements, a device or sensor may be connected via a licensed or unlicensed spectrum radio. Licensed spectrum radio protocols, such as LTE-M or NB-IoT, offer wide-area–network (WAN) coverage for cellular IoT devices. For a significant portion of residential and enterprise applications, unlicensed spectrum protocols for home, building, facility or campus connectivity are common. These include Bluetooth and Wi-Fi. Bluetooth as a personal-area–network (PAN) protocol is extremely popular and widely adopted. The most ubiquitous protocol for wireless local-area–network (WLAN) connectivity is Wi-Fi.


Power-efficiency challenges for wireless SoC design


For a reduction in maintenance costs and battery waste for IoT devices, many applications are targeting the use of a single battery over product life. This is particularly important where sensors may need to be embedded in hard-to-reach areas, such as sensors for building material integrity sensing.



Power optimization has been a key focus for both Bluetooth and Wi-Fi chip design, but the challenges for SoC designers as the industry moves toward small, lifetime batteries and eventually ambient-powered IoT connectivity require new approaches in RF design. This is especially true when considering the total energy and discharge cycle for typical device batteries, such as a 3-V lithium-ion CR2032, a common choice for small IoT devices and sensor modules.


For an IoT sensor tag polled every two minutes, to support greater-than-15–year battery life on an average 220-mAh coin-cell battery, the wireless SoC device power consumption needs to be in the sub-5-µW domain. This figure is an order of magnitude lower than typical power-optimized Wi-Fi and Bluetooth LE chips today.


From a chip design perspective, the key power consumers on the transmitter side include the need to generate a stable RF carrier and associated RF amplifiers to achieve sufficient signal output power. These elements can be dominant power consumers, especially at high-transmission duty cycles. On the receiver side, the need for channel sensing and synchronization via beacons will be the dominant power consumer at low-transmission duty cycles.


An approach enabling significant gains in SoC power efficiency is the use of passive backscatter applied to Wi-Fi connections.


What is passive backscatter?


Passive backscatter is an ultra-low–power RF communications technology that, when applied to Wi-Fi, relies on the use of detecting transmitted frames from a Wi-Fi station (STA), inserting data from a connected device and reflecting the data into a Wi-Fi STA receiver in the WLAN. The nature of the passive reflection is what allows for extreme power efficiency, as there is no high-power radio transmitting the sensor data back into the Wi-Fi network.


Wi-Fi frames are identified by an envelope detector implemented in standard CMOS technology and using weak inversion biasing. The receiver’s sensitivity can be set by register, or it can automatically adapt to the received RF energy using a successive-approximation analog-to-digital converter. The backscatter switch determines the efficiency with which the RF energy is reflected to the Wi-Fi STA and thereby affects the maximum communication distance. To optimize the backscatter efficiency, the antenna and chip package are carefully modeled and used in circuit simulations together with post-layout parasitic effects.


HaiLa’s BSC2000 RF evaluation chip is the world’s first monolithic chip implementing passive backscatter on Wi-Fi. Based on HaiLa’s foundational backscatter technology and the result of a successful collaboration between HaiLa and Presto Engineering, it pushes well beyond the current limits for low-power IoT connectivity over Wi-Fi.


—Peter Siegumfeldt is the ASIC design manager at Presto Engineering, and Patricia Bower is the product manager at HaiLa.

Recent Posts

See All

Comments


bottom of page