J.-F. Chamberland conducts research in information theory, statistical signal processing, probability, and in their applications to communication and control systems. His current research focuses on statistical problems in the context of wireless communications, networks, autonomous vehicles, and learning. He is also interested in the topics of vector space methods and optimization with possible applications to classification, biological systems, and societal challenges. Furthermore, he seeks to develop algorithms, techniques, and paradigms that permit the analysis and the design of complex systems. Some of the areas that he and his collaborators work on are listed below.

Low-Complexity Algorithms for Unsourced Multiple Access and Compressed Sensing in Large Dimensions

Wireless traffic is increasingly heterogeneous, with growth coming primarily from unattended devices. While early implementations of wireless communication systems have focused on voice telephony, subsequent generations of cellular infrastructures have enabled users to connect more broadly with the Internet, in support of applications such as gaming, browsing, and video watching. Looking into the future, unattended devices are predicted to grow rapidly and to generate a significant portion of the wireless data traffic. This evolution represents a formidable challenge for current infrastructures because such devices interact with the Internet in fundamentally different ways than humans. Individuals tend to establish sustained connections through their phones or computers, whereas machines often sporadically transmit status updates or control decisions with very short payloads. Without a fundamental redesign of the medium access control layer, wireless infrastructures will be unable to efficiently carry machine-type traffic, thereby creating a bottleneck for growth and innovation. The main goal of this research effort is to devise pragmatic random access schemes for machine-type data, with an eye towards addressing the aforementioned issues associated with the digital traffic of tomorrow. Findings from this project are expected to (i) help strengthen digital infrastructures, by now unanimously recognized as a key driver of the economy; (ii) train competent engineers with skills attuned to societal needs; and (iii) broaden participation in science, technology, engineering, and mathematics through recruiting and mentoring.

Close connections will be exploited between multiple-access communication, compressed sensing, and sparse graph inference. The crucial challenges and main innovations arise from the exceedingly large dimensionality of the engineering problems considered, compared to the state-of-the-art. The envisioned structures and algorithms for performing at such scales are rooted in the divide-and-conquer approaches of stochastic binning and splitting data. Techniques from graph-based codes to modern iterative methods and interference management are expected to play important roles in pushing the boundaries of unsourced random access and inference in large dimensions. The fundamental limits of complexity-constrained algorithms in wireless communications will be characterized by leveraging recently developed tools from finite-block-length information theory, statistical physics, and applied probability. Key attributes of the proposed models include uncoordinated access and the ability to operate without explicitly acquiring device identities. This departure from established schemes is crucial for eliminating a reliance on individualized feedback, which has enabled fast connections in the past but would now become cost-prohibitive as a mechanism for machine-type traffic. Likely outcomes for this project include near-optimum, low-complexity schemes for the next-generation of random access wireless systems, which will be broadly applicable to deal with inference in exceedingly large dimensions.

Multi-Agent Localization and Mapping Strategies for Autonomous Vehicles

The localization system on some autonomous vehicles can be viewed as performing information fusion from several sources. Coarse localization in some areas is afforded by real-time kinematic (RTK) and cellular infrastructures. This process is complicated in urban canyons by the unavailability of enough line-of-sight paths to GPS satellites, and by cellular signal being affected by multi-path effects, diffusion, and diffraction phenomena associated with scattering rich environ- ments. These challenges are well known to Ford Autonomous Vehicles. To circumvent these issues, hybrid algorithms that are partly based on computer vision have emerged, in addition to EKF-type methods.

The quality of the information acquired by cameras, LiDAR devices, and other imaging means is crucial to the inference and decision making processes taking place within an autonomous vehicle. This includes localization, but also extends to early warning systems, threat detection, and collision avoidance. Vehicles often operate in harsh environment and many factors can affect a systems ability to continuously provide reliable images at the pixel level. For instance, the formation of rain drop on the lens, glaring, insects, pollen, chips and cracks from rocks can all distort the pixel or point-cloud information afforded by the device. Inspired by our previous work with Ford Autonomous Vehicles, this research initiative seeks to pair the acquired information with a confidence map. This is perhaps best explained with images, but can be extended to point clouds. Consider a dirty camera that is performing an image acquisition task for the purpose of (fused) localization. The localization process operates as usual, with an algorithm matching the transformed local image to a global map, with say normalized mutual information (NMI). This information is then relayed to a decision. When the localization is reasonably successful, it is possible to recreate the path a posteriori and answer the question: What are the pixel locations offering the most reliable information? In this setting, frequent mismatches for certain pixels indicate compromised areas on the lens, whereas consistently reliable information indicate clean acquisition. This naive approach can therefore be thought of as a way to build a reliability map at the pixel level. As these soft reliability functions are formed, both the image and its current confidence map can be used in tandem in the localization process. For complex systems with multiple acquisition devices, the map can be created at the device level and used before aggregation. This philosophy extends to images captured by external agents or static structures. This thought experiment invites the possibility of closed loop system and enhanced filtering techniques where the confidence map is refined periodically, and the estimation algorithm takes as input both an image and a confidence map. Iterative algorithms of this type have been successfully used in, for instance, communication systems. The notion of turbo equalization is one where the channel conditions are assessed using pilot estimates and, in turn, pilot estimates are refined based on inferred channel conditions. This type of algorithm is appealing in autonomous system because it is, largely, software based. It is likely to improve estimates significantly in harsh conditions. This innovation can be approached from a data-only perspective, leading to manufacturer agnostic solutions. A similar type of approach can be adapted to static external cameras/beacons that are monitoring conditions and feeding back information to nearby autonomous vehicles. A multi-agent, multi-view scheme, with local maps and edge detection is also conceivable, although the computational complexity of more intricate implementations should be explored.

Massive Uncoordinated and Sporadic Multiple Access – Strengthening Connections between Coding and Random Access

The wireless landscape is poised to change, once again, within the next few years due to the emergence of machine-driven communications. This creates new challenges for wireless traffic, with packets originating from sporadic transmissions rather than sustained connections. Currently deployed scheduling policies are ill-equipped to deal with such traffic because they rely on gathering information about channel quality and queue length for every active device. The goal of this research initiative is to address this deficiency and devise novel access schemes tailored to massive uncoordinated and sporadic multiple access, thereby readying wireless infrastructures for the traffic of tomorrow.

The intellectual merit of this research initiative lies in exploiting the close connections between message-passing decoding and serial interference cancellation to create new access strategies. Linking advances in iterative methods to uncoordinated random access embodies the type of crosscutting research that can lead to disruptive technologies and paradigm shifts. This project embraces the evolving perspective of harnessing interference in wireless networks rather than fighting it or avoiding it. This viewpoint underlies many recent successes in network coding and distributed storage. This project brings forth such a perspective in the design of large-scale wireless networks. The broader impacts of this research program include providing pragmatic solutions to some of the challenges posed by an evolving wireless landscape, strengthening wireless infrastructures, and contributing to the training of a globally competitive Science, Technology, Engineering and Math workforce. The research tasks are attuned to societal needs in information technologies, an important economic driver for our nation. The wide dissemination of our findings will enhance the scientific understanding of wireless systems, access strategies, and iterative methods.

Adapting to a Changing Digital Landscape with Reconfigurable Antennas

This research initiative investigates new and realistic approaches to the creation and management of spatiotemporal information channels linking users and devices in wireless communication networks. Fast reconfigurable antennas can be employed to establish ancillary virtual links between nearby devices and hence augment the dimensionality of the solution space for several communication scenarios. This enables new ways to harness fading and manage interference in wireless environments. Understanding reconfigurable antennas, fading, interference and their interplay in the context of wireless communication networks forms the essence of this initiative. A central goal of the envisioned research is to circumvent the current bottlenecks that restrict Internet connectivity over access points and multi-cell networks. Prevalent obstructions that impede the development of superior systems include the complexity associated with interference management, limitations on feedback imposed by bandwidth and delay, and the fading characteristics of wireless environments. This research program introduces innovative technologies and algorithmic paradigms to address these fundamental issues. The proposed thrusts rely on a hierarchy of interlocking concepts; the three major tasks and their sub-tasks are summarized below.

  1. Design reconfigurable antennas and integrate them into mobile platforms.
  2. Provide a rigorous and thorough analysis on the repercussions of reconfigurable antennas on the foundations of wireless communication.
  3. Coordinate an integrative effort that blends the findings from tasks into a working prototype system.

Fundamental Limits in Delay-Constrained Wireless Communication

This research initiative addresses current issues in wireless and hybrid data networks. Wireless technology offers a unique mixture of connectivity, flexibility, and freedom. It plays an instrumental role in bridging the gap between mobile devices and established communication infrastructures. Today, wireless technology is being embraced with increasing vigor. This trend is reflected in the growing interest for multihop wireless networks. Wireless systems have the potential to fulfill the long-standing promise of pervasive computing and ubiquitous network access. Recent breakthroughs in multi-antenna systems, user cooperation, active relaying and network coding provide a foundation to realize the next radical advance in information technology: building reliable wireless multihop networks that can support delay-sensitive applications (e.g., VoIP, video conferencing, remote control, monitoring, gaming). Stringent delay constraints typical of real-time traffic suggest that a classical capacity/throughput analysis of the communication infrastructure associated with a multihop wireless network may not offer an accurate assessment of overall performance. In particular, existing models for wireless networks are limited in their ability to characterize time-variation issues in both single-hop and multihop channels. This research project seeks to improve the robustness and reliability of wireless multihop systems and to enable them to support delay-sensitive applications. To achieve this goal, we envisage a number of broad objectives for which both models and methodology must be advanced.

  1. Develop an integrated methodology for the analysis of wireless systems that support real-time traffic and delay-sensitive applications such as voice, video conferencing, inference and control.
  2. Use this methodology to identify fundamental performance limits and to design algorithms which allocate system resources efficiently when confronted by stringent service requirements.
  3. Create communication schemes for delay-sensitive traffic that take advantage of novel paradigms in wireless communications such as network coding, active relaying, user cooperation, multi-antenna systems, and multipath routing.
  4. Develop a comprehensive evaluation and validation platform for system design and algorithm development in the context of small wireless multihop networks.

Inference with Applications to Distributed Sensing

The emergence of miniature sensors with low-power wireless transceivers holds the promise of a new phase in the wireless revolution. Wireless sensor networks possess the ability to collect and transmit environmental data through the deployment of inexpensive devices. The amount of information generated by systems composed of hundreds or thousands of wireless sensors is vast. This creates many new challenges for the processing and transmission of the gathered data. New analysis tools are required to provide insight into the efficient design of sensor networks, especially in the context of inference problems and other delay-sensitive applications. The goal of this research initiative is to capture the preponderant features of sensor networks, and to use these features to derive guidelines and heuristics for the design of such systems.

Education and Technology

Teaching provides an exceptional opportunity to share knowledge and contribute to the scholarship of students. As a teacher, J.-F. Chamberland seeks to achieve three main interrelated pedagogical objectives: provide the students with a base of concepts and engineering skills, foster their interest in applied and theoretical research, and promote innovative and critical thinking. In college, a large portion of student learning occurs outside classrooms. One of the roles of an instructor is to facilitate and support this learning process. Understanding how emerging technologies such as content management systems, wikis, discussion boards, streaming contents and licensing influence learning and cooperation among students is an important research topic that will shape the future of education. This research area is an integrant part of our education program.