About SmartSensing Workshop

The real success of smart homes, cities, and all other intelligent environments are dependent on how efficiently these setups can provide services to the users. However, the efficiency of these services serves as the only one of the factors that ultimately contribute to the enhancement and well-being of the users; at the core of these systems sensing plays the primary role. Advancement in sensing leads to the detection of fine-grained contextual details of a user with high accuracy and thus allows these smart environments to provide enhanced services following the special needs of the concerned users. As the robustness and quality of sensing have a significant impact on the perception and reasoning of these smart environments, a lot of innovation is necessary not only in the development of these sensors but also while designing the algorithms to analyze the information collected by them. As smart environments also analyze complex interleaved activities, continuous sensing is usually adopted. But continuous sensing leads to the development of systems that are inefficient in terms of energy consumption which can limit the effective deployment of these sensors in realistic scenarios. Thus, smart sensing techniques also encompass schemes and devices that can lead to the development of power efficient systems that can help save money and energy without degrading the quality-of-service for the users.

The goal of this workshop is to bring together practitioners and researchers from both academia and industry in order to have a forum for discussions and technical presentations on the fundamental knowledge and principles of smart sensing systems that enable the value of co-creation in the field of sensing, IoT, wearable, digital health, development of smart environments and future of work.

Important Dates

  • Submission Deadline: September 20, 2019 (AoE)
  • Acceptance Notification: October 15, 2019
  • Camera-Ready Submission: October 30, 2019
  • Event Date: January 5, 2020

Submission link: Click here