Theme issue: Active Crowd Sensing [CFP]

Personal and Ubiquitous Computing
Personal and Ubiquitous Computing

Guest Editors

Zhiyong Yu
Fuzhou University, China
Jiangtao Wang
Lancaster University, UK
Jordán Pascual
Espada University of Oviedo, Spain

Mobile Crowd Sensing (MCS) has emerged as a paradigm for gathering information about the physical world. Using smartphones, networked vehicles and other sensor-rich mobile, portable and wearable devices, massive volumes of data can be gathered, converged and mined to enable applications in intelligent traffic, environmental monitoring, urban planning or public safety management, amongst many others.

The cost of crowdsensed data and the quality of data are two key factors for MCS. There has been a great deal of work to optimize them separately, such as recruiting participants that can meet a task completion requirement, or utilizing data already collected to infer uncollected data with minimal error. However, optimizing them jointly is a promising direction to optimize both cost and quality and is based on active learning which selects the most useful samples and so reduces the number of labeled samples needed. This approach has been successfully used in other research fields including signal and image processing.

In MCS, this is known as ‘Active Crowd Sensing’, and presents several challenges including: the crowdsensed data are spatiotemporally autocorrelated, while traditional active learning assumes the data are independently identically distributed; and that data from different applications require different learning algorithms and so active selection strategies should be based on specific learning algorithms.

This special issue of Personal and Ubiquitous Computing provides an opportunity for researchers and product developers to review and discuss the state-of-the-art in Active Crowd Sensing, to explore novel application areas and demonstrate the benefits of Active Crowd Sensing and to identify open issues sensing and learning in MCS.

Topics may include (but are not limited to):
Modeling the crowdsensed data cost and learned data quality in MCS
Participant/location/time selection which can benefit learned data quality
Missing data inference in Active Crowd Sensing
Spatiotemporal autocorrelation analysis of crowdsensed data
Relations between sensing rate and learning rate
Partitioning between sensing tasks and learning tasks
Learning directly from incomplete crowdsensed data
Adaptive measurement matrix for compressive sensing in MCS
Selective sensing and active learning by edge computing
Sub-sampling and super-resolution for crowdsensed data
Sparsity measurement and recoverability of crowdsensed data
Sparse mobile crowdsensing
Learning-assisted optimization in MCS
Active (deep/transfer) learning in MCS
Reinforcement/compressive/distilled learning for MCS
New datasets/features that can be easily accessed and helpful for label inferring
Problem modeling and framework developing for Active Crowd Sensing
Novel applications and systems in Active Crowd Sensing


Submissions should be original papers and should not be under consideration for publication elsewhere.

Extended versions of papers from relevant conferences and workshops are invited as long as the additional contribution is substantial (at least 30% new content).

Authors should follow the formatting and submission instructions for Personal and Ubiquitous Computing at

For more information visit the
Springer Nature Information for journal Article Authors
pages at

During the first submission step in Editorial Manager select
Original article
as the article type. In further steps, you should confirm that your submission belongs to this special issue by choosing the special issue title from the drop-down menu.

All papers will be peer-reviewed.

Important Dates

Manuscript Submission: June 15, 2020
Decision Notification: August 1, 2020
Final Manuscript Due Date: August 15, 2020
Publication Date (tentative): October, 2020
Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
) instead.