Best Practices

How to Make Choice-based Prioritization Work in a Mobile Environment

 2021/10/young-man-tablet-coffee-shop-245.jpg Young man using tablet computer in coffee shop
Young man using tablet computer in coffee shop

It’s common practice in surveys to include questions that ask respondents to prioritize a list of items. These “items” may be brands, product features, benefits, messages, potential names, etc.

In a traditional web-based survey, there are several different approaches that can be used:

  • Ratings
  • Rankings
  • Point or chip allocation
  • Q-Sort
  • MaxDiff

However, with the increased use of mobile platforms such as smartphones and tablets, caution should be exercised to ensure that the selected approach is optimal for the device. Smaller screen sizes limit the amount and complexity of information that can be displayed, and shorter attention spans on mobile devices can impact respondent engagement.

To counteract these new realities, we have done extensive investigation around which technique is best for prioritizing items in a mobile setting. This includes conducting “research on research,” particularly as it relates to MaxDiff exercises.

Based on our research and experience, we recommend these best practices:

Ratings Scales:
or any survey conducted on a mobile device, we try to limit the amount of scrolling that individuals must do. When using ratings, that means limiting the number of items that are shown on the screen at any one time. We recommend displaying 3-4 items on the screen at most (depending on the question text, anchor labels, etc.).

Rankings and Allocations:
With rankings and allocations, it is also important to minimize scrolling. For rankings, a best practice is for numeric scores to appear as individuals rank order the items. Alternately, using a drag and drop function can be effective when the list of items is fairly concise.

Sorting:
Q-sort exercises work well on mobile devices when there is a larger number of items.  In this case, respondents go through multiple steps, such as:

  • Initially selecting the top 2 items
  • Of the remaining items, select the bottom 2
  • Of the remaining items, select the top 3
  • Of the remaining items, select the bottom 3

This process continues until only a small number of items remain. To introduce greater precision, we also have respondents rank the items within each of the subsets (particularly the top items). An algorithm is then used to assign points to each of the items, resulting in very clear differentiation among the items.

MaxDiff:
We have done extensive testing of MaxDiff exercises to ensure that we are able to get the most accurate information from them in mobile settings. Our “research on research” has revealed the following:

  • MaxDiff exercises can be successfully executed on smartphones, providing us with accurate data
  • MaxDiff screens should show 3 to 4 items to provide reliable data
  • The number of screens shown does not impact the accuracy of data (true up to 12 screens)
  • As much as possible, the verbiage associated with the items should be minimized (while still conveying important information)

Given the importance of surveys on mobile, we believe it is imperative to use best practices to collect the best data possible. Please contact us to discuss the best prioritization method for your particular needs.

Stay on top of the latest marketing trends and innovative insights approaches.

Subscribe