Skip to main content
All CollectionsFAQs
Crowdsourcing Explained
Crowdsourcing Explained
Entromy Support avatar
Written by Entromy Support
Updated over a week ago

In today’s world, it is critical for leaders to pay attention to both individual and collective voices: What are your people thinking about? How are they feeling? Do they believe you are paying attention to what matters? Do they understand decisions being made, and do they agree those decisions will drive positive change? What would they advise based on their perspective closest to the ground?

When participants reach an open-ended question on your survey, they have the option to enter a response in their own words. They also have the option to agree or disagree with comments left by other survey participants as shown below:

This acts as a digital focus group, allowing you to see which comments and responses resonate most strongly across your organization. Rather than having to read through every individual comment submitted by respondents, you can easily identify which comments gained the most support, as shown below:

To maintain anonymity and confidentiality, as participants provide comments, our Natural Language Processing (NLP) reviews them for any identifiable information (e.g., personal names) or concerning language (e.g., references to alcohol or drugs). These comments then become part of a digital focus group, providing a collaborative, transparent opportunity for employees to share anonymous feedback. The ability to agree/disagree with comments also helps preserve anonymity by allowing collective feedback without identifying individual participants.

Leveraging AI in the Crowdsourcing Process

We are now also leveraging AI in our crowdsourcing functionality to automatically flag potentially sensitive or concerning comments for review. This AI-driven approach helps streamline the process of identifying comments that may contain sensitive language or require additional attention. However, it’s important to note that AI is not perfect—while it can significantly improve efficiency, there may be instances where it does not catch every issue or may flag something that doesn’t require action. For example, the AI might not flag a comment where a participant calls out a fellow employee using a nickname. Since the AI is designed to detect names based on how they are spelled in the census file we upload into our system, it may overlook nicknames or abbreviations that aren’t an exact match. Similarly, international names with varying spellings may not always be flagged correctly.

Because of these potential gaps, human oversight remains a critical part of the process, ensuring that all sensitive comments are reviewed appropriately.

Commonly Asked Questions

How does the crowdsourcing comments feature work when employees write in different languages? Do participants only see comments in their own language, or are they translated before crowdsourcing?

Participants only see comments in their own language.

Will survey participants be biased in their own responses if they see comments submitted by others?

No survey method is entirely free from potential bias, and various external factors can influence an individual’s responses (i.e., their mood, recent meetings, etc.). The crowdsourcing feature allows comments that resonate most deeply to rise to the top. You can enable the Rank Delay feature, which forces participants to submit their own responses before voting on others, or disable the crowdsourcing feature entirely.

Can crowdsourcing be disabled?

Yes, crowdsourcing can be disabled. However, we strongly encourage keeping it enabled for the additional insights it offers.

How do comment scores work?

When a participant submits a comment, it automatically receives 5 points. As others agree or disagree, the comment gains or loses points (e.g., a strong agreement adds 3 points, while strong disagreement subtracts 3 points). The numbers next to comments represent the net score.

Did this answer your question?