Abstract
In reasoning experiments, participants usually evaluate conclusions by clicking on predefined response alternatives or rating scales. However, such response formats may elicit specific cognitive reasoning strategies. To avoid such biases, we employed an open response format, allowing participants to formulate and explain their conclusions freely in complete sentences. Across two experiments, participants generated over 1,300 written responses, which we categorized as certain, uncertain based on counterexamples, or uncertain based on probabilities, using a predefined coding scheme. Participants frequently provided certain responses rather than expressing graded degrees of belief. When uncertainty was expressed, it was more often justified by concrete counterexamples than by probabilistic reasoning. We introduce the term probabilistic masking to describe the phenomenon whereby graded response formats may suppress the consideration of certain inferences or counterexamples, thereby biasing empirical accounts of human inference toward probabilistic interpretations.