Score contribution per author:
α: calibrated so average coauthorship-adjusted count equals average raw count
I propose modelling boundedly rational agents as agents who are not logically omniscient—that is, who do not know all logical or mathematical implications of what they know. I show how a subjective state space can be derived as part of a subjective expected utility representation of the agent's preferences. The representation exists under very weak conditions. The representation uses the familiar language of probability, utility, and states of the world in the hope that this makes this model of bounded rationality easier to use in applications.