This week’s Research Hero is Prof. Jay Edward Russo. Prof. Russo received his PhD in Cognitive Psychology from University of Michigan. He has been working at Cornell University since 1985, and holds the S.C. Johnson Family Professor of Management at the business school. He has also been on the Faculty of the University of Chicago, the University of California, San Diego as well as holding visiting positions at Bocconi University (Milan), Carnegie-Mellon, and Duke, and Penn (The Wharton School). Prof. Russo’s research focuses on managerial and consumer decision making and one of his most important contributions is the work in information distortion and process tracing methods. Prof. Russo has published extensively in prestigious journals as well as co-authoring Winning Decisions (2002) and Decision Traps (1989). He has been on the editorial boards of leading journals such as Journal of Behavioral Decision Making, Journal of Consumer Psychology, Journal of Marketing, Journal of Personality and Social Psychology, Psychological Science, and many more. He has also done consulting work for National Bureau of Federal Trade Commission, GTE Laboratories and General Motors Research Laboratories.
I wish someone had told me at the beginning of my career…Throughout your career, but especially prior to tenure, you will very likely be forced to make a tradeoff between good science and careerist tactics. A research topic that may contribute most to understanding J/DM may not be one that is currently well recognized and accepted by the field. The more novel the topic of one’s research, the more challenging will be its path to publication in journals, to grant support, and to other markers of acceptance by the field. The likelihood of lots of published papers is far greater if you work on currently accepted topics. You will need the publications, maybe many of them, to achieve careerist goals, especially tenure. The price to good science may be work that is incremental at best and “backfill” at worst. I urge you to be fully aware of the tradeoffs that you make between better science and career advantage.
I most admire academically… because…
Herb Simon because he aimed so high as a scholar and as a citizen of his university and of the world at large– and because he was so successful as both scientist and a citizen.
The best research project I have worked on during my career…/the project that I am most proud of/ that has inspired me most….I stumbled on the phenomenon of decision makers’ distorting new information to support the currently leading alternative. I investigated this predecisional distortion of information for a decade or so, revealing some of its manifestations, boundaries, and consequences. One strategy for good science is to try to identify the underlying causes that explain why a phenomenon occurs, in the hope that even one of those causes may be fundamental enough to explain other phenomena as well. The attempt to explain predecisional distortion led to work that identified the goal of cognitive consistency as the main driver. This work relied on multiple methods, including some new to me (semantic priming and a lexical decision task) or simply new (in-progress assessment of goal activation). The result was unexpected and quite clear: only cognitive consistency caused information distortion, with alternative goals like saving effort playing no role at all. Subsequent work has confirmed that the goal of cognitive consistency is at least one driver of several other J/DM phenomena, thus validating the scientist’s strategy of seeking depth of explanation.
The worst research project I have worked on during my career…/the one project that I should never had done…There is no one project that I regret. Rather my regret is working on too many projects, drawn to each one because it was so genuinely interesting. I probably should have focused on those that were both most interesting and most important.
The most amazing or memorable experience when I was doing research….After so many decades of research (five), there are many experiences; but it is more categories than individual events that come to mind in responding to this question. For instance, when I was younger, it was a great pleasure to have a senior scholar whom I respected proffer kind words about my work. Now I have the pleasure of supporting young researchers, reminding them that it may take several good ideas to find one both worthwhile and feasible and to remember in their enthusiasm and impatience that science is slow.
The one story I always wanted to tell but never had a chance…“There’s nothing new here.” These were the words of all three reviewers of one of the first submitted manuscripts on information distortion. Fortunately, each one identified a different well-known phenomenon of which information distortion was asserted to be merely another (unnecessary!) illustration. I do not recall the exact three, but early in this research stream the following were offered: attitude extremity/polarization, cognitive dissonance, confirmation bias, the desirability bias (wishful thinking), the halo effect, and the prior belief effect. Fortunately, the editor was sensitive to the unusual combination of reviewers’ complete agreement (“reject this manuscript”) and complete disagreement (“just another example of [three distinctly different phenomena]”). As a result, he gave me and my co-authors the chance to explain why there was, in fact, something new in the phenomenon of information distortion. The subsequent explanation was accepted, along with the manuscript. The lesson I took from this experience was how reviewers (which means most of us) can so naturally filter our judgments through our own lenses. The question that I ask myself is whether I have applied that lesson consistently when I evaluate others’ work. The answer: probably not, but I do keep trying.
A research project I wish I had done… And why did I not do it…I cannot claim to have no regrets whatsoever (that would be hubris), but none of them involve a research project that I regret not attempting.
If I weren’t doing this, I would be…Likely retired, an unpleasant thought. There is still tread left, so please don’t retire me.
The biggest challenge for our field in the next 10 years…One challenge is to encompass the growing breadth of J/DM phenomena and methods. Among the phenomena are those that are nonconscious, emotional, and contextual. Among the methods are those of neuroscience and of process tracing. In considering the opportunities and barriers to adopting these newer research topics and methods, I recall the observation that so often seems best to characterize a field’s response to such a situation, “We love progress; it’s change we hate”. My belief is that J/DM researchers, senior as well as junior, can master new methods and solve new problems. My hope is that more than a few will.
A second challenge is paradigmatic. J/DM emerged as a field by testing the optimal models of economics and statistics, especially EU and Bayesian updating. Violations of these models engendered the anomalies paradigm that has characterized J/DM for the last four decades. Let me suggest a challenge in the form of a question: what would J/DM look like if studied the way other higher-order psychological phenomena are approached, such as problem solving/reasoning and language comprehension? That is, what if we built theories of cognitive (and other) processes from process (and other) data, but without specifying optimal performance? Indeed, if we view behavior as driven by multiple goals not all of which are even conscious, can we really specify optimal performance? What if, instead, we viewed our subjects as adapting to the task environment that we scientists create in order to perform sufficiently well rather than optimally? Great progress has been achieved in understanding how people read without the use of an optimal model of language comprehension. Might similar progress occur in J/DM by focusing less on how our observations compare to optimality criteria and more on the complexity of decision makers’ attempt to achieve multiple goals simultaneously?
My advice for young researchers at the start of their career is…Learn how to select research problems, not just how to solve them. Try to be strategic in how you approach your topics, colleagues, and journals. Often I’ve seen a graduate student (or a credentialed researcher) happy just to find a candidate problem: “That would make a dissertation topic.” or “That could be publishable”. With my own students who are ready to find a dissertation problem, I ask them to identify three potential topics, to research each one for at least one week, and to evaluate their comparative merits. Then, and only then, do I want them to pick one.
Understand the J/DM paradigm in which you are working and think about whether a different one, maybe a newer one, might yield greater contributions to the field. Are input-output data sufficient, or would process data yield more insight? Is this the time or topic to bring in neuroscience? Should the analysis move from the attributes of the alternatives considered in a decision to the benefits that those attributes convey, or even to the goals that those benefits help to achieve? One of books that most influenced my graduate training is Thomas Kuhn’s Structure of Scientific Revolutions, which focused on scientific paradigms. I still begin my doctoral seminar by asking students to read it.