Thursday, January 23, 2025

Sunita Sah on How to Say ‘No’

 Dr Sunita Sah on How to Say ‘No’

Defy: The Power of No in a World that Demands Yes

Capacity, knowledge, understanding, freedom, and authorization
** valid consent requires five elements to be present**: 
   capacity (implicit and explicit competency), 
   knowledge (how to determine someone knowledge), 
   understanding (how to tell if they have the understanding), 
   freedom (the freedom to say, no, yes, maybe, not now, maybe later, wait), and 
   authorization12.
     author
     authority (Tennessee valley authority TVA)
     authoritarian
     authorize
     authorize
     authorship
     authorized edition
    
 
 

https://en.wikipedia.org/wiki/Sunita_Sah

Sunita Sah is a professor of management and organizations at Cornell University's SC Johnson Graduate School of Management, and a core faculty fellow in the new Cornell Health Policy Center.[1] Her book, Defy: The Power of No in a World that Demands Yes, will be published by Penguin Random House in January 2025.[2] Sah is the director of Cornell University's Academic Leadership Institute. She was formerly the KPMG Professor of Management Studies at Cambridge Judge Business School of the University of Cambridge, where she remains an Honorary Fellow.[3] 

Sah is known internationally for her research on conflicts of interest and disclosure, behavioral ethics, influence, compliance, and defiance. She identified the "panhandler effect" and "insinuation anxiety" in advisor-advisee dynamics.[6][7] She revealed how conflict of interest disclosures can increase compliance and have unintended effects.[8][9] Her research on forensic science demonstrates how a lack of scientific rigor in forensic processes can lead to improper conviction and incarceration.[10][11]

https://en.wikipedia.org/wiki/Sunita_Sah

https://www.sunitasah.com/


https://time.com/7204326/saying-yes-instead-of-no-essay/

By Dr. Sunita Sah
January 4, 2025 7:00 AM EST
Dr. Sah is an award-winning professor at Cornell University and physician turned organizational psychologist. She leads groundbreaking research on influence, authority, compliance, and defiance. Sah is the author of Defy: The Power of No in a World That Demands Yes

From an early age, we are taught that obedience is good, and disobedience is bad. Saying yes is polite and agreeable, while saying no is often seen as selfish or disruptive. These lessons shape us psychologically, socially, and even neurologically.

When we are rewarded for compliant behavior, our brain rewards us with a hit of dopamine, the neurotransmitter associated with pleasure. Repeated compliance strengthens the neural pathways associated with saying “yes.”

On the other hand, acts of defiance—especially when they are met with disapproval—receive no such reward, making those pathways weaker or less likely to develop. Over time, compliance becomes a default response.

This tendency is reinforced throughout our lives. At school, we are praised for obedience and penalized for questioning authority. At work, compliance is embedded in professional hierarchies. Even in our personal lives, studies show that those who are conscientious or have agreeable dispositions are more likely to acquiesce to others’ demands. It’s no wonder, then, that saying “yes” feels easier, safer, and even expected—while saying no can feel like swimming against a tide of social conditioning.

We feel immense pressure from others to meet their expectations, often prioritizing social harmony over our better judgment. This need for connection and acceptance drives us to comply, even when doing so conflicts with what we know is right.

Read More: Yes, You Can Get Better at Saying No

Research consistently highlights how this pressure to comply shapes our behavior. In a series of experiments, my colleagues and I found that people frequently took bad advice, even when the flaws in the advice were glaringly obvious. In one study, participants were asked to choose between two lotteries, one of which was clearly inferior in value. When an advisor encouraged participants to select the subpar lottery, compliance rates were often as high as 85%. Requests to comply created a social obligation too uncomfortable to resist. However, when participants had the chance to revise their choices in private, compliance dropped to about 50%—still substantially high but a clear indication of how the physical presence of others magnifies the pressure to comply.

In another experiment, we had a middle-aged man approach 253 ferry passengers traveling from Connecticut to Long Island, offering them $5 or a chance to play a mystery lottery (with an average payout of less than $5) in exchange for completing a survey. Without advice, only 8% of participants chose the lottery. When the man recommended the lottery, 20% complied. Alarmingly, when he disclosed that he would receive a bonus if they selected the lottery—introducing a blatant conflict of interest—compliance jumped to 42%. Despite admitting that this revelation reduced their trust in the man, many passengers said they felt uneasy rejecting his suggestion outright, fearing it would imply they thought he was untrustworthy.

This discomfort they felt, which I call “insinuation anxiety,” is an aversive emotional state that arises when we fear that rejecting someone’s request will be interpreted as a signal of distrust or disrespect. Insinuation anxiety can explain why patients sometimes comply with unnecessary medical tests, or why employees accept unrealistic demands from their bosses. The thought of questioning someone’s expertise or intentions can be so unsettling to us that we’d rather choose compliance to avoid the awkwardness of implied doubt.

Even in extreme cases, insinuation anxiety may compel compliance. Stanley Milgram’s landmark psychology experiments on obedience to authority illustrate dramatic compliance. Participants were instructed to administer what they believed were dangerous electric shocks to a stranger. Many were visibly distressed suggesting they did not wish to comply, yet two-thirds could not reject the experimenter’s directives, prioritizing obedience over their own moral values.

This drive to avoid insinuating mistrust stems from a desire to maintain social harmony, avoid embarrassment, and “save face” for the person giving advice. The cost, however, is that we often suppress our own values and judgments to placate others.

Stanley Milgram’s obedience experiments also reveal another dimension of compliance: the abdication of responsibility. Many participants administered what they believed were dangerous shocks because, as they said, they were “just following orders.” This tendency to transfer moral accountability to authority figures is known as “ethical fading”—it shrinks our sense of responsibility and numbs us to the consequences of our actions.

In my research, I’ve found a similar dynamic: People often say yes to bad advice because they believe it will shield them from blame if things go wrong. Ironically, however, the opposite is true. Compliance doesn’t absolve regret; it amplifies it. When we ignore our better judgment, we end up feeling more culpable for poor outcomes, not less.

Compliance and consent are often conflated but are fundamentally different. Compliance is reactive and externally dictated, imposed by systems or authority figures that leave us with little room to say no. Consent, by contrast, is deliberate—a deeply considered agreement or refusal rooted in one’s values. Valid consent requires five elements to be present: capacity, knowledge, understanding, freedom, and authorization. Studies have shown that when people feel rushed or overwhelmed, it is difficult for them to process information in a deliberate fashion, mitigating their informed consent.

Defiance, too, is often misunderstood. Many people assume it must be loud, aggressive, or confrontational. But like consent, true defiance is deliberate and deeply personal. Defiance, as I see it, requires the same five elements as consent to ensure that we can act in accordance with our values. It’s not about rebellion for its own sake but about alignment—choosing actions that reflect your values, even under pressure.

Even when we recognize the need to defy, many of us lack the tools to translate internal discomfort into action. Defiance, like any other skill, requires practice. But society rarely gives us the space to develop it. Without practice, we default to compliance and continue to say yes when we want to say no. Developing the ability to defy starts with recognizing discomfort as a signal, pausing to reflect on your values, and taking small, deliberate steps toward action. The more we practice, the more confident we become in our capacity to align our actions with our principles.

While defiance is often associated with risk—social exclusion, professional backlash, or strained relationships—the costs of compliance are equally profound. When we comply without question, we erode our sense of agency, disconnect from our values, and, in many cases, perpetuate harm to ourselves or others.

Understanding the psychology of compliance is the first step toward reclaiming our autonomy. By recognizing the forces that drive us to say yes when we mean no, we can begin to create relationships, workplaces, and communities that value authenticity over unexamined obedience. Every decision we make—whether to comply or to defy—shapes the world we live in. When we align our actions with our values, we don’t just change our own lives, we create a culture where integrity and respect thrive. Compliance may be our default, but it doesn’t have to be our destiny.

https://time.com/7204326/saying-yes-instead-of-no-essay/


https://www.nytimes.com/2016/07/10/opinion/sunday/the-paradox-of-disclosure.html

Opinion

Gray Matter
The Paradox of Disclosure

By Sunita Sah

    July 8, 2016

Credit...Gérard Dubois

A POPULAR remedy for a conflict of interest is disclosure — informing the buyer (or the patient, etc.) of the potential bias of the seller (or the doctor, etc.). Disclosure is supposed to act as a warning, alerting consumers to their adviser’s stake in the matter so they can process the advice accordingly.

But as several recent studies I conducted show, there is an underappreciated problem with disclosure: It often has the opposite of its intended effect, not only increasing bias in advisers but also making advisees more likely to follow biased advice.

When I worked as a physician, I witnessed how bias could arise from numerous sources: gifts or sponsorships from the pharmaceutical industry; compensation for performing particular procedures; viewing our own specialties as delivering more effective treatments than others’ specialties. Although most physicians, myself included, tend to believe that we are invulnerable to bias, thus making disclosures unnecessary, regulators insist on them, assuming that they work effectively.

To some extent, they do work. Disclosing a conflict of interest — for example, a financial adviser’s commission or a physician’s referral fee for enrolling patients into clinical trials — often reduces trust in the advice.

But my research has found that people are still more likely to follow this advice because the disclosure creates increased pressure to follow the adviser’s recommendation. It turns out that people don’t want to signal distrust to their adviser or insinuate that the adviser is biased, and they also feel pressure to help satisfy their adviser’s self-interest. Instead of functioning as a warning, disclosure can become a burden on advisees, increasing pressure to take advice they now trust less.

Disclosure can also cause perverse effects even when biases are unavoidable. For example, surgeons are more likely to recommend surgery than non-surgeons. Radiation-oncologists recommend radiation more than other physicians. This is known as specialty bias. Perhaps in an attempt to be transparent, some doctors spontaneously disclose their specialty bias. That is, surgeons may inform their patients that as surgeons, they are biased toward recommending surgery.

My latest research, published last month in the Proceedings of the National Academy of Sciences, reveals that patients with localized prostate cancer (a condition that has multiple effective treatment options) who heard their surgeon disclose his or her specialty bias were nearly three times more likely to have surgery than those patients who did not hear their surgeon reveal such a bias. Rather than discounting the surgeon’s recommendation, patients reported increased trust in physicians who disclosed their specialty bias.

Remarkably, I found that surgeons who disclosed their bias also behaved differently. They were more biased, not less. These surgeons gave stronger recommendations to have surgery, perhaps in an attempt to overcome any potential discounting they feared their patient would make on the recommendation as a result of the disclosure.

Surgeons also gave stronger recommendations to have surgery if they discussed the opportunity for the patient to meet with a radiation oncologist. This aligns with my previous research from randomized experiments, which showed that primary advisers gave more biased advice and felt it was more ethical to do so when they knew that their advisee might seek a second opinion.

To be sure, physicians who disclose a financial conflict of interest or a specialty bias do not necessarily give poor advice. Often physicians make great efforts to inform patients of facts relevant to their decision. It would be damaging if patients became distrustful of all expert advice. But the truth remains that it is often difficult to judge the quality of advice.

What can be done? When bias is unavoidable, as with specialty bias, options such as patient educational materials could alert patients to this problem without hearing it directly from the physician. Another solution could be multidisciplinary treatment consultations, in which patients meet multiple specialists at the same time. Another remedy is to incorporate mandatory “cooling off” periods for important decisions; this could reduce some pressure advisees feel to follow their advisers’ recommendations.

One situation in which conflict of interest disclosure can work well is when the conflict itself can be avoided. For example, in cases involving gifts, bonuses or commissions, which can be readily rejected, my research has shown that disclosure requirements may encourage advisers to reject the conflict so they can disclose the absence of any conflicts.

Bias disclosure can have a profound effect on both advisees and advisers. Consumers should be aware of their reactions to disclosure and take time out to reconsider their options and seek second opinions. And advisers and policy makers must understand the potential unintended consequences when using disclosure as a solution to manage bias.

Sunita Sah is an assistant professor of management and organizations at the Johnson Graduate School of Management at Cornell University.

https://www.nytimes.com/2016/07/10/opinion/sunday/the-paradox-of-disclosure.html


source: 
       https://www.kqed.org/radio/schedule
       Forum  
       10:00 am – 11:00 am
       Dr Sunita Sah on How to Say ‘No’
 

No comments:

Post a Comment

Libya, Ukraine, North Korea, and Iran situation

  https://copilot.microsoft.com/chats/4G4N26B9TUqUDSnMhqMVG Great approach! Comparing North Korea to Libya and Ukraine shows how different g...