This 12 months introduced questions on what “doing the suitable factor” truly means, whether or not “being good” crowds out accountability and the way to steadiness following procurement guidelines with compassion for struggling contractors. Ask an Ethicist columnist Vera Cherepanova examines the themes that emerged, from managing politically polarized groups and navigating the ethics of greenhushing to wrestling with AI-driven healthcare denials and candidates utilizing know-how to recreation hiring assessments.
Expensive readers, thanks for being with “Ask an Ethicist” in 2025. Because the 12 months attracts to an in depth, let’s look again at what saved you up at evening and pushed you to boost moral questions. With out an excessive amount of pretension, I do suppose we will say this column has been an unscientific “moral barometer” for the temper in our occupation, reflecting what was taking place within the wider world and the way E&C professionals tried to adapt.
We started the 12 months, the day after Donald Trump formally started his second time period as president of the US, with a query from a supervisor tasked with holding collectively a politically polarized crew and nonetheless getting work executed. I provided a reminder that colleagues — and folks generally — are greater than their political identities. Supporting a divisive political determine doesn’t essentially imply somebody shares all their flaws or vices. Folks vote or assist concepts for a lot of causes: the insurance policies they hope will come to move, a perception in lesser evils or perhaps a single problem they prioritize above all else. They might be mistaken of their judgment, however misjudgment is a common trait and could be discovered throughout the political spectrum. I drew on ethical philosophy to recommend that it’s not solely acceptable to take care of relationships with individuals who maintain completely different views, however it may be a chance to observe and embody our moral ideas. That one feels as related as ever.
We returned to political polarization later within the 12 months, this time by the lens of social media and freedom of expression at work. The collision of free speech, firm status and political strain was one of many hardest management checks of 2025. A public-company director requested whether or not an worker ought to be fired over a viral publish a few polarizing occasion. We can hearth somebody for a problematic publish — however ought to we? When nearly something could be learn as politically incendiary, the temptation to maneuver shortly and “get forward of the story” is powerful. But typically the driving drive isn’t a peaceful protection of values however one thing much less noble: hedging towards private legal responsibility, for instance. “That is what we stand for” will get repeated quite a bit in these moments; fewer firms pause to ask whether or not that’s truly true. Do your values genuinely inform your selections, or do they present up solely when handy? What an awesome query.
As political winds shifted and govt orders got here in, I obtained a well timed dilemma in regards to the ethics of greenhushing. The tables had turned: As a substitute of loudly promoting (and typically exaggerating) their sustainability credentials, firms have been now contemplating “strategic silence.” Many opted to fly beneath the radar and tweak language, “range” changing into “respect for individuals,” and the like. It didn’t at all times work. Some have been dragged into tradition conflict moments anyway and compelled to take a stance. Internally, staff noticed by the rebranding and have been unimpressed, making a governance drawback contained in the group in addition to outdoors. A kind of annoyed insiders wrote in, and I summed it up this manner: “Firms that genuinely prioritize ESG should discover methods to align enterprise survival with moral duty, not select one over the opposite.” I nonetheless want that line had traveled additional.
We regarded on the classics, too.
One reader puzzled what “doing the suitable factor” truly meant, and rightly so. Placing this slogan on compliance swag has turn out to be a standard observe, however does it provide any actual steerage for conditions when moral decisions usually are not simple? I centered my reply on the idea of judgement, one thing that got here up quite a bit this 12 months in connection to AI, too: “An ethical particular person doesn’t blindly comply with guidelines; they think twice in regards to the penalties of their actions, how these actions align with their values and whether or not they’re treating others with equity and respect. They ask how their decisions replicate who they’re or aspire to be.” Ethics isn’t math or accounting, however did they point out it at compliance coaching in any respect?
One other traditional got here from a reader not absolutely satisfied by the uber-positivity at their office and fearful that “being good” was crowding out accountability. I invited thinker Brennan Jacoby into the column, and he masterfully separated good ethics from good vibes, being supportive from being agreeable and shaming from holding individuals to account. His recommendation was to assist the crew rethink what ethics and accountability imply so that everybody is working from a shared, correct understanding. I added a line from former Purple Hat CEO Jim Whitehurst that felt painfully apt: “Cultures which are terminally good are so good that you just by no means have the arduous conversations — and also you by no means make the arduous adjustments till you go out of business.”
Because the 12 months went on, affordability and inequality got here to the forefront. A procurement supervisor wrote a few small contractor asking to be paid in money, presumably to save lots of on taxes. Following the foundations felt proper; being compassionate felt human. I beloved this query as a result of it was so relatable and since it highlighted the strain between compliance and ethics, phrases which are typically handled as synonyms however usually are not the identical factor. I wrote: “I perceive your frustration with inequality; seeing people battle whereas the rich appear to play by completely different guidelines could make ‘a bit of off the books’ really feel innocent, even justified. But when the system is unfair, the suitable treatment is democratic reform, not particular person acts of noncompliance.” Precisely.
After which, after all, there was AI. All over the place, on a regular basis.
Readers requested in regards to the pressure between speedy innovation and regulatory oversight. Connie the Puppet (sure, actually), our compliance conscience-in-residence, weighed in as a dwelling reminder that governance and experimentation can coexist when organizations genuinely attempt.
Readers additionally requested in regards to the blurring line between “dishonest” and “leveraging tech.” One dilemma centered on candidates utilizing AI to move technical interviews, however the query went a lot deeper: Ought to dishonesty be reframed as optimization, or ethics rebranded as inefficiency? I didn’t clear up the generational reckoning in a single column, however I did problem hiring practices: If AI might help a candidate simply recreation your assessments, perhaps the issue isn’t the candidates. Possibly it’s time to rethink what you’re truly testing for.
Sensible college students on the Worldwide Enterprise Ethics Case Competitors, the place I function a choose, introduced one other AI dilemma to life, this time on the intersection of ethics, enterprise technique and weak sufferers: AI-driven denials in healthcare. With healthcare (and schooling) attracting intense personal fairness curiosity, the prospect of enterprise rationale overpowering care selections isn’t theoretical. We explored the way to stop ethics from being automated out of healthcare, touchdown on a core concept: People should keep within the loop the place stakes are irreversibly human.
Lastly, because the strain to “use AI all over the place” intensified with some executives going so far as threatening to fireside staff who don’t undertake it quick sufficient, a reader requested the suitable query: How a lot AI is an excessive amount of? To assist reply this, I turned to Garrett Pendergraft, whose perception was splendidly clear: purpose for higher outcomes, not simply “extra AI,” and keep in cost. “Human judgment is essential,” he mentioned, “as a result of human judgment means determining which components of your work life, social life and household life have to have that human nuance and which ones could be outsourced.” Sure to that — and let’s carry it into 2026.
Let me finish with a thank-you. To everybody who despatched questions, feedback, reward, critiques or random ideas: I learn and appreciated all of it, and I at all times responded. This column exists due to you, your actual dilemmas, your curiosity and your willingness to wrestle with arduous questions. Please hold being the considerate, demanding viewers you already are.
Warmest needs for the vacations — and see you in 2026.
Have a response? Share your suggestions on what I bought proper (or mistaken). Ship me your feedback or questions.
This 12 months introduced questions on what “doing the suitable factor” truly means, whether or not “being good” crowds out accountability and the way to steadiness following procurement guidelines with compassion for struggling contractors. Ask an Ethicist columnist Vera Cherepanova examines the themes that emerged, from managing politically polarized groups and navigating the ethics of greenhushing to wrestling with AI-driven healthcare denials and candidates utilizing know-how to recreation hiring assessments.
Expensive readers, thanks for being with “Ask an Ethicist” in 2025. Because the 12 months attracts to an in depth, let’s look again at what saved you up at evening and pushed you to boost moral questions. With out an excessive amount of pretension, I do suppose we will say this column has been an unscientific “moral barometer” for the temper in our occupation, reflecting what was taking place within the wider world and the way E&C professionals tried to adapt.
We started the 12 months, the day after Donald Trump formally started his second time period as president of the US, with a query from a supervisor tasked with holding collectively a politically polarized crew and nonetheless getting work executed. I provided a reminder that colleagues — and folks generally — are greater than their political identities. Supporting a divisive political determine doesn’t essentially imply somebody shares all their flaws or vices. Folks vote or assist concepts for a lot of causes: the insurance policies they hope will come to move, a perception in lesser evils or perhaps a single problem they prioritize above all else. They might be mistaken of their judgment, however misjudgment is a common trait and could be discovered throughout the political spectrum. I drew on ethical philosophy to recommend that it’s not solely acceptable to take care of relationships with individuals who maintain completely different views, however it may be a chance to observe and embody our moral ideas. That one feels as related as ever.
We returned to political polarization later within the 12 months, this time by the lens of social media and freedom of expression at work. The collision of free speech, firm status and political strain was one of many hardest management checks of 2025. A public-company director requested whether or not an worker ought to be fired over a viral publish a few polarizing occasion. We can hearth somebody for a problematic publish — however ought to we? When nearly something could be learn as politically incendiary, the temptation to maneuver shortly and “get forward of the story” is powerful. But typically the driving drive isn’t a peaceful protection of values however one thing much less noble: hedging towards private legal responsibility, for instance. “That is what we stand for” will get repeated quite a bit in these moments; fewer firms pause to ask whether or not that’s truly true. Do your values genuinely inform your selections, or do they present up solely when handy? What an awesome query.
As political winds shifted and govt orders got here in, I obtained a well timed dilemma in regards to the ethics of greenhushing. The tables had turned: As a substitute of loudly promoting (and typically exaggerating) their sustainability credentials, firms have been now contemplating “strategic silence.” Many opted to fly beneath the radar and tweak language, “range” changing into “respect for individuals,” and the like. It didn’t at all times work. Some have been dragged into tradition conflict moments anyway and compelled to take a stance. Internally, staff noticed by the rebranding and have been unimpressed, making a governance drawback contained in the group in addition to outdoors. A kind of annoyed insiders wrote in, and I summed it up this manner: “Firms that genuinely prioritize ESG should discover methods to align enterprise survival with moral duty, not select one over the opposite.” I nonetheless want that line had traveled additional.
We regarded on the classics, too.
One reader puzzled what “doing the suitable factor” truly meant, and rightly so. Placing this slogan on compliance swag has turn out to be a standard observe, however does it provide any actual steerage for conditions when moral decisions usually are not simple? I centered my reply on the idea of judgement, one thing that got here up quite a bit this 12 months in connection to AI, too: “An ethical particular person doesn’t blindly comply with guidelines; they think twice in regards to the penalties of their actions, how these actions align with their values and whether or not they’re treating others with equity and respect. They ask how their decisions replicate who they’re or aspire to be.” Ethics isn’t math or accounting, however did they point out it at compliance coaching in any respect?
One other traditional got here from a reader not absolutely satisfied by the uber-positivity at their office and fearful that “being good” was crowding out accountability. I invited thinker Brennan Jacoby into the column, and he masterfully separated good ethics from good vibes, being supportive from being agreeable and shaming from holding individuals to account. His recommendation was to assist the crew rethink what ethics and accountability imply so that everybody is working from a shared, correct understanding. I added a line from former Purple Hat CEO Jim Whitehurst that felt painfully apt: “Cultures which are terminally good are so good that you just by no means have the arduous conversations — and also you by no means make the arduous adjustments till you go out of business.”
Because the 12 months went on, affordability and inequality got here to the forefront. A procurement supervisor wrote a few small contractor asking to be paid in money, presumably to save lots of on taxes. Following the foundations felt proper; being compassionate felt human. I beloved this query as a result of it was so relatable and since it highlighted the strain between compliance and ethics, phrases which are typically handled as synonyms however usually are not the identical factor. I wrote: “I perceive your frustration with inequality; seeing people battle whereas the rich appear to play by completely different guidelines could make ‘a bit of off the books’ really feel innocent, even justified. But when the system is unfair, the suitable treatment is democratic reform, not particular person acts of noncompliance.” Precisely.
After which, after all, there was AI. All over the place, on a regular basis.
Readers requested in regards to the pressure between speedy innovation and regulatory oversight. Connie the Puppet (sure, actually), our compliance conscience-in-residence, weighed in as a dwelling reminder that governance and experimentation can coexist when organizations genuinely attempt.
Readers additionally requested in regards to the blurring line between “dishonest” and “leveraging tech.” One dilemma centered on candidates utilizing AI to move technical interviews, however the query went a lot deeper: Ought to dishonesty be reframed as optimization, or ethics rebranded as inefficiency? I didn’t clear up the generational reckoning in a single column, however I did problem hiring practices: If AI might help a candidate simply recreation your assessments, perhaps the issue isn’t the candidates. Possibly it’s time to rethink what you’re truly testing for.
Sensible college students on the Worldwide Enterprise Ethics Case Competitors, the place I function a choose, introduced one other AI dilemma to life, this time on the intersection of ethics, enterprise technique and weak sufferers: AI-driven denials in healthcare. With healthcare (and schooling) attracting intense personal fairness curiosity, the prospect of enterprise rationale overpowering care selections isn’t theoretical. We explored the way to stop ethics from being automated out of healthcare, touchdown on a core concept: People should keep within the loop the place stakes are irreversibly human.
Lastly, because the strain to “use AI all over the place” intensified with some executives going so far as threatening to fireside staff who don’t undertake it quick sufficient, a reader requested the suitable query: How a lot AI is an excessive amount of? To assist reply this, I turned to Garrett Pendergraft, whose perception was splendidly clear: purpose for higher outcomes, not simply “extra AI,” and keep in cost. “Human judgment is essential,” he mentioned, “as a result of human judgment means determining which components of your work life, social life and household life have to have that human nuance and which ones could be outsourced.” Sure to that — and let’s carry it into 2026.
Let me finish with a thank-you. To everybody who despatched questions, feedback, reward, critiques or random ideas: I learn and appreciated all of it, and I at all times responded. This column exists due to you, your actual dilemmas, your curiosity and your willingness to wrestle with arduous questions. Please hold being the considerate, demanding viewers you already are.
Warmest needs for the vacations — and see you in 2026.



















