0:00:15.595,0:00:17.898 Phony philanthropist, 0:00:17.898,0:00:20.138 humanitarian hypocrite, 0:00:20.478,0:00:22.432 deceptive do-gooder, 0:00:22.802,0:00:24.917 fraudulent altruist, 0:00:24.937,0:00:27.440 charitable pretender - 0:00:28.210,0:00:31.489 however you describe them,[br]one thing's for sure: 0:00:31.489,0:00:34.852 There are few things in life[br]that we hate more 0:00:34.852,0:00:37.787 than moralizing hypocrites, 0:00:37.787,0:00:42.965 people who ask us to do charitable acts[br]but are themselves hypocritical. 0:00:42.965,0:00:44.723 Now, in my line of work, 0:00:44.723,0:00:45.856 working with charities, 0:00:45.856,0:00:47.688 social enterprises, 0:00:47.688,0:00:49.847 foundations, and aid agencies, 0:00:49.867,0:00:53.667 I hear the word "hypocrite" all the time. 0:00:54.137,0:00:55.701 When Bono, 0:00:55.701,0:01:01.250 the sunglass-wearing, tax-avoiding,[br]mansion-living, jet-setting Irishman, 0:01:01.700,0:01:04.383 when he asks people to donate to charity, 0:01:04.383,0:01:05.393 what do we say? 0:01:05.393,0:01:07.100 We say, "Hypocrite!" 0:01:07.480,0:01:10.180 When Al Gore campaigns on climate change, 0:01:10.180,0:01:13.768 a man who many years 0:01:13.768,0:01:17.375 has had a utility bill[br]more than 20 times the average household, 0:01:17.395,0:01:19.475 we say, "Hypocrite!" 0:01:20.015,0:01:22.845 When the CEO of the Kony campaign 0:01:22.845,0:01:24.993 was, on one hand,[br]asking us to donate money 0:01:24.993,0:01:26.565 and saying he cared 0:01:26.565,0:01:31.245 but, on the other hand, was taking home[br]a charity salary of $90,000, 0:01:31.245,0:01:34.361 we said - you guessed it - "Hypocrite!" 0:01:34.673,0:01:36.467 You see, we hate hypocrites. 0:01:36.467,0:01:39.182 We hate people who purport[br]to have certain beliefs 0:01:39.182,0:01:41.109 that we don't actually think they have 0:01:41.109,0:01:43.641 when their actions[br]don't reflect those beliefs. 0:01:45.031,0:01:49.845 And I want to ask us,[br]"Should we call out people for hypocrisy? 0:01:49.845,0:01:53.980 People who we think are hypocrites,[br]should we give them that label?" 0:01:54.880,0:01:59.364 Now, I hate genuine hypocrisy[br]as much as the next person, 0:01:59.364,0:02:01.959 but I want to suggest -[br]and this is a big "but" - 0:02:01.959,0:02:05.425 I want to suggest[br]that calling out people for hypocrisy 0:02:05.425,0:02:07.525 is misguided at best, 0:02:07.525,0:02:10.552 downright dangerous at worst. 0:02:11.692,0:02:13.741 The key problem here 0:02:13.741,0:02:16.973 is that often when we accuse[br]people of hypocrisy, 0:02:16.973,0:02:18.949 it's not actually hypocrisy. 0:02:19.679,0:02:24.018 And there's a few common mistakes[br]that we make time and time again. 0:02:24.548,0:02:26.321 The first mistake that we make 0:02:26.321,0:02:29.824 is that we assume that[br]all charitable acts are equivalent. 0:02:30.564,0:02:34.690 Say someone tells you[br]that they support a carbon trading scheme. 0:02:34.700,0:02:39.245 We interpret that as just that person[br]supports the environment, 0:02:39.245,0:02:42.329 and so if they don't recycle,[br]we say, "Hypocrite!" 0:02:42.679,0:02:47.996 If someone asks you for money[br]for water purification tablets 0:02:48.016,0:02:49.992 for a country like Myanmar, 0:02:50.012,0:02:53.847 we assume, oh, that person[br]supports fresh clean water, 0:02:53.847,0:02:57.346 and so if they don't themselves[br]give money to build water wells, 0:02:57.346,0:02:59.203 we say, "Hypocrite!" 0:02:59.543,0:03:00.544 But the reality 0:03:00.544,0:03:03.944 is that there are multiple different ways[br]of solving every problem, 0:03:03.944,0:03:06.144 some that are far more[br]effective than others, 0:03:06.144,0:03:09.394 and just because you support[br]some approaches to problems 0:03:09.394,0:03:14.439 doesn't mean you can or should or will[br]support every approach. 0:03:14.439,0:03:16.645 That's the first mistake we commonly make. 0:03:17.045,0:03:19.698 The second problem[br]that we often come across, 0:03:19.698,0:03:21.360 the second mistake that we make 0:03:21.360,0:03:27.096 is that we compare to the extremes[br]of selflessness and selfishness. 0:03:27.556,0:03:29.680 Say you walk into a cafe, 0:03:29.680,0:03:34.897 and there's a sign on the wall that says,[br]"We donate 20% of our profits to charity." 0:03:34.897,0:03:38.270 You'd probably think,[br]"What a great café! What good people! 0:03:38.270,0:03:40.482 Donating a bit[br]of their profits to charity." 0:03:41.032,0:03:45.072 And so when we have mostly profit-making, 0:03:45.072,0:03:47.805 mostly selfishness but a bit of altruism, 0:03:47.805,0:03:50.029 we like it, we think of it[br]as a good thing. 0:03:50.733,0:03:53.023 But then if someone works for a charity, 0:03:53.023,0:03:56.404 if someone dedicates[br]their entire career to a good cause, 0:03:56.404,0:03:59.254 if someone is mostly selfless 0:03:59.254,0:04:02.036 but then takes home[br]a reasonably decent salary, 0:04:02.036,0:04:04.306 we say, "Hah, hypocrite!" 0:04:04.666,0:04:08.868 So we're fine with mostly selfish[br]with a touch of altruism 0:04:08.868,0:04:12.677 but not mostly altruistic[br]with a touch of selfish. 0:04:12.937,0:04:14.653 You can be 10% altruistic, 0:04:14.653,0:04:16.483 but you can't be 90%, 0:04:16.483,0:04:18.823 which doesn't make any sense. 0:04:18.823,0:04:23.567 We prefer honest greed[br]to imperfect generosity. 0:04:23.567,0:04:25.564 We compare to the extremes 0:04:25.564,0:04:28.697 rather than comparing people[br]to other people. 0:04:28.907,0:04:30.499 That's the second mistake. 0:04:30.739,0:04:32.511 The third mistake we make 0:04:32.511,0:04:33.525 is that we assume 0:04:33.525,0:04:37.912 that because someone supports[br]a collective response to something, 0:04:37.912,0:04:39.934 individual action must follow. 0:04:39.934,0:04:41.363 And so if a politician says 0:04:41.363,0:04:44.371 that they support[br]government-provided education, 0:04:44.371,0:04:47.047 but they send their kids[br]to private independent schools, 0:04:47.047,0:04:49.199 we say, "Hypocrite." 0:04:49.199,0:04:53.479 If someone was to say they supported[br]a global ban on meat consumption, 0:04:53.479,0:04:55.993 and yet they themselves ate meat, 0:04:55.993,0:04:58.341 we might say, "Hypocrite." 0:04:58.711,0:05:02.125 But the reality is it's totally rational 0:05:02.715,0:05:06.465 often to support a collective response 0:05:06.835,0:05:10.287 without necessarily wanting[br]to be the one to act alone, 0:05:10.287,0:05:12.551 to act individually, to bear the cost. 0:05:12.551,0:05:14.355 It's very rational. 0:05:14.355,0:05:17.326 For example, if you act in a certain way, 0:05:17.326,0:05:19.739 such as by taking really short showers 0:05:19.739,0:05:23.573 or taking the train instead of a plane[br]to save on carbon emissions, 0:05:23.573,0:05:26.800 you bear the full cost of your action, 0:05:26.800,0:05:30.172 and yet the benefits[br]are dispersed by seven billion people. 0:05:30.466,0:05:33.767 And so in order for it to be rational[br]for you to do that, 0:05:33.767,0:05:37.749 the benefits really need[br]to be seven billion times the cost, 0:05:37.749,0:05:39.603 which is rarely going to be the case. 0:05:39.603,0:05:44.348 That's why initiatives such as Earth Hour[br]often don't have a sustained impact. 0:05:44.688,0:05:47.703 It's not hypocritical to be rational. 0:05:48.803,0:05:50.853 The fourth mistake that we often make 0:05:50.853,0:05:53.228 is that we assume[br]that if someone really cares, 0:05:53.228,0:05:55.715 if someone really wants the best outcome, 0:05:55.715,0:05:59.043 they'll necessarily support[br]the ideal policy. 0:05:59.513,0:06:02.270 So when Kevin Rudd,[br]the former Prime Minister of Australia, 0:06:02.270,0:06:06.179 said climate change is the greatest[br]moral challenge of our time, 0:06:06.189,0:06:10.258 and then he supported[br]watered-down environmental legislation, 0:06:10.258,0:06:11.958 we said, "Hypocrite." 0:06:12.548,0:06:15.720 But the reality is sometimes[br]you need to be strategic. 0:06:16.120,0:06:19.410 And if that ideal policy,[br]if the ideal situation 0:06:19.410,0:06:21.919 would not receive parliamentary support, 0:06:21.919,0:06:26.594 if that would be scrapped[br]by the next Parliament in a year or two, 0:06:26.594,0:06:29.670 then sometimes opting[br]for the second-best approach 0:06:29.670,0:06:32.047 is actually more sustainable[br]and actually better 0:06:32.057,0:06:34.132 and actually has a greater impact. 0:06:34.452,0:06:39.138 Another common mistake we make[br]is that we conflate legality and morality. 0:06:39.488,0:06:43.518 If someone was to stand up[br]and say they opposed prostitution, 0:06:43.518,0:06:46.053 they thought prostitution was wrong, 0:06:46.053,0:06:48.560 and yet then they voted[br]for it to be legal, 0:06:48.560,0:06:50.582 we might say, "Hypocrite." 0:06:50.582,0:06:53.954 But questions of legality and morality[br]are very different. 0:06:55.024,0:06:58.963 You see, if making prostitution legal 0:06:58.963,0:07:01.846 meant that victims of abuse[br]could come forward 0:07:01.846,0:07:05.282 without fear of persecution[br]or prosecution, 0:07:05.282,0:07:07.378 then it might be the right thing to do, 0:07:07.378,0:07:11.561 irrespective of whether you thought[br]it was morally right or wrong. 0:07:11.561,0:07:15.343 Likewise, it's entirely consistent[br]for someone to say 0:07:15.343,0:07:19.100 that they themselves,[br]say for religious reasons, 0:07:19.360,0:07:21.758 don't believe in gay marriage, 0:07:21.758,0:07:25.107 but for that same person to say[br]they think it should be legal. 0:07:25.477,0:07:28.090 Because questions of legality[br]also take into account 0:07:28.090,0:07:32.804 other people's beliefs and opinions[br]and sexual preferences. 0:07:33.184,0:07:35.827 We shouldn't conflate[br]legality and morality. 0:07:36.587,0:07:38.497 And the final mistake that we often make 0:07:38.497,0:07:41.926 is we just don't distinguish[br]between different circumstances. 0:07:41.926,0:07:44.174 When Obama came out and said 0:07:44.174,0:07:49.525 that having armed security in every school[br]wasn't the answer to gun violence, 0:07:49.525,0:07:52.676 the NRA responded,[br]not by attacking the argument, 0:07:52.676,0:07:54.353 but by attacking the person. 0:07:54.353,0:07:57.523 They ran ad campaigns,[br]saying that Obama was a hypocrite 0:07:57.523,0:08:00.483 because he had armed security[br]for his daughters. 0:08:00.903,0:08:04.716 We often don't distinguish[br]different circumstances. 0:08:04.736,0:08:07.762 My point here is that often[br]when we accuse people of hypocrisy, 0:08:07.782,0:08:10.203 it's simply not hypocritical. 0:08:10.203,0:08:12.235 We assume that we know people's beliefs, 0:08:12.235,0:08:15.822 we assume we know why people[br]are acting in a certain way, 0:08:15.822,0:08:18.434 but it's often arrogant to assume that. 0:08:18.434,0:08:21.863 We're too quick to condemn,[br]too slow to ask why. 0:08:22.503,0:08:25.498 But let's assume for a moment[br]that it was hypocritical, 0:08:25.508,0:08:29.664 that these people did actually act[br]in a hypocritical manner. 0:08:30.194,0:08:34.647 The problem here is that[br]the existence of hypocrisy 0:08:34.647,0:08:38.645 doesn't actually undermine[br]the argument that is being made. 0:08:38.645,0:08:41.924 It's a convenient distraction,[br]but it's not a rebuttal. 0:08:41.924,0:08:44.640 I mean the argument that smoking is bad 0:08:45.030,0:08:49.429 doesn't change because the person[br]who is making it is a smoker. 0:08:49.729,0:08:54.275 You can know right or wrong[br]without being morally perfect yourself. 0:08:54.275,0:08:57.574 And you should be able to ask people[br]to do what is right. 0:08:57.574,0:09:00.606 That shouldn't just be[br]the purview of the morally perfect. 0:09:02.056,0:09:05.772 And so, if we shouldn't[br]call out people for hypocrisy, 0:09:05.772,0:09:09.903 if we shouldn't focus[br]on the charitable messenger, 0:09:09.903,0:09:11.447 what should we do? 0:09:11.747,0:09:12.753 I want to say 0:09:12.753,0:09:17.132 that we should discuss and debate[br]and critique the charitable message. 0:09:17.702,0:09:19.778 Now, with me I have two jugs. 0:09:20.128,0:09:21.875 One of those represents the person, 0:09:21.875,0:09:25.264 one of those represents[br]the messenger in question, 0:09:25.264,0:09:28.141 and the other one represents[br]the argument, the message. 0:09:28.141,0:09:30.866 Now, when we call out people[br]for hypocrisy, 0:09:30.866,0:09:33.681 when we use that hypocrisy argument, 0:09:33.691,0:09:36.169 when we use it to attack a person, 0:09:36.179,0:09:37.970 this is what happens. 0:09:39.130,0:09:40.727 It's easy to make them bleed. 0:09:40.727,0:09:42.646 It's easy to inflict pain. 0:09:43.136,0:09:45.632 After all, they're a fallible person. 0:09:47.042,0:09:48.175 But what's interesting 0:09:48.175,0:09:50.298 is that we don't discuss,[br]we don't critique, 0:09:50.298,0:09:53.252 we don't criticize the charitable message. 0:09:55.862,0:09:57.246 And so that's the status quo, 0:09:57.246,0:09:59.552 that's the situation we find ourselves in, 0:09:59.552,0:10:03.371 where attacking the charitable[br]messenger is all too easy 0:10:03.391,0:10:07.240 and attacking the charitable[br]message is often taboo. 0:10:08.480,0:10:09.623 Why is this? 0:10:09.623,0:10:13.852 Well, I think we often think of charity[br]as somewhat of a taboo subject. 0:10:13.872,0:10:15.734 We don't like criticizing it. 0:10:15.744,0:10:18.904 Indeed, we just think of it as doing good. 0:10:18.914,0:10:21.810 That's why you can do a lot of things[br]in the name of charity. 0:10:21.810,0:10:23.616 (Laughter) 0:10:24.396,0:10:27.565 If you want an excuse[br]to do a naked calendar, 0:10:27.565,0:10:29.420 do it in the name of charity. 0:10:29.750,0:10:32.988 If you want an excuse to do a marathon,[br]do it in the name of charity. 0:10:32.988,0:10:35.418 If you want an excuse[br]to make three of your friends, 0:10:35.418,0:10:36.846 to force three of your friends 0:10:36.846,0:10:39.278 to pour a bucket of ice cold water[br]over their heads, 0:10:39.278,0:10:40.275 (Laughter) 0:10:40.275,0:10:42.135 do it in the name of charity. 0:10:42.135,0:10:45.779 You see, we find it difficult[br]to criticize acts of charity. 0:10:45.779,0:10:48.091 We think of charity as one and the same, 0:10:48.111,0:10:50.834 but not all charities are created equal, 0:10:50.844,0:10:54.817 not all approaches to problems[br]are equally effective. 0:10:55.437,0:10:57.583 One of the things[br]that the organization I run, 0:10:57.583,0:10:58.742 180 Degrees Consulting, 0:10:58.742,0:11:00.114 specializes in 0:11:00.114,0:11:01.585 is measuring the social impact 0:11:01.585,0:11:04.851 of different programs[br]and different organizations, 0:11:04.851,0:11:06.206 and it's very clear to me 0:11:06.206,0:11:08.522 that some approaches, some charities, 0:11:08.522,0:11:13.368 are hundreds, even thousands, of times[br]more effective than other approaches. 0:11:13.368,0:11:14.572 And so what that means 0:11:14.572,0:11:17.857 is that it's more important[br]to do the right act, 0:11:17.857,0:11:19.341 the most effective act, 0:11:19.341,0:11:22.006 than to merely do an action. 0:11:22.591,0:11:24.737 An action is merely a means to an end. 0:11:24.747,0:11:27.794 We focus on it when we accuse[br]people of hypocrisy, 0:11:27.794,0:11:31.656 but focusing on the impact[br]is far more important. 0:11:32.436,0:11:34.011 It's far more important 0:11:34.011,0:11:37.322 because in a world with unlimited problems 0:11:37.332,0:11:40.625 but limited time, limited resources,[br]and limited money, 0:11:40.635,0:11:45.975 we can't afford to not have[br]the greatest social impact possible. 0:11:45.975,0:11:47.474 We can't afford it. 0:11:47.474,0:11:52.642 We can't afford for doing good[br]to merely be a feel-good endeavor. 0:11:52.662,0:11:55.948 It must be an intellectual[br]endeavor as well. 0:11:56.308,0:11:58.215 Let me give you one example. 0:11:58.645,0:12:01.134 Say you have $42,000, 0:12:01.154,0:12:04.367 and you want to spend that money[br]helping blind people. 0:12:04.387,0:12:06.779 You can spend that money[br]in a few different ways. 0:12:07.099,0:12:10.125 One way is by not giving it at all. 0:12:10.135,0:12:14.440 The second way is by using the money[br]to train a guide dog. 0:12:14.450,0:12:17.326 It cost about $42,000[br]to train a guide dog. 0:12:17.336,0:12:18.640 And the third option 0:12:18.650,0:12:21.749 is that you can use it[br]to fund a low-cost eye surgery 0:12:21.759,0:12:22.812 in a place like India, 0:12:22.832,0:12:26.308 which costs about $75 per surgery. 0:12:26.328,0:12:28.985 And so with that $42,000, 0:12:28.995,0:12:31.429 you can either help no blind people, 0:12:31.699,0:12:33.306 one blind person, 0:12:33.316,0:12:35.902 or 560 blind people. 0:12:36.322,0:12:39.030 I do not think it should be taboo 0:12:39.290,0:12:44.233 to argue that you should not give money[br]to training the guide dog, 0:12:44.253,0:12:45.771 as cute as guide dogs are 0:12:45.791,0:12:49.222 and as important guide dogs are[br]for the people who use them, 0:12:49.422,0:12:53.834 and that you should instead[br]give money for the low-cost eye surgery. 0:12:54.584,0:12:56.684 I know that sounds bad. 0:12:56.944,0:12:58.875 It sounds unethical. 0:12:59.455,0:13:01.313 It almost sounds evil. 0:13:01.753,0:13:05.634 Once we've done the effective approaches,[br]we can do the less effective approaches, 0:13:05.654,0:13:07.877 but I don't think[br]less effective approaches 0:13:07.877,0:13:11.621 should come at the expense[br]of the more effective approaches. 0:13:12.371,0:13:15.245 Because as long as it is taboo 0:13:15.258,0:13:18.579 for us to talk about the impacts[br]of different charitable acts, 0:13:18.869,0:13:20.223 more people will be blind, 0:13:20.251,0:13:21.721 more people will be poor, 0:13:21.751,0:13:25.432 more people won't have access[br]to health, education, and sanitation, 0:13:25.432,0:13:28.191 and that is something I cannot stand for. 0:13:28.921,0:13:31.115 I want us to have[br]the greatest impact possible, 0:13:31.115,0:13:33.377 and I don't think[br]we'd have that greatest impact 0:13:33.397,0:13:37.097 by focusing on hypocrisy[br]or focusing on the messenger. 0:13:37.097,0:13:39.878 We have it by focusing[br]on the charitable message. 0:13:39.878,0:13:42.088 That's the most important thing. 0:13:42.708,0:13:44.141 Let me conclude. 0:13:45.401,0:13:48.741 Time and time again, when we can, 0:13:48.761,0:13:50.923 we target the messenger, not the message; 0:13:50.943,0:13:53.213 the campaigner, not the campaign; 0:13:53.243,0:13:54.872 the person, not the argument. 0:13:55.272,0:13:58.704 The exact opposite should be true. 0:13:59.764,0:14:02.123 The key point that I'm trying to make here 0:14:02.123,0:14:05.043 is that charitable messengers[br]should not be the target, 0:14:05.073,0:14:09.244 and critiquing charitable messages[br]should no longer be taboo. 0:14:10.024,0:14:14.794 Small minds rebut people;[br]great minds rebut arguments. 0:14:14.814,0:14:17.109 I think Eleanor Roosevelt would agree. 0:14:17.529,0:14:18.552 So the next time 0:14:18.552,0:14:23.881 that a politician, a celebrity, a friend,[br]a religious leader, a charity worker 0:14:23.891,0:14:26.387 asks you to do something[br]that you don't want to do, 0:14:27.667,0:14:31.042 I want you to respond[br]by rebutting the message, 0:14:31.042,0:14:32.622 not the messenger. 0:14:32.632,0:14:35.759 The next time that a friend[br]calls out someone for hypocrisy, 0:14:35.769,0:14:39.767 I want you to tell them,[br]"Rebut the message, not the messenger." 0:14:40.477,0:14:43.316 By focusing on the hypocrisy[br]of the messenger, 0:14:43.316,0:14:44.858 we're being misguided, 0:14:45.078,0:14:47.782 but by focusing on[br]the validity of the message, 0:14:47.792,0:14:49.181 we're being productive, 0:14:49.201,0:14:51.957 we're helping to maximize impact. 0:14:51.977,0:14:53.927 And that is a cause worth fighting for. 0:14:53.927,0:14:55.077 Thank you. 0:14:55.077,0:14:58.069 (Applause)[br]