0:00:00.130,0:00:02.200
♪ [music] ♪
0:00:03.645,0:00:05.600
- [Narrator] Welcome[br]to Nobel Conversations.
0:00:07.250,0:00:10.240
In this episode, Josh Angrist [br]and Guido Imbens,
0:00:10.240,0:00:11.920
sit down with Isaiah Andrews
0:00:11.920,0:00:14.303
to discuss how their research[br]was initially received
0:00:14.999,0:00:17.090
and how they responded [br]to criticism.
0:00:18.700,0:00:20.380
- [Isaiah] At the time, [br]did you feel like
0:00:20.380,0:00:21.627
you were on to something,
0:00:21.627,0:00:25.152
you felt this was the beginning[br]of a whole line of work
0:00:25.152,0:00:27.202
that you felt like was going [br]to be important or...?
0:00:27.600,0:00:30.000
- [Guido] Not so much[br]that it was a whole line of work,
0:00:30.000,0:00:31.894
but certainly I felt like, [br]"Wow, this--"
0:00:32.277,0:00:35.045
- [Josh] We've proved something [br]we didn't know before,
0:00:35.045,0:00:36.114
that it was worth knowing.
0:00:36.114,0:00:38.033
- Yeah, going back to the...
0:00:38.741,0:00:41.080
compared to my job market[br]paper or something--
0:00:41.080,0:00:45.560
No, I felt this was actually [br]a very clear, crisp result.
0:00:46.400,0:00:49.530
- But there was definitely [br]a mixed reception
0:00:49.530,0:00:52.420
and I don't think anybody [br]said that,
0:00:52.420,0:00:55.487
"Oh, well, this is [br]already something
0:00:56.043,0:00:59.386
which is the nightmare scenario[br]for a researcher
0:01:00.230,0:01:02.003
where you think[br]you've discovered something
0:01:02.003,0:01:04.461
and then somebody else says, [br]"Oh, I knew that."
0:01:05.000,0:01:07.220
But there definitely was [br]a need to convince people
0:01:07.220,0:01:10.370
that this was worth knowing,[br]that instrumental variables
0:01:10.370,0:01:12.687
estimates a causal effect [br]for compliers.
0:01:13.200,0:01:16.178
- Yeah, but even though[br]it took a long time
0:01:16.178,0:01:19.348
to convince a bigger audience,
0:01:19.820,0:01:24.346
sometimes even fairly quickly, [br]the reception was pretty good
0:01:24.800,0:01:26.645
among a small group of people.
0:01:27.200,0:01:31.297
Gary clearly liked it a lot [br]from the beginning,
0:01:31.800,0:01:33.289
and I remember...
0:01:33.289,0:01:35.645
because at that point[br]Josh had left for Israel,
0:01:35.645,0:01:38.886
but I remember explaining it [br]to Don Rubin,
0:01:39.696,0:01:43.700
and he was like, "You know, [br]this really is something here."
0:01:43.700,0:01:45.197
- Not right away though.
0:01:45.932,0:01:47.173
Don took some convincing.
0:01:47.500,0:01:49.150
By the time you got to Don,
0:01:49.150,0:01:51.226
there have been [br]some back and forth with him
0:01:51.226,0:01:53.304
and in correspondence, actually.
0:01:53.700,0:01:57.103
- But I remember at some point [br]getting a call or email from him
0:01:57.103,0:02:00.020
saying that he was sitting [br]at the airport in Rome
0:02:00.020,0:02:03.700
and looking at the paper [br]and thinking,
0:02:03.700,0:02:07.000
"Yeah, no actually, [br]you guys are onto something."
0:02:07.490,0:02:08.594
- We were happy about that.
0:02:08.594,0:02:10.550
But that took longer [br]than I think you remember.
0:02:11.030,0:02:12.500
It wasn't right away.
0:02:12.500,0:02:13.700
[laughter]
0:02:13.700,0:02:15.325
Because I know[br]that I was back in Israel
0:02:15.325,0:02:16.627
by the time that happened.
0:02:16.627,0:02:18.750
I'd left for Israel [br]in the summer of--
0:02:19.390,0:02:21.190
I was only at Harvard [br]for two years.
0:02:21.190,0:02:22.540
We had that one year.
0:02:22.540,0:02:25.700
It is remarkable, I mean, that[br]one year was so fateful for us.
0:02:25.900,0:02:27.200
- [Guido] Yes.
0:02:27.690,0:02:30.200
I think we understood there was[br]something good happening,
0:02:30.200,0:02:33.700
but maybe we didn't think it was[br]life-changing, only in retrospect.
0:02:33.700,0:02:35.620
♪ [music] ♪
0:02:35.620,0:02:37.495
- [Isaiah] As you said, it sounds[br]like a small group of people
0:02:37.495,0:02:39.190
were initially quite receptive,
0:02:39.190,0:02:42.190
perhaps took some time[br]for a broader group of people
0:02:43.090,0:02:45.912
to come around to seeing [br]the LATE framework
0:02:45.912,0:02:47.620
as a valuable way to look [br]at the world.
0:02:47.620,0:02:50.100
I guess, in over[br]the course of that,
0:02:50.100,0:02:52.128
were their periods [br]where you thought,
0:02:52.128,0:02:54.450
maybe the people saying [br]this wasn't a useful way
0:02:54.450,0:02:55.751
to look at the world were right?
0:02:55.751,0:02:58.360
Did you get discouraged? [br]How did you think about?
0:02:58.360,0:02:59.755
- I don't think I was discouraged,
0:02:59.755,0:03:02.271
but the people who were saying[br]that we're smart people,
0:03:02.784,0:03:06.117
well-informed econometricians,
0:03:06.117,0:03:07.800
sophisticated readers,
0:03:08.900,0:03:11.324
and I think the substance[br]of the comment
0:03:11.324,0:03:14.297
was this is not what [br]econometrics is about.
0:03:14.297,0:03:20.572
Econometrics being transmitted [br]at that time was about structure.
0:03:21.300,0:03:24.490
There was this idea that[br]there's structure in the economy,
0:03:25.100,0:03:27.200
and it's our job to discover it,
0:03:27.200,0:03:30.952
and what makes its structure[br]is it's essentially invariant.
0:03:32.570,0:03:34.900
And so we're saying, [br]in the LATE theorem,
0:03:34.900,0:03:37.699
that every instrument produces[br]its own causal effect,
0:03:37.699,0:03:41.386
which is in contradiction to that[br]to some extent,
0:03:41.386,0:03:43.640
and so that was [br]where the tension was.
0:03:43.640,0:03:45.551
People didn't want [br]to give up that idea.
0:03:46.300,0:03:50.369
- Yeah, I remember once [br]people were started
0:03:51.200,0:03:55.664
arguing more vocally against that,
0:03:56.900,0:03:59.483
that never really [br]bothered me that much.
0:03:59.483,0:04:03.051
It seemed clear that [br]we had a result there,
0:04:03.051,0:04:05.878
and it became somewhat [br]controversial,
0:04:05.878,0:04:08.395
but controversial in a good way.
0:04:08.620,0:04:10.190
It was clear that people felt
0:04:10.820,0:04:13.835
they had to come out[br]against it because--
0:04:13.970,0:04:15.649
- Well, I think we think[br]it's good now.
0:04:17.426,0:04:19.238
We might not have loved it [br]at the time.
0:04:20.168,0:04:22.984
I remember being[br]somewhat more upset--
0:04:22.984,0:04:24.780
there was some dinner [br]where someone said,
0:04:24.780,0:04:27.455
"No, no, no, [br]that paper with Josh --
0:04:28.855,0:04:30.749
that was doing a disservice[br]to the profession."
0:04:32.050,0:04:33.850
- We definitely had [br]reactions like that.
0:04:35.410,0:04:38.200
- At some level, that may be [br]indicative of the culture
0:04:38.400,0:04:40.000
in general in economics [br]at the time.
0:04:41.400,0:04:44.097
I thought back later, [br]what if that happened now?
0:04:44.600,0:04:47.682
If I was a senior person[br]sitting in that conversation,
0:04:48.200,0:04:51.898
I would call that out because[br]it really was not appropriate--
0:04:53.000,0:04:54.200
- [Josh] It wasn't so bad.
0:04:54.600,0:04:56.600
I think the criticism is...
0:04:57.700,0:04:59.298
It wasn't completely misguided.
0:05:00.070,0:05:01.351
It was maybe wrong.
0:05:01.800,0:05:04.485
No, no, but you can say [br]that paper is wrong,
0:05:05.280,0:05:06.440
but it's saying that
0:05:06.440,0:05:08.128
it's a disservice [br]to the profession --
0:05:08.128,0:05:10.300
- that's not really--[br]- [Isaiah] It's a bit personal.
0:05:10.300,0:05:12.646
- Yes, and doing that not to me
0:05:12.646,0:05:14.442
but in front of [br]my senior colleagues.
0:05:15.191,0:05:17.369
- But nobody was saying [br]the result was wrong,
0:05:17.369,0:05:18.700
and I remember also,
0:05:18.700,0:05:21.579
some of the comments [br]were thought-provoking.
0:05:21.579,0:05:23.059
So we had some negative reviews,
0:05:23.059,0:05:25.861
I think, on the average [br]causal response paper.
0:05:26.500,0:05:30.361
Somebody said, "These compliers,[br]you can't figure out who they are."
0:05:31.967,0:05:33.891
It's one thing to say[br]you're estimating
0:05:33.891,0:05:35.678
the effect of treatment[br]on the treated
0:05:35.678,0:05:36.840
or something like that.
0:05:36.840,0:05:38.400
You can tell me who's treated
0:05:38.700,0:05:42.289
people in the CPS,[br]you can't tell me who's a complier.
0:05:42.929,0:05:44.679
So that was a legitimate challenge.
0:05:44.679,0:05:47.800
- That's certainly fair,[br]and I can see why
0:05:49.880,0:05:53.502
that part made people[br]a little uneasy and uncomfortable.
0:05:54.300,0:05:56.400
But at the same time,
0:05:56.900,0:06:00.244
because it showed that you couldn't[br]really go beyond that,
0:06:00.800,0:06:03.775
it was very useful thing [br]to realize.
0:06:04.630,0:06:06.180
I remember on the day,
0:06:06.500,0:06:09.771
we got to the key result [br]that I was thinking,
0:06:09.771,0:06:13.113
"Wow, this is as good as it gets.
0:06:14.221,0:06:16.978
Here we actually have an insight[br]but clearly--"
0:06:17.500,0:06:19.250
And we had to sell it [br]at some point.
0:06:19.480,0:06:21.261
For quite a few years, [br]we had to sell it
0:06:23.480,0:06:24.892
and it's proven to be quite useful.
0:06:25.500,0:06:28.761
I don't think we understood that[br]it would be so useful at the time.
0:06:28.761,0:06:29.871
No.
0:06:30.170,0:06:34.600
I did feel early on this was[br]a substantial insight.
0:06:34.600,0:06:36.440
- [Josh] Yeah we [did] something.
0:06:36.440,0:06:40.041
But I did not think [br]goals were there.
0:06:40.700,0:06:42.600
I don't think we were aiming [br]for the Nobel.
0:06:42.600,0:06:43.730
[laughter]
0:06:43.730,0:06:46.243
We were very happy to get[br]that note in Econometrica.
0:06:46.859,0:06:48.829
♪ [music] ♪
0:06:49.770,0:06:51.500
- [Isaiah] Are there factors [br]or are ways of approaching problems
0:06:51.500,0:06:54.186
that lead people to be better [br]at recognizing the good stuff
0:06:54.186,0:06:56.600
and taking the time to do it [br]as opposed to dismissing it?
0:06:56.600,0:06:57.830
- [Josh] Sometimes [br]I think it's helpful.
0:06:57.830,0:06:59.478
If you're trying to [br]convince somebody
0:06:59.478,0:07:01.247
that you have something [br]useful to say
0:07:01.900,0:07:04.176
and maybe they don't [br]speak your language,
0:07:04.894,0:07:06.541
you might need [br]to learn their language.
0:07:06.761,0:07:07.910
Yes, yes, exactly.
0:07:07.910,0:07:11.736
That's what we did with Don,[br]we figured out how to--
0:07:11.736,0:07:14.052
I remember we had a very hard time
0:07:14.052,0:07:15.816
explaining the exclusion restriction[br]to Don,
0:07:17.430,0:07:18.993
maybe rightfully so,
0:07:19.804,0:07:21.948
I think Guido and I [br]eventually figured out
0:07:21.948,0:07:24.420
that it wasn't formulated [br]very clearly,
0:07:25.400,0:07:27.450
and we came up [br]with a way to do that
0:07:27.450,0:07:29.316
in the potential outcomes framework
0:07:29.316,0:07:32.218
that I think worked[br]for the three of us.
0:07:32.218,0:07:33.419
- [Guido] Yeah.
0:07:33.419,0:07:35.454
Well, it worked for[br]the bigger literature
0:07:35.454,0:07:37.639
but I think what you're saying [br]there is exactly right,
0:07:37.639,0:07:40.860
you need to figure out [br]how not just say,
0:07:40.860,0:07:43.894
"Okay well, I've got this language[br]and this this works great
0:07:43.894,0:07:45.900
and I've got to convince someone[br]else to use the language.
0:07:45.900,0:07:48.188
You could first figure out [br]what language they're using
0:07:48.680,0:07:51.028
and then only then, [br]can you try to say,
0:07:51.028,0:07:53.140
"Well, but here you thinking of it [br]this way,"
0:07:53.140,0:07:56.880
but that's actually [br]a pretty hard thing to do,
0:07:56.880,0:07:59.098
you get someone from [br]a different discipline,
0:07:59.098,0:08:02.300
convincing them, two junior faculty[br]in a different department
0:08:02.300,0:08:04.366
actually have something [br]to say to you,
0:08:04.596,0:08:06.516
that takes a fair amount of effort.
0:08:07.500,0:08:09.782
Yeah, I wrote Don [br]a number of times,
0:08:10.420,0:08:11.868
in fairly long letters.
0:08:11.868,0:08:13.805
I remember thinking [br]this is worth doing,
0:08:14.600,0:08:16.006
that if I could convince Don
0:08:16.780,0:08:19.444
that would validate the framework [br]to some extent.
0:08:20.300,0:08:22.924
I think both you and Don were
0:08:22.924,0:08:25.000
a little bit more confident [br]that you were right.
0:08:25.000,0:08:26.438
Well, we used to argue a lot
0:08:26.438,0:08:28.320
and you would sometimes [br]referee those.
0:08:28.320,0:08:29.500
[laughter]
0:08:29.800,0:08:30.800
That was fun.
0:08:32.760,0:08:34.125
It wasn't hurtful.
0:08:35.200,0:08:37.492
I remember it getting [br]a little testy once,
0:08:37.935,0:08:39.606
we had lunch in The Faculty Club
0:08:40.600,0:08:44.077
and we're talking about [br]the draft lottery paper.
0:08:44.930,0:08:47.430
We were talking about "never takes"
0:08:47.430,0:08:51.000
[as people who wound serve][br]in the military irrespective of
0:08:51.000,0:08:53.500
whether they were getting drafted
0:08:54.500,0:08:58.800
and you or Don said something[br]about shooting yourself in the foot,
0:08:58.800,0:08:59.800
[laughter]
0:08:59.800,0:09:01.530
as a way of getting [br]out of the military
0:09:01.530,0:09:03.230
and that may be [br]the exclusion restriction
0:09:03.230,0:09:05.223
for never takes wasn't working
0:09:06.300,0:09:08.520
and then the other one was going,
0:09:08.520,0:09:09.791
"Well, yes you could do that
0:09:09.791,0:09:12.008
but why would you want [br]to shoot yourself in the foot?"
0:09:12.008,0:09:13.225
[laughter]
0:09:13.225,0:09:15.400
It got a little [out of hand there]--
0:09:15.400,0:09:17.860
I usually go for moving to Canada,[br]for my example,
0:09:18.690,0:09:20.096
when I'm teaching that.
0:09:20.096,0:09:21.365
[laughter]
0:09:22.030,0:09:23.575
But things are tricky,
0:09:24.860,0:09:26.595
I get students coming[br]from Computer Science
0:09:26.595,0:09:29.943
and they want to do things [br]on causal inference
0:09:30.566,0:09:33.460
and it takes a huge amount [br]of effort to figure out
0:09:33.460,0:09:35.230
how they actually thinking [br]about problem
0:09:35.230,0:09:37.000
and whether there's something there
0:09:37.000,0:09:38.310
and so, now over the years,
0:09:38.310,0:09:40.302
I've got a little more appreciation[br]for the fact
0:09:40.302,0:09:41.958
that Don was actually willing to--
0:09:42.630,0:09:46.000
It took him a while, [br]but he did engage first with Josh
0:09:46.400,0:09:47.500
and then with both of us
0:09:48.380,0:09:50.163
and rather than dismissing[br]and say,
0:09:50.163,0:09:53.348
"Okay, well I can't figure out[br]what these guys are doing
0:09:53.348,0:09:56.435
and it's probably just [br]not really that interesting."
0:09:57.200,0:09:59.736
Everybody always wants [br]to figure out quickly,
0:10:00.196,0:10:01.376
you want to save time
0:10:01.376,0:10:03.410
and you want to save [br]your brain cells
0:10:03.410,0:10:04.583
for other things.
0:10:05.000,0:10:07.000
The fastest route to [br]that is to figure out
0:10:07.000,0:10:08.460
why you should dismiss something.
0:10:08.460,0:10:09.560
Yes.
0:10:09.560,0:10:11.100
I don't need to spend time on this.
0:10:11.100,0:10:12.498
♪ [music] ♪
0:10:12.498,0:10:13.880
- [Narrator] If you'd like [br]to watch more
0:10:13.880,0:10:15.822
Nobel conversations, click here,
0:10:16.220,0:10:18.409
or if you'd like to learn[br]more about econometrics,
0:10:18.640,0:10:21.240
check out Josh's "Mastering[br]Econometrics" series.
0:10:21.800,0:10:24.540
If you'd like to learn more[br]about Guido, Josh, and Isaiah
0:10:24.860,0:10:26.502
check out the links [br]in the description.
0:10:26.992,0:10:28.307
♪ [music] ♪