Transcript

The video above was recorded at the RailsConf 2019 in Minneapolis, Minnesota.

As engineers, we’ve spent years mastering the art of conducting technical interviews—or have we? Despite being on both sides of the table dozens of times, how often have we come away feeling that the interview didn’t work as well as it could have? How many of our interviews have been just plain bad? How much time do we spend designing and improving our own interview processes, and what signals should we be looking for when it comes to making those improvements?

This talk examines the technical interview in depth, developing a framework for interviewing candidates “where they are” by focusing on answering two major questions:

  1. How can we ensure our interview process identifies the people and skillsets we need to grow our teams?
  2. How can we interview candidates in an inclusive way that maximizes their ability to demonstrate their competencies?

This talk aims to equip you with a rich new set of tools you can immediately apply to the hiring process in your own company.

If you liked this talk, please share it! And if you know an organization that could benefit from additional developers who can help make the whole team better, we’d love to hear from you.

00:00
(upbeat music)
00:21
- So it think we'll get started.
00:23
And I'd like to start by telling you a story.
00:29
This story, I've anonymized, or rather left out
00:33
some names so as to protect companies
00:36
that have interesting interview practices.
00:40
But I do want to let you know ahead of time
00:45
this story, everything that happened in it is true,
00:47
but I've stitched together multiple interview loops
00:50
into one for the sake of the story.
00:52
So don't feel bad for me.
00:53
I did not have one super awful day
00:55
where everything went terribly wrong.
00:57
But these are all things that did happen to me
01:00
in the course of interviewing at small companies,
01:02
large companies, companies who's names
01:04
you know and read about in the news,
01:06
and then companies that maybe you don't.
01:10
So I drove to the company and I was very excited.
01:14
And I started my interview loop
01:17
by talking to the coordinator who was running it.
01:19
Everyone was super nice, friendly.
01:21
Restroom, water, coffee, all that.
01:24
And I go into the interview room and
01:26
my first interviewer comes in and introduces himself.
01:29
Pulls out a piece of paper
01:33
and starts asking me things like, "What is a pipe?"
01:37
Not how do they work, not why are they interesting,
01:39
not what are some Unix running utilities
01:42
that you'd think to use.
01:44
But literally what is that character right there,
01:48
(laughter) what does it do.
01:50
So I said, 'Okay.'
01:51
And I gave them the Wikipedia definition
01:54
and they said "Okay great."
01:56
And then they said "Let's move on."
02:00
"Does Ajax return things other than X amount?"
02:04
And I said 'Sometimes, I think.'
02:08
And I talked about Json and I talked about
02:11
web standards and XMS script (chuckles),
02:13
XML script, XML script, anyway.
02:15
(audience laughs)
02:17
And they said "Okay, it sounds good."
02:20
And continued on in this way for 35 minutes
02:24
asking just, kind of, trivia questions
02:27
that ranged all over the place.
02:28
Back-end, front-end, command run utilities things like that.
02:32
And at the end they said, "Thanks for your time."
02:34
And they left.
02:36
That was interesting.
02:38
And then I had another person come in
02:41
and they introduced themselves and said,
02:43
"Let me ask you a problem."
02:44
And they asked me, you always kinda know, right?
02:46
When someone asks you a question
02:47
that's something like, "How would you validate
02:49
"that this data structure is correct?"
02:50
You know there's like a secret follow-up question
02:52
that they really wanna ask but
02:53
they want to ask the easy one first.
02:55
And then ask you something like,
02:56
"How would you generate a data structure?"
02:59
So I started doing all the things
03:00
you're supposed to do, you know?
03:01
Writing on the white boards,
03:02
saying, 'Let me make sure I understand the question.
03:05
'Let me make sure that I've got this right.'
03:09
And they interrupted me and said,
03:11
"This isn't hard, just write the code."
03:15
And I don't know what that would do to you,
03:17
but that is very affective at throwing
03:18
me off my game entirely.
03:20
So I then spent the next 35 minutes
03:23
mumbling and sweating and freaking out,
03:25
and producing a very poor answer
03:26
to not a very hard question.
03:28
Although I would ask things like,
03:30
'Does it seem reasonable, is this right?'
03:31
And they would say "No."
03:32
And after a lot of poking and prodding,
03:35
I would find out I had missed a semicolon,
03:36
or I had swapped two indices.
03:38
Is and Js look similar on the white board.
03:43
So we didn't get to their favorite part
03:45
and it became abundantly clear that
03:46
they were very irritated that we
03:48
didn't get to their fun question.
03:49
But the nice thing about interviews
03:50
is they are eventually over.
03:52
(laughter)
03:53
And thanked me for my time, took a picture
03:55
of the whiteboard and left.
03:57
And I felt pretty bad but it was okay
03:59
because now this was the lunch interview, right?
04:02
This is the one where you go and you have food
04:05
in the company cafeteria and somebody
04:06
tells you all about what it's like to work there,
04:09
the culture, other people, things like that.
04:12
And this person was very nice, a little bit older,
04:14
had been with the company for sometime.
04:16
Told me about the culture,
04:17
what they liked working on, what they didn't.
04:20
And I don't (chuckles) know if HR told him
04:22
to say this or if this was just a thing.
04:24
All they could think of to tout to me,
04:26
on the one end who's interested in things like inclusivity.
04:29
But this is a direct quote how we've
04:30
heard all about the unisexual bathrooms.
04:33
(laughter)
04:34
The unisexual bathrooms.
04:36
I think they meant non, like, gender-nonspecific bathrooms.
04:38
But that was nice, like it was well-meaning.
04:40
So I finished my meal, I took my unisexual bathroom break,
04:45
and I came back. (laughter)
04:45
And I said, 'Okay, let's continue.'
04:51
So, and this one threw me the most.
04:52
This person came in, they were super friendly,
04:54
really energetic, very warm.
04:56
And they gave me a reasonably good,
04:57
you know, the well-scoped interview question.
05:00
And they said, "Does that make sense?"
05:01
And I said, 'It does.'
05:02
And then they sat down and they
05:04
opened their laptop and I thought
05:05
maybe they'd be taking notes,
05:06
maybe there was like some interactive component.
05:09
And I started writing on the whiteboard
05:11
and they proceeded to ignore me, entirely.
05:15
As though I were not there for 40 minutes.
05:18
(laughter)
05:18
And I would say things like,
05:19
'Does this make sense, is this reasonable?
05:21
'Is this the right approach?'
05:22
And sometimes they would look at the
05:23
whiteboard and say, "Yeah."
05:25
And sometimes they would just say "Yeah."
05:27
Or nothing.
05:28
And then they thanked me for my time,
05:29
and took a picture of the whiteboard and they left.
05:32
This is the theme. (laughter)
05:35
And finally I'm getting to the end of the day,
05:37
I'm very tired but I know there's
05:39
only one interview left.
05:40
And so this is okay.
05:42
And the interviewer comes in,
05:43
also very nice, very friendly.
05:45
And he asks me a question but it
05:47
turns out you cannot meaningfully answer,
05:49
unless you know what a Derange sequence is.
05:54
Which I did not.
05:56
And what made me really mad about this is that
05:58
Derange sequences are super interesting mathematically.
06:02
And so I went home and looked up
06:03
the events there on Google and Wikipedia,
06:06
and I was super mad because I loved
06:08
the idea but I was mad that it came up
06:10
in a interview and I didn't know the answer.
06:13
So that was, I think, the most trickiest one for me.
06:15
Whereas this would otherwise be super cool
06:16
and I kind of see why this person asked the question.
06:20
But anyway, so that was the end of my interview loop.
06:23
And I said, 'Thank you.'
06:23
And I went home.
06:25
And you'd think that they would've said no but they didn't.
06:29
And you'd think that they would've said yes but they didn't,
06:31
they said, "Would you like to come
06:32
"in for some more interviews?"
06:33
(laughter)
06:35
And I said, 'No, uh uh.' (laughter)
06:37
So I don't like the idea of comparing
06:42
software developers to medical professionals
06:44
or doctors, it is knowledge work like medicine.
06:48
We are on call sometimes.
06:50
But I think that comparison is super fraught.
06:52
I don't like pretending that whether or not
06:55
the webpage is up is equivalent to,
06:58
you know, life or death situations.
07:00
But I have friends who are doctors
07:01
and this part resonates with me.
07:02
Where I realized, imagine if you like go in
07:05
and you're interviewing as a doctor,
07:06
and they say, "You're references all check out.
07:08
"You're a well-known professional.
07:11
"Clearly you have a track record of success.
07:13
"If you wouldn't mind just baring with me
07:16
"for just a brief exercise. (laughter)
07:19
"Just to demonstrate that you really
07:21
"do know how to do surgery."
07:22
(laughter)
07:24
This is what it makes me think of and I don't like that.
07:26
I don't like that interviewing
07:28
makes me think of an operation (laughter).
07:30
So interviewing is broken.
07:32
I don't say that lightly.
07:34
I don't like to say broken unless I am willing to continue.
07:37
Because when someone says something is broken
07:38
that immediately invites the question,
07:40
"How is it broken, what's wrong?"
07:42
So I'm gonna go through that and
07:44
I'm gonna talk about how we can fix it.
07:47
And that's the whole point of this talk.
07:50
So this talk is called Interview Them Where They Are.
07:53
And hello RailsConf, hello Minneapolis.
07:55
I'm delighted to be here, I've never been to Minnesota,
07:57
I've never been to Minneapolis before.
07:59
So this has been fantastic.
08:01
My name is Eric.
08:02
I am a Software Consultant for the
08:05
company called Test Double.
08:06
If you're not familiar with Test Double,
08:08
we are a distributed remote consultancy
08:11
that pairs with client teams to not only deliver
08:14
great software but insure that the teams themselves
08:17
are better as a result of having worked with us.
08:20
So if you are thinking about your current projects,
08:24
your current environment, and you think
08:26
there's something that we could help with,
08:27
please don't hesitate to reach out to me.
08:29
I'm Eric Q Weinstein.
08:30
I'm on most things, Twitter, GitHub,
08:32
you can reach out to me at Eric@TestDouble.com.
08:36
If you are looking for your next adventure we are hiring.
08:39
So, and I'm happy to talk a little bit to you
08:41
about the interview process at Test Double.
08:43
And finally, if you do want to talk more
08:46
about interviewing, and diversity, and inclusion,
08:48
these are topics that are very important to me,
08:49
and I'm always happy to chat.
08:51
So please do come find me after the show.
08:54
And a few years ago I wrote a book
08:56
to teach Ruby to eight, nine, and 10 year olds.
08:59
Published by No Starch Press.
09:00
It's called Ruby Wizardry.
09:02
Let me know if you're interested,
09:03
if for some reason you'd like a copy
09:04
and just can't afford it let me know
09:06
and we'll work something out.
09:09
So the points of interviewing in my mind are to
09:12
find the people that we need to grow our teams,
09:15
which I think is somewhat non-controversial.
09:18
But the one thing that I think is not
09:20
always talked about and one that I think is critical
09:21
is we're optimizing for demonstrating competencies.
09:26
Too often, someone, an interviewer
09:29
intentionally or inadvertently views the
09:30
interview process as sort of a challenge.
09:31
Kind of, a way to find something
09:33
that the candidate doesn't know.
09:35
To probe for weaknesses and say
09:36
this person really knows Javascript,
09:39
really knows React, doesn't understand Active Record,
09:41
doesn't understand databases, etcetera, etcetera.
09:44
Without really thinking too much about
09:46
is that what you're looking for.
09:52
So I really like running tests.
09:53
I test a lot, I usually do test-driven development.
09:58
And this has happened to me a lot in my career
10:00
but this first time it happened it was very frustrating.
10:02
I remember writing some code and I was like
10:04
I'm gonna sit down, I'm gonna write the test.
10:06
So I wrote a test and it was red,
10:08
which is great because that is the
10:09
first step in writing your tests.
10:11
In red, green refactor.
10:13
Which nobody tells you initially is
10:14
shorthand for red, red, red, red, green,
10:19
red, red, red, green refactor. (laughter)
10:23
So I'm writing my test and I'm excited
10:24
'cause it's failing and I write the
10:25
production code and my test stays red.
10:28
And so I start digging around, I'm trying to figure it out.
10:31
I'm writing down the execution paths on paper,
10:33
staring at my code, back to the paper, back to the editor.
10:37
Finally I looked back at the test after way too long,
10:40
and I realized that my test was wrong.
10:42
And this is a thing that I think we don't think
10:45
about often outside the context of writing code.
10:48
If you're interviewing someone
10:50
and they have years of experience,
10:51
and all these great open source contributions.
10:52
And they're very smart and very sharp,
10:54
and very empathetic, and they don't know
10:56
the answer to some trivia question.
10:58
Or they don't do well on the whiteboard,
11:00
and you decide well the test is the test,
11:03
so this person is not qualified to work here,
11:05
maybe the test is wrong.
11:10
One thing I think a lot about is
11:11
the way that we construct interviews.
11:15
And a lot of people think interviews
11:16
start when you walk into the room and the candidates there,
11:19
or maybe a couple weeks before when you get
11:20
that calender invite with the attached resume.
11:23
But it actually starts much much earlier,
11:25
months ahead of time.
11:26
And this is a job description.
11:29
I'm sure many of you have seen these,
11:30
we call them JDs in the biz.
11:32
Having been a Engineering Manager,
11:33
I've written several of them,
11:35
and written some really bad ones.
11:37
I wrote this one but for the talk,
11:38
this is not one that I wrote for work.
11:41
I won't read it to you but this
11:42
all seems familiar, right?
11:43
We're looking for some credential or some equivalent,
11:46
we want a certain amount of experience,
11:48
certain technologies.
11:50
And bonus points, whatever that means for
11:52
our upper particular stack.
11:54
I don't think this is a very good job description.
11:56
And I think we can kind of take our
11:58
inclination to refactor out of the realm of
12:02
writing tests and bring it to our job descriptions.
12:04
So that's what we're gonna do.
12:06
So I'm gonna break this up into the
12:08
four kind of pieces, you know?
12:10
Credential, experience, particular languages, bonus points.
12:15
And let's look at this first one.
12:16
"We're looking for someone with a
12:18
"bachelor's degree in computer science or equivalent."
12:22
I don't think this is a valuable
12:23
thing to put in a job description
12:25
because most of the time you don't need it.
12:28
Now there is some roles where you really,
12:30
a full grounding in computer science
12:32
is necessary or is important.
12:33
And you may work at a place where, you know,
12:36
for whatever reason they try and say,
12:36
"Hey we do need people to have
12:38
"four year degrees for X, Y, Z reason."
12:41
That's fine.
12:42
But I don't think a degree in
12:43
computer science should be necessary
12:44
for most of the work that we do.
12:47
And I don't anyone has ever explained
12:48
to me what equivalent means.
12:50
It's sort of a content free statement, right?
12:52
(laughter)
12:52
It's like equivalent experience.
12:53
And so what is that?
12:54
Like is there a secret bachelors afterwards?
12:56
I don't know what that is supposed to be.
12:57
(laughter)
12:58
So I recommend we get rid of it.
13:01
So when we refactor our JDs in dead code
13:04
basically we're not gonna put this in.
13:07
"One to three years of experience."
13:09
This says to me that you're trying to
13:12
find someone with a certain level of seniority.
13:15
But what happens when you say one to three years
13:17
of experience is, one, it's extremely broad.
13:20
I was different in my third year of writing
13:22
software professional then in my first.
13:25
But also there's such a broad range of experience.
13:30
If you start at a startup that's always on fire,
13:32
and you're wearing in multiple hats,
13:33
and do all these different things.
13:35
You will have a much different set of skills
13:37
than if you go for a big bank and
13:38
you write Java for three years.
13:40
Not to knock big banks.
13:42
Maybe a little bit.
13:44
So I don't think this is really very valuable.
13:48
What this is really asking is
13:49
well what does this person know how to do?
13:52
And so what I would say is,
13:53
'Well we're looking for someone who is
13:55
'comfortable writing features and is
13:57
'looking to own entire services,
13:58
'or large swats of the single application.
14:01
'If you manage one application.'
14:04
And that'll get you closer to what
14:05
you actually are looking for,
14:06
and not some arbitrary number of years.
14:09
Especially because we probably have worked
14:10
with people who have three years of experience,
14:11
and we've probably have worked with people
14:13
who have one year of experience three times.
14:15
That is very different. (laughter)
14:18
So moving on.
14:19
And so we want someone who knows
14:20
Javascript, and React, and Go.
14:22
And I think this is reasonable,
14:23
this is the piece that's gonna change the least.
14:25
But I think it's kind of unclear where those
14:28
weights lie and what you're actually looking for.
14:31
So I would say something more like,
14:33
'We prefer the candidates know Javascript,
14:34
'and React, or Go.'
14:36
But maybe for more experienced or
14:38
senior candidates it's fine if they don't.
14:40
It's not a hard requirement.
14:41
Kind of acknowledging that people
14:42
with more experience have sort of
14:44
learned how to learn in some capacity
14:45
and can come up to speed faster.
14:46
Then someone who's brand new to maybe
14:49
Git, and editor, and all these other things.
14:51
And finally bonus points for
14:53
Postgre, microservices, Kubernetes.
14:56
I also think this is kind of content-free,
14:58
it's unclear to me what this means.
15:00
But I think what it's driving at is
15:01
well here's what we do.
15:03
And my opinion is that you should just say that.
15:05
Our stack is Javascript/React with Go on the back end.
15:09
Those Go services are organized as
15:10
microservices orchestrated by Kubernetes,
15:12
and the data services post grads.
15:15
So now we can actually put these
15:16
back together into a job description
15:18
that I feel much better about.
15:19
This feels really like an improvement,
15:21
it's not ideal, I don't think.
15:22
But I think it's really nice to say,
15:23
'Here's what we're looking for.
15:25
'Do you know these things?
15:26
'Or have you done these things?
15:27
'Are you looking for these things?'
15:29
We can have a conversation.
15:31
One thing people sometimes say
15:34
in response to this is, "Well I don't have
15:37
"specifics anymore this too vague.
15:38
"I don't have a degree requirement.
15:39
"I don't have information on how many years of experience.
15:43
"I'm gonna get unqualified people."
15:45
And I think the opposite is true.
15:47
I think when you have gatekeeping language
15:49
like degree, number of years,
15:51
what happens is people self-select out
15:52
even when they are qualified for the role.
15:55
And this is disproportionately true
15:57
of under-represented minorities.
15:59
Groups in tech, or like women in tech,
16:01
and folks who don't have medium-white-man-syndrome.
16:05
Which I have.
16:06
As a medium white man, both in terms
16:09
of my average capacity and my propensity
16:12
to write things on medium.
16:13
(laughter)
16:15
(chuckles) I feel like I should have a lot of things,
16:18
like here's what happened to my startup.
16:19
Anyway. (laughter)
16:22
The thing that doesn't happen is I'll say,
16:24
'Well I got three of these five checkboxes.
16:26
'I'll just send in my resume and see what happens.'
16:29
This is not something that happens
16:31
necessarily outside of my sphere of privilege.
16:35
And so you have to be mindful of the fact
16:36
that there are people who will select themselves out
16:38
if you've inadvertently put in all this,
16:40
or intentionally put in all this
16:41
gatekeeping lanugage around years and credentials.
16:44
So I think this is a substantial improvement.
16:45
'Cause now, you know, you're gonna
16:48
start finding who you're looking for.
16:51
And there's a few key takeaways.
16:52
One of the big ones from this talk is know
16:54
what you're looking for before you interview.
16:57
And critically you have to know
16:59
how you're gonna measure success.
17:01
Now, I said that job description wasn't ideal.
17:03
I would love job descriptions that
17:04
include how we measure success,
17:05
how the company thinks about reviews,
17:08
and what we want to see from you as a Software Developer.
17:12
But if you don't include in the JD,
17:13
you at least need to think about it.
17:15
When you write it, before you interview
17:17
because you're setting people up for failure
17:18
if you don't know how you'll evaluate
17:20
them once they join your team.
17:22
So no we have people on our pipeline.
17:25
Imagine no people like me who don't listen
17:27
and don't know how to self-assess.
17:29
(chuckles) We have people who theoretically
17:30
are who we're looking for, right?
17:32
Which is, we know popular space, which is nice.
17:36
And so now the question is how do we evaluate them, right?
17:39
Historically this has been done
17:41
on the whiteboard which is a blackboard,
17:42
because I'm limited by the available emojis.
17:45
(laughter)
17:46
And what we're interviewing,
17:48
I honestly think really was valuable at one point.
17:51
Back in the day you had a certain amount of compute time,
17:53
you would schedule time to use the computer.
17:55
In the meantime you'd write down all
17:57
your code on paper, make sure it was right.
17:59
And then type it in as fast as you could (chuckles),
18:01
and compile it and see if it worked.
18:03
That is not the world that we live in anymore.
18:06
And I think that whiteboarding interviews
18:08
can have a place, sort of, and I'll expand on that.
18:12
But probably it's not a good proxy
18:14
for writing code with other humans,
18:15
which is what we do. (laughter)
18:16
We work on teams, we write code collaboratively,
18:18
we communicate and then spend time together.
18:20
And whiteboarding by yourself on a board
18:22
while somebody says, "Yeah that's right."
18:23
Or "No that's not."
18:24
Even if they're more engaged, I think
18:26
it's not a good way of doing it.
18:29
So there are some obscure programmers
18:32
on the internet who also agree with me.
18:34
I hope DHH is, no he's not here.
18:37
But he said the same thing and
18:39
I agree with DHH obviously.
18:41
I would fail to do this, I have failed to do this.
18:44
I don't fear whiteboarding anymore, I used to,
18:47
I used to be super afraid of whiteboarding.
18:50
And now I just don't like it.
18:51
(laughter)
18:53
It's because the medium doesn't really make sense,
18:56
and again the content is not correct
18:59
for what we're evaluating.
19:00
We're tryna figure out how do you communicate,
19:02
how do you take ambiguous requirements
19:04
and turn them into increments of work.
19:05
How do you write code (chuckles) with other people?
19:09
And this has been noted in texts like
19:12
Programming Interviews Exposed.
19:14
Where they kind of said, "Well based on
19:15
"these constraints you're not gonna be
19:18
"asked any real-world problems."
19:20
Unfortunately, real-world problems are all that we solve.
19:25
And so we're back here.
19:27
So, what are some ways that we can actually
19:30
get a signal and interview people
19:32
in a way that's more inclusive.
19:34
Gets us better, like a higher fidelity signal
19:36
around what they can do.
19:38
So this is Ada.
19:41
Ada is a fresh computer science graduate,
19:43
she went to college.
19:45
Maybe has an internship or two
19:47
but has not really written production code
19:51
in a group, in a company for very long.
19:53
So the first question is well what
19:54
is Ada going to be good at?
19:57
Now, I would assume her strengths are algorithms right?
20:01
Sorting, searching, things like that.
20:03
Things that are taught in a traditional
20:04
computer science program, data structures,
20:08
and maybe also stack overflow.
20:09
But I think everyone is well-versed in stack overflow.
20:13
And, kind of, well-defined tightly scoped providence.
20:16
Thing that are, sort of, there's a right answer,
20:18
there's an input, there's an output.
20:20
And this is maybe the only time
20:23
I would still be okay with whiteboarding.
20:25
And this is absence, you know, projects,
20:28
a data profile, internships.
20:30
Things that demonstrate that collaborative
20:32
nature of writing going together.
20:34
You also want to make sure you're
20:36
asking for things that are going to
20:38
be relevant to their experience.
20:40
So, if Ada has spent months and months
20:41
preparing for whiteboard interviews,
20:42
and I say 'Hey, let's pair on something.'
20:44
And she has never paired before.
20:45
That's is kinda like me going into a room
20:47
and expecting to pair and getting whiteboarding.
20:50
So again, it's flexing the interview
20:52
and sort of meeting people where they are that's critical.
20:55
I call these made-to-measure interviews,
20:57
you can construct modules around
20:59
what you're looking to assess based on
21:01
who you're looking for in the world.
21:04
And you can mix and match them a little bit
21:07
or even allow our interviewees to select
21:08
candidates to select what they want.
21:10
And say, "Well you know, I'd really like to do
21:12
"a take-home and then pair on it."
21:13
Right, that's one of the options?
21:14
It's almost like when you go and have
21:16
dinner at a wedding, you know?
21:17
It's like the fish or the chicken.
21:18
No one yells at you if you pick the wrong one.
21:20
(laughter)
21:21
No one's like you didn't go to dinner
21:21
if you had a different dinner than they had.
21:24
No one gives you just a plate of garbage
21:27
and says here you go, everyone gets the
21:28
same plate of garbage so this is fine.
21:29
(laughter)
21:30
Fairness is, I think, a toxic idea sometimes.
21:33
So this is what I would recommend for you.
21:37
Like I said, there are projects,
21:38
opportunities to pair, and they really simulate the worker.
21:40
We prefer that.
21:43
So this is Ben.
21:44
Ben graduated from a bootcamp maybe
21:46
three for four years ago.
21:47
So he does not have a traditional CS background,
21:49
but has worked in the industry for a few years.
21:52
And has attended a bootcamp.
21:53
So he spends a lot of time thinking about
21:54
code quality, writing modular,
21:56
easy to change, and test code.
21:58
So I would expect Ben could potentially
22:00
be really good at pairing.
22:01
We do a lot of pair at Test Double.
22:03
I know there're other companies that do a lot of pairing.
22:05
Bootcamps that emphasize pairing.
22:08
Probably good at testing, refactoring,
22:11
changing code, preserving behavior
22:13
but making the code base nicer.
22:15
I would expect him to be able to
22:17
talk about the trade-offs involved in
22:20
his language or framework of choice, okay?
22:24
To have that sort of deeper conversation.
22:27
So one thing I might ask Ben to do is,
22:30
'Hey, let's pair on a small window-downed
22:34
'version of a real production thing.
22:35
'Let's maybe if you have the time,
22:37
'do a take-home and then we can pair on either
22:40
'refactoring it or adding new functionality afterwards.'
22:43
These are pieces that are actually
22:44
used in the Test Double interview process.
22:47
But here I'm advocating we often
22:49
give them the choice and say,
22:51
'Hey, could you do this and could you do that?'
22:52
And being respectful of other people's time
22:54
because there are people who are new
22:55
parents or have other obligations.
22:57
Maybe they don't have time for a full
22:58
interview loop but they can do a take-home.
23:00
Maybe they don't have time for a take-home
23:01
but they're happy to pair.
23:02
And I think they key here is having
23:04
that flexibility and looking for strengths
23:06
rather than prodding for weaknesses.
23:11
Finally this is Charlie.
23:14
So Charlie did do a tradition computer science degree
23:18
but they graduated 10/12 years ago.
23:21
So they they have not spent a lot of time
23:22
thinking about algorithms or data structures.
23:25
They've been thinking about production problems,
23:28
they've been thinking about keeping the lights on,
23:30
they've been thinking about fires,
23:31
management, dealing with product stakeholders.
23:34
All the things that we do everyday, above and beyond.
23:37
There's sort of, like underlying competitions.
23:40
So, again I would expect Charlie
23:42
to be able to do programming, pairing.
23:45
Maybe I might give them a thornier pairing task,
23:48
then I might give to them something
23:49
that's gonna have more edge cases.
23:51
I would expect Charlie to have evolved
23:53
to the point where it's not if when,
23:55
you know, if things go wrong but when things go wrong.
23:58
And having that mindset of like this will
24:00
eventually fail so what is the failure mode.
24:04
I would expect Charlie to have a lot of strength
24:07
in terms of one-to-many communication, right?
24:09
Maybe Ben I would expected to master
24:11
maybe one-to-one communication but not to
24:13
have mastered getting by in front of a large group,
24:16
or giving a conference talk and getting
24:19
consensus from a large number of people.
24:22
So I would expect to be able to
24:24
test that out and say things like,
24:26
'Hey tell me about a time you had to
24:27
'get consensus or get by in front of a group,
24:30
'especially a group that didn't report to you
24:31
'or didn't have an obligation to you.'
24:34
I would look for service and system's level thinking
24:36
and say 'Tell me about this big project
24:39
'you did on your resume.'
24:40
Or, 'Hey we have a service XYZ at our company
24:43
'how would you go about developing that service?'
24:46
Right?
24:46
And really, again, not prod their weaknesses
24:49
but ask for clarification and say, 'What was the trade off?
24:52
'What happened when you did this?'
24:54
Or 'What were you trying to avoid
24:55
'by having this architectural pattern?'
24:59
And that's kind of like, I don't like whiteboarding
25:02
code necessarily on the whiteboard
25:03
but I do like drawing systems diagram.
25:05
It is super fun.
25:06
And so just having them write on the whiteboard,
25:08
hey you know this service talks to this one,
25:10
this service uses this database.
25:11
Here's where was cache, here's where we don't,
25:13
here's the trade-offs involved, this like that.
25:14
That's what I would look for.
25:18
Now I will take a minute to talk
25:20
a little bit about bias, right?
25:23
People have asked me, "Doesn't this introduce
25:25
"a lot bias now because not everyone
25:28
"gets the same interview.
25:29
"Some people will be interviewed in
25:30
"this particular way and these people
25:32
"will be interviewed in this other way."
25:35
And that's a good question.
25:37
There's two aspects to it.
25:38
One, this already happens anyway.
25:40
Even though we say everyone gets the same interview,
25:42
not everyone gets the same interview.
25:44
People are asked different interview questions,
25:46
people have different standards,
25:47
people have different experiences.
25:48
So you're already getting a lot of human volatility
25:52
depending on who is on your interview loop.
25:55
And further, I think, just because
25:57
everything is the same, doesn't mean it's correct, right?
26:01
Again, the test can be wrong.
26:03
And I think that understanding that uniformity
26:06
is not the same thing as being unbiased.
26:07
You can have systemic bias and we do.
26:11
This is, I think, present in all parts of the
26:13
tech community and in human organizations at large.
26:16
So this notion that just because you
26:20
don't give everyone the same interview,
26:22
you're somehow privileging some people and not others.
26:26
And again, I think it's the opposite.
26:28
If you don't flex to someone's interview style,
26:30
if you don't help them to shine and to make clear
26:36
where they're really good and where they're not.
26:37
I think this is where we do disservices.
26:39
This is where we exclude people by not
26:41
allowing them to demonstrate what they know.
26:43
Because everyone gets the same test
26:44
and you have to pass this test,
26:46
it has nothing to do with the work.
26:48
It's like the SAT, right?
26:49
The SAT measures absolutely nothing,
26:51
other than are you good at the SAT, right?
26:54
But it's a proxy for college admissions.
26:55
Which then if you get your four year degree
26:57
and then you go, you can say,
26:58
"Hey I have a four year degree in computer science."
27:00
And really all that means is you
27:01
did well on the standardized test
27:02
and maybe your parents have enough
27:03
money to send you somewhere nice, right?
27:05
It has nothing to do with your ability.
27:08
So, the major takeaways, I think,
27:12
are knowing who you want and what you're
27:15
looking for months before your interview,
27:18
when you get to that job description.
27:21
Trying out what I call made-to-measure interviews.
27:23
Again, it's not bespoke.
27:24
Not everyone gets their own completely
27:27
personalized interview but they get
27:29
to select or you can help flex, you know?
27:31
And give them the modules that
27:33
are gonna be valuable to them.
27:35
And again, allow them to say,
27:37
"Hey, I'm really good at pairing.
27:38
"I'm really good at refactoring.
27:39
"These are the things I'd like to be tested on."
27:43
Looking for strengths and not for weaknesses.
27:45
We don't spend a lot of time with our own peers
27:48
saying, "Hey I think that you're
27:50
"super bad at XYZ and that bothers me."
27:51
Right?
27:53
We usually say, "I know that we need to do XYZ
27:55
"and this person on this team has
27:56
"a deep expertise and I'd like to find him."
27:58
So, starting from the get-go and saying,
28:00
this is what we're looking for
28:03
and allowing people to demonstrate
28:04
that competency is critical to having
28:06
an inclusive and a positive interview pipeline.
28:10
Finally, the interview fails the interviewee
28:13
when that interview does not anticipate people like them.
28:17
When someone has designed a loop and it's for
28:20
people with a certain amount of privilege,
28:21
who went through a certain set of programs,
28:23
with a certain amount of experience.
28:25
And this is just the way things are
28:26
and everyone else is kind of tough luck,
28:27
you have to figure out some way to learn enough of this.
28:31
And you go and you buy a Cracking the Code Interview,
28:32
you buy Programming Interviews Exposed,
28:34
or things like that.
28:36
This is truly, I think, toxic.
28:39
I think it gets us to a place,
28:41
not only where interviews feel bad.
28:43
As I mentioned in my earlier story,
28:45
but you don't get a good signal, right?
28:47
I have told stories where people
28:50
didn't do super well in the interview loop,
28:51
but I knew they were capable, I'd worked with them before.
28:53
Or they came back and did a second round
28:55
and were asked different things,
28:56
and they did completely differently.
28:58
I know people who have been hired
29:00
in an organization who crushed the interview loop,
29:02
and were unsuccessful in the role.
29:05
And so that tells us that again,
29:07
the test is not meaningful if the
29:09
test is not testing what it's supposed to be.
29:12
Anyway, I am very happy to talk for
29:14
40 minutes forever, and ever, and ever.
29:17
I have an unlimited amount of language
29:20
but a limited amount of wisdom.
29:22
So, that is all I've got.
29:24
Thank you so much for coming to this talk.
29:27
(audience applauding)
29:33
(upbeat cheerful music)
29:38
(squeaking)
29:39
(upbeat cheerful music)

Eric Weinstein

Person An icon of a human figure Status
Sleeper Agent
Hash An icon of a hash sign Code Name
Agent 0049