Photo credit: Cori Crider/Foxglove
This transcript was created using the awesome, Descript. It may contain minor errors.
Note: This is an affiliate link, where This is HCD make a small commission if you sign up a Descript account.
Chris Gray, a very warm welcome to Bringing Design Closer. How are you?
Chris Gray 00:04
I’m good, mate. How are you?
Gerry 00:05
Not so bad. We’re both going through the throes of the COVID-19 pandemic at the moment, but today we’re going to be discussing more around your background. So maybe tell the listeners a little bit about your background and what we’re going to discuss today.
Chris 00:20
Well, a couple years ago, I was working for a company called CPL under contract to Facebook here at their main headquarters in Dublin. I spent my days looking at all the content that people have reported for various reasons and deciding whether or not to leave it up on the platform.
Gerry 00:39
So what did your role involve when you were in CPL?
Chris 00:43
Basically, I would go down to the Facebook building and log into my computer and sit there all day just looking at one video or photo or written content after another. It’s just this never-ending stream of stuff that somebody has found objectionable. And then you have all these rules and regulations and reference material to look at to make decisions about it.
Gerry 01:09
So for anyone who’s listening, and you’re using Facebook, if you see a piece of content there that’s in your feed, your stream and you don’t like it and you report that, the team that Chris was on were the people that had to make that decision of whether that content was going to get pulled down or stay on the system. Is that correct, Chris?
Chris 01:28
That’s right. It could be absolutely anything. I even would get people reporting pictures of puppies because they don’t like animals being bought and sold. So you have to you have to look at everything, even though nobody is really worried that it’s on the platform. And then, obviously, there’s the much more extreme stuff as well.
Gerry 01:48
Looking at puppies all day might be a dream job for some people though, Chris.
Chris 01:51
Well, my first few weeks it was naked ladies, actually. I was working on the global photo pornography queue, which was mostly, you know, quite fun.
Gerry 02:02
I’m sure that was an interesting kind of conversation about trying to make those kinds of calls. Porn and naked photos are not allowed on Facebook, is that right?
Chris 02:13
Well, I mean, this is the question of moral injury. You will make decisions that you personally disagree with, you know? So I will be looking and saying, ‘Oh, nothing wrong with that, but it’s broken the rule. Sorry, darling. You can’t show me that online.’
Gerry 02:28
So how did they determine that? I actually saw an interview the other day with Anthony Kiedis, the lead singer from the Red Hot Chili Peppers. And not many people realize, but when they did the Super Bowl halftime show a number of years ago, they had something like 55 complaints, because Anthony Kiedis was showing – he always sings topless. People took offense to the fact that there was a topless man on the Super Bowl. So how does Facebook get around that?
Chris 03:01
I have no idea how they made the rules, but there does have to be a rule. There has to be a definition of what is okay and how we measure that. So, I mean, the question of nipples is a good one. Male nipples, female nipples. What about breastfeeding? What about women protesting about sexual exploitation and the right to own their own – you see, the context. The amount of flesh that’s being displayed? How clearly can you see the nipple? Is the nipple partly obscured or not? You have to have detailed rules about all of these things. I do pity the poor bugger that has to make these rules.
Gerry 03:45
But what’s the whole point of the rule system in Facebook and CPL? Is it trying to create a whole kind of unified, singularity way of thinking, because surely you can’t please everybody?
Chris 03:58
Well, obviously nobody ever explains anything like that to me. My job is just to implement the rules; I’m not supposed to bother myself by worrying about what the policy should be, or what the thinking behind the policies is. What I think is that they’re trying to get consistency. If you report something that you don’t like, and I delete it, and then it comes up again, and you report it again, and the next person doesn’t delete it, then there’s no point having content moderation if you’re going to change the rules every time. So you really do need to kind of lock this down. This is what’s okay, and this is what isn’t. And then – I know this is hard – but they’re trying to have one set of rules for the whole world, one global ‘right or wrong’.
Gerry 04:47
Yeah. Which isn’t good, because certain content is good to create conversation and create topics. Otherwise we’ll end up with a very bland-looking world, which is kind of what – if you follow those kind of rule structures that a number of very exclusive and more than likely non-diverse, non-inclusive kind of minds are looking at, it’s not a good thing.
Chris 05:15
It’s interesting that the people who complain most about having their free speech taken away are the ones who want to say things that a lot of the rest of us find offensive. So if you compare it to something like Mumsnet – a website for generally women, family oriented people – then you won’t see those kinds of opinions and extreme content being on display. The amount of moderation that’s needed is much less. But it’s the combative people, the people who like to win arguments, the people who think that there are much stricter definitions of what’s okay and what isn’t, they will push the boundaries. They will want the right to say things that many other people will think are not really okay.
Gerry 06:08
Yeah. It’s a bit of a slippery slope if it’s not managed correctly, though. And it’s a slope that – you in turn inherited those increasing rules as they added to, I suppose, the social conversation that was gaining pace in the last number of years, and Facebook was trying to respond to that. How did you see your role change at CPL over the years from when you started to when you ultimately left?
Chris 06:40
Well, I mean, I talked with people that were there in the very early days, when there were a handful of guidelines and the decision that you would make would be to ignore the content or to delete it, and it was that simple. And then when I was there, that had expanded. There were probably 10 or 12,000 words of densely written text saying what was okay and what wasn’t, and hundreds of pages of reference material. And when you made a decision, that was a very granular process. There were literally about 100 options to choose from. So, okay, you get naked people, Well, okay, we’re going to delete that because it’s nudity, subcategory nipples, or subcategory penises – no, that’s an erect penis; no, that’s not an erect penis. You’re into this insane level of detail, and they’re organized hierarchically. So you know, if you’ve deleted because you noticed the nipples, but you didn’t notice somebody’s bare bum, well, maybe the bare bum ranks higher on the hierarchy, so now you’ve made the wrong decision.
Gerry 07:51
Yeah.
Chris 07:52
So your quality score comes down and you’re in danger of getting fired. And I was talking to somebody recently that is still there. They told me there are 250 options now.
Gerry 08:02
Wow. So 250 options, that’s more than a pilot has–
Chris 08:07
Yeah, they just keep adding new rules. Every time something happens, that some new special interest group gets upset about, then you get more. And I just don’t see how that is sustainable, because it’s beyond your mental capacity. You don’t have the bandwidth to hold all this in your head and make lots of decisions very quickly.
Gerry 08:33
So when we spoke initially, Chris, you mentioned that there was a 1% or maybe it was a 2% failure rate in terms of making the right decision on those content. How would I describe it?
Chris 08:49
2% was permitted. You had to be you had to be correct 98% of the time.
Gerry 08:54
You had to be correct 98% of the time. Now, walk me through how a 100 or 200 decision matrix and making the wrong decision – that surely has an impact on being able to effectively process things, but also on your mental health?
Chris 09:15
In terms of the processing, very often, it’s a simple, simple choice. You know, if I say I don’t like Irish people, well, you know, that is clearly hate speech against Irish people. There’s no thinking required for you to say, okay, that’s not good. And I would say maybe 90% of the time, you would very quickly make a decision, it only takes a few seconds. The problem is, of course, though, that sometimes you’re kind of on autopilot and you get the obvious violation and then realize afterwards that, ‘Oh, wait a minute, there was something else that’s further up in the hierarchy.’ One example would be the naked ladies. Well, if people are using naked ladies to get you to go and visit their website and hand over your login credentials – you know, standard fishing – then that’s a completely different violation. So if I’m deleting for bare nipples or bare bums when actually I should have been deleting for identity theft—
Gerry 10:21
Yeah.
Chris 10:21
Then multiply that 100 times once you get into much more diverse content, so that’s challenging.
Gerry 10:28
It is challenging, but if you have to deal with like 100 or – I don’t know how many tickets you’d have on a night, and some of them aren’t all black and white – many of them I’m sure are going to be in the middle – and you have to make that decision and you’ve got a potential kind of success or failure rate going against your job and your safety of your livelihood, that that creates a lot of a lot of stress.
Chris 10:58
There were the tickets that you get and you can see it’s a problem and you don’t know what to do and you end up having all these different options and conversations–
Gerry 11:06
Walk me through that. Say you had a piece of content that came in that was a little bit more difficult to define if this was something to be taken down or let it stay on the network. Walk me through what you had to do in that instance.
Chris 11:23
It could be very time consuming. You’ll start and you’ll look at it, and then you’ll maybe look at the hierarchy and say, ‘Okay, this is not spam. This is not child abuse,’ and you’re looking through to see, okay, maybe it fits into this particular category. And then you find something that you think might fit, so then you go and look in detail at the rules. And even the placement of a comma can change the meaning of something so that it suddenly becomes violating or not violating, and you’re often having to interpret the context of something and then, finally, reach the decision: no or I think that’s okay. And then you carry on working down the hierarchy, looking for other categories where it might be violating. And then you can just find yourself just really not sure. So you ask some colleagues. We have a group chat, so you’ll type into the group chat, you’ll put the ticket number. You might get three or four other people looking at it, giving you their opinions; very often, the opinion will be split. There’s nothing you can do at that point. You just have to pick one, pick an option and go with it. And really, there’s nothing you can do. You would post then – we have like a Facebook group called a tribe, and you would post a description and what your thinking is and the ticket number. By the next day, you might get an answer to that, but it’s too late. It’s only useful for future reference, but at least some guidance.
Gerry 12:55
It’s very similar to the police trying to police society and going around and enforcing–
Chris 13:02
No, because the police have a lot of discretion.
Gerry 13:05
They do have discretion, but also if the person who–
Chris 13:07
They have discretion, but they also – how many decisions do they make in a day?
Gerry 13:13
Yeah, that’s a good point.
Chris 13:15
I’d guess 600, one every 30 seconds.
Gerry 13:18
600! And how long are the shifts, six hours or eight hours?
Chris 13:21
It’s an eight-hour shift. So you’ve got a bit of admin to do, you’ve got some emails, yada, yada, so maybe six hours of work.
Gerry 13:31
Okay, so some of them are being processed–
Chris 13:34
A hundred an hour, one every 30 seconds.
Gerry 13:36
Yeah. So you’re dealing with lots of content very quickly and trying to make that kind of decision. And basing it off a rule book that, I guess, is getting bigger by the day.
Chris 13:50
Well, it’s getting bigger; it’s also being tweaked. So every two weeks, there’ll be a policy update. They’ll change rules that you’ve been implementing. I’ve had days where I’ve made a decision on a Tuesday on some content, and then on the Wednesday they’ve changed the rule. And then on Thursday or Friday, I’ve been audited, and the auditor has applied the new rule.
Gerry 14:12
Yeah, it almost seems like the title of the role content moderator – ‘moderator’, you’d assume that there was some level of power to be able to make a decision because you’re a moderator. You’re actually controlling the situation, but it seems like–
Chris 14:28
We’re not moderators. The job title is community operations analyst.
Gerry 14:33
Okay.
Chris 14:35
There’s no discussion about moderation at all. That’s the popular understanding of the job. The job is to apply rules and to pass sentences and to process exactly what you see in front of you.
Gerry 14:50
Yeah, it’s a very robotic approach to human problem.
Chris 14:55
If you’re in a discussion with somebody online or an argument and you say something they don’t like, and they report that, all I see is your comment. I don’t see what the person said before you. I don’t see what you’re trying to rebut. I only see your comment, totally out of context.
Gerry 15:11
Yeah. It’s almost as if you’re being set up to fail, like you haven’t been given the tools to do the job. You’ve been given some tools to do the job, obviously, but you haven’t been given the bigger picture to be able to make an informed decision.
Chris 15:26
I think you’re being treated as a cog in a machine. It’s being treated as an engineering problem.
Gerry 15:33
Yeah.
Chris 15:34
There’s a thinking that we can apply rules and algorithms and systems, and each of these people will do that little bit of processing. We don’t have an AI that can do that yet. So we’ll have to use these little pre trained neural nets called community operations analysts, and they’ll implement this the system for us, but that’s how they think about it.
Gerry 15:59
Surely people working for CPL and Facebook – you’re not the only one who will have expressed that this is quite a difficult job to do. It’s very hard to get a 98% success rate every week. What was the communication from CPL – and Facebook, ultimately – around how they can actually improve the situation?
Chris 16:22
No, no, no. Don’t ever bring problems to the table. Don’t ever question the system. This is a command and control situation. I think it starts with the way that Facebook is structured as a share company, Mark Zuckerberg owns control of the voting shares, and whatever Mark wants, Mark gets. He gives instructions and then the next level of people go and implement these policies saying, ‘Mark wants this’ or ‘Mark wants that’ and there’s no argument. And that percolates down so that we’re at the bottom. I can negotiate with my auditors about specific decisions that I’ve made, but I could not have any kind of a conversation with anybody about training systems or rules that weren’t working. You can’t give any feedback.
Gerry 17:13
Yeah. It’s interesting, because in my notes here I’ve got a quote from Mark Zuckerberg on this actual problem and some leaked audio from an old staff meeting where he said that the working conditions in companies’ moderation centers as being a little overdramatic.
Chris 17:32
I think he was actually referring to the criticisms that had been made in the press. I take that personally, because it was me making those criticisms.
Gerry 17:40
So he said from digging into them and understanding what’s going on, it’s not that most people are looking at just terrible things all day long. It would be interesting to see if Mark Zuckerberg has ever actually sat down in a moderation center and tried to do the job himself, I wonder?
Chris 17:56
Well, in fairness to Mark Zuckerberg, this organization is too big for one person to know what’s going on in every corner. He’s dependent on what he’s told by various people. And every person in that chain has an incentive to paint a picture that makes them look good.
Gerry 18:18
Yeah. But this problem has been growing for such a long time. From when we were speaking initially, we mentioned that the content moderation team grew from a team of 10 to how many thousand people now?
Chris 18:31
Well, the last quote I heard a couple of weeks ago, he was in Germany, he said there were 35,000 people employed by Facebook doing this – and that’s just Facebook. You know, we have TikTok and Twitter and all the others.
Gerry 18:46
So you could say that Mark’s obviously a very, very intelligent person. But the fact that there’s been an attempt there to hire 35,000 people might show that there’s a bigger need and these problems aren’t going away. So I was just asking purely I wonder has Mark Zuckerberg ever gone into these centers and actually examined what might be going on himself?
Chris 19:09
Well, what I’ve heard from people that are still there right now is very occasionally you’ll get somebody coming from Facebook to see what goes on. And the whole building becomes this hive of activity and everybody has to clean their desks and look happy. You have to put on a brave face for the client. So somebody like Mark Zuckerberg, it would be like a royal visit. He’s going to have his entourage, there’s going to be all these chains of people all desperate to ensure that they look good. Everything’s going to be sanitized and protected. He’s not just going to walk in anonymously and shadow people doing their normal job. I don’t think he would ever be able to do that.
Gerry 19:55
Yeah, to get an effective and accurate evaluation of the real world, I suppose – that’s a really, really good point. So when you were at CPL, you obviously would have raised some of the issues that you were seeing in terms of being able to do your job effectively. Ultimately, you left your job at CPL. Walk me through what happened after that, Chris?
Chris 20:21
Well, let’s just walk through how I left, shall we?
Gerry 20:24
I was trying to frame it in a way that’s – I’m trying to be very political.
Chris 20:30
You’re asking me this question about giving feedback. And I’d given some feedback to the senior person on site, who was actually very receptive. This was the senior CPL person. He was quite pleased that somebody was actually telling him the truth, and he acted on a couple of recommendations that I gave, and that was the last time I ever saw him. He vanished from his role – like, two days later, he literally was never seen again.
Gerry 21:00
He left?
Chris 21:02
I don’t know what happened, nobody will tell you anything. But I don’t know that the whole evening shift was really struggling and underperforming at the time, and my team in particular. So the various changes were implemented. And you know, I kind of felt, ‘Wow, this is great, somebody’s listened.’ And then I was accused of resisting the changes and it seemed like somebody new had got some drama – so just normal office politics bullshit, basically. And eventually I tried to talk to my team leader – just to clear the air, smooth things over – and I was really upset and emotional and just could not deal with a minor confrontation. And I had started to realize already that I was having emotional problems, mental health issues, and this really kind of shocked me. I felt like I was going to be eased out as a result of this.
So I went and talked to my regional manager. I was very, very calm; I really prepared myself very well. I just went in and said, ‘Look, we’ve got this problem that my whole team is not reaching the quality standard, has not been reaching the standard for six months. You’ve been providing all this additional training for months, and it’s not working.’ I have 15 years experience in adult education, designing courses and so forth. I said, ‘At this point, we have to just look and say, well, instead of saying that these people are not doing their job well enough, maybe we have to look at the training systems. Is there something we can do differently?’ And the answer was, well, no – if you can’t reach the standard with the resources that are there, then you really need to start looking for another job. And two weeks later – it wasn’t my choice to be out. I was led out of the building. So you’ve given some feedback, you’ve tried to be constructive. And anybody that’s trying to be constructive just seems to get pushed out of the door because they’re challenging the status quo. My poor team leader – who wasn’t equipped or trained to recognize that I was under a lot of stress, and he wasn’t recognizing the danger signs – just perceived me as a problem. And then he has to fire me. He was embarrassed and stressed about doing that. Just a cog in the machine.
Gerry 23:40
Yeah. It sounds like there was a lot of pressure to deliver hyper accuracy to their client – Facebook, ultimately – because that’s kind of what was needed, but–
Chris 23:54
Right? But, you see, Facebook have defined what success looks like in a way that then puts us into this incredibly stressful situation where the huge cognitive load of interpreting the content is compounded by the near impossibility of reaching the quality target. So eventually you just become obsessed with, ‘How can I make a decision that I could justify to an auditor?’ And it’s all to conform with these rules, but you keep changing the rules. And if you keep changing the rules, surely that’s because the rules are not perfect.
Gerry 24:32
Yeah. I mean, this is a huge opportunity. There is a huge, huge opportunity that Facebook, CPL and the content moderation team – they’re on the ground, they’re seeing things, the granularity and conversation and feedback loop would do so much in terms of being able to improve the process, improve the situation, but it just doesn’t sound like there was any opportunity.
Chris 25:03
It would work if you have the ability to make change, but you don’t. You have 35,000 people locked into a process. Can you imagine trying to change that process? And I’m guessing that what has happened is that 10 years ago, this started to become an issue. And somebody said, ‘Okay, I’ll have a look, I’ll put together some rules or some guidance and look at getting a subcontractor in to implement them.’ They see it as a call center job, and they’ve applied call center thinking to this, with very, very simple measures of success. And then something else has happened, so they’ve had to make more rules, and they’ve had to get more people, and it’s kept getting bigger and bigger. And now there’s a slew of tpeople who’ve just kind of found themselves doing this work. And they’ve got an empire, and huge budgets, massive share options and a huge vested interest in keeping on doing what they’re doing right now.
Gerry 26:06
Yeah, I suppose that the more people that they add to the situation, they can then point back to people in Congress and say, ‘Well, we’ve hired more people, we’re trying to tackle the problem’ when really that’s not the main problem. It’s actually the stuff that goes on in the network.
Chris 26:26
Yeah. I mean, their solution now is just more of the same, more of the same. We are going to keep trying to improve what we’re doing. It’s like petrol driven motorcars – we’re going to keep on improving the efficiency instead of building electric cars.
Gerry 26:42
Yeah, it’s so true. Chris, I just want to chat a little bit more around the content. Obviously you’ve had a lot of exposure to the content that you’ve seen on Facebook, but how would you describe what you’ve seen? Give us a snapshot. Obviously you mentioned puppy dogs and some nudity and stuff, but how does it make you think about humanity, now you’ve seen behind the curtain?
Chris 27:10
People are awful. There’s the disturbing content, the terrorism and the graphic violence, and you know that’s going to be shocking and you think that you can prepare yourself for it. But one of the things that really got me was just the banal nastiness of so many people’s lives. You can imagine, you know, Dave and Diane are married or they’re together and they have a kid and then they separate and one or both of them has children by a previous relationship. And one or both of them is an alcoholic or a drug user or something, and they get into an argument, and then somebody’s mother gets involved and all this stuff is going backwards and forwards. They’re not highly educated. They don’t write in full sentences. You get a 500-word stream of consciousness without a single punctuation mark. And they’re all using the reporting tool against each other, and you get the whole argument in dribs and drabs. Not in order, you have no idea what’s going on and you have to parse not every sentence, every clause in every sentence, every verb-object combination, you have to look at that and apply these rules to it. And it’s just depressing. Having to think in detail about these sad, angry little lives that some people have to lead, it really brings you down. And then you get people being unloaded off a truck by men with guns and they’re lining them up by a trench and waiting for the shooting to start. Then you get puppy dogs, so it’s all right for a minute. Or you get nasty racist jokes and you laugh at them before you delete them because this is your world now and your humor becomes very dark.
Gerry 29:15
Yeah. Well, what would you like to see CPL and Facebook do to try and improve the situation?
Chris 29:22
Well, we’ve got content moderation in general, but then you’ve also got – over time, this content does really affect people. I think you get depressed from some aspects of it, and then there’s the trauma side as well. I think they need to recognize that this is real, and they provide proper support for people. I mean, I’m in court. I’m taking Facebook to the High Court to make them acknowledge that PTSD is real. That’s bullshit. Just turn around and say, okay, out of 35,000 people, we recognize that some of those people are going to be harmed by stuff that we want to take down off our platform because it’s harmful. So they’ve got to do that and start making provision for people to get the proper support that they need. And then, I think, we need to recognize that the working environment makes it much, much harder to cope with that stuff in the first place. PTSD is linked to feelings of powerlessness, to being unable to take action to protect yourself, to deal with difficult situations. And if you’re constantly under threat and under attack over quality scores and stupid office politics and so forth, then you’re already in a weakened state. You’re already much more vulnerable before you even see the content.
Gerry 30:54
So you mentioned there about taking your case to the High Court. Where are you at in the whole process of taking Facebook and CPL to court?
Chris 31:04
Well, when you phoned me, I was actually sat answering a list of questions from Facebook’s lawyers. It’s still very early days. We’ve filed the writ. And then we go through this whole process where the lawyers asked a bunch of questions to clarify matters, and we start to put together a case, they start to put together a defense. We have to go through the whole process of discovery now, where we request information. So, for example, how many hours a day or a week did I spend looking at penises or terrorist content? How many items did I mark as graphic violence, for example? Facebook has all this data. We need to obtain this, we need to quantify what we’re exposed to. So this is going to take a long time and of course, everything’s slow down right now anyway because of the corona situation.
Gerry 32:00
Yeah, absolutely. Has there been any Facebook response to either your case or how they handle moderation feedback like this?
Chris 32:11
Oh, yeah. I mean, people within the company have obviously asked Mark Zuckerberg directly about it, and he’s said that we’re just being overdramatic. The faithful PR machine puts out all these statements about how they take care of us and they take this stuff seriously, which is either a blatant lie, or, again, information is being filtered and the people who are making these statements genuinely believe what they’re saying because the truth is hidden from them. But it doesn’t sound like anything’s improved on the shop floor. I had an email from somebody a couple of weeks ago telling me that even their toilet breaks are controlled now; they have to account for every moment of every day. So the response has actually been to tighten this command and control mentality.
Gerry 33:03
It’s a crazy, crazy situation. It’s such an old-school way of thinking for a new world problem. It’s hard to believe.
Chris 33:15
There’s a marketing guy called Seth Godin that you might have heard of, and he talks of companies being stuck. You know, ‘This is what we do. This is the way we’ve always done it.’ And there’s too much inertia, too much resistance to change, that you’re not open to new ideas. I think his example is wider – the flight attendants show you how to fasten the safety belt. There’s no need for that; we could all get along fine without that. But this is what we’ve always done, and to take it away means taking it away from someone and they’re going to fight to prevent that. And I think the same thing is happening within Facebook. I think that the organization is married to this one solution and is not open to alternatives. I think Bruce Lee said if you’re married to one outcome then you’re dead.
Gerry 34:06
I’m a big Bruce Lee fan, back in the day. It’s a crazy situation. Chris, if people want to reach out to you – I know that we do have listeners in San Francisco and in Facebook, and definitely in other large tech companies as well. If they’re suffering from this, what advice would you give them?
Chris 34:24
Oh, I’m not qualified to give medical advice. Certainly Ireland or Europe, the first step is your general practitioner, your local doctor. My doctor, I burst into tears in her surgery when I was finally – it took me a year to realize that I was a complete mess. And I went to see my doctor, and I literally just cried my eyes out because this was the first time I’d ever talked to anybody. You’re not allowed to talk to this stuff. We’re under this omerta, this vow of silence. We sign these nondisclosure agreements that we’re never going to talk to anybody about what we do at work. And breaking that, getting out there and acknowledging that you’ve got a problem is the first step. And then from there, you’re into trying to find the right kind of therapy or counseling that’s going to work for you. That’s really where you need to go. And then you need to talk to a lawyer.
Gerry 35:27
You mentioned – I can’t remember, is it Foxglove, the name of the legal–
Chris 35:31
There’s a group in London called Foxglove who are a nice kind of intermediate step. Maybe people are a bit nervous about going to see a lawyer right now. But Cori Crider in London, she’s a great campaigner. She’s supported by various social organizations in the UK. She works closely with Amnesty International, who are taking a big interest in this case, because there’s the question here of your employer is putting you in harm’s way knowingly, so that’s the human rights violation. So she’s very clued in and she can connect people with various different resources and represent you with the media and so forth without you having to take that big step of committing to a lawyer. But then my lawyers in Ireland are great. They’re helping out with the financial side. Nobody has to worry about, ‘What’s it going to cost me to sue Facebook?’ That can be taken care of. You can come and talk to them. They’re very receptive. They listen, they’ve never pushed me to take action. They’re not there to sell a service. I found them very good as well.
Gerry 36:45
Okay. So I’ll put a link to Foxglove’s website and details in the show notes. Chris, it has been wonderful to hear your story. Thank you so much for taking the time to share it. If anyone wants to reach out to you, how might they do that?
Chris 37:00
To me personally? My name is Chris Gray. I am the real Chris Gray, so if you email chris@therealchrisgray.com then that should come through to me.
Gerry 37:14
Chris, thank you so much for your time today. It was really good. I enjoyed our conversation. Thank you.
Chris 37:18
Great. Thanks for having me.