Transcript: JUST 150 Jay Cameron Witness Testimony

Lawyer testifies in Canadian Parliament in opposition to proposed Internet censorship scheme

May 21 2019


On May 16, 2019, the Standing Committee on Justice and Human Rights (JUST) met in the Canadian Parliament for meeting 150, which was part of their "study on online hate".

Jay Cameron, Barrister and Solicitor, of the Justice Centre for Constitutional Freedoms was a witness and shared his testimony on the subject of "online hate", including proposed Internet censorship schemes and the prosecution of Canadians for "hate speech" for content they post online.

The following is a transcript of Cameron's testimony from the audio recording of the meeting which can be listened to here. This transcript is not guaranteed to be accurate or complete, and it is released into the public domain.

________________________________


Jay Cameron: Thank you very much. Honorable members, thank you very much for the invite to appear here today. I'm with the justice center for constitutional freedoms, we're not for profit, non-political, non-religious organization, dedicated to the protection of Canadians' fundamental freedoms and constitutional rights.

I'm going to talk about three things this morning. First, the problem with setting out to censor hate, without proper parameters. Secondly, the reality on the ground with human rights tribunals in the context of this study, and third, the dangers of state censorship and big tech, combined. Then I'll provide you with four recommendations.

Like the Canadian civil liberties association, the starting point for this conversation is, or should properly be the constitution of this country. That is Canada's foundational document, but it is not mentioned anywhere in the outline for this study committee, and most of the witnesses before the committee made no mention of it, except to urge you to infringe it as fast as possible.

Set out in section 2B of the Canadian Charter of Rights and Freedoms, is the fundamental right to have an opinion and express it. This committee is studying online hate and preventing online hate, but it has not established parameters or definitions as to what constitutes hate. I think it behooves the committee to ask "what is hate?", and what is the enticement of hate. Because the reality is that crying hate has become one of the favorite tools, in some circles, to prevent dialogue and discredit disagreement.

You disagree with my religion, that's hate. You disagree with my politics, that's hate. You disagree with my gender identity, that's hate. Do you have concerns about immigration, and resources and security, that's hate.

What if you're a single woman working out of her house as an aesthetician, and you aren't comfortable waxing a pair of testicles, that's hate.

You want to peacefully express your opinions on a University campus regarding abortion, you can't, because that's hate.

You just heard from a previous witness [Morgane Oger] who says Megan Murphy is hate, Feminist Current is hate, and the Post-Millennial is hate, all without any examples whatsoever. Therein lies the problem. The same witness demonstrated in front of the Vancouver Public Library and compared the feminist talk going on inside to a holocaust denial party because the women were talking about the interests and rights of biological women.

And lastly, but not least, senator, US senator Elizabeth Warren, within the last couple days, described all of Fox News as hate.

None of this is hate. That's disagreement and it's dialogue, but it's not hate. It's protected speech, under the constitution, and it is entirely legal.

Now I alluded to the woman in the waxing case, you've heard about this case, made international news. The Justice Centre represented this woman; she's a single woman, works out of her home, she has a small child, and she provides aesthetician services to the community. She advertises on the Internet, and tells the world that she provides waxing services to women. She's trying to make ends meet. She doesn't have the supplies to wax somebody's scrotum, she doesn't really want to work on somebody's scrotum. She didn't start out intending to work on somebody's penis, and it was irrelevant to her whether that person thought they were a man or a woman, because it was about physiology. She had a human rights complaint made against her, which terrified her.

She told me she went to 26 different lawyers first, before she found the Justice Centre. And all 26 lawyers refused to take her case. Why? Well, they gave her a variety of different reasons according to my client. Some were afraid of activists, some were afraid of the different procedure at the Human Rights Tribunal. Some were afraid of representing somebody who had allegedly engaged in discrimination, and they didn't want the stigma attached to representing somebody like that in that context. There's also not much money in these cases, so they're not particularly attractive to lawyers.

And that creates a significant access to justice problem that this committee needs to consider. It needs to consider people like my client who have a complaint made against them despite the fact that they didn't do anything wrong. A lot of people who have complaints against them are common people, many of them have few means, they're facing a bewildering process, but even worse they're facing the stigma of a human rights complaint, and in this day and age of hyper sensitivity and social media, where gossip travels around the world in an instant, being accused of discrimination is worse than a criminal accusation, in many cases. It's enough to destroy your reputation, even the lawyers don't want to be involved in it, because they're afraid of stigma. They don't want to hear "you represented that bigot, that racist, that misogynist, that homophobe, that Islamophobe... how could you, in good conscience, represent those disgusting, filthy human beings?"

Now, is the state going to appoint counsel and pay for it if people can't? In the woman's complaint the complainant's name was withheld by the tribunal and kept private, but my client's name was publicized for the whole world to see. As a single mom, my client didn't need the complaint, she was trying to make ends meet, it caused her months of terror, life was hard enough, and she told me she wept when the complaint was withdrawn. I'm going to say that again: The complaint was withdrawn. Never made it to a hearing, there was never any vindication for her, simply the accusation that she had discriminated on the basis of gender identity or expression.

Now, there are 14 other cases before the BC Human Rights Tribunal from the same complainant. Every single one to my knowledge, request damages against the people who refused to wax the complainant. None of them have a lawyer to my knowledge, so there's lotls of pressur eto settle, indeed some of them have.

Only the Tribunal knows who the parties are until a hearing date is set, and then the parties are publicized three months in advance. Now, the Justice Center offered to represent these complainants, or these respondents, for free. And we asked the BC Human Rights Tribunal, given the fact that there's an access to justice problem, to pass along that offer to all of the respondents, but the BC Human Rights Tribunal refused to do so, and that's something you need to consider as well. Because, Human Rights Tribunals are not the savior for these problems, in many cases they create more problems than they fix.

I want to say a little bit about the fine under former section 13 of the Canadian Human Rights Act. IT was $10,000. That fine was found to be unconstitutional, at the first stage of hearings, it was overturned by the Federal Court of Appeal, but it never made it to the Supreme Court of Canada. The fine for the conviction of drunk driving is $1,000. And that is a crime under the criminal code which is a grave social evil. And what you have heard this morning, is that people should be punished for the crime, the vague crime, no specifics, like Meghan Murphy who is not here to defend herself, of transphobia, or misgendering. And that's part of the problem that you need to think about.

How much time do I have left?

...

We recommend four things, that the Canadian Human Rights Act, if it is to be amended, that it be amended to define what is and is not hate speech pursuant to the Supreme Court of Canada's decision in Saskatchewan v. Whatcott, 2013 1SCR467 at paragraphs 90 and 91.

The Supreme Court of Canada sets out what is hate speech, and most of what you've heard from the witnesses who are telling you something is hate speech, doesn't even come close to hate speech.

Secondly, if there is any new legislation to be implemented, we say that there ought to be defenses to a complaint of hate speech mirroring the defenses in section 319 2 of the criminal code. Specifically that no person shall be convicted of an offense under subsection 2 of 319 if he establishes that the statements communicated were true, if in good faith the person expressed or attempted to establish by an argument an opinion on a religious subject or an opinion based on a belief on a religious text... I'll pause here to note that the Bible under the parameters that you've been asked to consider, and the Koran, and other religious books, could be considered hate speech just because they're posted online, for saying things like "God created male and female"... that's not hate, that's a statement, and it's entirely permissible, but it would be protected under the defenses that I'm outlining here.

Three, if the statements were relevant to any subject of public interest the discussion of which was for the public benefit and if on reasonable grounds he believed them to be true, or if in good faith he intended to point out for the purpose of removal matters producing or tending to produce feelings of hatred towards an identifiable group in Canada.

Three, we recommend that the maximum fine for any finding of, uh, hate, be capped at no worse than the criminal code finding for drunk driving at $1,000.

And fourth, that Parliament launch an initiative to encourage people to come forward with their big tech censorship stories that it can understand the extent of that problem, which is significant, and not embark on a mission of censorship without all the facts.

Those are my submissions, thank you.

Unknown voice #1: Thank you very much Mr. Cameron. We'll go to Mr. Cooper.

Michael Cooper: Thank you very much to the witnesses, Mr. Cameron, your client who endured a complaint through a Human Rights Commission that was ultimately withdrawn, and was subject to enormous costs on her life, would not be entitled to costs is that right?

Unknown voice #2: We applied for costs.

Jay Cameron: Because the complaint was precipitously withdrawn, once counsel became involved, and submissions were filed, and evidence was filed, and so we applied for costs, based on a number of misrepresentations, which we said the complainant made, and the fact that the complaint was started, it put this woman through a terrible amount of crisis for 5 months, and then it was withdrawn. What is she supposed to do? There is no recourse for her.

Michael Cooper: And to that end, you'll just confirm that under the Canadian Human Rights Act, under that framework, if there was a frivolous and vexatious complaint, a respondent would be statutorily barred from suing, is that right?

Jay Cameron: That's my understanding, and that's a problem. Because there's no disincentive, for launching multiple complaints, and the person involved in the waxing case made 16 of these complaints. You know, some of them are in various stages of settling, and some of them are proceeding to a hearing as far as I understand. But the point is that obviously there is a problem, when people can just destroy somebody's reputation by a charge of discrimination, and then nothing happens to them when it was done maliciously.

Michael Cooper: Right. Now you touched on big tech and big government. I think we've seen some steps that have been taken in Europe by the European commission; I would suggest that there is a real issue of censorship creep in terms of some of those steps taken. You touched on it, but you didn't have an opportunity to elaborate, so I'd be interested in your thoughts on big tech and big government coming together, and the dangers there.

Jay Cameron: Yeah, it's not a myth that big tech is censoring opinions that it disagrees with. Two days ago, democratic representative Tulsi Gabbard, she's the representative for Hawaii, appeared on the Joe Rogan show, and she voiced opposition to the censorship of Facebook users arguing instead that companies like Facebook had betrayed the long standing American commitment to free expression by ousting unpopular political commentary from their platforms. Just listen to those words, what she's saying: Unpopular political commentary, and that is what is being ousted. Not hate, but simply stuff that the censors at Google and Facebook disagree with, and because they have the power, and little oversight, they do whatever they want. And so, you know, it's a dangerous proposition for government to consider, and to propose, teaming up with these institutions and these entities which are already engaged in gross censorship which is well documented. We'll be submitting a paper about it, but it's well established at this point. Google as well is routing search results away from certain media outlets, you know conservative voices it disagrees with, so it's routing traffic away from those entities, and that's unacceptable.

Michael Cooper: In that vein, we've seen examples of Antifa for example, which has expressly incited violence, and social media platforms have refused to take that content down, so we've seen the inconsistency... in terms of the European commission, in 2016, they entered into an agreement with YouTube, Facebook, Twitter, and Microsoft, whereby those platforms agreed to take down content that constituted hateful conduct or violence extremism, ]I would submit, relatively vague terms, within a 24 hour time frame. Should we be concerned about ordering social media platforms to take down content within 24 hours, it seems to be there's not a lot of time for deliberation... and should we also be concerned about when state actors make requests for social media platforms to take down certain content, given the fact that state actors might have their own agendas.

Jay Cameron: Absolutely. Twitter is particularly notorious for this type of thing. There is a movie that is come out called 'Unplanned'. There was a US Senate hearing where the producer, the co-producer, for the movie appeared before the US Senate, testifying that Google refused to take their ad dollars. Twitter took down their account, and deleted hundreds of thousands of followers from the 'Unplanned' movie. And it's a true story about a director of Planned Parenthood. But we have the same problem in Canada, where theaters are refusing to screen the movie, despite all of the questionable content that is in the theater, they're refusing to show this true story, right? Essentially censoring it for the public. Whether you agree with pro-life positions or not, that still should concern you as Canadians.

The other thing I would say just quickly: Twitter is bizarre, right? They banned permanently, they permanently banned Meghan Murphy for the crime of misgendering... they've taken down... she's not a conservative, she's quite far left on the feminist side of the spectrum. So they take down accounts like this, but I went looking for something on Twitter, and accidentally stumbled onto a page with this guy's penis in front of this woman's face. You can have all this stuff on Twitter, but if you want to talk about conservative viewpoints or things that Jack Dorsey disagrees with at Twitter, they take them down. So, it's such a double standard, and it quite frankly should offend more of the people in government that it does, I think.

Unknown voice: Thank you very much, we're going to go to Mr. Ehsassi.

Ali Ehsassi: Thank you Mr. Chair. You mentioned in your opening remarks that you were somewhat insulted that there was no mention of the constitution in the motion which we're examining here. Is that correct, is that a correct...?

Jay Cameron: No, I didn't say I was insulted, I said that I think the starting point for the conversation needs to be section 2B of the charter, because it protects fundamental rights which the Supreme Court of Canada have said to be are the foundation of Canada's liberal democracy, and that it can't function without freedom of expression. And so I think that the context of this conversation needs to be section 2B at the start, I'm not offended though sir.

Ali Ehsassi: Okay. But surely you understand that any recommendations we adopt we would obviously be well aware of section 2B of the constitution, correct?

Jay Cameron: You know sir, I'm not sure of that at all. I mean, this government's track record regarding section 2B of the charter is not good. There was the Canada summer jobs fiasco in 2018, where people were compelled to...

Ali Ehsassi: But you understand that members of this committee obviously are mindful of that, correct?

Jay Cameron: I understand that people should be mindful of it, but as to whether or not this government takes section 2B seriously, I'm not convinced of that at all.

Ali Ehsassi: So, uh, you had the opportunity, you were sitting here, you had the opportunity to listen to the previous witnesses. What were your thoughts on some of their concerns?

Jay Cameron: I think that the representative from [unintelligible] made a excellent point of the dangers of attempting to take down speech, and fine ISPs for content. I think that that's a legitimate concern. I think that people in Canada have a right not to be subjected to criminal hatred, insofar as those concerns are based on the incitement of criminal hatred. I think those concerns are legitimate, and I support the prosecution of the incitement of violence or genocide against identifiable groups of people. The problem is is that a lot of the concerns that are being expressed are couched in vagaries...

Ali Ehsassi: But you constantly cite cases that you find to be extreme, but obviously you would agree with us that there is a public interest here in making sure that hatred does not spread. You would agree with that objective, would you not?

Jay Cameron: I think that in order for me to agree with that you would have to define what hatred is. How are you defining it? If you are defining it like some of these witnesses, then no, I don't think that the government legitimately has an objective...

Ali Ehsassi: But you did heard from the witnesses, that some of the witnesses did have to deal with sexism on Facebook. Correct?

Jay Cameron: Is sexism hate? Has that been established? I don't know that it has been sir. I mean, what is sexism? Is arguing against a woman's legal right to have an abortion because somebody has a perspective that's different than that, is that sexism?

Ali Ehsassi: No, but those weren't the examples they were providing us. Surely, you were sitting here, but that didn't concern you in the least bit?

Jay Cameron: No, please don't put words in my mouth. I'm not here to argue with you.

Ali Ehsassi: I'm asking you very direct questions, and the responses don't seem...

Jay Cameron: I'm giving you direct answers. I don't know that it has been established that sexism, which is not defined for the purposes of this committee, is hate. If you're telling me it is, then I think that what we need to establish is what you mean by sexism, and establish parameters.

Ali Ehsassi: But you would agree that they did face sexism, would you not? The previous witnesses that were before us.

Jay Cameron: Which witness are you speaking of?

Ali Ehsassi: The mayor, one of the mayors who showed up, an example she provided.

Jay Cameron: If she personally experienced sexism?

Ali Ehsassi: Correct.

Jay Cameron: I don't know what she experienced, I'm not her. I can't give a testimony...

Ali Ehsassi: But you were here, you were sitting here, and I take it you were listening.

Jay Cameron: Well, she said...

Ali Ehsassi: Are we supposed to be concerned about sexism, online?

Jay Cameron: Well I think, again, I think that it depends how you define sexism. What is sexism? And tell me what you're talking about, and i'll tell you about whether or not I think you should be concerned about it online. Give me an example.

Ali Ehsassi: Just to let you know, in my particular riding, Willowdale, we experienced the van attack. And as I'm sure you're well aware, the suspect, in that particular case, he was very much influenced by incel, which is, for those people who don't already know, incel is a group for men who feel rejected by women. So, right before the actual van attack, he posted "incel rebellion has already begun." So do you think it would have been irresponsible for, in this particular instance, Facebook for example, to have eliminated that comment?

Jay Cameron: Do I think it would be irresponsible for Facebook to eliminate...

Ali Ehsassi: For Facebook... Do you think anything should be done?

Jay Cameron: Yeah, no, I...

Ali Ehsassi: So far as you are concerned, we see all these instances, and from your perspective, should governments be concerned?

Jay Cameron: Yeah I think that governments are prosecuting offenses under section 318 and 319 of the criminal code, for example, and you need permission from the attorney general to prosecute that offense. But obviously that's something serious, and when there is a breach of the criminal code, I think it should be prosecuted. So I support that. As far as... my point sir, is that not all of the people who are charged with a human rights offense, or are the subject of a human rights complaint, are lunatics plotting a van attack against women. There are lots of common people who are innocent...

Ali Ehsassi: I think we all understand that.

Jay Cameron: Well, I think it's important to clarify it, because I'm not sure that we all do understand it.

Ali Ehsassi: All I've asked you is, is there a public interest for governments to be concerned about these types of things. You've expressed to us that gossip can spread on the net in no time, and it spreads wide. Hatred you would agree actually spreads as well.

Jay Cameron: Sure, communication spreads. And that's the same for criminal code, speech that infringes the criminal code as well.

...

Colin Fraser: Thank you very much Mr. Chair. Thank you to the witnesses for joining us today, I'll split my time with Mr. Erskine-Smith.

Mr. Cameron, I just would like to pick up on a couple of comments that you made in your presentation. Look, I take what you're saying about section 2B of the Charter, freedom of expression, being a fundamental freedom, it also... section 2 includes other fundamental freedom such as religion and association, and of course all of the rights in the Charter of Rights are read together and it's often times the balancing of the different rights, of those fundamental freedoms, that can come into conflict, and they have to be balanced. So, I take issue with the fact that you think we should look at section 2B first and that's the most important and paramount consideration of all of the other rights. I disagree with that. And also, section 1 of the Charter, makes it very clear that of course, all of the rights, including the fundamental freedoms, are subject to reasonable limits, and the courts have ruled on that, and I think it's misleading to just say that section 2B is the paramount consideration.

Another comment that you made was, your third recommendation, that any fine, for anything involving hate speech online, or whatever, should be capped at the criminal code fine for impaired driving, which is $1,000. Well that's a minimum fine, first of all, and second of all you can actually go to jail for impaired driving, so it is a serious offense, but of course it depends on all of the circumstances.

Third, you mentioned that the BC Human Rights Tribunal, should, in some fashion or another, be promoting your legal services, to give you platform in order to take on clients that would help you get the word out there about your organization and what you stand for. And I don't think that's the role at all for the BC Human Rights Tribunal to be sort of promoting any legal services over others.

I want to move though to something you said, which was that there's this sentiment out there that disagreeing with someone's point of view is considered hate, and you went through a list of them and said "you disagree with that, that's hate". Well, I don't think that's true. I think that the essential point here is that it's about spreading misinformation that angers people, and riles people up online, and spreading that disinformation which turns members of a community against one another. That's the fundamental problem that we're seeing here with things online that are not true, and they're being propagated here by people with insincere motives, and motives that are outside of the bounds of civil society, I would suggest.

So what I'd like to ask you sir is: Does it trouble you that terrible individuals, when we see the Toronto van attack, or what happened in Christchurch, or the Quebec City mosque shooting, does it trouble you that those terrible individuals have been inspired by provocative and hateful content on social media platforms?

Jay Cameron: Does it trouble me that they were inspired by social media?

Colin Fraser: Does it trouble you that they were inspired by hateful content on social media platforms.

Jay Cameron: I don't presume to know what inspired these people, I'm not them. I don't know what their childhood was like or what they were subjected to...

Colin Fraser: It's been widely reported that hateful content on social media platforms were at least partially responsible for the ideologies that they hold. Does that trouble you.

Jay Cameron: I think that any time somebody commits a heinous act against people, it troubles me. And I think that any time they're motivated to do that, in part or in whole, by something somebody said, is troublesome. But, crimes happen every single day, and they are... people are influenced by what other people say, across the country, and, I mean that's... that's troublesome.

Colin Fraser: Ok well we'll leave it there and I'll turn my time to Mr. Erskine-Smith.

Nathaniel Erskine-Smith: So, uh, Mr. Cameron, I share your love of section 2B, I once, as a young law student volunteered for the CCLA -- hi [unintelligible], it's nice to see you again -- so, I want to ask about different ways we already restrict speech, and just be brief. Do you agree with laws that restrict speech related to terror. Just say yes or no.

Jay Cameron: Related to what sir?

Nathaniel Erskine-Smith: To terrorism.

Jay Cameron: That's a criminal code offense.

...

Nathaniel Erskine-Smith: Defamation.

Jay Cameron: The law of defamation is tried in civil court. I mean it's punished, it's not censored prior to the defamation though, there's a difference.

Nathaniel Erskine-Smith: No, I understand. It's still defamation.

Jay Cameron: Okay.

Nathaniel Erskine-Smith: Harassment.

Jay Cameron: Like criminal harassment?

Nathaniel Erskine-Smith: Yeah, criminal harassment.

Jay Cameron: Absolutely, like a restraining order or something like that? Sure.

Nathaniel Erskine-Smith: Threats.

Jay Cameron: Sure.

Nathaniel Erskine-Smith: And hate under the criminal code, I understand you do support the existing laws.

Jay Cameron: It's the law.

Nathaniel Erskine-Smith: Great. So now let's talk about how we enforce those laws. So, the problem online, is that in many ways these existing laws, which you and I both support as restrictions on speech, are unenforceable in effect. That the criminal code is a very cumbersome instrument, and can't properly apply in so many instances when there is so much, there is a voluminous amount of hate online, and our law enforcement agencies and our courts can't possibly keep up with the comments, whether it's on Twitter or Facebook or wherever it may be. And I'm not talking about censoring your favorite conservative commentator, I'm talking about what you and I agree with, enforcing existing laws under the criminal code. So, do you think there should be liability for online platforms if they fail to take content down that is hate, according to the criminal code, within a timely way?

Jay Cameron: I think that there's... you're presuming, as a foundational premise, that there's a problem with the criminal code and the way that it's enforced. And I'm not sure... that hasn't been established. I know that people are complaining...

Nathaniel Erskine-Smith: So you think the criminal code is an effective instrument, right now, based on everything we've seen, on enforcing the hate speech laws on content online? Your answer is that it's an effective instrument?

Jay Cameron: Here's the problem with what is being proposed here. You're contemplating taking the prosecution of hate speech away from a prosecutor, a crown attorney...

Nathaniel Erskine-Smith: I'm not contemplating that, I'm contemplating a complimentary mechanism.

Jay Cameron: I would like to answer the question if I may. And the approval of the attorney general, and giving it to a tribunal, which is an entirely different entity...

Nathaniel Erskine-Smith: You're inventing the suggestion. I'm not talking about the Human Rights Tribunal. I'm talking about imposing some liability through new legislation on social media platforms that fail to take down hateful content, according to the law as it is, in a timely manner.

Jay Cameron: Who determines if it's hateful?

Nathaniel Erskine-Smith: Who determines if it's hateful? If it's obviously hateful... ultimately... uh... the... uh... there would be a judicial mechanism, a government agency would find it, and ultimately there would be a judicial mechanism if Facebook, or Google or whomever disagreed.

Jay Cameron: Right. What mechanism? What mechanism is there for somebody to determine that something is on an online platform... like for example, Facebook, Facebook says that it's content neutral, it's a marketplace of ideas, it doesn't police speech...

Nathaniel Erskine-Smith: Facebook takes down terrorism content already... they take down content already according to existing laws, they apply existing laws, so we're asking them to apply to the laws with respect to hate speech, and if they get it wrong, and someone says "hey Facebook, you got it wrong!" then they take it to court. Why is that so hard?

Jay Cameron: The criticism out of the United States, and certain other pundits, is that Facebook is violating its own premise by taking down speech which it says, when it represents to the public...

Nathaniel Erskine-Smith: You're raising distraction concerns that I'm not raising at all!

Jay Cameron: I'm not. I'm not. I'm answering your question, maybe you don't understand the answer that I'm trying to give you, but it is an answer to your question, okay? Facebook says that it's a marketplace of ideas, that it's neutral, and yet it is policing speech, okay? They're taking down speech. My question for you is: Who is determining whether or not something is hateful? Which is what you're advocating, that there should be liability for an organization like Facebook if it doesn't take down hate speech fast enough. My question is: Who's deciding that? And that's the problem.

Nathaniel Erskine-Smith: We already have laws! Oh, gaaahhh!

Unknown voice #3: Your time is up, it was very interesting...