Family Matters – Nathan Seiuli on the new civil liberties group in town

There’s a new civil liberties group in town! In this episode, Simon sits down with Nathan Seiuli from PILLAR NZ. PILLAR is an acronym, standing for Protecting Individual Life, Liberty, And Rights, and Nathan talks about its mission to restore the pillars of democracy and advocating for individual life, liberty, and rights. He discusses how he and his colleagues will focus on a broad spectrum of civil liberties issues including online privacy and foreign interference. Nathan also discusses a particular case to illustrate their work, which is seeking greater clarity around AI-powered surveillance by police. Simon and Nathan also have a debate around whether banning social media for under 16 years old is a good idea or not. They discuss the competing needs to protect children, support parents, and ensure privacy. Despite the complex and often concerning topics, Nathan emphasises optimism and community engagement as driving forces for their advocacy work.


Show script auto-generated by Descript app:

FM – Nathan Seiuli

Simon O’Connor: Hi everyone. Welcome to another Family Matters and shock horror I am actually interviewing someone from New Zealand. I am staying in New Zealand for a change. As you’ll know, over recent weeks and months we’ve been all over. The globe. But I’m really excited to be talking today to actually a new, I would say breaking or cut edge civil liberties organization called PILLAR New Zealand.

And so very excited to actually introduce when I find the right button to introduce someone who will not be completely unfamiliar to the family. First crew, it’s Nathan Seiuli. Nathan, welcome to the show. Thanks, Simon. Great to be here. Look, you’re a familiar face for many reasons, including of course, you are one of our wonderful panellists on Straight Talk on a Monday night.

And of course you’ve been involved with a number of organizations, but now you are leading something called PILLAR New Zealand. And look, I think a lot of viewers won’t necessarily have heard because simply it’s only just been organized. And the main thing I know is PILLAR is actually. Quite a nice double entendre.

It’s an acronym. So tell us about PILLAR. What’s this all about?

Nathan Seiuli: Yeah you’re very kind. First of all. Thanks for having me.  Thank you for the kind introduction. I need to get my wife to watch that, get a bit of respect around here. No. So PILLAR is a new civil liberties organization that me and a small team will launched in September.

It is, like you say, double entendre. It speaks to restoring the pillars of democracy. We really believe in democracy at PILLAR, but it’s also, an acronym for Protecting Individual Life, Liberty and Rights. So yeah, some of you may recognise me from things like Straight Talk, but you may also recognize me from the Free Speech Union where I spent close to three years working in that space.

And effectively I came to the point where I realized that yes, speech is under threat and defending it is really important, but it’s also a broad. Scope of issues that I wasn’t able to engage in, that I wasn’t able to advocate for, and the success that we’d had at FSU I really saw being applicable and probably beneficial in these other arguments as well.

So I decided, what I’m going to step out on my own or with a small team independently and pursue change in more than just the speech area. And it’s been a, it’s been fun, it’s been a, definitely a steep learning curve, but we are having a lot of fun doing it and we are really hopeful for the future of PILLAR and for the country.

Simon O’Connor: I do love the name. People who follow me, be it the podcast or the live shows know I enjoy a good pun or a double entendre or a play on words. It’s a small point, but who came up with it? Was this hours and hours of talking around the table, or did someone throw something into Grok or Chat GPT?

How did you come up with it?

Nathan Seiuli: No, we’d have been having a lot of conversations as a team. In my previous role around what are we trying to achieve, what are we trying to do? And this theme of restoring the foundations kept coming up in a and from there fed into this idea of restoring the pillars of democracy.

To be honest as well, I had about a month off. On parental leave. We, my wife and I, we just welcomed our third son and after the two days we spent in hospital, I got home and I thought, what am I going do for 28 days? Not exactly sure. So that’s where I got left with a bit of time with the seed of an idea.

The acronym came second. Initially we were only using one L. We were talking a lot about liberty. And then someone, I think actually Bob might have brought it to my attention, and talked about what about life? And there’s a real tension in there advocating for life and liberty. It can sometimes seem like it’s a contradiction, some of these arguments, but we really want to walk into these conversations and enter into these discussions while balancing the tension of how do we protect individual life, vulnerable life.

We also balancing individual liberty. I think it’s decision, it’s discussions and decisions that need to be had and made that haven’t necessarily been done.

Simon O’Connor: Bob now will want a royalty for that, but we can discuss that later and offline. But no, jokes aside. I would make the quick observation actually, that life and liberty actually go together.

Life gets messy as your life, my life. Everyone’s clashes and we have our demands. But actually, there’s no freedom without life. It’s one of the most key things. So I love the fact that you’ve got the two Ls there. And by the way, congratulations on number three. I’m amazed you’re awake.

Nathan Seiuli: Look, honestly, people keep saying, how’s sleep going? I get eight hours every night. I’m not going to lie. But my wife might have a different answer to that question, so let her, her speak for herself later on.

Simon O’Connor: It was something, even though I’ve got five step kids, Nathan, I, the youngest was six when I popped along, which was, coincidence.

But it could also have been good planning to, to miss those early years. Very good planning. As you say, you obviously have been involved with a Free Speech Union, which you know, by and large has a good job, but quite a obviously a singular focus. So it sounds to me, from what you’ve just described with PILLAR that you are trying to look for across a much broader array of rights that are all interconnected?

Nathan Seiuli: Oh, 100%. So obviously speech was a tightly scoped remit at the FSU, and that’s the secret source as well, I would say of the FSU. They were able to focus and drill in on this one issue. A really important issue and an issue that underpins nearly every other right really. At PILLAR, we’ve got a slightly broader view, and that’s not to say that we are easily distracted or not focused, but we would say things like speech, thought, inquiry, conscience, belief, religion, and especially heading into this area of the right to privacy, which is becoming more and more controversial.

So things like digital surveillance, digital id, facial recognition, technology number, plate scanning, a lot of these more modern threats to our rights, especially our right to privacy. They all sit just outside the remit of what speech allowed us to engage with before. So yeah, stepping out independently from the FSU, and I should make that pretty clear.

We, we are independent of the FSU like you say, they are very effective and very successful in what they’ve done and what they continue to do. But yeah, having that sense of freedom and independence has really allowed me and the team to think a bit broader address other issues, foreign interference being a major one as well, and just how we can engage them beyond that speech thread where there is so many things we can touch on and bring to the forefront of the conversation.

So yeah, we’re excited to touch a broader spectrum of issues, but at the end of the day, all falling under that category of we, we believe civil liberties and human rights.

Simon O’Connor: Yeah. And look, it won’t surprise you that it’s something I agree with, including that whole notion rights in intersect one another.

And I must admit, I hate using the word intersect these days. It seems to have a different meaning, but yeah, rights don’t sit as individual silos – they interconnect. Or in your case, if I could draw out your name and analogy, it’s the pillars which hold up, the important structures of our society.

They might be independent, but together they’re working. So I mean it in some ways. It’s quite a massive area that you are moving into. And we should have a bit of a debate actually, or a discussion. There’s a better way to discussion a little bit later on this whole age verification and digital ID and particularly around under 16 year olds.

But in some ways, PILLAR to me, your PILLAR in New Zealand sounds like it could be exceptionally broad. What are some of the topics you’ve already delved into or beginning to make comment on?

Nathan Seiuli: You mentioned that obviously the under 16 social media users ban. It’s a big one. We are looking at this from two perspectives, two key perspectives, parental rights first and foremost.

I don’t think the state should intervene in how parents’ parent, but then also things like online freedom and privacy, this huge risk for privacy. Foreign interference is another one that we’ve mentioned with the NZSIS report coming out and saying that foreign interference is definitely happening.

To, to the greatest degree in our history. Other issues around the election will come out and be more apparent. And then things like surveillance. The police are using AI powered cameras. We don’t know how many and how much they are being charged to use cameras from a private company, but looking into that issue as well.

So they all are interconnected, like you say, or a pillar of democracy that we’re looking at a lot around privacy and online privacy at the moment, but also, yeah, foreign interference, and then still those classic cases of freedom, of speech, freedom of thought. We are looking into a case with a teacher who’s relieved of her duties for upholding her beliefs recently in Dunedin.

There’s just so many things on the table, which is why, people say, why do you start something new? Isn’t this a saturated market? Their problems don’t seem to be slowing down. The change isn’t happening as fast as we think it could happen. So we just wanted to put more hands out there, many hands makes light work, and really try and lift this thing to the level that we think it should be at.

Simon O’Connor: It’s a massively broad area, but for what it’s worth I do see a lot of these. You’ll see it clearly or more clearly than me because of the work you’re leading, but there are just look an array of challenges, some that you could even describe as threats. Certainly the foreign interference space.

I won’t go down that rabbit hole, that’s one of my soap box issues. But I would just underline what you mention, it is happening and it’s happening frequently. It’s actually good to see our in intelligence services speaking out on it. But as you say, even the challenges around expressing one’s belief in society, I see supermarkets now are rolling out facial recognition and my sense say on that one or what the police are doing is we as a society haven’t really had a conversation about these things.

It’s just been foist on us.

Nathan Seiuli: No, I had a conversation with someone this morning over coffee and I said, oh, go you the easiest way to exactly, I’m in the, I’m in the real world now, I said the easiest way to describe my job to someone who doesn’t understand it right now, was I look at things that three years ago were conspiracy theories and try and unpackage how they’re actually happening right now.

Like the whole digital ID conversation. I have to consistently step back from my computer and wonder, am I connecting real pieces with string here? Are they really filming us and watching us and scanning our face? It’s this extensive reporting being done by a very small amount of journalists, especially from RNZ and just looking at these stories and putting them in context next to each other, it’s hard to deny that our privacy is being infringed on hugely by AI powered cameras on our roads and in our big retailers. So there hasn’t been a conversation and I’ll just say this, we don’t have to go deep into it, but there was a request made for more information from the Court of Appeal that was shot down by both the court and the private company.

Big questions to be answered. There’s big questions to be asked at PILLAR. We are really keen to ask those questions and help lead that conversation. But yeah, I think we’ve taken our privacy for granted in New Zealand. We’ve taken our freedoms for granted in New Zealand. They’re not gone by any means.

They the stretch of the imagination and they’re hard to kill. They’re hard to stamp out. We want to make sure that people though, are being equipped with information that they can stand for their rights as well.

Simon O’Connor: It’s probably more of a historical comment and it relates back to, dare I even mentioned the c word, back in COVID times that New Zealand sort of has inherited a tradition of rights.

We’ve never really had to fight for them in the way that many other countries have. And so I think at times, and again, feel free to push back on this, but I think we take them a little bit for. Granted because they’ve just always been here in New Zealand, which has always had freedom of speech and association and movement and so on.

But actually they are being consistently challenged and to the extent that yourself and PILLAR can actually gently, not in the violence sense, but gently shake New Zealanders and say, hey look there are challenges. So I will ask you to go down a little bit of that rabbit hole on this AI and police mainly, because actually, I don’t know much about. I know discretely what AI is and police might deploy, but what is it that you are seeing, no pun intended, and what’s the problem?

Nathan Seiuli: So there’s a company called Aura, which a decade ago was founded in conjunction with the police. And what they specialize in is AI powered cameras.

Now, the technology at the moment to public knowledge is being used to scan number plates. Look up things like, does this car have a warrant? Does it have a WOF? Things like that, but also can tell where that car has been for up to 60 days now, scanning every number plate. It’s not being selective.

This is how they do it. They do large capture and then they whittle it away to what they say they need. The issue is though an internal police report revealed that their system is accessible by anybody that has a police login. So eight and a half thousand people. And when you log in to look at information, you don’t have to say why you are there or what information you took.

So it’s different than the regular police database in that sense. You have to if you log into the regular police database, there’ll be a flag that’s attached to that login. It’ll show what information you accessed. You have to say why you’ve accessed it and what you took out. This database, there’s no such safeguards.

So an internal police report and review revealed this. Now, when the police were pushed on this, they said, yeah, we’re absolutely too loose. We need to tighten our controls. A few weeks later, it came back to, to say how they’ve gone and tightening the controls. They said we actually haven’t done anything and we probably actually not going look into any of the potential misuse and abuse of the system.

Now, all of this is in RNZ articles. This is not hidden or conspiracy. It’s just not widely known now. Secondly to that, the same company in the report, it was referred to multiple times that they were using facial recognition technology despite them denying that they had that capacity.

Once those allegations were made, though, a few days later they came out and said, actually, now we can do facial recognition technology, but it’s not called that, it’s called connect the dots. They changed the terminology and they said, actually we can do that. And that’s what we’re being seen rolled out at mainly.

Food stuffs, retailers, I think Christchurch just announced three supermarkets. But the Pak n SAve near my house has a big sign in the window that says, when you walk in, you’ll be scanned. Now it scans every face that walks in and it deletes you if you’re not in the system, which sounds great.

There’s a safety there, but it also says that there’s a percentage of error where you can be mistakenly identified and then you stay in the system, right? On the basis of how many scans are being done and the percentage that’s being deleted, it still leaves a huge amount of faces in the system, and it’s meaning that people are being wrongfully removed from supermarkets.

It also means their biometrics are being stored in a database that is not that secure. And if you look at things that the discord leak, if you look at the recent Microsoft outage, we know that these systems aren’t as secure as they always claim to be. Especially even with Google. I think there was over a hundred million login credentials leaked in the UK yesterday. So that’s what we are looking at. That’s what we are trying to get people talking about. Should you are in public sure, but should you be really open to being scanned every time you go into a store or drive your car, and should they know where you’ve been for the last 60 days and all your behavioural habits like this?

Because it’s the first step towards restricting access on the basis of. Things you might say online or things you might do in private that they know and know publicly. So it sounds very Orwellian. It sounds like very conspiratorial. I still am getting to terms with understanding and believing it myself, but it’s on good authority that this is taking place.

Simon O’Connor: You can immediately see the giant reach of it. And actually, particularly at the start when you talked about the, what it can do for police and the access level there’s quite a paradox that actually it’s a phenomenal amount of information and yet very few safeguards. Whereas, as you say, I forgot what they call the polices, national intelligence software now, but you, yeah, it’s hard to get into.

Rightly and you have to log your way in. I remember. Granted, not the police, Nathan, but when I used to work at the Ministry of Social Development, if I ever access someone’s records, legitimately, I might add, you always had to leave a note and it was audited. But yeah, I hadn’t appreciated the AI stuff.

I knew of, I didn’t realize like 60 days. That’s phenomenal.

Nathan Seiuli: The police have also said that it’s producing way more information than they know what to do with. And it’s funny, I, so I made a video about this and then got a phone call from an unknown number and I answered, it was a journalist from RNZ that they said, how do you know all of this stuff?

I said I’ve been reading what you’ve been writing, and they said that’s great. You’re on the right track. Also, look into this, and this. So there is an effort happening. In, in, in the media to try and bring this to light. And they were the ones who told me about their rejected request for information that the Court of appeal put forward.

The Crown and Aura both said no because of the public private interest of the issue. But I that’s the question I’m asking. Why don’t the police or Aura, the private company or theCcrown wanted to be out there. First of all, how many cameras there are, how much police are paying for those cameras, and what those cameras, capacities are and abilities are?

It seems like stuff that should be in our interest, considering they’re being used on citizens. But there’s been a drop in retail crime as a result, and there are over four months, two cameras made over a $900,000 worth of fine. So there’s incentives to not be transparent around this as well. So let’s be really clear, there’s questions that need to be asked.

Simon O’Connor: 100%. And part of it too is I don’t think you compromise the intelligence by sharing some of the things you just mentioned. Again, these are, this is high level surveillance. It’s not tinfoil hat stuff because we look at the likes of China, for example, and how it’s social credit system works. Yeah, it’s a phenomenal amount of information and it’s actually again, it’s great that you guys exist to be chasing up and saying, hold on a moment. We, we don’t want all maybe we do, but we don’t want all the information on the system. But yeah. Where’s it been deployed? How often? How frequently?

What are the oversights? Is the privacy commissioner, just as a side note, been involved and engaged in any way in this space?

Nathan Seiuli: Yeah he’s made multiple comments. He is playing very coy. Paul Goldsmith’s made comments. One of the comments that was really concerning though, that I saw, and I, it was someone from the privacy commissioner, I believe either one of their legal representatives or the commissioner said that it’s going be trade offs of the system, because unless they can enforce it across everyone, they won’t get the full benefits.

Now, a statement like that, you have to take a step back and say, so you’re admitting that it only works if it’s all intrusive and inclusive. Which sets to me the framework for them having an incentive to take it a bit further than what we would probably be comfortable with. I was already hesitant around, like you brought up COVID, having to scan into every area you went with your COVID pass.

This takes the, that takes the option off you now. It’s not about you scanning and it’s about you being scanned as you enter. So it’s a pretty, crazy concept, but it’s happening. And like you say, in China, it happens. Nations around the world are trying to implement it. The UK’s trying to implement it and then slowly what it does, it whittles away your different rights, your right to rent first, your right to work second, your right to purchase third.

So these things can easily be restricted or revoked on the basis of whether you have all the right credentials.

Simon O’Connor: Oh a absolutely. And for me too, the whole argument that some would deploy if you’ve got nothing to hide, then what have you got to fear? But this is so ubiquitous, so much in every element of your life.

And again, it strikes me, this is a key part of what PILLAR’s doing is just raising awareness. So Kiwi’s go, oh, it’s happening. And it’s even happening in the private space. One of my neighbours has a, what do you call it, a license plate, recognizing camera. I was coming up my driveway at one point and this light kicked in faint blue light.

And I was like that’s strange. And I recognised, I went, oh, I know exactly what this is now. It helps him open his gate. But, from a privacy standpoint, and I’m not picking on him by the way, but it’s clearly on, and yeah, there’s implications there. because it’s shared land and so on.

It’s just, yeah, it’s everywhere now.

Nathan Seiuli: Amazing. As part of being in the modern era. But the lie is that we have to be com sort of complacent about it. We can actually push back, we can actually fight for fundamental rights like privacy. But it does take being part of the conversation. We can’t just sign everything away because we’re being told it makes a safer environment.

That’s what the police said, that and I believe someone in government came forward and also said that this technology is essential to creating safer communities. At PILLAR, we would absolutely say that safe community is great, but free community far better. So yeah, that, that’s what we’re looking at the moment in that area.

Simon O’Connor: Can I say? I think that’s a great distinction actually. A free society becomes a civil society and it actually enables a lot of whether you don’t need. Policing, but I’m actually struck, it’s always one of my favourite quotes from a book called A Canticle for Leibowitz, and it talks about the ‘state rightly sought to minimise suffering and maximise security, but in seeking both, they found the opposite’ which may be a little bit mysterious, but for me what it says is, and it’s joining what we were just saying with the police, that, oh no, we need all these powers. We need to able to do this, and this to keep everyone safe. But actually, in doing so, the opposite happens. We all become nervous and suspicious. Yeah.

So with PILLAR then, you’ve clearly, is there an educative side and advocacy side? Is there almost, is there a legal side as well? Just how big is your work and advocacy now?

Nathan Seiuli: Yeah, we’re looking to grow into those spaces. We are set up with the structure prepared slowly looking to take on a few cases and defence to individuals, especially around the foreign deference issue.

That’s a really big one where there’s a lot of kiwis who need our help. And I say kiwis because they are kiwis. They’re not Chinese people or Iranian people. They’re kiwis of that descent, but they need our help.

Simon O’Connor: You spoke out with one of my friends, Portia Mao.  Portia’s amazing. Great to see you back in here and supporting her.

Nathan Seiuli: Now Portia’s story is just it’s the tip of the iceberg. Chinese kiwis in New Zealand have a right to enjoy all the freedoms afforded by our democracy. They’ve, they come here, they work hard here, they’ve pursued freedom. It’s just absolutely un kiwi, in my view, to let them suffer under the thumb of communism.

A democratic nation. So we are absolutely looking forward to take on some legal cases there. We will continue to do obviously legislative work as, as well with the under 16 ban being our first one. But we’ve got a few on the bill for next year as well. We’ll do some education, we’ll do community events.

We’re looking at bringing some speakers over from overseas. Got some really exciting names on the ballot there. And, yeah, community building. This is something that we don’t want to exist in just the philosophical or digital space. We want there to be communities of freedom, fighters of PILLAR people all across the country.

We’re excited about fostering that. So we call it stand so standing for people who need our help. Shift shifting legislation for a freer community and freer society speaking up for those who don’t have a voice and strengthening our communities. And those are the four areas that we focus on and working.

Simon O’Connor: So before we, because I’d love to talk with you briefly around the whole age verification because I think PILLAR, not think, I know that your perspective or PILLAR’s perspective is slightly different to Family First around the under 16, and it’d be nice to tease that out a little. We can go into it deeper another time.

But what ultimately do you hope to achieve? What like the would be an end Yeah, an end goal with all the work that you’re doing.

Nathan Seiuli: Yeah. It’s. It’s probably it’s not wiser me to make some sort of utopian statement, especially since I don’t believe in such a thing.

But our goal is that through our work, people would be re-energized. There’d be an optimism that they find ab about New Zealand and about. Our nation and about the freedoms that were afforded. I talk to so many people who are negative about New Zealand and negative about the country we live in, and there’s a consistent theme.

A lot of them haven’t travelled, haven’t looked outside of our borders, and have this romanticized idea of what exists out there that’s better than what we have. I think that we should be extremely grateful, thankful, and optimistic about the future of New Zealand. I think what we have is one of the greatest countries in the world, if not the greatest.

And I want our work to help people feel that, find that, and promote that in their communities and in their families and in their lives. So if anyone who’s engaging with PILLAR can walk away thinking and feeling one thing, it would be optimism. It would be optimism that’s attached to freedom. It would be optimism that’s attached to responsibility.

It be optimism that’s attached to community. We just want people who see our work to not feel like this is a doomsday message. Yes, some of the things we are talking about are really serious and should be of concern, but you should walk away feeling like I know what I’m going do about it. Not, oh, what am I going do?

So hopefully everyone who sees what we are doing feels optimistic, feels empowered to make a difference because. It’s people. At the end of the day, it’s not the system that’s going to fix all of these things. It’s the people engaging together democratically that we can actually start to see New Zealand be like, I believe the greatest nation in the world.

And I absolutely am never going to apologize for that. People call me crazy, I don’t care. I think New Zealand’s amazing and I’ll say that’s the hell I’m going to die on,

Simon O’Connor: oh, and look good on you. And I’m with you. I think actually most New Zealanders I think those who often get hold of the microphone in media and elsewhere in academia, I’m going very broad here, they knock our society down.

And as someone who has travelled a lot, I’ve been around to many places in the world. We are very fortunate to live here and with the rights and freedoms that we. That we have. And I, was saying to you before we came on to the show, I’m often depressed. Here’s an anti optimism dynamic. But, I’m often depressed when I’m watching these short videos coming out of America as these young Americans dish on their society how terrible America is.

And you’re going, have you been to Iraq and Afghanistan? Have you been to Tunisia – actually, it’s probably not fair to pick on Tunisia at the moment. Anything we can do and like PILLAR’s going to do to try and inculcate that optimism, I think is just brilliant.

Nathan Seiuli: Oh we need it. Honestly. We really need an optimistic generation.

Simon O’Connor: I’m too old and tired. I’ll leave it to the younger one. No. Only kidding! So tell us, speaking of younger ones, okay, tell us, and you’ve touched on two reasons, parental rights and online freedom. So PILLAR is concerned about any, as I understand, any moves to try and regulate or restrict access to social media or the internet for under 16 year olds, which is slightly different to Family First’s positions. So talk me through your side.

Nathan Seiuli: So we’re not hesitant for any moves, but we just think that there should be good solutions.

We can’t have solutions that make the problem worse. First and foremost, parents should be the first line of defense when it comes to what their children do. Online access to devices, right? I don’t think the state should take a role in telling parents what they can and can’t do, especially in something as vague as harm.

Now, the issues online are real, they’re realized. Some people will still debate that. But we at PILLAR, no. We agree there is harm online for children, however. The state coming in with blunt broadbands is never going to be the answer because what it does is it catches everyone in this this practice of verifying your age.

Banning under sixteens doesn’t make just under 16 olds prove that they’re not 16. It makes everyone prove they’re over 16. It becomes a sort of an exercise in data collection. Now, people have out been out there saying they don’t keep the data. How many leaks are we going to see before we stop believing this?

This such thing as a perfectly secure system. The other thing is the government isn’t even the most effective tool or the effective body to enforce these sorts of safeguarding of online spaces. We have community groups, we have churches, we have advocacy groups that are far better equipped to deal with the case by case specific issues.

Faced by certain children. It’s a minority problem. And people are going to say that it’s not, but 500 parents recently was surveyed, 20% of them said that their children had an unhealthy relationship. Now, 20% is a big number, but it’s still a minority. We can’t make majority rules for minority problems. We need to target them intentionally and bring in actual tools that will make a difference, not just put a band aid on it, ban them because, we’ll, we’ve seen it with COVID, right?

You overregulate you threaten with fines. You put these restrictions on private companies and what do they do? They overregulate, they over censor things that are legitimate, not illegal, get removed from online. Information gets controlled, narratives get controlled. The state should have no role and we’re seeing it with state controlled media as well.

The state should have no role in changing and challenging and controlling narratives that are coming out in these spaces, yes, we should protect children. Children should be safe. But that should be the parent’s role first and foremost. We’re not saying the harm is not real, we’re just saying there’s better options than blunt.

For state bans, and I would say this to family first and to you, and to you and Bob, I remember you guys were part of leading an amazing campaign against banning conversion therapy for the very sole purpose that had infringed on the rights of parents to lead their children in line with their evolving capacities, right?

This is exactly that in a digital form. So it’s not saying that we should let kids go online. We would never advocate for kids being online unsupervised. I’ve got children, you have stepchildren. Nick has two girls. Our team understands deeply what it means to look after children, protect children. We do it every day.

We just don’t think the government should have a say in that.

Simon O’Connor: Yeah, we there’s some crossover between what you are saying in Family First position because, we are not very, and I’m, I suppose I’m partly speaking for myself and family first. Bob and I are debating in discussing this at length.

We certainly don’t want to see actually overarching government. Control in, in the space. In fact highly sceptical of allowing the government to be in other than very light touch regulation to enable what we would actually see as more the private sector, be able to step in and give the tools to, to parents because you’re right.

This should be a parental issue though, we’ve been somewhat affected here by Jonathan Haidt’s research and take, which is basically, some parents get it, it would be my counter argument that 20%, some get it, a lot don’t, but the harms are still there for kids. Yeah. And so parents do need tools and I suppose the other side for us too is big tech, as we like to call it.

So your matters and your googles and apples do need to step up to the plate and actually take some steps rather than just dragging children in. So I suppose ultimately for us, it’s a child’s safety. Issue, but it would want us And you are on the same space. Yeah, but we don’t want heavy handed government.

One, one example, I was just talking earlier with someone who’s very much involved in the age identification and evidence side and was pointing out, like for example, for, and I’d actually be interested in your take on this. There’s a multiplicity of technologies now. One, you don’t even require the internet and so far as your phone can scan.

Your passport, you then use the video on your phone and it basically matches and said, oh, yep, that is Nathan. That is Simon. At no point does it connect to the cloud or the internet. And then because it’s been say your passport or driver’s license, it says, oh, okay Simon is over 18. And that moment, a short bit of data is fired off to Facebook or wherever and says, yep, this guy’s over 16 and that, that’s the end of the transaction, I don’t know, does technology like that, is that a way forward we could look at, or you still got concerns around privacy that could still be compromised in some way?

Nathan Seiuli: There’s absolutely still concerns there, but what you are speaking to is what are, if, and I agree with you, what are better solutions in the state, right?

Absolutely app store levels at device levels, that’s something that the Australians are exploring at the moment. And I know that some of the submitters from the likes of Meta TikTok and Snapchat came forward and said, these would be better options if the device creators, sellers, and carriers would put parental guide or blocks in at the device level.

They currently don’t, and that would be a far better place to implement parental controls as opposed to state controls. I know Meta uses a third party app called Yoti, or a third party platform called Yoti, which is what you’ve described there. But what is also res resounding thought and opinion amongst the experts is that state bans won’t work.

They don’t work. There’s way too many workarounds. There’s black markets that pop up. You force people into less regulated, less visible spaces, especially young people. Tech savvy young people will not be kept off the internet. And then and then what will happen is tech giants, as you call them, will be fined $50 million in Australia, for example.

So what’s their incentive to not be fined 50 million? What’s the, what’s the way to avoid a fine then to overregulate? They’ve already said that they’re going to not just look at what people register as identity or id, they’re going to look at behaviour. And that should be really concerning.

That’s the expert from TikTok said that they’re going to regulate behaviour and anyone that they deem to be behaving under the age of 25, their account will be restricted and eventually deleted. So this is the thing we can say, we’ll put it in now. And this is what we always have to remember with legislation and with some of these things that we put into law and.

You might implement it now and it’s fine, but what are you going to do if the person you don’t vote for gets into power and that tool is sitting there waiting for them to brandish and use however they see fit? You have to think through things like this, because if we bring it in now and it works for us, it might not work tomorrow or next year when the election happens. It can be used any way you want.

Simon O’Connor: I am with you. Having been in Parliament, it was always one of the things in my mind of whenever we pass something, okay, as you articulately put it, group X takes it one direction and then the next group comes in and says, oh. Great. It might be a strange analogy, but I remember the treaty principles bill controversial as it was, one of the things I was saying to people is, okay for those who supported it.

Okay, great. You think ACT, putting through this law to define it. Once it’s done, why would not the next group come in and redefine it though? You’ve created a very strange dynamic, but how would you apply what you’ve just said? And again, thinking to viewers to understand a good natured discussion and debate.

But, we do regulate access to alcohol. We fence swimming pools. We tell people not to speed on the roads. We still have drug regulations to try and protect despite the fact we know that kids get over fences, they go and sneak alcohol they find drugs, so on and so forth.

Should we just, to put it, slightly crude, do we just need to give up on that as well?

Nathan Seiuli: Key word there. We regulate, but do we stop? Yeah I think there’s, I couldn’t imagine the number of under 18 year olds that are drinking in New Zealand right now. The car analogy is an interesting one as well.

I’ve had it brought up multiple times, seat belts and speed limits there for safety. This is not comparable at all though. This is. This is like getting in your car and having to tell your car where you want to go, how long you want to be there, and what you’re going to do there before it even turns on. That’s what this is.

It’s not the same as having restrictions in place. This is a barrier to entry and participation. Everyone will have to participate in, everyone’s going to have to prove their identity and their age. This removes the ability for pseudonymity anonymity, things that are protected, vulnerable speakers online. So it’s not exactly the same as regulating and creating a safer environment.

What it does is it create effectively a surveillance environment. Yeah, if you can find me, a good example where drinking ceased for under eighteens when. 18-year-old bands come in. Maybe you win me over. But I just from personal experience, I definitely was not waiting until I was 18.

Simon O’Connor: Just need to remind you, this is a Live podcast,

Nathan Seiuli: hypothetically. allegedly

Simon O’Connor: yeah. One of the ones that’s captured my, I suppose for me it’s twofold. One is just that what does seem to be comprehensive harm to children? Look, I’m with you. I’m the first person to get upset and worried about the word harm because it gets stretched and twisted again and putting my cards on the table.

I think Family First again. Jonathan Haidt’s research, which basically, and I suppose one of his conclusions, Nathan, is that we need a collective response, which is why even as someone who leans heavily into parental rights to go if Parent X is doing something and parent Y isn’t actually the children are still going to be impacted and infected without a collective response, which might sound strange coming from a former centre right politician.

But it’s those area of the app stores, which has particularly caught my attention because twofold, it seems one, pretty much we access all apps through only two stores, Google and Apples. And needing an age affirmation there could work. But the real kicker is consent that the parents need to give consent.

Because one thing that strikes me, I don’t konw about you. When I sign up for an app and I get, pages and pages of 8 point font of conditions, I just go to the bottom and click agree. I suppose young people can’t consent to these things. The sort of approach I think like Utah, Texas and others are taking, keeps the parent in the loop.

Is there something that we should explore there from your perspective?

Nathan Seiuli: Absolutely. Yeah. No, you are right there. There’s a monopoly there with Apple and Google and they do need to implement better controls for parents. I have explored the Apple parental controls, with a dummy account that I made for my 6-year-old.

Not that he has a device or a phone or anything that he could use it on, but I just wanted to see what the controls were. And they could be improved a lot more. What, yeah I just keep coming back to the thought though, that bringing in state ban, state enforce. To replace parental rights, parental responsibility and oversight is just never going to work.

So no, Apple and Google, this is exactly what they’re talking about in Australia, Apple and Google need to do a better job at the device and app store level, and that’s what every privacy expert that I’ve seen speak publicly on the issue has said. And I was also just want to say, you mentioned Jonathan Haidt a few times, and I think he’s done fantastic work in this space, he doesn’t view it as so much of an addiction to social media, but a social participation thing, right? He said, if you can get a third of kids off you, you are likely to get the whole group off. They just want be there because their friends are there. And he says, in an ideal world, the state would come in and they would achieve this.

But he also acknowledges that the state is not the right actor to do so I want to encourage people who love Jonathan Haidt. I’m on your team. I’m team Jonathan. It’ll be, it’d be great to get into New Zealand to do a talk about this. But even he’s realistic about the limitations that having the state enforce these things bring and the threats that it brings.

So we can’t just say Jonathan Haidt said this, and this. His research is amazing. He’s, he showed that there’s an argument there, but his solution isn’t one, one dimension either.

Simon O’Connor: No. Agreed. And I have to say, I’m probably more aligned to what you are saying is that I don’t want to see big government here at all or overly heavy government regulation.

I suspect there will need to be some, where the tipping point comes to me, because I can understand a lot of the debate and argument like, like you can suppose for us at Family First, that child save the need to actually safeguard for want of a better term in the way we perceive it.

Because no one’s here saying we shouldn’t safeguard children. But it tips a bit of the balance to go, okay, we actually need to tool up parents. But I suppose for me it’s very light government regulation, which puts a little bit of pressure on these big tech companies to do something primarily through that.

Private sector and to limit how much tech, but understand the tensions there too. We should come back to this another day. because literally we could do a full podcast on this, but I thought it would be just good to tease out a few ideas. I say

Nathan Seiuli: If I could add one thing. The Bill is out there.

It’s only seven pages. We are going to, we are going to release a sort of a deep dive on it soon. The bill in its opening statements, talks about encouraging education, encouraging resources that educate parents and then later on suggests zero such thing or points in the direction of any education.

So more than anything, this is lazy legislation. So we just have to be really cautious of that as well. This is just virtuous. We’re trying to follow other countries. The work hasn’t been done. We’ve been trying to talk to the minister now for over a. She’s cancelled two meetings with us, so we’re looking forward to talking to her soon, hopefully, and we’ll get some answers.

Simon O’Connor: Yeah, look for what it’s worth. I’m pleased the Bill’s been drawn out. I’m only pleased because it’s enabling more conversation, or it should do. I actually think the bill is exceptionally light in some ways. To me it’s a classic Members Bill. It just flags an issue.

It was like years ago, and look, there’s multiple Members’ Bills, but I remember one was when I was chairing the health committee around organ donation. And look the idea in the Bill was great, but we took well around about nine months to a year basically doing all the heavy lifting and drafting, because it really was just a couple of pages of nice statements.

And I think unfortunately Catherine’s social media one’s exactly the same. It’s somewhat modelled as on Australia, but it’s very light in detail. And if it passed, it would be a mess, to be honest. But to the extent it starts a conversation, although I do feel sorry for the MPs on the select committee because they’ll have to do all the heavy lifting and they’ll get absolutely no credit for it whatsoever. And actually the other thing, and I don’t mind saying this, while we’re still on the podcast, we’ve actually struggled to engage with her or the Minister. So Erica Stanford’s been meaning to look into this issue and had Jonathan Haidt last year at our Forum on the Family, granted via Zoom, but he spoke with us. We had Dawn Hawkins here recently, another expert. There’s a whole number of other contacts we have and have been offering them. Actually Family First has been saying to the government, to the MPs, we can have these people talk to you. We don’t even need to be there. They just don’t seem to be wanting to listen and I don’t about you, that’s a big red flag for me in this space in particular.

Nathan Seiuli: Oh, absolutely. And if I could just quickly say. I went to the Forum on the Family this year. It was incredible. I, if you haven’t gone before, you should go next year. Family First. You guys do an incredible job.

I sat there during the intro of the conference and there’s a highlight reel of everything that Family First has done for New Zealand over the last two decades. And I thought, why am I even starting a new organisation? You are already doing it all. But I, but then I thought, no, I’m encouraged. I want to be even a fraction as successful is what you guys are. Good on you. Like I, I think, like you say, we are aligned on the core fundamental principle of and this is what I’d say across all advocacy groups, we all want to protect children. Oh yeah. There’s no disagreement there, but the solution is really important.

We need to make sure we get the solution right. So that’s where I’d say we’re a bit of an outlier. I’ve gotten some cold looks. I’ve gotten some cold shoulders from different people in the space. I’m a big boy. I’m happy to handle it, but I just want people to know at PILLAR, yes, we believe in protecting children.

The solution’s really important though, because if we don’t do this well now, it’s only going to come back and get us in the future.

Simon O’Connor: Yeah. And arguably it will not help the children and it’ll have repercussions into the adult space. But this again, is why I think actually it’s in my mind, it’s good having multiple groups. So you’ve obviously Family First. Bob’s been around like 20 years. PILLARs just started, have the likes of Ethos Alliance and others. There’s no shortage. I actually, myself, think it’s better to have multiple groups and point disagreement because actually as we discuss and debate, obviously we’re here on a podcast, but we’ll be discussing a lot offline.

Hopefully it’ll hone our thinking. So if people that want to get hold of PILLAR, they want to check out your work, your young guys, your media savvy. So what are your websites and Facebook? Where can people find you?

Nathan Seiuli: Yeah, so we’re on every social platform. Our handle is at PILLAR NZ and we have a website, www.nzpillar.com, sign up to our mailing list.

We send out updates every week. We send out information every week. We’re hosting a petition at the moment. It’s our under 16 band, so make sure you sign that. But if you want to engage with us, probably the best places social media. At PILLAR NZ. So we’ll see you there. If you write us a comment, we’ll write back.

We’ll, I’ll make Arian write back and it’ll be a good time.

Simon O’Connor: I have to say, I’ve only just signed up to your newsletter, so I encourage people to do that. But also I am loving your social media game already. I know you guys have started your own podcast. So look, Nathan, I’m incredibly excited that PILLAR New Zealand is up and running.

I just love the breadth of your work and look a big thank you for taking the time to come and talk with me and all the viewers today. Oh, thanks for having me. And thanks to all the viewers.

Scroll to Top