PODCAST

THEOS Cybernova: The Cybersecurity Podcast for APAC Leaders

THEOS Cybernova delivers expert cybersecurity insights for business and security leaders in APAC. Hosted by THEOS Cyber CEO Paul Jackson, each episode dives into real incidents, strategic responses, and the evolving role of cyber leadership.

Episode Summary

Disclaimer: This episode discusses child sexual abuse material (CSAM) and includes references that some listeners may find distressing.

For Mick Moran, child sexual abuse material (CSAM) is not just a law enforcement issue; it is a cybersecurity blind spot. As CEO of the Irish Internet Hotline and a former INTERPOL Assistant Director, he argues that every CISO needs to know how to detect CSAM, what to do if it surfaces, and how to protect both staff and reputation.

Through wargames at the Council of Europe, Mick shows how easily organisations falter without a CSAM policy: HR rushing to dismiss, legal silenced by uncertainty, and security teams exposing staff to trauma. He connects these lessons to Asia Pacific, where remote abuse and sextortion networks highlight the urgent need for corporate readiness.

This is not a topic widely discussed in cybersecurity circles, but it is one every CISO must factor into their playbook. Detection, wargaming, reporting, and welfare cannot be ignored.

About the Guest

Mick Moran is the Chief Executive Officer of the Irish Internet Hotline, Ireland’s national reporting service for illegal and harmful content online, including child sexual abuse material (CSAM), intimate image abuse, and other forms of online exploitation. It is an independent not for profit and operates with members and also with support from the EU and Irish Department of Justice,

Mick brings more than 30 years of frontline and international experience combating online crimes against children. A former officer with An Garda Síochána, he was seconded to INTERPOL in Lyon as Assistant Director for the Vulnerable Communities sub-directorate, where he built and led global teams addressing online child exploitation and human trafficking. He also served as Garda Liaison Officer to the Irish Embassy in Paris and in intelligence at Garda HQ in Dublin.

Alongside his leadership role, Mick lectures at University College Dublin, sharing his expertise in cybercrime, forensic computing, and child exploitation. His career has been defined by a commitment to protecting vulnerable individuals from online harm, and his leadership at the Irish Internet Hotline is strengthening Ireland’s role in global online safety.

 

Mick Moraan
Chief Executive Officer
Irish Internet Hotline

Mick Moran
Chief Executive Officer
Irish Internet Hotline

Mick Moran is the Chief Executive Officer of the Irish Internet Hotline, Ireland’s national reporting service for illegal and harmful content online, including child sexual abuse material (CSAM), intimate image abuse, and other forms of online exploitation. It is an independent not for profit and operates with members and also with support from the EU and Irish Department of Justice,

Mick brings more than 30 years of frontline and international experience combating online crimes against children. A former officer with An Garda Síochána, he was seconded to INTERPOL in Lyon as Assistant Director for the Vulnerable Communities sub-directorate, where he built and led global teams addressing online child exploitation and human trafficking. He also served as Garda Liaison Officer to the Irish Embassy in Paris and in intelligence at Garda HQ in Dublin.

Alongside his leadership role, Mick lectures at University College Dublin, sharing his expertise in cybercrime, forensic computing, and child exploitation. His career has been defined by a commitment to protecting vulnerable individuals from online harm, and his leadership at the Irish Internet Hotline is strengthening Ireland’s role in global online safety.

Paul Jackson: Welcome to another episode of the THEOS Cybernova podcast with me, your host, Paul Jackson. I’m proud today to have the legendary Mick Moran with me for another episode in season two of the podcast. Mick, thanks for joining us today on the show.

Mick Moran: Hey, thanks very much, Paul. Legendary. I think I’m legendary in my own lunch time. A bit like, a bit like yourself. It’s the truth, be told, we are two self-proclaimed legends here in this recording box?

Paul Jackson: Yes, I do indeed. We are in this little recording box in the fabulous venue that is the Council of Europe in Strasbourg.

Mick Moran: That’s right. We’re in the Council of Europe, and the Council of Europe to be fair to them, were one of the first outfits really to pay attention to cybercrime.

With the, with the they have a convention that goes way back, I think, to the year 2000 or something. Yeah, yeah, yeah. 1999 I think. And they did all through the late 90s, they were doing a lot of work on cybercrime. And then they have the convention, their Budapest convention, and from my perspective, like so my expertise is in the whole child exploitation online area.

And my interest in the Budapest Convention is it’s the only convention that really calls CSAM or child pornography. It’s the only convention that mentions CSAM as a cybercrime. Yeah, but other people tend to think that, you know, you can’t have something that’s content, labeled cybercrime. But whereas I would say, no, it’s not the content that’s the problem. It’s the transferring of it on networks and the and the moving of it.

Paul Jackson: Indeed. And we are going to drill into that. But first I’m going to kick off with a very startling statistic, because you gave an excellent presentation yesterday, an excellent workshop. And one of the metrics that you gave there shows the connection between the private sector, if you like, and, you know, the exploitation of children, because most listeners are probably thinking, well, what’s it got to do with cybersecurity?

But we are going to cover that. But this metric is pretty staggering. That and it comes from an organization called Net Clean. And they state that 1 in 500 business machines contain CSAM, as we call it. Now, this is pretty staggering. But let’s start by explaining what is CSAM.

Mick Moran: Okay. So CSAM is an acronym that stands for child sexual abuse material.

Now that’s also known as child pornography. And in most of the laws and in most of the conventions, like the aforementioned Budapest Convention, are in the EU. In the EU, directive on on child abuse online, they refer to it as child pornography and in fact in laws even in Japan and then in China, it’s referred to as child pornography.

So I have no problem. I mean, I think Thailand recently made child pornography illegal. You know, I know that they’ve been working hard in the UAE, for example, to address the issue there. So it’s child pornography. And because you can’t have child pornography without having child abuse, we started pushing in a few years ago to use the acronym CSAM instead of child pornography, because pornography gives you the impression that it is the same type of pornography that you would see on, you know, Red Tube or any any of the commercial pornography websites or in OnlyFans or somewhere like that.

Whereas this is not pornography in that sense, because it involves the actual abuse of children. It’s a recording of the actual abuse of children. And when I say children, I’m not talking about young people, I’m talking about children. The vast majority of CSAM that’s out there is, in fact, prepubescent children. So it’s a classic bell curve where it starts at zero and it goes up in the bell curve.

For those of you who know what a bell curve is, know that it peaks out, tops out at around 12 years of age and falls away then to between 12 and 18, right down to. Of course, it’s not it’s not child anymore once it’s under over 18. But that point being that the vast majority of that, you know, and that’s an important point to make, the vast majority of it is, you know, it peaks at 12.

But, I mean, there’s an awful amount of stuff out there seven years of age, six years of age, children of that age, and indeed a lot of stuff of, a lot of stuff involving a lot of movies and, and and images involving pre, pre speech children. So under three years of age, under two years of age is an awful amount of stuff like that out there.

Paul Jackson: That’s incredibly shocking. I mean and I’m sure many of our listeners will be shocked by these metrics, but and I should of perhaps warn them at the beginning that they may, find some of this a little bit upsetting. But we have to face reality and the kind of work you do. I mean, I am in awe of the work that you do.

And also the NCMEC, the National Center for Exploited and Missing Children, our friends over there who does great work, but it’s not highlighted very often, you know, in terms of the context of cybersecurity. Now here at Theo’s, right, we do loads of cyber security drills, you know, tabletop exercises for ransomware, for other kinds of breaches or DDoS attacks. Nobody ever does them for incidents where, you know, CSAM is found on their systems, and how it got there, who might be responsible, what the impact is in terms of, you know, how many people within their organisation might be involved. It’s actually potentially extremely damaging. Reputationally and obviously, morally.

Mick Moran: Yeah. No, there’s a moral aspect, but I think more importantly, one of the things that I highlight when I do that exercise is that unless you have policy in place, you’re you’re running panicking, right? You saw yesterday the like we split up into three groups. So we had the HR group, we had the legal and compliance group, and we had the we had the CISO group.

And we saw very clearly that none of them was talking to each other. They were all off thinking about the problem, that they all got the same problem, and they were all thinking about the problem in their independent silos. So straight away you could see that, that was a problem. No, I think like where, where, where, this another place where CSAM differs from ordinary pornography is most companies will have a pornography thing in their acceptable users policy, or they’ll make it very clear, whatever you know, your end users, they’re getting company devices or they’re putting in their own devices.

And one of the main things is you don’t use it. Look, if you’re if you’re working for me, you’re working between 9 and 5. You shouldn’t be browsing pornography between 9 and 5. Right. And if we find pornography on your system, it’s a disciplinary offense. It’s a H.R. Matter. It’s certainly not illegal. The problem with CSAM, from a cybersecurity perspective, is that it is out there.

You rightfully target that figure 1 in 500 machines, but it’s out there. And it attracts special attention. And the reason it attracts special attention is that it’s illegal in most countries. Right? Okay. That’s one reason that attracts attention. So now, you have an immediate legal compliance problem. So you have, in effect, a serious criminal operating in your company.

If you find this on your system now, somebody has put it there, who put it there and you rightly pulled out the, the you rightly pulled up the risk of reputational damage. But here’s another one that companies don’t listen and don’t think about. You are exposing your staff to very traumatic images or videos.

Yes. Right. Your staff are getting exposed to that because you didn’t care about it, because you didn’t deal with it as a threat, that you didn’t have signatures in your system that will flag up, this as a, as an indication of compromise.

Paul Jackson: So what can company I mean, where do you, how would you detect?

Mick Moran: Now let’s get there. Let’s get there in a second. But just let me finish this point because it’s an important point when you’re dealing with these videos. Let me just give you an example of a video. Right. And again, a warning to listeners that this could be upsetting to somebody, and it could be triggering to some of your audience.

So please, you know, skip forward just 10s if you’re of the sensitive nature. But there is some material out there that is actively shared online that involves the rape of babies, right? Including the screaming, including the soundtrack that goes with it. Now, if you are not warning, if you’re if you’re a staff member working in your helpdesk or the secretary of an executive who has this on his machine, isn’t it doesn’t get exposed to this.

And then in the post-incident thing, you have half a dozen people in a room watching it on a big screen, and you’re exposing them to it. You are dealing with material that can be traumatising to people. And you could. There is another big risk involved. So part of our action plan in relation to dealing with it is the complete forensic quarantine cleaning of the material, only exposing those who do.

It’s absolutely necessary. And even then, after they’ve been trained properly in relation to it.

Paul Jackson: All right. So before we go into the detection of this, I want to keep touching on the trauma side because you actually raise an important point. And we’re both experienced former police officers. And sadly, we both had to deal with this kind of material. And we’ve seen how horrific it can be and the impact it has. The audience may be quite curious. How do you know, psychologically, how do police officers deal with this and how from a leadership point of view, how do you help or, you know, try to identify when officers may be suffering from trauma through repeated viewing?

Mick Moran: So like, I’m the CEO of Hotline.ie, the Irish Internet Hotline. We managed CSAM reports from members of the public, and we were a member of the INHOPE network. And so we manage notice and takedown so far for sites and URLs all around the world. And I have analysts there who are trained primarily as internet analysts and then secondary as child, child-like CSAM analysts.

And they, they understand fully. And one of the key parts to their work is the whole welfare package that we are building around them. Now, when I was the Assistant Director at INTERPOL, I was the Assistant Director of the Vulnerable Community Section. And there I was dealing with trafficking in human beings. I was dealing with people smuggling, and I was dealing with online child exploitation. And as you know, INTERPOL houses the International Child Sexual Exploitation (ICSE) database. And, that is basically a registry, kind of a record of all of the CSAM that’s floating around out there on the internet, and that has many uses for whatever. And one of the key things you have to build around a team in this way is welfare.

Now, when I started this game back in 1997, there was none. Nobody thought about welfare, and nobody thought about anything else. Yeah, but I mean, like, I laugh, and people say to me, Oh, can you do this? And I just say, look, as a police officer, you deal with fatal traffic accidents, you deal with suicides, you deal with really the dark side of life in general.

You never know when at one end of the day to the other, what you’re going to be facing when you go out there. At least when you’re working in this area, you’re dealing with CSAM all the time, and therefore, you can prepare yourself in advance. You can step away from the computer and switch it off.

If you’re feeling a little bit overwhelmed and of course, then you have welfare. So my team now at the minute at the Irish Internet hotline, they are they have obligatory visits to, supervising psychologist who has a chat with them once every quarter. And they’re free to go at any time, actually, they want to, and so they call it vicarious stress or vicarious trauma.

So in other words, an analyst can start to take on board the trauma that the child is having in the image or movie. And, once that starts to happen, I think the analyst can become very quickly unwell if they’re not careful. And you see, when you add that vicarious trauma, which in other words sort of might be stress.

And if you add that to just the regular stress of life in general, right? So if you’ve had a bad day, maybe your significant other has been giving you some shit that morning, and you know, then you’re sitting in traffic on the way into work, and then you get into work, into work. I know you’re sitting down and you have all that stress on your shoulders, and then you sit down in front of a screen full of stress.

Paul Jackson: Yeah. Not going to be a good day.

Mick Moran: Not going to be a good day.

Paul Jackson: Yeah. So I’m going to also ask a quite a sensitive question. Now, a difficult question, which is, you know, as a police officer, you know, you’re obviously going to decide who does those kinds of jobs, which, you know, because obviously categorising CSAM is a laborious job. You know, many cases may have thousands of images, right? And each one needs to be categorised. How do you prevent somebody who actually thinks, well, I want to view these images. So the best way of viewing them without getting arrested is to become a police officer and view these images.

Mick Moran: Yeah. That’s something that we manage as a risk. Within this environment. We manage it as a risk through a selection process, obviously. And then that’s just an addition to the recruitment process. And in addition to that would be to try and spot that. Not easy. Not easy. But it’s a look at where in the world of risk that’s who we are, you know, cyber security and working online.

In our everyday life is risk. So, identifying it as a risk and flagging it within the recruitment process. And then as I said, there’s a supervising psychologist there, and the supervising psychologist is their job is behavior. And, yeah. So they would like to think that they might spot it. And ultimately that’s it. You can’t know, you can’t know. You know, and that’s just the way it is.

Paul Jackson: Okay. So let’s go back to, you know, CISO, because that’s primarily who our audience is here. Yeah. How do they implement a program whereby they can detect, or perhaps be on the lookout for, this kind of material?

Mick Moran: Okay, so first and foremost, first and foremost, I think the CISO is the perfect person. And that’s why I’m really thankful to you, and to THEOS for allowing this podcast to be made. And I’m very thankful to the underground economy, to Team Cymru, who run it here. I’m very, very happy, to talk to CISOs and to explain to them that first and foremost, like, it’s a nice segue in when we talk about risk, because that’s a CISOs world, isn’t it?

It is a CISOs world. It’s all about risk. And like, if he looks at his, if he looks at his blinking lights of stress, he just wants to know which one is blinking brightest. Right? The problem with CSAM is that most CISOs don’t have a light on their dashboard for season. Yeah, they’ve never heard of it.

They’re not interested in it. They think it’s just a, some sort of a child welfare thing that “ladies who lunch” worry about. It’s nothing to do with me. Well, unfortunately, and as the statistic from NetClean says, one in 500 machines has something to do with you. And because it involves children, it’s as the French say en chaîne —, it’s connected to, it’s linked to many, many other compliance issues.

So when, for example, you talk about GDPR, right in GDPR, there are special categories of data, right. Special categories of PII. And let’s face it, one of the most dangerous, shall we say, one of the most mined areas of data is data relating to children. And so it so that’s another reason it’s differentiated from normal pornography.

It’s another reason why your policy in relation to pornography within the house is not good enough. Because there’s other factors that link into it. I’m based in Ireland, obviously. I’m the CEO of the Irish Internet Hotline. If anyone’s interested in knowing more about the hotline, I don’t find it at hotline.ie. But in Ireland, we have two pieces of law that are related to CSAM.

One of them is the Child Trafficking and Pornography Act, where it is defined, and where the Gardaí — our police service — pull their power to have warrants in relation to it, powers of arrest, all that sort of thing. And then down in the bottom of the article, down in the bottom of the act, in Section 9, there is a provision that basically makes the body corporate responsible, where if they have CSAM on their system and they know that they have it on their system.

Here’s the interesting line: or are reckless as to whether they have CSAM on their system. They can be held responsible for it under the Act. And it’s a no-fault thing. So, in other words, the CISO might not know about it, but he might still be standing in front of a court facing a criminal charge for having CSAM on his system.

Paul Jackson: Yes, the reckless. Yeah, it’s an interesting angle, isn’t it? And, obviously, we won’t go down that legal rabbit hole.

Mick Moran: No.

Paul Jackson: Neither of us are lawyers, but we know from experience.

Mick Moran: Yeah. And here’s the second piece of law that attracts it, right? It’s the Criminal Justice Act relating to the disclosure of sexual abuse of children.

So in other words, if you are a company, you are a CISO in a company, you find some of this and you treat it as pornography, then you’re likely just to delete it. And to pass it over to HR to make it an HR problem. Well, here’s why you need to CSAM policy. Because if you delete this material and you don’t inform the authorities, because these are pictures and don’t forget that at their base, these are pictures that are of a rape scene.

Basically, child sexual abuse. Okay. So if they are a picture of a rape of a child, then that is a serious crime happening to that child. You don’t know who that child is. That child might be from Canada, the child might be from the UK, but it might equally be your employee’s daughter. So it’s very important to remember that you are obliged under the Act in Ireland to report this to the authorities.

Paul Jackson: I think that’s the case in many…

Mick Moran: You probably, probably, probably is.

Paul Jackson: Obviously, we have an Asia focus here, and I’ll come to it in a bit because obviously, you know, I spend a lot of time in the Philippines. Where a lot of our employees in THEOS are based. And, yeah, let’s come to that later, because that’s..

Mick Moran: We can talk about the Asia angle because there are angles there in Asia.

Paul Jackson: 100%. But I want to go back to the exercise that you ran yesterday here at the Council of Europe at the Amazing Underground Economy Conference, that I do hope more of our listeners who are in the private sector in the Asia Pacific would consider attending next year. It’s absolutely priceless in terms of the content to be who you meet, and you get to meet, Mick.

Mick Moran: Of course, if I’m invited back next year, who knows, you know.

Paul Jackson: But, it all seriousness, the exercise threw up some interesting points for me because I think, you know, the scenario. Let me recap the scenario. Right. So you had a helpdesk, a computer that wasn’t functioning properly. So helpdesk or IT support, we’re fixing it. Whilst they were fixing it, they happened to notice these images of CSM.

Right. Simple scenario could happen to any company. But then you know, of course, it was split into groups. And you know immediately people think, well, what should I do about that employee who owned it? You know, and it’s kind of a guilty until, well, you know, proven innocent rather than innocent until proven guilty, because there are all sorts of circumstances that could have led to that, those images being on the machine.

And there’s a lot to think about. How do you act in regard to that employee? It’s a serious allegation. What do you do? You know where do you take this? And it threw up so much confusion with the audience, and I could tell that this has never been done as a tabletop exercise in any of the organisations that were present yesterday.

Mick Moran: Yeah, yeah. No. And that’s it. And okay, so we basically, as you say, the scenario, the helpdesk scenario, and then I split the people up into groups and I did the old one, two, three. So I counted one, two, and three, which split up bodies. The split of bodies which was very useful. And I split them up.

And so one was the legal and compliance team. Two, two was the team and three was the CISO team. And then they all went into different corners and they talked about it for a few minutes, and they only got a few minutes because they got about 7 minutes or 8 minutes because the last line in the scenario was in ten minutes of a meeting with the Dragon CEO, CEO.

And she’s not going she’s not happy about this, and she’s going to be she’s going to have her poker face on, and you’re going to have to explain to her what the scenario is, what the situation is, what you know about it, and what are you going more importantly, what are you going to do about it? So, I mean, HR wanted to sack them immediately.

You know, the CISO was panicking about not being exposed to many of his staff, and that was useful. I thought that was very useful — not exposing too many of his staff. And one of the CISO, one of the people there, that was a CISO, she was very interesting because she just kept bringing it home: that this guy is innocent.

We don’t know how this stuff got there, because most people start with “it’s theirs, it’s his.” But that’s not the reality. And it absolutely cannot be the reality. And that’s why I say you need a policy. You need to wargame it. Everyone needs to understand their role in it, because you cannot see how it got there.

And I thought that she was being very clever because she kept bringing this up. So every time they came to a solution, she’d say: “Yes, but…” And you see, because none of them were thinking. And this, I think, should have been the CISO’s thinking on it. The CISO shouldn’t have even talked to HR or legal and compliance until he or she knew what they were actually dealing with.

Paul Jackson: Agree. Right. Agree.

Mick Moran: Because it might be that everybody went into a tizzy based on what the guy in the helpdesk — who may not be the brightest sandwich in the picnic, right, or chip off the old block — you know, they’re all basing everything. So what are you doing? Are you triggering this whole internal panic based on what the guy, the helpdesk guy saw?

But has anyone confirmed what he saw? Has anyone? Is anyone satisfied that this is CSAM? Or is it just like he saw some nudity that he thinks looks young? Or is this actually confirmed CSAM?

Paul Jackson: Agree. Yeah. You know, I’ve seen this time and again where people run off on the assumption that the junior guy is right. Wow. Yeah. Mistakes can be easily made.

Mick Moran: So look, the CISOs all got into: “Do we get… let’s take an image of the machine. Let’s find out what else the machine is connected to. Let’s run down some of the leads that you can get. What else have we got around the image?”

So if it’s an image or a movie — okay. It’s one file. Okay. So what else have we got? Have we got thousands of these files on our system? Are they in our network? Are they on a share? Is it in the cloud? You know, all of those are very important aspects, not just from a legal perspective, but also from a reaction.

And what are the next steps the company will take? And so, so like again, wargaming, policy, and training are so important for people to understand how to deal with the incident, and then to map it out properly before you then pull the lever that says, “CSAM policy.” Right? Get HR involved, get whoever involved, at least have some knowledge of what you’re dealing with.

Paul Jackson: Right. So, let me put your INTERPOL hat back on, and you know, I know you spent many happy years in Lyon. Yeah, at INTERPOL, where I met you many times at various conferences. But we won’t go into that. But obviously in our part of the world. So this podcast, THEOS, etc., we’re focused on the Asia-Pacific region.

And sadly, a lot of this content does originate from our part of the world. A lot of the abuse does originate—oh, you’re shaking your head.

Mick Moran: No, no, no, no, no. There is some, okay. If you were to talk about online, so the umbrella phrase we use is online child sexual exploitation and abuse.

So, some people say OCSA, but I just say it’s O-C-S-E-A, okay. So online child sexual exploitation and abuse. If you’re going to talk to me about the Asia-Pacific region, CSAM is not the first thing that comes to mind. One of the main reasons being that a lot of the countries over there haven’t yet got law. That illegal makes it illegal.

So that’s one reason why CSAM is not the primary concern over there. The primary concern from certain countries in that region is sexual extortion of kids in the West and sexual exploitation of kids in real life, that is actually involved with travelling sex offenders. And then, linked to the travelling sex offenders angle are people who travel from the West to the East to abuse children.

And for unfortunately, they get access to children in those certain environments in the East. In certain countries in the East, there’s also remote sexual abuse. So remote sexual abuse is an interesting one. And it’s one that unfortunately happens in the Philippines a lot. And there have been newspaper articles and everything else.

So it’s not just me saying it — I want to make that clear.  I’m not dissing the Philippines in any way, or Filipino people in any way. But there is a certain type of exploitation that’s facilitated by the internet, which is called remote child sexual abuse. Basically Western sex offender, or a person with a sexual interest in children, makes contact with a mamasan in, you know, with a madame, if you like, in a poorer region of the world.

And often this is the Philippines, and the child is basically abused to order on webcam.

Paul Jackson: Yeah. That’s horrific.

Mick Moran: Yeah. It sounds like it. It sounds pretty horrific. But it’s a very common practice in the Philippines, and it’s an unfortunate one. And law enforcement do their best. I know the French do a lot of work in this area, the English do a lot of work in this area, the Americans do a lot of work in this area — because it’s their citizens who are the clients, the consumers of it. And they’re paying like €30 or $30 using — I suppose — people are wiring the money in some way, Western Union or whatever. And to be fair to Western Union and PayPal, I know that they are aware of this and they deal with this a little bit.

Another thing that comes out of Asia, of course, is sexual extortion. So there are certain countries in Southeast Asia, for example, where they are grooming kids and adults online until they engage in some sort of sexual activity on the webcam, and then they threaten them.

Paul Jackson: Yeah. Well, we had an excellent presentation yesterday. I’m not going to name the company, but obviously we’re under NDA here. But it’s reassuring to hear that the social media companies are taking this kind of exploitation seriously. You’re shaking your head.

Mick Moran: I’m not shaking my head. I’m kind of nodding. I’m kind of doing that “Yeah, all right”. they could do more. Right? And they are doing loads, that’s true. And when you talked earlier on about the welfare of police officers, they’re dealing with CSAM. It’s very important to remember my staff in hotlines all around the world, like there’s 54 hotlines under the INHOPE umbrella, 54 hotlines around the world.

But also trust and safety people — you know, all these big companies, the trust and safety people, they are there. They are some genuine good people. And they’re doing really hard work, constantly, to deal with bad behaviour on their platforms. And they don’t get the credit for, you know, we will go after the head of Meta, and he gets his head slapped in some congressional hearing in the United States of America. But nobody talks about it, trust and safety people.

And, you know, trust and safety itself has come in for some significant flak recently around, with the change of attitude in the States and the kind of —shall we call it—the change of direction in relation to certain political linked aspects.

But don’t forget that behind that flak that you’re firing, there are trust and safety professionals. They’re doing great work and keeping your platforms — the ones you’re using every day, keeping them spotlessly clean from the types of stuff like CSAM, for example.

Paul Jackson: Yeah. No, it’s good to hear. And, you know, I think even though this is obviously an underground economy, it’s focused on financial crime. It’s organised crime who is financially motivated. It’s the primary reason we are here. But let’s face it, I mean, the world you live in — the murky world, I have to say — that you live in is also financially motivated. Of course. You know, a lot of these images and, you know, they’re to order their for financial profit.

Mick Moran: Well, I’m going to shaking my head again on you. Okay. Like, the financial crime and organised crime angle of this is absolutely the sexual extortion that we’ve just talked about. The extortion of money from people using their sexual activity as a threat. Southeast Asia is good at that. Right. Yep. Africa is very good at that.

And so that’s certainly one aspect of the financial side of it. There is CSAM available on the darknet for payment. So you can pay bitcoin or a bit of a bitcoin and you can get access to some stuff. Sometimes, people who are producing it will charge for it. But the vast majority of CSAM that’s circulating around there is circulating like for like, so it’s aficionados sharing, you know, like baseball cards, and they’re kind of like, yeah, it is awful.

It is awful. It’s wrong. And people collect it, and they are always after the next “best,” the next images or movies in a series — series are constructed generally around a victim. And so you have, you know, you can you can have scenarios like certainly during my 10 or 11 years in INTERPOL, I actually watched certain children growing up in abuse videos.

Paul Jackson: Oh that’s horrible.

MICK MORAN: Yeah. Do you know what I mean? Like, I see her now and she now she’s seven, now she’s… You know, when I saw her first, she was four, you know, and now she’s 14. And when I was leaving INTERPOL, we still hadn’t found her. We don’t know who she is. And she’s still being abused every day.

Yeah, yeah. So that’s one that’s sad. But it doesn’t… but it does seem like — all in all, I’m not here to shock. I’m not here to moralise. I’m here to make the point that this is a cybersecurity threat. It’s an insider threat that actually matters very well onto any of the of the threat models that are out there. The Gartner, the NIST Model threat model. It’s absolutely maps onto them very, very well. And so if you want to talk about like, you know, preparation for an incident: so having your policy, having  wargamed that a little bit like we did yesterday. And then the detection. Right. How are you going to detect it.

And there are ways. There’s commercial ways like NetClean have it, like Cloudflare, for example, on your website. Have you got it switched on on Cloudflare? It’s an absolute it’s a setting on Cloudflare. And if your website is on Cloudflare, it’s a setting. If you’re hosting company is not providing this type of scanning, especially if people are uploading files to your network for whatever reason, you absolutely should have it switched on.

And then how else do you detect it within your own network? Well, I mean, you need to have the signatures, and you’re seeing the hash values. They’ve got to be in your SIEM, and you’ve got to be getting a red flag when one of these triggers. And once it triggers, then you’re putting your policy in place, and you’re what, how have you wargamed this. How was your NIST? How was your information security officer. How is like what are the steps into the next steps to take? Have you got somebody in-house to do forensics? Have you got somebody you know? All of these are questions, and have you got an advisor on the end of a phone that’s sitting in Ireland? And we’re advisors. We are consultants to companies and especially member companies, companies who are members of the Irish Internet Hotline. They get consultancy from us in relation to this, and we can help them to set it all up. We can do this.

Paul Jackson: Well, this is exactly what, you know, a little plug for our own company, THEOS, which, you know, the incident response retainers that we have aren’t just for the ransomware, as they are for the insider threat, the internal investigations. We have tons of experience dealing with employee investigations.

Mick Moran: Okay. And so I would say to you that you need to bring it into THEOS too. And I’d be very happy to consult on.

Paul Jackson: All right. Before we hit the last question, which is a music question, I’ve got to talk about AI. Yes. Because what I mean, nowadays, I’m sure of this — CSAM material can be generated via AI. Are you seeing this?

Mick Moran: Yes, we’re seeing it. We’re seeing it happening already. To be fair, most of the big LLMs and the big generative AI engines won’t allow it to happen. I mean, even I have problems when I’m asking it questions based on systems. It goes. I’m sorry: “ I’m only an AI model. I can’t help you.” You know? And then I have to come at the question again. And I have to explain that I’m a professional. I’m an academic. Whatever. You know, you can find out about me here, I’m an academic who studies this. And so can you, please…?

But again, if I’m getting around it, so too can offenders get around it. So, like it is a big question. There’s a number of things like people talk about child protection areas and the role that LLMS are having there. And there’s a recently there… ChatGPT brought in rules in relation to parental controls and things like that, because there was a very sad case of a young man who committed suicide, because the LLM kept, if you like, validating his opinions about it.

And eventually the kid, killed himself. So that’s an unfortunate one. It’s an example of how a LLM can be dangerous, especially around children, and how vulnerable children can be. We see huge regulation coming into platforms and things like that now, especially in the European Union. And so LLMs are going to have to be the same way.

But the ones we’re more concerned about is the CSAM, because if you have a very large collection of CSAM and you run it in as a source, source data into a generative AI, and then you request, and then you put in many, many, many, another collection of non-CSAM in there. And now you ask it to produce new CSAM. Well, we’re going to see a huge increase in this. And that increase will come at a time when law enforcement, hotlines, CISOs are already snowed on the woodwork. And now here’s another problem coming down the line for them. And they’re already snowed under. I mean forensic teams in law enforcement all over the world, forensic teams in law enforcement will tell you that 80% of their work, you know, somewhere between 60 and 80% of their work is CSAM.

Paul Jackson: Yeah. That’s shocking.

Mick Moran: Yeah, it is, but if that’s what they’re dealing with now, what will it be like when this AI-generated CSAM, which is still illegal, by the way? Which is still illegal. That’s an important point to make. It’s still illegal. It’s still illegal to possess, to produce, to disseminate, to distribute.

And what’s going to happen there to those teams who are already snowed under, as I said. But also it forms, it will form part of the AI slop that’s out there. And my biggest fear, my biggest fear is that it starts to normalize it.  If people are starting to see it, okay, they’re shocked initially, and they get over the shock, you know, and then what?

Paul Jackson: I truly hope it’ll never be normalised to its, and look, we’re coming towards the end of the podcast. Now, this is I’m sure this is an eye opener because it’s not a subject this actively discussed enough. You know, I hope this opens a few people’s eyes. I’m sorry if it shocked a few people, but it’s the reality. But anyway, you know, your job is stressful. More stressful than most when you deal with these things.

But, yeah, I always rely on music to unwind a little. So I’m going to ask you, you know, what do you listen to? What’s on your turntable at the moment? I’m going to guess Val Dudek.

Mick Moran: Oh yeah. Well, it has to be said that, I love music and I absolutely adore Spotify.

I’d be a bit of a vinyl man, and I’ve got quite a big vinyl collection. I have a lot of music. I’ve, like gig gigs and gigs and gigs [gigabytes] of MP3s and everything, and I have, I haven’t looked at in years, you know, with Spotify. What I love about it is that I’d have a song in my head, I wake up sometimes with a song in my head,  I put on the song on Spotify, and then I let the algorithm do the rest.  

And through that, I pick up some really, really cool bands and some good bands. And BBC Radio 6 Music. Fantastic alternative, kind of just a little bit off the beaten track, and you discover some, some really, really good stuff on there.

Eight Radios and other big radios are on in our office all the time. I don’t know who put it on. It’s one of my favourite radio stations. And they didn’t put it on for me, but it’s ticking away there in the background. And Eight Radio is also a great source of alternative stuff.

But what am I listening to right now? I suppose I’m listening to a band called Kingfisher who have just released an album called Halcyon. And Kingfisher are an Irish group and they fuse rock and — I don’t know — think Dermot Kennedy, Hozier and a bit of the Wolfe Tones, you know, kind of Irish music, which, like if you take the likes of Kneecap, they’re also doing that right now.

There’s another great band out there at the minute called Amber. They’re doing the same. CMAT is doing a little bit of that as well. Kind of rock fusion, rock-country fusion whereas Kingfisher have just released Halcyon, as I said, and that’s the one that’s been on. It’s been on repeat on for the last while.

So I totally recommend it to people.

Paul Jackson: Fantatic! This is why I ask the question, because I get great recommendations, and I will be checking that out. Look, Mick, I can’t thank you enough. You really are a legend. The work you do is so vital, and it’s so important. And keep doing the good work.

And, you know, everybody listening —  Please check out Hotline.ie. And, you know, if you enjoyed this show, then please click the like or the subscribe button on whatever platform you happen to be listening to. And hopefully you’ll join us again for future episodes of the THEOS Cybernova podcast. Mick, thank you so much.

Mick Moran: Much obliged. Thanks, Paul.

Recent Podcast

Episode 7 | Season 2

CSAM as the Insider Threat Missing from Your Playbook

Mick Moran on why CISOs must treat CSAM as a cybersecurity risk, with lessons on detection, policy, and response.

Episode 7 | Season 2

Episode 6 | Season 2

The Anatomy of Crisis Management: Preparation, Communication, and People

When crisis strikes, will you be ready? Hear how Tim McNulty turns disruption into resilience.

Episode 6 | Season 2

Episode 5 | Season 2

Cyber Scams in Asia: Victim Blaming, Underreporting, and the Need for Change

Why are cyber scams soaring in Asia while victims are blamed and fraud goes unseen?”

Episode 5 | Season 2

Episode 4 | Season 2

APAC Cybersecurity Challenges, Brain Drain, Data Privacy, and AI

Cybersecurity, privacy, and regulation—how are APAC companies keeping pace?

Episode 4 | Season 2

Episode 3 | Season 2

From ROOTCON Pioneer to Leading Offensive Security in APAC

Jayson “JV” Vallente’s journey reflects the rise of ethical hacking and offensive security across APAC.

Episode 3 | Season 2

Episode 2 | Season 2

From Scotland Yard to Manila—Building a Cross-Border Cyber Investigations Practice

From vice squads to digital forensics, one journey shows how cybercrime and response evolved.

Episode 2 | Season 2

Episode 1 | Season 2

Inside the High-Stakes World of Digital Forensics and Incident Response

How do the best in DFIR respond when there’s zero room for error—and no time to waste?

Episode 1 | Season 2

Episode 12 | Season 1

Navigating Privacy, AI, and Cyber Law in APAC​

How can organizations stay resilient as privacy regulations lag behind rapid digital threats?

Episode 12 | Season 1

Episode 11 | Season 1

Cybersecurity, Leadership & Breaking Barriers​

How do you secure a multinational company while navigating complex cyber regulations and evolving threats?

Episode 11 | Season 1

Episode 10 | Season 1

Cyber Journalism, Crisis Comms & the Power of Storytelling​

How do journalists uncover the truth behind cybercrime?

Episode 10 | Season 1

Episode 9 | Season 1

Turning the Tables – 100 Days as CEO

What happens when the host becomes the guest?

Episode 9 | Season 1

Episode 8 | Season 1

What Every Business Needs to Know About Cyber Insurance

When a cyber incident occurs, can your cyber insurance policy come to the rescue?

Episode 8 | Season 1

Episode 7 | Season 1

The Leadership Playbook for Aspiring CIOs and CISOs

What does it take to transition from a cybersecurity practitioner to a strategic leader?

Episode 7 | Season 1

Episode 6 | Season 1

The Modern CISO 's Balancing Act - Security, Business, and Innovation

Is the traditional CISO role obsolete?

Episode 6 | Season 1

Episode 5 | Season 1

Crisis Leadership When Cyber Attacks Strike

What happens when a ransomware attack hits, and every decision counts?

Episode 5 | Season 1

Episode 4 | Season 1

The View from Down Under

What makes Australia’s cybersecurity landscape unique.

Episode 4 | Season 1

Episode 3 | Season 1

Cracking the Code to Cyber Talent and Recruitment

Explore strategies for finding and nurturing top talent in the cybersecurity industry.

Episode 3 | Season 1

Episode 2 | Season 1

From Cybercrime Investigator to Private Sector Leader​

Follow the transition from cybercrime investigator to a leader in the private sector.

Episode 2 | Season 1

Episode 1 | Season 1

Building THEOS Cyber, Embracing Growth, and the Journey Ahead

Discover the story behind THEOS Cyber, its growth journey, and future aspirations.

Episode 1 | Season 1