Site icon Easy Prey Podcast

The Balance Between Privacy and Protection with John Pizzuro

“What’s more noble than trying to protect a child?” - John Pizzuro Click To Tweet

Age appropriate conversations need to be taking place with children on how technology is being used to groom them. Unfortunately, many are already addicted to the dopamine fix and their relationships with a device can numb them to being preyed upon. Today’s guest is John Pizzuro. John serves as CEO of Raven, a lobbying and advocacy group that focuses on protecting children from exploitation. John was a former commander of the New Jersey Internet Crimes Against Children task force and has created a framework for other countries to use to investigate child exploitation.

“The apps and technology will change, but what has really changed in the last 20 years is behavior as a result of that technology.” - John Pizzuro Click To Tweet

Show Notes:

“Everyone is on the platforms but the platforms don’t do nearly enough.” - John Pizzuro Click To Tweet

Thanks for joining us on Easy Prey. Be sure to subscribe to our podcast on iTunes and leave a nice review. 

Links and Resources:

Transcript:

John, thank you so much for coming down the Easy Prey Podcast today.

Thanks for having me.

Can you give myself and the audience a little bit of background about who you are and what you do?

My name is John Pizzuro. I am retired from the Internet Crimes Against Children Task Force, which is an acronym of ICAC from the New Jersey State Police. Spent 25 years there and now I am the CEO of Raven, which is the first-of-its-kind lobbying organization for law enforcement technology to actually change a lot of the funding mechanisms and legislation that's happening today that we're all impacted by.

Awesome. How did you get into the Internet Crimes Against Children Task Force when you're in law enforcement?

It’s one of those things where people get transferred for various reasons. I got transferred there as the commander back in 2014/2015, in that time range. I ended up taking over the task force and that task force is responsible for all the cyber trafficking, child exploitation cases in New Jersey, for that part.

How did you get appointed? Was that something that you wanted to do, or is it because of case work you had done outside of that division previously?

I just got put there, transferred as people got transferred around. But what happened is that I had spent five years in organized crime, narcotics, and corruption. But when I got there, it became probably the most purposeful thing that I've ever done before. So much so that I'm still in it today.

That says a lot about how passionate you are about this field.

I mean, what's more noble than trying to protect a child, right?

Yeah, if I mentioned the advice that you give in the situations have changed a lot in the last 20 years. Maybe it's more true now than it was then—the whole stranger danger—when the reality was it really wasn't strangers perpetrating a lot of the crimes against kids. Now, has it flipped the other way?

Well, think about it. We text a stranger to get in a car with them. It's completely changed. Technology has really changed. [inaudible 00:03:09] 2008, there's on average one computer per household. I think when I was leaving, we would do a search warrant and you're talking about 20 internet cable devices because there's laptops, phones, heck, now your refrigerator, your Nest. Really, technology has impacted, but the thing that hasn't is so much as behavior. I think that's what a lot of people don't understand. The apps and technology have changed, but what has really changed in the last 20 years is the behavior as a result of that technology.

The apps and technology have changed, but what has really changed in the last 20 years is the behavior as a result of that technology. -John Pizzuro Click To Tweet

Let's talk about that. What was the behavior previously and how has it changed over the years?

Right now, you've been asked to CDC. They talk about cell phone addiction. Today, we are dopamine-dependent. Meaning, it’s likes, followers, and views. Because of that, it’s really changed where we get our attention and how easily we get duped. For example, I talk a lot about social engineering. If I told you right now, if you were sitting and an earthquake started, you would jump for cover automatically. But in your own house, you're not worried about anything when you're talking to someone online and that's because your fear-flight-freeze center of your brain is not activated.

But in your own house, you're not worried about anything when you're talking to someone online and that's because your fear-flight-freeze center of your brain is not activated. -John Pizzuro Click To Tweet

That is where, today, more people are getting scammed and it's not just children. You can go to adult fraud and you can go to any of those aspects, but it's because we're in a safe environment. When you're in a safe environment, you don't tend to believe that something bad's going to happen.

When you're in a safe environment, you don't tend to believe that something bad's going to happen. -John Pizzuro Click To Tweet

My home is a safe place and, by extension, in a weird way, the internet is now a safe place.

Correct.

That's an awful thought.

And you're more trusting. If you talk to neuroscientists, and when you look at something, you look at an image all the time, you believe it. That's why from a political perspective, we've become so polarized. It’s because ads basically target what we like. It was funny, I had a phone conversation with someone before the podcast and then we’re talking about the destabilization of the US dollar. As I scroll on social media, the first ad I see is about the US dollar being destabilized. There you go, right?

And a little bit of positive reinforcement.

Yeah, absolutely.

Let's talk about some of those behaviors. What are some of the behaviors that have changed? Obviously we're more trusting of our devices now than where we were 10 or 20 years ago. How has the behavior changed in the sense of, like, predators and kids being targeted?

There are a couple of things. One is, a lot of predators and offenders, what they do is manipulate language. What do I mean by language? Chris, what's your favorite rock band or favorite music?

Oh. Let's just say it's U2, even though it's not because I can’t think of one off the top of my head.

All right. I love U2, right? I'm going to say, we can go back in written words, you say, “Hey, you know what, I just listened to Ordinary Love, a new track by U2.” I'm like, Ordinary Love, really? U2? That's my favorite band as well.” What happens is, what do we like to talk about most?

Things that we like.

And ourselves, right? What happens is, good offenders, and this is social engineering 101 in general. What they tend to do is they will actually mimic someone's language. When I'm reading something, what do I see? I see the exact language repeated back at me. “Wow, this person is just like me.” Now, all of a sudden my trust is down. They repeat the same language. It’s the same phrase. And now I want to believe that. Because all my self-esteem and popularity comes from social media and the likes, followers, and views, I'm hooked. It's that simple. It’s that simple.

Was it that way 20 years ago with kids being groomed by their coaches and things like that? Or is that kind of really changed in the way that it really has changed?

It's changed because of the reliance on technology. If you're talking about grooming 20 years ago, obviously you go to the ice cream and you talk to the actual park. But think about this and this is what I tell a lot of people. All right, your child right now, who's playing Fortnite or Roblox, and they might be 12 and they're playing with a 40-year-old. If you went to a neighborhood and you went to a playground and you saw a 40-year-old approach your 12-year-old, what would you think? But you're at home. You don't see it.

I don't necessarily want to jump into the legislation yet. I think we have a lot of questions. But I want to come back to the question of do the platforms have a responsibility in identifying who's using their platforms to other people. Let's come back to that. Because I think that probably ties in a little bit with the stuff we want to talk about later on.

Yup. When you talk about the platforms, I was asked when I testified in the Senate Judiciary Committee in February, I was asked this question, was our social media companies doing enough? My response was they don't do anything. It was one of those responses. But the reality is, most of the companies, they have limited moderation, right? I start an app company tomorrow, I have two or three people. I don't have the ability to moderate, so then they're scanning.

The reality is, I could be whoever I want with any device I want, enjoying whatever I want. A six-year-old can pretend to be a 24-year-old. A 24-year-old can be a six-year-old and there are no identity checks or age verification… Click To Tweet

Then there's the people on their platform. The reality is, I could be whoever I want with any device I want, enjoying whatever I want. A six-year-old can pretend to be a 24-year-old. A 24-year-old can be a six-year-old and there are no identity checks or age verification checks.

Is that one of the things that you are pushing forward through your lobbying efforts?

One of the challenges is privacy. Everyone's worried about privacy. The thing is, with law enforcement, I can guarantee you that Google and Amazon Web Services and Meta, for that case, know more about you than law enforcement can ever intrude on. I think from a law enforcement perspective, we want to have that ability. It's hard. We respect privacy. I don't care what anyone else does, but how do we protect children? I think to have something along those lines where we could at least have the ability to where—I’ll use Tinder, for example—that if I'm 13, that even if I lie and say I'm 18, that Tinder knows that that individual is 18. Without geolocation, without any of the other parameters. We're just looking to protect children more. But everyone is on the platforms, but the platforms don't nearly do enough.

Do you see AI as being something that will help in the long run in terms of, if AI is monitoring conversations, it can kind of see this stuff and recognize the patterns.

All right, not that I agree a lot with Elon Musk here with AI, but there is a lot that we don't know about AI and honestly, there has to be regulation right now with AI. Lot of challenges with that, especially people using AI to actually even groom. Because let's face it, if I can download a child's history, what better than to have the AI program just mimic rather than me mimic it.

Again, it's like one of those things where technology has its uses, but then how is it regulated, how is it introduced? But from a scanning mechanism, we got encryption now and a lot of platforms. Think about this. Last year, there were 32 million cyber tips from the National Center for Missing and Exploited Children. Thirty-two million. Do you know how many of them came from Apple?

Four.

How many people have an iOS device?

I think there are about a billion out there.

Every time I ask, 60%-70% of the people that raised our hands have an Apple device. We're not getting tips from that. Because again, they are blocking everything down from an end-to-end encryption. If we don't know what's happening on the platform, it's not happening. But the reality is, there's more cases of teen suicide through sextortion victimization today. The reality is that, again, these platforms will make billions and billions of dollars and, by the way, they hire lobbyists, that millions of dollars to go in and lobby for them.

The question is where is the duty to care? I think everything's a balance.

I think that's the hard thing for people to understand—that there's in all these things, there's always going to be a balance that has to be achieved. You can’t have absolute privacy and then have good law enforcement. But if you have [inaudible 00:13:11] law enforcement doing everything, then you lose privacy. There's a balance that has to be played.

I think that's what ultimately you try to do. If we had a back door for end-to-end encryption, then we knew a child from being scanned was in there, then the privacy people, be like, “Well then my information isn't safe.” It’s just one of those things where it’s tough because I’m asked about Section 230, which is liability. When you take a look at some of these platforms, you have mothers and fathers that are that are imploring Congress and the Senate because their child committed suicide. Yet the current law doesn't allow for any lawsuits or regarding that use of social media by those children that committed suicide.

Again, how do you come up with a problem? I think today we're so polarized. You have two groups, but ultimately, instead of complaining, can't we come up with a solution? I think that's why kind of Raven is this.

Is it that you find that people are either we're all one way or we’re all the other way when what the realistic 90%, 80% of Americans, it’s not black or white, it’s gray for most of us?

Yes. Maybe that’s why because I’m a Libra, I guess, because I got the scales or whatever. So I see both sides and I'm down the middle. I can see both. There's a saying, “You don't lettuce  perfect be the enemy of the good.” But if we don't do anything, then everyone's going to suffer. I'd rather do something at least good in the right direction and then, look, hopefully something like that we're able to implement to help.

Let's talk about what we can do to help. Let's talk about what should be happening outside of the platforms first and then what should be happening on platforms or with technology. If we can split that. Because, hopefully, there are people involved in children's lives who can have conversations and can see things. What should they be doing? What kind of conversations should they be having? Just try to help the kids there. And then let's talk about the technology side or the intervention side.

All right, so we’re talking about children first. Parents need to have conversations with their children. They understand technology and it drives me insane every time I go to a restaurant and I see someone hand the phone to a two-year-old. But they know that system better than the adult does already at two. I think it starts with an early conversation. I think open communication is really important. I think we don't talk about how offenders and predators actually will try to victimize children and how to groom them.

I was talking about language before. I think it starts there and it starts at an early age. If you're talking to your child when they're 12 or 13, it's already too late. I like to stay in this, a couple years back, I went on a beach and I'm walking on the beach. This is when TikTok just started. Literally, there's 67 people on the beach. All the kids are on their phone. It's 80 degrees and sunny out. There's 12 TikToks and then there's a four-year-old who can't find their parents. If I could take a microcosm and blow it up, when you talk about it, I think that's where that starts. That’s the preventative piece at home. I think with that, you need it to be implemented in school.

You know what, we talk about vaping, right? We talk about drugs, weed, alcohol, or teen pregnancy. But why aren't we talking about social media and the impacts there? Why isn’t that part of the health class?  But at fourth grade or third grade, the long-term impacts of that. That’s what I would talk about on the preventative side.

We talk about drugs, weed, alcohol, or teen pregnancy. But why aren't we talking about social media and the impacts there? Why isn’t that part of the health class? -John Pizzuro Click To Tweet

What's your position on why it's not being taught as part of education?

Part of it is the subject matter. No one wants to talk about child exploitation in general. Just to give you a break, trafficking. Trafficking is 1% in comparison of the problem to child exploitation. All you have to do is take a look at the tips from the National Center for Missing and Exploited Children. But the challenge becomes no one wants to talk about it because your six-year-old or seven-year-old could be victimized.

The other thing is, is that they've implemented trafficking into the schools a little bit more. But that is based on the movie Taken from Liam Neeson. Where it just, that and the way it got marketed. I'm not saying trafficking doesn't exist or it's not a problem. But it's taking up that space. Honestly, if I was able to help children at a younger age, the likelihood of them becoming a trafficking victim and becoming a narcotics user, opioid user lessens.

I think what people don't understand is if I'm six years old and an offender made me do something sexual and I sent that image, I have been assaulted and I'm going to be impacted, and then later on, I'm traumatized there. That leads to those other coping mechanisms and behaviors.

Really, it’s having age-appropriate conversations way, way early. As opposed to, “Hey, you’re 16 now. You should be careful about sending photos.”

Now my son will be on the couch for the wrong reasons. He's 18 now, but when he was 12, he was chatting with someone online. He’s playing Fortnite. I'm like, “Who are you talking to?” “Ah, don’t worry about it.” I said, “No, who are you talking to?” “Ah, don't worry about it.” I said, “Colin, who are you talking to?” He says, “I don't know, some guy just started asking me questions.” I said, “Hey, you're your predator.” So I will be on the couch for the other reasons and not trusting anyone.

You've done something right when he's looking at people that he doesn't know as a potential predator. Hopefully he's not paranoid and afraid of everybody.

Yeah, well there's the balance.

There’s balance. I think with any of these topics, when we as adults are uncomfortable talking about it, the concept of having productive discussions with kids about it is we're afraid of it. We hope if we don't talk about it, it'll go away, it won't happen.

Yeah. Here's the other thing. The parents are so addicted to their own devices that they don't have time. I love Netflix. You just start binge-watching something. It goes from one episode to another, to another, to another.

As a matter of fact, this is a great…. If people can google this: Sean Parker Axios. In 2017, he did an interview. In the interview, he said that, “Hey, we wanted to consume as much of your time as possible as a former hacker,”—I’m paraphrasing a little—“but we knew what we were doing, that we would give you a hit of dopamine and get as much attention as possible. We knew that, and we did it anyway.”

It's a really important aspect to understand it. That goes for all of us. If you put a couple TV shows on right now, and I'm sitting there, the next thing I know, I'm like, “Oh, my God. It's 2AM.”

How much of the problem is this us being addicted to our devices, not being aware of what's going on, and we're viewing our relationship with the device as opposed to the relationship with the people beyond the device, if that makes sense?

It comes down to that relationship with that device. It's that need for constant self-esteem or constant feedback. I think that's where it really starts. Instant gratification. Think about it.

I could find anything that I want out. If someone asks me, “Hey, how old is this actor was?” I'm like, “Hey, how old is Marisa Tomei?” “Marisa Tomei is 57 years old.” My point is that you just take a look. You take a look wherever you go. Go sit on an airplane. You sit there. Everyone is just mindlessly scrolling.

No one's reading books anymore. No one's meditating or that sort of thing, but I think that's where that constant attention is. What happens is it takes you a while. You become so dopamine-dependent. You need more, and more, and more, and more.

Are the predators aware of this? They're thinking, “Hey, I've done the research. I'm giving dopamine hits to the people I'm trying to take advantage of,” or is this just ingrained in how under the surface and they don't even realize they're doing this?

I think there's some extent to that, but I'll give you the darknet or community on the darknet, and it was how to seduce a four-year-old that's been read 55,000 times.

Again, thoughts on oral sex from zero to two-year-olds, read 17,000 times. The point is that there are communities where people share techniques. If you look at the darknet and Tor, you'll see a lot of that. Again, they're not even trading the images, but it's best practices.

What will happen is, I'll give a presentation somewhere about what to do. The next thing I know, that's in the Tor community, and they're saying, “Hey, let's change our behavior here because they're looking over here.”

With a lot of things, it's a cat-and-mouse game.

Yeah. Here's a perfect example. You remember naps during the day, right? Peer-to-peer networks? Right now, there's 100,000 people that have traded IP addresses in the US over the last 90 days that have distributed and shared rape and toddler sexual abuse videos. There are only 728 being worked right now, just to give you an [inaudible 00:24:24].

Dr. Michael Bourke did this study called the Butner Study. In those studies, he found that each one of those 100,000 people, 50%-85% of them are hands-on offenders with the average victims between 10 and 13. If I just do that and extrapolate that number in half, there’s 450,000 victims of child sexual abuse in the US that we haven't even got to.

And 75,000-85,000 perpetrators.

That's part of why Raven comes in and Raven exists. It's because, like funding, for example. The ICAC Task Force that does a lot of these investigations, 61 task forces only get $33 million, yet 33 HIDTA task forces get $596 million high intensity drug trafficking. They don't have the money. They don't have the resources. They don't have the laws. As a result, we're at a point where there's more child victimization now than there's ever been in the entire point, anytime during our history.

Is part of the difficulty that internet crimes are more likely cross-jurisdictional, cross-state borders, cities, countries, whereas if you have your drug trafficking, we know the guy has to come into our town to deliver the drugs?

Here's the interesting thing about that. The ICAC Task Force has 61 task forces throughout the US. We work conjointly. We would have four meetings per year, where we would sit and we would actually talk.

I might get a call from South Dakota ICAC and say, “Hey, I got this suspect. They’re talking to this victim,” and then we will move the victim. It's probably the best run, most efficient task force at any task forces in the US.

In terms of great cooperation between groups.

Yeah, because everyone you have federal partners, state partners, local partners. There’s training mechanisms together, conferences together. They do over 90% of all the child exploitation cases in the US.

Wow. One group.

Yeah. But it's like New Jersey ICAC. It was run by the New Jersey State Police, but I had every prosecutor's office. We had 71 agencies that were part of the task force and did some part of that work, and each state is similar to that. Texas has three ICACs. California has five ICACs because of the population and the size of the state.

Got you. They're way understaffed, way underfunded, way under-resourced.

Yeah. Think about this. 2008 was the Protect Our Children Act. That was what started the ICAC Task Force. In 2008, they authorized that 60 million. They've never received the 60 million. It’s 2024. At that time, there was one computer per household. Since then, cyber tips have gone up to 100%, yet funding has only gone up 30%.

Yeah, major shortfall.

One of the things is, and you know this too from a technology standpoint, it costs money to investigate. If I want to, for example, use software like magnet software, Axiom, that's 6500. If I want to use a great key, it's 80,000. All that adds up, but the tools have changed, right? Before, it was just one license. It was small software and was a computer, but now we got terabyte iPhones. Do you know how long it takes to go through that data?

Yeah, way more time consuming. Resource consumption.

And then storage. The other thing is that we have local servers. Again, we're talking about cloud storage too because of the amount of digital evidence. Raven for us is just the funding mechanism. It's the technology mechanism. It's holding the tech companies accountable and just having some robust legislation in order to protect things.

Let's talk about it. What are the legislative changes and the funding changes that you want to see start to take place, not just domestically but internationally?

One of the things, like you said, internationally, the internet is everywhere. The same problems in Australia and England are here in the US. One of the things that we want is laws that will protect children better, for example, age identification, or identity verification, without infringing on some type of privacy.

We have one act that we'll call the Parental Empowerment Act, that we want to make sure that we can at least protect kids on their devices. That's probably one. Number two is probably the reauthorization of the Protect Our Children Act of 2008, which we can go from funding from 33 million to hopefully in the four hundreds of million, but also add some more protective fixes.

For example, when the National Center for Missing and Exploited Children, when they get cyber tips, they get certain pieces of information because Meta might send three pieces of information, Snapchat might send four, so law enforcement is already behind the curve because of whether it be screen names, email addresses, geolocation. We want to modernize the amount of data. Everyone should be sending data.

For example, NetMac will ask you, “Hey, we need you to provide 30 points of data.” Some only give you two or three, and we complied. That is voluntary compliance. I think we're at a point where it can't be voluntary anymore.

What are some of the hurdles that you're facing in trying to get legislation and funding?

All right. A couple of things are, the Republicans have the House; Democrats have the Senate. One of the major, major tenets is spending because spending in government programs is high. Again, that's an argument to have and just say, “Hey, this is where money needs to be put in this book, and this is why.” One of the things we can do is we can articulate it and explain every level of it.

One good thing about that Senate Judiciary hearing is it is a bipartisan issue. Whether you're Democrat or Republican, I don't know how you cannot argue to save children. The challenge comes on how to do it.

Everyone agrees we should do it, but then the devil is in the details because people in other organizations might want to take certain things out of those bills that impact them. That is where a lot of time has to be spent educating and ensuring that those bills get to a committee and get passed.

Where are you seeing the most common ground between the different sides of the political spectrum, and where are you seeing the largest disagreements?

All right, the largest disagreement right now probably would be around 230. It's because, do we hold all these tech companies liable, or do we give them some immunity? I think that's part of it. Because it's a broader scope, I think that's where you get privacy advocates.

I think the challenge is that privacy has already had the loudest voice there. I don't think they've heard from this side, basically. I think that's probably the biggest hurdle. They're only talking about that, then we're talking about regulation.

If we regulate someone in our free capitalist society, then that goes against what we're doing. Again, we have to bear some responsibility. I think that is going to be the biggest fight going on, but we got to find a compromise or something that would do both.

Is that some of the challenges that when you start talking about Section 230, either it's enshrined, we can't touch it, and then the other side is, we need to get rid of it entirely as opposed to, how do we modify this in a way that makes sense?

Yes, those are the two different extremes. That's what makes it challenging. The challenge is that, first, we can have a bill. The larger the bill, the more likely someone's going to take issue with something in those bills.

I am not a professional lobbyist. We're just going to advocate for what we do. I go in. It's amazing the amount of groups that want their own little signature stamp. They want this taken out. I'm like, “Wow. Now I understand why it's so hard to pass something.”

Everybody wants their fingerprint on some little piece of the pie.

You're old enough to know Schoolhouse Rock!, right? Remember the video, “What's a Bill?” and stuff? What I'm really disappointed about is that it's completely wrong. Everything I learned in that was completely wrong.

Yeah. It's a lot of politicking now and less about the goal, unfortunately. It's what it appears to be from a layman's perspective.

The good news is that there is a little bit of a groundswell. At least we've talked to 40 legislative staff and 40 different legislators in the last three months alone. Everyone's in agreement. I think it's the how. I think that's going to be the challenging part. But at least we’re there because if it was any other issue, forget it.

Yeah. Everybody has a certain amount of common ground. We need to do what's in the best interest of children, versus only 5% of the people in the country care about that issue. We can't tackle that.

Yeah, absolutely.

Is there an easier front on the money side and the funding side?

The funding side, the Democrats generally will fund more, but now you're talking about fiscal Republicans that really want to watch every penny. I think that becomes hard because of the funding and the amount of our debt. If you just take us as a country, you look at our debt, you look at the money that we're spending, and you look at our deficit, I think that becomes problematic. That becomes the argument.

What's going to happen is that they're not going to create more programs with more money and more spending. They’re going to have to take it from other places. That's where it really becomes dicier. I think in this issue, if we can articulate things the way we could show, I'm naive. I think I got an argument. Maybe that's why I'm still doing this, so that I still can make a difference.

Is it easier on the local or the state side as opposed to the federal side when it comes to both legislation and funding?

Yes and no. Federally, it's easier to talk to people because of that process. The challenges with states, every process is different, and every state is different. Ultimately, the state needs to pony up. They can't just rely on the federal government all the time too.

It should be in conjunction. I'll use New Jersey as an example. New Jersey had $600,000 in their ICAC program, which isn't even enough to buy one or two tools, and that's it. Realistically, they should have at least 2 million federally, but then the state should be kicking in another million, or 1.5 and then maybe another 500,000.

The state should be maybe even be matching something, specifically, because I'll give you some stats in New Jersey. In 2015, we had 120 arrests, which were like, “Oh, my God. This is horrible.” In 2019, we had 423 arrests.

When we talk about the amount of offenders and victims in this state, there are kids. I think the challenge is that there's money in all these other places. When you come down to money, people fight tooth and nail for that existence over that money. That's what makes it difficult.

Yeah, that will always be the challenge. There's always more programs than money, whether it's local, state, federal, international level. There's always more need than funds.

You know what gets cut first? These victim services. Child advocacy centers do all the victim care, forensic interviewing. They get cut before anyone else.

I would love to have a study where we talk about the value of a child. If you have a child that's victimized, what is that harm to that child, or better yet the US, of that traumatization? What does it incur to their value in society and country in general, had that not happened?

I've seen studies like that or the discussion of studies like that. The fear is now you're placing a financial value on somebody's life, and we shouldn't be valuing people's lives in terms of money. But there is definitely an impact to society as a whole when someone's gone through a traumatic event.

Yeah. How many doctors did we lose? How many lawyers did we lose? How many engineers did we lose? How many valuable people could have changed the way we grew as a country? I think those are things that you look at, but the challenge with research is every researcher I ever saw just wants to go from research project to research project. There's not enough money.

It's funny because I teach college part-time, and you have the career educator that wants to talk about theory all the time. At some point, there's got to be some action to that theory, otherwise it doesn't work.

That's a good segue. If people want to get involved and people want to help move legislation forward, help the lobbying events, or help the advocacy, what can they do?

You can go to www.raven.us. It's very easy. My email, by the way, is jp@raven.us. We'll make it nice and simple. There are things that, obviously, donation funding. Have a talk with your legislators. If you have an event where you need support in that town, or you need someone to talk to the elected members, we're more than happy to provide that expertise.

Provide who to talk to, how to talk to them, the talking points.

And ultimately, your vote. If enough of you want something done, the last time I checked is people that get elected want to stay elected.

Take this issue seriously or I will not vote for you again.

Yes.

That's where I think it comes down to, and this is my personal soapbox, is voting people, not by which party they're a part of, by the values and the things that they say they're going to do. If they don't do it, you replace it with somebody else who will.

Yeah. I think that's what we're trying to do, Chris. We want accountability. We want people to be accountable. For too long, no one's been accountable. That goes from the tech companies. That goes from the status quo. The status quo has been the same for this crime type since 2008, and nothing has changed. That's why we created Raven.

We're all part-time. Raven is made up of nine retired Internet Crimes Against Children commanders, policy expert Dr. Michael Bourke. We have people that have dedicated their life to us.

We don't personally benefit from this. We're not advocating for us. Most of our children are grown at this point. But we started this, and we just want to move the needle. It's John Madsen, who's the president of our board. When I first called him about this, he said, “John, if not us, who? And if not now, when?”

Yeah, that's always the call. Are there any other particular resources you want to make sure that we mention before we wrap up today?

If you are looking for any education courses, you can go to johnpizzuro.com. I do all child exploitation, darknet grooming, anything along those lines. If you need someone to come in and speak about that to your institution or organization, I teach a lot of that. Other than that, no, I appreciate your time.

OK, John. Thank you so much for coming on the Easy Prey Podcast today. I really appreciate your mission post-work life.

Awesome. Thanks, Chris.

Thank you.

Exit mobile version