Site icon Easy Prey Podcast

Balancing Privacy, Security, and Accountability with Kurt Long

“The public is left with a collection of tools that were never intended to secure our communication.” - Kurt Long Click To Tweet

In this episode, we navigate the ever evolving landscape of messaging app dynamics, examining the challenges and opportunities in striking the right balance between preserving user privacy, ensuring robust security measures, and maintaining accountability in an ever changing digital landscape. 

Today’s guest is Kurt Long. Kurt is an entrepreneur with over 25 years of experience in starting, growing, and building Information Security and Privacy businesses. Kurt is the Executive Director of The Long Family Force for Good Foundation which focuses on supporting not-for-profits dedicated to improving the mental health of families and children. Kurt is also the Co-Founder and CEO of BUNKR.

“There need to be alternatives in the marketplace that balance the public’s right to privacy and legal due process.” - Kurt Long Click To Tweet

Show Notes:

“The entire planet deserves to conduct their affairs with dignity at bank level security.” - Kurt Long Click To Tweet

Thanks for joining us on Easy Prey. Be sure to subscribe to our podcast on iTunes and leave a nice review. 

Links and Resources:

Transcript:

Kurt, thank you so much for coming on the Easy Prey Podcast today.

It's great to be here. Thanks for having me, Chris. I'm a fan of the podcast. I like the format. I like that you can geek out a little bit and still keep it in understandable terms that people can follow along with.

Awesome. Can you give myself and the audience a little bit of background about who you are and what you do?

Longtime entrepreneur for the last 25 or 30 years. Today, we have a startup called BUNKR. Also, the chair of a foundation called the Long Family Force for Good. That is meant toward the spiritual and mental well-being of children and families, and have a deep and lasting interest in aligning all these things in terms of, can we create businesses that are financially prosperous for all involved but do good in the world?

You'd be surprised at how tricky that topic can be. I've spent a lot of time thinking about it. I'd like to think I spent a lot of time practicing it. It's just a passion of mine that I continue.

I love that philosophy. It's one of the reasons why I started the podcast in that I have a profitable business that gives me a soapbox and a platform where I can run the podcast so far without any advertising, on my dime, to try to help the world be a better place, protect people from all the crazy stuff that goes on out there, and educate people where they don't have to pay for it. I think our philosophies are definitely aligned.

How did you get involved in technology? Was there a passion when you were a kid of your Atari 400? What was your technology journey like?

I grew up in Florida. We spent a lot of time outdoors. I did spend time in school and try to do well, but a lot of times outdoors. I was old enough to see the Apollo launches at night from across the state. Obviously, you can't see in detail from 150 miles away, but you can literally see the launch. You'd look up a couple of days later and you'd see these literally astronauts on the moon or in orbit and you think, “That's real. This is a real thing.”

I wrote the Space Center, maybe, when I was 13 or 14. They wrote back. That confirmed it was real. “This is a real thing, and maybe I could work there someday.” About eight years later, I did just that. I graduated when I was 21 from the University of Florida in Computer Information Science, and I was fortunate enough to work at Kennedy Space Center.

That was a foundational experience of the first technology experience, where I worked with really bright people who cared about one another and cared about the mission that we were all there for. Had it not been for them and that mission of space flight—by then it was the space shuttle—I wouldn't have stayed in technology my whole life. But I got addicted to doing hard things that are bigger than yourself with people that care and seeing it come true.

I like that concept, particularly with technology of, “How can we do something that's never been done before that ‘they’ say can't be done, and let's achieve the unachievable,” is a fascinating thing for me.

I just think that we should aspire to do things bigger than ourselves. Let's just say there are 30,000 people in Brevard County, which is by the Space Center industrial complex, working on that mission. Clearly, it's just so overwhelmingly obvious—this is a team effort, but to know that you contributed something, an important part. If you made a mistake, you're letting everyone down.

When we had launches, the entire county was so pleased and happy that night. I'm sure we drank extra alcohol or something like this on successful launches and then across the entire launch complex in the United States and around the world. Everybody was so prideful. You knew you did your part. It has to work, but you knew there were all these other people that you had to work together with.

I love that whole concept. I love that philosophy. I look at it as when you're doing this stuff, one little mistake has massive consequences. Everybody has to get everything right 99.999999%.

It's all true. That's a whole ‘nother path to go down. Everything you said is absolutely true. But if you get me going on that, we'll never get to security and privacy.

The only thing I would say is in my bio, you'll see Hubble Space Telescope, Venus Radar Mapper, Galileo, Ulysses. Those are the four scientific missions that I like to think I contributed something to and meaning my part. It's an important part, but it's a small part. Those are things that my entire life I will feel great about. I wouldn't trade those things for any other aspect of my career. But it's in the past, and that's where we'll leave it for now.

One more question about it. When you see photos taken by Hubble, do you think not that, “I took that photo,” but a little bit of the, “Hey, that's because of the work I did”?

Yes is the short answer. It's built in like your intuition. There's a sense of ownership. No matter how big or small, it matters. It's meaningful; it matters every time. Even James Webb telescope, when you see that, it's part of a heritage.

Hubble proved these things could be done, and James Webb came along and took it even further. While I don't have anything to do with that mission, you feel like you're part of that heritage. You feel connected, and it's an amazing feeling to have part of your whole life.

You help lead, you help lay the foundation for the next generation of technology in space telescopes and whatnot.

Just as people before us, whether we say it's in space or in computing, wherever you want to pick, we stand on the shoulders of the people who come before us. -Kurt Long Click To Tweet

Just as people before us, whether we say it's in space or in computing, wherever you want to pick, we stand on the shoulders of the people who come before us. The older you get, the more you realize that. I think that's a really important idea that there's this deep respect.

For example, my lead, Bill Galloway, graduated from Berkeley, electrical engineer. His father worked on the Manhattan Project, which famously was the conception/creation of a nuclear bomb that ended World War II.

I know there's a movie out, Oppenheimer, and you can twist that up a little bit, but I can tell you for sure, Bill Galloway passed his pride and his father's work along to me. I can assure you that whatever standards his father held him to, Bill held me to those standards as well.

That's cool. Let's transition into talking about privacy and security. We'll have some fun rabbit holes to go down. I know that your specialty—what you're working on is messaging platforms, the intersection of government and law, and how all these things roll in together. Let's take a step back. Can we talk a little bit about the history of messaging platforms, if that's a little bit of background that you have, where it's come from, and where you see it going?

You know what? I might just go a little bit further back if you don't mind, Chris, to the beginnings of my security career. I was a TCP/IP, what's called a sockets programmer, if you're old enough to know what sockets were. I was pretty good at it. Basically, that is the lower level programming that you can send messages to computers on.

You use these different protocols, depending on the use case. When you hear TCP—Transmission Control Protocol—IP—Internet Protocol—these are reliable protocols, TCP is. I knew all about that. I was very invested in that and just thought, “Oh, my gosh. We can connect computers around the world.”

When the commercialization of the Internet hit in the mid 90s, I just thought, “Oh, my gosh. This is the greatest thing that could ever possibly happen to a TCP/IP programmer, a sockets programmer. I made my way to the Netscape Communications company when there were 30 people. They were still called Mosaic. I would go out there, meet with the product managers, and start my first security business around the Netscape platform.

People's names like Ben Horowitz, Frank Chen, Greg Sands, and even Marc Andreessen, are all people that were in that circle, a very brilliant set of people that would go on to shape the Internet. I knew how to apply the products and the programming to write applications.

In effect, I wrote a reverse proxy server to enable very important billing information for a giant telecommunications company, to offer that online proxy server, putting the data in what's called a demilitarized zone, where it's multiple layers of security. You don't ever have to actually expose it. That's basically how I got into security.

I was an infrastructure guy, and then I started a whole company around single sign-on, access controls, and then that company wound up selling products all around the world. That was OpenNetwork, and it was acquired by BMC. But it all started with, literally in the very early days, me writing some programs.

I know you're asking about messaging. I'm maybe one or two steps away from that, so bear with me. My first foray was single sign-on, access control, identity management.

It's funny because I had some involvement as a consumer, let’s say, on the Internet in the early 90s. I got involved in email messaging and looking at how email messaging works. One of the things that I found interesting about the early internet was this concept of almost implicit trust.

When you're sending data, it is always assumed that you are the legitimate originator of the data and everybody who touches that data as it transits around the world is going to do the right thing. They're going to hand it off to the next entity to get that data where it's supposed to go. The person who gets it is going to say, “Yes, I'm the intended recipient.” Everybody trusts everybody throughout that entire process. To me, it was like, “Oh, this was definitely made by people who—by educators and purists—who didn't think about the bad guys yet.”

That's a great point, and that's a really good segue to the messaging idea. I spent the first 10 years of security privacy on this infrastructure and how we authenticate people, how do we make sure people can go to the right places.

But during that journey, you start finding out that the tools we're using, like email, were exactly what you said. They were assumed to be for use on internal department DARPA communications, and there's no real concept built in of authentication, if you will. There's no real concept of agreed-upon encryption standards that keep those messages secret. It turns out that they pass through many different routers and switches, and those routers and switches can be compromised.

]You eventually figure out that, “Oh, wow. An email is just absolutely insecure. It's vulnerable to imposter attacks.” You might call it social engineering, but I'll just say imposter. That turns into these phishing attacks where you send these false emails. That leads to account stuffing and password stuffing.

Those insecure communication channels that the public is almost forced to use—because businesses use the—of email, most certainly I would include text messaging in there, and I would include social messengers in there in terms of not so much the encryption but vulnerable to imposter and social engineering attacks—is left with a collection of tools that were absolutely never intended to secure our communications.

They have fed this incredible outbreak of crime where cyber crime is going to be the third largest economy in the world after the US and China by 2025, I think it is, and it has hit $8 trillion of damages now. That's my response to the messaging question. You go from, “Hey, we're all good people, and this is all about democracy. We're all going to be better,” to, “Oh, my gosh. The world is maybe not what I thought it was out there.”

I don't know that it's called a knee-jerk reaction, but I think now we're swinging the opposite direction on messaging and moving data in that we're end-to-end encrypting stuff, we have messages that disappear after we receive them, that Apple is telling federal government, “Hey, we can't tell you the text messages between these people because it's encrypted. We don't see this; we're not going to give it to you,” and that has almost swung from this implicit trust to now we're intentionally putting in systems that have implicit distrust of everybody in the process.

I don't know if I can say anything better. That's about it. It's really tough to respond when the host does it so well. That's right. If you think about things like WhatsApp and the founders of WhatsApp, their intention was to create a messenger that gave people privacy and security. I think their interests were really well-intended and genuine, and they did that.

Famously, Facebook bought them while they were still called Facebook and not Meta. Famously, the treatment of privacy and security frustrated at least one of the founders to depart from WhatsApp and create Signal, which is another secret messenger.

Snapchat was probably the first to come along and make famous disappearing messages, and that was basically for kids to send pet messages that their parents couldn't see. There's an origin story around Snapchat, but we'll leave that out for now. Disappearing messages is certainly a big part of it.

Now, we've got a collection of messengers when we include Telegram and talks in there that the entire purpose is to prevent legal due process. In other words, even with a warrant, law enforcement is not able to do their job.

While maybe that's a great use case if I'm a reporter somewhere around Gaza in fear of my life from one side or the other, and I can't ever have anybody look at my messages, I get that. The encryption where we're not going to cooperate with law enforcement at all breeds like a criminal use of these messengers, famously criminals, terrorists, and every possible nefarious activity that you can think of, in part flourishes off their ability to communicate without any inspection ever by law enforcement.

I don't want to characterize any one messenger. They're all slightly different, including iMessage. But as a general sentence, I'd stick by what I said.

It's an interesting argument between philosophies, so to speak, that while we should have absolute security and privacy, we should also have safety. Yes, we want the reporter that is reporting on oppressive regimes to be able to do that in a way that doesn't put their family at risk, but we also don't want the local crime boss disseminating his hit list to his hitmen via secure messenger and us not being able to do anything about it or know about it. It sets up this almost purist argument between these two things.

Yes, I think there's a different use. By the way, this is all great debate. In other words, I don't think it's as simple as saying, “This is the answer,” but there needs to be alternatives in the marketplace.

For example, a balance between the public's right to privacy—and they do have a public right to privacy under the Fourth Amendment and protection from warrantless search, and I think we all deserve that—versus legal due process where, yes, Kurt is suspected of insider trading and the SEC or a legal subpoena of some kind says, “We can take a look at his messages to see if he really is trading on that insider information and making money.” So I have the right to be protected until there's a legitimate court order for otherwise.

The only other thing I'd say right there, Chris, is you could make an argument to say, “Yeah, but you can't trust the courts, or I don't trust politicians and I don't trust this.” At some point, I think we all have to make a stand and say, “We have to trust somebody, and I'm going to continue to trust the court system.” That's probably my best chance at having something I can trust, because if we don't trust anyone, civilization breaks down.

We founded a company called BUNKR that does just this. It tries to strike the balance of messaging between the rights of the public versus the Fourth Amendment, the right to legal due process, and the right of victims to be able to seek justice in legal due process.

If you're a financial institution because you were talking to SEC earlier, we're a financial institution, and our executives want to communicate about the business to one another that maybe they shouldn't be doing that via, let's pick on Snapchat for this conversation, because they can send a message to someone else in the company, it disappears, and there's no paper trail of what the executives of those financial institutions are doing.

I think consumers can look at that and say, “Yeah, in this situation, maybe they shouldn't be allowed to use end-to-end encrypted, non-subpoenable…that they can't use these platforms where, under the right circumstances, yes, the courts can reveal those conversations.” Is that what BUNKR is trying to be?

Yeah. We can break that down a little bit. For example, in the finance industry, the Securities Exchange Commission has weighed in and said, “Hey, listen. These apps that don't support legal due process, all the ones we just said basically, you are going to be fined heavily if you use them.” For the public good, I think we want transparency when it comes to our money.

For the public good, I think we want transparency when it comes to our money. -Kurt Long Click To Tweet

Money's hard-earned. It does not come easy, as all of us know. I don't think any of us really want a bunch of insider trading on our money. The SEC has stepped in and fined the banks billions of dollars saying, “You guys can't use these apps.” Your question was a little more oriented toward the public and to say, “Well, why can't I use them?”

I'll give you a use case. Let's take law enforcement, and it could be the FBI. The FBI is handling, almost by definition, dangerous people. Some of them provide very important information to the FBI. In this case, that agent has to be accountable for their interactions with a dangerous person. Maybe that person is providing a tip. Maybe it's an inside source. Law enforcement might take actions based on that informant.

Law enforcement doesn't want a secret messenger, because if things go wrong, it's like, “Hey, you told us this. My guy told me that too.” That doesn't work for them. Law enforcement wants to be legitimate people. They want to be able to show, “Hey, this is the chain of evidence as to why I took the action based on this informant.” The last thing they want is not to be able to prove themselves. That's law enforcement.

I'll roll over to one more, Chris, if you'll endure me. This is based on a real story, and it's happening more and more frequently. I'm going to tell it a little bit abstractly. Forgive me, but I want to do it to protect people. Let's say we have an insider trader who also isn't just an insider trader, but they do legitimate business and they have legitimate friends. That's not their sole purpose in life is to inside trade.

I'm just going to call it a secret messenger. As they use their secret messenger, they communicate with a large number of people that are legitimate people. Now, all of a sudden, law enforcement has no ability to look in the messages. They know they have the one person who is indictable as an insider trader. All they know is this person communicates with all these other people, but they're using a secret messenger with a disappearing messenger.

Guess what? Law enforcement shows up at the house of what you could say are the innocent and say, “We know you have a relationship; we know you communicate. You need to produce the evidence that shows you're not talking about insider trading.” Guess what? They can't do it. You're not convictable, but now you're dragged into a lawsuit or criminal activity because you can't defend yourself.

It's not intuitive how this is playing out in the marketplace, but law enforcement doesn't have any choice but to make a little bit—I want to choose my words carefully—be suspicious of people using messengers. Ultimately, you can get caught up in the web of criminal activity and now you can't defend yourself.

It's guilty by association, but not legally guilty by association.

That's right. That's why I used the word suspicious, so I was trying to balance that out.

It's like, “Oh, hey. You've been communicating with this bad guy. We think you might have been part of the scheme.” I suppose those get out into court records, those get out to the public, and all of a sudden it's, “Hey, Kurt communicated with this guy who was convicted of insider trading. Is Kurt part of it also? We see that he's doing well financially. I wonder if, I wonder if, I wonder if, I wonder if…” that sort of thing.

Correct, and then it can even cost people their employment because their employer doesn't want the affiliation of a director, an executive, or anyone. But let's say those positions, particularly, they do not want the sensitivity of those people having any affiliation with criminal activity and say, “Hey, listen. I didn't do it.” Well, you can't really prove it without a very, very lengthy discovery process, which is, in itself, not fun.

Yeah. Do you want to have to turn out your entire life of, “OK, here's every financial transaction I've ever made. Look at all my banking, look at all my stock trades.” Turn all that out and have someone turn your life upside down and inside out and say, “OK, yeah. I guess we're OK with that.”

You're right. In healthcare, it's the same conversation in terms of regulatory action. It's the SEC and healthcare enforcing these laws and rules. On the healthcare side and the healthcare industry, it's more famously HIPAA and the Office for Civil Rights. It's the enforcement body. You've got multiple drivers of the need for a balanced messenger that protects privacy and security for the public, but respects the Fourth Amendment and tries to balance that with law enforcement and regulatory bodies.

It's interesting when you talk about medical because I definitely want my doctor to be able to communicate with my pharmacist in a way that is secure and protects my privacy. But if the doctor does something wrong and prescribes the wrong medicine, I want a chain of evidence showing that the doctor made the mistake in communicating to the pharmacist. There are some ways where security and privacy help us retain rights that we might otherwise lose, if that makes sense.

Yeah. Again, the host has said it better than I can. It's 100% correct. You want to, again, strike the balance. I think every physician knows you're not supposed to text message with your patients. It's not an indictment of all text messengers, it's just that generally, it's very difficult to get cooperation. You can't use disappearing secret messages for some of the reasons that you said.

Briefly, I founded a company called FairWarning. We invented and patented a whole series of analytics and techniques to discover bad actors within the healthcare system, doing everything from identity theft to Medicare, Medicaid fraud, to opioid diversion, to snooping on your neighbor, to blackmailing people, to moving yourself up donor lists. That's a whole wild 15-year story, but we did that.

One point. I don't know what it is now—Imprivata owns FairWarning now—but about 60% of the hospitals in the US and many around the world used our software to protect patient privacy. The issues that you just raised are incredibly important, because even the Chinese were famously sucking off medical data from health systems and insurers in order to blackmail ambassadors and people on the ground, senior level defense contractors. Our medical information is of a particularly human vulnerability to us. The healthcare industry is something near and dear to my heart. They have special challenges. Yes, they have the same challenges as you described with the messaging.

Do you see more and more legislation coming around where messaging platforms that have a retention policy versus communication platforms that intentionally have a non-retention policy starting to, like you were talking about the SEC starting to fine companies. Do you see other industries where that's just going to become the standard of like, “Well, you just can't do this anymore”?

Let me start in the states first, then I'll pick up the UK, France as examples and the European Union in totality. In the US, we've got some logjams. The Congress shockingly has drafted legislation at different times and has made it different segments of the way through the process. But to my knowledge, it's not anywhere close to being done.

In the United States, we're going to live with this industry regulation, state regulation, lack of federal regulation. This is the condition that we're in for the foreseeable future. In the UK and France, it's really different. The UK has passed legislation that bans these secret messengers. At the same time, they can't quite get the courage to enforce it.

They've got themselves where they've passed legislation, but I don't really see any evidence that they're going to want to enforce it. They're bordering on over surveillance. In other words, skipping the warrant process as we know it and maybe over-surveilling. That's the UK.

France has similar-type circumstances. I don't recall if that legislation actually passed, but it was on the precipice. Then you have similar legislation in the EU. In those markets, it's like we do want to open this up so that we can look after the bad guys, but we can't overreact, just open mic it, and give the EU the ability to insert some malware on your phone that they can listen to your mic and not even have to go through a warrant for it.

I do believe that this is amongst the most important issues of the 21st century, trying to carry forward human rights that have been hard-earned that we say we believe in our constitution and in the United Nations special rapporteurs, but they're getting very loosey-goosey with it all. That's what we're trying to do with BUNKR. Hey, what we're trying to do is bring a balance to this thing. That's my viewpoint on that.

Are you also trying to move that for consumer communication, or is your particular drive at the moment, this is predominantly for businesses where there are regulatory issues that need to be managed, and your position isn’t, “Hey, no consumer should ever use end-to-end encryption between family members”?

We've got public usage in 32 countries around the world at this point. It's super popular with families. It contains other elements of what might add to a family security posture, like password management, secure cloud storage. We've got families using the product around the world. We have athletes using the product around the world. We have entrepreneurs, we have physicians, we have pilots. We have a lot of great use cases for the public usage of it, and it grows every day.

It's like a grownup WhatsApp; WhatsApp for grownups. You're not going to get any spam. You're not going to get any imposter attacks. Yes, you might have to cooperate with law enforcement, but if you want to do something like that, go use a different messenger. It worked out really great for legitimate public users and legitimate businesses.

I think you make a good point. If the platform has a little bit more control about what is happening on the platform, you're going to have less of the criminality on that platform.

Yeah. Free means there's no—I use the word authentication—any certainty that you are who you say you are. Let's just leave it at that. You open it up, you’ve just got a registration page, and now they can come on, start using it. What you get is criminals from around the world absolutely bombarding the platform. They are using it to communicate. They're using it for secure file storage. That is well-established.

The first step is if you have someone pay for it just a little—99¢, we're offering it to the public at 99¢—while they have to have payment on file with the App Store or the Play Store, is that perfect? No. Does it get rid of about 99.9%? Then when you say, “Oh, hey, listen. We might cooperate with law enforcement,” that's going to get rid of the other bad guys. What you get is you get legitimate people that want to conduct legitimate business on the platform, and that's what we have.

I like that you're able to chase away the criminals to some extent from using the platform by making it clear that, “Well, we're going to cooperate with law enforcement so you, Mr. Criminal, really don't want to be here.” It just makes it a better environment for the people that use the platform. I like that.

It's the radical idea of providing a legitimate software service at a low price for legitimate public users and legitimate business. That's radical now, but that's who we are. By the way, you chase away the bad guys. We've got invitation-only messaging, so you literally cannot get a message from someone that you don't know.

Yeah, and that's a lot of where the scammers come in, all those text messages. “Hey, this is Bob. Is our appointment still at 2:00 today?” It's not Bob, and he doesn't have an appointment.

Yes, and there's not an Amazon package on its way that you need to fill out the credit card for or whatever it is, all these crazy things that you get. We have a lot of important people on the platform. We tend to attract a lot of very important people. They tell us, if you ever open that platform up and let people look for us and try to connect to us, we will quit. It's a very private, very secure, legitimate platform.

I like that. Maybe this is not something you can necessarily talk about. How do you then deal with the platform being used in countries with oppressive regimes? Let's just say North Korea, because I don't think anyone from North Korea is going to come after me for spilling them out.

A lot of it will come down to the payment. We will get some registrations from difficult countries, but we don't get a lot of them. This is the vision. The vision is more like, “Listen. The entire planet deserves to conduct their affairs with dignity at bank-level security.” That's the statement.

Now you start saying things like, “Well, does that mean that we block Yemen?” The answer to that is a no. We're not going to block Yemen. What it means is that someone from Yemen can register, pay for the platform, and communicate with their family. But by the nature of our architecture, they literally cannot reach out and go after people outside of their connectedness. And it's worked out great.

I really do want to reach legitimate people all over the world and maybe even more so if you're in a difficult country where we could help you have a stable, decent life. That's a pretty good idea.

I guess my question was more of if Yemen came to you and said, “Hey, we are legitimate representative from the Yemen country, and there are two individuals in our country who we know who've been communicating with our platform, and our laws say we're allowed to see their messages, but US laws says, ‘No, you're not.’”

Yeah. I'm going to be forthright with you. We literally have lawyers. If you're a summary of the five people that you're around the most, I'm like, “I'm going to turn into a lawyer, I'm pretty sure.” We have lawyers to help us figure that out. My tilting would be toward the US State department.

That is my allegiance. I'm a citizen and a law-abiding citizen of the United States. I think it's the greatest country on earth. While I have incredible respect for everyone, my leanings are going to be toward what our State department thinks. We might have some difficult decisions in the future for us, but I welcome them.

Got you. I know there are a number of entities out there that, “Hey, while we have customers in Yemen and the Yemeni government comes to us, we follow the rules there, even if it's an authoritarian government, and we don't think it's a good use.” “Hey, that's just what our policy is.” You've taken the stance of, “Look, we're going to try to adhere to US values and US laws as much as we possibly can and treat all of our users with that kind of respect.”

Yeah, that's not a legal position. In other words, that wasn't a legal position. That’s more of a values and emotional position. The answer to that is yes, but I very much value legal counsel input. We'll always seek their advice and all the decisions we make, but my leanings are as you described.

And that should be comforting to people in those countries who want to use the platform. But again, it's if it's being used for legitimate purposes, when you have authoritarian regimes, maybe viewed as you and I see something as legitimate, and they don't see it as a legitimate one.

I agree with you.

Let's not paint you as a target for those countries to come after you.

Hey, listen. They've already tried to break in. We had 85 countries in parallel trying to break in on day one. We're used to being attacked. It's the nature of the business I've been in for 30 years. There are always people on the outside that would love to compromise you.

I have a behind-the-scenes question. If you can't or don't want to answer the question, we'll either remove it or let you not answer it on the podcast. Do you see attacks on your platform ebb and flow with the news cycle?

Yes. There's a layer of defense around. You can see them hit it. They make no penetration whatsoever, but there is definitely a hacktivist element to anything that's on the Internet now.

There is even the politicization of cyber attacks in terms of some go at infrastructure, some will go after embarrassing people. Maybe they take down their website and put something up. Maybe that's not an infrastructure attack, but yes.

The proxy wars that were involved in Russia and in the Middle East drive behaviors around the world. On my blog—it’s blog-bunkr.life—I’ve written about that extensively.

Very interesting. As we wrap up, if people are interested in BUNKR, where can they find out more about it? Where can they find out more about you?

Bunkr.life is the website. My personal profile on LinkedIn is Kurt Long. Whether you put Kurt Long in BUNKR or Kurt Long in FairWarning, you're going to find me. It's Kurt Long. I'm actually kitesurfing in the picture. If you see me kitesurfing, that's me in between trying to stop the bad guys. We're on Instagram under @bunkr.life. On X, I think it's @bunkrlife or @bunkr.life on X as well.

These days, you can never trust your intuition when it comes to names of apps and platforms because for one, every properly spelled word has been used.

Yes, that's a real thing. It's a real story. Between the domain name orders and the trademark orders, it's tough to get some space in there.

You and I are laughing, but it is unfortunately very true that coming up with a consistent name that you can get a domain name, name your app, get the social media accounts for, is really painfully hard to do.

Yeah. One last thing about the name is we conceived of this thing somewhere in the Covid era. People were not yet acclimated to the new world we live in. They're like, “Are you sure BUNKR is the right name?” At the time, I thought, “That's a good name.” As time went by, they're like, “Man, you got a great name, BUNKR. I sure wish I had a BUNKR right now.”

Everything comes full circle.

It does. It's been great to be with you, Chris. You had a great intuition for all these topics. I appreciate your questions.

It's only because I talk to so many people like you. To me, it's always fascinating. Even seeing how our views on privacy and security have ebbed over the last couple of years, I think probably four years ago, having a conversation about, “Hey, we need to have a platform where it's secure, it's subpoenable,” I would have been tarred and feathered for having that conversation. But I think now, we're starting to see the impacts of, “Oh, if everything is secret, maybe some bad things actually happened because of that.”

Yeah, I mean we've got to look to law and order from somewhere. I'm the first person not to trust a politician. We all have our suspicions, but we’ve got to look at law and order from somewhere. We can't allow ourselves just to decay into a society that simultaneously doesn't respect privacy whatsoever, and people are easily taken advantage of by the information that's known about us.

On the other end of the spectrum, just having so much protected criminal activity that law enforcement can't do their job, I don't think that's the route we want to go. I think we want to go for the right balance of law and order. I do say that in a balanced way. I mean that with all my heart.

I love it. Kurt, thank you so much for coming on the Easy Prey Podcast today.

Thank you, Chris.

Exit mobile version