In a world where we’re told to carry our entire lives in our pockets, we’ve reached a strange tipping point where the very devices meant to connect us have become windows into our private lives for those who wish us harm. It’s no longer a matter of looking for the “shady” corners of the internet; today, the threats come from nation-state actors, advanced AI, and even the people we think we’re hiring. We are living in an era where the most sophisticated hackers aren't just trying to break into your phone, they’re trying to move into your business by pretending to be your best employee.
Joining the conversation today is Jared Shepard, an innovative industry leader and the CEO of Hypori. A U.S. Army veteran with over 20 years of experience, Jared’s journey is far from typical; he went from being a high school dropout to serving as a sniper and eventually becoming the lead technical planner for the Army’s Third Corps. He is also the founder of Intelligent Waves and the chair of the nonprofit Warriors Ethos, bringing a perspective shaped by years of advising technologists in active war zones.
We’re going to dive deep into why Jared believes everything you own should be considered already compromised and why that realization is the first step toward true security. From the terrifying reality of his own 401k being stolen via identity theft to the future of “dumb terminals” that protect your privacy by storing nothing at all, this discussion challenges the status quo. We’ll explore how to navigate a future where AI can fake your identity in real-time and why the ultimate battle in cybersecurity isn't against a specific country, but against our own human tendency toward laziness.
“If your security model depends on perfect behavior, it’s going to fail. People are human, and attackers know that.” - Jared Shepard Share on XShow Notes:
- [[02:12] Jared Shepard of Hypori is here to discuss how modern cyber threats actually play out in real life.
- [04:48] How modern attacks unfold slowly instead of triggering obvious alarms.
- [05:55] Why many victims don’t realize anything is wrong until secondary systems start failing.
- [07:56] What identity theft looks like when accounts are targeted methodically over time.
- [08:48] How attackers prioritize persistence and access over immediate financial gain.
- [10:32] A real attempt to take over long-term financial accounts and how it surfaced.
- [13:07] Why financial institutions often respond late even when fraud is already underway.
- [15:44] The limits of traditional identity verification in an AI-driven threat environment.
- [16:52] Why layered authentication still fails when underlying identity data is compromised.
- [18:21] Deepfakes, voice cloning, and why video calls no longer prove much.
- [20:57] How laptop farms are used to bypass hiring controls and internal access checks.
- [22:18] Why insider-style access is increasingly coming from outside the organization.
- [23:33] Why some companies are quietly bringing back in-person steps for sensitive roles.
- [26:09] SIM farms, mobile identity abuse, and how scale changes detection.
- [28:47] The growing tension between personal privacy and corporate device control.
- [31:22] Why assuming device compromise changes everything downstream.
- [33:58] Isolating data from endpoints instead of trying to secure the device itself.
- [35:12] How moving compute and data off the endpoint reduces exposure without requiring device monitoring.
- [36:35] How pixel-only access limits data exposure even on compromised hardware.
- [39:11] Why AI training data introduces new security and poisoning risks.
- [41:46] Why recovery planning is often overlooked until it’s too late.
- [44:18] The problem with victim-blaming and how it distorts security responses.
- [46:52] Why layered defenses matter more than any single tool or platform.
- [47:58] What practical preparation looks like for individuals, not just enterprises.
- [49:12] Rethinking privacy as controlled access rather than total lock-down.
Thanks for joining us on Easy Prey. Be sure to subscribe to our podcast on iTunes and leave a nice review.
Links and Resources:
- Podcast Web Page
- Facebook Page
- whatismyipaddress.com
- Easy Prey on Instagram
- Easy Prey on Twitter
- Easy Prey on LinkedIn
- Easy Prey on YouTube
- Easy Prey on Pinterest
- Jared Shepard – Hypori
- Jared Shepard – LinkedIn
- Warriors Ethos – Jared Shepard
Transcript:
Jared, thank you so much for coming on the podcast today.
Appreciate you having me.
So can you give myself and the audience a little background about who you are and what you do?
Sure. So my name is Jared Shepherd. I'm the CEO for Hypori. I also have many other identities. I'm the founder of a company called Intelligent Waves, which was my first IT company. I started about 20 years ago. I spun Hypori out of it. I'm also the founder and chair of a nonprofit called Warriors Ethos that focuses on helping veterans in transition out of the military into their next career, whatever they're going to do next, to be successful. And a father, three kids, you know, I have a good life.
You've got your hands full as well, it sounds like. So how did, so my understanding is that you served in the military, and when you were in the military, were you doing cybersecurity for the military?
Not initially. So I've kind of had the anti-CEO story, as in I don't come from the common pedigree. I was a high school dropout, homeless kid at one point, who ultimately joined the Army to get off the streets. I joined the infantry. I got to do some cool stuff in infantry to include serving as a sniper and getting to go to different cool places. And it led me to Fort Hood, Texas. where I made the decision that I was going to do something different. And I ultimately re-enlisted and became an IT guy, which is what taught me the baseline for IT.
But I ended up being picked up by a three-star general on September 11th of 2001 to be his communications NCO. So I got to be a strap-hanger in a corner of a room with one of the more senior leaders in the US Army, being mentored by the most senior leaders in the government, well-known names that we all know. And I'm just a strap-hanger in a room and watching. But I realized the world was way bigger than I ever dreamed it was. And so I wanted to have a bigger impact. And ultimately, I made the decision to get out of the military and became a consultant, a contractor for a little while, and then started my first company. I actually started my first company in Baghdad, Iraq. And I've been, kind of running my own companies ever since.
That's neat. So was your intent to go into IT or was this just kind of like, I think this is where my future is?
So, yeah, I don't think it was ever my intent to go into IT. It was actually one of my best friends who joined the army with me. I was getting ready to go to what's called SFAS, which was the selection to go become a Green Beret. And he was like, why would you do that? Go be an IT guy. And I was like, well, why would I do that? And he said, you could go get one certification and make $70,000 a year. And I was like, that's a lot of money. Like, wow.
And so ultimately, I did. I decided to reenlist in the Army. The Army trained me, gave me the skills, and gave me the opportunity to really create the career field that I'm in today. What I do today as an entrepreneur, as a CEO, as a guy who's spent over four years in war zones as a technologist and as an advisor to technologists, is I took this fusion of understanding of technology, understanding of the operational requirements, the business requirements of the very ragged edge, and then just an entrepreneurial spirit and desire to have an impact. And thus, that's what I do.
I like it. So before we get into the meat of our discussion, I really like to ask my guests, particularly those in cybersecurity, if they've ever been a victim of a cybersecurity incident, fraud or scam. Because if you and I, who are the experts, can't get it right 100% of the time, or our teams can't get it right 100% of the time, the audience shouldn't be discouraged, depressed, humiliated, embarrassed if they can't get it right 100% of the time, because the truth is, we're all being targeted all the time. So do you have a story that you could tell?
Yeah, multiple. I mean, I was in Iraq actually when the massive USB exploit happened and that caused us to have to do backflips and try to protect our networks because we knew there were nation-state exploits that existed. I've had my first 401k stolen with identity theft. And they were really, really advanced. Although it was crazy when the bank sent me a picture of my passport, but it was an African man from West Africa. But it had all the right information on it, right? They were very, very advanced in a way that they were able to do that. And luckily the bank was able to protect me from the loss there with the methodology that they use to do that.
And then all the way to today across my two companies, I probably on an every other day basis, one of my employees is like, hey, did you just send me a text message asking me for gift cards or for Starbucks cards or for Apple cards? And so it happens every day. And you have to just, the requirement for diligence is just never-ending. Like, before you click, know what you're doing.
“The requirement for diligence is just never-ending … before you click, know what you’re doing.” - Jared Shepard Share on XYeah, I have a similar experience with the passport story that you were telling it. Luckily, it wasn't a 401k account, but I'm going to try to be intentionally vague here, that someone tried to do an account takeover on a significant platform to me, and they produced fake corporate documents, a fake passport, all with my information on it. But clearly not my photo. I guess even if it was my photo, it's not like the people at that company would have known. And it started me down this process of thinking of like. Okay, so does that mean if I open a 401k, I should be providing them a copy of my passport and my photo so they know in advance what I look like in case this sort of thing?
Like with me, the first thing they did was they just kind of very casually called in. I got to work with the FBI some on it. They very much just casually called in and said, hey, my cell phone number changed. And so they changed their phone number first. And then once that was done, then they called back and they were like, a couple of weeks later, hey, I need to update my e-mail address. And so then they updated their e-mail address. And then it was like, well, I'd really like to take my 401k and I'd like to move it to a different bank. And so it wasn't, they didn't want to withdraw the money, because that would have raised all kinds of flags. They was like, hey, I want to consolidate my 401k all to this one bank. And they said, okay, that's kind of normal thing.
So they set up a call with the other bank and they were working with that. And the guy, he calls the other bank that they received the money. And at the last minute, he's like, you know what, actually, I'd really like to look at like gold or Bitcoin. And so then they put them into that and it's like, they put all the money into Bitcoin and then money's gone, right? Instantly gone. And then they kind of figured that out. They were like, did you just liquidate your entire 401k and put it into Bitcoin? And I was like, who is this? And they were like, yeah, that's what we thought the answer was. They're like, OK.
And so were you actually able to get the money back or the transaction gets stopped?
The bank actually protected me in that I think they recognized they had a breakdown of validating who I was before they allowed the money to be transferred. However, though, interesting enough, the FBI was like, they have a threshold that's something like $300,000 or more that anything shy of that, they really don't investigate.
Which is like …
That's the average person's livelihood. And they're like, man, that's not enough money for us.
I think that's probably more than the average. I think I seem to be remembering the average person in the US when they hit retirement, they have $100,000 in their bank account. And for the FBI to say, nope, that's 1/3 of the threshold for us to even get involved.
Yeah. And what's scary is when you think about like, what was the total cost to execute September 11th, right? It was not a massive multiplier over that $300,000. You pull that off 3, 4, 5, 6, 7 times in a row, all of a sudden you could fund a terrorist activity, right? I mean, the potential impacts to something like this is real.
Yeah. And it's only gotten more complicated since then. Now we've got, it used to be, hey, I'm just going to call up on the phone. Now we've got devices, we've got all sorts of equipment floating around.
Take AI, like look at any of the new AI generation mechanisms and like guys like me and you who have podcasts out, where we've been out and doing things, right? It would take AI this long to reproduce something with us. And so now like, yeah, you could send a passport and then I could literally meet that person live on video and say, I'm looking at changing my money over, and nobody would know the wiser.
I know, and it's awful. when I started the podcast, the advice I would often give people is, hey, jump on a FaceTime call, jump on Zoom with someone. And if there's anything hinky about that, Like you, that's kind of like, that's just too complicated for someone to do real time. If they're like, oh, I'm in the airport, I've got a bad internet connection. Okay, then you know, it's, you know, something's hinky going on. Nowadays.
A friend of mine's a CTO over at Mandiant, and I know another lady who runs threat defense for Google. And some of the stories they talk about, like these companies that unknowingly are hiring multiple North Korean operatives into their business. And they have laptop farms in Eastern United States and Virginia, New York, et cetera, where they're remoting into these laptops to then be a present on the laptop to do an interview. So even if cyber is checking, like, hey, where is this guy calling in from? Oh, he's calling in from Reston, Virginia. Okay. But no, actually he's remoting into a Reston, Virginia laptop and then interviewing in.
And then when they finally reveal to this company and they say, hey, we think you have six employees potentially that are actually North Korean operatives working in your business. And the poor software developers are like, well, who? And when they tell them who, they're like, no, that's my best employee. Like, what do you mean? And they go, yeah, of course it is. Because it's actually like five people doing the job because they want to outperform. And they're doing it so that they can then get your code or introduce new things into your environment. Or they can recommend somebody else to get a job that has a more sensitive job to get more sensitive things. The advancements are amazing.
I guess this is now a commercial for end of work-from-home.
I mean, so you talk to some of the big firms now, a lot of the big firms now for any of the senior level jobs, they require an in-person interview. So they'll put you on a plane to make you come and do an in-person interview.
And if you think about it, the cost of that is trivial compared to someone's annual salary.
100%, especially when you're thinking about what the cost of exploitation is if you're wrong.
Yes, I mean, multi-millions of dollars, tens of millions, hundreds of millions of dollars.
Maybe your entire business, your IP, I mean, you name it. Yep.
Let's fly the person in.
Yeah, I mean, imagine what happened to SolarWinds, right? If that was perpetrated by an inside actor, right? I mean, you could have collapsed the entire multi-billion dollar value of an organization by simply opening a back door.
Well, I think of any company that runs a large portion of the internet or a large portion of security infrastructure that runs on. Look, we're recording this in late November, CloudFlare had an outage just a couple of days ago. A couple of days ago, and 1/3 of the internet was down for like 8 hours.
CrowdStrike, AWS's DNS errors. When they're that large of a provider, and it's just amazing, like something that could even internally inflicted can be crippling.
Yes.
And I would argue that we've never really seen like a near peer like China actually go against our infrastructure yet.
Yeah, I mean, these things are accidental human-caused errors, let alone concerted, nation state actors saying, let's see how come, like, okay.
Look at the, what was it in New York just a couple months ago, the phone farm that they found in New York that had, what, 4,000 phones on it. They were able to do social media influence and liking, right? Okay, well, yeah, that's cool because it influences social media from a very specific location. But it could also be an absolutely focused DDoS. All simultaneous launches that you can produce petabytes worth of data against a single target, and what would you do?
It's funny, I have not read much about the SIM farm, but I was talking with someone about it, I think maybe it was my brother or something like that. The original news stories were like, we think this is an attack on, I think there was a UN event right around the time. And they're like, oh, well, it was clearly an attack on the UN event. I'm like, to me, no, that doesn't make sense. What makes sense is it's really easy to stick 4,000 SIM cards in an abandoned building in one of the most dense cities in the world, where an extra 4,000 phones wouldn't be noticeable.
Doesn't even blip the radar.
If I were to take that same 4,000 sim farm to, I don't know, some little city of 600 people?
Arkansas, right? Yeah, I mean.
That has like 3 cell towers, all of a sudden 4,000 cell phones activate, they're gonna be like, ha, this is suspicious. But in the midst of a super population dense area. So to me, it wasn't location-specific. I mean, maybe it could have been, but it's just easy to hide in the noise of however many tens of millions of people funnel through New York in any given way.
Yeah, I would say to your point, I would say actually it was location-specific, but probably not event-specific. It wasn't driven just off the NATO thing going on. I think that was a nation state or similar activity that was being put into place for strategic use and probably for long-term strategic use if they could have.
Yeah, but probably not targeting actually New York. Or maybe, who knows?
Well, I mean, just using New York as a launch point for attacks, probably anywhere. Yeah.
So that kind of brings us to, we've been talking kind of around the edge of edge devices and mobile. And I was having a conversation earlier in the week with someone who's talking about the privacy aspects. Things are getting really, really murky, particularly in the corporate private world where you now have a single device that people are using both for corporate and for personal, as opposed to companies saying, Hey, here's your corporate device. This is only for corporate use. And don't you dare do your personal stuff on it or vice versa. How do we start dealing with that from a privacy and a security perspective?
I mean, you just kind of underscored the tagline for my company, but because that's what we do. We do it because we think that the days of carrying around multiple devices are probably largely dead, right? I mean, it was an archaic practice. Who wants to do that? I think the up and coming young men and women in our nation and across the world, really, they want the sensation of the ability to do anything from this platform. And if you look at even the big producers, the Apples of the world, the Androids, the Googles of the world, the Microsofts of the world, they're all trending towards mobile operating systems, right? Because I think they kind of know that the world is going to go mobile.
So how do you, as a business, leverage the capabilities of having an employee with twenty-four-seven access from a device like this, but do so in a way in which doesn't compromise corporate information or corporate infrastructure or corporate data? And then inversely, how do you do it in a way in which you're also not then compromising, if you're the user, your privacy, because it's your device, right? And how do you accomplish that? Now, I'm going to tell you how Hypori works to address part of that.
But then you have the next problem, which we haven't seen yet, but we've gotten all the indications that is going to make this problem even more complex. How many of these operating systems are now coming with AI built into them organically? And you know what, the number one downloaded free AI is in the world is DeepSeek. And it's a great high-performing AI if you're looking at benchmarks. You just have to be comfortable with the fact that like midnight every night, it's exporting gigabytes worth of data out to China. So yeah, the real world of privacy …
Europe's been struggling with this for a while with GDPR and the separation in GDPR from a privacy standpoint. But then in America, corporate rules have taken over more so than individual privacy rules. But I think we're seeing an emergence of individual privacy rules. But then these massive engines of AI are kind of just under undermining that, because how do you train these AI models if it isn't going to be off of a combination of personal and corporate data?
Yeah. And then what are they going to do with that data? Even like even with like ChatGPT, let's use them as kind of the generic. Their original policy was, we're going to store the prompts for 30 days. And then they get sued. And now we have to store prompts indefinitely for everybody. And I think that just in theory just ended about a month ago. But.
But it's a slippery slope. Like inversely, like you look at Europe, like you can't move data across national borders without written consent from an individual, which is cool right up until you actually try to create a collaborative platform of any kind.
Yeah. Yeah. Because then you can't move anything. Right. And to me, as a user, I'm not even in a position to be able to keep track of one AI's legal status and what lawsuits are out there. Like, now do I have to have like AI generated news to tell me which AIs I shouldn't be using on any given day of the week?
So, I mean, the way we've approached the problem—and we've gone against the grain, I think in some ways, right? And we've created essentially our own product category in some ways. And it was funny, like Gartner told me early on, they were like, hey, congratulations, you actually are your own product category. By the way, we have some really bad news for you, actually you are your own product category. And I didn't get the sense of humor of that at the time. Now I do, right? Because we're not Chevy competing against Ford, where we're just trying to say that we make a better truck. We do something entirely different. For lack of better equation, we're Tesla competing against Ford, trying to say, look, yeah, we have four wheels and four doors, but that's where the similarities end.
And so the genesis of Hyporia was actually in the military. So the military side of the house, we were doing hard stuff in hard places where it was not just difficult from a nation state and data standpoint, but also dangerous. And the idea was, can we enable you to use a device from, say, a host nation market, knowing that device has already been compromised, and knowing that the network has already been compromised, and do so in a way in which doesn't jeopardize the enterprise, the enterprise's data, and do so safely? And then what we found is that we had to—this was before Zero Trust became a thing, where we built this platform—our platform assumes that your device is already compromised, which, by the way, if you have social media on it, it is. It's been compromised, at least at the core of the word “compromise,” anyways.
But so it assumes that your device has already been compromised. So what my application will do is it won't interact with any of the software or middleware layers or the edge of your edge device, meaning that it doesn't trust anything that's on your phone, it assumes it's compromised. And in doing so, it then means that the side effect of that is you get to keep your privacy. So what you do on your phone on your time with your applications is your business, and it will never be exposed to the enterprise. And inversely, I'm never going to expose the enterprise to what you're doing on your phone.
And what we do that through is, ironically, it's not a new idea, although it's a lot of new technology. You remember the old days of mainframes and dumb terminals? Yep. That's what we've essentially been able to do is we've essentially been able to turn any edge device into a dumb terminal that can receive pixels from, we'll say, the mainframe, but in this case, a cloud-based operating system. I can view, I can interact with in real time, but I'm never actually in possession of anything. Because the pixels are writing over themselves constantly, which means the data never actually left the enterprise, which means my device actually stores nothing. So that's how we're providing that level of separation that you're talking about is between personal and corporate, is to do that.
Now, if you really want to get further into it, then you start thinking about, OK, well, what about AI? And how is AI learning on edge? Well, it's not just, how is AI I don't want learning about my platform? It's also, how do I teach AI that I do want learning about my stuff, but not off of bad stuff? Like, I don't want data poisoning. I don't want to learn bad data. Which that then comes into a very similar type of isolation challenge where, hey, how do I isolate my GPT or my LLM so that it gets what I want it to get, but it doesn't get exposed to stuff I don't want it exposed to? And so you can use almost that same level of isolation exercise.
And where we really go against the grain with a lot of this is we're taking compute away from the edge. And by doing so, I would argue that I'm actually empowering much more vast capability and compute speed and capability on the edge, because all I really need the edge to do anymore is present pixels and collect telemetry, i.e., touch, type, and swipe. And I allow all that processing and storage and networking to occur inside the cloud, which is, by the way, as best suited to do so as any platform is.
Yeah, and then you can, if you need more compute power, you scaled up in the cloud.
Turn up the knob, right? Yeah, you turn the knob to 11.
Okay, everybody, you've got to mail back your phones, and we're going to ship you out a new phone.
You don't worry about. It was interesting, because one of my parlor tricks I love to do is when I demo Hypori, and I'm showing like, okay, here's my virtual phone versus my physical phone. And look, here I am on 2 bars of 4G. So in my virtual phone, I go to bandwidth test and I go, how much bandwidth do you think I can get? And people go, two bars of 4G, 20 megs, 30 megs, 50 megs. It doesn't matter what they say, because they never get it right unless they're true technologists. And then they pick it up right away and they go, it's not here. It's in the cloud. Yes. So then when we do the speed test, you get like 5 gigs of throughput because it's running at data center speeds.
So now if you think about the implications of that, that means I could give you a computer that could operate at data center speed. So to your point, I can dynamically allocate lots of processors, lots of RAM, lots of bandwidth, and you could access it from any edge device to include maybe even your 72-inch TV on the wall, right? You could do it for cheaper than you could go buy a new PC from Best Buy for and get a better computer.
Now, I guess the risk or the downside is if you don't have any internet connectivity, you've got nothing.
“If there's any idea that's starting to go away, it's the idea that you're ever going to be in a place that you don't have connectivity.” - Jared Shepard Share on XIf there's any idea that's starting to go away, it's the idea that you're ever going to be in a place that you don't have connectivity.
Yeah. And as I've traveled internationally over the last 20 years, not having connectivity somewhere was definitely a reality. In the last couple of years.
It's still a reality here in the US. I mean, I was in the backwoods of West Virginia a couple of months ago and I was like, oh, zero bars. Okay. You know, it is what it is.
But with Starlink and any competitor service It's just a matter of time before, look, with T-Mobile, you can do Starlink. I think T-Mobile has partnered with SpaceX. So you can do…
We're going to be right around the corner, I think, for cars starting to come out with standard Starlink minis built into them. And then they're going to act as Wi-Fi hotspots anywhere in the world. And then we're going to see townships that are going to incorporate them into stoplights, right? And you start to pay for it like a utility rather than as a service.
Yeah, I mean, there are a lot of municipalities that run citywide Wi-Fi as a service to their community.
Yep, which from a connectivity standpoint is brilliant. From a privacy standpoint, it starts to become challenging, because now who's managing that network?
I would not want to be the city employee that was responsible for maintaining a citywide Wi-Fi network of 50,000 compromised devices.
“My #1 rule to cybersecurity is everything is compromised.” - Jared Shepard Share on XThat's why we tell people, our approach to it is, and I think this should be like to your viewers, being a guy that's been doing cybersecurity for 20 years now, that's been doing it in the most difficult places in the world and to include when people's lives rely on it. My #1 rule to cybersecurity is everything is compromised. And if you just operate with that assumption in play, then you can start to figure out, okay, is what I'm putting on this super sensitive? And if so, what protections am I using to prevent that from being compromised?
So in theory, I'm just trying to work this out and you can tell me if I'm on the right track. If my device is compromised and can screen record, I can at least see what you've been doing potentially, but I can't interact with that information.
Except for the fact we have the ability to detect screen scraping and screen recording and disable it. Now that's an administrative setting. You may choose that you want to allow that to happen for other purposes, right? One of the methodologies that we actually have, for instance, so iPhones, they won't allow you to develop in the operating system, right? And nor do we want to, because that's a sovereign space for Apple. But what we can do is when you go to do screen collects, I can just feed it black pixels rather than anything else. And so then, that's a methodology in which we can protect ourselves.
Nice. I'm glad there's a workaround for that. I guess the only other workaround is someone snooping over your shoulder watching what you're doing.
I always get that, even because we do all kinds of cool sensitive stuff for the Department of Defense and others. And they go, what happens if somebody takes one phone and takes another phone and they take a picture? And I'm like, then you have an insider threat problem. That's a different problem that technology can only solve for so much.
Technology can solve for technological problems. Humans have to solve for some human problems.
You know, ironically, I actually, there are, there is technology out there that can solve for that problem. I've seen it, we've even played with it, which is where when I'm using my device, the camera detects whether my eyes are looking at the device or not. And the minute I take my eyes off of it, the screen goes black. And it can detect if there's another set of eyes looking at it, and it can make the screen go black then too. The issue is users hate inconvenience.
No. And I suppose you could do things like polarized glasses and polarized filters on the displays.
There's some cool tech out there that will actually, they'll use the screen refresh rate. So, you know, if you think if your screen is refreshing at 60 hertz, it will every, you know, second or third frame, it will blur a letter on the words that are being done. So if any time a camera was to capture that, a third of the letters are blurred out. But it'll give you a headache if you have to stare at it for long enough.
I remember someone, I remember seeing a video of someone came up with some technology to render 3D images on a video that if you took a still frame, you can't see the 3D image at all. Because it requires your brain to process multiple frames to build the image.
Yeah, there's really, look at the new meta glasses and stuff like that, there's going to be mechanisms in there where I can say that not only do you have to, the glasses can see your retina, so I can authenticate that you are who you are supposed to be. I can also say you're only allowed to see certain kinds of images if I can validate who you are. So where that tech is going to go is going to be obnoxious. It's just whether or not the user's going to be willing to adapt to those environments. Because in the end, my number one, as a cybersecurity secure platform provider, the number one enemy for me in the world is not China. It's just laziness, right? And that's always our number one enemy.
“As a cybersecurity secure platform provider, the number one enemy for me in the world is not China. It's just laziness.” - Jared Shepard Share on XAnd I think that's true for so many things. I've had this conversation with so many people is, and I think this is primarily a Western viewpoint of things, and maybe it's not. We have such a drive, and everybody in marketing talks about, how do we reduce friction? How do we make transactions easier? How do I make it easier to buy something? How do I make it easier to return something? And friction is the friend of cyber, friction is the friend of security.
Oh yeah.
How do we find that point where we have enough friction that we get security, but not so much friction that we produce too much frustration?
My chief security officer, a guy named Matt Stern, who used to run the Army Cyber Battalion, he helped run US CERT. I mean, he's done cyber stuff since cyber was a thing, even before it was a thing, back when it was considered information operations. He's been doing this for a long time. And he says, one of the terms that he's coined and said for a long time that I enjoy is, cyber is one of the few things where we blame the victim, not the perpetrator. And the first thing that happens is when somebody gets compromised, we go, well, did you have security?
Yeah.
Rather than being like, oh man, I'm sorry, you just had things stolen from you. I mean, you were truly the victim, but we just victim shame almost immediately in cybersecurity.
Yeah. And it draws over to scams and frauds as well as well. You should have known that this person wasn't who they claimed to be. Yep.
I mean, and in some ways, they've tried to, banks have tried to address that. Like, think credit card fees, right? I got a friend of mine who's like, literally, hey, I don't care if I put my credit card on the internet. Doesn't matter to me. That's why I pay credit card fees, because they have to insure bad transactions. That's not my problem. That's their problem.
Yeah, but my 401k is my problem. As you as well know.
You don't think it's your problem until they're charging you, two hundred bucks a month in the credit card fees, you know. Then all of a sudden it becomes everybody's problem.
Yeah. So you've talked about this on the solution on the kind of the corporate or business front. Is there a parallel consumer product for that sort of that environment?
So we haven't released it yet, but it, you know, in the next, I'd say 12 to 18 months, but yes. And ironically, the epiphany for that came from my wife, where one day I was getting ready to go to work, my wife stops me and she's like, Hey, when can I have Hypori? I'm like, Well, why would you want Hypori? And she said, well, it's a phone, right? Yeah. It's in the cloud? Yeah. She's like, so if I lose this thing, it doesn't matter, right? Yeah. Okay, cool. She's like, because the worst thing in the world is when I have to replace this thing because it gets old. That's the most stressful event in my life is when I have to replace this phone because I'm worried about losing contacts, friends, pictures, you name it. And I was like, okay. And she's like, and by the way, why can't I take the big one with the cameras to the kids events? And then take my little flippy one with me when I go out with my girlfriends to dinner. She's like, why can't I change out my phone the way I change out my purse?
Yeah.
And I was like, oh, crap. So we can actually commoditize the edge device, make it a fashion accessory. Imagine, what if I could give you a faster thing than you could ever put into this form factor and I could do it for twenty-five bucks a month and that includes the hardware.
Yeah, I mean, if you're just streaming pixels to the display, I guess you got to work around the camera issues and stuff like that. But let's just talk about the utilization outside of the sensors and the, if you're just checking e-mail and going on the web or apps, you no longer need an $1,800 edge device. I can now stick with my…
Maybe that $1,800 edge device now can be built around the fact that it comes out with $1,800 worth of cameras on it. And you get the best cameras on the planet on that one. And then like when you're going to go to with the kids swimming event and you know you're going to get wet, you grab the cheap waterproof one. And you know, being able to have that level of flexibility where you can make a decision based upon what you're about to do, that would be empowering users in a way in which I don't think that they've been empowered in the past.
So is the trade-off that you're now, in some sense, handing privacy off to another company to manage that virtual device?
“If you look at a product and you're not paying for it, you are the product, right? Your data is the product.” - Jared Shepard Share on XI think that's a valid concern, although realistically, that's always been happening, right? I mean, so today, who is your privacy protected by? Microsoft or Apple or name the security vendor, the VPN vendor, the endpoint management vendor, the cyber, you know, whatever happens. Somebody is going to tell you they're running your privacy. Now, I would say anytime you use a product, if you look at a product and you're not paying for it, you are the product, right? Your data is the product. And so, if you're going to go and buy something specialized, that it's product to you, is protection of your privacy, then I think you have a higher likelihood of getting what you're actually buying versus buying convenience, ease, and features. And privacy is just kind of the secondary thing. Like, you know, I mean, how did the big search engines in the world make their money? Off of data. Yeah. I mean, data is the currency of today, right? Data is worth more money than money.
“Data is the currency of today... Data is worth more money than money.” - Jared Shepard Share on XAnd we see that in, you go to a store and they, hey, join our loyalty program because knowing what you do and associating that with your account, we're going to be able to convince you to buy more of our product.
Because we know more of that you go to. That's not like download our app and get more information. I mean, and that's the thing, okay. I love the people that are out there complaining about privacy and individual rights and they're doing it on a video on TikTok.
Yeah.
And I'm like, you're killing me, Smalls. Because like, look, I challenge you, if you're watching this now and you have TikTok on your phone, for one, I don't know what you're doing, but for two, go to TikTok's website and read their openly disclosed privacy declaration. They just openly tell you, hey, we're going to go through every other message on your phone. We're going to pull every piece of data off your phone we possibly can. They're open about it.
They're not even hiding it. Which leads you to question like, okay, if that's just how much they're telling me, what are they not telling me? Because there's something that every company is not telling you. Maybe it's not illegal or, you know, but…
And that's what we have to look at, like how is privacy weaponized, right? It's not just weaponized in the form of like theft, like we started talking about where they steal your identity and they do something with that. They produce something out of that. In the time of AI and the time of, look at our nation's politics today. How much of what you think that the average person is being influenced by is actually being influenced by legitimate sources of information, versus how much of it is being generated by people who are interested in destabilizing our country? And by the way, they would pay really good money to Vendor X, name the vendor, for the ability simply just to prioritize what kind of media gets in front of you.
Yeah, because that will have some level of influence on you. Even if you think, hey, I'm pretty sure that that's fake. I'm pretty sure that that's not real. If it's in front of you enough times, your brain just takes it as truth.
I mean, like the sales call, what they say, it's like seven to eleven times or something like that, you have to have a touch point to make a sale. Okay, social media is the same thing. And if you look at it like the Chinese, the Russians, people who have bad interests for America, for Western ideology as a whole, they don't care if you're left wing or right wing. That's not their thing. They just want you to sow disruption. So if you're on the left, they're going to feed you crap that makes you upset about the right. If you're on the right, they're going to feed you stuff that makes you upset about the left. And all of them are going to make you question your government and question the stability of the system. And that's the purpose. That's what they want you to do. They want you to be unhappy. They want you to have second thoughts. They want you to question the system. And they're willing to pay lots and lots of money to do that. And we carry around the mechanism in which they're going to execute that plan.
It's not that they are going to execute that plan. It is they have been executing that plan.
“Within the next 24 months, we'll be at a point where there is no such thing as video content that you can believe anymore.” - Jared Shepard Share on XFor a long time now, right? And it's just getting better. And now if you look at some of the new AI models, the video producing AI systems that are out there, et cetera, I would almost say if we're not there today, which we're very close, within the next 24 months, we'll be at a point where there is no such thing as video content that you can believe anymore. Because it's just going to be so accurate. I have friends of mine that are cybersecurity professionals who have sent me videos and are like, oh my God, can you look at this? Can you believe this? And I'm like, guys, that's fake. And we end up in a 20 minute debate over whether it's fake or not. And then come to find out it was AI generated. So these are professionals being duped, let alone guys who aren't.
It's not as simple as, hey, this person has seven fingers or three ears.
I saw a guy who was making a joke and he was like, look, if you're going to be a criminal today, what you do is you put on a sixth finger on your hand when you're going to go do it. So then that way, when they show the video of you doing it, you can be like, oh, that was AI generated. That wasn't me. I don't have six fingers.
Like that, it's going to happen in a court case sometime. It's just bound to happen.
But you know what, though? It's actually, I think before court cases catch up with that, AI is going to outclip it and it won't make those mistakes anymore. Like, AI is going to get to a point where it simply just won't make those mistakes anymore.
But the argument will be, it's bad AI. It's cheap AI. It's old AI. They just didn't have the latest one.
Look, AI, you know, kind of to the purpose of like what we're talking about. Like I've said this on multiple panels where I've talked and stuff like that is like AI is our generation's version of splitting the atom. And much like Oppenheimer's paradigm, where Oppenheimer had to make a decision whether he was going to push the button or not, when he was being told that there was like a 10% chance that when he pushed the button, the whole atmosphere would light on fire and destroy the planet. But he kind of knew, look, if I don't do it, bad guys are going to do it. So one way or the other, somebody has to push the button, and it'd be better if I'm at least in control of it, right? So that was the decision he had to make. And it is the same decision our country's going to make and other countries are going to make, the Chinese have already made, I mean, everybody's going to do that.
So, but the difference is that the ability to split the atom, the reason why this was so important was because we could use it for good and maybe power the whole world, or we could use it for bad and maybe destroy the whole world. So now fast forward, AI is kind of the same thing, except for the fact that unlike splitting the atom that was only available to like six nations in the world, AI is available to every 16-year-old kid on the planet.
And soon it will be every six-year-old kid.
Well, I mean, that's a whole different regulatory problem, right? But yeah, that's exactly it. So I can have some level of hope that we can keep six nations to know what's in the best interest of the world. But to keep, you know, 600 million 16 year old boys or 600 million 16 year old girls. No chance.
I think we've taken a big detour, which is always fun and good. And I love going on detours. So if we bring it back to this is what you guys do is a good option for businesses and there's maybe an option coming in the future for consumers. What should consumers do in the meantime to de-risk or mitigate some of these things? Or what sort of companies should be like, hey, we're looking at going this way, but we're not ready to go yet. How do they kind of decouple some of these issues?
Yeah, so I'll start with the corporate side. On the company side of the house, you know, companies before Hypori, they really only had the option of buy an employee a second cell phone or put MDM on somebody's personal cell phone, right? Which is just an egregious problem from a policy, from regulatory, from a … that means a company's taking ownership of your phone, essentially. And nobody, no employee wants that, no company really wants to deal with the implications of that. Or you could do MAM, which is application management, which gives you a little bit of security and a little bit of privacy, but it really doesn't give you either. And now Hypori offers you actually true separation from an edge device.
But if you're that person who owns that edge device and you're concerned about how do you protect your information, obviously before Hypori comes out with a personal offering, which I would advocate for, but I come back to like my golden rule of everything's compromised. So every time you do something, you have to take into consideration.
Just because something's compromised doesn't mean you don't do it. It means that you have to take into consideration how and what you do it for and what you do afterwards. So you have to pay a special attention to things like, look, if you're going to put a credit card online and buy something online, yes, you have credit card fees that are going to protect you in case somebody defrauds you, but you have to be actually looking at your credit card statement to catch that if you start to see suspicious behavior. Be diligent on your passwords and how you protect your passwords. Like, by the way, don't name it your dog or your kid. Don't use the birth date of your dog or your kid, because it's just, that's so silly. It would be, you might as well just post it out on the internet and make it easier.
Which you probably have if you’re using one of those.
And by the way, if you're doing that, you're fooling yourself into believing that you have some level of security. So there's a lot of systems that are out there. There are password keepers, there are VPN mechanisms, there are, like don't ever not have endpoint defense on there. It doesn't have to be your company's endpoint defense. It could be your personal endpoint defense. You could go buy BitBlocker, Microsoft Defender, or whatever else it happens to be. Like there's a lot of them that are out there. Something is better than nothing. And nothing is as good as everything, right? So you have to have a layered defense if you want to protect yourself.
And then even in the end, even with all that protection, you could still be a victim. So then, like most cybersecurity practices now, they tell you, we're not telling you how to protect you from ever being compromised. We're telling you what to do once you have been. And so learning how to do, okay, hey, look, I just got indication that I've been compromised, cool. Know where to go to change all your passwords. Know what plan you have to change all your passwords to strong passwords. Don't write it down and keep it on the back of your keyboard, but writing it down and keeping it somewhere that it's referenceable is actually helpful.
For instance, I had a family healthcare problem where someone had a brain injury. And now we couldn't get into their phone, we couldn't get into their emails, we couldn't get into anything. And it was fascinating, you don't realize how tied you are to this stuff, but it was like, hey, we had to file a claim with their insurance. And they were like, okay, cool, open an insurance claim. Great, we just sent you a four-digit code to your phone to verify it's you. Oh, crap, we don't have access to the phone. So think about that, have a plan in place, document it in some way, put it somewhere safe and stored, encrypted, but where somebody knows how to get access to it. So in case something like that happens, you're not left high and dry.
Yeah, that's a really good point because if you think about if my wife was not able to access my e-mail, everything that I, bank accounts, retirement, all the things that we do are often tied to our e-mail address. So if we lose access to that, everything else kind of gets stuck until we can fix that.
Yeah, I mean, the beauty of MFA, multi-factor authentication, is that it creates these multiple factors of authentication. And one of those is often, like, you get an MFA on your phone or a text message and you validate that you are who you say you are, right? Which is cool until somebody else needs to have access. And then it becomes problematic. So think about those things. Be like, hey, this is the kid's healthcare information, or this is the bank account information. And yeah, well, I take care of all the bills. And if I got smacked by a bus tomorrow, the wife needs to pay the bills at the end of the month.
I think that's interesting because that's advice that is, I don't want to say contrary to privacy, contrary to security. But it's the reality of what do you do if you lose access or if you get compromised? What do you do then? Everyone just says, well, just don't get compromised, just don't lose access, and then we'll all be good.
Okay, but if we agree on the premise that everybody eventually gets compromised, and nobody's gotten out of this thing called life undefeated. So we all have to assume it's going to happen. So plan for it. And I would argue that the idea of privacy is not the isolation of all of your information to just you. The idea of privacy is you get to choose who gets access to what.
“The idea of privacy is not the isolation of all of your information to just you. The idea of privacy is you get to choose who gets access to what.” - Jared Shepard Share on XYeah.
That's privacy, right? That should be your choice. And when.
And hopefully when. So as we wrap up here, if people want to connect with you or Hypori, how can they find you?
So the easy one is just go to hypori.com, H-Y-P-O-R-I.com. You can click contact us. I'm on there. My team's on there. We're very responsive. You can find me on all the normal social medias. LinkedIn, I'm on all the time. If you really want to hunt me down and find me on regular social media, you can. I don't hide.
But if this has intrigued you, the idea of isolating an edge device essentially so no longer be able to handle information so that you can protect your enterprise in a way in which doesn't impact your employees' privacy, reach out to us. We'd love to show it to you. It's a really cool system and platform. And I think we're ahead of the curve in the fact that I think the whole world's going to move towards this kind of a capability.
Or back around to it.
Yeah, that is, back to dumb terminals again.
Jared, thank you so much for coming on the podcast today.
Chris, I really appreciate you having me.





