Many people are comfortable sharing data in an environment that they believe to be a safe space. But not everyone considers where that health data goes after it has been collected. Today’s guest is Katie Lips. Katie is an author and entrepreneur who helps consumers to understand the value of their data and protect themselves online. She is currently creating a data-centric health app with her extensive experience.“Always consider, ‘Is that something I really want to share?’” - Katie Lips Click To Tweet
- [0:51] – Katie shares her broad and diverse career in technology.
- [2:06] – The growth of the internet has changed exponentially since its inception.
- [4:04] – Covid opened up everyone’s eyes on how useful data is in the health space.
- [5:16] – For many, sharing health data during the pandemic made them nervous.
- [6:19] – Many of us have health tracking apps and devices that collect an enormous amount of data.
- [7:57] – Sharing data means that it could land in the hands of companies that you may not want to have so much information on you.
- [8:43] – Consider who will see your data as it can paint a picture of who you are and your life.
- [10:00] – Fitness trackers use a GPS which is a great tool for runners, but Chris shares an example of how it puts military personnel at risk.
- [12:29] – If you share health data online, like on social media, it is considered personal data and isn’t strictly governed.
- [14:05] – Some companies may use social media to determine how healthy someone is.
- [15:36] – The truth is, we don’t know who can take that data and use what they want.
- [17:01] – Chris and Katie discuss an example of unintended consequences of using health tracker apps.
- [19:50] – What is a data play?
- [21:23] – Some data is best preserved as saved on a device rather than online.
- [22:52] – It’s important to consider what data would be accessed in a breach as a consumer and a business owner.
- [24:14] – The possibility of data being sold to health insurance companies is a scary concept.
- [25:47] – There is a difference between sick care and health care and this space has changed in recent years.
- [27:37] – Can your health choices be held against you with your health provider?
- [29:44] – Chris describes an insurance company incentivizing healthier choices.
- [31:56] – Companies need to be more upfront on what the data shared will be used for. Chris and Katie believe that is something that will become more common.
- [34:30] – Katie describes her startup and health journey.
- [37:07] – Something can be data driven, but doesn’t collect a lot of unnecessary data.
Thanks for joining us on Easy Prey. Be sure to subscribe to our podcast on iTunes and leave a nice review.
Links and Resources:
- Podcast Web Page
- Facebook Page
- Easy Prey on Instagram
- Easy Prey on Twitter
- Easy Prey on LinkedIn
- Easy Prey on YouTube
- Easy Prey on Pinterest
- Eatiful Website
- Katie Lips Website
- Katie Lips on Instagram
- Katie Lips on LinkedIn
Katie, thank you so much for coming on the Easy Prey Podcast today.
Hi, Chris. It's great to be here.
Glad to have you. Can you give myself and the audience a little bit of background about who you are and what you do?
Sure. My name is Katie Lips. I've had quite a broad and diverse career in digital technology across all sorts of things from building digital products for people to developing digital strategy, to developing digital policy and advising consumers how to protect themselves from scams, advising governments also on how to protect consumers in the online space. I've worked in Tech for Good businesses. I set up my own startup as early ago as 2003.
I've had this really broad career, which has meant that throughout that time, I've just been hugely interested in data, in my own personal data, and how we use it online and how other people use it online, and also in helping other people to get consumers, or just regular people, I guess, helping them to protect themselves in the online world.
As someone who's been around the block a bit in terms of her digital career, obviously, when I started work, the Internet was at a much tinier space than it is today. It was much simpler. There weren't as many threats or dangers out there.
Today, the opportunities for people to do amazing stuff is enormous. We can put our whole lives online and also use the Internet for fabulous things like improving our health, for example. Whilst all of that's absolutely brilliant, we're all much more at risk as you, of course, and your listeners all know.
I'm just interested in that intersection of, how can we build great products that really help people online whilst at the same time, protecting them? That's been an interest throughout my career, which is why, I guess, I'm here today to talk a little bit about what I'm up to.
Absolutely. To me, it's interesting because growing up, our data was never something that we really thought a whole lot about. It was often stored in paper in vast file cabinets and vaults. At some point, the sprinklers broke, the paper got wet, and our data was gone, the information about us. But with the advent of computers, more and more of that has gotten digitized.
In a lot of cases, that's advantageous. If your doctor's office needs records, they can access that thing quicker and faster. But then came the advent of the Internet, and now that data is potentially moving places that we don't know where it's going or what it's being used for. Let's spend some time talking about health data privacy, what our exposure risks are, and what we can do better as consumers.
It is a fascinating space, isn't it? I think, actually, Covid opened everyone's eyes to just how useful data could be in the health space. I guess we all knew that they can generally help everything. If you have good data, and you have the ability to analyze it well and use it properly, it can do all sorts of stuff. You can do all sorts of stuff with it. But in the health space, it really hits home for a regular folk, if you like. The data is hugely important.
I'm sure in the States, as much as in the UK, we've all been asked to share our data during the pandemic about tests that we were having, where we're testing positive, where we have been. Some of us were tracking our movements to help people study the spread of the disease. I think that was a really good example of how, at scale, data could actually help fix huge problems like that.
Obviously, a cure wasn't found, but vaccines were, and lots of lives were saved. I think that's a good example. Perhaps, for some of us, it was an obvious choice to share our data. For others of us, it was perhaps a really scary thing we didn't want to.
I was very much on the fence. I could see the benefits of it, but I could also see that if not treated really, really well, who was going to get access to this in the future? How long did they want it for? And for how long were they going to use it? Were they going to delete all these things that I just knew in a rush that people weren't really considering? Because I was building something in a rush, there are things that you wouldn't have time to consider.
You're in this emergency situation where you're being asked for your data about stuff you know could help you and others. Do you trust it all? Maybe, maybe not. I think that that opened everybody's eyes to the fact that data is really important in the health space, which is a good thing.
But of course, you're right. People have been using digital health products for ages. Many of us use Fitbits, connected watches, or whatever to monitor our actual physiology. As I'm wearing my Apple Watch, it's probably listening to my heartbeat right now. Oh, my God, isn't that a terrifying prospect? Why? Is that necessary? But it's just stuff that you set up once you forget about.
Then all of a sudden, there's this enormous slew of data about things that you don't even think about anymore. Also, in the health space, things like calorie tracking, so plenty of people who want to lose weight. I left my corporate life because I lost a lot of weight and wanted to share a weight loss product with people. I know loads of people who do calorie counting on their phone using apps and basically telling another company every single thing they eat.
Now, that can be great if you get the data back about the insights so that you can track. “Oh, I've eaten more calories than I'd like,” or, “I tend to eat lots of calories at three o’clock in the afternoon. Is there a way I could stop that?” Or, “I'm eating too many carbs, not enough carbs, or whatever it is.”
If you get the insights back from the data you're giving up, then I think it's possibly a fair exchange. But in a lot of services that I see, it's a bit one way. When it comes to health-related data, I think that that's possibly opening people up to problems, and the problems are many.If you get the insights back from the data you're giving up, then I think it's possibly a fair exchange. -Katie Lips Click To Tweet
They start with things like if your data falls into the wrong hands, then you're scammed as a result of that. There's more risk of just companies getting more data about you than you possibly let them have for a purpose that is ill-defined. Then there's the risk that you don't really know what might happen with that data in the future, or who it might be sold on to, or even who it could be inadvertently used by.
The idea that maybe someone talks about—to use the weight loss example again—how much weight they've lost, what they're eating, or how much exercise they're doing, maybe they post lots of pictures on Instagram and things like that. Now, Instagram might not be doing anything bad with their data, but anyone can come along and look at that and get a picture of that person.
You have to always consider, certainly on social platforms, who else is in the room, I think. When it comes to health data, I think that as consumers and patients—let's say people—as people, we're not quite attuned really to the future issues that might bite us when it comes to the health data that we're sharing and whom we're sharing it with.
I know, as people, we're not good with foreseeing unintended consequences of things. I remember a great example of the US military at one point, banned the use of, let's just call it Fitbit or any type of fitness tracker, because they realized that it was uploading GPS coordinates of military service members.
If they knew the routes that these guys were running around on the bases, that it gave data as to where people could and couldn't go on military bases. They're like, “Oh, this might be great to track the workout, but now you're putting military servicemen at risk.” It definitely becomes like, “Oh, I'm sure no fitness tracker, in developing their product, thought about, does this put military servicemen at risk?”
Right. It's insane that we don't think that. We just think, “Oh, that's a nice thing. It will help me track my run.” You're absolutely right. That's a brilliant example that it even caught out the military who you think has a really strong cyber policy. But then they haven't gone as far as, but you've all got these Fitbits on, because you like showing off about how far you're running.
In a more consumer world, if you're uploading your run data every day and, “Oh, I'm five miles away. It’s going to take me half an hour to get back or whatever,” and then people wanting to burgle your house might be able to use that data. I don't know. Maybe I'm sounding like a real cynic and someone who's afraid of everything. That's not how I want people to feel. But I do think that you're right, it's the unintended consequences. We possibly will need to be a little bit more savvy about some of the risks we're opening ourselves up to.
One of the things that I think about is—and I think I've gotten more aware of, and I think people think of it from a credit perspective, like a financial credit perspective—if you have your financial habits and the credit agencies collate that information and merge it with other details, and people offer you loan products based on your income and your financial habits, and things like that, people go, “Well, that's OK.” I start worrying when, is my health data being merged with my eating habits, my exercise habits, my financial data? Is that now going to affect my ability to get a life insurance product? Because we know you eat too much chocolate, therefore we're going to raise your insurance rates.
It's possibly not on such a level as the chocolate eating that you need to worry about, but it's there. It's absolutely there. Health data is much more strictly governed than personal data. If you tell the Internet that you're eating a lot of chocolate, or that you're going running, or even that you've been losing weight, it's not in a health context. It's personal data that you've actively shared online. You shared some of it publicly, let's say.
There are different ways and different types of how data is governed. Health data is actually governed by much stricter rules and regulations than regular personal data is governed. That makes sense. It means that people who officially take care of your health data or your personal health record, et cetera, have to treat that very, very carefully indeed, and that they make sure that that's not going to fall into the wrong hands.Health data is actually governed by much stricter rules and regulations than regular personal data is governed. -Katie Lips Click To Tweet
Actually, when you look at where we are as consumers with digital health, our understanding and our experience of it goes far wider than just when you go and visit your doctor, or you go to hospital for something. It's much broader. It covers everything through all sorts of different types of lifestyle data, your exercises. Since you might eat, it might include weight loss, it might include personal care stuff, it could include a whole bunch of stuff.
Actually, what's happening is organizations and companies are able to infer things about your health based on your personal lifestyle data. There's no need to regulate that especially as health data. It can all just abide by the standard rules that personal data needs to abide by. Which is OK in the main, but it can actually be really problematic, because some companies might think, “OK, I want to know about how healthy someone is. I'm just going to go and look at their social media and find out.”Organizations and companies are able to infer things about your health based on your personal lifestyle data. -Katie Lips Click To Tweet
Actually, for a lot of people, it's all there. If you're someone that uploads your run data, or I'm just saying I've been for a run, or whatever, then it knows that you've got some exercise going on, and you're probably trying to be quite healthy. If you're the person who is, as you were saying, who likes a lot of chocolate, then they're going to know that about you, which may be absolutely fine. But there are certain lifestyle behaviors that may impact a risk profile negatively for you from a life insurer or health insurer, for example.
Things like saying, “I like ziplining, bungee jumping, skiing, extreme…” or could be seen as extreme or risky sports might impact a life insurance premium. Similarly, going down the pub a lot or going out to bars and looking like you're someone who drinks alcohol more often than other people would also negatively impact you.
But people seem completely comfortable with sharing this data in environments that they believe are safe spaces. It's just them and their friends or them in a community of like-minded individuals on the Internet. Because we've not got that eye on what sits behind, who else can see, and who else can be in that room and in that space, and ultimately, who else can either directly take that data through approved APIs into the service or who can just scrape the data, because it's actually just sitting there on a webpage.
I think we need to be a little bit more cautious and cognizant of that other side of it and just think, “Is that something we really want to share for how it is today? Are we aware that it could be taken in a different context by somebody else?”
I know here in the US, we've had legislative changes on abortion rights. Without getting into the pros and cons of that, by no means, that is definitely not the point of mentioning it. But there was suddenly this great concern for women in states where that's no longer a legal service, that if they have applications that are tracking their menstrual cycles, that data suddenly becomes potentially risky for them. And like, OK, where's this data being stored? Who has access to it?
Previously, a year ago wasn't a concern. All of a sudden, now, it depends on the circumstances. It could be a jail sentence or be used as evidence in a court case against them.
Yes, I think that's a really good example of the unintended consequences, both of the user in using a service that could be something you set up to track your cycle, and then ends up being so utterly terrifying in terms of how that could be potentially used if it falls into the wrong hands.
But it's not even that it's the wrong hands. We're not even talking about cyber criminals anymore. We're just talking about changes in legislation that could make something that you think is completely legal and acceptable one month, completely not legal and acceptable another month. We, as systems designers or product designers, are also not thinking in terms of that level of unintended consequences.
That's a real challenge for businesses setting up products, because you design your product for the best-case scenario. Everyone's going to come and use my product, and they're going to love it. It's going to be brilliant, and everyone's going to be happy. You don't design products thinking, “Oh, these awful things are going to happen, and then we're going to be putting people at risk accidentally and completely what we didn't intend to, but we might be doing that.”
I'm advocating for people definitely doing some unintended-consequence thinking. I think that's the important part of a data strategy. You have to worry about bad stuff that can happen with it as opposed to just the good stuff that you want to achieve.
Because I know that you're working on health products now, in that space, do you see maybe a push from developers rather than storing the data in the cloud on their own servers about the customers, even if it's anonymized or whatever, but rather saying, “OK, well, let's leave the data on the device and move the analysis”? Rather than the analysis being done in the cloud, move the analysis of the data to the device so that the developer never actually sees any other data?
That's something that we're working on at the moment with my latest project that I'm working on. You can look at this several ways. The old way of thinking if you're building a data product, when people were building a startup in the old days, we say, “Oh, it's a data play,” which basically meant your job was to go and grab as much customer data as you could.
You didn't necessarily know what you wanted it for or what you were going to do with it, but you would grab everything you could. You'd profile people. You'd ask them a bunch of questions. You track a bunch of things. You’d then figure out what in there is useful and valuable to you.
I think there's a shift now. I hope there's a shift. I think some people are definitely still in that bad, old school way of thinking, but I think there are plenty more businesses who are thinking, “Actually, no. We want to be privacy preserving.” The reason why is trust, because if your customers don't trust your service, you don't have a customer base either in your business. You have to be more trustworthy nowadays.
We hear scary stories all the time. Some accidental awful data breaches, but some just sloppy thinking in terms of how data is handled. Or just actual commercial deals that aren't any good for the end user. I definitely think there's a move towards wanting to store data on the device and never let it leave there.
If I want to tell my phone everything I'm eating in a day, track my calories, track my weight, and that's very personal to me, it's important if that's my mission to lose weight, let's say, but I certainly don't want that […] knowing. I don't want that to end up on social media or to fall into the wrong hands and allow that to mean that I'm the victim of a scam.
Definitely storing it on the device, and definitely doing the analysis on the device, and maybe there's some angle of aggregate data that you can then say, “OK, generally, people hear the general stats of what people did, but no, I don't want to own all the data about what every single individual ate every day or did every day, because actually, as a business owner, that gives me a huge level of responsibility, maybe I don't need.” If you think about it more from a future perspective, we don't need to do that anymore, because it can be handled by the device.
For example, we're building a prototype for our app, a native iOS app from using HealthKit and ResearchKit provided by Apple. It pulls bits of data together and helps you manage and store that safely on device so that you don't need to own it and worry about it, which I think is great. There are great tools for developers out there to help them behave in that much more data-preserving way.
I like the idea of a company being able to say, “There is no database that can be compromised on our side. We build an app. It does this. We might see some wide-ranging analytics or some aggregate data, but if someone does get into our database, there is nothing there for them to get.” To me, it's like, as a company, I'd be happy to have a minimized cybersecurity risk.
That's right. I think you're completely right. I hope that more and more businesses are shifting towards that mindset and that strategy. As they are, then we'll see much less risk for consumers. If they're not, it's often because what they're doing isn't quite right.
I've seen a few businesses in the health space at the moment who are, at the consumer level, you won't see it. You'd only see it if you research the company a little bit and find out there are plenty of businesses in this space taking investment from insurance tech firms or funds, because there is still being a data play. They're still thinking you just grab the consumer health data and there must be use for it. You must be able to then sell that to health insurance, which I find quite scary, actually.
Yeah, and that's the challenge. To me, I think, when it comes to if I'm going and seeing the doctor, if I'm on the way to the hospital because something unknown has happened to me, I want doctors to be able to have access to all relevant health data as quickly as possible. But the flip side is, if they don't need it for life-saving emergencies, I don't want them sharing that data with each other.
My optometrist doesn't need to know what my cardiologist thinks about me or things like that, unless it's a life-saving emergency. To me, it's trying to balance those, what data should be segregated, what data should be allowed to be merged, and how does that benefit me as a consumer, not the pocketbook of the company that I'm working with.
I think that's absolutely right. There's this ethical debate, I suppose, or issue that needs really careful consideration. Digital transformation is enabling all healthcare providers to move from an era of, let’s say, sick care. They call it health care, but it's really sick care in the olden days, up until now.
Let's say you go to the doctor when you're sick and to get fixed. You have a problem and they fix it. Maybe real healthcare is where they keep you healthy as opposed to waiting for you to get sick. Now to keep you healthy, you've got to know how someone might get sick and then prevent it. For that, you probably do need a lot of data analyzed brilliantly by some very clever people. We probably, as patients, should opt into systems like that.
Whereas at the moment, I still think we're still a bit too much in the era of sick care. Only when we get much more advanced data systems can we move effectively to that era of proper, preventative health care.
That's always going to be the challenge to people. If you know you're not living a healthy lifestyle, do you want your health care—not your sick care—provider knowing that you're not living in a healthy fashion? Are they going to raise your rates? Are they going to come knocking on your door, saying, “Hey, why aren't you exercising more?”
It's quite scary, isn't it? My story is that I used to live quite unhealthily. I used to drink a fair amount of alcohol, not a massive amount but a fair amount, more than I do now. I used to overeat food, and I put a lot of weight on. And then I decided to do something about all of that. I lost a lot of weight. I lost about 90 pounds.
That is my story. The reason behind why I'm starting my new business in the health space is to help other people do the same. As you say, Chris, if you go to a health provider and say, “I'm behaving unhealthily,” then could that be held against you? That's a real worry. It might stop people from looking for help.
Or you need to find an alternative provider who's not aligned with those systems that could make bad decisions about you, who can just help you to change your habits, let's say, or to start behaving a little bit more healthily, or doing some nicer things for yourself, let's say.
There is this really fine balance, because also, healthcare providers don't really want to put people off. They do want to prevent disease. They genuinely, I hope, want to help fix people when they get ill. That involves knowing the full extent of the problem.
You can't go to the doctors and say, “Oh, I've got a problem with my leg,” but then not tell them that you fell over and bumped it or whatever it is. You've got to be honest about what the root causes and what's the problem with it.
I think we're starting to see some of that now. I have a life insurance policy that my health behavior does not affect the rate on the policy, but my health behavior affects whether or not they're going to have to pay out, if that makes sense. If I'm healthier, then they're not going to have to pay out an inopportune death.
One of the things that this particular provider does is, “We will give you an Apple Watch for free if you exercise a certain amount, and you self-report your blood pressure.” And you earn points if you go see your doctor. And if you earn enough points, they waive the cost of the Apple Watch. They're incentivizing. Rather than raising your rates because of bad behavior, we're going to give you a toy for good behavior.
I totally understand. I worked in life insurance for a while. I understand if you can get your customers to behave, if you get your clients to be healthier, they're going to die less. As an insurance company, you're going to not pay out the policies, and you're going to make more money. It's in your interest to keep your clients healthy.
They found a way of doing it, which more aligns with my values in terms of, I'm in control of what data I share with them. If I decide I want to stop sharing, at any point, then I just pay the balance off of the watch. But it's an interesting idea of starting to see that we're going to incentivize good behavior as opposed to punish bad behavior.
That's a really interesting way of looking at it, actually. Yes, there are plenty of schemes out there that will incentivize good behavior, but I think people have to be slightly careful that they're not paying for these devices with their data. It sounds a bit like that.
There's a certain amount.
There's a scheme in the UK like that as well with, I think, a couple of health providers. It sounds nice and shiny, doesn't it? Get a device, but ultimately, the device is tracking you. Even if it's giving you an incentive to be healthy that benefits everybody—it benefits them and you—it's also gathering lots of data. They can use an aggregate to refine their risk profiles about people in general.
I think you are somebody who understands this stuff. Plenty of people who have taken up this deal don't quite understand that data exchange and the implications of it. Now, I don't know quite what I would suggest. I'm not suggesting that everyone needs to go and become like a data expert and really understand this stuff, but I am advocating that companies like that—healthcare providers—are a bit more honest and upfront about that value exchange with their customers so that everybody understands really what they're getting themselves into.
I think transparency is going to be one of these future-looking things that we're going to see talked about a lot more when we install apps on phones. I think we're starting to see it now. The platforms are now saying, “This is what systems your app is requesting access to.” If I'm installing an app to track my weight, they don't need access to my contacts. They don't need access to my camera and the microphone, who knows what else—my network settings and my GPS data. They only need access through Apple's HealthKit. They just need access to my weight, and that's it.
I think we're starting to see people get more aware of reviewing, “What data are my apps really asking for, whether it's health data or just aggregate data?” Hopefully, we're going to start seeing that more when we're dealing with other business entities. “What are you doing with this data?” People start asking that question.
I've asked that in a few places, and the people at the front desk are like, “I don't know where this data goes. I don't know what it's used for. I don't know why we need it.” I’d respond, “Am I required to fill this out on the form?” They're like, “No.” You shouldn't be asking for it if you don't really need it, but I think people have to start getting more conscious about what data you're providing, what information you're voluntarily giving up.
That's the thing. If you're curious or cautious or worried at all, always ask. If you don't get an answer you like, don’t do it. I think we should all be encouraged to ask that question more often, because if we do, then companies will listen to that. They'll know that it's awkward when there are. They're like, “I don't know.” I know that's not a good position to be in. They'll come up with better answers or they'll change their practices, because they'll only do that if more people ask
If they truly don't need whatever they're asking and enough people say, “Why are you asking this?” they'll stop asking for it. What is this product that you're currently working on your startup?
I mentioned my weight loss journey. My weight was 90 pounds. I wrote a book about it, actually, which is called Love Yourself & Lose Weight. That was really the catalyst for me. I wrote about self-love and how anybody could use it to take a bit of a break in their behavior and think about if they wanted to make a personal change, a bit of weight loss or something else, how they could do that.
A lot of people say, “Well, what did you eat all day long? The book doesn't tell me what you ate, if you want to diet. How many calories? How far did you run, blah-blah-blah?” Everyone wanted to know the specifics.
I've been working on a product more recently to talk about my weight-loss method, which was conscious eating and just slowing down, taking time. I’ve done research, a study recently, interviewed lots of people, and found that 78% of people mostly eat everything on their plate. That's people in the UK, Europe, and the US. Just you get a plate of food and it all goes in.
How many times as kids were we told, “Clean your plate”?
Right. We're all told to do it. Don't leave anything, to eat everything. It’s wasteful.
You need to eat too much, because there are starving kids in some far off land. You should be grateful for what you have, therefore finish your plate.
Yes, don't upset the starving kids by leaving food over here. That would be bad. We're all brought up with that, and that's what we do. I believe that that causes weight gain.
I'm building out this method and platform with free content and an app to help people to not do that, to just take more time, slow down, eat really slowly. Think about what you're eating a lot more. Enjoy it a lot more. Do not worry about what it is. Not necessarily go on a diet—don’t start eating lettuce or cabbage every day if you don't want to. Keep eating your chocolate, but just do it more thoughtfully.
Some people call it conscious eating. Some people call it mindful eating. My business is going to be called Eatiful. I'm going to be launching this product for people who want to lose anything from just a tiny bit of weight up to a lot of weight for a much more easy, enjoyable, and sustainable method.
Hopefully, you won't be collecting lots of data.
No, we won't, but it is going to be data-driven. That's an interesting point, actually, Chris, because when I was losing weight, I had to learn about what worked for me. I didn't just start a thing one day, and then a few months later, I'd lost all this weight. I had to trial and error, test and learn, iterate and figure it out, and figure out what works.
Our product helps you do that by learning about your own habits. It helps you track your habits and then get insight into your habits. But as to what we're talking about earlier, we're not going to be storing a massive amount of data. The data is stored on the device. The analytics happen on the device. You get to learn about what works for you without us knowing a whole bunch of stuff about you, which I think is a really nice way of looking at it.
I like that perspective. What will the website for that be?
The product is called Eatiful, and the website is weareeatiful.com.
Awesome. If people want to find more about you, where can they find you online?
I'm at katielips.com, where I talk about conscious eating in general and a bit about what I'm up to. I've got lots of free resources for people who want to learn more about that side of things. I also sometimes talk there about data and data strategy. I'm on LinkedIn, where I talk a lot more about that side of things as well.
A little bit more on that. Yeah, the data conversations are more appropriate for LinkedIn.
It is, yeah.
We will make sure to link all those resources so people can find them easily on the podcast's notes page.
Katie, thank you so much for coming on the Easy Prey Podcast today.
Chris, thanks very much for having me. It's been a pleasure.