Site icon Easy Prey Podcast

The Human Side of Cyber Security with Jessica Barker

“Being vigilant is helpful but the right phish at the right time can catch any of us.” - Dr. Jessica Barker Click To Tweet

The cyber security circle of information will always have some human involvement and raising awareness in how systems were designed, developed, used, and abused is critical. Today’s guest is Dr. Jessica Barker. Jessica is an award winning leader in the human side of cyber security and has delivered face to face awareness sessions to over 50,000 people. She is a best selling author, international keynote speaker, media commentator, and serves on numerous boards, including the UK government’s Cyber Security Advisory Board. 

“If it’s unexpected, if it makes you feel something, and if it asks you to do something, that’s a very toxic combination.” - Dr. Jessica Barker Click To Tweet

Show Notes:

“There are lots of elements where psychology, sociology, behavior economics, marketing, neuroscience, all of these amazing disciplines, with all that they can teach us about people and that we can take and draw on in cyber… Click To Tweet

Thanks for joining us on Easy Prey. Be sure to subscribe to our podcast on iTunes and leave a nice review. 

Links and Resources:

Transcript:

Jessica, thank you so much for coming on the Easy Prey Podcast today.

It's my pleasure. Thanks, Chris.

Can you give myself and the audience a little bit of background about who you are and what you do?

Sure. I work on the human side of cybersecurity. For the last 12 years, I've really made it my mission to shine a light on cybersecurity awareness, behavior, and culture. I'm interested in understanding what makes people tick when it comes to cybersecurity, what we can do to better engage people, how we can empower people and move them towards practicing more secure behaviors.

What got you into the field?

Luck, essentially. I was finishing my PhD, which was about the Internet but not about security. I came to my PhD from a background in sociology and politics, and working in urban regeneration. Then for my PhD, I was looking at the growth of the Internet economy, and really what the Internet had meant and meant to society, to culture, and to communities. I didn't know what I wanted to do next, but I knew I didn't want to keep doing what I was doing at the time.

I essentially was headhunted for a small cybersecurity firm specializing in defense, who felt they had all of the technical capability that they wanted. They really were looking for someone who could speak to people, who could interview people, who understood qualitative data, and who could really make inroads on the human side. Luckily for me, after googling “what is cybersecurity” when I first got approached, it ended up being me and I really haven't looked back.

That's awesome. One of the things that I ask, particularly my cybersecurity guests are, have you been a victim of a cybersecurity incident or almost been a victim of a scam?

I think probably all of us have at least almost been a victim. Have I actually been a victim? Not that I know of. I think it's important to add that caveat, because sometimes you don't know for a while. As far as I'm aware, I have never been a victim. There's certainly been times where I have received an email, a message, or where someone's approached me in person. I've been flustered. I've been busy. I've come close.

Then there's also been times where I've received an email or a message, and I've thought, “This looks a bit fishy.” Then when I checked it out, it turned out to be legitimate, which is an interesting feeling. 

I was recently invited to the Royal Garden Party at Buckingham Palace, celebrating King Charles' coronation. That was one of those times I received the email. I was like, “Yeah, right. This is a phish.” It wasn't. Being vigilant is helpful, but I always say the right phish at the wrong time can catch any of us. I very much include myself in that.

Being vigilant is helpful, but I always say the right phish at the wrong time can catch any of us. I very much include myself in that. -Jessica Barker Click To Tweet

It's interesting, because an invitation such as that is the sort of thing that is very flattering. It was like, “Oh, that's something I would want to do,” but I think I would have the same reaction of like, “No, this sounds too good to be true. This must be fake.” How long did it take you to figure out that it was actually a real invite and not a phishing attempt?

Well, I sent it to my husband, who is an ethical hacker. He runs Cygenta with me. It's very handy having him as tech support. He checked it out and came back and he was like, “No, this is real. We're going to Buckingham Palace.”

I think that's actually one of the key things to me. One of the key things is you actually farmed it off to someone else. It was like let's take myself out of the equation. That always seems to be almost a really good stopping point for scams.

It's true. I mean, I'm lucky that obviously, he's been ethical hacking for nearly three decades. It's great having that at home, but at the same time—and all the research supports this as well—if you receive something that looks a bit fishy, then even just reading it aloud to someone else, (a) you get their opinion, but (b) I think even more importantly, it allows you that chance to take stock, take a step back, slow down the way you're processing that information. You are more likely to identify it's a scam.

I know with reading stuff, I will, if I read it, I interpret it one way, if I read it out loud, it can come across as different. Sometimes I catch things. I always tell people that are writing for me, read it out loud and see if this makes sense. Something about the process of reading it out loud changes the way that you interact with it. Is that one of the techniques of dealing with cybersecurity?

I think it is, because you're right. You process that information differently. You're able to spot something, that for some reason, if you're just looking at it internally, you're less likely to identify. I think, also coming back to something later, maybe looking at it the next day, these are all techniques we can use to really just slow down how we're interacting with that information. If it's got their time pressure in, like you've got to deal with it right away, that in itself is a red flag. 

I always say my recipe for spotting a scam is if it's unexpected, if it makes you feel something, and if it asks you to do something, that's a very toxic combination.

I always say my recipe for spotting a scam is if it's unexpected, if it makes you feel something, and if it asks you to do something, that's a very toxic combination. -Jessica Barker Click To Tweet

Isn't that also marketing in general?

Certainly there's a fine line, I think, between persuasion and manipulation. I know they're very different, but at the same time it can seem like the same techniques are being used, particularly when we get into more aggressive sales.

I actually had that experience in Las Vegas on my first visit. I live here now but when I first visited, walking through one of the hotels, somebody there started chatting to me. They're very complimentary about my accent. They're very friendly. Then before I know it, they have seamlessly moved into trying to sell me a timeshare. I walked away. I didn't sign up. It took a lot, though, to get out of that conversation. I walked away thinking, “Wow, I've just had a master class in social engineering.”

I think most of Vegas is in itself a class of social engineering.

That's absolutely true. Yeah, we're taking notes.

We're all very excited to go there and give people our money.

Exactly.

The human side of cybersecurity, how do you define that? I think we all get the grasp of the technical aspects of cybersecurity: having data segregated, limiting access, and two-factor authentication. How does the human side come into play?

The way I think about it is if we look at the lifecycle of information or technology, people are involved at every stage. Whether this is the initial concept of a new piece of technology, whether this isn't being designed, developed, used, abused, the impact of technology, people are absolutely vital at every stage.

We are not securing information for the sake of that information or the sake of the technology itself. There is no such thing as malicious machines, despite what we may be seeing in the media about AI. It's still people at the end of the day. 

There is no such thing as malicious machines, despite what we may be seeing in the media about AI. It's still people at the end of the day. -Jessica Barker Click To Tweet

We can think of the human side in terms of the work I do day to day, which is awareness-raising, helping organizations understand the behaviors of people that work for them, helping them understand their cybersecurity culture and develop it in that way.

But there are lots of other human elements in terms of attacker motivation and methods in terms of accidental issues, the psychology there in terms of malicious insiders, and what motivates a malicious insider. Then, of course, in terms of the impact, there are lots of elements where psychology, sociology, behavioral economics, marketing, neuroscience, all of these amazing disciplines with so much that they can teach us about people and that we can take and draw on in cybersecurity.

If we’re talking generational or neuroscience, you and I grew up in an age where we were not as connected to our devices as people are now. Are there generations or age groups that tend to do better on the cybersecurity side of things that somehow their thought processes differ about cybersecurity?

Yeah, it's a good question. Obviously, there are lots of factors that come into play. But if we're speaking generally, a lot of research shows, there are different approaches between different generations. 

What you will often hear people say is an assumption that the younger generation, 18-24 years olds, maybe even younger than that, don't have any respect for cybersecurity or privacy. They're very open. They don't practice secure behaviors.

Actually, all the research shows that's not generally true. Younger generations usually have more of an understanding and engagement with cybersecurity. They're more inclined to practice secure behaviors. They're more likely to make conscious decisions. 

Yes, they might share their Netflix password. Netflix is very aware of that at the moment, and they've been really clamping down. They might share certain passwords, but they’ll be more aware of their choice to do that and their risk appetite.

Are they generally more inclined to use password managers because they're more comfortable with the technology? Whereas maybe someone else, not that they don't want to be cybersecurity-conscious, but, “Oh, my gosh. There’s just so much technology that I'm not familiar with. I'm just happy that I can remember one password, let alone 40 of them.”

Exactly. They've been taken along every step of the way. Rather than for us in the older generations who didn't grow up with the Internet in the same way, we've potentially been a little bit left behind, and particularly generations older. If we think of our senior citizens, they're more likely to find a bit of friction when using, for example, a password manager.

Of course, I'm generalizing lots of people who are not that and who don't struggle with it, but this is where I can get frustrated when people who are maybe very technical will look down on things like their password notebook. They will assume that their threat model is the same as a threat model of somebody who’s, say, in their 70s. They're using their password notebook at home. They're very careful with it. Actually, it enables them to have more secure passwords, rather than trying to demand that they use the same tools that might be appropriate for someone in the workplace.

Well, a physical password notebook is not great if you're being intentionally targeted as someone's walking into your house. Someone online is not going to be able to get your password manager.

Exactly. I always say the likelihood of someone breaking into your house and stealing your passwords, and you not knowing about it, is far lower than the likelihood of someone being able to break into your accounts, if you're using the same weak password over and over.

Are most of the people that you work with corporate clients or end users?

Mostly corporate clients, and then through that relationship, so it will often be a CISO or security leader who brings us in, and then we will often get face-to-face with the people in that organization. It might be delivering awareness sessions, it might be creating content, or it might be running focus groups as part of a culture assessment. Lots of conversations about cybersecurity, which I really appreciate. It's a real privilege for me.

What kinds of things can you do to shift the culture at a company where maybe people are a little bit frivolous with their password management or their cybersecurity stance? How do you help a company make that shift for the employees? 

It was one thing, the boss comes out and says, “OK, everybody change your password right now or you're fired.” That doesn't really change people's behavior long-term. That's only going to impact the moment.

There's so much I could say about this, Chris. I'll try to hold myself back, but be prepared to shut me down. I will just say a few key things. Firstly, it's about understanding the culture that the organization has. This is both in terms of the organizational culture, but also the existing security culture. I will often hear people say, and I've said it myself, “We need to build a security culture.”

You already have a security culture. It's what kind of culture you have. It's whether positive, or negative, or ambivalent. It's about understanding what are the wider values in the organization and how security can tap into those. How aligned is the current security culture already? Where do we want to get it to?

It's then really about values. Values are fundamental in a security culture. Do people understand the importance of cybersecurity and how it relates to them? That can be really beneficial, tapping into that both in terms of their day-to-day work, but also their personal life, enabling them to be more secure at home, or have better security for their family.

Then it's also thinking about the perceptions. Do people see themselves as being capable of engaging with cybersecurity? I've seen this time and again, research also backs this up, that self-efficacy, feeling that I can, is the most important factor in actually shifting behaviors. It's not scaring people. It's not droning on about the threat. It's actually empowering them. Like I said, so much more I could say about this, but I'll leave that there for now.

I came from an IT background, and have always watched the conflict between the marketing department and the compliance department. “Oh, you can't say that. Their compliance is the Department of No.” That same analogy is drawn towards if a company actually does have a cybersecurity team or a security team, it's, “Well, they're the Department of No. They're just there to get in our way and make our life more difficult and make our job more challenging.”

As employees, we're always trying to find our ways around the security department as opposed to working with them. How do you help a company change that adversarial relationship?

Yeah, it's a really good point. It's something we find particularly when we usually run the first culture assessment with an organization. Not always, but often there will be some perceptions around security being a blocker, the security team saying no, and often saying no without explaining the why, which can be really frustrating for people.

It's partly about finding out where those blockers are and where the workarounds are. When we hold focus groups, a key thing for me and for the team is making sure that we open up a rapport and a dialogue where people don't feel that they're being judged, and where people feel it is a trustworthy space, so that they will tell us some of these issues that they have. It's very common for us to end a focus group, and people say, “Oh, that was a bit like therapy.”

People have these frustrations, and finding out about them is a gift. Because then we can start to open up that dialogue between the security team and between everybody else in the organization or where there might be these pockets of resistance. Often, it is about stronger communications, so having more of a consultative approach. 

If you're bringing in controls, or you're changing access, or access is locked down, having that conversation with people to either look at whether your controls are appropriate, if they maybe could be tweaked, or if not, why? Why is that control there? And helping people understand it's not necessarily because you don't trust them, but it's for wider security issues.

I'm a big believer in shining a light on the work of the security team. We hear this very regularly, people will say things, had it in a recent assessment. “I know the security team does good work, but I haven't got a clue what it is,” so being able to actually highlight that to people. People have an appetite to know so why not share it with them?

It comes back to the compliance department that if the compliance department's job is to prevent lawsuits, and no lawsuits happen, it's that little challenge is like, “Hey, we did a good job. We know we did a good job because nothing happened.” Isn't some of the challenge of measuring the success or the efficacy of the cybersecurity team is that nothing happened as a result of them doing a good job?

That's it, isn't it? It is a big challenge to be able to almost prove a negative. But I think sometimes because of that, we can overlook what is interesting about our work to people or what does demonstrate success, and just being able to tell people about near misses, about things that could have gone wrong but didn't, being able to tell the stories. We have lots of amazing stories in cybersecurity.

In one way, the media is a challenge to us, I think, when we communicate cybersecurity messages. There's a lot of scaremongering out there. There's a lot of fear, uncertainty, and doubt. But at the same time, the media is a gift because it's rare that a day goes by without cybersecurity being in the news. It's a theme, even if it's just a subplot or an element of a lot of TV shows and movies. We have stories there to bring this to life and we should use them.

Do you find more of that people's view of cybersecurity in terms of like, “Hey, the attackers, this guy in a hoodie doing malicious stuff, and he's intentionally targeting our company,” versus an opportunistic person?

That's a big myth. I think that we really need to consider our language, because we often talk about cyber attacks being targeted. How that's usually interpreted by people is literally that there is, as you say, somebody's usually in a hoodie purposely targeting us. As we know, that's not how most attacks happen.

That unfortunately leads to this perception where people think, “Well, it would never happen to me. Why would we be targeted?” Helping people understand that (a) you do have valuable information or assets, but (b) it's usually not like that. It's usually, as you say, more opportunistic, more of a wide net. That's one of the perceptions we're battling against. Combined with the optimism bias, it's really tough to unpick that.

The optimism bias you mean by like, “No one would attack us because we just provide pool-cleaning services. No one would want to attack us.”

Yeah. It's this human tendency. This is where I think we can learn a lot from those social and human sciences I was talking about. Great research in neuroscience shows us that about 80% of people all around the world, skew towards optimism in their personal lives. We think the world is getting worse, but we think bad things will never happen to us. We think we'll never get ill, we'll never get divorced, and we'll never get hacked.

Is part of the training helping companies deal with, “OK, you've been hacked. Now what?” Because it seems like one of those good things is to plan for it in advance. If something bad happens, how are we going to deal with it? As opposed to, “OK, we've been locked out of our systems. What do we do now? Who do we call?”

Our work is focused on prevention. Whether the work is pen testing, or the work we do with cultural assessments and awareness raising, we're usually helping organizations that want to prevent an attack. But as we know, usually that approach comes from experiencing an incident, unfortunately. So often, organizations will come to us when they say, “This thing happened,” or on a good day, “This thing nearly happened,” or, “This thing happened to a partner, client, supplier, or friendly competitor, and we want to make sure it doesn't happen here.”

This is maybe outside of your scope and if it is, say, “I don't know.” I'm fine with that. I always jokingly laugh whenever I get a notification of a data breach. Oh, “We value your security. It's the most important thing to us.” Well, no, if it was the most important thing, that probably wouldn't have happened. Is there a better way to notify people of a data breach incident or something like that, other than saying, “We clearly didn't value something enough, but we're going to pretend that we did”?

It's interesting, because that drives us mad in cybersecurity, doesn't it? We feel like it's a platitude and we feel like an organization rolls out that line, “We take your security seriously,” and we all roll our eyes. One thing I haven't done or seen—actually even now I'm going to go and have a look into it—is any research seeing how people outside of cybersecurity feel about those communications.

There must be some research out there. I just haven't seen it yet. I'd be fascinated to see that and to see if they feel it's as empty as we do. From my point of view, the best way to handle a data breach in communications is to have a plan first as you've indicated before, but also to really focus on the right level of transparency. So being able to tell people what you know, when you know it in the most appropriate way, and coming from the right level.

I know it's a long time ago, but we'll never forget the lessons we learned from TalkTalk, where the CEO was on the news every day. Actually, there was too much information. 

At the same time, there was a data breach at one of the supermarkets in the UK. It was actually the worst data breach, but it got no news time. I know reputational damage because of that, because it was all dominated by over communications. 

For sure, it's a fine line to tread. I'm not an expert in incident communications, but I think when it goes well, it can actually build more trust.

As a good example, one of my vendors last year, I got an email from them saying, “Hey, we just want to let you know that there was this particular platform that you've heard about in the news. We use this platform and we're concerned about their cybersecurity incidents. We don't believe that it has impacted us but here's what we're going to do. We're going to change every password that we use on every system. This will take us some time but we're going to follow up with you and let you know. It'll probably take us a couple of weeks. It's a very manual process, but we'll let you know when it's done.” 

It made me trust them so much more. It's like, you admitted that, yeah, there’s a security issue. You admitted that you don't know whether it impacted you, but you're going to take precautions, you detailed what the precautions were and a timeline. I was like, “Oh, OK.” There were, like, no open loops for me to wonder like, “Well, what are you not telling me?”

That's a great example. I think people appreciate that. It also hints at some of the complexity we're dealing with. It wasn't a data breach, as you've described, that impacted them directly, but it was a supplier. It was somebody in their supply chain. They were taking these precautions. I think it gives a glimpse into this isn't necessarily really straightforward. It's more complicated than we'd like it to be sometimes.

That's always one of my fears, is supply chain stuff. I feel like I think I could do a reasonably good job in my sphere of influence, but I don't know what other people are doing.

Exactly. We're becoming more and more connected. That's only going to increase. I think that's the challenge that we're facing. Supply chain is an issue now. What's it going to be like in five years? Unless we get a handle on this.

Things are going to be way more integrated and at multiple more layers. I had an incident where I did have a supply chain issue, and someone down the line got compromised. A library that was getting included in something I was doing was running a cryptocurrency miner. It's one of those things. It's like, “OK, I see that there's a problem. Now, how do I figure out where it's coming from? Turn this off. Oh, no, dig away, turn that off, turn it back on, turn this off, turn that on, turn this.” Finally it's like, “OK, it's them. But it's probably not them. It's probably a supply chain to them also.”

Yes, a very good point. It's those layers of complexity that are really challenging. But at the same time, we can recognize the challenges we're dealing with. We can recognize they are going to evolve, but I am still optimistic that we are making progress. 

Cybersecurity is such a young industry, and I think we often overlook that. We often are so focused. Of course, we have to be on the thing that's going wrong, or may go wrong, or has gone wrong in the past, that we can overlook the 100 times that everything worked, back to your earlier point, Chris. We can also fail to take stock of the progress and everything that we have achieved so far.

I think it's important for us as a profession, but also as individuals, to be able to recognize things that do go wrong. Not everything is in our control. It's important for us to be as prepared as we can to have the controls that we can to mitigate the risk where we can, but also to not always feel like this guy is going to call in.

 

Do you think there are things that we can do at the educational level as part of regular schooling that will help not necessarily prepare people for, “Hey, this is your social engineering 101 defense class.” Because by the time you read a curriculum, all the vectors will have changed, but teaching a critical thinking course at a younger age where it really helps people to think about what's happening and how it's happening. That way, as the attack vectors change, it's not so much, “Well, here's how you stop someone from picketing, stealing the wallet out of your pocket by putting it somewhere else.” You're just more aware of what's going on around you and can adapt to those things.

That kind of critical thinking, I do think there's probably more that can be done at the education level, but I think we all need it. Maybe actually, those of us who are out of school need it more because I think there is a lot around education that even if it's not directly a critical thinking course or class, if we think about history, if we think about sociology, if we think about English literature, there's a lot of critical thinking that is taught in those disciplines.

I think it probably can be drawn out more and applied to technology. But I think, for those of us who are maybe further away from our education days, having some of that could be really beneficial. 

This, I think, applies to, of course, social engineering to scams, but also to disinformation. Because we're seeing those same tactics being used. They're as old as time and have always been used in political and social manipulation, but now it's accelerated with the Internet. 

Helping people to understand that, and understand, as you say, not at the tactical level but more at the strategic level. That's why I talk about how, unexpectedly, it makes you feel something, asks you to do something, because the tactics might change. But that's the kind of strategy that generally underpins any kind of manipulation.

It's almost more understanding how people are being manipulated or how you're being manipulated versus if they use Venmo, then you run for your life. Or if they say, “Pay me with a gift card,” run for your life. Well, in six months, gift cards aren't going to exist anymore. I don't know. But it'll be, “Pay with this new app that's really hot and fancy.”

We know that the criminals evolve their tactics depending on how successful we are. -Jessica Barker Click To Tweet

Exactly. We know that the criminals evolve their tactics depending on how successful we are. Just as we managed to shore up our defenses in one area, and absolutely, as you say, as new tech comes along, they'll follow those numbers and look at how to manipulate from that angle. Taking that strategic approach, and really helping people understand how they think, how they process information, and what can influence their actions and their reactions, that's a great way, I think, of building up resilience in people.

It’s almost that the tech follows cybercrime in terms of, “We found a hole; let's patch it. We found out a technique and it's not proactive.” We have to teach humans the proactive side, and the tech can handle the reactive side.

I think that's a good way of looking at it, actually. I had not thought about it like that before. If we lead with the human side—so often the human side is an afterthought—if we could center our security more on people, then everything I think would follow that.

Centering on people, does it help when your employees at an organization are seeing, as so many of us have, a job where we believe in the mission for the company? The company has a mission and we're going to help people do this. If you tie in by being good at the way we handle security, it helps us achieve our mission, as opposed to they're just a hindrance that's going to give you the stumbling block of it.

Exactly, it comes back to the values point, and if you have strong organizational values. If you can also tap into people's individual values, then that will really help to tap into that intrinsic motivation for engaging in cybersecurity. 

Some of the most healthy cybersecurity cultures that we work with are organizations where they have a very customer-focused value and mission. Then, cybersecurity obviously can really prove how it supports those values and that mission, and people then will see the customer data that they're handling and think, “What if this was my data? And if I'm caring for our customers, then how do I handle that data with care?”

That makes perfect sense. As we start to wrap up, I know we talked before we started recording that your company has some resources concerning cybersecurity culture. Where can they find that and what are those resources?

You can head to our website, cygenta.co.uk, and look at the resources. We have a security culture guide. This really has been in my head for five years, maybe, and finally got it on paper. That's really the ultimate guide to cybersecurity culture. 

A lot of people I find wonder, “What is cybersecurity culture? What do we actually mean when we use those words? How can we develop a more positive and proactive cybersecurity culture? How can we track it? What makes a good culture? What's going to work for your organization?” 

We've tried to bring all of this together, and showcase some of the great academic research that is out there that will help you, in a practical sense, apply that to your day-to-day work on your security program.

I like that—trying to help people change the culture as opposed to the hardware.

Neither is easy. 

Yes. I know you've authored a number of books. What is your most recent book?

Confident Cyber Security was published the first edition in 2020. I was just hoping maybe one person would read it and find it beneficial and then was delighted to get a great response. It went on to become an Amazon number one bestseller. 

A lot of people tell me it helps them in their career. It helps them communicate cybersecurity. It helps them share with recruiters what they're looking for and what this field actually means. The second edition is coming out in September.

What are the major changes from the previous version to your 2023 version?

The book has a lot of case studies. The vast majority of the case studies have been updated, bringing it up to date with more of a look at AI, more of a look at geopolitics and what we can learn about cyber war. For example, the Russian invasion of Ukraine. I talk more about deep fakes.

I also was keen in this edition to not just focus on cybersecurity, which of course, was a lot of the first edition but to bring in confidence. What do we mean by confidence? How can we deal with things like imposter syndrome or imposter phenomenon? And how do we handle confidence, whether it's under confidence or overconfidence, when we're trying to engage people more effectively in cybersecurity?

I can definitely relate to some of those. I know you have a YouTube channel as well where you talk about the human side of cybersecurity. Where can people find that YouTube channel?

You can find my channel at Dr. Jessica Barker, and I share weekly videos that are centered on the human, essentially anyone who wants to communicate more effectively about cybersecurity, or who wants really accessible explainers, either for themselves or to use with the people that they care about. Head to my channel.

Awesome. We will make sure to link to all those in the show notes because it's really hard to remember all these things when you're driving. I think most podcasts are listened to while they're driving. We'll make sure to add all those to the show notes for everybody. Jessica, thank you so much for coming on the podcast today.

Such a pleasure. Thank you.

Exit mobile version