Is Privacy Possible in a High-Tech Data-Driven World?

Is Privacy Possible in a High-Tech Data-Driven World?

The idea of privacy has evolved. Even from the early days of the internet to today’s complex global data ecosystems, privacy concerns have changed. The rise of AI and biometric data increase those concerns exponentially. Is it even possible to have privacy anymore? Let’s take a look at the current state of digital privacy and where it might go in the future.

See Privacy is Dead with Pam Dixon for a complete transcript of the Easy Prey podcast episode.

Pam Dixon is the founder and executive director of World Privacy Forum, a nonprofit public interest research group. She has been working on privacy since the early 1990s, making her one of the earliest people in the field. As soon as she got on the internet, she realized it would have unique privacy implications. The idea of what the world would look like once everything was fully digital became a passion. In 1993, she co-authored Be Your Own Headhunter Online, one of the first books to talk about internet privacy.

While researching workplace privacy, Pam found significant privacy and security issues in popular online resume databases. She wrote a 90-page report about it. Richard M. Smith, lead director at the Dever School of Law’s Privacy Foundation, was impressed by the report and hired Pam as a principal investigator. There, she learned the technical aspects of privacy. After 9/11, she decided the best thing to do would be to get into broad research. So she founded the World Privacy Forum to do research on privacy and data governance. They research the privacy implications of complex data ecosystems, data brokers, identity, healthcare systems, financial systems, global privacy, AI, genetic databases, machine learning, and more.

The Evolution of Privacy Concerns

The first modern age of privacy was in the 1970s. Hesse, Germany, passed the very first privacy law in the world, but it was less than a page long. The United States passed the Fair Credit Reporting Act, the first major privacy legislation in the world. The Organisation for Economic Co-operation and Development (OECD) in Paris took notice and used it as a foundation to develop the OECD Privacy Principles. These principles then percolated into laws across the world. The first multinational law, the European Data Privacy Directive, passed in the 1990s, making it the first multinational law.

Around 2012, Europe realized that the privacy laws created before the majority of people were online didn’t match the privacy concerns of where the internet has gone. They started the conversation around what is now known as GDPR. GDPR has an extraterritorial provision, which means that even though it’s a European law, it applies to any country that wants to do business in Europe. This incentivized a lot of other countries to pass similar legislation. Now, the only countries that haven’t are small island nations, countries without the economic infrastructure to do so, and the United States.

The Evolution of Privacy Concerns

The idea of privacy has to change along with technology. Just look at how much things have changed since the beginning of digitization in the late 1980s. The internet era has completed its purpose, and now we’re moving into machine learning and AI. Everyone’s talking about large language models and confused about how AI impacts everything. We’re seeing extraordinary systemic impacts and changes, even in places we wouldn’t expect. This is a unique transformation we’re lucky to be seeing firsthand.

We are right now in a transition that happens once every thousand years. We’re very lucky people. – Pam Dixon

Consumer Perspectives on Digital Privacy

Initially, consumers were unequivocally excited about internet technology. When Google started Gmail, though, she realized they were allowing their ad engine to scrape people’s emails for keywords, and that wasn’t okay. She co-wrote a passionate letter about how bad it was, and got death threats for years over it. But she was right, and eventually Google stopped doing that because it violated European privacy laws.

This shows how consumers have become more aware of privacy concerns and more ambivalent about the technology. We saw the same thing with social media. People posted everything enthusiastically on Facebook, then migrated to other social media. Now we’re seeing kids suing their parents over sharenting and more concern about what tech is doing to us. Public perception has gone from “This is a great innovation!” to “It has some harms, and we need ways to be safe from those.”

It goes from, yeah, this is a great innovation, to … we need some guardrails. – Pam Dixon

Other parts of the world have different mindsets, too. Asia has less rigid ideas about privacy – they put up more guardrails but those guardrails are more flexible. India has the Aadhaar system, which does fully-digital biometric real-time identification. Even the smallest, poorest villages are digitized. India also has a lot of digital, technical, and legal protections for privacy. But because of how technology is changing, there are still gaps.

Adapting to AI

At this year’s OCED meeting, the University of Melbourne presented a study they did that was a trust index for AI systems. The developing world had the highest measure of trust in these systems. This has led to an interesting situation where developing nations are the earliest adopters of new AI tech, while developed nations are more hesitant. There’s real potential for these nations to “leapfrog” over others in the next decade.

These systems are very complex. In the US, the themes in privacy around these systems are control and consent. Let’s control the flow of data and give people an opportunity to consent. But the quantity of data is so great that it’s hard to give people genuine insight into every piece of data. Data free flow with trust (DFFT) might be a solution. It allows for easier flowing of data between trusted and verified entities. Privacy advocates tend to want more guardrails than DFFT has, while the innovation side appreciates that it provides fewer roadblocks. But everything is so complex that it’s impossible to come up with a solution that fits everything.

We are in such a complex situation that it’s almost impossible to characterize everything under one rubric anymore. – Pam Dixon

The Future of Privacy

The privacy principles we know today have flowered, and now they’re reaching the end of their usefulness. New principles are waiting, but we don’t yet know what they are. There are people who will tell you they know. Pam thinks she sees hints of them. But we need to watch the ecosystems as they evolve before we make assumptions. Cars have been around for a while, and we can make them better because it’s tweaking a known system. Managing privacy concerns is a challenge because we don’t know where we’re going or what systems will look like in the future. Some aspects will get better and some will get worse – it’s not an either/or situation and it’s not linear.

We do have hints of where some problems are, especially with machine learning, neural networks, and AI. Geospacial Information Systems (GIS) are used a lot around the world, primarily for good purposes like providing healthcare to nomadic tribes. But identifying and tracking different ethnicities is a system that can also be used for evil. Algorithms have been criticized for bias – racial, age, gender, and more. If everything is tracked, even if the purpose is to help people, it could both help and hurt.

The Future of Privacy

Privacy Concerns in Tracking and Machine Learning

A policy study found that if you’re a member of an indigenous tribe or other identifiable group, you could be genetically identifiable. If you’ve provided genetic information to a biobank, if that data is analyzed with AI or machine learning, there’s a good chance you could be identified even if the data is anonymized. If you’re a member of a Native American tribe, if you have a rare disease, or anything else where there’s a data set without a lot of people in it, you could still be identifiable. Current privacy mechanisms don’t cover this concern.

Every individual has their own level of concern about this. Some people with a rare disease are concerned about the privacy of their health data and don’t want it out there. Health privacy is an important area of research. Other people are willing to risk potentially being exposed to help other people who also have that disease. There’s vulnerability there, and we have to understand the complexity of privacy. It gets even more complex when AI is involved.

Privacy, when it comes down to it, is highly contextual, and this is what makes it so difficult to actually implement and legislate. – Pam Dixon

Right now, we’re in a transitional era. It’s essential to understand what we’re doing. We need to move slowly and carefully, but we do need to move. AI requires guardrails, even if we’re not sure what those look like right now. We need to develop principles and apply them. Right now, a lot of this technology has no governance systems in place. But those tools are important if we want to address privacy concerns.

The Importance of Testing Solutions

Any kind of tracking needs to have guardrails. Pam isn’t sure what those should look like, and we need a lot of factual work to figure out what they should be. Any proposed solution should be tested. You’d be surprised how many legislative solutions to privacy concerns have never been tested. We’re past the era where that could work. Proposed solutions must be tested to see if they work.

There are all these proposals for fixing privacy that literally have never undergone a single test. – Pam Dixon

Lots of legislators are worried about hurting innovation. But anyone who’s looked at how it actually applies in the world know it’s almost inapplicable to any meaningful AI. People want to take advantage of this technology, but they have a vague sense that something could go wrong. It’s hard to see the privacy concerns. Guardrails will actually help people feel more comfortable engaging with these technologies.

Privacy Across the World

In Rwanda in the 1990s, there was a terrible genocide based on their ID cards, which listed ethnicity. Pam had the privilege to teach at Carnegie Mellon University Africa in Rwanda in 2023. She taught about identity ecosystems in the developing world, and invited Josephine Acacia, the Rwandan National Identity Authority, to teach part of a class. She spent an hour and a half talking about what she specifically had done to make sure their ID system was never abused like that again.

The Aadhaar system in India is the largest ID system in the world, with over a billion people. When it was first built, it was a privacy nightmare. The issues were adjudicated by the Supreme Court of India, and their decision struck down the law that allowed some of the problems, mandated more privacy legislation, federated a centralized database, and added significant privacy controls. Today, the system is quite resilient.

No system is perfect. It requires good governance and good guardrails. But we can build great systems that do a lot and address privacy concerns. There’s a lot of countries that are already doing it. Asia is doing really well, and Europe is also doing pretty good. We just have to figure it out.

Privacy in the United States

The United States has the most computing power in the world. We have some of the largest models and good power centers for AI and machine learning. We also use a lot of quantum capacity compared to the rest of the world. We’re very dominant in that way. Other regions in the world are looking at policy, how technology and privacy concerns are impacting people, and new permutations of technology, people, regions, and countries. Civil society, academic, and governments are all involved in this work.

The US is ahead in computing, but behind in understanding and implementing guardrails. There’s a lot of resistance to anything that looks like it might slow innovation. But it’s a balancing act. Studies looking at trust in digital ecosystems show that lack of trust means tech isn’t adopted as quickly. If legislation is too tight, it becomes an innovation problem. But if it’s too loose, it becomes a trust problem. We need to find the sweet spot of fostering innovation while protecting privacy.

Learn more about Pam Dixon and the World Privacy Forum at worldprivacyforum.org. There you can find op-eds, reports, and thousands of articles on various privacy topics.

About Your Host

Chris Parker

Chris Parker is the founder of WhatIsMyIPAddress.com, a tech-friendly website attracting a remarkable 6,000,000 visitors a month. In 2000, Chris created WhatIsMyIPAddress.com as a solution to finding his employer’s office IP address. Today, WhatIsMyIPAddress.com is among the top 3,000 websites in the U.S.

Share Post:

COULD YOU BE EASY PREY?

Take the Easy Prey
 Self-Assessment.

YOU MAY ALSO LIKE

If you’ve ever felt like your smart devices are “spying” on you and silently collecting your personal data to feed the almighty algorithm or…

Less than 30 years ago, biometric security seemed like something out of science fiction—reserved for futuristic thriller films like The Bourne Identity or Mission:…

Professionally and personally, most of us spend a lot of our time online. We use our smartphones and our personal computers for everything from…

PODCAST reviews

Excellent Podcast

Chris Parker has such a calm and soothing voice, which is a wonderful accompaniment for the kinds of serious topics that he covers. You want a soothing voice as you’re learning about all the ways the bad guys out there are desperately trying to take advantage of us, and how they do cleverly find new and more devious ways each day! It’s a weird world out there! Don’t let your guard down, this podcast will give you some explicit directions!

MTracey141

Required Listening

Somethings are required reading – this podcast should be required listening for anyone using anything connected in the current world.

Apple Podcasts User

Fascinating stuff!

I've listened to quite of few of these podcasts now. Some of the topics I wouldn't have given a second look, but the interviewees have always been very interesting and knowledgeable. Fascinating stuff!

Apple Podcasts User

Excellent Show

Excellent interview. Don't give personal information over the phone … it can be abused in countless ways

George Jenson

Interesting

I've listened to quite of few of these podcasts now. Some of the topics I wouldn't have given a second look, but the interviewees have always been very interesting and knowledgeable. Fascinating stuff!

User22

Content, content, content!

Chris provides amazing content that everyone needs to hear to better protect themselves and learn from other’s mistakes to stay safe!

CaigJ3189

New Favorite Podcast!

Entertaining, educational and I cannot 
get enough! I am excited for more phenomenal content to come and this is sthe only podcast I check frequently to see if a new episode has rolled out.

brandooj

Big BIG ups!

What Chris is doing with this podcast is something that isn’t just desirable, but needed – everyone using the internet should be listening to this! Our naivete is constantly being used against us when we’re online; the best way to combat this is by arming the masses with the information we need to stay wary and keep ourselves safe. Big, BIG ups to Chris for putting the work in for us.

Riley

As seen on

COULD YOU BE EASY PREY?

Take the Easy Prey Self-Assessment.
close

Copy and paste this code to display the image on your site

COULD YOU BE EASY PREY?

Take the Easy Prey Self-Assessment.

We will only send you awesome stuff!

Privacy Policy

Your privacy is important to us. To better protect your privacy we provide this notice explaining our online information practices and the choices you can make about the way your information is collected and used. To make this notice easy to find, we make it available on every page of our site.

The Way We Use Information

We use email addresses to confirm registration upon the creation of a new account.

We use return email addresses to answer the email we receive. Such addresses are not used for any other purpose and are not shared with outside parties.

On occasion, we may send email to addresses of registered users to inform them about changes or new features added to our site.

We use non-identifying and aggregate information to better design our website and to share with advertisers. For example, we may tell an advertiser that X number of individuals visited a certain area on our website, or that Y number of men and Z number of women filled out our registration form, but we would not disclose anything that could be used to identify those individuals.

Finally, we never use or share the personally identifiable information provided to us online in ways unrelated to the ones described above.

Our Commitment To Data Security

To prevent unauthorized access, maintain data accuracy, and ensure the correct use of information, we have put in place appropriate physical, electronic, and managerial procedures to safeguard and secure the information we collect online.

Affiliated sites, linked sites, and advertisements

CGP Holdings, Inc. expects its partners, advertisers, and third-party affiliates to respect the privacy of our users. However, third parties, including our partners, advertisers, affiliates and other content providers accessible through our site, may have their own privacy and data collection policies and practices. For example, during your visit to our site you may link to, or view as part of a frame on a CGP Holdings, Inc. page, certain content that is actually created or hosted by a third party. Also, through CGP Holdings, Inc. you may be introduced to, or be able to access, information, Web sites, advertisements, features, contests or sweepstakes offered by other parties. CGP Holdings, Inc. is not responsible for the actions or policies of such third parties. You should check the applicable privacy policies of those third parties when providing information on a feature or page operated by a third party.

While on our site, our advertisers, promotional partners or other third parties may use cookies or other technology to attempt to identify some of your preferences or retrieve information about you. For example, some of our advertising is served by third parties and may include cookies that enable the advertiser to determine whether you have seen a particular advertisement before. Through features available on our site, third parties may use cookies or other technology to gather information. CGP Holdings, Inc. does not control the use of this technology or the resulting information and is not responsible for any actions or policies of such third parties.

We use third-party advertising companies to serve ads when you visit our website. These companies may use information (not including your name, address, email address, or telephone number) about your visits to this and other websites in order to provide advertisements about goods and services of interest to you. For information about their specific privacy policies please contact the advertisers directly.

Please be careful and responsible whenever you are online. Should you choose to voluntarily disclose Personally Identifiable Information on our site, such as in message boards, chat areas or in advertising or notices you post, that information can be viewed publicly and can be collected and used by third parties without our knowledge and may result in unsolicited messages from other individuals or third parties. Such activities are beyond the control of CGP Holdings, Inc. and this policy.

Changes to this policy

CGP Holdings, Inc. reserves the right to change this policy at any time. Please check this page periodically for changes. Your continued use of our site following the posting of changes to these terms will mean you accept those changes. Information collected prior to the time any change is posted will be used according to the rules and laws that applied at the time the information was collected.