Close Menu

    Never Miss a Thing!

    Get emailed updates whenever a new podcast episode goes live, plus bonus content from Dr. Stieg's fascinating guests! Subscribe to the This Is Your Brain newsletter below

    Can Music Heal Your Brain? with Barbara Minton, PhD

    February 6, 2026

    What Makes Us Vulnerable to Artificial Intelligence? with Jacob Ward

    January 23, 2026

    Is Retirement Bad For You? with Ross Andel

    January 9, 2026
    Facebook X (Twitter) YouTube LinkedIn Instagram
    This Is Your BrainThis Is Your Brain
    This Is Your Brain
    • Home
    • About
    • Podcast
      • Podcast Season 6
      • Podcast Season 5
      • Podcast Season 4
      • Podcast Season 3
      • Podcast Season 2
      • Podcast Season 1
    • Webinar
    • Contact
    • DrPhilStieg.com
    This Is Your BrainThis Is Your Brain
    Home»Uncategorized»Can Big Tech Read My Mind? with Sean Pauzauskie, MD 

    Can Big Tech Read My Mind? with Sean Pauzauskie, MD 

    This Is Your Brain producerBy This Is Your Brain producerNovember 28, 2025
    Screenshot

    Whether it’s our shopping habits, Google searches, or Facebook friends, we are all aware that our personal information is being collected and often sold online.  But what about our thoughts, what’s going on in our minds? With all the advances in consumer Neurotechnology and AI, stealing data from our brains is no longer science fiction and at least for now, it can collected and used without our consent.   

    Can anything be done to stop this intrusion into our most personal information? Dr. Sean Pauzauski, Medical Director at the Neurorights Foundation, discusses how companies collect our brain data, and whether or not your mind has any right to privacy.

    Phil Stieg

    Our personal data is online and it’s for sale. But what about the information in our brains, in our thoughts? Is that up for sale as well? With the exponential increase in neurotechnology, the private data in our minds can and is also being collected and used without our permission. How are companies getting access to it? What are they using it for? And is there anything we can do to stop it? 

    Today, we are here with Dr. Sean Pausowski, the Medical Director at the Neurorights Foundation, which is working to protect all of our brains’ rights to privacy, finding ways to safeguard our thoughts, emotions, even our subconscious before it’s too late. Sean, Thank you for being with us today.

    Sean Pauzauskie

    Yeah. Thank you so much for having me.

    Phil Stieg

    Before we get going on the whole concept of Neurorights, which obviously is incredibly important, can you explain to us what Neurorights Foundation, believe is considered neurotechnology?

    Sean Pauzauskie

    So  Neurotechnology is broadly defined as any device that can either record or manipulate brain activity.

    Right now on the market, there are about 30 devices that are capable of recording brain activity, mainly using EEG. Actually, the headphones that I’m wearing right now have EEG sensors in them just to use one example, there are devices called Neurable, which are headphones, MUs, Emotive, MindLift, and others that are really capable of collecting brain signal at the consumer level.

    Phil Stieg

    So you have an app associated with your earphones, right? Explain how you use that app.

    Sean Pauzauskie 

    Yeah. So the app itself. So you put the headphones on. It’s a very seamless process. It starts collecting brain waves and will tell you in real-time using biofeedback, how focused you are at that moment. So as you and I know, gamma and beta waves are some of the fastest frequencies that the brain produces. And we know from studies that people who are in a more focused state are in a gamma or beta frequency. And so the headphones can tell that very quickly and very seamlessly and help people stay in that state. 

    Phil Stieg

    And I, as a consumer, do I know If I bought your earphones, would I know that they’re monitoring my EEG activity, or is that just something the company has added on without my knowledge?

    Sean Pauzauskie

    Yes, the companies are very open about what they’re doing with their devices. That’s actually the reason that they want you to buy the product is because they possess these capabilities.

    The devices themselves are all marketed for various indications – things like sleep optimization or meditation or cognitive tracking, things of that nature. And so they’re health adjunctive treatments. They’re not FDA-approved clinical tools, but they are capable of collecting very useful data and helping consumers and patients optimize their brain health, but in different ways that were never possible before.

    Phil Stieg

    Again, I’m having a hard time linking, however, you’ve got those headphones on. The company is storing some EEG data from your brain, but the company right now has no idea what you’re thinking, doing, or anything like that. So how are they going to technically steal my thoughts?.

    Sean Pauzauskie

    These algorithms are developed to track certain states. As you know, brain waves are broadly divided into five different categories, and we know certain things about the brain. We don’t know everything, but we can know certain things with a fine degree of granularity from which frequency the brain is in. What they’re doing are capturing states of the brain broadly and then inferring that the data are directly related to various states.

    Phil Stieg

    But you bought those earphones knowing that they’re monitoring your brain waves.  By doing that, are you giving up your rights and just saying I don’t care if the company does that?

    Sean Pauzauskie

    Yeah. So I am first and foremost an enthusiast for neuro technology, and so I feel comfortable using it. And I feel like today, what can be done with the data is there aren’t any inherent uses that I feel could be bad from the data that are being collected. And so I see the applications and the uses is very exciting at the same time that I think that there should be guardrails that make the devices safer and just to give the person control over their own data.

    At the Neurorights Foundation, we feel it is first and foremost part of our mission to promote the use of neurotechnology by doing things like creating safety standards that will make people more likely to adopt the technology. We think that it will increase investor confidence if they know that there is less exposure to regulatory risk and development. We feel like companies are going to be able to work more closely with medical centers under more validated settings, and that the FDA has actually said that companies that do pre-market diligence will have a faster road to FDA approval. So we feel like the ethical guardrails are maybe something that companies may not necessarily want in the short term, but it can only create an environment where this technology is going to be more readily adopted because of consumer confidence and legitimization of the technologies.

    As you know, people are skeptical about, do these things work? What are they doing? And so if we can create just basic safety standards and development principles, we feel like it’s actually going to drive the adoption and help more people in the long run.

    Phil Stieg 

    So what are the “Neurorights” that your foundation is seeking to protect?

    Sean Pauzauskie 

    Neuro rights broadly are defined as five different categories of ethical interest as it relates to neurotechnology. 

    The first neuroright is mental privacy. What we’re talking about with consumer devices collecting EEG data, we want people to be able to own and control their own brain waves. 

    The second category is fair access to mental augmentation. Some of these devices not only record brain data, but are capable of influencing it. Just give one example. There are devices that introduce low amplitude electrical currents into the frontal lobes and have been proven clinically in the lab to help people fall asleep faster and stay asleep longer. But it’s the influencing of that brain activity through signal that we define as mental augmentation. 

    The third Neuroright is personal identity. This gets more into things that could influence your sense of self. These would be more probably in the future with things like implantable neuro devices that could change a person’s sense of who they are. 

    The fourth Neuroright is free will. We don’t want anyone to lose their sense of agency through neurotechnology. We want people to maintain control not only of who they are, but what they do.

    And then the final neuroright is freedom from bias. So this delves into the realm of the algorithms that are interpreting the neural data. We don’t think that any brainwave should be viewed as good or bad based on any biases in how these brainwaves are interpreted.

    Phil Stieg

    So in essence the measurement isn’t revealing your emotions. It’s measuring your brain electrical activity. And then it’s you taking that feedback and turning it into whatever it is you want.  But it’s not somebody else having that. Or is it also that somebody at the app has the measurement of your brain activity as well, and we’re unclear about how that’s going to get used?

    Sean Pauzauskie

    Yeah. So that’s actually where a lot of the work on mental privacy comes in. So as you know, the first laws in the United States were passed last year in Colorado in April and in California in September to give consumers the right to have control over this data. 

    As you and I well know, there are a dozen or more medical conditions that you could also tell from this type of data. And so that could be great for things like driving insights into various mental health disorders or cognitive disorders or seizure or you name it. But we feel like that should only happen if the user has consent to give that data away. So it should be an individual personal decision about whether their data is being aggregated into some algorithm that wants to create insights into things like depression. But we’re not against the insights being teased out of the data. We just think it should be a choice.

    Phil Stieg

    That’s the other thing, too, as I was reading through some of the scientific American stuff on this, is I understand down the road, it’s something we’ve got to be worried about. And what I don’t want to do is set up legislation now when we don’t understand what down the road means, but it will limit down the road. And I think you and I probably feel the same way about this. You’re into neurotechnology, but we also want to protect the individual from some company that isn’t using it appropriately.

    Sean Pauzauskie

    Yeah as our sponsor in Colorado, a representative named Cathy Kipp, who was really a pioneer in the first neural data protection law, put it, if everyone was a good actor, the statutes would be two lines long. It’s to prevent any potential misuses or abuses that mean that we have to have these big, long, bulky privacy laws to prevent any bad action. 

    We’re not worried about any particular company’s data practices now. In fact, this could be a rare example in technological revolution where we don’t have to wait until there’s been a Three Mile Island or a Chernobyl or something like that. We can get out ahead of it and be sure that people are only enjoying the benefits from the beginning.

    Interstitial theme music

    Narrator: 

    Long before technologies could record or interpret brain activity, entertainers known as mentalists astonished audiences by seeming to read minds. Mentalists still perform on the stage and screen, and they have a few tricks up their sleeves for making feats of mind-reading look remarkably real. 

    Like magicians, mentalists combine techniques such as misdirection, sleight-of-hand and psychology, to suggest that they possess superhuman abilities. Mentalists also carefully study human behavior and mannerisms, to uncover subtle clues that support the illusion of telepathy.

    During the 1800s, mentalism’s popularity soared. A mentalist would try to guess a secret name or number, describe a hidden object, or predict which card a person would draw from a deck. 

    Some mentalists said they could communicate with ghosts. Others professed to read people’s thoughts or see into the future. 

    But while mentalists of the past claimed to be psychic, modern mentalists readily admit that mentalism isn’t mind-reading at all.  

    Oz Pearlman has spent decades perfecting his act as a mentalist, appearing on television and stages worldwide. During a podcast appearance, Pearlman stunned the host by correctly guessing the PIN code for the host’s bank account. In a recent TED talk, Pearlman explained that he was not a psychic, and that mentalism was “a learnable skill.” He uses it to reverse-engineer the human mind and intuit correct answers by reading a person’s body language. 

    For example, when he asks an audience member: “Think of someone famous, living or dead,” he looks at their face and body. Their posture tends to be stiffer and their facial expression more somber if the famous figure is no longer alive. When male audience members choose a famous man, they put their hands in their pockets; but if they choose a famous woman, they clasp their hands in front of them.

    As Pearlman said in the talk, “If I know how you think, I know what you think.”

    While mentalism isn’t mind-reading, it’s still entertaining to watch — especially when the audience can’t guess how the trick was done.  For Dutch mentalist Timon Krause, his goal as a performer is simply inspiring the audience with a sense of wonder. He says: “I just want them to have an experience where they start to think that, well, maybe that was magical!” 

    Interstitial Theme Music

    Phil Stieg

    When you talk about neuro discrimination, give me an example. What does it mean?

    Sean Pauzauskie

    Yeah. So I think the example that gets used probably most frequently is the example of insurance companies. So if they were to get a hold of that I have a problem with anxiety or depression or some mental health disorder that’s been diagnosable by brain waves, I should not be denied coverage by an insurance company as one example. Okay.

    Phil Stieg

    And does that occur? Currently? Or is that what we’re worried about in the future?

    Sean Pauzauskie

    That’s what we’re trying to prevent, yes.

    Phil Stieg

    Do you envision a day where a company is going to say, Sean, put these earphones on while you’re working, monitor your brain activity, and then possibly use that in your job performance evaluation?

    Sean Pauzauskie 

    There’s actually a bill We’re still advancing in California right now that is more focused on the labor applications of this data. My feeling, and I think the feeling of most neurorights advocates, would be that these technologies are not inherently good or bad when it comes to things like focus tracking or making sure that people are paying attention, especially if you’re in a high stress, high demand job, things like air traffic controllers, truck drivers, anything where public safety is at stake. But it should be the choice of the individual to enter into that agreement, and they should have certain rights. They should have the right to fair consent. They should know why they’re using the device. They should know what the device can do. And as long as those standards are in place and that the employee cannot be discriminated against based on their neural information, then I have no problem with… No one would ever have a problem with increasing safety through technology. It’s just that there should be certain rights of the people who are using it.

    Phil Stieg

    And you’re obviously having some impact. There are some recent bills passed in Montana and Colorado. What are those bills that you are happy with?

    Sean Pauzauskie

    Yeah. So the bills that we passed in Colorado, California, and Montana were a direct results of the efforts of the Neuro Rights Foundation. The strategy of all of our bills so far have been to amend state privacy acts. California was the first state to pass a broad privacy act, protecting all kinds of personal information, including your cell phones and your addresses and your emails, things that should remain private, that everyone believes should remain private.

    And so our strategy was really a pretty simple one just to go into those statutes and amend in a definition of neural data. We talked about earlier, anything that can record or manipulate brain activity through the use of neurotechnology or a device. And the same strategy was employed in Montana.

    Phil Stieg

    I was impressed with how international this is. Can you give us an example of how broadly-based the Coalition for Neuro-Rights is?

    Sean Pauzauskie 

    The first state to actually take action on neuro rights was the country of Chile in 2021, who amended their Constitution to include protections for brain data. In 2023, the state of Rio Grande do Sol in Brazil included protections for brain activity. Now, I think there are about a half a dozen bills throughout South America, including Colombia, Uruguay, Mexico, Costa Rica. The movement is spreading in South America. And then the country of Spain, actually, just the province of Cantabria in the north of Spain, actually just introduced the first bill outside of the Americas for the protection of broadly meant neuro rights.

    Phil Stieg

    I don’t want people listening to this podcast running away thinking, Oh, my God, they’re going to start stealing my thoughts, my emotions, and putting them out there in public. 

    Sean Pauzauskie

    So the thing about thoughts and emotions is that there was a study in Australia last year that showed that capability to create an algorithm to take thought to text with about 40% accuracy, which isn’t ready for consumer use yet. But this stuff is on the horizon. If we know anything about technological advancement, it tends to outpace our expectations. It wouldn’t be surprising to me if in five years, we have the ability to maybe use the headphones I’m wearing right now to take thoughts and translate them into text. That’s really more of a future concern. But the things that are here now, emotion recognition, as you may know, detectable by just two channels of EEG and a pair of earbuds. You can tell with pretty high fidelity if somebody is anxious or sad or happy. And so some of these things are here today. Some of them are future concerns. I definitely think the implantable devices They’re already going through the rigorous FDA approval processes, and that’ll take some time, but that’ll be here soon as well. So it’s really their today problems, and then there are the problems that we’re looking at in the future.  But we just want to get out ahead of this to avoid any slowing of the adoption or any potential misuse or abuse.

    Phil Stieg

    What wakes you up at 3:00 in the morning regarding your worries about this data acquisition and the data companies and how they might use it?

    Sean Pauzauskie 

    So we talked about the potential discrimination piece from things like insurance companies. I worry about manipulation. I worry about somebody wearing a pair of headphones or an earbud that data being sold to an advertiser who… We’re entering an age where it’s not just time spent scrolling through your computer and the company knows how long you spend on that website. We’re actually entering an age where you can tell how interested somebody is in the content that they’re looking at, how focused they are on the content that they’re looking at. It concerns me that a patient could have things potentially manipulated into their algorithm based on their mental activity. And then probably the worst thing, because I’m a practicing neurologist, are people who are essentially giving away their health conditions to companies who can then do whatever they want with it. There are no standards on what could be used. If somebody knows that I’m anxious or I’m depressed, they could put that into my algorithm and try to sell me things that they think I might need based on my health conditions that I didn’t even know that I gave away.

    Phil Stieg

    Do you think things are moving in the direction? Are you encouraged? Are you positive about this? Or do you feel like people are fighting with you, trying to undermine you? Is there a balance?

    Sean Pauzauskie

    I think there’s definitely a balance, but on And so on the whole, I am more positive about the future of neurotechnology. I think that the future is incredibly bright. When I was a medical student, I think a lot of people were looking at me funny like, Why would you want to do neurology? You can know what’s going on, but you can’t really do anything. Well, this technology has really changed that to the point that I feel like I’m going to have literally dozens of more tools and more options to treat patients. And so I’m very excited about that. And then just the 30,000-foot picture of cultural progress and what we’re going to be able to do for both patients and consumers. We’re going to be living completely augmented lives in a way very soon. And so I’m very excited about that aspect as well.

    Phil Stieg

    Dr. Sean Pauzauskie, thank you so much for spending this time with us. Probably One of the most interesting components in neuroscience right now is neurotechnology, and how we can regulate and make sure that your individual privacies are not invaded. Thank you so much for making that clear to us, and it’s been a real pleasure talking with you.

    Sean Pauzauskie

    Thank you for having me.

    Author

    • This Is Your Brain producer
      This Is Your Brain producer

      View all posts
    Previous ArticleThe Neuroscience of Your Workout – with Gary Wenk
    Next Article The Power Of Touch with Dr. Michael Banissy

    Never Miss a Thing!

    Get emailed updates whenever a new podcast episode goes live, plus bonus content from Dr. Stieg's fascinating guests! Subscribe to the This Is Your Brain newsletter below

    Don't Miss

    Can Music Heal Your Brain? with Barbara Minton, PhD

    Music February 6, 2026

    Release date Feb 6, 2026 Guest Website : https://musicandhealing.net Whether it’s rock and roll, classical,…

    What Makes Us Vulnerable to Artificial Intelligence? with Jacob Ward

    January 23, 2026

    Is Retirement Bad For You? with Ross Andel

    January 9, 2026

    Why Brains Need Friends with Ben Rein, Ph.D.

    December 26, 2025
    Stay In Touch
    • Facebook
    • Twitter
    • YouTube
    • LinkedIn
    • Popular
    • Recent
    • Top Reviews

    Can Music Heal Your Brain? with Barbara Minton, PhD

    February 6, 2026

    What Makes Us Vulnerable to Artificial Intelligence? with Jacob Ward

    January 23, 2026

    Is Retirement Bad For You? with Ross Andel

    January 9, 2026

    Can Music Heal Your Brain? with Barbara Minton, PhD

    February 6, 2026

    What Makes Us Vulnerable to Artificial Intelligence? with Jacob Ward

    January 23, 2026

    Is Retirement Bad For You? with Ross Andel

    January 9, 2026
    © 2026 This Is Your Brain
    • Home
    • About
    • Podcast
      • Podcast Season 6
      • Podcast Season 5
      • Podcast Season 4
      • Podcast Season 3
      • Podcast Season 2
      • Podcast Season 1
    • Webinar
    • Contact
    • DrPhilStieg.com

    Type above and press Enter to search. Press Esc to cancel.