Introductions
Scott: Thank you guys so much for joining us today. I’m Scott.
Nayna: I’m Nayna, and today we are interviewing Andrea Downing and Ashley Dedmon from The Light Collective. Could you guys give us a brief introduction before we start with our questions?
Andrea: I am Andrea Downing and I'm co-founder and board president of the Light Collective. We're an organization on a mission to advance the rights, interests, and voices of patient communities in health technology. And as a patient advocate, I am very excited to be here speaking as part of the Sawyer Series at Rice about Cybersecurity Awareness Month, in addition to Breast Cancer Action Month. It's kind of the convergence of both things.
Ashley: Hello, I'm Ashley Dedmon. I feel like I have to say first I'm a native Houstonian. I live right here in Houston,Texas. So it's great to be right here in Houston, right up 288. So thank you all. Thank you to the Sawyer Series and Rice University for hosting this important conversation with The Light Collective. As I shared, I'm Ashley Dedmon and I am a supporter and advocate for The Light Collective. I actually worked with The Light Collective from kind of getting everything off the ground. It was a cohort of how many?
Andrea: So we started with you back in 2022.
Ashley: Yes. That was the first cohort.
Andrea: And it was funded. You know, our cohorts were funded by the Robert Wood Johnson Foundation. And that first cohort was all about setting our vision together. And we brought in this coalition of community based organizations to say, you know, hey, what does technology advocacy look like? How do we advance the rights and interests of patient communities when we're thinking about adopting new technology? And I was just so glad to be able to work with you then and see how you have grown and gone on your path because you've been part of all of these amazing organizations, which is an intersection of The Light Collective, but by no means, like you do all this other amazing work within the breast cancer community, and you want to talk a little bit about that?
Ashley: Yes. And I'm glad you laid that foundation. So that's really where I was able to connect with The Light Collective and really be part of that foundational work with The Light Collective. I think it was maybe with 10 or 15 other people. Is that number right? It was a collaborative group.
Andrea: Yeah, we had originally 13. And now we have 28 member based organizations that have been part of cohorts.
Ashley: So, yes. I was a part of that foundational group. But as Andrea shared, I've been in the breast cancer space almost going on two decades. My first form of advocacy was being a caregiver for my mother who had metastatic breast cancer, and my father, who had prostate cancer. I've worked in the Houston community. I was a K-12 health educator, worked at Harris County Public Health, Blue Cross Blue Shield, and all along this pathway, I was doing my advocacy work. And then in 2021, I actually merged what I call my professional work and my purposeful work where I went to go work for Susan G. Komen. Then I worked for the American Cancer Society. Now I'm at Living Beyond Breast Cancer. And I'm also a doctoral student at the UT School of Public Health who I know is also a partner and collaborator. And so this is really just been my advocacy path. And I have to say, when I was first introduced with The Light Collective, it was different. It was something that I didn't know existed. It was a critical need. And it made so much sense. So when I was introduced to Andrea and she was sharing her vision and her mission of what The Light Collective was and will be, I was just fascinated because I really felt like it aligned with my advocacy work. So thank you, Andrea, for allowing me to be a part of that work then. And even now.
Andrea: That means the world to me. Thank you.
Life Experiences
Nayna: I know you guys touched on this a little bit, but could you share some of the more pivotal moments in your life that have inspired you to pursue your current work and how these experiences have shaped your approach and perspective within your field?
Andrea: Sure. So I will actually start with I did not foresee myself being on this path of being an ethical hacker. I was originally an advocate from the BRCA community along with Ashley. So being immersed in the breast cancer community, I will share a very pivotal moment that was something that I will remember like it was yesterday, and it was here in Houston. I was at an event and it was at an ethics and data sharing event, talking all about the good ways that we could share data for research to improve health outcomes for our communities to find better cures. And it happened to be the week that Cambridge Analytica hit. And for anybody who remembers that story, that big scandal back in 2018, it was March of 2018. I was literally sitting at dinner with all of these mentors of mine and I was in tears. The reason that I was in tears was, I was starting to understand with my background, you have to understand, I started my career in technology back when I was 23 and kind of a geek, right? So understanding that you could scrape user profiles from Facebook in order to create these predictive models or behavioral targeting models in order to change behavior of a whole population was really scary.
And I wasn't just asking myself, okay, what can you do in politics? My question is, how does that type of approach impact vulnerable communities who reside on a platform like Facebook or any social media platform? So that was the start of me kind of falling down the rabbit hole and becoming an accidental hacker. And here we are five years later. We've taken our reports to the FTC. We have advocated for better policies. We've tried to organize our communities. We're publishing research in order to inform these policies, and still we have so much more to go, so much further to go in order to make these places safer. And I'll just pause and say, if you remember back to 2018 and earlier, back when social media was a magical place, how bad things have gotten, I asked myself, how are we going to turn the tide; how are we going to change things so these vulnerable communities are in a better place to practice collective governance, instead of being owned by the social media platforms that we are on today?
Ashley: I know for me, losing my mother to metastatic breast cancer, as I shared earlier and supporting my father through his prostate cancer diagnosis, were the pivotal moments that deeply influenced my decision to pursue my current research career at UT School of Public Health, as well as my advocacy work. These experiences really shaped my approach, giving me a profound understanding of the emotional, physical, and systemic challenges faced by persons who are diagnosed with cancer and their families. They also fueled my commitment to addressing disparities in cancer care and advocating for health equity, especially for underserved communities. And these moments of loss and caregiving really solidified my perspective that research and policy and advocacy must be patient-centered. It must also be empathetic as well as focused on improving health outcomes. I know, as Andrea was sharing, I was in class the other day and just listening to some of my colleagues in school and really just identifying a lot of their research paths and their career paths. I think we are so used to working in a silo.
You know that in industry everybody stays in their silo, right? But there's so much overlap. There's so much collaboration, there's so much synergy. And I think it takes people coming outside of their box and coming outside of their shell and even challenging the way we think. And I, as I shared earlier, when I was introduced to The Light Collective and I think I shared this with Andrea a lot of times, like everything she was sharing, it was challenging me and not in a negative way. It just caused me to think differently, use another part of my brain that I wasn't used to using, because I'm so focused on community and so focused on interventions and programming, that you don't see the kind of the invisible things that could bring harm, but that you need to be aware of because that then goes into educating the community as well. Right? We're already out educating them. As I think as a public health student, if we're already out educating them, this is a part of that education.
Emerging Risks in Healthcare Technology
Scott: You guys touched a little bit about the invisible things that can kind of come with risk in healthcare technology. For our next question, as technology continues to evolve in healthcare, what emerging risks do you see that patients need to be aware of and how can they better protect themselves while still benefiting from these digital health tools?
Andrea: I'm going to talk a little bit about artificial intelligence. And I have a book recommendation for everyone. It's a book called AI Snake Oil: What Artificial Intelligence Can Do, What It Can't and How to Tell the Difference. And the question is, how do we protect ourselves? The first thing I would say is we as patients and individuals or “users,” are not going to move the needle on digital rights or protecting our communities alone. It really takes a village. And what we have today is a lot of folks are are signing these consents for either care if you're going into the hospital and you're signing your consent form or when you go online and you check the box and and sign up for an account with a lot of the fine print saying things that you may not know about how your data are being shared. So when we think about this emerging paradigm of AI, we have to understand that there are a lot of data brokers out there that are scooping up data and creating predictive algorithms or analytics in ways that we as patients are not yet aware of, and we have no laws yet to protect us. You have to keep in mind, HIPAA is the one federal health privacy law here in the US that has really strong protections for patients. Yet there are a lot of loopholes in HIPAA, and HIPAA was designed in the age of fax machines back in 1996. It does not cover anything that you post online: what you buy related to your health on the internet, your browsing activity, and behavior. So I would say the first thing is to understand and recognize we need new laws. We need new policies and a call to action here is starting to really go to Congress, maybe in the next administration, depending on what happens in this election and say it's time to take our power back and really find stronger consumer privacy laws that protect specifically our health information.
Ashley: I would just add, when we look at technology, it's definitely been able to advance access. And I also think that we have to make sure that patients or consumers or people who are navigating the healthcare system as a whole, are aware of their rights, the privacy emerging risk that could happen, and even misinformation. When I really look at disparities, health literacy and misinformation is critical because we can see what it's already done to underserved communities. Because this is a topic that might be new for some communities, the information on how to use certain technologies is critical. And it's also the training on the technology and the different platforms. But I think it's also important that persons and patients who are accessing the health care system seek guidance from trusted medical professionals as well. Being aware, as I said earlier, how to not only access the technology but to be able to use it effectively and efficiently and safely and really just engaging with technology thoughtfully and prioritizing the security that can help patients really reap the benefits of that platform while minimizing those risks. And I think that's really what's important. We know that it's helpful, but we want to make sure. And I think it's important that people and patients know how to use it safely and effectively as well.
Andrea: So one more thing I want to tag on to what Ashley is saying and double down on this. The ways that we share knowledge and underlying that shared data that impact our lives and are potentially life saving. You know, there are ways that a Facebook community or a social media group can be a lifeline for people. There are ways that sharing genomic data within the BRCA community is going to help us classify more variants so people can get clarity on their own diagnosis. And we want that better diagnostic excellence, right. You know, we want faster cures. So we have to understand that there's good data sharing and bad. It's not all lumped into one. It really depends on the type of data being shared with whom and how it's used. And I really think about technology in terms of–think about a stick. You can either use it to make a wonderful tool or you can bop somebody over the head with it or stab them and turn it into a weapon. And that's a very simple metaphor for the ways that technology can be used as well.
Need for Patient Protection
Nayna: I have a follow up question, just based on what I'm hearing about technology and how we can further mitigate any future harm. And I know policies are a great way for that to happen. But as we know, that takes time and that takes a lot of effort and that doesn't happen overnight. So what strategies or what ways do you think that these social media companies or just technology-based companies can do to help prevent this private data being public?
Andrea: That is a really tough question, and it depends on the type of technology, the type of data. I'll give you one example of how we thought strategically at The Light Collective about this. So back in 2022, we downloaded our archives off of Facebook, and we started to look at our cross-site tracking data to see what advertising companies were sharing about us and using information to target us. So it was our very first study and it only looked at five companies. We published it in Cell Patterns. What we learned was that three of the five companies were not following their own privacy policies. And that comes down to were they putting in their privacy policy in the fine print they're using the data that is being shared for advertising? So we published that and there was a follow up investigation using the same method by The Markup.
The Markup looked at 30 of the top 100 hospitals. They found that 30 of these top 100 hospitals were sharing data from patient portals without consent—authenticated patient portals. And that includes who your doctors are, your medications, your health conditions. They were sharing that with Meta. So when we think about how we can have an impact, it's all about like we are David versus Goliath, right? We don't have a lot of resources. We're this scrappy organization. What we did was we looked at our own data to try to understand, okay, what were the potential harms of seeing different companies tracking us across the internet? And that led to a follow up investigation that ended up creating a ban on surveillance trackers, a joint ban between the Federal Trade Commission and Health and Human Services Office of Civil Rights. Long story short, there's been a wave of federal and state class action lawsuits thinking about how courts also inform policy. And there has been a big shutdown of using these trackers among HIPAA-covered entities. It's now considered a bad practice, which it absolutely, absolutely should be. And yet we're still facing some, I would say blowback from that where hospitals seeing the liability are trying to kind of limit their liability in these consumer lawsuits.
Successes
Nayna: What has been the most significant outcome or success of The Light Collective's work so far? I know you talked about the studies that you guys have done, but can you touch on a little bit more of some significant outcomes and how you measured the impact of your advocacy on policy changes or patient empowerment?
Andrea: So I'll tie it back to just putting the thread together of the state and federal class action lawsuits. There are quite a few that we're watching right now that haven't been decided. And the biggest one is a case called Doe vs. Meta that actually looked at cross-site tracking and some of that original research that we did. And they have made all these claims about whether or not Meta broke federal privacy laws. But also federal wiretapping laws. So that precedent is still being set. And I would say it's a big impact in the sense that getting that level of rigor and understanding that, yeah, we really can band together, do research, and shed light on privacy or security problems that are going to create these cases that will set legal precedent is, I think, a really important thing.
The other things that we're doing in terms of “How do we measure impact?,” “How do we think and do things differently?” - The first I'll point to is it's Cyber Security Awareness Month, and we have a “Don't Get Hacked” bootcamp that we share with patient communities. And we partnered with a very famous ethical hacker. Her name is Rachel Toback, founder of Social Proof Security. We've delivered this training. And at this point, 79 organizations’ representatives have taken the training. We're trying to increase that impact.
But we're really proud that we've had 349 course completions. We have also, in addition to the other research that we published, just did a new study called “Tangled in the Web,” and that research was 377 participants —patients, advocates and caregivers in our own network. What we learned going back to this new wave of AI, we don't have data yet to really show what the community is concerned about. And one of the key statistics from that study was 91% of participants want to be informed if AI is being used to communicate with somebody or being used to make predictions or decisions about a patient at the point of care. Yet we have no standards or policies that are being enforced yet to make sure that happens. So I would say the story is still being written. We're at the forefront of these waves of technology washing over communities. And those are small ways that we've had impact. We hope that time will tell the greater impacts to come.
Ashley, would you be willing to talk a little bit about what was your experience of going through our cohort and learning together? How was that like just as one organization involved? How was that impactful in your thinking?
Ashley: It's a great question, and I think I touched on it a little earlier. It really challenged my thinking. It really made me, as I say, use another part of my brain. I'm a connector, so I like to connect the dots. I like to connect people. And so I think initially it was taking what kind of, as I shared earlier, what an industry that has kind of been siloed or even information that really hadn't surfaced up to the patient community level yet, and really taking this information and trying to process it, but also see the importance and the value and the application in the space that I'm in, in the advocacy space. And not only that, but just in the day to day of people going to and from their health care provider's office. And so I think when I think back to that cohort, The Light Collective, Andrea did a great job of bringing cross-sectoral contributors into the room with different backgrounds, different level of expertise. But all had that common goal, right? And so it was challenging, but you had to trust the process, right? Anything new that's different, that's outside your comfort zone. You're just like, okay, where is this going, you know? But when you stay the course, you really realize that it's truly a beautiful process, a meaningful process, a needed process.
I think having been introduced to The Light Collective, being a part of that foundational work has made me that much of a stronger advocate. So much so, I've never told you this, but after that cohort, I mean, unfortunately, we've all experienced things that have happened in the health care system when it comes to privacy that should not have happened, right? We know we have amazing health care providers. And they are taking care of everyone. We saw that during COVID they put themselves and everyone else ahead of them. And we know that things happen. We're all human. But having been a part of this, as I was just kind of navigating my health care journey post being exposed to and being introduced to the Light Collective, there were things that–I remember one time I went to an appointment and I went in and the computer was accidentally left on, and it had another person's information. I immediately went out and I was like, hey, y'all need to log out of this, right? And they typically do. They'll either scan the card or they'll exit out.
And they were like, oh my gosh, thank you so much. And that, like I said, it's something that happens. But I was in a room left alone with someone else's health information. And I knew if that was me, I would not want that. And so it was reporting it, making them aware, right? Because guess what? I was going to be the next person whose information was going to be pulled up on that screen, and then someone else is going to be coming in after me. And so I think it's accountability. But I also think too, it's continuing to train our community, train our health care providers, train everybody that's a part of this system that we have to be aware. Things are always evolving and changing, and we need to be aware of that too. But it has definitely helped me to be a better advocate because I'm really big on patient rights. These are just the different types of rights that I, prior to connecting with the Light Collective, I didn't like, but you don't know. You're not hypersensitive or aware of it, right? I think you just, like Andrea said earlier, you check a box, but do you really know what you're checking?
Personal Boundaries in Activism
Scott: So this is a question for Andrea. Your work focuses on advocating for the protection of personal and private data. Yet, your activism has also led you to share aspects of your own life publicly. How do you navigate setting boundaries between your activism and your personal life? And has it been challenging to find a balance, and what kind of strategies have you developed to kind of regulate this relationship?
Andrea: I haven't really thought about this question until you actually asked me. And I would say the key here is boundaries. Setting boundaries is something that in a physical world we can say, hey, no, I would like you to kind of step off or I don't want you to do this. And I think when we talk about the word privacy in health care, a lot of people think about keeping your information secret. And it's really not about keeping things a secret. It's about setting boundaries and having those boundaries respected. Now, from my own personal perspective, I'm just going to say I have done a lot of work to try to set better boundaries. But we often, as patient advocates, find ourselves in this paradox where we choose to come out and tell our stories publicly. And doing so, we know that it may cause some opportunities to be closed to us in the future because of our public persona. For me, I really hope that we have a future and live in a world where we respect both our digital selves and digital boundaries, as well as our physical selves and boundaries, which in and of itself is a hard thing to do.
But for me, it's also been really about choosing deliberately. Yes, I want to be public about some of the things that are private to other people, and I want to be vulnerable in order to help others. Unfortunately, I think a lot of us, as patient advocates, find ourselves in a place of exploitation. When that happens, we're often asked to advise companies or to become influencers on social media for a certain product in ways that may benefit the company or may generate data from our community that is going to make a lot of money for that company. And yet the benefit does not get shared back. So in terms of my own personal boundaries, I've just had to work at saying no more to people who ask me, hey, do you want to speak at this thing? Which I, of course, have chosen to speak at Rice because I really believe in being able to speak to the next generation of med students. But I'm saying no to a lot more things just because I have to ask myself, is it worth the time that we have to actually advance the mission? And yeah, it's been a personal struggle for me.
Advocacy Work
Nayna: Now we have a question for Ashley. You've kind of touched on this, but you've used your experience as both a public health professional and BRCA advocate for patient rights and data privacy. So how has your personal and familial journey influenced the way that you've approached your advocacy work? And what advice would you give to others who want to turn their personal experiences into a platform for change?
Ashley: That's a great question. When I think about my path to advocacy, it really started with my personal experience. If my parents weren't diagnosed, if my mother had not passed away, I don't know if I would be. I know I would be some sort of advocate today, but I don't know if I would be an advocate in this place, in this space. But I'm also an advocate for child health and well-being, also an advocate for other things that are important to me as well. And I think those skills are very transferable when you find something that you believe strongly in, which is why and how I was able to advocate for The Light Collective because it's something that I truly believe in and I see the value in. And I see how it's protecting and helping to protect people. I think it all starts from a personal experience or something you can relate from or relate to, to really help drive you. I think it's also, like I said, it's almost been 20 years of being in this space and being trained and being trained either through organizations like The Light Collective, going through different types of trainings. I remember I went through a patient navigation training through GW, and that was because I realized, like a lot of people here in my community, in the Houston community, if someone was diagnosed, like they would call Ashley, right? You know, she knows her and I would know how to connect them to people.
But I realized early on, I was also helping to guide people from a place of fear and trauma from my own parents' experience. And I recognized that very early on in my path to advocacy, and I needed a formalized training to really be able to help support people without bringing my experience in it. And so now I'm able to help. When people call me, I'm able to help guide them and provide them resources and provide them with key questions that they should ask when they go see their doctors. And really just empower them to advocate for themselves. Because I think also in the patient advocacy space, while it's very rewarding to be able to help people, it also can take a toll on you. Because you yourself are going through your experience and your journey. And that continues. But you are now a lived experience expert. There are women and men out there who have been diagnosed. I'm an undiagnosed woman. They are more of experts than I am because they have lived it. I have experience as a caregiver. I have experience as an undiagnosed woman who knows she is high risk. I've had a preventative double mastectomy, but I want to acknowledge it's nothing like a person who has heard those words. You have cancer. Totally a different path.
And so I think it takes everyone coming together, multiple experiences, to be able to advocate and using that to drive them forward for what's important to them. I think that's another thing I love about the advocacy space is that there is room for everyone. There's certain niches when it comes to advocacy. There's research advocacy, fundraising, policy advocacy. There's so many arms of advocacy that there's space and room for everyone. And I think with The Light Collective, we now know there's digital advocacy. Did not know that 2, 3, or 10 years ago. I think there's space for everyone. So my advice for others would be to use their unique experience to fuel their passion, to build credibility through education. You have to educate yourself. You have to go through different programs. I know The Light Collective has a boot camp that's focused on patient AI rights. So that's another form of education, right? And to also collaborate with others who share their vision for change. And finally turning their personal stories into powerful platforms for advocacy and ensuring that they're compensated for their work. As Andrea shared earlier, it is so important that we share our experience, but it should not be shared for free. I'm a huge advocate on being able to make sure that anytime a person opens their mouth to relive their trauma, they should be compensated for it. And it's a space that I think we're all trying to work through.
But I think it's important for advocates and patients to know that it's okay to share your story, and it's okay to get compensated for it. I think I had a point where I felt bad for receiving an honorarium or compensation based off of my pain and trauma. But you realize that it is an experience. It's okay to receive compensation for that. And I think it plays back into the fact that it takes time. It takes time away from work for people. It takes time away from people's families. They have to relive a trauma over and over again every time, as I shared, every time they open their mouth to speak. And it's okay. I know for me, there are things that I do for, you know, absolutely for free. But those are things that I've said. You know what? When it comes to children or my child's school or my church or whatever, I have those things that I know that this fills Ashley up. And then I also know that there's things that depending on what it is, it's just the offering of it, you know, and sometimes I will turn it down. Sometimes I'll say thank you. Sometimes I'll donate it back. But I think it's just the offering of it that makes it equitable. And I think that's so important.
Andrea: If I could, I want to add a few things to that. I'm really glad you brought that up, because I want to go back to what we talked about in terms of setting boundaries. I think within the space of technology advocacy, we have a lot of resources that we can point to that are best practices around the science of engagement that have been applied in clinical research, but have not yet been applied in technology when it comes to partnering with patients and patient communities. So the first thing I want to add to and say this is true for me too, is when you stand up and tell your story in an empowering way, it can be both a healing experience, but that lived experience is absolutely trauma. And at this point in, I think, general advocacy and in healthcare, we lack as patient advocates, institutional support in the same way that professors, doctors and others have institutional support. Yet we're doing work on behalf of communities where it's important to create a professionalized path, free from conflict of interest in ways that truly support work, that can address these disparities and bring more people to the table. So I do want to point out there's a couple of great resources.
One is from the Patient-Centered Outcomes Research Institute. They published earlier this year a really great guide called Foundational Expectations for Partnership. And it's all about how do you think when you approach a community and say, hey, we want to “partner.” Going back to boundaries, the question I keep asking myself is, okay, how are they going to treat our community as a true partner on equal footing and not a token for PR and some of the keys to that approach are– what decision-making power do we have as part of any initiative that we're in? How diverse is the representation at The Light Collective? We have a guiding principle, which is no aggregation without representation. Are you asking us just to be on a committee so you can have that face on your website? Or are we actually part of a board that when things go down, we're going to be able to represent an advance patient interest in those rooms. I would say there's a whole other group of things that need to happen. But especially within the space of technology advocacy and increasingly when we're setting standards or developing AI, we're just not there yet.
Takeaways
Nayna: Thank you guys so much for taking the time to speak with us. Before we end things off we wanted to ask, knowing all this information and hearing about your stories and personal experiences, what would you like people who are listening to this to take away from your talk and everything that you've been speaking about today?
Andrea: So I'll start with what I said before. No aggregation without representation. We can't advance rights, interests, and voices of patient communities in health technology as individuals, we have to take collective action. It's time for health privacy laws in this country that need to protect patient communities. It's time for us to respect the role of digital advocacy as we work with technology, standards, government, and the industry. The patient voice truly needs to have a role in that. And if you're interested in this, go check out The Light Collective. Sign up for our bootcamp and come talk to us. We're happy to talk to anybody, especially if you are a med student out there trying to shape the future of medicine because we're trying to do the same thing. Let's do it together.
Ashley: To add with what Andrea just shared, I would just amplify the patient voice is so important. I know Andrea and I shared our experience today, and for some it may be, well, how do I do that? No matter who you are, no matter your background, your socioeconomic status, your education background, your voice matters. And I think that's so important. And I believe it's being connected and getting connected to organizations like The Light Collective who can help you to get more education, who can also help to open up more doors in areas of advocacy that you're passionate about. The Light Collective has great collaboration with other organizations, multisectoral organizations. I think it's just knowing that the patient voice is so important. And I think the last thing is, we have a panel discussion this afternoon. In the Sawyer Seminar on “No Aggregation Without Representation,” as Andrea just shared. And so we're super excited because Andrea and I will be on that panel today with a mutual friend, sister, and colleague Valencia Robinson. And so we're looking forward to seeing everyone who's coming out. But if you miss it, I know Rice University will have it online somewhere for people to watch for days and months and years to come. So I think, yeah, those are two great first steps.
Photo credit: Michael Busch