TechTank, a biweekly podcast from the Center for Technology Innovation at Brookings, explores today’s most consequential technology issues. Moderators Nicol Turner Lee and Darrell West speak with experts and policymakers to share data, ideas, and policy solutions that address the challenges of our digital world.
Conversations around technology in 2025 were dominated by topics like the labor impacts of generative artificial intelligence (AI), the data center boom, and children’s online safety, among others.
These discussions are likely to continue or even expand in 2026, as lawsuits around kids’ use of social media unfold and the Trump administration prioritizes U.S. leadership in AI. Concerns about fairness, privacy, transparency, and security cut through each of these conversations.
In this episode of TechTank Podcast, co-hosts Nicol Turner Lee and Darrell West discuss what issues they think will take precedence in 2026, and expectations for action at the state and federal level.
Listen to the episode and subscribe to the TechTank Podcast on Apple, Spotify, or Acast.
Transcript
[00:00:00] CO-HOST NICOL TURNER LEE: You are listening to Tech Tank, a biweekly podcast from the Brookings Institution exploring the most consequential technology issues of our time, from racial bias and algorithms to the future of work Tech Tank takes big ideas and makes them accessible.
[00:00:26] CO-HOST DARRELL WEST: Thanks for joining our Brookings Tech Tank podcast. I’m Darrell West, senior fellow in the Center for Technology Innovation at the Brookings Institution. Now that it is 2026, there are questions as to how technology policy will unfold and what actions governments will undertake. We are seeing dramatic advances in generative AI, but concern about how it and other tech developments will be regulated and what it means for questions such as fairness, privacy, and transparency.
[00:00:57] To discuss these issues. I am pleased to be joined by my colleague Nicol Turner Lee. She’s the director of the Brookings Center for Technology Innovation and the author of a Brookings Press book entitled “Digitally Invisible: How the Internet Is Creating the New Underclass.” It’s an excellent book and I highly recommend it, and what Nicol and I are gonna do today is to discuss what to expect in the coming year on the technology front and what people should watch from the Trump administration and Capitol Hill. Nicol, welcome to our Brookings Tech Tank podcast.
[00:01:31] CO-HOST NICOL TURNER LEE: Hey, Darrell, it’s always great to be with you at the beginning of the year, so I feel like this is something we’ve been doing for quite some time.
[00:01:38] CO-HOST DARRELL WEST: Yeah, it’s always great to do a tech outlook and kind of compare notes, and it’s always, good to be on with you as well. yes, let’s just jump right in. Nicol, I know there are lots of exciting developments on the AI front. president Trump has made news by issuing an executive order limiting the ability of states to regulate AI.
[00:01:57] We know there are court challenges to that order, but based on what we know now, what is your assessment of that executive order and how do you think it will affect the states?
[00:02:08] CO-HOST NICOL TURNER LEE: this is an interesting question. I think the Trump administration in the second presidency has been really clear that AI is going to be the driver for our global competitiveness and just position us, much better, when we are compared to China and Russia and other areas.
[00:02:26] So I think, that conversation and that narrative is one that has. I think come out of the Biden-Harris administration as well, and one in which I think is really important for the United States to lead the same token. Where are, where there are really interesting highlights in the executive order that allow for that type of global competitiveness.
[00:02:44] In the area of AI, there’s a lot of other areas that are restricting the ability of the U.S. to do, one being the conversation of preempting states as they begin to think about ways in which they can either regulate, legislate, or experiment with AI. And I think that’s, pretty much a stickler in the conversation right now, state rights are particularly important, not just on the consumer protection side. I consider states as the protector of your grandmother’s internet. They are the ones that are working to ensure that the attorney general has the power that he or she needs, so that we can avoid things like fraud, scam, and abuse. But states are also being put on the wire that if they do not comply with the very exciting and, assertive, demands of the White House that they will be considered out of order and out of line. And in some instances, the executive order implies that they could lose federal funding, such as the ones that are being given to expand broadband infrastructure simply because they’re not necessarily working alongside the government. So I think Darrell, for me, that’s a stickler alongside, I think some of the other areas which are very confusing and concerning about the executive order in terms of, the limitations and restrictions on higher education. the lack of interest in setting up guardrails to avoid predatory algorithmic decision making among other things. It’s an interesting time, but, I think the state stickler one is one that we set, can’t seem to shake. It seems like preemption is on the mind of this White House, in their absence of actually doing anything to protect consumers or to provide some certainty around AI legislation.
[00:04:42] CO-HOST DARRELL WEST: No, I think those are all, terrific, points. And I do think, it’ll be interesting to watch the legal challenges. there are a number of states that already have started to regulate AI. California, New York, yeah. And other, places. they’re clearly concerned. I’m personally worried about how Trump is trying to repeal federalism at a variety of different levels. We’re seeing this not just on AI, but in terms of crime, law enforcement, the use of the National Guard in cities across the country, immigration, the use of federal grants, pulling federal grants back from, states that, are not. Trump, states. But the one thing I did, notice about the Trump Executive Order is even though he wanted to limit the ability of states to regulate AI, it was actually fairly narrowly crafted in the sense that it did create several exceptions. for example, there’s a lot of. contentiousness over data centers, in states and localities. states that, have permitting rules, where, data centers have to comply with the local, regulations in order to move forward. that was exempted from this, rule. There are a number of states, more conservative states that have passed legislation regulating social media platforms, that was exempted. There are land use requirements out of, in a number of communities across the country. So there were some things that were actually taken aside. Trump realized he can’t just prevent states from doing anything, so I think he did try and craft it a little more narrowly than a lot of people realized. Just to avoid the legal challenges, which he knew was going to pop up.
[00:06:22] CO-HOST NICOL TURNER LEE: and I, if I could actually also add in, I think there’s, when we had this conversation, maybe two or three months ago when the executive order came him out, I think that again, this ambiguity that was encased in how the White House was putting down these restrictions, like you said, it allowed for, some, interpretation that could essentially, cover them if they needed to be covered. But I was on LinkedIn, And I was reading a piece that is in Science Magazine that just came out by, my friend and your friend Alandra Nelson. And I think she also put in another area that people have not quite talked about.
[00:07:02] And you just brought it up with all of the things that are happening across the world that. In this White House effort to be the AI, star, there’s also an association with some of these global, challenges that we’ve had such as, what’s going on with Venezuela and Greenland, critical minerals are a huge part of this, Darrell, and I’m curious to think what you also are thinking about this as well, that even in this AI plan of accelerating our competitiveness against China, that some of these, geopolitical concerns actually tie back to AI in many respects, right? And people are not talking about them as much as they should. And I think that’s what Alandra was trying to do in her article in Size Magazine, which is absolutely brilliant, by the way. And something I’ve also heard about Greenland and where there’s interest in Greenland has a lot to do with critical minerals, but this is beyond my pay grade, Darrell, I know you’ve looked at a little bit of this. I’m just curious to think to see what you think about this as well.
[00:08:04] CO-HOST DARRELL WEST: Now certainly the critical minerals aspect is an important part of a number of things that Trump is doing. people have said that, that is actually one of his interest in Greenland, that they actually do have some of the critical minerals who are needed, for the whole, digital, revolution. The other comment that I would make on the executive order is the, other thing I worry about is, if you limit the state’s ability to regulate AI for whatever time period, three years, five years, or, longer. It’s gonna be too late because that revolution is accelerating so rapidly if we don’t put any guardrails in place in the next three years. The world is gonna be so completely different. Three years from now or five years from now, these AI tools are gonna be embedded in everything we do and all the tools that we use online. It’ll be too late, it’ll be almost impossible to regulate at that point, which I think is what a lot of the tech companies would like. But as a consumer, I actually think that would not be a good approach to public policy.
[00:09:04] CO-HOST NICOL TURNER LEE: Yeah, no, I agree. I think it’s something that we should probably look a little bit more into as well. and I, would just say the last thing, Darrell, that’s really consuming this AI action plan that the Trump administration has put out are data centers. And just a plug for everybody who’s listening. Darrell and I are data center deep. We have put out a paper that’s an explainer. We have a new paper coming out shortly on what communities should be asking. But, that has been, I think another area which is very, Predominantly showing up right in the media cycle as one of the major concerns that is going to be required to be addressed if we are gonna have the compute power again to outwit China on these issues.
[00:09:47] CO-HOST DARRELL WEST: So Nicol AI is not the only topic likely to be important this year. we’re also seeing several states and localities ban cell phones in schools, and I know that you are, doing work on technology and education. So just want to ask you, are these bans a wise policy move and why are states moving in this direction?
[00:10:09] CO-HOST NICOL TURNER LEE: Oh, Darrell, how can we do this? How can we go from AI to cell phones? It’s what’s going on here? obviously cell phones have just deeply been entrenched in our society. we have an upwards of 98 to 99% penetration of, even the more modern. Smartphones versus a flip phone. we’ve seen this as a primary gateway for many people to the internet as well. And then on top of these, benefits of smartphones, we’ve also seen the downsides which are connected to these conversations on whether or not young people have just way too much access when it comes to screen time. And this is not a new conversation, honestly, this is a conversation that has been well researched, discussed. There have been many efforts over the years, to talk about better hygiene when it came to the use of technology, and especially technology in classrooms. Now we’re seeing these national bands, which I find to be so interesting, where we want young people to leave their cell phones at the door, when it comes to learning, which makes a lot of sense in some instances because the research is quite real in terms of distractions as well as, the focus on the.
[00:11:26] Way in which attention is implemented by students when they’re learning something new. But I still stand on the side that it’s really important for us to also assess that some of these technologies are a lifeline for students when it comes to being an unsafe home environments. It is a way for them to get the type of help they need or if they are in homes where, safety is an issue when I was doing the research for my. Book. I found that in a community in Maricopa County, Arizona, young people under the age of 10 had cell phones simply to be able to track whether or not a family member was deported, while they were in school. and in some instances, and this also, buys with my research, we’ve seen young people unable to buy a calculus computer and have had to rely upon their cell phone to be able to do their homework. I think, Governor Phil Murphy actually had a really good way of talking about the cell phone ban. He’s basically suggesting that we wanna get screens out, these screens out the classroom, but we wanna do it on a case by case basis, giving more authority to the teachers to really think about ways in which it can be creatively used by students or in areas where the students have critical need. For me, that sounds like a smarter solution and one that is less. I wanna say this, and I might get a little heat on this, but bring it forward. I’m, up for it. I do think that much of the cell phone ban and the technology ban conversation are being driven by middle class families who have the ability to provide other choices to their kids. And I also think that, much of the conversations around cell phone bands have a lot to do with this fear that young people will go on there and just do really bad things. And listen, I just remember my dad telling me when his mom tried to ban his record player from his room because she was afraid she was gonna listen to, Marvin Gaye. To me, these are just the questions that commonly come up with technology that instead of banning, perhaps we need more thoughtful responses, much more, media literacy and digital literacy education, AI literacy education as well, and just better ways to equip parents to be able to respond to these things in meaningful ways ’cause I could tell you, Darrell, I think there was some things that I were banned during my lifetime and I still managed to use it and get access to it, and I think that’s really important with these cell phones that, it’s really done in a very thoughtful manner where we’re not really, penalized the technology per, but perhaps our inability to do our job as a society when it comes to media literacy.
[00:14:13] CO-HOST DARRELL WEST: I think you’re right on that. bands historically have not worked on hardly any new technology because people do find a way to, use them. But I remember in your, digitally invisible, book, you also argued that in a number of underserved communities, both in rural areas as well as urban areas, that schools often were the lifeline and the vehicle by which people got access to the digital world because they didn’t have access at home. And so they needed to have access during the school time, in order to do their homework, access electronic, resources and so on. The other thing that I, worry about. and I know that there’s a lot of concern about the whole distraction issue in the classroom, which I think is a very real issue. I even talk to people at the college level of friends of mine who teach in, colleges. they complain that, college students are using their cell phones, during, class time as well and not really, paying close attention to what is going on in that, classroom. With teenagers in particular, there’s a lot of concern about the role of cell phones and social media platforms in particular on the mental health of those individuals, we all remember, that’s a very formative and vulnerable age, for a lot of people, especially young girls. body image, problems and so on. just, the whole mental health, problems that seem endemic in that generation. People talk about loneliness. people feel isolated. and I do think that the, cell phone issue is a, reasonable one to raise and we have to think about how to kind of balance, getting the advantages of cell phones in terms of access with some of the problems that we, definitely see.
[00:15:59] CO-HOST NICOL TURNER LEE: Yeah, and I, agree with that. I’m working on a piece, with Josie Stewart, which is framed in Robert Putnam’s, his, distinguished piece, “Bowling Alone”, where he argued the same thing in terms of loneliness in society, coming from the, the erosion of collective cohesion because people didn’t bowl in leagues anymore when the television came out and people found themselves doing things, singular by themselves without community at hand. And for me, that makes a lot of sense, in terms of television and it makes a lot of sense in terms of new technology. But I also caution people, the cell phone itself when it was just the internet, I don’t think we saw as many of the challenges that we see today as more of these applications have taken up space on these phones. So I always tell people, don’t al always blame the phone, blame the applications, and being sure that people understand that many of these applications are filling the void of loneliness, that bowling alone mantra that Robert Putman came and they are due to the fact that in addition to the mental health concerns of young people, there’s a backdrop of mental health reductions in funding or abandonment of Boys and Girls Club or the increasing cost of getting your kid into some type of extracurricular sports activity.
[00:17:21] And so I really think that. It’s a combination of us looking at society, the extent to which we want young people to have access to the technology so they compete in this new, labor economy, while at the same time we’re giving some guidance, maybe a public service campaign, or we’re ensuring that there’s some type of course that’s done as a requisite in the schools, but we’re really making this part of our values and norms on how to use these things in meaningful ways. think about it, when a kid is trying to drive, probably every public school across the country requires a certain amount of hours in terms of driver’s education. We’ve also historically required that when it comes to reproductive health, we’re not doing it with cell phones or technology or generative AI. And I think that’s something I’d like to see in my lifetime happen within our schools where we can start with some baseline learning.
[00:18:17] CO-HOST DARRELL WEST: Yeah, we definitely need digital literacy training programs just to help people adapt to the new digital world. What the advantages, what the benefits are as well. That’s right. As, the risk and as I think especially with, young people, that would be a very important. On another topic, there are powerful video editing tools that have emerged on various platforms, including grok, that alter images, and have been used even to undress women, and many of course are completely outraged by this. So Nicol, I’m just curious, is the outrage that has emerged over this particular application enough to stimulate congressional action?
[00:18:55] CO-HOST NICOL TURNER LEE: Oh, I think so. I think we’ve had, some bipartisan consent, and I think this is gonna be a 2026 issue around, non-consensual images being shared on the internet and being able to use these types of AI tools to manipulate body, as well as voice and, opinion and all these other things. I think it’s really important that Congress begin to see that their lack of action on. This issue when it was very fundamental. The, link to deep fakes has now gotten outta control. We’ve essentially given tech companies loopholes for people to do just really, horrible things and disturbing things on the same platform that has a potential to be a problem solver. there was news recently that grok is actually being embedded into the Department of Defense. I can tell you that I had some, nightmares about potential applications based on what the public is actually do utilizing that tool for. There’s a lot, for example, of, mandatory requirements that are being as, assessed on women in the military.
[00:20:05] there’s a lot of pushback on transgender people in the military. I can only imagine how grok could be used as not only a tool to shame people based on these just, disturbing requirements and pushbacks, but also could be used in many respects to, surveil people, And so I do think that, one of the things that I’m trying to do in my research, and Darrell, you’ve done the same in your work with federal governments, is to really draw some fine lines between the commercial, applications that are being used by government entities, including Signal, and the extent to which they have no regulatory safeguards at all in place to ensure that the public is protected as well as the information that the government is stewarding, that’s protected as well by these commercial products that are just being, procured and used without any type of assessment of its, security value.
[00:21:04] CO-HOST DARRELL WEST: Now this has become a tool for shaming prominent women. And there’s just so many examples of, like any woman who has advanced to the top of the political world or the business world, sometimes becomes object to, this type of degrading, applications. clearly we need, more legislation in this area. I did note that recently, the, people who run, grok, claim that their applications have to conform to state and or national laws. But of course, the problem is we don’t have a national law that prohibits, this type of activity. There are a few states that actually have passed a legislation. I think Minnesota has, and I think there are a few other states have as well. But most states do not have legislation in this area, and we clearly don’t have, that, legislation at the national level. the def the company’s defense that they’re conforming to state, or national laws. Doesn’t make any sense given the fact they know we actually don’t have much, legislation to prohibit, this kind of behavior. Yeah. But internationally we are seeing other countries actually step up. the uk, the European Union has launched investigations into, this particular area. So I do think, companies need to be held accountable for this type of thing. There are lots of ways in which, technology has created, advantages for people. But there’s so many clear risks and, and problems and outright disasters. we need some rules that, put guardrails in place in those particular areas.
[00:22:39] CO-HOST NICOL TURNER LEE: Yeah, and I think it is such a different, time, right? Because Darrell, you remember. When it came to the federal government using many of these commercial products, there was such greater scrutiny, not just on the, regulation side of it, but also just in practice. And I just don’t think that we’re seeing the same type of scrutiny. And when we do see these mistakes, as we saw with the use of signal, for example, it’s brushed off as something that was just a, oh, I didn’t know this was gonna happen when, there could be some really serious consequences. And like I said, I really do have some reservations about. Some of these, more active social media tools being used by our government agencies, especially at a time where they’re being used to weaponize individuals. come on. I hate to go down this path, but look at what’s happening with facial recognition technology as it’s being used around immigration, deportation efforts, now, we are seeing in places like Minnesota where somebody’s can have their photo taken and then that information could be read by third party applications to determine whether or not that person should be in the United States or not. that type of surveillance, I think in the last few years, many of us have really thought about ways to execute this in responsible, in fair ways, and the government being, the for some of that behavior that doesn’t emphasize those values can be very disturbing. I think and have a precedent that’s set in the future that we may not be able to return from.
[00:24:07] CO-HOST DARRELL WEST: Yeah, it is disturbing the way in which, some of these, facial recognition tools are being abused. you mentioned the case of, using it to identify immigrants who may be subject to deportation, but it’s also being used to identify protesters,
[00:24:22] CO-HOST NICOL TURNER LEE: Yep.
[00:24:22] CO-HOST DARRELL WEST: ICE agents are out there with, cameras, they’re videotaping people who are protesting ICE policies. And the fear is that facial recognition is being used to identify those protestors and subjecting them to, arrest or a possible, legal intimidation. So that’s, something that we’re certainly paying attention to now. I know there’s been interest in children’s safety issues in regard to technology. Congress has actually come close to passing legislation in this area, but has not quite managed to push it over the finish line. Australia just banned social media usage for those under the age of 16. Nicol, will this be the year something actually moves in? the Congress area in the United States.
[00:25:15] CO-HOST NICOL TURNER LEE: It’s gonna be interesting, Australia. I just read this, the other day, 5 million social media accounts of young people under the age of 16 were deactivated the day after the law went into effect and were like a month in and they continued to ban these accounts for young people. I think in ways that are not only, surprising the tech companies that they were gonna go through, what they said, but also individual teens who are, I, understand from reading an article waking up saying, where is my social media platform? I think, again, in the absence of any type of, thoughtful. balance conversation on how this technology evolves. I would say that we had plenty of time to address these issues. I still go back to when Mark Zuckerberg addressed a bunch of parents in Congress who sat in the audience, just very upset about the impact of these technologies on their children. And I think about the fact that we’ve had, time to really consider where social media makes sense for young people based on the various privacy bills that have been in limbo in Congress. With that being said, I think Australia’s quickly finding out that these blanket bands do have disproportionate effects on teens, in remote areas where perhaps using their social media was the only meet of communication because perhaps they didn’t have telephone service they’re finding that it’s having an effect on children with disabilities, people who were dealing with elements of mental health. We’re using these social media platforms to be in community, and again, that’s the trade off that Australia has made the challenge that we have in the United States when it comes to social media, they’re still protected by Section 230. I think that there’s some appetite, but not a lot to do a rewrite of that there are still no data privacy laws in place that could potentially. Really, in some of the grievances that most people have about the, type of content that young people see. And even in the advances in age verification, one company’s effort to have people use facial recognition, for example, to verify whether or not they’re 16 and over there are still concerns as to whether or not, a potential pedophile can grab a young person’s phone, grab a young person’s face, for that matter and use that as an entryway to still do the stuff that he or she does. So I, think until we have a conversation around what data privacy looks like in this country. We have very good, legislation around the protection of tech companies when it comes to, harm that is being done by young people as a causal relationship with the technology that they are interacting with and potentially if we have more conversation on quelling and averting the use of data by third party actors. I think until we do those things, it doesn’t matter what we do. As we said before, young people are still gonna find ways to get on it, and tech companies unfortunately gonna find ways to go against, to find loopholes in the laws. and so I sit, I sit there, right? I, don’t know what is going to be the next step of this conversation in the absence of children privacy protections.
[00:28:28] CO-HOST DARRELL WEST: We definitely need a data privacy law within the United States. both you and I have, called for that. Our colleague, Cam Kerry has written extensively on this, front as well. that would not solve all these problems, but it certainly would address some of the issues that we’ve, raised in this, podcast. Now, the other arena to watch may be at the local level, and specifically in regard to Mayor Mamdani of New York City. He was elected as a democratic socialist, and it’ll be interesting to watch how he handles this whole technology area and whether he puts some new guardrails in place for tech companies that operate within New York City, which is basically most of the tech companies, New York City. It’s like the state of California in the sense that, they’re such a big player that all the tech companies are operating there. And what California’s discovered is that when they pass privacy laws or AI regulation, laws, it almost becomes a defacto national standard because companies don’t want to devise different types of products for California as they do for other states. And so I’m interested in watching Mayor Mamdani just to see how he uses. The city of New York City as a testing ground for some of, the ideas that are percolating. We should also note he is being advised by Lina Khan, who’s the former chair of the Federal Trade Commission under President Biden. She’s been very active in calling for new guardrails on the tech sector. So I’m just curious, Nicol, what we might expect from New York City and how that might, serve as an incubator of ideas for the national level.
[00:30:08] CO-HOST NICOL TURNER LEE: I’ll just say it like this. I think if any tech company was looking for a boogie man or woman is coming outta New York, right? New York City, along with New York State for that matter, what govern the governor has done as well when it comes to cracking down on data brokers and establishing more data integrity, task force, and thinking about AI content that’s using elections, I am certain that the new mayor is going to follow in her footsteps and follow on the legacy of what’s already been done in New York.
[00:30:38] When it comes to bans on facial recognition or bans on, the use of, FRT in particular by law enforcement or the way that New York has banned cell phones, but they’ve also put some bands in place for ChatGPT in New York City public schools. So I do think that they are going to be an interesting example to watch because they’ve managed over the years, like California, state of New York, California, they’ve managed to get some stuff done right, even if it’s at the state level and, really think that the, moratorium may be more targeted towards them as to anybody else as well as like Illinois. But I think, Mayor Mamdani is really going to think carefully, with his colleague Lenahan around, consumer protection. Really geared towards public interests, applications, housing is a big issue for them. I could almost be certain that housing algorithms, tenant algorithms, anything that is being used as an algorithm to determine eligibility that does not comply with any new standard will be scrutinized. and so if you just take every issue that this mayor has run on and you just apply technology, I think you better be prepared that there will be some type of, legislative directive coming out of the, city as well as state of New York. And I also think that there’ll be some consumer, backup as he’s also put in place a variety of task force to pay attention to these segments within the various verticals and more broadly on the community.
[00:32:11] CO-HOST DARRELL WEST: Yeah, the new mayor has already put together some task forces, and kind of advisors who are helping to move him in, this direction and I think you’re right. I would expect there to be some action on calling for greater transparency from the tech sector more disclosure about what is happening with a number of different applications. I actually just did an interview with a, New York radio station and they were talking about the use of facial recognition in retail outlets.
[00:32:41] CO-HOST NICOL TURNER LEE: Yep.
[00:32:42] CO-HOST DARRELL WEST: And the need for disclosure of that so that when people walk into the store, they know that, that software is being, deployed. and we also need to pay attention to how the data retention part of that, like how long, the images are retained in the store databases. I also think the whole issue of bias and the impact of technology, and how it affects, women, various members of ethnic groups and minority groups, and other protected categories, could be an area where a New York City does some interesting things. And then of course, the whole privacy area. yes, in the absence of a national privacy bill, we’ve already seen some states, adopt, stronger privacy protections, and I could see, New York, moving in that, direction as well.
[00:33:34] CO-HOST NICOL TURNER LEE: Yeah, and I, think, I wanna be clear, being a New Yorker myself, I do think that New York will find ways to incubate, entrepreneurs and small businesses to utilize AI in ways that it’s allowed them to flourish. So I do see, embedded in the mayor’s agenda is also an economic opportunity agenda, which, Darrell, I’m over here scratching my head thinking, Hey, I think a blog may be coming out of this, how New York City and under Mandani might actually frame, I think some of the future, regulatory hef that I think many people wanna see on the consumer protection side as well as on the opportunity side. Maybe I’ll start drafting that my friend as a, post.
[00:34:20] CO-HOST DARRELL WEST: That would be a great topic. I think there would be a lot of interest, in that. So Nicol, it has been great to be on the same podcast with you. I enjoyed our conversation and look forward to an exciting year coming up. And I wanna thank everyone for joining Tech Tank. we hope that you’ve enjoyed, this, conversation. don’t forget to follow both our tech tank podcasts as well as our tech tank blog for more details on many of the issues that, we have discussed, today, you can find us at brookings.edu. So thank you very much for tuning in.
[00:34:52] CO-HOST NICOL TURNER LEE: Yeah. Thank you and always a pleasure to be with my co-host and, it’s gonna be a great year, 2026. Lots and lots of issues to discuss. Thank you for listening to Tech Tank, a series of round table discussions and interviews with technology experts and policymakers for more conversations like this, subscribe to the podcast and sign up to receive the Tech Tank newsletter for more research and analysis from the Center for Technology Innovation at Brookings.
The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).
Commentary
PodcastTech outlooks for 2026 | The TechTank Podcast
Listen on
February 3, 2026