Lawyers should be using AI themselves
EdTech counsel Melodie Wiseman on why attorneys need to test the tools themselves
What’s the worst that could happen? As society barrels toward a full embrace of AI technology, Melodie Wiseman, the assistant general counsel at Stride, Inc., says the worst thing attorneys can do is sit back and watch. As someone who has worked in EdTech for more than 20 years, she spends a lot of time weighing the risk of innovation against the impact it may have on the most impressionable and vulnerable among us.
—Interview by Emily Kelchen, edited by Bianca Prieto
You’ve spent years working in a highly regulated, data-sensitive field. Now, the whole world seems to be embracing big data with few controls. How do you wrap your head around that?
We can no longer say, “If this happens” or “when that happens.” Employees, children and clients are using AI tools—whether sanctioned or not.
I noticed you didn’t say, attorneys.
Lawyers should know how to use the popular AI products. They should regularly use those tools to understand the type and quality of information clients receive when they ask AI the same questions they ask us.
So, just get out there and experiment?
I may be a bit biased thanks to the industry I’m in, but I think relentless self-education is fundamental.
That also means that I must constantly sharpen my own expertise—staying informed, constantly evolving my knowledge, so I can guide people rather than slowing them down.
I actually use and test more products than I otherwise would because I want to understand how they function in real-world conditions. At the same time, I am plugged in to regulatory updates—especially around trust and safety developments. Responsibility and regulations remain a constant presence, even if they may not anticipate emerging technologies. They offer boundaries and principles that guide us.
If the regulatory framework isn’t quite there, what are you looking to form an opinion about a product?
Lawyers should be paying attention to output reliability, responsible use and safety standards. I try not to approach it from a “No, because” perspective—I aim to approach it as a “Yes, if.” This makes risk management a more collaborative process.
When you are advising someone who is adopting or building a new tech tool, how do you communicate the balance between innovation and risk?
Innovation and risk management are not in conflict. They are complementary pillars of responsible product development and long-term scalability.
Safety and digital accessibility should be foundational design and programming considerations—not secondary concerns—especially for products used in schools or by minors. Incorporating these considerations from the onset provides innovators with a product they can rapidly scale.
If you could give one piece of advice to lawyers who want to be innovative, or advise innovators, what would it be?
Don’t be committed to being right. Be committed to understanding the law as it is today and predicting, as best you can, where regulations are headed.

You're all caught up!
Thanks for reading today's edition! You can reach the newsletter team at raisethebar@mynewsletter.co. We enjoy hearing from you.
Interested in advertising? Email us at newslettersales@mvfglobal.com
Was this email forwarded to you? Sign up here to get this newsletter every week.
Raise the Bar is written and curated by Emily Kelchen and edited by Bianca Prieto.
Comments ()