Artificial Intelligence: How to Start The Learning Process in your School District … Safely

In this episode of the Digital Learning Today Podcast, Jeff welcomes Dr Jayne Lammers, Director of Learning Design at Edmentum on the podcast to discuss how your school district can begin using Artificial Intelligence to support staff, students, and the community. If you are a new listener to TeacherCast, we would love to hear from you. Please visit our Contact Page and let us know how we can help you today!
In This Episode …
- How do we define the term “Generative AI”?
- Do we need to be worried about Privacy Agreements when using applications with AI?
- What questions should Tech Directors be asking EdTech Companies about their AI features?
- How will school districts be notified of changes in AI in apps?
- What switches will the district have to control AI features?
- What types of language should be in an AI policy?
- Introducing AI to staff at a group meeting?
- Start with a problem that is broad and discuss how AI can help solve it.
- Language Support for MLL students and families (claude.ai)
- Start with a problem that is broad and discuss how AI can help solve it.
- Will AI ever replace teachers in the classroom?
- How to teach AI to students as a “thought buddy”
- The importance of sharing and reflecting after using AI so others can learn about it together
AI Applications Mentioned on the Podcast
Website Links Mentioned on the Podcast about AI
Articles Referenced
Follow Our Podcast And Subscribe
Follow Our Host
- Jeff Bradbury | @JeffBradbury
- TeacherCast | @TeacherCast
About our Guest:
Dr. Lammers began her education career as a middle and high school literacy teacher and found a passion for supporting teachers in meeting the needs of striving readers. She earned her Ph.D. in Curriculum & Instruction at Arizona State University and spent 15 years in higher education, preparing teachers for the challenges of today’s classrooms. Her research explored the intersection of students’ interests and technology’s affordances, aiming to make literacy instruction more meaningful and impactful. Now, as the Director of Learning Design at Edmentum, Dr. Lammers helps ensure that Edmentum's products designed to accelerate learning leverage research-based best practices and consider the realities of teachers’ work and students’ needs.About EdMentum
Edmentum creates learning technology solutions designed to support educators and supplement existing curriculum with one goal in mind: positive student outcomes. They reach more than 43,000 schools, 420,000 educators, and 5.2 million students in all 50 states and more than 100 countries worldwide. Edmentum believes that every student deserves the opportunity to thrive everywhere learning occurs - whether they seek to catch up, stay on track, or chart their own path. When you pair Edmentum’s comprehensive, research-backed learning acceleration solutions with empowered and supported educators, you can change the direction of students’ lives.Links of Interest
- Website: https://www.edmentum.com/
- Twitter: https://twitter.com/edmentum
- Facebook: https://www.facebook.com/Edmentum
- LinkedIn: https://www.linkedin.com/company/edmentum
- YouTube: https://www.youtube.com/user/edmentum
Join Our PLN
Are you enjoying the TeacherCast Network, please share your thoughts with the world by commenting on Apple Podcasts today? I enjoy reading and sharing your comments on the podcast each week.Let’s Work Together
- Host: Jeff Bradbury @TeacherCast | @JeffBradbury
- Email: info@teachercast.net
- Voice Mail: http://www.TeacherCast.net/voicemail
- YouTube: http://www.TeacherCast.net/YouTube
- iTunes: http://www.TeacherCast.net/iTunes
Check Out More TeacherCast Programming
- Educational Podcasting Today (http://www.educationalpodcasting.com)
- Ask The Tech Coach (http://www.AskTheTechCoach.com)
- EdTech in the Classroom (http://www.edtechintheclassroom.com)
Need A Presenter?
- Jeff Bradbury (@TeacherCast) is available as a Keynote Speaker, Presenter, or to Broadcast your conference LIVE!
00:00:03.782 --> 00:00:04.062
Hello,
00:00:04.243 --> 00:00:05.522
and welcome to the TeacherCast
00:00:05.644 --> 00:00:06.464
Educational Network.
00:00:06.484 --> 00:00:07.564
My name is Jeff Bradbury,
00:00:07.623 --> 00:00:09.605
and welcome to Digital Learning Today.
00:00:10.025 --> 00:00:10.845
On today's episode,
00:00:10.865 --> 00:00:11.605
we're going to be talking
00:00:11.804 --> 00:00:13.766
all about artificial intelligence.
00:00:14.326 --> 00:00:15.785
We're going to be defining what it is,
00:00:15.805 --> 00:00:17.227
how school districts are
00:00:17.466 --> 00:00:18.766
currently taking advantage of it,
00:00:19.106 --> 00:00:20.327
and how you can safely
00:00:20.588 --> 00:00:22.047
deploy it in your classroom.
00:00:22.347 --> 00:00:24.007
I have a fantastic guest on today,
00:00:24.428 --> 00:00:26.609
Dr. Jane Lammers from Edmentum.
00:00:26.949 --> 00:00:28.190
Dr. Lammers, how are you today?
00:00:28.370 --> 00:00:29.289
Welcome to TeacherCast.
00:00:29.974 --> 00:00:30.795
Thank you, Jeff.
00:00:30.875 --> 00:00:31.736
I'm doing great.
00:00:31.815 --> 00:00:32.475
And I'm really looking
00:00:32.536 --> 00:00:33.737
forward to talking with you
00:00:33.798 --> 00:00:35.079
about this subject that
00:00:35.418 --> 00:00:36.981
everybody is talking about.
00:00:37.401 --> 00:00:39.402
I am so excited to have you on.
00:00:39.603 --> 00:00:39.823
You know,
00:00:39.862 --> 00:00:40.683
we've been talking about
00:00:40.743 --> 00:00:42.145
artificial intelligence here.
00:00:42.204 --> 00:00:43.447
It seems like forever.
00:00:44.408 --> 00:00:46.448
Every single time we have a guest come on,
00:00:46.869 --> 00:00:48.892
it's the topic that we have to bring up.
00:00:49.271 --> 00:00:50.613
But I'm excited about having
00:00:50.654 --> 00:00:51.555
you on today because we've
00:00:51.595 --> 00:00:52.597
never really had a chance
00:00:52.637 --> 00:00:55.000
to really start from ground zero,
00:00:55.081 --> 00:00:56.523
start from the beginning here.
00:00:57.064 --> 00:00:58.625
Before we get into those fun questions,
00:00:58.665 --> 00:00:59.487
tell us a little bit about
00:00:59.506 --> 00:01:00.429
yourself and what's
00:01:00.469 --> 00:01:01.731
happening these days at Edmentum.
00:01:02.911 --> 00:01:04.173
Well, thanks for that invitation.
00:01:04.313 --> 00:01:05.635
So, Jeff,
00:01:05.894 --> 00:01:09.298
I was a teacher educator for 15 years.
00:01:09.498 --> 00:01:10.819
I worked in higher education.
00:01:11.540 --> 00:01:13.162
I was a tenured professor at
00:01:13.201 --> 00:01:14.602
the University of Rochester.
00:01:14.623 --> 00:01:16.424
I was a Fulbright scholar
00:01:16.465 --> 00:01:17.766
who got to travel to
00:01:17.846 --> 00:01:19.328
Indonesia just before the
00:01:19.388 --> 00:01:20.709
pandemic and conduct
00:01:20.769 --> 00:01:22.490
research with a partner
00:01:22.531 --> 00:01:24.313
down there on the digital
00:01:24.332 --> 00:01:27.034
literacy practices of Indonesian youth.
00:01:28.557 --> 00:01:30.037
Had a lot of fun doing that
00:01:30.778 --> 00:01:33.700
and was I ran an English
00:01:33.740 --> 00:01:35.123
teacher preparation program
00:01:35.162 --> 00:01:36.584
and also advised doctoral
00:01:36.623 --> 00:01:38.165
students and really enjoyed
00:01:38.286 --> 00:01:39.966
studying how young people
00:01:40.167 --> 00:01:41.709
use technology for their
00:01:41.909 --> 00:01:44.070
own interest driven learning purposes.
00:01:44.631 --> 00:01:46.552
So that's where I kind of came from.
00:01:46.572 --> 00:01:48.295
I was an English teacher
00:01:48.415 --> 00:01:49.936
prior to going into higher ed.
00:01:50.757 --> 00:01:51.977
But as with many folks,
00:01:52.097 --> 00:01:54.280
the pandemic changed things, right?
00:01:55.081 --> 00:01:57.662
And I wanted to be closer to family.
00:01:57.862 --> 00:02:00.245
I wanted to have greater impact.
00:02:00.305 --> 00:02:01.727
There was something about
00:02:02.266 --> 00:02:05.290
the pandemic and the remote
00:02:05.370 --> 00:02:07.131
emergency instruction that
00:02:07.212 --> 00:02:08.673
happened as a result that
00:02:08.872 --> 00:02:12.056
really put what I had been studying,
00:02:12.195 --> 00:02:13.477
which was how are people
00:02:13.637 --> 00:02:14.918
using technologies,
00:02:15.998 --> 00:02:17.579
into the forefront and into
00:02:17.598 --> 00:02:18.419
the conversation.
00:02:18.460 --> 00:02:19.719
And then an opportunity came
00:02:19.860 --> 00:02:21.620
up and I am now the
00:02:21.860 --> 00:02:24.580
Director of Learning Design at Edmentum.
00:02:25.420 --> 00:02:29.421
We are a K-12 digital curriculum provider.
00:02:29.921 --> 00:02:31.882
We aim to be the premier
00:02:31.961 --> 00:02:33.143
learning acceleration
00:02:33.242 --> 00:02:34.983
company that helps get
00:02:35.502 --> 00:02:37.423
young people all across the
00:02:37.462 --> 00:02:39.743
country and in countries around the world
00:02:40.484 --> 00:02:42.566
back up to speed and beyond
00:02:42.825 --> 00:02:43.507
in their learning.
00:02:43.546 --> 00:02:45.568
And we use technology to do
00:02:45.628 --> 00:02:46.911
that in a variety of ways.
00:02:46.991 --> 00:02:48.671
We have intervention programs.
00:02:48.752 --> 00:02:50.875
We also have fully online
00:02:50.935 --> 00:02:54.237
courses and a fully online academy.
00:02:54.699 --> 00:02:57.260
And so my job is to make
00:02:57.360 --> 00:02:58.862
sure that what we know
00:02:58.962 --> 00:03:00.724
about good learning is
00:03:00.805 --> 00:03:03.487
built into the design of our products.
00:03:04.931 --> 00:03:06.592
And when you say good learning,
00:03:06.612 --> 00:03:08.155
how do you define that,
00:03:08.615 --> 00:03:09.596
especially in 2024?
00:03:09.596 --> 00:03:11.918
What does good learning look like?
00:03:12.437 --> 00:03:13.558
What does good learning look like?
00:03:13.598 --> 00:03:13.919
Well,
00:03:14.580 --> 00:03:17.943
engagement is on top of mind for most
00:03:17.983 --> 00:03:19.525
of the educators I talk to
00:03:19.625 --> 00:03:21.066
and most of the school district folks.
00:03:21.186 --> 00:03:23.147
How do we actually get young
00:03:23.228 --> 00:03:25.931
people to be and remain
00:03:26.050 --> 00:03:27.491
engaged in the learning
00:03:27.532 --> 00:03:28.652
that happens in their
00:03:28.712 --> 00:03:29.995
formal schooling environments,
00:03:30.474 --> 00:03:32.116
especially after the years
00:03:32.257 --> 00:03:33.318
of disengagement?
00:03:34.199 --> 00:03:36.159
And what I would argue is
00:03:36.580 --> 00:03:38.081
couple that with all of the
00:03:38.141 --> 00:03:39.364
engagement that they get in
00:03:39.503 --> 00:03:40.884
other kinds of social media
00:03:41.324 --> 00:03:42.586
and other kinds of social
00:03:42.667 --> 00:03:43.527
learning spaces.
00:03:44.935 --> 00:03:47.679
Young people need engaging learning.
00:03:47.758 --> 00:03:48.520
So that's the first
00:03:48.560 --> 00:03:49.681
definition of good learning.
00:03:50.163 --> 00:03:51.223
It's engaging learning.
00:03:51.425 --> 00:03:52.967
It meets learners where they
00:03:53.187 --> 00:03:55.349
are and then helps to
00:03:55.670 --> 00:03:57.293
leverage what they already
00:03:57.353 --> 00:03:59.276
do know and get them to
00:03:59.395 --> 00:04:00.176
where they need to be.
00:04:01.344 --> 00:04:02.467
I love that definition.
00:04:03.207 --> 00:04:04.650
Just a few hours before we
00:04:04.669 --> 00:04:05.450
did this recording,
00:04:05.570 --> 00:04:06.953
I brought home a Google
00:04:06.973 --> 00:04:08.735
Sheets project that I'm
00:04:08.756 --> 00:04:09.516
going to be giving in my
00:04:09.556 --> 00:04:10.358
middle school soon.
00:04:11.118 --> 00:04:12.662
And my middle school kids
00:04:12.681 --> 00:04:14.463
know that before they get any assignment,
00:04:14.503 --> 00:04:16.627
it has to pass a series of three tests.
00:04:17.168 --> 00:04:18.189
And those tests, of course,
00:04:18.228 --> 00:04:18.870
are my triplets.
00:04:19.269 --> 00:04:19.730
So tonight,
00:04:19.769 --> 00:04:20.990
my triplets were doing these
00:04:21.031 --> 00:04:22.012
Google Sheets homeworks.
00:04:22.612 --> 00:04:23.413
They're in fourth grade,
00:04:23.432 --> 00:04:24.634
but they were doing the
00:04:24.673 --> 00:04:25.654
middle school level work.
00:04:26.136 --> 00:04:27.216
And just as you were saying,
00:04:27.416 --> 00:04:28.737
meet the kids where they are,
00:04:28.817 --> 00:04:30.238
give them something engaging,
00:04:30.779 --> 00:04:33.422
and just sit back and watch what it is.
00:04:33.802 --> 00:04:35.083
Adventum, of course, is in 43,000 schools,
00:04:35.103 --> 00:04:35.624
hitting 420,000 educators
00:04:35.644 --> 00:04:36.564
and 5.2 million students in
00:04:36.624 --> 00:04:36.944
all 50 states.
00:04:43.550 --> 00:04:45.211
I'm looking forward to this conversation.
00:04:45.230 --> 00:04:46.350
You want to just dive right into this?
00:04:46.891 --> 00:04:47.391
Absolutely.
00:04:47.451 --> 00:04:47.730
Let's go.
00:05:04.516 --> 00:05:05.338
Some people think that
00:05:05.377 --> 00:05:08.300
artificial intelligence is only chat GPT.
00:05:08.480 --> 00:05:09.821
We've got different terms, right?
00:05:09.860 --> 00:05:11.362
We've got generative AI.
00:05:11.442 --> 00:05:12.882
We've got design AI.
00:05:12.942 --> 00:05:15.245
We've got text-based AI.
00:05:15.305 --> 00:05:17.826
We've got AI in different
00:05:17.886 --> 00:05:19.346
products like Canva and
00:05:19.427 --> 00:05:21.369
Adobe and Microsoft and all
00:05:21.389 --> 00:05:22.350
of these different things.
00:05:22.850 --> 00:05:23.930
So I'm going to ask you,
00:05:23.951 --> 00:05:24.831
I'm going to put you on the
00:05:24.870 --> 00:05:25.591
hot seat here.
00:05:26.271 --> 00:05:27.552
Millions of educators have
00:05:27.593 --> 00:05:28.894
just stopped their cars and
00:05:28.934 --> 00:05:29.935
pulled over to the side.
00:05:30.555 --> 00:05:30.855
Jane?
00:05:31.716 --> 00:05:35.000
How do you define the term generative AI?
00:05:36.201 --> 00:05:38.625
When I'm talking about generative AI,
00:05:38.646 --> 00:05:41.670
I am talking about any of
00:05:41.790 --> 00:05:44.733
the tools that will generate
00:05:45.918 --> 00:05:48.021
new content because they are
00:05:48.182 --> 00:05:49.302
using the artificial
00:05:49.362 --> 00:05:50.524
intelligence that they have
00:05:50.543 --> 00:05:54.588
been programmed with to look for patterns,
00:05:55.028 --> 00:05:57.911
to call from whatever large
00:05:57.971 --> 00:05:59.372
language model usually,
00:05:59.574 --> 00:06:01.636
so whatever big batch of
00:06:01.716 --> 00:06:05.639
data they were given, to give the user
00:06:06.961 --> 00:06:09.502
a response or a creation,
00:06:09.523 --> 00:06:10.322
because it could be
00:06:10.442 --> 00:06:11.624
image-based if we're
00:06:11.663 --> 00:06:12.704
talking about something
00:06:12.744 --> 00:06:16.286
like DALI or mid-journey or in Canva,
00:06:16.545 --> 00:06:18.666
could be design-based, right?
00:06:19.067 --> 00:06:20.827
But it gives the user what
00:06:20.908 --> 00:06:22.548
it thinks it wants.
00:06:23.709 --> 00:06:25.370
So that's important to know
00:06:25.891 --> 00:06:28.192
that it provides what it
00:06:28.291 --> 00:06:29.612
thinks you want based on
00:06:29.653 --> 00:06:31.173
the prompt you gave it and
00:06:31.233 --> 00:06:34.815
based on its training and in its model.
00:06:35.891 --> 00:06:37.411
That sounds like my triplets.
00:06:37.451 --> 00:06:38.572
Let me see what we can get a
00:06:38.632 --> 00:06:39.291
couple of things here.
00:06:39.992 --> 00:06:44.713
If I go on to a Google search and I say,
00:06:44.733 --> 00:06:46.994
I need a recipe for cookies.
00:06:47.593 --> 00:06:49.194
Is that generative AI?
00:06:49.274 --> 00:06:50.274
I'm putting in a prompt.
00:06:50.714 --> 00:06:51.716
It's giving me something.
00:06:52.036 --> 00:06:53.156
Is that generative AI?
00:06:53.762 --> 00:06:55.463
No, that's a search, right?
00:06:55.644 --> 00:06:57.245
And artificial intelligence
00:06:57.305 --> 00:06:58.766
might be involved in the search,
00:06:59.086 --> 00:07:01.086
but that's not necessarily generative AI.
00:07:01.146 --> 00:07:02.608
We'll say like when you're
00:07:02.668 --> 00:07:03.649
starting to type,
00:07:04.168 --> 00:07:05.670
give me a recipe and you
00:07:05.730 --> 00:07:07.850
see all the stuff that pops up below it,
00:07:08.350 --> 00:07:09.732
that is generative AI
00:07:09.752 --> 00:07:11.012
because here it is using
00:07:11.052 --> 00:07:12.713
its training to make a
00:07:12.754 --> 00:07:14.014
prediction to give you what
00:07:14.055 --> 00:07:15.314
it thinks you want, right?
00:07:15.435 --> 00:07:16.536
But it's just filling in the
00:07:16.596 --> 00:07:17.456
search right there.
00:07:19.098 --> 00:07:20.459
But what I'm talking about
00:07:20.500 --> 00:07:21.240
when I talk about
00:07:21.281 --> 00:07:23.382
generative AI is when
00:07:23.401 --> 00:07:24.742
you're using a tool like
00:07:24.843 --> 00:07:26.142
Microsoft Copilot,
00:07:26.262 --> 00:07:27.843
who any of the schools who
00:07:28.165 --> 00:07:29.785
are on Microsoft tools,
00:07:30.245 --> 00:07:32.307
they likely now have access,
00:07:32.466 --> 00:07:34.108
whether they use it or ignore it,
00:07:34.747 --> 00:07:36.428
to Copilot,
00:07:36.749 --> 00:07:39.091
to have a chat feature that
00:07:39.211 --> 00:07:41.291
they can put in and ask a question.
00:07:41.312 --> 00:07:44.233
And when you ask that program a question,
00:07:45.293 --> 00:07:46.595
Unlike a Google search,
00:07:46.975 --> 00:07:48.714
you'll get a different sort
00:07:48.754 --> 00:07:50.216
of generative answer.
00:07:50.315 --> 00:07:52.276
You'll get a text-based answer,
00:07:52.716 --> 00:07:53.956
often with different sources.
00:07:54.377 --> 00:07:55.798
What Google gives you is a
00:07:55.978 --> 00:07:58.798
list of what it thinks are
00:07:58.879 --> 00:08:01.199
your most likely or best
00:08:01.319 --> 00:08:04.661
paid for choices to answer your question.
00:08:04.841 --> 00:08:06.242
And you then have to go out
00:08:06.302 --> 00:08:07.343
and look at the site that
00:08:07.382 --> 00:08:08.062
it links to you.
00:08:08.742 --> 00:08:10.163
What the generative AI tool
00:08:10.204 --> 00:08:11.324
will do with your question
00:08:11.464 --> 00:08:13.706
is it will create text that
00:08:13.747 --> 00:08:15.228
it thinks answers your question,
00:08:15.307 --> 00:08:17.050
pulling and synthesizing
00:08:17.449 --> 00:08:18.771
from a variety of sources.
00:08:19.552 --> 00:08:19.891
And, you know,
00:08:19.932 --> 00:08:21.052
while you're giving me that
00:08:21.132 --> 00:08:22.533
amazing answer, of course,
00:08:22.593 --> 00:08:24.935
I go on to Microsoft Copilot and I say,
00:08:25.615 --> 00:08:26.896
start a knock knock joke.
00:08:26.997 --> 00:08:28.358
And of course, it says knock knock.
00:08:29.175 --> 00:08:29.754
Who's there?
00:08:30.375 --> 00:08:30.834
Banana.
00:08:31.295 --> 00:08:32.495
So this is where artificial
00:08:32.556 --> 00:08:33.576
intelligence is, right?
00:08:34.775 --> 00:08:36.817
If I go into these different programs,
00:08:36.856 --> 00:08:38.256
we know that there's, as you mentioned,
00:08:38.317 --> 00:08:40.697
a variety of different kinds.
00:08:40.977 --> 00:08:43.577
I think the two biggies that are out there,
00:08:44.118 --> 00:08:47.698
ChatGPT and Microsoft Copilot.
00:08:48.019 --> 00:08:48.798
And I would even throw a
00:08:48.879 --> 00:08:49.600
third one in there.
00:08:49.879 --> 00:08:52.100
Google has their BARD slash,
00:08:52.220 --> 00:08:53.841
they're not calling it Gemini, right?
00:08:54.160 --> 00:08:54.400
Right.
00:08:54.640 --> 00:08:54.780
And...
00:08:56.380 --> 00:08:58.542
The scary part is these
00:08:58.642 --> 00:09:00.043
things are now starting to
00:09:00.082 --> 00:09:02.043
be turned on at the admin
00:09:02.063 --> 00:09:04.384
level for school districts.
00:09:05.445 --> 00:09:07.686
This is not the AI world and
00:09:07.765 --> 00:09:09.826
then the school world.
00:09:09.947 --> 00:09:11.447
Microsoft is now every
00:09:11.508 --> 00:09:13.288
single day putting out videos going,
00:09:13.749 --> 00:09:16.049
here is Copilot with PowerPoint.
00:09:16.090 --> 00:09:17.630
Here is Copilot with Outlook.
00:09:18.270 --> 00:09:19.792
I got to be transparent.
00:09:19.892 --> 00:09:20.971
I'm personally one of those
00:09:21.072 --> 00:09:22.173
ones that are paying 30
00:09:22.173 --> 00:09:23.833
bucks a month for Copilot.
00:09:24.614 --> 00:09:25.575
I love it.
00:09:26.075 --> 00:09:27.677
I love the fact that I can sit,
00:09:27.736 --> 00:09:28.397
if nothing else,
00:09:28.397 --> 00:09:30.359
30 bucks a month is paying
00:09:30.438 --> 00:09:31.899
for me to look at a strand
00:09:31.919 --> 00:09:34.201
of emails and have it read
00:09:34.241 --> 00:09:35.683
the emails and give me like
00:09:35.724 --> 00:09:38.326
a three sentence synopsis
00:09:38.745 --> 00:09:40.687
of what the entire email thread is.
00:09:40.707 --> 00:09:43.049
I absolutely love that.
00:09:43.682 --> 00:09:45.144
Yeah, it's a time saver.
00:09:45.325 --> 00:09:46.525
Huge time saver.
00:09:46.947 --> 00:09:48.628
I'm still trying to figure things out.
00:09:48.729 --> 00:09:49.789
Last night I was doing a
00:09:49.870 --> 00:09:50.850
chat with a friend who was
00:09:50.890 --> 00:09:52.873
at a Rangers game and I said, you know,
00:09:53.153 --> 00:09:54.674
please take this picture
00:09:54.855 --> 00:09:55.956
and put it in a Rangers
00:09:55.996 --> 00:09:57.437
jersey and put the Stanley Cup.
00:09:57.999 --> 00:09:58.538
I was doing the
00:09:58.578 --> 00:10:01.342
designer.microsoft.com thing and
00:10:02.503 --> 00:10:04.024
We were just having fun with it, right?
00:10:04.283 --> 00:10:04.504
Right.
00:10:05.284 --> 00:10:08.086
And let me stop you there, Jeff,
00:10:08.187 --> 00:10:10.028
because what you're doing
00:10:10.769 --> 00:10:12.769
is exactly what I'm trying
00:10:12.809 --> 00:10:15.751
to advocate for to school leaders.
00:10:16.091 --> 00:10:17.173
You're playing with it.
00:10:17.293 --> 00:10:18.673
You're getting your hands in there.
00:10:18.714 --> 00:10:20.816
You're trying different use cases.
00:10:21.115 --> 00:10:22.096
The use case may be
00:10:22.177 --> 00:10:23.177
entertaining your friend.
00:10:23.496 --> 00:10:25.298
The use case may be digging
00:10:25.318 --> 00:10:25.759
through your emails.
00:10:25.798 --> 00:10:27.559
email and saving yourself time.
00:10:27.899 --> 00:10:29.341
The use case may be for our
00:10:29.380 --> 00:10:30.601
teacher friends listening,
00:10:31.123 --> 00:10:32.503
designing a lesson plan or
00:10:32.563 --> 00:10:35.065
giving a student sample to
00:10:35.264 --> 00:10:36.865
meet the needs of their students, right?
00:10:37.226 --> 00:10:39.447
You've spent time to play
00:10:39.687 --> 00:10:41.590
and figure out where it
00:10:41.750 --> 00:10:43.110
could be useful for you.
00:10:43.951 --> 00:10:45.091
And that's what we're
00:10:45.211 --> 00:10:46.751
advocating for our
00:10:47.251 --> 00:10:48.491
education partners to do.
00:10:49.072 --> 00:10:50.293
So at Edmentum,
00:10:50.552 --> 00:10:52.052
we ran a series of
00:10:52.113 --> 00:10:54.092
experiments to try to
00:10:54.253 --> 00:10:56.594
figure out how would we want to advise,
00:10:56.734 --> 00:10:57.833
especially last summer,
00:10:58.014 --> 00:10:59.414
everybody was talking about it.
00:10:59.855 --> 00:11:01.355
School districts had shut it down.
00:11:01.735 --> 00:11:02.855
We're trying to figure out
00:11:03.014 --> 00:11:05.076
what we could suggest to
00:11:05.296 --> 00:11:07.096
to our partners and so we
00:11:07.136 --> 00:11:08.136
got in there and started
00:11:08.197 --> 00:11:09.658
running experiments and
00:11:09.738 --> 00:11:11.077
that's the kind of thing
00:11:11.118 --> 00:11:12.457
that we learned is that
00:11:12.859 --> 00:11:14.379
teachers need or school
00:11:14.399 --> 00:11:15.899
district leaders need to
00:11:16.200 --> 00:11:17.360
get their hands in it try
00:11:17.399 --> 00:11:18.500
different tools see how
00:11:18.541 --> 00:11:19.961
they work so they can
00:11:20.041 --> 00:11:21.522
figure out where it might
00:11:21.562 --> 00:11:22.381
be useful for them
00:11:23.673 --> 00:11:24.615
I'd like to have this
00:11:24.695 --> 00:11:26.034
conversation from a couple
00:11:26.134 --> 00:11:27.075
different chairs.
00:11:27.696 --> 00:11:28.395
I'll try to tell you the
00:11:28.436 --> 00:11:29.596
chair I'm doing the question from.
00:11:29.635 --> 00:11:29.936
Right now,
00:11:29.956 --> 00:11:30.956
I want to do this from the tech
00:11:30.976 --> 00:11:31.636
director chair.
00:11:32.297 --> 00:11:33.238
When I'm working with a
00:11:33.278 --> 00:11:34.918
company and they say they
00:11:34.977 --> 00:11:37.339
now are using artificial intelligence,
00:11:38.178 --> 00:11:39.340
I know as a tech director,
00:11:40.220 --> 00:11:41.740
I need to have a privacy
00:11:41.780 --> 00:11:43.042
agreement signed with that
00:11:43.121 --> 00:11:45.283
company for my users to log in?
00:11:45.844 --> 00:11:47.403
Do I also need to ask
00:11:47.504 --> 00:11:49.725
questions such as where is
00:11:49.806 --> 00:11:51.767
that company getting their
00:11:51.826 --> 00:11:53.107
artificial intelligence
00:11:53.368 --> 00:11:54.769
originally sourced from?
00:11:55.129 --> 00:11:57.129
And do I need to worry about
00:11:57.190 --> 00:11:58.490
having a privacy agreement
00:11:58.530 --> 00:11:59.511
with that source?
00:12:01.942 --> 00:12:03.826
I think what you're hitting
00:12:03.926 --> 00:12:05.948
on with that question there, Jeff,
00:12:06.009 --> 00:12:08.172
that is on the minds of
00:12:08.272 --> 00:12:10.894
every tech director and the
00:12:10.955 --> 00:12:12.918
legal folks in districts, right,
00:12:13.359 --> 00:12:15.782
is how do the data that get
00:12:15.902 --> 00:12:17.945
put into an AI get used?
00:12:18.945 --> 00:12:21.067
So one of the benefits of
00:12:21.226 --> 00:12:23.769
using a Microsoft co-pilot, for example,
00:12:24.370 --> 00:12:26.231
is the way that it's
00:12:26.312 --> 00:12:28.693
attached to any enterprise
00:12:28.833 --> 00:12:30.975
is it protects the privacy of the data.
00:12:31.116 --> 00:12:33.538
That data doesn't get fed into the model.
00:12:33.918 --> 00:12:34.899
But the important thing for
00:12:34.960 --> 00:12:37.221
teachers to know, if, for example,
00:12:37.282 --> 00:12:39.303
the only generative AI that
00:12:39.344 --> 00:12:40.745
they think of is chat GPT,
00:12:42.427 --> 00:12:43.307
What they need to know is
00:12:43.347 --> 00:12:44.889
that ChatGPT will take
00:12:45.110 --> 00:12:46.392
anything that you input
00:12:46.511 --> 00:12:48.433
into it and it starts to
00:12:48.614 --> 00:12:49.975
use it to train the model.
00:12:50.755 --> 00:12:52.378
So the question from a tech
00:12:52.418 --> 00:12:54.921
director seat is probably yes.
00:12:55.100 --> 00:12:56.962
They need to figure out where...
00:12:58.432 --> 00:13:01.874
what a company is using and which data,
00:13:02.033 --> 00:13:03.615
like whether or not the data gets shared,
00:13:04.176 --> 00:13:05.235
you're safer if they're
00:13:05.275 --> 00:13:06.756
using Microsoft Copilot.
00:13:07.157 --> 00:13:10.019
And there's also almost always,
00:13:10.178 --> 00:13:10.860
as I've seen it,
00:13:11.000 --> 00:13:12.620
data sharing agreements or
00:13:12.780 --> 00:13:15.442
not that protect the privacy.
00:13:15.522 --> 00:13:17.302
So even as an ed tech company,
00:13:18.364 --> 00:13:20.004
all of the same rules and
00:13:20.044 --> 00:13:22.986
regulations for protecting
00:13:23.086 --> 00:13:25.207
student data apply to us as
00:13:25.268 --> 00:13:26.509
they do to a school district.
00:13:26.849 --> 00:13:28.953
So we can't use and share
00:13:29.094 --> 00:13:31.980
and email and feed into a
00:13:32.201 --> 00:13:34.004
system any student data.
00:13:35.304 --> 00:13:36.586
when a tech director is
00:13:36.686 --> 00:13:38.427
looking at an application
00:13:38.927 --> 00:13:40.488
or when an application
00:13:40.548 --> 00:13:42.288
comes to a tech director that says, hey,
00:13:42.349 --> 00:13:44.150
now we have this extra thing on us,
00:13:44.971 --> 00:13:46.792
what questions should a
00:13:47.513 --> 00:13:49.134
tech director be asking of
00:13:49.173 --> 00:13:50.894
their ed tech partners when
00:13:50.934 --> 00:13:53.996
it comes to the topic of AI, AI features,
00:13:54.177 --> 00:13:56.258
perhaps can I turn the AI
00:13:56.317 --> 00:13:57.599
features on and off on my
00:13:57.678 --> 00:13:59.019
side or are they now just a
00:13:59.100 --> 00:14:00.039
part of this world?
00:14:00.400 --> 00:14:01.561
What questions should school
00:14:01.581 --> 00:14:02.461
districts be asking?
00:14:03.336 --> 00:14:03.436
Well,
00:14:03.475 --> 00:14:05.017
I think lots of people are asking
00:14:05.216 --> 00:14:08.698
questions around age restrictions.
00:14:09.080 --> 00:14:10.559
So those keep changing.
00:14:10.580 --> 00:14:13.162
I would also,
00:14:14.023 --> 00:14:16.183
speaking of the idea of changing,
00:14:16.724 --> 00:14:19.405
this landscape and these
00:14:19.466 --> 00:14:21.726
technologies are ever changing, right?
00:14:22.128 --> 00:14:24.009
All of the models keep getting updated.
00:14:24.489 --> 00:14:27.671
So I might want to ask if I
00:14:27.691 --> 00:14:28.672
were a tech director,
00:14:29.884 --> 00:14:31.504
How will I be notified of
00:14:31.585 --> 00:14:32.965
changes to the model?
00:14:34.706 --> 00:14:37.147
I think it is a good question to ask.
00:14:37.346 --> 00:14:39.729
Is there a way to limit access,
00:14:40.369 --> 00:14:42.149
turn features on and off?
00:14:44.585 --> 00:14:45.644
The other thing to note
00:14:45.965 --> 00:14:47.566
about the perspective that
00:14:47.785 --> 00:14:50.626
I bring from Edmentum is
00:14:50.706 --> 00:14:53.488
that we're not putting AI
00:14:53.687 --> 00:14:56.068
into our products at this point.
00:14:56.828 --> 00:14:58.528
We've taken a more kind of
00:14:58.609 --> 00:14:59.789
cautious approach.
00:15:00.549 --> 00:15:02.770
We're absolutely looking at
00:15:02.990 --> 00:15:04.230
use cases for our own
00:15:04.291 --> 00:15:06.331
efficiencies and the work
00:15:06.351 --> 00:15:08.033
that we need to do to create things.
00:15:08.572 --> 00:15:11.453
But when it comes to in our products,
00:15:12.173 --> 00:15:12.494
No.
00:15:12.974 --> 00:15:13.414
Rather,
00:15:13.634 --> 00:15:15.075
what we're doing is trying to
00:15:15.154 --> 00:15:16.576
figure out how to help
00:15:16.855 --> 00:15:18.657
teachers who use our
00:15:18.697 --> 00:15:21.538
products think about when
00:15:21.577 --> 00:15:23.558
and if or how students
00:15:23.698 --> 00:15:26.900
might use AI to complete assignments,
00:15:27.380 --> 00:15:28.160
what they should worry
00:15:28.221 --> 00:15:30.322
about or not when it comes to that,
00:15:30.662 --> 00:15:31.462
and how to have the
00:15:31.543 --> 00:15:32.582
teachers find their own
00:15:32.623 --> 00:15:34.384
efficiencies with AI in
00:15:34.443 --> 00:15:36.184
terms of using our products.
00:15:37.072 --> 00:15:37.972
I wanna throw one more
00:15:38.033 --> 00:15:38.793
question at you from the
00:15:38.833 --> 00:15:39.634
tech director chair,
00:15:39.673 --> 00:15:40.375
and this is a biggie.
00:15:40.894 --> 00:15:43.076
And there are spreadsheets
00:15:43.177 --> 00:15:44.278
running around the internet
00:15:44.317 --> 00:15:46.679
right now that have all of
00:15:46.720 --> 00:15:47.660
this information,
00:15:48.501 --> 00:15:49.322
but I think there's a lot
00:15:49.363 --> 00:15:50.403
that's premature right now.
00:15:51.556 --> 00:15:53.136
Do you have any recommendations?
00:15:53.197 --> 00:15:54.778
I know you're not legal, right?
00:15:55.158 --> 00:15:56.100
But do you have any
00:15:56.179 --> 00:15:57.900
recommendations on language
00:15:58.241 --> 00:16:00.864
that should be in or things
00:16:00.923 --> 00:16:02.625
that should be in some kind
00:16:02.664 --> 00:16:05.668
of an official board doc AI policy?
00:16:06.248 --> 00:16:07.830
I know school districts are jumping in,
00:16:07.870 --> 00:16:09.311
but they don't have an AI policy.
00:16:09.350 --> 00:16:10.432
Some school districts are saying,
00:16:10.831 --> 00:16:11.773
why do I need one?
00:16:11.793 --> 00:16:12.572
And then there's some school
00:16:12.592 --> 00:16:14.195
districts that are making
00:16:14.235 --> 00:16:16.256
the document that everything is in there.
00:16:16.876 --> 00:16:17.236
Do you have any
00:16:17.256 --> 00:16:18.437
recommendations or do you
00:16:18.457 --> 00:16:19.898
have a chance to see what
00:16:19.918 --> 00:16:20.798
other school districts and
00:16:20.818 --> 00:16:21.418
stuff are doing?
00:16:22.359 --> 00:16:24.299
So I've had a couple of
00:16:24.379 --> 00:16:25.620
opportunities to see what
00:16:25.660 --> 00:16:27.081
other school districts are doing.
00:16:28.802 --> 00:16:29.662
One of the things I do on
00:16:29.701 --> 00:16:31.163
the side is I still I
00:16:31.202 --> 00:16:32.663
couldn't leave academia altogether.
00:16:32.722 --> 00:16:34.043
So I still teach an
00:16:34.124 --> 00:16:35.244
instructional technology
00:16:35.303 --> 00:16:37.024
course at the University of Pennsylvania.
00:16:37.345 --> 00:16:38.426
And I taught it last fall.
00:16:38.905 --> 00:16:40.046
And the course gets taught
00:16:40.206 --> 00:16:41.346
to school leaders.
00:16:41.586 --> 00:16:43.268
And so I had a cohort of 20
00:16:43.268 --> 00:16:44.469
something school leaders.
00:16:44.808 --> 00:16:45.568
And of course,
00:16:45.708 --> 00:16:48.090
in an instructional technology module,
00:16:48.190 --> 00:16:48.890
we were talking about
00:16:48.931 --> 00:16:50.831
generative AI and policy.
00:16:50.871 --> 00:16:52.493
So I got to see some of the
00:16:52.533 --> 00:16:57.595
policies that those folks as my students,
00:16:57.674 --> 00:16:59.155
but in their day jobs were
00:16:59.235 --> 00:17:00.356
creating in their districts.
00:17:00.736 --> 00:17:02.258
I've also been following
00:17:02.418 --> 00:17:03.818
what New York City schools
00:17:04.019 --> 00:17:05.559
have been doing, right?
00:17:05.680 --> 00:17:07.020
They were one of the first
00:17:07.201 --> 00:17:09.242
school districts to ban
00:17:09.883 --> 00:17:11.763
chat GPT when it first came out,
00:17:11.825 --> 00:17:12.625
when everybody was trying
00:17:12.644 --> 00:17:13.425
to figure things out.
00:17:13.865 --> 00:17:15.366
And now we see that they
00:17:15.406 --> 00:17:16.488
have come around and
00:17:17.048 --> 00:17:18.689
created a more thoughtful approach.
00:17:18.729 --> 00:17:20.650
They've got a group working
00:17:20.769 --> 00:17:22.151
on it and they're trying to
00:17:22.270 --> 00:17:23.330
make things public.
00:17:23.371 --> 00:17:24.571
So I think if I were your
00:17:24.612 --> 00:17:26.873
tech director and I was in that chair,
00:17:27.373 --> 00:17:28.634
what I would do is I would
00:17:28.653 --> 00:17:30.334
probably go look to some of
00:17:30.354 --> 00:17:31.454
the bigger districts who
00:17:31.515 --> 00:17:33.016
have the resources and the
00:17:33.056 --> 00:17:34.276
money and the manpower to
00:17:34.316 --> 00:17:36.297
be thinking about this more deeply.
00:17:36.798 --> 00:17:37.938
And I would look to see what
00:17:37.978 --> 00:17:40.598
their current policy is and
00:17:40.679 --> 00:17:41.940
see what language might
00:17:42.039 --> 00:17:43.740
need to be included in my
00:17:43.820 --> 00:17:45.662
own district's policy.
00:17:47.045 --> 00:17:48.486
So let's take that hat off
00:17:48.526 --> 00:17:49.788
for a second here and let's
00:17:49.827 --> 00:17:51.288
put on the coaching hat or
00:17:51.308 --> 00:17:52.430
the curricular hat, right?
00:17:52.911 --> 00:17:53.991
One of the questions and
00:17:54.672 --> 00:17:55.613
topics that have come up on
00:17:55.653 --> 00:17:57.234
our Ask the Tech Coach show has been,
00:17:57.734 --> 00:17:59.856
how do you introduce this
00:17:59.978 --> 00:18:01.939
concept to teachers, right?
00:18:03.019 --> 00:18:04.721
We think of this as the calculator, right?
00:18:04.801 --> 00:18:06.584
Teachers are saying you can't use it,
00:18:06.644 --> 00:18:07.924
you can't use it, you can't use it,
00:18:07.964 --> 00:18:09.446
but now everybody has a calculator.
00:18:11.993 --> 00:18:13.355
There's so many coaches out
00:18:13.395 --> 00:18:14.938
there right now that are
00:18:15.398 --> 00:18:16.461
jumping in and saying,
00:18:16.500 --> 00:18:17.782
can I have 20 minutes at a
00:18:17.823 --> 00:18:19.664
faculty meeting just to put
00:18:19.704 --> 00:18:20.666
that first toe in,
00:18:21.126 --> 00:18:23.049
just to have that conversation?
00:18:24.031 --> 00:18:25.814
Even myself as a technology teacher,
00:18:25.974 --> 00:18:26.976
I want to try this.
00:18:27.676 --> 00:18:29.739
But I don't want to teach my
00:18:29.838 --> 00:18:30.599
kids something.
00:18:30.640 --> 00:18:31.601
I feel weird saying this.
00:18:31.862 --> 00:18:32.642
I don't want to teach my
00:18:32.682 --> 00:18:34.463
kids something that my
00:18:34.523 --> 00:18:35.826
colleagues are going to be
00:18:35.986 --> 00:18:37.968
uncomfortable with them knowing.
00:18:38.509 --> 00:18:38.689
Right.
00:18:39.169 --> 00:18:39.430
Right?
00:18:39.809 --> 00:18:41.491
So all of that being said,
00:18:41.511 --> 00:18:43.193
if you were somebody who
00:18:43.213 --> 00:18:43.894
was in charge of
00:18:43.934 --> 00:18:45.016
professional development...
00:18:46.635 --> 00:18:46.955
How do,
00:18:47.056 --> 00:18:48.457
and this is gonna be a two-part question.
00:18:48.537 --> 00:18:50.679
How do you start the conversation?
00:18:51.138 --> 00:18:53.140
What's an application that you would use?
00:18:53.460 --> 00:18:54.500
Do you have an example of
00:18:54.540 --> 00:18:56.882
maybe a first group assignment?
00:18:57.643 --> 00:18:59.584
What's that 30 second pitch
00:18:59.624 --> 00:19:01.224
or speech or anything that
00:19:01.265 --> 00:19:02.066
you would do if you were
00:19:02.086 --> 00:19:03.586
that coach and you were
00:19:03.626 --> 00:19:04.987
given a faculty meeting and said,
00:19:05.567 --> 00:19:06.567
introduce the topic,
00:19:06.587 --> 00:19:07.828
but don't go too far in the water.
00:19:08.308 --> 00:19:09.569
Right.
00:19:09.589 --> 00:19:10.510
I love this question.
00:19:10.691 --> 00:19:11.411
So this,
00:19:11.510 --> 00:19:12.771
I have actually done a bunch of
00:19:12.791 --> 00:19:14.933
thinking about is how to get it started.
00:19:16.144 --> 00:19:17.785
I think I would take a
00:19:18.305 --> 00:19:20.625
problem that is broad for
00:19:20.826 --> 00:19:21.905
most of my colleagues.
00:19:22.006 --> 00:19:23.185
And I would venture a guess
00:19:23.226 --> 00:19:24.386
that most of your listeners
00:19:25.126 --> 00:19:25.987
are dealing with the
00:19:26.027 --> 00:19:28.127
challenge of the various
00:19:28.268 --> 00:19:30.229
languages that our students
00:19:30.288 --> 00:19:31.308
come to our classrooms
00:19:31.328 --> 00:19:33.890
speaking and their families, right?
00:19:34.009 --> 00:19:36.290
We have a huge variety of
00:19:36.371 --> 00:19:38.771
multilingual learners who
00:19:39.051 --> 00:19:41.633
are trying to learn our content,
00:19:41.833 --> 00:19:43.272
but they still don't know
00:19:43.333 --> 00:19:44.252
the English language that
00:19:44.272 --> 00:19:45.114
we're speaking to them in.
00:19:45.973 --> 00:19:46.894
So one of the things that I
00:19:46.954 --> 00:19:48.655
might show my colleagues if
00:19:48.756 --> 00:19:50.477
I were a coach is I'd show
00:19:50.517 --> 00:19:52.978
them we found Clawed AI,
00:19:53.038 --> 00:19:54.078
which is one that we have
00:19:54.118 --> 00:19:55.019
not yet mentioned.
00:19:56.160 --> 00:19:59.142
But Clawed AI was the tool
00:19:59.182 --> 00:20:00.563
that we found at the time
00:20:00.603 --> 00:20:01.844
when we ran our experiments
00:20:01.884 --> 00:20:03.444
a few months ago was the
00:20:03.484 --> 00:20:06.686
best at taking a prompt
00:20:07.809 --> 00:20:09.172
and you put it in there and
00:20:09.211 --> 00:20:11.114
you ask it to translate
00:20:11.173 --> 00:20:12.935
that prompt and explain the
00:20:12.976 --> 00:20:16.401
concept to a speaker of say, for example,
00:20:16.480 --> 00:20:17.521
Moroccan Arabic.
00:20:18.470 --> 00:20:20.611
So what Cloud AI does,
00:20:20.631 --> 00:20:22.512
it's better than Google Translate,
00:20:22.752 --> 00:20:24.034
which just gives you a
00:20:24.094 --> 00:20:25.634
one-to-one translation and
00:20:25.855 --> 00:20:27.155
who knows how good it is.
00:20:27.816 --> 00:20:29.777
But what Cloud AI does is it
00:20:29.817 --> 00:20:31.137
will give you the translation,
00:20:31.459 --> 00:20:33.700
explain in both English and
00:20:33.779 --> 00:20:35.540
in the language of choice,
00:20:35.721 --> 00:20:36.541
the target language,
00:20:37.122 --> 00:20:39.003
why it made choices that it did.
00:20:40.625 --> 00:20:42.665
to explain the concept and
00:20:42.705 --> 00:20:43.866
to make it more accessible.
00:20:44.346 --> 00:20:45.807
So if you've got a teacher
00:20:46.127 --> 00:20:47.588
who's got students who are
00:20:47.628 --> 00:20:49.349
speaking maybe a handful of
00:20:49.410 --> 00:20:50.631
languages in their class
00:20:50.671 --> 00:20:51.471
and they're just trying to
00:20:51.511 --> 00:20:53.692
teach them math and you're
00:20:53.732 --> 00:20:55.354
trying to explain to them
00:20:55.394 --> 00:20:56.815
the Pythagorean theorem and
00:20:56.894 --> 00:21:00.198
how that works and you need
00:21:00.837 --> 00:21:02.219
your multilingual learners
00:21:02.239 --> 00:21:02.798
to understand it,
00:21:02.919 --> 00:21:04.200
I might show them how that
00:21:04.259 --> 00:21:06.221
works and how easy that is
00:21:07.162 --> 00:21:09.963
to give them the explanation
00:21:10.044 --> 00:21:11.644
that will allow them to differentiate.
00:21:12.226 --> 00:21:13.646
The next thing I'd do is I'd say,
00:21:14.126 --> 00:21:16.147
so who's interested in learning more?
00:21:17.388 --> 00:21:18.410
And I think professional
00:21:18.450 --> 00:21:19.810
development in this area
00:21:20.310 --> 00:21:21.311
needs to start with a
00:21:21.412 --> 00:21:22.492
coalition of the willing.
00:21:24.061 --> 00:21:25.461
So bring together the
00:21:25.501 --> 00:21:28.303
teachers who aren't fully afraid of it,
00:21:28.442 --> 00:21:30.183
who want to dip their toes in the water.
00:21:30.923 --> 00:21:33.144
And what we advocated when
00:21:33.203 --> 00:21:34.664
we wrote about this last year,
00:21:36.365 --> 00:21:38.145
we advocated for bringing
00:21:38.185 --> 00:21:39.246
this group together and
00:21:39.346 --> 00:21:41.605
creating a culture of experimentation.
00:21:41.987 --> 00:21:43.227
So getting the school to
00:21:43.326 --> 00:21:45.166
give them some space, some time,
00:21:45.227 --> 00:21:46.127
maybe some professional
00:21:46.167 --> 00:21:47.008
development hours.
00:21:47.768 --> 00:21:50.151
to start running their own experiments,
00:21:50.211 --> 00:21:51.352
to start using these
00:21:51.412 --> 00:21:53.194
different tools to see what works.
00:21:53.474 --> 00:21:53.955
There's also,
00:21:54.016 --> 00:21:55.217
we haven't talked yet about
00:21:55.297 --> 00:21:56.598
all of the generative AI
00:21:56.679 --> 00:21:59.001
that are school focused, right?
00:21:59.082 --> 00:22:00.522
That are not these other ones.
00:22:00.563 --> 00:22:02.746
So like school AI, magic school,
00:22:03.626 --> 00:22:05.107
These ones that essentially
00:22:05.188 --> 00:22:07.469
take a chat GPT engine,
00:22:07.888 --> 00:22:08.990
put it in a wrapper and
00:22:09.049 --> 00:22:10.609
start to program it and
00:22:10.650 --> 00:22:11.911
give it a personality and a
00:22:11.951 --> 00:22:13.392
persona that meets
00:22:13.511 --> 00:22:15.132
different grade levels or
00:22:15.211 --> 00:22:17.192
subject areas and starts to
00:22:17.232 --> 00:22:19.534
do some of the design work for teachers.
00:22:19.595 --> 00:22:20.855
So it lessens the load,
00:22:20.894 --> 00:22:22.496
the burden about designing
00:22:22.516 --> 00:22:23.276
your own prompts.
00:22:24.057 --> 00:22:25.616
And just have these folks
00:22:25.676 --> 00:22:27.397
experiment and learn about
00:22:27.458 --> 00:22:28.659
the different cases that it
00:22:28.739 --> 00:22:30.199
might work and then let it
00:22:30.299 --> 00:22:31.059
start to spread.
00:22:32.067 --> 00:22:32.807
When you're looking,
00:22:33.407 --> 00:22:34.307
I'm gonna go back a hat.
00:22:34.887 --> 00:22:36.588
When you're looking at, you know,
00:22:36.729 --> 00:22:38.648
here's Claude, here's Magic School.
00:22:41.009 --> 00:22:42.730
You're suggesting that this
00:22:42.789 --> 00:22:44.069
be at the teacher level,
00:22:44.549 --> 00:22:45.589
which to the best of my
00:22:45.650 --> 00:22:46.549
knowledge means I don't
00:22:46.690 --> 00:22:48.651
need to worry about privacy agreements,
00:22:49.010 --> 00:22:50.451
or is this where you go to
00:22:50.490 --> 00:22:51.570
your tech director as the
00:22:51.590 --> 00:22:52.471
tech coach and say,
00:22:53.291 --> 00:22:54.951
I'd like to try this Claude thing,
00:22:56.152 --> 00:22:56.951
go get an agreement.
00:22:57.451 --> 00:22:59.593
So I can now go do my 30
00:22:59.593 --> 00:23:01.333
minute faculty meeting.
00:23:03.666 --> 00:23:05.769
what's the legalities on that?
00:23:06.068 --> 00:23:06.970
At what point does school
00:23:06.990 --> 00:23:07.631
districts need to be
00:23:07.691 --> 00:23:08.551
reaching out to all these
00:23:08.592 --> 00:23:09.392
different companies?
00:23:10.492 --> 00:23:11.233
There's no students.
00:23:11.674 --> 00:23:14.237
You have not yet said student logs into,
00:23:14.436 --> 00:23:14.636
right?
00:23:14.656 --> 00:23:14.917
Right.
00:23:15.417 --> 00:23:16.278
But you're still asking
00:23:17.038 --> 00:23:18.580
teachers to log into that.
00:23:18.820 --> 00:23:19.701
And that's kind of where I
00:23:19.781 --> 00:23:21.523
am right now is I'd love to
00:23:21.544 --> 00:23:22.744
start trying these things,
00:23:23.224 --> 00:23:24.046
but I don't want to be
00:23:24.125 --> 00:23:25.446
crossing the district line
00:23:25.487 --> 00:23:27.048
that I might not know exists.
00:23:27.790 --> 00:23:29.374
As the director of learning design,
00:23:29.674 --> 00:23:30.215
thankfully,
00:23:30.477 --> 00:23:33.182
the legal aspect of it is not my purview.
00:23:33.321 --> 00:23:34.785
So I am not the best person
00:23:34.825 --> 00:23:35.707
to answer that question.
00:23:37.175 --> 00:23:39.277
fair okay so you're the tech
00:23:39.297 --> 00:23:40.498
director so you're the tech
00:23:40.518 --> 00:23:42.818
coach and I love the idea
00:23:43.200 --> 00:23:44.440
let's have a conversation
00:23:44.480 --> 00:23:45.980
with a problem the problem
00:23:46.040 --> 00:23:47.082
is I've got students that I
00:23:47.142 --> 00:23:47.741
need to be able to
00:23:47.781 --> 00:23:49.063
communicate with here's how
00:23:49.103 --> 00:23:50.864
this works if anybody else
00:23:50.923 --> 00:23:51.924
wants more and we talk a
00:23:51.964 --> 00:23:53.345
lot on this show about the
00:23:53.384 --> 00:23:54.445
innovation curve where once
00:23:54.465 --> 00:23:55.866
you get to that 13 or so
00:23:55.906 --> 00:23:57.607
percent now you got your
00:23:57.667 --> 00:23:59.148
first followers right how
00:23:59.189 --> 00:24:00.429
do you get to that next 23
00:24:00.429 --> 00:24:03.010
to get your we talk about
00:24:03.050 --> 00:24:06.192
that one a lot on here excellent what
00:24:07.873 --> 00:24:11.895
other ideas do you have for
00:24:12.477 --> 00:24:14.057
bringing these topics in by
00:24:14.097 --> 00:24:15.419
the way and I love coming
00:24:15.618 --> 00:24:16.359
coming from a school
00:24:16.380 --> 00:24:17.840
district that supported 75
00:24:17.840 --> 00:24:19.863
languages and being the guy
00:24:19.923 --> 00:24:21.163
that brought in things like
00:24:21.523 --> 00:24:22.545
powerpoint live and
00:24:22.585 --> 00:24:23.746
microsoft translate and
00:24:23.766 --> 00:24:26.807
here's the app I love the idea for mlls
00:24:29.539 --> 00:24:31.060
What's the dog and pony, right?
00:24:31.121 --> 00:24:31.882
Is the dog and pony,
00:24:31.922 --> 00:24:33.403
here's designer.microsoft,
00:24:33.442 --> 00:24:34.364
give me a prompt and it's
00:24:34.384 --> 00:24:35.183
going to make a picture.
00:24:35.243 --> 00:24:37.826
Now go try something.
00:24:38.125 --> 00:24:38.886
What's the next thing
00:24:38.946 --> 00:24:40.268
outside of MLL students?
00:24:42.088 --> 00:24:42.470
I think...
00:24:44.321 --> 00:24:46.483
I think you'll get most
00:24:46.663 --> 00:24:49.244
teachers to buy in and want
00:24:49.306 --> 00:24:51.467
to understand more if we
00:24:51.567 --> 00:24:53.147
help solve problems for them.
00:24:53.607 --> 00:24:55.189
So I don't actually think
00:24:55.470 --> 00:24:57.911
it's the cool whiz bang dog and pony.
00:24:58.811 --> 00:25:01.394
Even I myself don't always
00:25:01.453 --> 00:25:04.214
appreciate what an AI tool
00:25:04.296 --> 00:25:06.596
can do in terms of making a
00:25:06.656 --> 00:25:07.758
presentation look better
00:25:08.057 --> 00:25:09.679
because I've got years of
00:25:09.719 --> 00:25:10.680
doing a presentation.
00:25:10.720 --> 00:25:12.221
That's not a problem I feel
00:25:12.260 --> 00:25:13.362
like I'm trying to solve.
00:25:14.281 --> 00:25:16.343
I think if you go at
00:25:16.982 --> 00:25:17.903
authentic problems that
00:25:17.923 --> 00:25:18.583
teachers are trying to
00:25:18.863 --> 00:25:20.844
solve and then think about it.
00:25:21.243 --> 00:25:22.545
I think there's always the
00:25:22.585 --> 00:25:25.105
problem of differentiation.
00:25:25.305 --> 00:25:26.986
We talked about multilingual learners,
00:25:27.086 --> 00:25:28.385
but another way of thinking
00:25:28.445 --> 00:25:29.467
about differentiation.
00:25:31.606 --> 00:25:33.208
A lesson I have learned when
00:25:33.248 --> 00:25:34.628
it comes to generative AI
00:25:34.749 --> 00:25:36.230
is you always need to keep
00:25:36.330 --> 00:25:37.270
humans in the loop.
00:25:37.951 --> 00:25:40.193
You cannot just totally rely
00:25:40.453 --> 00:25:42.555
on what the generative AI
00:25:43.076 --> 00:25:44.017
produces for you.
00:25:44.477 --> 00:25:46.137
You've got to keep checking it.
00:25:46.638 --> 00:25:48.660
And so that's why I think
00:25:51.294 --> 00:25:55.316
We'll never see AI fully replace teachers.
00:25:55.375 --> 00:25:57.497
We need their humanity and
00:25:57.537 --> 00:25:58.377
their understanding and
00:25:58.397 --> 00:25:59.419
their relationships with
00:25:59.479 --> 00:26:00.239
kids in the loop.
00:26:00.278 --> 00:26:02.460
So one way that we can leverage that is,
00:26:02.980 --> 00:26:04.141
say this is a middle school
00:26:04.181 --> 00:26:05.521
teacher in your context,
00:26:05.942 --> 00:26:08.483
and they have multiple classes of kids,
00:26:09.104 --> 00:26:10.305
and they're still, again,
00:26:10.464 --> 00:26:11.704
trying to teach maybe a
00:26:11.765 --> 00:26:14.027
social studies concept or a math concept.
00:26:14.747 --> 00:26:16.548
But they've got five sections of kids,
00:26:16.968 --> 00:26:18.409
and they all like different things.
00:26:18.609 --> 00:26:20.270
And the teacher can't keep
00:26:20.411 --> 00:26:21.731
coming up with all of these
00:26:21.771 --> 00:26:22.711
different examples
00:26:41.203 --> 00:26:44.165
Now give me an example to teach that.
00:26:44.465 --> 00:26:46.367
x math concept to all of
00:26:46.407 --> 00:26:49.049
these kids and it will
00:26:49.089 --> 00:26:51.411
generate it in seconds and
00:26:51.451 --> 00:26:52.511
I love that you just said
00:26:52.551 --> 00:26:53.913
that because a couple weeks
00:26:53.952 --> 00:26:55.054
ago I was teaching my kids
00:26:55.074 --> 00:26:56.315
how to do autobiographies
00:26:56.775 --> 00:26:57.776
and right in front of them
00:26:57.836 --> 00:26:58.997
I opened up copilot and
00:26:59.017 --> 00:26:59.577
said I need an
00:26:59.637 --> 00:27:00.959
autobiography that has this
00:27:00.999 --> 00:27:02.179
this and I basically
00:27:02.259 --> 00:27:03.079
plugged in what their
00:27:03.119 --> 00:27:06.001
assignment was and the kids
00:27:06.021 --> 00:27:08.763
were just like wait how'd you do that and
00:27:09.809 --> 00:27:10.653
That was kind of fun.
00:27:11.335 --> 00:27:12.519
But let me put on my third
00:27:12.558 --> 00:27:14.425
hat here as the technology teacher,
00:27:14.486 --> 00:27:16.010
as somebody who's in the classrooms.
00:27:17.289 --> 00:27:19.751
I'm still nervous to show
00:27:19.811 --> 00:27:21.073
this stuff to my students,
00:27:21.153 --> 00:27:22.512
even though it's on my accounts,
00:27:22.614 --> 00:27:23.894
even though they're not
00:27:23.913 --> 00:27:25.035
getting their hands on it.
00:27:25.695 --> 00:27:26.836
I still feel like I'm the
00:27:26.875 --> 00:27:27.696
guy that's teaching them
00:27:27.737 --> 00:27:28.917
how to use the calculator
00:27:28.938 --> 00:27:30.939
when the math teacher says no calculator,
00:27:30.979 --> 00:27:31.199
right?
00:27:31.419 --> 00:27:31.598
Right.
00:27:31.679 --> 00:27:33.039
I still feel like if I go in
00:27:33.079 --> 00:27:34.240
there and I show them how
00:27:34.340 --> 00:27:36.162
to use these things,
00:27:36.321 --> 00:27:37.803
eventually they're going to
00:27:37.843 --> 00:27:38.604
find the... And I don't
00:27:38.644 --> 00:27:39.443
want to be blamed as the
00:27:39.845 --> 00:27:40.644
guy who's teaching them all
00:27:40.664 --> 00:27:41.246
the back doors.
00:27:41.726 --> 00:27:41.885
Right.
00:27:42.465 --> 00:27:42.626
So...
00:27:43.487 --> 00:27:45.888
We talked about when you're doing the PD,
00:27:46.650 --> 00:27:48.250
help the teacher solve the problem,
00:27:48.270 --> 00:27:49.112
get them interested,
00:27:49.152 --> 00:27:50.472
and then you start to build from there.
00:27:51.614 --> 00:27:51.815
What...
00:27:52.994 --> 00:27:54.236
advice would you have for
00:27:54.336 --> 00:27:55.916
anybody trying to show off
00:27:56.698 --> 00:27:58.960
artificial intelligence for
00:27:59.099 --> 00:28:01.182
to students but doing in a
00:28:01.221 --> 00:28:02.863
way that's not the oh it's
00:28:02.883 --> 00:28:03.824
going to help me cheat on
00:28:03.844 --> 00:28:05.826
my you know right that
00:28:05.945 --> 00:28:06.945
stuff right how do you
00:28:07.106 --> 00:28:08.207
actually start to bring in
00:28:08.227 --> 00:28:10.128
this as a tool and we can
00:28:10.169 --> 00:28:11.349
discuss the canvas of the
00:28:11.390 --> 00:28:12.371
world and the fireflies
00:28:12.391 --> 00:28:14.392
like but what what's a good
00:28:14.432 --> 00:28:15.894
couple intro lessons for students
00:28:16.874 --> 00:28:19.914
So where I like to go is
00:28:20.634 --> 00:28:21.776
common sense education.
00:28:22.215 --> 00:28:23.455
I don't know if you've looked at,
00:28:23.576 --> 00:28:24.997
they're really well known
00:28:25.096 --> 00:28:27.237
for their digital citizenship curriculum,
00:28:27.777 --> 00:28:29.637
and they've now put out a
00:28:29.718 --> 00:28:32.078
series of lessons for students
00:28:33.038 --> 00:28:35.361
on AI that explains what it
00:28:35.500 --> 00:28:38.223
is and also kind of takes
00:28:38.284 --> 00:28:39.785
this digital citizenship
00:28:39.884 --> 00:28:42.887
approach to teaching and
00:28:42.948 --> 00:28:44.449
learning about AI.
00:28:44.750 --> 00:28:47.071
So if I were in your shoes
00:28:47.271 --> 00:28:48.373
as the tech teacher,
00:28:48.813 --> 00:28:49.953
I'd probably start there
00:28:50.134 --> 00:28:51.476
with their lessons because
00:28:51.496 --> 00:28:52.115
you're building an
00:28:52.135 --> 00:28:53.657
understanding of the tool,
00:28:53.758 --> 00:28:55.058
not just showing the cool
00:28:55.118 --> 00:28:56.619
whiz bang how it would help me.
00:28:57.461 --> 00:28:58.041
kind of a thing.
00:28:58.541 --> 00:29:00.063
So I think it's really
00:29:00.123 --> 00:29:01.805
important when we're
00:29:01.884 --> 00:29:03.886
talking with students that
00:29:03.946 --> 00:29:05.268
we help them understand
00:29:05.347 --> 00:29:07.289
what the tools do and don't do.
00:29:07.529 --> 00:29:08.570
We help them understand the
00:29:08.631 --> 00:29:10.612
biases that are built into them.
00:29:11.853 --> 00:29:15.296
We help them understand what
00:29:15.355 --> 00:29:16.657
they need to look out for
00:29:16.738 --> 00:29:18.058
that they can't just
00:29:20.865 --> 00:29:22.847
put in a prompt and turn in
00:29:22.968 --> 00:29:24.190
whatever it spits out.
00:29:24.510 --> 00:29:25.172
So again,
00:29:25.633 --> 00:29:26.894
translating the humans in the
00:29:26.934 --> 00:29:28.096
loop back to them.
00:29:28.576 --> 00:29:30.079
I would start with that
00:29:30.200 --> 00:29:31.141
resource and that
00:29:31.201 --> 00:29:33.345
collection of lessons as my
00:29:33.384 --> 00:29:34.105
first place to go.
00:29:35.156 --> 00:29:36.439
Then I would probably if
00:29:36.939 --> 00:29:39.821
your school allows you, you know,
00:29:39.883 --> 00:29:40.923
you've asked you've raised
00:29:40.943 --> 00:29:41.865
a bunch of important
00:29:41.924 --> 00:29:45.608
questions about the
00:29:45.648 --> 00:29:47.471
legalities of data sharing
00:29:47.550 --> 00:29:48.732
and having the right agreement.
00:29:48.813 --> 00:29:50.253
So let's say you do have
00:29:50.375 --> 00:29:52.836
permission to show it and
00:29:53.057 --> 00:29:54.098
your school has worked out
00:29:54.159 --> 00:29:55.380
all those legal details.
00:29:56.573 --> 00:29:58.355
I would probably start with
00:29:58.615 --> 00:30:02.877
the brainstorming capacity that AI does.
00:30:03.137 --> 00:30:05.721
So not doing the finished
00:30:05.861 --> 00:30:07.342
product part of it,
00:30:07.521 --> 00:30:09.344
because that's where some
00:30:09.384 --> 00:30:10.223
of your colleagues are
00:30:10.344 --> 00:30:13.067
probably kind of got their
00:30:13.126 --> 00:30:15.028
hackles up about cheating
00:30:15.208 --> 00:30:16.529
and the potential for cheating.
00:30:17.190 --> 00:30:19.310
And until we get all of our
00:30:19.351 --> 00:30:20.932
colleagues to change their
00:30:21.011 --> 00:30:23.792
pedagogy from the kinds of
00:30:23.853 --> 00:30:25.334
assignments that could be
00:30:25.634 --> 00:30:27.134
replicated and spit out by
00:30:27.173 --> 00:30:28.055
a generative AI,
00:30:28.855 --> 00:30:30.855
what I think we're best to
00:30:30.915 --> 00:30:32.957
do with the youth is to
00:30:33.096 --> 00:30:34.778
teach them how the tools
00:30:34.857 --> 00:30:36.218
could be a thought buddy,
00:30:36.238 --> 00:30:37.558
a brainstorming partner,
00:30:37.699 --> 00:30:38.900
an idea generator.
00:30:39.839 --> 00:30:41.980
Tools like ChatGPT are great for that.
00:30:46.894 --> 00:30:47.734
All of these topics that
00:30:47.755 --> 00:30:48.815
we're talking about are
00:30:48.855 --> 00:30:50.636
going to be detailed in our show notes.
00:30:50.717 --> 00:30:51.636
I'm making sure that we have
00:30:51.696 --> 00:30:53.459
links to all the different AI tools.
00:30:54.118 --> 00:30:55.019
I found the link to the
00:30:55.079 --> 00:30:56.201
Common Sense article.
00:30:56.641 --> 00:30:58.402
And speaking of articles, Dr. Lammers,
00:30:58.561 --> 00:31:00.002
you recently at Edmentum
00:31:00.943 --> 00:31:03.346
published an article about generative AI.
00:31:03.786 --> 00:31:05.867
And that article was called
00:31:06.567 --> 00:31:08.690
AI in Education Experiments,
00:31:09.190 --> 00:31:10.290
Lessons Learned.
00:31:11.030 --> 00:31:11.830
Talk to us a little bit
00:31:11.892 --> 00:31:13.551
about this post and specifically,
00:31:13.751 --> 00:31:15.212
what have some of the
00:31:15.252 --> 00:31:16.794
lessons been that you and
00:31:16.814 --> 00:31:18.314
your team have learned about AI?
00:31:19.075 --> 00:31:19.214
Well,
00:31:19.255 --> 00:31:20.535
I've already shared a couple of them.
00:31:20.634 --> 00:31:22.076
So the experiment from
00:31:22.155 --> 00:31:23.636
Claude and translating
00:31:23.696 --> 00:31:24.997
comes directly from that
00:31:25.396 --> 00:31:27.678
article that you'll link to.
00:31:28.199 --> 00:31:29.679
The other thing that we did is...
00:31:31.962 --> 00:31:34.865
The needing to try different
00:31:34.964 --> 00:31:37.007
tools and to try them over
00:31:37.067 --> 00:31:38.788
time to see how they work
00:31:38.828 --> 00:31:39.788
and how they change.
00:31:40.410 --> 00:31:42.290
So to see whether or not
00:31:42.371 --> 00:31:44.373
ChatGPT might be better at
00:31:44.413 --> 00:31:47.576
something versus Copilot
00:31:47.675 --> 00:31:50.178
versus Gemini versus Cloud AI.
00:31:51.898 --> 00:31:54.121
The other thing that when I
00:31:54.161 --> 00:31:55.541
go back to this idea of the
00:31:55.602 --> 00:31:56.782
coalition of the willing
00:31:56.942 --> 00:31:57.884
who are going to run
00:31:57.983 --> 00:31:59.724
experiments and try things,
00:32:00.586 --> 00:32:02.866
that I think this works best
00:32:04.048 --> 00:32:05.607
if they can then have the
00:32:05.647 --> 00:32:06.788
time and space to come
00:32:06.848 --> 00:32:08.289
together and critically
00:32:08.369 --> 00:32:10.030
reflect on what they've learned,
00:32:10.391 --> 00:32:11.570
to share resources,
00:32:11.971 --> 00:32:14.413
that there be created some sort of a hub.
00:32:14.913 --> 00:32:16.693
For us at Edmentum,
00:32:16.713 --> 00:32:18.555
we used a Microsoft Teams channel,
00:32:19.555 --> 00:32:21.915
which we called our AI brainstorming hub.
00:32:22.076 --> 00:32:23.836
And any resource gets shared
00:32:23.876 --> 00:32:25.337
there so that anyone who's
00:32:25.397 --> 00:32:26.898
interested can follow along,
00:32:27.419 --> 00:32:29.259
can dialogue about it.
00:32:30.079 --> 00:32:33.488
So I think that idea of
00:32:33.627 --> 00:32:35.070
creating this space for
00:32:35.152 --> 00:32:37.155
experimentation is really helpful.
00:32:38.048 --> 00:32:40.329
The article also shares the
00:32:40.369 --> 00:32:41.631
lesson we've already talked about,
00:32:41.671 --> 00:32:43.511
about keeping humans in the loop,
00:32:43.811 --> 00:32:45.231
that you need to have
00:32:46.692 --> 00:32:49.815
people look over what the AI creates,
00:32:50.694 --> 00:32:52.276
find hallucinations,
00:32:53.375 --> 00:32:54.576
which that's another key
00:32:54.636 --> 00:32:56.218
term that we haven't touched on,
00:32:56.657 --> 00:33:00.059
but because of the way AI is designed,
00:33:00.720 --> 00:33:01.619
it could generate
00:33:01.780 --> 00:33:04.602
falsehoods that look very believable,
00:33:04.781 --> 00:33:05.362
because again,
00:33:05.461 --> 00:33:06.563
it's just trying to please you.
00:33:07.002 --> 00:33:08.222
It's trying to give you what
00:33:08.262 --> 00:33:09.003
it thinks you want.
00:33:10.644 --> 00:33:12.203
And so if you get to the
00:33:12.284 --> 00:33:14.224
point where you are using
00:33:14.244 --> 00:33:16.105
AI with students,
00:33:16.645 --> 00:33:18.986
that article also has some
00:33:19.046 --> 00:33:20.086
lessons learned that
00:33:20.227 --> 00:33:23.446
specifically speak to work with students.
00:33:24.688 --> 00:33:25.907
And this idea that we need
00:33:25.928 --> 00:33:27.208
to promote critical
00:33:27.368 --> 00:33:29.388
thinking and reflection on
00:33:29.409 --> 00:33:30.848
the student's part as they
00:33:30.990 --> 00:33:32.509
analyze AI's output.
00:33:33.988 --> 00:33:35.631
You mentioned Cloud AI
00:33:35.730 --> 00:33:39.237
earlier about being a good tool for MLL.
00:33:39.257 --> 00:33:43.084
I want to say this the right way.
00:33:43.565 --> 00:33:45.749
Have you focused these AI tools
00:33:46.957 --> 00:33:48.376
for certain subjects.
00:33:48.436 --> 00:33:49.037
For instance,
00:33:49.958 --> 00:33:51.518
have you noticed that Copilot
00:33:51.557 --> 00:33:52.958
might be good at some subjects,
00:33:53.018 --> 00:33:55.818
but Gemini is better at others?
00:33:57.419 --> 00:34:00.819
I find there's people in
00:34:00.839 --> 00:34:01.579
certain circles that
00:34:01.720 --> 00:34:02.381
they're just going to try
00:34:02.381 --> 00:34:03.560
100 different AI tools,
00:34:03.941 --> 00:34:04.760
and they're going to always
00:34:04.820 --> 00:34:06.020
have 100 AI tools because
00:34:06.300 --> 00:34:07.201
they know what's there.
00:34:07.662 --> 00:34:09.242
But the majority of teachers are either,
00:34:09.262 --> 00:34:10.282
I don't want it,
00:34:10.402 --> 00:34:12.682
or show me the one that I need.
00:34:13.163 --> 00:34:14.143
Right, exactly.
00:34:14.362 --> 00:34:15.344
In a school district, look,
00:34:15.623 --> 00:34:17.164
if you're a Google school,
00:34:17.204 --> 00:34:18.105
you're going to do this one.
00:34:18.144 --> 00:34:19.264
If you're a Microsoft school,
00:34:19.304 --> 00:34:20.266
you're going to do this one.
00:34:20.326 --> 00:34:20.865
If you're not,
00:34:21.385 --> 00:34:22.405
here are some other options.
00:34:23.327 --> 00:34:24.067
Have you found some
00:34:24.166 --> 00:34:26.768
favorites yet and for specific reasons?
00:34:28.788 --> 00:34:29.068
Well,
00:34:30.108 --> 00:34:34.371
I know that when we were trying to
00:34:34.490 --> 00:34:37.512
use ChatGPT to do certain calculations,
00:34:37.813 --> 00:34:38.813
it couldn't always be
00:34:38.873 --> 00:34:40.014
trusted with the math.
00:34:40.655 --> 00:34:40.936
Now,
00:34:41.416 --> 00:34:43.777
I say that with a huge caveat that
00:34:44.257 --> 00:34:46.958
when we were doing our experiments,
00:34:47.039 --> 00:34:47.940
that was last year.
00:34:48.280 --> 00:34:49.300
That might as well be a
00:34:49.380 --> 00:34:51.722
decade ago in AI terms, right?
00:34:52.483 --> 00:34:54.184
So it is ever-changing.
00:34:55.126 --> 00:34:57.148
So I don't know that there
00:34:57.188 --> 00:34:58.829
is a great answer to your
00:34:58.889 --> 00:35:01.012
question definitively, Jeff.
00:35:01.974 --> 00:35:03.514
I think that as these models
00:35:03.574 --> 00:35:04.516
continue to change,
00:35:04.576 --> 00:35:05.637
that's why we need a
00:35:06.978 --> 00:35:08.561
culture of experimentation.
00:35:10.161 --> 00:35:10.782
There is another...
00:35:11.800 --> 00:35:13.402
form of ai that we haven't
00:35:13.463 --> 00:35:14.764
talked about yet and I
00:35:14.824 --> 00:35:15.905
really haven't talked about
00:35:15.945 --> 00:35:17.047
it much on this channel
00:35:17.108 --> 00:35:17.887
because I'm still
00:35:18.869 --> 00:35:20.371
fascinated by how it works
00:35:20.592 --> 00:35:21.413
and I'm just gonna I don't
00:35:21.432 --> 00:35:22.012
even know what this is
00:35:22.253 --> 00:35:23.675
specifically called but I i
00:35:23.715 --> 00:35:24.996
like the term second brain
00:35:25.637 --> 00:35:27.119
so I i call it your second
00:35:27.139 --> 00:35:29.302
brain ai and specifically things that
00:35:30.333 --> 00:35:32.173
They will look at all of
00:35:32.273 --> 00:35:34.096
your personal information
00:35:34.335 --> 00:35:35.976
and help you make decisions,
00:35:36.577 --> 00:35:37.757
help you organize.
00:35:38.378 --> 00:35:39.900
I'll give you two examples
00:35:40.159 --> 00:35:41.380
that helped me run my life
00:35:41.420 --> 00:35:42.641
and helped me run TeacherCast.
00:35:43.672 --> 00:35:45.032
I'm a big fan of an
00:35:45.152 --> 00:35:46.974
application called Notion.
00:35:47.735 --> 00:35:49.436
And Notion is a note-taking
00:35:49.516 --> 00:35:51.777
application on one level,
00:35:51.836 --> 00:35:53.398
but it's also a way to
00:35:53.518 --> 00:35:54.818
create databases and
00:35:54.878 --> 00:35:56.840
note-take and you name it.
00:35:56.880 --> 00:35:57.320
Basically,
00:35:57.521 --> 00:35:59.021
everything that you've ever seen
00:35:59.081 --> 00:36:00.182
on TeacherCast for the last
00:36:00.242 --> 00:36:02.304
couple of years is designed in Notion.
00:36:03.105 --> 00:36:03.804
And recently,
00:36:04.224 --> 00:36:06.487
Notion came out with their own AI tool,
00:36:07.067 --> 00:36:09.047
but instead of searching the world,
00:36:09.528 --> 00:36:11.389
it's searching itself, right?
00:36:11.650 --> 00:36:12.911
So when we say things like
00:36:12.951 --> 00:36:14.130
the term second brain,
00:36:14.150 --> 00:36:16.112
it literally is thinking for me.
00:36:16.532 --> 00:36:18.193
And I can actually go into
00:36:18.233 --> 00:36:19.954
the AI tool and I can ask it,
00:36:20.333 --> 00:36:21.894
tell me how many times Dr.
00:36:21.954 --> 00:36:23.195
Lammers was on the show and
00:36:23.255 --> 00:36:24.795
what the episodes were about.
00:36:25.695 --> 00:36:26.916
Maybe because in six months
00:36:26.936 --> 00:36:27.876
you're going to be back on
00:36:27.916 --> 00:36:28.838
and I want to make sure
00:36:28.878 --> 00:36:30.057
that we're having a similar
00:36:30.097 --> 00:36:31.278
yet different conversation.
00:36:31.838 --> 00:36:32.659
Or I can say,
00:36:33.079 --> 00:36:34.500
show me all the podcast
00:36:34.559 --> 00:36:36.081
episodes that we discussed.
00:36:36.661 --> 00:36:37.842
artificial intelligence
00:36:37.902 --> 00:36:38.882
because maybe I'm doing a
00:36:38.902 --> 00:36:40.364
blog post on my top 10
00:36:40.364 --> 00:36:41.585
whatever and I want to
00:36:41.626 --> 00:36:43.668
start to reference other
00:36:43.708 --> 00:36:45.009
shows so notion is a way
00:36:45.048 --> 00:36:46.630
that it'll actually take
00:36:46.690 --> 00:36:48.592
your your again your second
00:36:48.632 --> 00:36:50.313
brain it'll only think
00:36:50.574 --> 00:36:53.637
inside of that co-pilot is
00:36:53.697 --> 00:36:55.719
another option co-pilot
00:36:55.760 --> 00:36:56.880
depending on how you're
00:36:57.061 --> 00:36:58.501
using it and I i again I
00:36:58.541 --> 00:36:59.963
pay for it inside of my
00:37:00.003 --> 00:37:00.965
teacher cast domain
00:37:02.483 --> 00:37:05.023
as a switch that says internal,
00:37:05.043 --> 00:37:06.164
I forget what the exact words are,
00:37:06.184 --> 00:37:07.505
but basically it's internal
00:37:07.545 --> 00:37:09.186
of your domain or the web.
00:37:09.206 --> 00:37:10.925
So if I click on the
00:37:11.086 --> 00:37:12.567
internal switch and I don't
00:37:12.586 --> 00:37:14.947
remember the name of it, but I can say,
00:37:15.387 --> 00:37:17.469
show me all of my podcast
00:37:17.568 --> 00:37:19.710
episodes and it'll find
00:37:20.210 --> 00:37:21.951
only the podcast episodes
00:37:22.170 --> 00:37:23.992
inside of my OneDrive.
00:37:25.012 --> 00:37:26.733
Whereas if I search the web,
00:37:26.833 --> 00:37:28.393
now it's basically doing a Bing search.
00:37:30.347 --> 00:37:32.168
And so I love these
00:37:32.889 --> 00:37:34.032
companies that are coming
00:37:34.112 --> 00:37:35.994
up with ways for us to do
00:37:36.114 --> 00:37:38.458
more using the tools that
00:37:38.498 --> 00:37:40.981
we're currently building, right?
00:37:41.320 --> 00:37:42.682
So I spend a lot of time
00:37:43.815 --> 00:37:45.695
on my Notion, on my dashboards,
00:37:46.217 --> 00:37:46.896
I'm making sure that
00:37:47.036 --> 00:37:48.577
everything is there and named correctly,
00:37:48.597 --> 00:37:50.199
because I know someday soon
00:37:50.659 --> 00:37:51.679
I'm going to need to pull
00:37:51.719 --> 00:37:52.940
that information out.
00:37:53.842 --> 00:37:55.362
And the same thing with Microsoft.
00:37:55.422 --> 00:37:56.663
Microsoft is checking all of
00:37:56.702 --> 00:37:57.704
your PowerPoints and words
00:37:57.744 --> 00:37:58.925
and Excels and it's
00:37:59.005 --> 00:38:00.266
checking the entire
00:38:00.346 --> 00:38:02.987
knowledge graph out there of yourself.
00:38:03.047 --> 00:38:05.009
So that way you can find what you need.
00:38:05.048 --> 00:38:05.628
Now, obviously,
00:38:06.090 --> 00:38:07.811
if I'm searching my own stuff,
00:38:08.451 --> 00:38:09.891
it doesn't know what you as
00:38:09.931 --> 00:38:10.952
my coworkers doing.
00:38:11.833 --> 00:38:13.173
But that's okay because I
00:38:13.193 --> 00:38:14.295
don't always want to know
00:38:14.315 --> 00:38:15.695
what the entire planet's doing.
00:38:16.114 --> 00:38:16.775
I just want to know what's
00:38:16.815 --> 00:38:18.817
in my own bedroom or my own house.
00:38:20.077 --> 00:38:21.518
Do you have any experience
00:38:21.637 --> 00:38:23.438
using any applications like that?
00:38:23.579 --> 00:38:24.500
Or you were shaking your
00:38:24.539 --> 00:38:27.021
head about using the co-pilot stuff.
00:38:28.021 --> 00:38:29.282
Are you one or is your team
00:38:29.882 --> 00:38:32.543
one to be making these second brains,
00:38:32.603 --> 00:38:33.445
second thinking,
00:38:34.505 --> 00:38:35.985
digital versions of yourself?
00:38:36.146 --> 00:38:36.666
And if so,
00:38:36.686 --> 00:38:39.027
do you have any suggestions on those?
00:38:39.047 --> 00:38:39.387
Yeah.
00:38:40.106 --> 00:38:42.507
The only place that I have used this,
00:38:42.588 --> 00:38:45.289
I have not dug into this
00:38:45.329 --> 00:38:47.670
kind of second brain AI for
00:38:47.731 --> 00:38:50.572
myself very much beyond, you know,
00:38:50.672 --> 00:38:51.932
working in a corporation
00:38:51.972 --> 00:38:53.313
that uses Microsoft
00:38:53.353 --> 00:38:56.076
products and also Atlassian products,
00:38:56.215 --> 00:38:57.617
Confluence, right?
00:38:58.137 --> 00:39:01.179
We use SharePoint, there's stuff on Teams,
00:39:01.219 --> 00:39:03.179
there's files that get emailed to you,
00:39:03.400 --> 00:39:03.780
all of this.
00:39:04.340 --> 00:39:07.083
So I often use the tool
00:39:07.322 --> 00:39:10.865
Delve in Microsoft to find, okay,
00:39:11.005 --> 00:39:13.307
I know this person sent me a file.
00:39:14.469 --> 00:39:15.190
Where is it?
00:39:16.250 --> 00:39:17.030
Help me find it.
00:39:17.231 --> 00:39:18.411
And so I don't have to
00:39:18.452 --> 00:39:19.773
search email and then
00:39:19.853 --> 00:39:21.954
search Teams and then search, you know,
00:39:22.414 --> 00:39:22.976
Confluence.
00:39:23.275 --> 00:39:25.358
That's probably the best one that I use.
00:39:25.398 --> 00:39:26.539
And I use it regularly
00:39:26.559 --> 00:39:28.139
because I know I saw that
00:39:28.199 --> 00:39:29.221
file from somebody.
00:39:29.240 --> 00:39:29.840
Yes.
00:39:30.382 --> 00:39:30.621
Mm-hmm.
00:39:32.449 --> 00:39:33.710
There's a lot, right?
00:39:33.731 --> 00:39:34.271
There's a lot.
00:39:34.951 --> 00:39:37.072
And I think where we are
00:39:37.172 --> 00:39:38.373
right now is we're at that
00:39:38.492 --> 00:39:39.793
point in the curve where
00:39:40.213 --> 00:39:41.594
people are jumping on board
00:39:41.614 --> 00:39:44.115
or some of them are even saying,
00:39:45.356 --> 00:39:45.996
I don't have the time.
00:39:47.257 --> 00:39:50.438
So much stuff, grades, curriculum, parents,
00:39:51.159 --> 00:39:53.380
post-pandemic, behavior.
00:39:54.021 --> 00:39:56.141
I don't have time for one more thing.
00:39:56.641 --> 00:39:57.742
And you've got this wave of
00:39:57.862 --> 00:40:00.344
educators coming in going, no, no, no,
00:40:00.583 --> 00:40:01.744
this is the thing.
00:40:02.085 --> 00:40:02.425
Right?
00:40:02.505 --> 00:40:04.168
And even a couple of shows ago, we did the,
00:40:04.208 --> 00:40:04.449
you know,
00:40:05.329 --> 00:40:07.353
how would you relate AI to other
00:40:07.452 --> 00:40:08.295
recent things?
00:40:08.355 --> 00:40:08.835
And we're like, no,
00:40:08.856 --> 00:40:10.617
this isn't Google Cardboard
00:40:10.677 --> 00:40:12.481
where many people try it
00:40:12.521 --> 00:40:13.483
and now it's in the corner.
00:40:13.523 --> 00:40:13.663
Like,
00:40:14.485 --> 00:40:16.045
This is the thing, right?
00:40:16.065 --> 00:40:17.246
Like this is the thing that
00:40:17.266 --> 00:40:18.286
we're going to look at and go,
00:40:18.306 --> 00:40:20.268
this isn't going anywhere.
00:40:21.168 --> 00:40:22.588
This is the calculator that
00:40:22.628 --> 00:40:23.469
suddenly you turn around
00:40:23.489 --> 00:40:24.670
and everyone's got one in their pocket.
00:40:24.690 --> 00:40:27.451
Like everything is going into here.
00:40:27.590 --> 00:40:28.630
So how do we learn?
00:40:28.650 --> 00:40:32.333
And let's take one final lap around here.
00:40:32.932 --> 00:40:34.034
If you were listening to
00:40:34.074 --> 00:40:35.614
this show and you wanted to
00:40:35.653 --> 00:40:37.474
take that first step to try things,
00:40:37.514 --> 00:40:39.516
as you said, button push, test things out,
00:40:39.976 --> 00:40:40.757
play with things.
00:40:41.577 --> 00:40:42.318
what would be one of the
00:40:42.418 --> 00:40:43.498
first things that you would
00:40:43.538 --> 00:40:44.778
do or the first
00:40:45.380 --> 00:40:46.420
applications that you would
00:40:46.460 --> 00:40:48.822
look towards just to sit in
00:40:48.862 --> 00:40:49.742
your office one day and
00:40:49.782 --> 00:40:50.443
push some buttons?
00:40:51.224 --> 00:40:51.364
Well,
00:40:51.403 --> 00:40:53.945
if I'm at a school that uses Microsoft,
00:40:54.065 --> 00:40:55.025
I would use Microsoft
00:40:55.045 --> 00:40:56.067
Copilot because it's
00:40:56.106 --> 00:40:57.588
probably the easiest one to
00:40:57.628 --> 00:40:59.188
know that the data is protected,
00:40:59.248 --> 00:41:00.230
so I won't get in trouble.
00:41:01.166 --> 00:41:02.688
If I'm not at a school that uses that,
00:41:02.788 --> 00:41:04.088
I just go to chat GPT
00:41:04.148 --> 00:41:05.068
because there's a lot
00:41:05.469 --> 00:41:07.590
talked about jet chat GPT.
00:41:07.789 --> 00:41:10.090
And so I could find resources easily.
00:41:10.451 --> 00:41:11.731
So I'd pick one of those two,
00:41:11.871 --> 00:41:13.552
whichever one is the most accessible.
00:41:14.251 --> 00:41:15.393
And then I would sit down
00:41:15.432 --> 00:41:16.472
and think about what are
00:41:16.612 --> 00:41:18.373
all of the repetitive tasks
00:41:18.614 --> 00:41:20.175
that take me lots of time
00:41:21.235 --> 00:41:25.896
and how might I find or try myself
00:41:28.108 --> 00:41:29.949
a prompt that helps me save
00:41:30.048 --> 00:41:32.028
time with any one of those tasks,
00:41:32.170 --> 00:41:34.070
whether it's parent communications,
00:41:34.550 --> 00:41:36.570
whether it's designing
00:41:37.070 --> 00:41:39.152
student samples for essays
00:41:39.251 --> 00:41:40.331
as I'm trying to teach
00:41:40.411 --> 00:41:42.052
something in my English class,
00:41:42.413 --> 00:41:43.413
whatever it may be,
00:41:43.472 --> 00:41:46.054
if it's a differentiation task,
00:41:46.114 --> 00:41:47.755
and I'm trying to make sure that
00:41:48.894 --> 00:41:50.476
All of the kids have
00:41:51.416 --> 00:41:53.378
examples that relate to
00:41:53.438 --> 00:41:54.719
their particular interest.
00:41:55.019 --> 00:41:55.920
Whatever it may be,
00:41:55.940 --> 00:41:58.083
I would use one of those
00:41:58.143 --> 00:41:59.463
tools to try to create
00:41:59.543 --> 00:42:00.644
things that save me time.
00:42:02.481 --> 00:42:03.483
And I would add in there,
00:42:04.063 --> 00:42:05.804
try prompts that are serious.
00:42:06.565 --> 00:42:07.947
Try prompts that are silly.
00:42:07.987 --> 00:42:09.007
There's nothing wrong with
00:42:09.088 --> 00:42:10.668
opening up Copilot or any
00:42:10.688 --> 00:42:11.369
of these and saying,
00:42:11.409 --> 00:42:12.490
tell me a knock-knock joke.
00:42:13.271 --> 00:42:13.472
Right.
00:42:13.771 --> 00:42:14.932
Just try something.
00:42:14.972 --> 00:42:15.193
You know,
00:42:15.253 --> 00:42:16.634
today was the last day of our
00:42:16.653 --> 00:42:17.414
marking period.
00:42:17.474 --> 00:42:19.396
I had to write those emails to parents.
00:42:19.777 --> 00:42:20.838
There's nothing wrong with
00:42:20.878 --> 00:42:21.639
going in and saying,
00:42:22.039 --> 00:42:23.199
write me a letter to this
00:42:23.280 --> 00:42:24.561
parent about their student
00:42:24.601 --> 00:42:26.041
who is not doing so well.
00:42:26.402 --> 00:42:27.744
And you don't have to send it,
00:42:28.264 --> 00:42:30.246
but just see what it comes back with.
00:42:30.952 --> 00:42:32.153
And what I like to do in the
00:42:32.233 --> 00:42:35.094
write me a letter kind of
00:42:35.255 --> 00:42:37.996
use case is write me a letter about,
00:42:38.097 --> 00:42:38.896
you know, student.
00:42:39.077 --> 00:42:40.418
You put the student's name in.
00:42:40.797 --> 00:42:43.039
You're still protecting privacy because,
00:42:43.099 --> 00:42:44.320
you know, they don't know that student.
00:42:44.340 --> 00:42:45.219
You use the first name.
00:42:45.621 --> 00:42:46.221
And you say,
00:42:46.501 --> 00:42:47.661
and I want to make sure that I
00:42:47.681 --> 00:42:49.222
tell the parent three things.
00:42:49.262 --> 00:42:51.344
And you just put it in bullet point form.
00:42:51.563 --> 00:42:52.945
And I need it to be clear.
00:42:54.025 --> 00:42:56.206
Two paragraphs long, I need it to come,
00:42:56.565 --> 00:42:56.985
whatever.
00:42:57.186 --> 00:42:58.626
However much you want to give it,
00:42:59.007 --> 00:43:00.126
and you'll see that it
00:43:00.186 --> 00:43:01.206
creates something for you.
00:43:01.588 --> 00:43:02.407
And then the other thing
00:43:02.427 --> 00:43:03.088
that I would tell the
00:43:03.128 --> 00:43:04.889
teachers who are just trying this out,
00:43:05.389 --> 00:43:06.708
remember that this is a chat.
00:43:06.869 --> 00:43:08.489
So if you don't like what it gave you,
00:43:09.309 --> 00:43:11.030
tell it to change something, right?
00:43:11.411 --> 00:43:12.670
So you don't have to take
00:43:12.731 --> 00:43:15.192
the initial output and then use it or say,
00:43:15.692 --> 00:43:16.353
this doesn't work.
00:43:16.652 --> 00:43:18.632
Because where the real power
00:43:18.733 --> 00:43:21.474
comes is in its ability to
00:43:21.574 --> 00:43:23.275
iterate based on feedback from you.
00:43:23.942 --> 00:43:26.623
I first got into chat GPT
00:43:27.103 --> 00:43:28.465
when I was redesigning my
00:43:28.525 --> 00:43:30.025
resume and I popped it in
00:43:30.226 --> 00:43:32.387
and I popped the entire resume and I said,
00:43:32.467 --> 00:43:33.327
make it better, right?
00:43:33.367 --> 00:43:34.728
Because that's basic.
00:43:34.748 --> 00:43:35.949
You're learning how to do stuff.
00:43:36.710 --> 00:43:37.811
And it was okay,
00:43:37.871 --> 00:43:40.092
but still on that overall horrible side.
00:43:40.552 --> 00:43:41.612
And so then I just ended up,
00:43:41.693 --> 00:43:43.173
I went bullet point by bullet point.
00:43:43.653 --> 00:43:45.335
Here's a thing that's on my resume.
00:43:45.815 --> 00:43:47.757
Please make this sound more professional.
00:43:49.382 --> 00:43:50.983
And little by little,
00:43:51.045 --> 00:43:52.885
I just started carving out my documents.
00:43:52.905 --> 00:43:54.208
And then I went into my bio.
00:43:54.708 --> 00:43:55.728
Here's what I have.
00:43:56.128 --> 00:43:59.132
Please add these three or four new things.
00:43:59.713 --> 00:44:01.394
And then here it is.
00:44:01.434 --> 00:44:02.476
And then you put down,
00:44:02.976 --> 00:44:07.159
please give me this for a job interview.
00:44:07.199 --> 00:44:08.802
Please give me this for my website.
00:44:08.842 --> 00:44:10.563
Please give me this for a presentation.
00:44:10.623 --> 00:44:11.744
Please give me this in 150 words or less.
00:44:13.346 --> 00:44:14.226
And again,
00:44:14.445 --> 00:44:15.626
whether you use them or not is a
00:44:15.646 --> 00:44:16.086
different kind,
00:44:16.126 --> 00:44:17.407
but you're just trying
00:44:17.527 --> 00:44:18.387
things and you're putting
00:44:18.447 --> 00:44:19.146
stuff out there.
00:44:19.186 --> 00:44:20.168
You're putting your toe in
00:44:20.188 --> 00:44:22.288
the water and seeing where it is.
00:44:23.407 --> 00:44:23.708
Obviously,
00:44:23.728 --> 00:44:25.528
you mentioned that your team
00:44:25.548 --> 00:44:27.289
started doing this research last year.
00:44:27.768 --> 00:44:28.849
Where are you today?
00:44:29.269 --> 00:44:31.510
Where do you plan on being tomorrow?
00:44:31.829 --> 00:44:33.130
What's in the future for
00:44:33.170 --> 00:44:34.951
your team in studying and
00:44:34.990 --> 00:44:36.411
in using and in sharing the
00:44:36.451 --> 00:44:37.492
knowledge about artificial
00:44:37.532 --> 00:44:38.351
intelligence with the world?
00:44:39.277 --> 00:44:39.498
Well,
00:44:40.137 --> 00:44:42.760
we continue to run experiments and do
00:44:42.840 --> 00:44:44.541
projects to help figure out
00:44:44.621 --> 00:44:45.860
how to save us time.
00:44:45.960 --> 00:44:47.802
So as you mentioned at the beginning,
00:44:48.822 --> 00:44:50.664
we have our products in
00:44:51.264 --> 00:44:52.545
districts around the country.
00:44:52.605 --> 00:44:53.824
So we're always looking to
00:44:53.864 --> 00:44:55.266
make sure that our products
00:44:55.365 --> 00:44:57.067
meet the standards for all
00:44:57.106 --> 00:44:58.288
of these different states.
00:44:58.947 --> 00:45:00.268
And since we don't have a
00:45:00.329 --> 00:45:01.469
centralized curriculum in
00:45:01.489 --> 00:45:02.230
the United States,
00:45:02.650 --> 00:45:04.251
you can imagine that a
00:45:04.831 --> 00:45:06.152
large language model and
00:45:06.574 --> 00:45:07.634
different machine learning
00:45:07.693 --> 00:45:09.034
could help us look across
00:45:09.275 --> 00:45:10.456
all of the state standards
00:45:11.556 --> 00:45:12.818
to make sure that we have
00:45:12.838 --> 00:45:14.338
the alignment that we say we do.
00:45:14.900 --> 00:45:16.681
So that's one very popular
00:45:16.721 --> 00:45:18.402
project and one use that
00:45:18.442 --> 00:45:19.702
we're using AI for.
00:45:21.143 --> 00:45:23.166
But what we're continuing to
00:45:23.206 --> 00:45:24.447
do is to try to have
00:45:24.547 --> 00:45:25.849
conversations with our
00:45:25.929 --> 00:45:27.090
education partners and the
00:45:27.170 --> 00:45:28.293
folks in the schools who
00:45:28.452 --> 00:45:29.815
use our products and who
00:45:29.875 --> 00:45:31.436
are worried about our AI.
00:45:31.737 --> 00:45:33.298
And we're continuing to have
00:45:33.318 --> 00:45:35.101
this kind of internal
00:45:35.161 --> 00:45:37.364
experimentation so that we
00:45:37.483 --> 00:45:39.927
know how to advise our
00:45:39.987 --> 00:45:40.708
education partners.
00:45:40.768 --> 00:45:41.487
One of the things that I
00:45:41.527 --> 00:45:43.088
really enjoy about working
00:45:43.188 --> 00:45:46.070
for a company that really
00:45:46.150 --> 00:45:48.110
values educators first,
00:45:48.250 --> 00:45:49.271
like Edmentum does,
00:45:49.710 --> 00:45:50.871
is that we're not just
00:45:50.951 --> 00:45:52.391
trying to sell our products.
00:45:52.492 --> 00:45:54.271
We're really trying to be in
00:45:54.331 --> 00:45:55.632
relationship with those
00:45:55.693 --> 00:45:56.693
folks who use it and to
00:45:56.813 --> 00:45:57.974
understand their daily
00:45:58.594 --> 00:46:00.393
realities and to help them
00:46:00.494 --> 00:46:02.635
figure out how to make
00:46:02.735 --> 00:46:04.056
things work best for those
00:46:04.096 --> 00:46:04.936
daily realities.
00:46:05.922 --> 00:46:07.204
Talking today to Dr. Jane
00:46:07.284 --> 00:46:08.565
Lambers from Edmentum.
00:46:08.746 --> 00:46:08.967
Jane,
00:46:09.027 --> 00:46:10.088
where can we learn more about the
00:46:10.128 --> 00:46:11.190
great work you're doing and
00:46:11.269 --> 00:46:12.010
how do we get in touch with
00:46:12.030 --> 00:46:13.413
you if you have any other questions?
00:46:13.893 --> 00:46:15.295
I think LinkedIn is the best
00:46:15.376 --> 00:46:17.278
way to reach me and I'll
00:46:17.318 --> 00:46:18.400
make sure you have that to
00:46:18.440 --> 00:46:19.242
put in your show notes.
00:46:19.914 --> 00:46:20.295
And of course,
00:46:20.315 --> 00:46:20.916
you can find out more
00:46:20.936 --> 00:46:22.876
information over at edmentum.com.
00:46:22.996 --> 00:46:23.737
All of our show notes are
00:46:23.757 --> 00:46:24.398
going to be over there.
00:46:24.438 --> 00:46:25.898
This is Digital Learning Today.
00:46:25.938 --> 00:46:26.460
You can, of course,
00:46:26.500 --> 00:46:27.740
check out everything we
00:46:27.780 --> 00:46:28.840
have going on over at the
00:46:28.860 --> 00:46:30.322
TeacherCast Educational Network.
00:46:30.722 --> 00:46:31.802
Find out more information,
00:46:31.822 --> 00:46:32.643
like and subscribe,
00:46:32.684 --> 00:46:33.644
all that great stuff over
00:46:33.684 --> 00:46:34.864
at teachercast.net.
00:46:35.085 --> 00:46:35.686
Dr. Lammers,
00:46:35.865 --> 00:46:37.827
thank you so much for joining us today.
00:46:38.347 --> 00:46:38.867
Thank you, Jeff.
00:46:38.987 --> 00:46:39.568
It was a pleasure.
00:46:40.048 --> 00:46:40.809
And that wraps up this
00:46:40.849 --> 00:46:42.190
episode of Digital Learning today.
00:46:42.271 --> 00:46:43.550
I hope you guys had a good
00:46:43.590 --> 00:46:44.692
time and I hope you learned
00:46:45.012 --> 00:46:45.833
something that you can
00:46:45.893 --> 00:46:46.934
share with your faculty.
00:46:47.233 --> 00:46:48.054
There's one thing that we
00:46:48.094 --> 00:46:49.715
know here about artificial intelligence.
00:46:49.914 --> 00:46:51.737
It ain't going away.
00:46:52.197 --> 00:46:53.757
So have a good time with it.
00:46:53.838 --> 00:46:54.938
Let us know what you're thinking.
00:46:54.958 --> 00:46:56.340
And if you're interested,
00:46:57.059 --> 00:46:57.619
reach out to me.
00:46:57.900 --> 00:46:58.920
Would love to have you be a
00:46:58.981 --> 00:47:00.121
guest on this show as we
00:47:00.141 --> 00:47:01.123
get into the summertime.
00:47:01.443 --> 00:47:02.143
And that wraps up this
00:47:02.182 --> 00:47:03.123
episode of TeacherCast.
00:47:03.143 --> 00:47:04.304
On behalf of Dr. Lammers and
00:47:04.405 --> 00:47:05.585
everybody here on TeacherCast,
00:47:06.146 --> 00:47:07.226
my name is Jeff Bradbury,
00:47:07.407 --> 00:47:08.186
reminding you guys to keep
00:47:08.226 --> 00:47:08.967
up the great work in your
00:47:08.987 --> 00:47:10.148
classrooms and continue
00:47:10.168 --> 00:47:11.929
sharing your passions with your students.