OpenAI CEO Sam Altman discusses the future of generativ… — Transcript

OpenAI CEO Sam Altman discusses the future of generative AI and its impact on education in a fireside chat at University of Michigan Engineering.

Key Takeaways

  • Generative AI is rapidly evolving from chat models to complex reasoning capabilities.
  • OpenAI's latest model marks a foundational step toward generalized intelligence.
  • AI has transformative potential in education by personalizing learning and expanding access.
  • The adoption of AI technologies can be unpredictable but impactful, as seen with ChatGPT.
  • Collaboration between universities and AI innovators is crucial for future advancements.

Summary

  • Sam Altman, CEO of OpenAI, engages in a fireside chat hosted by University of Michigan Engineering about generative AI's transformative role.
  • The event included over 500 student-submitted questions highlighting strong community interest in AI.
  • Altman describes the launch of OpenAI's 'Strawberry' model (GPT-4.01) as a significant step toward complex reasoning in AI.
  • He explains the progression of AI capabilities from chat-based models to reasoning models, marking a new phase in AI development.
  • The discussion touches on the path toward generalized and superhuman intelligence as a continuous exponential growth.
  • Altman reflects on the unexpected rapid adoption of earlier GPT models, especially the ChatGPT moment with GPT-3.5.
  • The event emphasizes AI's potential to personalize learning, improve educational outcomes, and broaden global access to knowledge.
  • University leaders and investment directors highlight collaboration opportunities between AI technology and academic ventures.
  • The conversation also includes personal and informal moments, such as Altman sharing his favorite fruit, to humanize the dialogue.
  • The session is positioned as a rare paradigm shift in technology and education, with ongoing rapid advancements anticipated.

Full Transcript — Download SRT & Markdown

00:00
Speaker A
Good afternoon.
00:03
Speaker A
I'm Karen Tolly, the Robert J. Vlasic Dean of Engineering, welcome to our fireside chat with Open AI CEO Sam Altman.
00:14
Speaker A
Presented to you today by the College of Engineering in collaboration with our colleagues from across the campus.
00:23
Speaker A
I'm grateful to all of you because you've taken the time to join us here for this opportunity to have this really important discussion with a pioneer in the field.
00:35
Speaker A
Students, thank you so much for taking your time to engage.
00:40
Speaker A
As someone new to campus, it is wonderful to see such a tremendous response from our community, including representatives from the Board of Regents, University Executive Officers, and other leaders from across campus.
00:57
Speaker A
Generative AI is transforming how we work, how we learn, how we teach, and it's a very popular topic.
01:08
Speaker A
The number of requests for this event was overwhelming, and you, students, all submitted more than 500 questions for our guest of honor today.
01:58
Speaker A
Okay, it's going to be a long night.
02:02
Speaker A
No, just kidding. We're not going to be able to cover all of those questions today, but the university's senior managing director of investments, Dan Fader, and Open AI's Sam Altman will discuss a wide range of questions motivated by what you submitted.
02:21
Speaker A
To ensure an enjoyable experience for all, we ask that you please refrain from personally recording or photographing today's event.
02:33
Speaker A
We will share a recording of the event next week.
02:37
Speaker A
Thank you for your cooperation.
02:40
Speaker A
Now, I'm pleased to introduce the chair of the University of Michigan Board of Regents, Kathy White, who will introduce our speakers.
03:27
Speaker A
Regent White.
03:37
Speaker B
Well, good afternoon, everyone, and thank you.
03:43
Speaker B
For the kind introduction, Dean Tolly, I'm excited to introduce today's distinguished guest, Sam Altman.
03:52
Speaker B
A leader in technology, innovation, and education.
04:01
Speaker B
As CEO of Open AI, Sam Altman drives a groundbreaking organization in artificial intelligence.
04:10
Speaker B
His leadership has expanded AI's capabilities from natural language processing to reinforcement learning.
04:20
Speaker B
Open AI's work sets the stage for AI to enhance human intelligence in many ways.
04:29
Speaker B
That are new.
04:31
Speaker B
Sam combines technical expertise with a deep passion for innovation in education.
04:41
Speaker B
As president of Y Combinator, he mentored thousands of entrepreneurs who created technologies that shape our daily lives.
04:50
Speaker B
His approach to teaching emphasizes real-world application, hands-on learning, and cultivating a growth mindset.
05:00
Speaker B
Today's program will be moderated by Dan Fader, senior managing director of investments for the university's endowment.
05:10
Speaker B
He leads the university's investments in venture capital and private equity.
05:20
Speaker B
Which also includes actively seeking out opportunities for collaborations between the university's investment program and the broader university community.
05:30
Speaker B
Together, they will discuss how technology will shape the future of education.
05:39
Speaker B
Sam will explain how AI and machine learning can personalize learning, boost outcomes, and broaden access to knowledge worldwide.
05:49
Speaker B
His insights will challenge us to rethink traditional education and use technology to prepare students for the complex challenges of the future.
05:57
Speaker B
So please join me in welcoming a visionary leader transforming technology and education, Sam Altman.
06:19
Speaker C
Okay.
06:21
Speaker C
Welcome to Ann Arbor.
06:23
Speaker D
Thanks for having me.
06:24
Speaker C
Yeah.
06:25
Speaker C
Absolutely.
06:27
Speaker C
Um, this is this is Sam Altman.
06:30
Speaker C
Just, you know, this is.
06:33
Speaker D
Yeah.
06:35
Speaker C
Um.
06:37
Speaker C
So, uh, we've known each other for, I was looking back, about 10 years now.
06:44
Speaker C
And it's.
06:46
Speaker D
It's been awesome.
06:47
Speaker C
To be, um, around you and to to see the work you've done.
06:50
Speaker D
It's been awesome.
06:51
Speaker C
Over the time.
06:52
Speaker C
Of this time.
06:54
Speaker C
Um, before we get going, I just want to say thank you to your team.
06:59
Speaker C
Um, especially Teresa Lopez, uh, for making this happen.
07:03
Speaker C
There's a lot that goes into this and your team is fantastic.
07:08
Speaker C
And then here at the university, um, Mike Drake.
07:12
Speaker C
Emily, uh, Dickman from engineering and Ally LaVine.
07:18
Speaker C
Uh, from the investments office, uh, along with a bunch of other people.
07:24
Speaker C
Uh, did a lot to make today happen.
07:28
Speaker C
Um, so I actually, I love my job.
07:30
Speaker D
Me too.
07:31
Speaker C
And.
07:32
Speaker D
Yeah.
07:33
Speaker C
I.
07:34
Speaker C
Up and down.
07:35
Speaker D
But mostly.
07:36
Speaker C
Um, and, uh, you know, a couple of the reasons why I love my job is that a part of my job is to just.
07:45
Speaker C
Meet with exceptional people and try to, um, find ways to collaborate and to.
07:53
Speaker C
See where it goes, basically.
07:56
Speaker C
And, um, the other part of why I love my job is exactly this.
08:02
Speaker C
Which is to find ways to bring those people, uh, to the university.
08:08
Speaker C
To see if there are things that can come back.
08:11
Speaker C
The other direction.
08:13
Speaker C
And.
08:14
Speaker C
And keep going.
08:16
Speaker C
Um, we have a ton to talk about.
08:20
Speaker C
I, one of the things that, uh, I know about our conversations.
08:25
Speaker C
Is we cover a lot of material in very short period of time.
08:28
Speaker C
So I have a lot of pages.
08:30
Speaker D
We'll get through.
08:31
Speaker C
We might get through all of them.
08:33
Speaker C
Um, but you kind of messed up this plan a little bit.
08:37
Speaker C
Uh, and I had a speed round at the end.
08:41
Speaker C
And I'm just going to do one of the speed round questions right now.
08:45
Speaker C
And I just want to ask you, what, what's your favorite fruit?
08:50
Speaker D
Oh.
08:52
Speaker D
Um.
08:54
Speaker D
I actually don't even like strawberries.
08:56
Speaker C
Okay.
08:57
Speaker D
But.
08:58
Speaker C
I.
08:59
Speaker D
I do today.
09:01
Speaker D
Uh.
09:03
Speaker D
Favorite fruit.
09:06
Speaker D
Pineapple's pretty good.
09:07
Speaker C
All right.
09:08
Speaker D
Yeah, I pick pineapple.
09:09
Speaker C
Comes from Hawaii.
09:10
Speaker C
I like that.
09:12
Speaker C
Um, so for those who don't know.
09:15
Speaker D
We launched strawberry.
09:16
Speaker C
Or.
09:17
Speaker D
We launched 01, which we called strawberry for the last two years.
09:20
Speaker D
We're very excited.
09:22
Speaker C
Yeah.
09:23
Speaker C
Um.
09:25
Speaker C
Well, what is it?
09:28
Speaker D
Um.
09:31
Speaker D
You know, when we, when we finished GPT4, one of the things we were most excited about is can we teach.
09:40
Speaker D
Can we use this thing that we've created?
09:46
Speaker D
Uh, and on top of it, can we teach models to reason?
09:51
Speaker D
And we thought that if we could do that, it would be a very significant step forward.
10:01
Speaker D
And in some sense, you know, was the next.
10:07
Speaker D
Most important, most obvious, whatever you want to call it, missing piece.
10:12
Speaker D
So we got to work on that.
10:16
Speaker D
Uh, it was an idea that many different teams at Open AI were working on in different ways.
10:24
Speaker D
And, you know, we tried some things.
10:28
Speaker D
Some worked, most didn't, but we put more and more effort around the things that worked.
10:33
Speaker D
And the model that we launched today, uh, is.
10:41
Speaker D
I think the first, the first model that I would say is.
10:51
Speaker D
It's still very early, but the first model that is a true.
10:58
Speaker D
General purpose complex reasoner.
11:03
Speaker D
Um, we have this idea that there are these like five levels.
11:10
Speaker D
Of AI that we plan to work on.
11:13
Speaker D
Uh, the first one where we've been for the last few years is these sort of chat-based models.
11:19
Speaker D
And level two, which is about reasoning.
11:22
Speaker D
Um, I think this is the first time we've gotten there.
11:26
Speaker D
It'll get rapidly better from here.
11:30
Speaker D
But even this one is pretty good.
11:33
Speaker C
Um.
11:35
Speaker C
So, uh, does this put us on the path to generalized intelligence?
11:40
Speaker C
Or superhuman intelligence?
11:42
Speaker C
Where, where, where does this land us?
11:44
Speaker D
I think we are on that path, we've been on that path for a long time.
11:48
Speaker D
Uh, this is the next step on that path.
11:51
Speaker D
But, you know, I think it's all one long exponential curve that we are very fortunate to get to live and witness.
12:01
Speaker D
This is an exciting time.
12:02
Speaker C
Cool.
12:03
Speaker C
It's a big day.
12:05
Speaker D
Yeah.
12:06
Speaker C
Thank you for being here for us.
12:07
Speaker D
No, I.
12:09
Speaker D
I think.
12:12
Speaker D
Like we're all tired and exhausted, but these paradigm shifts don't come along that many times.
12:20
Speaker D
Um, you know, the GPT4 was one and this is one.
12:25
Speaker D
And it's it's really it's special.
12:28
Speaker C
So, uh.
12:30
Speaker C
A thing, and I'm curious about, is that when GPT2 came out.
12:38
Speaker C
Um, it really caught fire quickly.
12:41
Speaker C
And I think it maybe caught you a little bit by surprise.
12:46
Speaker C
At how, what the uptake was.
12:49
Speaker D
You know.
12:50
Speaker D
Until, you never know, like looking forward when the thing that kind of like catches fire is going to be.
12:57
Speaker D
And why it was the chat GPT moment.
13:01
Speaker D
And not, that was with like when it launched.
13:04
Speaker D
It was with GPT3.5.
13:06
Speaker D
But why it wasn't quite GPT3 or GPT4.
13:11
Speaker D
Um, that was sort of hard to predict.
13:14
Speaker D
We knew it would happen at some point.
13:17
Speaker D
Uh, but we were caught off guard the particular time it did happen.
13:23
Speaker C
So.
13:25
Speaker C
I'm just trying to connect a couple dots here in terms of where we are with with this launch.
13:30
Speaker C
And we'll see where it goes.
13:33
Speaker C
And you, you know that it's a big deal.
13:36
Speaker C
Or you believe it's a big deal.
13:39
Speaker C
Um, but in looking back, um, I've heard this from a number of people.
13:48
Speaker C
Um, a comment is, well, how could you not have known that two was a big deal?
13:53
Speaker C
It's magic.
13:54
Speaker C
And so is there something about when you're doing work on something that you, you're, you're focused on kind of metrics and development.
14:02
Speaker C
That takes you away from seeing the impact?
14:06
Speaker D
So.
14:08
Speaker D
You know, I think with, when we put out GPT3, that kind of tech community cared about it.
14:16
Speaker D
But most of the rest of the world didn't.
14:20
Speaker D
And I was a little confused by that because I thought it should have gotten more attention than it did.
14:25
Speaker D
And then by the time we put out, we put out chat GPT with 3.5, November 30th of 2022.
14:34
Speaker D
And we had finished training GPT4, August 2nd, I believe, of 2022.
14:40
Speaker D
And we've been using it internally.
14:42
Speaker D
So we'd had months of getting used to GPT4, which we thought was really good.
14:50
Speaker D
And GPT3.5 was like very old news to us.
14:54
Speaker D
And so I think that's maybe why we were surprised, but but again, this is the like it's hard to predict when things catch fire.
15:02
Speaker D
I think you could also say like, why weren't people more excited about GPT3?
15:07
Speaker C
Yeah.
15:09
Speaker C
And then just sort of bouncing back to strawberry.
15:13
Speaker C
What, what are the things, specific things that you see that can be done with this?
15:19
Speaker C
That are maybe most notable or most important in your mind.
15:24
Speaker D
Any of you take the AI math competition?
15:27
Speaker D
It got a 93.
15:30
Speaker D
Percent.
15:31
Speaker D
Like that's, I retook one for fun this year.
15:34
Speaker D
I hadn't done it since high school.
15:37
Speaker D
It's pretty hard.
15:39
Speaker D
I certainly didn't get a 93.
15:42
Speaker D
Um.
15:44
Speaker D
So, uh, I I think there's like, you can point to metrics like that.
15:50
Speaker D
Um, for programming, the the utility to bring to people to help them write software is going to, I think, be astonishing.
15:58
Speaker D
I think a lot of researchers in many different fields will be able to use this to enhance their research.
16:05
Speaker D
Uh.
16:08
Speaker D
And also, as always happens, people will find new ways to use this we never dreamed of.
16:19
Speaker C
Yeah.
16:21
Speaker C
And in terms of what, what, what, what can be done with the state of the art prior to today?
16:28
Speaker D
It's super different.
16:30
Speaker D
Um.
16:31
Speaker D
Look, we numbered this one.
16:35
Speaker D
Because it is a very early thing.
16:39
Speaker D
So rather, you know, we thought at one point about, it doesn't fit perfectly, but maybe we'll call this GPT5 or whatever.
16:47
Speaker D
Um, but this is a new paradigm.
16:50
Speaker D
It's a different way to use a model.
16:52
Speaker D
You have it's good at different things.
16:55
Speaker D
It takes a long time for hard problems.
16:58
Speaker D
Which is annoying, but we'll make that better.
17:01
Speaker D
Um, but it can do things that the GPT series just didn't and it struggles with a lot of things too.
17:11
Speaker D
Um.
17:13
Speaker D
So it'll take some time to adapt to.
17:16
Speaker D
But, you know, just when you're standing backstage, I was like looking at what people were saying about it online.
17:24
Speaker D
And it's amazing to watch people who have not, you know, been staring at this every day for the last year.
17:33
Speaker D
Find out, like, look at it with fresh eyes and be like, wow, this thing wrote this incredibly complex piece of code.
17:39
Speaker D
Or it helped me reason through this problem I've been stuck on.
17:43
Speaker D
Uh.
17:45
Speaker D
So, I think we'll see a lot in the next few weeks of how people adapt and begin to use this.
17:50
Speaker D
But.
17:53
Speaker D
I, I think these models have been so impressive in some ways.
18:00
Speaker D
That we've overlooked how bad they've been in reasoning.
18:04
Speaker D
And now they can.
18:06
Speaker D
It's a, this is a real like phase change in that sense.
18:11
Speaker C
Yeah.
18:12
Speaker C
So I was getting it, trying to get at a little bit.
18:16
Speaker C
Which is when we look at what is being used now in terms of trying to get maybe to this kind of output.
18:24
Speaker C
With, you know, whether it's recursive self-improvement or open-endedness.
18:30
Speaker C
Or genetic AI.
18:32
Speaker C
You know, as ways of kind of getting there.
18:34
Speaker C
Is, is this override?
18:36
Speaker D
I think this does start to enable agents.
18:38
Speaker D
I think one of the things that have been blocking agents is just a system.
18:43
Speaker D
That had enough reasoning capabilities and was robust enough that you, you could.
18:51
Speaker D
Do these long horizon tasks with enough confidence or rigor.
18:59
Speaker D
Um, I think we're still a ways away from like.
19:03
Speaker D
A self-improvement loop.
19:07
Speaker D
Um, and when we get closer to that.
19:10
Speaker D
We'll have to be very careful.
19:13
Speaker D
But I do think the agent's vision is now like.
19:18
Speaker D
You know, it's not going to be.
19:22
Speaker D
It's not like a next month kind of thing.
19:26
Speaker D
But it's it's within grasp.
19:29
Speaker C
Yeah.
19:31
Speaker C
What does this mean for reinforcement learning?
19:34
Speaker C
And human input.
19:36
Speaker C
So RLF.
19:38
Speaker C
Where, you know, it's that has been an important tool.
19:43
Speaker D
Yeah.
19:44
Speaker C
Yeah.
19:46
Speaker D
Well, 01 really is like all about reinforcement learning.
19:51
Speaker D
Um, that that is the secret here.
19:54
Speaker D
And we've done smaller versions of that in the past.
19:58
Speaker D
RLF is a great example.
20:01
Speaker D
Um.
20:03
Speaker D
But, you know, this is the dream of RL and language models finally.
20:08
Speaker D
And it really works.
20:10
Speaker C
Okay.
20:12
Speaker C
Um.
20:14
Speaker C
Maybe we could just.
20:16
Speaker C
Go back a little bit.
20:19
Speaker C
What is.
20:21
Speaker C
What does AGI mean?
20:24
Speaker C
What does it mean to you?
20:26
Speaker D
You know, whatever is like two years away.
20:30
Speaker D
Or five years away.
20:32
Speaker D
Or like whatever's coming in the future.
20:36
Speaker D
Uh, I think, I think the term has become essentially meaningless.
20:40
Speaker D
Because people have such a huge variety of things that they mean for it.
20:47
Speaker D
Like some people call GPT4 AGI, which I think it's definitely not.
20:54
Speaker D
Um, some people talk about this vague thing and they're not sure what it is.
21:00
Speaker D
But they think it'll come in, yeah, some set number of years.
21:04
Speaker D
Always the same number in the future.
21:08
Speaker D
Um, and some people mean that it's like the legitimate recursively self-improving super intelligence.
21:13
Speaker D
Uh.
21:15
Speaker D
I would love to banish the word.
21:19
Speaker D
Uh, because I think it's become so overloaded with different meanings.
21:26
Speaker D
Uh, you know, one of the reasons we like this levels framework is you can at least then like kind of agree on what the milestones are.
21:32
Speaker D
And talk about, talk about something with a little more rigor.
21:37
Speaker D
Um.
21:40
Speaker D
I think if you could go back in time, not even very long.
21:46
Speaker D
If you could go back in time like five years.
21:52
Speaker D
And show someone 01, they would be absolutely astonished.
21:57
Speaker D
Most people.
21:58
Speaker D
They would say, there's no way that AI is going to do this in 2024.
22:04
Speaker D
And I think if you could fast forward another five years.
22:10
Speaker D
And we could show you what we'll have in 2029.
22:15
Speaker D
Most people would say, absolutely not.
22:18
Speaker D
There's no way.
22:20
Speaker D
Can't keep going like this.
22:22
Speaker C
Right.
22:24
Speaker C
I didn't know there were.
22:25
Speaker D
I think it's great.
22:26
Speaker C
I didn't know there were looking for revenge.
22:27
Speaker D
Well.
22:28
Speaker D
I just think it's been a sad.
22:30
Speaker D
Like, I was a 20-year-old founder.
22:34
Speaker D
I was like very grateful for the experience.
22:37
Speaker D
It was a lot of fun.
22:39
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
22:46
Speaker D
And now I think it'll depend on the swing back.
22:50
Speaker C
Right.
22:53
Speaker C
I didn't know they were looking for revenge.
22:55
Speaker D
Well.
22:56
Speaker D
I just think it's been a sad.
22:58
Speaker D
Like, I was a 20-year-old founder.
23:02
Speaker D
I was like very grateful for the experience.
23:05
Speaker D
It was a lot of fun.
23:07
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
23:14
Speaker D
And now I think it'll depend on the swing back.
23:18
Speaker C
Right.
23:20
Speaker C
I didn't know they were looking for revenge.
23:22
Speaker D
Well.
23:23
Speaker D
I just think it's been a sad.
23:25
Speaker D
Like, I was a 20-year-old founder.
23:29
Speaker D
I was like very grateful for the experience.
23:32
Speaker D
It was a lot of fun.
23:34
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
23:41
Speaker D
And now I think it'll depend on the swing back.
23:45
Speaker C
Right.
23:47
Speaker C
I didn't know they were looking for revenge.
23:49
Speaker D
Well.
23:50
Speaker D
I just think it's been a sad.
23:52
Speaker D
Like, I was a 20-year-old founder.
23:56
Speaker D
I was like very grateful for the experience.
23:59
Speaker D
It was a lot of fun.
24:01
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
24:08
Speaker D
And now I think it'll depend on the swing back.
24:12
Speaker C
Right.
24:14
Speaker C
I didn't know they were looking for revenge.
24:16
Speaker D
Well.
24:17
Speaker D
I just think it's been a sad.
24:19
Speaker D
Like, I was a 20-year-old founder.
24:23
Speaker D
I was like very grateful for the experience.
24:26
Speaker D
It was a lot of fun.
24:28
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
24:35
Speaker D
And now I think it'll depend on the swing back.
24:39
Speaker C
Right.
24:41
Speaker C
I didn't know they were looking for revenge.
24:43
Speaker D
Well.
24:44
Speaker D
I just think it's been a sad.
24:46
Speaker D
Like, I was a 20-year-old founder.
24:50
Speaker D
I was like very grateful for the experience.
24:53
Speaker D
It was a lot of fun.
24:55
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
25:02
Speaker D
And now I think it'll depend on the swing back.
25:06
Speaker C
Right.
25:08
Speaker C
I didn't know they were looking for revenge.
25:10
Speaker D
Well.
25:11
Speaker D
I just think it's been a sad.
25:13
Speaker D
Like, I was a 20-year-old founder.
25:17
Speaker D
I was like very grateful for the experience.
25:20
Speaker D
It was a lot of fun.
25:22
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
25:29
Speaker D
And now I think it'll depend on the swing back.
25:33
Speaker C
Right.
25:35
Speaker C
I didn't know they were looking for revenge.
25:37
Speaker D
Well.
25:38
Speaker D
I just think it's been a sad.
25:40
Speaker D
Like, I was a 20-year-old founder.
25:44
Speaker D
I was like very grateful for the experience.
25:47
Speaker D
It was a lot of fun.
25:49
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
25:56
Speaker D
And now I think it'll depend on the swing back.
26:00
Speaker C
Right.
26:02
Speaker C
I didn't know they were looking for revenge.
26:04
Speaker D
Well.
26:05
Speaker D
I just think it's been a sad.
26:07
Speaker D
Like, I was a 20-year-old founder.
26:11
Speaker D
I was like very grateful for the experience.
26:14
Speaker D
It was a lot of fun.
26:16
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
26:23
Speaker D
And now I think it'll depend on the swing back.
26:27
Speaker C
Right.
26:29
Speaker C
I didn't know they were looking for revenge.
26:31
Speaker D
Well.
26:32
Speaker D
I just think it's been a sad.
26:34
Speaker D
Like, I was a 20-year-old founder.
26:38
Speaker D
I was like very grateful for the experience.
26:41
Speaker D
It was a lot of fun.
26:43
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
26:50
Speaker D
And now I think it'll depend on the swing back.
26:54
Speaker C
Right.
26:56
Speaker C
I didn't know they were looking for revenge.
26:58
Speaker D
Well.
26:59
Speaker D
I just think it's been a sad.
27:01
Speaker D
Like, I was a 20-year-old founder.
27:05
Speaker D
I was like very grateful for the experience.
27:08
Speaker D
It was a lot of fun.
27:10
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
27:17
Speaker D
And now I think it'll depend on the swing back.
27:21
Speaker C
Right.
27:23
Speaker C
I didn't know they were looking for revenge.
27:25
Speaker D
Well.
27:26
Speaker D
I just think it's been a sad.
27:28
Speaker D
Like, I was a 20-year-old founder.
27:32
Speaker D
I was like very grateful for the experience.
27:35
Speaker D
It was a lot of fun.
27:37
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
27:44
Speaker D
And now I think it'll depend on the swing back.
27:48
Speaker C
Right.
27:50
Speaker C
I didn't know they were looking for revenge.
27:52
Speaker D
Well.
27:53
Speaker D
I just think it's been a sad.
27:55
Speaker D
Like, I was a 20-year-old founder.
27:59
Speaker D
I was like very grateful for the experience.
28:02
Speaker D
It was a lot of fun.
28:04
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
28:11
Speaker D
And now I think it'll depend on the swing back.
28:15
Speaker C
Right.
28:17
Speaker C
I didn't know they were looking for revenge.
28:19
Speaker D
Well.
28:20
Speaker D
I just think it's been a sad.
28:22
Speaker D
Like, I was a 20-year-old founder.
28:26
Speaker D
I was like very grateful for the experience.
28:29
Speaker D
It was a lot of fun.
28:31
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
28:38
Speaker D
And now I think it'll depend on the swing back.
28:42
Speaker C
Right.
28:44
Speaker C
I didn't know they were looking for revenge.
28:46
Speaker D
Well.
28:47
Speaker D
I just think it's been a sad.
28:49
Speaker D
Like, I was a 20-year-old founder.
28:53
Speaker D
I was like very grateful for the experience.
28:56
Speaker D
It was a lot of fun.
28:58
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
29:05
Speaker D
And now I think it'll depend on the swing back.
29:09
Speaker C
Right.
29:11
Speaker C
I didn't know they were looking for revenge.
29:13
Speaker D
Well.
29:14
Speaker D
I just think it's been a sad.
29:16
Speaker D
Like, I was a 20-year-old founder.
29:20
Speaker D
I was like very grateful for the experience.
29:23
Speaker D
It was a lot of fun.
29:25
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
29:32
Speaker D
And now I think it'll depend on the swing back.
29:36
Speaker C
Right.
29:38
Speaker C
I didn't know they were looking for revenge.
29:40
Speaker D
Well.
29:41
Speaker D
I just think it's been a sad.
29:43
Speaker D
Like, I was a 20-year-old founder.
29:47
Speaker D
I was like very grateful for the experience.
29:50
Speaker D
It was a lot of fun.
29:52
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
29:59
Speaker D
And now I think it'll depend on the swing back.
30:03
Speaker C
Right.
30:05
Speaker C
I didn't know they were looking for revenge.
30:07
Speaker D
Well.
30:08
Speaker D
I just think it's been a sad.
30:10
Speaker D
Like, I was a 20-year-old founder.
30:14
Speaker D
I was like very grateful for the experience.
30:17
Speaker D
It was a lot of fun.
30:19
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
30:26
Speaker D
And now I think it'll depend on the swing back.
30:30
Speaker C
Right.
30:32
Speaker C
I didn't know they were looking for revenge.
30:34
Speaker D
Well.
30:35
Speaker D
I just think it's been a sad.
30:37
Speaker D
Like, I was a 20-year-old founder.
30:41
Speaker D
I was like very grateful for the experience.
30:44
Speaker D
It was a lot of fun.
30:46
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
30:53
Speaker D
And now I think it'll depend on the swing back.
30:57
Speaker C
Right.
30:59
Speaker C
I didn't know they were looking for revenge.
31:01
Speaker D
Well.
31:02
Speaker D
I just think it's been a sad.
31:04
Speaker D
Like, I was a 20-year-old founder.
31:08
Speaker D
I was like very grateful for the experience.
31:11
Speaker D
It was a lot of fun.
31:13
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
31:20
Speaker D
And now I think it'll depend on the swing back.
31:24
Speaker C
Right.
31:26
Speaker C
I didn't know they were looking for revenge.
31:28
Speaker D
Well.
31:29
Speaker D
I just think it's been a sad.
31:31
Speaker D
Like, I was a 20-year-old founder.
31:35
Speaker D
I was like very grateful for the experience.
31:38
Speaker D
It was a lot of fun.
31:40
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
31:47
Speaker D
And now I think it'll depend on the swing back.
31:51
Speaker C
Right.
31:53
Speaker C
I didn't know they were looking for revenge.
31:55
Speaker D
Well.
31:56
Speaker D
I just think it's been a sad.
31:58
Speaker D
Like, I was a 20-year-old founder.
32:02
Speaker D
I was like very grateful for the experience.
32:05
Speaker D
It was a lot of fun.
32:07
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time.
32:14
Speaker D
And now I think it'll depend on the swing back.
32:18
Speaker C
Right.
32:20
Speaker C
I didn't know they were looking for revenge.
32:22
Speaker D
Well.
32:23
Speaker D
I just think it's been a sad.
32:25
Speaker D
Like, I was a 20-year-old founder.
32:29
Speaker D
I was like very grateful for the experience.
32:32
Speaker D
It was a lot of fun.
32:34
Speaker D
Um, and it's just been much less of that, I think, as the tech industry has like ossified someone in the intervening time, and now I think it'll depend on the swing back. I also want to give a special thanks to all the team members who worked behind the scenes to make this event happen.
33:20
Speaker D
It was really a special event for all of us.
33:24
Speaker D
And I want to thank you students for joining us today.
33:29
Speaker D
And being so engaged in this discussion.
33:32
Speaker D
Have a good evening and go blue.
Topics:Sam AltmanOpenAIGenerative AIGPT-4Artificial IntelligenceEducation TechnologyUniversity of MichiganAI ReasoningMachine LearningFuture of AI

Frequently Asked Questions

What is the significance of OpenAI's 'Strawberry' model mentioned in the video?

'Strawberry' refers to OpenAI's GPT-4.01 model, which represents a major advancement by incorporating true complex reasoning abilities, marking a new level beyond chat-based AI.

How does Sam Altman view the future of AI in education?

Altman believes AI can personalize learning, improve educational outcomes, and broaden access to knowledge worldwide, challenging traditional education models.

Why was the ChatGPT moment with GPT-3.5 surprising to OpenAI?

Although OpenAI expected AI to gain widespread adoption eventually, the rapid and massive uptake of ChatGPT with GPT-3.5 caught them somewhat off guard in terms of timing and scale.

Get More with the Söz AI App

Transcribe recordings, audio files, and YouTube videos — with AI summaries, speaker detection, and unlimited transcriptions.

Or transcribe another YouTube video here →