Share it with your friends Like

Thanks! Share it with your friends!

Close

MIT 6.034 Artificial Intelligence, Fall 2010
View the complete course: http://ocw.mit.edu/6-034F10
Instructor: Patrick Winston

This lecture explores genetic algorithms at a conceptual level. We consider three approaches to how a population evolves towards desirable traits, ending with ranks of both fitness and diversity. We briefly discuss how this space is rich with solutions.

License: Creative Commons BY-NC-SA
More information at http://ocw.mit.edu/terms
More courses at http://ocw.mit.edu

Buy/Stream:

Comments

Tomas Alejandro says:

44:58 he looks like an angry gorila, mad because he couldnt get the food

muhammed salih says:

Can anyone give me the link to the demo shown on the video

Akshat giri says:

I thought the title meant 13 different genetic algorithms.

Karim Saad says:

Best greetings from Germany !

I'm a high school student in Germany and
I think AI and these algorithms are very useful and interesting.

In Germany the most people don't care about it today, but our politians try to move the people in these for them new direction.
In the direction of self learning machines, machines who do the most job of us.
For example helping doctors while they run diagonstics on their patients or do operational things… 😉

Maybe It's a huge thinking forward, in the future.

Sedit T says:

I utterly love these MIT Lectures and have been watching them for the past couple days non stop…. However the way this guy breaths in this one is almost making me want to shut this the fuck off. I REALLY need to hear this for an evolution simulator that I am working on but I almost can't take this dude fucking breathing like hes doing some strenuous workout just writting on a fucking chalk board and talking. I wanna tell him to sit down and take a break, dont push yourself there man, you are giving a lecture not running a marathon. Damn its driving me fucking crazy.

Grandfather_Din_Racket says:

poor type 2 diabetic. Too bad the FDA killed stevia-sweetened chocolate

0 1 says:

Kinda disappointed by this lecture:
1. The lecturer said mutation is essentially hill-climbing which I agree. But he didn't explain what cross-over is and why it is important. At least he should have stressed that it was still a mystery.
2. Crediting the artificial creature program for its "rich solution space" rather than genetic algorithm without even justifying it is kinda irresponsible. Because that's a bold and non-trivial claim.
3. Yes, GA requires fine-tuning of parameters, in machine learning we have feature engineering which is doing the same thing. Isn't it naive to thinking an algorithm as general as GA would work well on all problem instances without feature engineering? There is no universal problem solving algorithm that works well for all problem instances (no free lunch theorem)

Overall, I have the impression that the lecturer has prejudice against GA.

Anonymous says:

actually, this video is almost 3 years out of date. OpenAI's neuroevolution algorithm (run in parallel among 2000 cores) was able to solve Atari games faster than Google's DeepMind, which uses Reinforcement Learning and backpropagation or something. but basically, if you have a whole company's resources to cores, then neuroevolution is the fastest way to teach a.i. to play video games, because it's much more parallelizeable.

Piyush Pratap Singh says:

Awesome and easier, thanks!

Sposchy says:

Well, there we go. I can at least get one mark on an MIT exam. He's definitely. a creationist.

TerraNova says:

Highly intriguing and informative.

DeenanTheKemon I says:

what a dry and miserable class, they never laugh or react to anything he says and hes funny, and interesting to listen to. spoiled bratts

Jeff Brasfield says:

Im glad i did not pay for that. but thanks anyway.

John David Deatherage says:

I watched your lecture with great interest. I'm teaching myself Python by coding a GA. Often, when selection and reproduction are discussed, the biological model of two parents are combined into one offspring. I have a different idea. Say you have a starting population of 200. You apply your fitness function to score each member and then the grim reaper function to kill the bottom half in terms of fitness. You have a population of 100 members. Why not combine each member with every other member? (think nested loops). 100 * 100 (crossover) produces 10,000 new members. apply a mutation function randomly against the population and against each cell in the DNA string. Then reduce the population by 99% by fitness back to the original level of 100. In effect producing the next generation from the top 1 percent of the current generation. Have you considered such an approach? Can you give me your opinion? Thank you!

Some Random Guy says:

Wow, this is a ridiculously convoluted way of explaining such a simple concept.

Write a comment

*

DARPA SUPERHIT 2021 Play Now!Close

DARPA SUPERHIT 2021

(StoneBridge Mix)

Play Now!

×