Skip to content

AI learns to play Google Chrome Dinosaur Game using RL with NEAT-Python

Notifications You must be signed in to change notification settings

UPocek/Dinosaur_Game_AI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Flutter

Dinosaur AI Game

Artificial intelligence learns how to play a replica of Chrome's offline mode Dino game using reinforcement learning with NEAT-Python genetic algorithm.

Progress

Stage 1:

The initial population size is 30, the activation function is set to tanh, and agents (individuals) with the highest score get to reproduce for the next generation.

START

Stage 2:

After a few obstacles, most of the individuals from the population have died, but others learned from their mistakes, optimized their neural networks, and were able to adapt and learn how to beat the game.

MID GAME

Stage 3:

Later when the game started to speed up only one individual was able to survive and keep going. I run the game for a few hours and this t-rex became so good that in my opinion, he could keep going forever. As shown this game is very easy for a computer to learn how to play. And with a good initial population and game setting, AI was able to beat it in just a few generations.

SUPREMACY

About

AI learns to play Google Chrome Dinosaur Game using RL with NEAT-Python

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages