Skip to content

HadiSDev/MultiAgentObjectCollectorEnv

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MultiAgentObjectCollectorEnv

First install CleanRl by doing cd cleanrl and poetry install

To run ppo trainer do (from root folder):

cd cleanrl

and python ..\ppotrainer.py --env-id oc-v1

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages