Skip to content

Files

Latest commit

 

History

History

envpool

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

Installation

Install envpool with:

pip install envpool

Note 1: envpool only supports Linux operating system.

Usage

You can use OpenRL to train Cartpole (envpool) via:

PYTHON_PATH train_ppo.py

You can also add custom wrappers in envpool_wrapper.py. Currently we have VecAdapter and VecMonitor wrappers.