OpenAI Gym: How to Take Actions

Update (please read this after watching the video)

The website https://gym.openai.com is now redirecting to https://gymlibrary.ml, the new documentation site for the Gym library.

The new site documents the environment space and the meanings of the observation elements. Therefore, this replaces source number 1, 2 and 4 in the lesson.

However, be careful with the new site. It documents the latest Gym version. The conda version may lag a bit, since gym is releasing new versions at breakneck speed.

If something in the documentation is not matching what you see in your code output, check your gym version and use source number 3: the docstrings in the environment's source code for that particular version (you can find that on Github).

Download the Jupyter notebook used in this lesson

You can download the Jupyter notebooks from a GitHub repository. It includes the notebooks used in the video lessons and all coding exercises. Find it here: https://github.com/gutfeeling/fastdeeprl

The notebook for this particular video is here: https://github.com/gutfeeling/fastdeeprl/blob/master/10_actions/actions.ipynb

Lesson Description and Notes

Learn how to take actions in OpenAI Gym environments.

Using the CartPole-v1 environment as an example, I will show you how to find the allowed actions in any environment.

We will get introduced to the Discrete data type.

We will also learn to interpret the meaning of the actions using the following documentation sources.

  1. The environment page on Gym's website e.g. https://gym.openai.com/envs/CartPole-v1/
    1. This has been replaced by the new documentation site https://www.gymlibrary.ml/environments/classic_control/cart_pole/
  2. The Wiki page of the environment in Gym's GitHub repository e.g. https://github.com/openai/gym/wiki/CartPole-v0
    1. This is now old and probably unmaintained. The latest information is here: https://www.gymlibrary.ml/environments/classic_control/cart_pole/
  3. The environment's source code e.g. https://github.com/openai/gym/blob/master/gym/envs/classic_control/cartpole.py
  4. Research paper referenced in the environment page e.g. https://gym.openai.com/envs/CartPole-v1/
    1. This has been replaced by https://www.gymlibrary.ml/environments/classic_control/cart_pole/#description

At the end of the tutorial, we will take many actions in a row, visualize the results, and bring the simulation to life.

Complete and Continue  
Discussion

0 comments