-
Updated
Mar 13, 2018 - Python
mountaincar-v0
Here are 22 public repositories matching this topic...
MountainCar Deep-Q Network
-
Updated
Jan 24, 2022 - Jupyter Notebook
MountainCar-v0 is a gym environment. Discretized continuous state space and solved using Q-learning.
-
Updated
Mar 31, 2019 - Jupyter Notebook
opengym mountain car continuous model trained with actor critic method
-
Updated
Oct 1, 2021 - Python
Applied various Reinforcement Learning (RL) algorithms to determine the optimal policy for diverse Markov Decision Processes (MDPs) specified within the OpenAI Gym library
-
Updated
Dec 18, 2023 - Python
-
Updated
May 9, 2018 - Jupyter Notebook
A solution for the MountainCar-v0 problem of the Gym environment
-
Updated
Jul 2, 2020 - Python
PGuNN - Playing Games using Neural Networks
-
Updated
Feb 25, 2019 - Python
Mountain Car is a Gym environment. I used this environment to train my model using Q-Learning which is a reinforcement learning technic.
-
Updated
Oct 3, 2023 - Python
This repo constains the implementation of REINFORCE and REINFORCE-Baseline algorithm on Mountain car problem.
-
Updated
Feb 13, 2022 - Python
Deep RL on OpenAI gym environment
-
Updated
Dec 8, 2022 - Python
Mountain car problem via Q-learning.
-
Updated
Apr 4, 2020 - Python
OpenAI MountainCar-v0 DeepRL-based solutions (DQN, DuelingDQN, D3QN)
-
Updated
Aug 11, 2021 - Python
Deep RL agent for solving MountainCar-v0 environment.
-
Updated
Jun 8, 2020 - Python
An implementation of main reinforcement learning algorithms: solo-agent and ensembled versions.
-
Updated
Feb 7, 2019 - Python
A simple baseline for mountain-car @ gym
-
Updated
Jan 15, 2020 - Python
Tensorflow based DQN and PyTorch based DDQN Agent for 'MountainCar-v0' openai-gym environment.
-
Updated
Mar 30, 2022 - Python
Solving MountainCar-v0 environment in Keras with Deep Q Learning an Deep Reinforcement Learning algorithm
-
Updated
Oct 2, 2020 - Python
RL with OpenAI Gym
-
Updated
Jun 1, 2021 - Jupyter Notebook
Improve this page
Add a description, image, and links to the mountaincar-v0 topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the mountaincar-v0 topic, visit your repo's landing page and select "manage topics."