Skip to content

Atari, Dockerfile, PPO

Compare
Choose a tag to compare
@kengz kengz released this 16 May 15:37
· 2427 commits to master since this release
99f54b4

New features and improvements

  • some code cleanup to prepare for the next version
  • DQN Atari working, not optimized yet
  • Dockerfile finished, ready to run lab at scale on server
  • implemented PPO in tensorflow from OpenAI, along with the utils