2019 TutORial: Recent Advances in Multiarmed Bandits for Sequential Decision Making

2019 TutORial: Current Advances in Multiarmed Bandits for Sequential Choice Making



Given by Shipra Agrawal on the 2019 INFORMS Annual Assembly in Seattle, WA. This tutorial discusses some latest advances in sequential resolution making …

Leave a Reply

Your email address will not be published. Required fields are marked *