casino_slot.png
 

Multi-Armed Bandit Experimentation

Multi-Armed Bandits are the most basic form of machine learning. Bandits are a type of testing that allows us to exploitative and exploratory at the same time. They optimize an element quickly and automatically. Bandits allows for a quicker and more scalable way to optimize experiences with less risk than a traditional A/B test.

bandit.png

The Dynamic Page Architecture team at Wayfair is responsible for Content Platform tools, as well as Experimentation Platform tooling across all of Wayfair.

DIAGRAM_MAB_testing.png