Optimizing Against You

Optimizing Against You

A discussion about Engagement Optimization Algorithms

·

8 min read

As it stands, social media companies seem to be acting like the cigarette companies of the 21st century. Millions in advertising trying to get as many people as possible to join the app they know to be harmful. While it's a companies responsibility to take care of their investors, who is responsible for making sure this is not at the expense of its users? I consider myself to be pretty cognizant of the addictive algorithms in use by corporations, and still got caught up in one unknowingly.

My Experience

When Canada's lockdown started and leaving the safety of your house felt like putting your life on the line, I started to go back through all the games I'd missed since I had quit before university. I played just about every type of game there is before finally landing on a game called Apex Legends. The most important thing to note about Apex Legends in this context is that it is free to play, the only method of supporting the game's continued development being cash shop cosmetic items and a once-a-month 'battle-pass' that allows you to unlock unique cosmetics by playing more.

Unfortunately, I didn't often have friends to play with, so I would play with other random teammates. This was fine while I was learning, but over time the inconsistency of my teammate's skill levels started to wear on me. Even in the Ranked mode where you're supposed to be matched with people of equal skill levels, I found myself being matched with other players that were 5-6 ranks off of my own. When I posted screenshots of this happening online, it seemed like many other players were experiencing this weird matchmaking as well and were similarly displeased. Eventually, the time cost of playing the game was too much, and I had to quit. I was curious though, the strange matchmaking wasn't a mistake; they were intentionally matching players of unequal rank against each other. Why was it done this way when it seems to cause so much anger in the community? As it turns out, it seems to be another case of a "free" product taking advantage of its users.

Skill Based Matchmaking (SBMM)

When you want to play something competitively, you likely want to be matched up with people that are within a similar skill bracket. Generally, you'd want a player that is a little bit above your rank to help you improve, or a player that is a little bit below your level to help you practice. The first popular system for ranking players in this way was created by a physics professor named Arpad Elo in 1950, initially to be used for ranking chess players.

The Elo rating measured the relative strength of a player in chess compared to other players in the league. Your ranking is inferred from your opponents and the results of the games you've had against them. Many modern rating systems in online games will, at their core, have a similar idea behind their ranking system: there is a variable assigned to every player, let's just call it Elo, that represents how skilled the player is in reference to the population. During regular play, players should generally be matched against those who have a similar ranking. When you win, your Elo goes up by an amount relative to the ranking of your opponent. Same thing if you lost. The bigger the disparity, the larger the change. So, over time, you should end up seeing your rank stabilize somewhere around your true current skill level. For most, this feels like a fair way to match players, but is it really the 'best'?

Engagement Optimized Matchmaking (EOMM)

In 2016 and 2017, EA filed patents for Dynamic Difficulty Adjustment (DDA) and Engagement Optimized Match Making (EOMM). Shortly thereafter, EA worked with a professor at UCLA to write and publish a paper that aimed to show the benefits of EOMM when compared with other matchmaking methods. The EOMM system they described is designed to try and learn enough about your playing habits that it can keep you engaged with the game as long as possible. The patent for it says "The longer a user is engaged with the software, the more likely that the software will be successful". They didn't give their definition of success in the patent, though in the paper they do say that the objective of an EOMM system can be optimized for both in-game time as well as real-money spending.

So, how do you keep your users engaged with the game for as long as possible? EA concluded that the best way to keep players engaged is to vary the difficulty of your game on the fly. It can do this through the use of what the patent calls knobs; controllable game parameters that will affect your player's experienced difficulty. The choice of knobs is one of the more important factors in DDA because it needs to be something that will go unnoticed by the user. The patent uses the example of a race car — if you adjust the max speed of the car based on if the user is winning or losing, that's going to be a very jarring experience for everyone involved. This limits the scope of what we can adjust in an online game; because most of the entities you engage with are other human players, it would be unfair to change an opponent's stats to affect the outcome of a duel. So, EA concluded that the fairest way to change the difficulty of a match to vary is the skill level of the players themselves.

But how does the system know who should be matched up with who? Using Machine Learning the game's operator can continually monitor each player's gaming habits and, after a certain threshold of time or matches, try to match you into a group of other similar players, called a cluster. Your habits can include anything from how often you quit after a win or loss, when do you spend money, how quickly do you start another match, etc. The cluster definitions, i.e. the approximate description of everyone in a cluster, and your assignment to that cluster won't stay the same over time. The churn risk between you and a potential opponent is calculated based on your habits as well as the cluster details of you and your opponent(s). The ideal set of matches is determined by minimizing the total churn risk across all possible matches that can be made.

In its paper, EA concludes based on a simulation that for games with a significantly large player population6 EOMM will at least match, if not outright beat, the churn-avoiding performance of all other matchmaking methods by around 1% per game. Which, calculated over a whole play session, will end up increasing the retention by 10-15%! This was taken conclusive proof that their EOMM system was the best for player engagement, but does that necessarily mean that it's the best for the players themselves? How does this affect the mentality of a player when the outcome of their matches seems to follow no obvious pattern.

Terms of Engagement

Engagement Optimized Matchmaking is just one of many examples of time optimizing algorithms in use on the web today. Many of the world's most popular websites: YouTube, Facebook and TikTok all have algorithms whose sole purpose is to make sure you spend as much time on their platform as possible. Eugene Wei has done an amazing series of articles on his blog about how the TikTok algorithm sees and interacts with the users of the platform. In it, he describes how quickly the algorithm can lock on to your particular content preferences, serving you up content that even you didn't know you'd enjoy. What happens when an algorithm gets to know your habits and weak points, even better than you know yourself? Anecdotally, it means that people report long sessions with the platform without really noticing how much time has passed. What makes these apps even more dangerous is that for every hour you spend consuming media, at least another hour of content that the algorithm can recommend to you has been uploaded. You can never reach the of your infinitely scrolling timeline.

What I want to ask is: Who has the responsibility of taking care of the users of these platforms? The companies are concerned mainly with their responsibility to their shareholders, and rightly so. In the past, when an industry has been created around something harmful (tobacco, alcohol, etc) there was government intervention in the form of heavy regulation. It seems now, though, that the problems we face from social media and the systems that create them are so complex that trying to regulate the allowable use of these algorithms in industry will be akin to playing "Whack-a-mole".

I think, in the end, it will come down to consumer education; it's the responsibility of those using these platforms to understand how they're being taken advantage of. But first, that information needs to be made public. We need to study how these algorithms work, and the information about the effects they have on their users made public. We should implement a warning similar to what you see on North American cigarette boxes or South American sweets so people know ahead of time what apps to watch out for.

TikTokWarning.png

Tinfoil Hat

Some people talk about 'the singularity'; a point at which AI becomes smarter than humans and we are immediately enslaved by the superior brainpower of an artificial master. What if instead of all at once, it was a gradual erosion of our free will as we find ourselves under the control of algorithms meant to optimize how we act. It seems to me that these social media algorithms are already kind of doing that: it's taking control of how we spend our time, by understanding us better than we know ourselves.