The ever-increasing level of corporate surveillance of the networked environment is perhaps best highlighted by the use of algorithms that are designed to collate a vast amount data from individual consumers to create highly detailed profiles for the purposes of targeted advertising. Information from posts made on social media, or data collected via internet browsing surveillance through  cookies or fingerprinting technologies are collected, sorted, and collated by algorithms that are designed to sort individuals into specific categories or “clusters.” These clusters can be based on socio-economic status, age, interests, lifestyle, race, and other preselected categories.

The largest problem – aside from the privacy aspects of surveillance – is that these algorithms can, and often do, make false assumptions about people based on extrapolated data. In addition, they also recreated and reinforce existing biases and stereotypes based on the information that is fed into them. While it has been argued that algorithms are objective calculation tools, they are only as objective as the humans that program them, there data they “learn” from or the “proxies” they use to measure human behaviour.

In addition, these algorithms can also be adjusted to provide specific desired results, and the mapping of specific clusters of individuals. By filtering and adjusting the algorithm for the display of specific desired characteristics, individuals’ positions will vary amongst several clusters, highlighting specific characteristics about them by grouping them with similar individuals along specific lines.

So, what does this look like? The eQuality Project partnered with David Phillips and Andrea Villanueva to produce a short film entitled You are What We Make YouYAWWMY highlights the ways in which young people are sorted and categorized by algorithms that make decisions about who they are and what they’ll do based analysing the data they drop as they go about their daily lives. By physically representing the kinds of assumptions algorithms make about people, YAWWMY explores the impacts of this kind of algorithmic sorting on all of us, and highlights the ways in which these algorithms can recreate and reinforce discriminatory biases.