Algorithms seem to be a special breed - acting in the shadows of the internet, they silently collect data about users, analyze it and create assumptions and inferences. Most of this stays hidden, therefore their potential power remains an almost magical mystery to the average user who feeds them with information. Only sometimes, we are allowed to see minor glimpses ("You liked X, so you might also like Z"). What if we could make these algorithmic inferences visible?
Looking behind the algorithmic curtain could enable us to see through the tale of the all-knowing algorithms and give insight into the true capabilities, potentials and threats.
Thanks to the GDPR, it is now possible to access most of the data we gave to online services. This raw data, combined with some of the services algorithmic outcomes could be used to gain insight about the algorithms themselves by reverseengineering.
For the data at hand the cookie machine uses statistical tools to intercross information from the differing data sets. It then phrases certain insights about the data‘s owner into single statements and distributes them. Each cookie contains an algorithmic truth that needs to be interpreted by its owner. Some insights might seem logical, others far fetched, but some might feel like a sought-after glimpse of serendipity.