Because of the complexity and amount of the gathered raw data, and due to the fact I am repurposing it, I currently am programming custom data crawlers (small purpose-built programs iterating through the datasets line by line) that arrange data in order to be legible by future creative coding programs. Nevertheless, this step also needed a number of analytical procedures beforehand, such as a ‘semantic’ log for this first dataset—what events and processes are being obfuscated and what are they called, and which of these can be said to relate specifically to the interface in question.
Above is Google’s DevTools visualization of the dataset produced. After numerous crawling attempts, this is a first tentative result:
This is a diagram drawn computationally from all the available positions tracked and reacted upon by Facebook during my interaction. Significantly, this has been generated entirely independent from any classically visual digital data (such as information from HTML / CSS). The next step, just on the level of data analysis, is correlating the various extracted datasets chronologically together.