AR experience revealing emotional surveillance dangers
Top 40.8% on sourcepulse
This project is a deep learning-powered AR experience that analyzes facial reactions to highlight the dangers of emotional surveillance by Big Tech. It targets artists, researchers, and users interested in AI ethics and augmented reality, offering a critical commentary on data privacy.
How It Works
The experience uses a custom-built, TensorFlow-based engine for facial analysis, though it has also experimented with Dlib compiled for WebAssembly. It employs a unique frame-accurate video synchronization method by embedding encoded information in the video's overscan area, avoiding floating-point approximation errors. For performance, it utilizes an OffscreenCanvas zero-copy hack on Chrome to run the computer vision engine on a separate thread.
Quick Start & Requirements
The project uses face-api.js
for its computer vision engine. Specific installation or execution commands are not provided in the README. It requires a browser environment, with optimal performance on Chrome due to its implementation of OffscreenCanvas.
Highlighted Details
/installation
, including a politically biased algorithm for dramatic effect.Maintenance & Community
The project was a recipient of Mozilla's 2018 awards for art and advocacy. Further community or maintenance details are not provided.
Licensing & Compatibility
The README does not specify a license. Compatibility for commercial use or closed-source linking is not addressed.
Limitations & Caveats
The project's code is described as unoptimized and was developed under significant time constraints, leading to potential performance issues on non-Chrome browsers or average hardware. The README notes that Firefox's OffscreenCanvas implementation was incomplete during development, and regressions in Chrome have caused browser crashes.
2 years ago
Inactive