Google Paper Signals was used by ARC Worldwide and Leo Burnett to help foster interagency teamwork. Bringing employees from different accounts and different departments, teams were tasked to create a Paper Signal that physically displayed online data via API’s and SDK’s. Each team was given an Arduino Feather HUZZAH with ESP8266 (wifi) and a micro-servo.
“The Nose Knows” - This signal used APIFY to webscrape pollen.com to gather data in JSON format. Every morning at a specified time, the Signal would check the daily pollen level forecast at your location and proactively dispense your allergy medicine if all conditions are met.
“Emoji Signal” - Can a robot show emotion? We asked Google to check #FACESOFCHICAGO on Instagram and use facial recognition to show us how Chicago is feeling.
This signal used a webscraper to parse multiple image urls from #FACESOFCHICAGO. A NodeJS server was setup on Google Cloud Platform to iterate through each image and send them to the Google Vision API for facial and emotion recognition. The emotion data of the faces was saved to a JSON file. The Arduino would then read the file and display the appropriate emoji translation.