This guy rigged his MacBook with code that describes the city it sees in real time
There’s so many things you can do with a MacBook. For example, loading it with code that lets it describe what it sees in real-time.
What happens when you take a Macbook loaded with a neural network that describes what it sees in an image, and parade it through town to see people reactions?
This is exactly what Kyle McDonald had in mind when he created the video above. Using Andrej Karpathy’s “NeuralTalk” code modified to run from a Macbook’s webcam feed, McDonald took to the streets of Amsterdam to see the neural network in action. And I must say, the results are pretty fascinating.
As you can see in the video, the neural network doesn’t get everything correct. For example, it takes the software a little longer to decipher if someone in the shot is eating a hot dog rather than holding his smartphone, and sometimes it sees things that average person would never see.
But lets keep it real here, the fact this is code deciphering pixel color and brightness of what’s being shown in an image is pretty fucking amazing.
To learn more, check out the video above or by clicking here.