Facebook announced that its artificial intelligence division is working on a new project that will enable AI to perceive the world through the eyes of a human, that is, to take a first-person view of life that surrounds us.
The company hopes that this AI will eventually be used in its augmented reality glasses to help people with their daily routines and household chores.
The research project is called Ego4D, and it involves 13 universities and laboratories across nine countries. Using over 2,200 hours of first-person videos where people live their daily lives, the researchers will train the AI to perceive the world in an “egocentric” way. This next-gen AI will be able to see the world from the human perspective.
It is important to note that universities, not Facebook, were responsible for collecting the data. Some of the participants were paid for taking part in the experiment. They recorded videos of their daily activities with smart glasses and GoPro cameras. These were simple actions like cooking, chatting with friends, or playing with pets. The videos were then anonymized.
Ego4D has two main components: a massive dataset of first-person videos and a number of benchmarks it should be able to tackle. They include episodic memory, forecasting, hand-object interactions, audio-visual diarization, and social interaction.
Facebook noted that this project is a research one, not commercial. However, The Verge notes that privacy experts are already concerned about how Facebook AR glasses will allow their wearers to secretly record people, and the newly announced AI capabilities only deepen this concern.