Some Instagram users have recently reported that Instagram has begun asking them to take a video selfie to verify their identity and log into their accounts.

One of such users is social media consultant Matt Navarra who shared screenshots of Instagram asking him to take a video selfie and show his face from different angles to confirm he is a real person.

Instagram reported that the social media platform asks to verify user identity in cases of suspicious activity. For example, this feature is used if the user quickly likes many posts in a row or quickly starts to follow a large number of accounts.

However, in reality, an account can be subject to additional verification even if someone else likes many of their posts or directs a suspicious activity to the user's account.

The social media platform also assured that they do not use this feature for facial recognition but to make sure a bot does not operate the account. The video selfies are viewed by Instagram teams, that is, real people, and are deleted after 30 days.

Instagram has long dealt with bots that spread spam messages, offend users, or artificially inflate followers or like counts.

Meta, which owns Instagram, recently announced that it is shutting down its facial recognition feature. However, the company mentioned that it had only shut down a specific Facebook feature and not the use of facial recognition in general.