Regulating fake video

Let's regulate videos like we regulate food.

Deep fakes, or the ability to manipulate people’s actions in videos, were in the news recently. Macleans wrote a very concerned article December of 2018.

This will be the year that you can no longer believe your eyes. Video manipulation techniques that previously only special effects studios could pull off will soon be available in off-the-shelf programs that anyone can use, and we are not ready for the way that will change the world.

I rarely believed my eyes when it comes to video even before this development.1 Cameras are operated by people, who set a narrative based on what they record/don’t record through their camera(s). Editing and audio commentary/subtitles add further bias. So, videos are to be interpreted in context, not be taken as truth. I view video-based evidence with the same level of scepticism as a text news article. To a lot of people, though, video feels more real than someone’s written account because a camera is supposed to be an extension of your eyes. The camera sees, and seeing is believing.

This fake video development adds another element of scary, even without the shaming/humiliating element of fake sexual explicitness. A video, especially of a public figure, can be invented to make them look like they’re saying things they did not say, or doing things they did not. Left unchecked, they decimate the notion that a video recording can be a recollection of events.

Deep fake video is thus a serious problem. But, it fits within the overall framework of video authenticity. When anyone can make a video and broadcast it to the world, authenticity is hard to verify. When the video is malevolent misinformation, it’s an even bigger problem because the maker has designed the video to manipulate. In India, for instance, fake videos (not deep fakes, crude fakes) have been used to propagate violence.

Regulating fakes

There is a regulatory way forward, though, and you only need to look to something more vital for life than YouTube.

Think of video like you’d think of food.

Everyone consumes food. A significant fraction of those people produce as well. The food most people make is intended either for personal consumption, or for consumption by a few other people who all have a significant relationship with the cook. It’s the same for most video now, produced on cellphones for the consumption of a few people all known to the producer.2 If your food is undercooked, or insufficiently cleaned, it might make a few people sick/unhappy, but it’s almost always an accident with repercussions to but a few. Also, malevolence is almost non-existent.

This is level 1, cooking for your family or for a dinner party, same as home video. No regulation is required, the product is not intended for public consumption.

Professional food production, whether it’s on a farm, a factory, or a restaurant, you may notice, is highly regulated. And for good reason, improperly produced food can kill. When you’re selling to the public, you’re feeding strangers and the numbers are now in the hundreds, thousands or millions​.

This is level 2. While people may argue about the extent of regulation of food, there’s no question that every aspect of food production is subject to some safety procedures, inspections, certifications and penalties for non-compliance. This regulation did not happen overnight, industrial food production was fraught with abuse/cheating before regulations assured people that for the most part, if the system is working reasonably well, the food is what it was claimed to be, and would not kill them.

So, is it time to regulate public video like public food? Where would you start? At the host (youtube/facebook) level? No that’s too late.3 That’s like regulating food only when it gets to the grocery store. It would have to be regulated at the production stage, in the software/firmware. So, to begin, you’d have to certify that your raw video recording software records what the camera is pointing at. In addition, any software that edits video would have to be regulated to ensure that edits made are transparent, and saved with an audit trail to ensure that details of changes can be retrieved later. This is just the beginning.4

So, is this too onerous? Possibly. But remember, this applies to public video only. If you were recording a video of your toddler being cute 5, and sending it to your friends/family, that’s not public. If you took the next step of posting it on facebook, all facebook would need to do is confirm that your android/iphone video software was cleared for public video, and any edits were applied by software that could authenticate the edits.67

People who produce professional video for a living already have a greater responsibility to adhere to professional standards. From fact-checking, to transparent editing, commercial software, and more, the structures are in place to regulate. The structures are also in place to remove unauthorized public videos produced by professional organizations as they are already in a regulatory framework around financial/legal standards, professional codes and more.

If you think video has a great influence on people, good and bad, and you want to minimize misinformation, you should be deeply concerned about the complete lack of standards and regulation on public video.

  1. I watch fiction/documentary/music video 99% of the time. News video, political debates, speeches, etc. either bore me, or increase my stress levels. I think it’s because it’s a very non-interactive form of communication
  2. Hi mom! I create 10 videos a week of my toddler!
  3. Also, we don’t want these profit-maximizing eyeball-lusting overlords to have even more power
  4. I’ve been told I like focusing big picture and need to work on dotting my i’s, but this is my blog post, so, that’s all you’ll get :)
  5. Mine’s the best!
  6. Perhaps, akin to antivirus software, algorithms could detect unauthorized manipulation, but not necessarily if the video was not made public.
  7. I presume this is possible in this brave new world of AI, machine learning, insert robots-will-take-over-the-world jargon here.