Facebook makes chips by itself, and uses ai to manage violent live broadcasts.

The live broadcast is very popular, and the live broadcast is also very controversial. In addition to undressing, rumors, and fake speculation, the live broadcast platform has poured into more terrible violence, murder, and suicide.

In 2017, Facebook spent a bloody Easter, a man who shot and killed himself.

Facebook makes chips by itself, and uses ai to manage violent live broadcasts.

After the bad live killings, major online media platforms are developing "violent tagging algorithms". Facebook recently announced the development of AI chips to discover, tag, and filter such live video.

Social media companies make their own chips

Upgrade software can no longer meet Facebook's needs, and more powerful algorithms need to be supported by custom AI chips. Although a large number of companies (Intel, Samsung, Nvidia, etc.) are developing energy-efficient chips, Facebook decided to come by itself. Yann LeCun, chief intelligence scientist at Facebook, said:

More and more smartphones are equipped with powerful chips that allow users to take advantage of speech recognition, enhancement and virtual reality as well as image and video processing functions directly on their devices. This trend will grow, prompting more software-based companies to consider hardware.

Facebook recently announced that it can generate a digital fingerprint by actively uploading nude photos. The AI ​​prohibits others from uploading by recognizing digital fingerprints, and this system combats “color intelligence complex”.

The identification of violent videos is much more difficult than bare photos. It is difficult for AI to distinguish whether the content is spreading violence, suicidal, or just joking. In addition to searching for information, AI needs the ability to combine contextual analysis content.

Facebook makes chips by itself, and uses ai to manage violent live broadcasts.

(Source: Fortune)

The current monitoring is not timely, and it is very energy intensive.

Facebook's current monitoring system consists of an AI system and a human moderator. This traditional method requires a lot of energy and labor costs.

It is difficult to balance the review and review time of video sites. Under the current conditions, increasing the intensity of the audit will inevitably slow down the review. If the audit time is too long, the user will be lost. Suppose a blogger uploads a video review for three days, which will provoke a strong complaint from the fans, seriously affecting the blogger's income.

To combat violent crime, the speed of content review is even more critical. In the "Eastern Massacre" incident, the murderer released a "pre-announcement video" before the live killing, but the video did not attract attention, and his atrocities were not discovered and stopped in time.

The “killing video” stayed on Facebook for more than two hours, enough to be downloaded and uploaded to other platforms to expand the spread of crime.

Facebook makes chips by itself, and uses ai to manage violent live broadcasts.

(Yong Zihua gives advice to suicidal people, picture source: 哔哩哔哩)

On May 20, 2018, a girl announced on the Weibo that the whole family committed suicide, causing concern of netizens and eventually being rescued. This is not the first suicide note that appeared on Weibo. This generation of people who grew up in the Internet age also likes to stay on the Internet.

Although this way of treating life is not advisable, there are also many suicides and suicide notes, but this "suicide notice" also gives suicides a chance to be rescued. It is important to be discovered in time.

AI moderators with stronger “violent tagging algorithms” are the future trends, not only reporting problems in a timely manner, preventing violence and suicidal behavior, but also reducing energy and labor costs.

Facebook makes chips by itself, and uses ai to manage violent live broadcasts.

(Source: InnovaTIon Village)

At the end of 2016, China's "Internet Live Service Management Regulations" was promulgated, and the platform cooperated with relevant departments to carry out live content rectification. However, from the current situation, pornography, violence, rumors, fraud and other content have not completely disappeared from the live broadcast platform.

On October 26, 2017, the e-sports player “Death Announcement” beaten his girlfriend “suspected” in the live game, and was taken back to the investigation by the police for alleged violent injuries. After the League of Legends officially decided to ban him for 20 months, and multiple The live platform banned his account.

There is no more violent behavior on the Internet than in real life, but violence spread through live broadcasts will have a wider social impact. In the live broadcast, the death declaration "declared" the death of his career.

Facebook makes chips by itself, and uses ai to manage violent live broadcasts.

(Death announcement live screen, image source: TechWeb)

Recently, there has been a live broadcast fundraising questioned by the netizens, "The Eye Cancer Girl Xiaofengya Incident", and the live platform rewards have been involved in the "publicity of fundraising" public opinion.

The local taste of live broadcast is a bit cute, but the live broadcast of violence is very awkward. The suicide broadcast of “children’s play” consumes the kindness of netizens, and there are still many homework to do in live content supervision, whether it is system or supervision technology.

Fiber Optic Enclosure

Fiber Optic Enclosure,Corning Fiber Optic Enclosure,Fiber Optic Enclosure Box,Plastic Fiber Optic Enclosure

Huizhou Fibercan Industrial Co.Ltd , https://www.fibercan-network.com