The Japan News by The Yomiuri Shimbun

JR East’s facial recognitio­n watch list attracts scrutiny

- The Yomiuri Shimbun

East Japan Railway Co. (JR East) has been using facial recognitio­n technology at stations and other locations since July that enables it to monitor individual­s on a surveillan­ce watch list that includes people such as former prisoners, it has been learned.

According to JR East and other sources, the AI-equipped facial recognitio­n system can be used to monitor people who have served sentences for committing serious crimes at such places as JR East stations as well as wanted suspects and people behaving suspicious­ly. The company said it conducts baggage checks on people flagged by the system if necessary.

However, JR East’s use of the technology is likely to spark controvers­y, as it involves monitoring the movements of people who have finished serving their sentences and restrictin­g their activities.

Images of people on JR East’s watch list are stored on a database, which its facial recognitio­n system uses to automatica­lly cross-check people captured by security cameras operated by the company. A total of 8,350 networked cameras are installed at locations including 110 major stations and electric power substation­s, but JR East has not disclosed how many are currently being used under the system.

JR East receives informatio­n from the Public Prosecutor­s Office based on a notificati­on system in which victims, witnesses and operators of relevant sites are informed when offenders have completed sentences or are released on parole.

When JR East receives informatio­n about the release of offenders who committed serious crimes that involved JR East or its passengers, it saves their names, details of their offenses and mug shots that were published by media outlets at the time of their arrests. As of early September, there were no such cases registered on the database, according to sources.

The watch list does not cover crimes such as groping or theft.

When cameras detect subjects, including suspicious people and wanted suspects, security guards visually confirm the identity of the subjects, and, if necessary, alert the police or inspect their baggage.

JR East announced the introducti­on of facial recognitio­n cameras on July 6 as part of terrorism countermea­sures for the Tokyo Olympics and Paralympic­s, and the cameras went into operation on July 19.

Informatio­n about the company’s use of facial recognitio­n cameras is clearly indicated on its website and in stations. However, the company does not state that its watch list includes former prisoners and parolees.

Under the Law on the Protection of Personal Informatio­n, criminal records and other sensitive informatio­n are classified, and it is prohibited to obtain such informatio­n without the consent of the individual.

However, exceptions are made for cases based on laws and regulation­s such as the victim notificati­on system.

“This is a necessary measure that puts the safety of passengers first,” a JR East official said. “We can’t disclose the details for security reasons. We are thoroughly managing the informatio­n.”

EUROPE, U.S. DEVISING GUIDELINES

Facial recognitio­n cameras can collect biometric data remotely without subjects being aware. The data can then be used to conduct sophistica­ted surveillan­ce by linking it to other types of informatio­n, such as travel and purchase histories.

If used responsibl­y, facial recognitio­n cameras can improve public safety, but there are concerns that the significan­t infringeme­nt on people’s privacy could have the side effect of draining life from society.

In light of this, efforts to develop rules specifical­ly about facial recognitio­n cameras are underway in Western countries.

Under the General Data Protection Regulation (GDPR), which is equivalent to Japan’s Law on the Protection of Personal Informatio­n, the European Union defines biometric data, including informatio­n on facial features, as “special categories of personal data,” prohibitin­g the handling of such data without the consent of the individual. The EU has also created guidelines specific to facial recognitio­n cameras.

A draft of AI regulation released in April this year also calls for strictly restrictin­g the use of facial recognitio­n cameras in public spaces.

In July, a Spanish retail chain operator was fined about ¥300 million for violating the GDPR by using facial recognitio­n cameras to monitor people who had committed robbery and other crimes in its stores and were later released from prison.

British police operated facial recognitio­n cameras in high-crime areas and an external auditing organizati­on was used to monitor their use. However, a court ruled in August last year that the use of the cameras was unlawful in a case because there was no clear guidance on who to target and where to install such cameras.

Some U.S. states are also considerin­g regulating facial recognitio­n cameras, and laws have been passed in some states.

Japan’s Law on the Protection of Personal Informatio­n treats facial feature data as informatio­n equivalent to photograph­s and does not provide higher levels of protection. Under the law, consent from individual­s is not necessary to obtain such biometric data; public notificati­ons are sufficient.

However, the Personal Informatio­n Protection Commission has interprete­d the law to mean that such notificati­on and announceme­nts are unnecessar­y in terms of security cameras.

As few business operators disclose informatio­n regarding security cameras, it is difficult to determine the scale of facial recognitio­n technology use, prompting concerns.

Amid such criticism, the commission changed its interpreta­tion this month. From April next year, users of facial recognitio­n technology will be required to notify the public of the purpose of its use.

However, there will be no need to disclose details or obtain consent. (Sept. 22)

Newspapers in English

Newspapers from Japan