Order from us for quality, customized work in due time of your choice.
Facial recognition is one of the undoubtedly amazing technologies of modern times, combining established hardware (cameras) with highly complex innovative software (programs that allow for various actions such as tracking). The technology has significant applications, the consequences of which are not year clear given the technology’s infancy. However, it is understood by everyone in the industry to be a very powerful and consequential tool, emphasizing the question of ethics in development, implementation, and execution (Noorden). Faced with an ethical choice of implementing a facial recognition system at the University of Hawklandia, I, as the President, would agree to the use of the FaceTech with certain exemptions. Although active in the stadium, it will not be used against students unless criminal conduct was involved in the past. Furthermore, the in-classroom add-on will not be activated. The choice to follow through was based on the context of providing comprehensive security to campus while not limiting the rights of students and promoting academic learning without intervention.
The first ethical issue is that each person has a right to privacy, meaning that if they do not wish to be filmed, they can reasonably avoid it and have their personal space. Commonly to be recorded and analyzed on a continuous basis such as this, individuals would have to give consent. This applies to all stakeholders, including students (some of which may be minors), staff, and visitors. The article by Rob Watts discusses the issue of consent as part of the ethical implementation. He suggests providing the ability to opt-in and out, educating the public, and using a transparent provider to use the technology contractually responsibly. While on the scale of a university, opting out of being filmed is unrealistic and defeats the system’s point. However, being transparent and making people aware regarding the use and applications of facial recognition for security purposes, such as automatically identifying non-students/staff, tracking their actions and intercepting any potential criminal or threatening activity.
While FaceTech may serve as a brilliant security system, instantly recognizing those that have been banned from the stadium or serve as a danger to the public based on law enforcement databases, this information can be abused greatly. Using such systems is already popular at stadiums worldwide to prevent banned rowdy fans from returning to the games. At the same time, it offers the convenience of being able to pay for concessions seamlessly by just looking at the camera (Rosenberg). A similar approach would be used at the University, but unlike those other stadiums, it is not a tremendous business dealing with multibillion-dollar sports performances. To avoid overstepping, security will only intervene if a student has been charged with a criminal penalty for their behaviors during previous games.
The biggest concern is that the system which is meant to protect the university population can be easily turned against them. For example, vulnerable populations such as Uyghur people in China are being tracked down and placed in concentration camps simply due to their ethnicity and religion (Noorden). It was the decision by the government and those controlling the facial recognition system to single out that specific group. Similar concerns can be present even on a U.S. campus. It has been discovered that facial recognition systems recognize minorities much more poorly than their white counterparts and can erroneously label a minority student as a mismatch from a criminal database as a hypothetical. The best algorithms are known to misidentify African Americans 5-10 times more often (Simonite). It has to be monitored very carefully.
Finally, there is the issue of data collection. Similar to the right to privacy, most will not know that they are being recorded, and a complex algorithmic database analyzes their faces and figures. Depending on the private contractor, that data is stored, the AI learns from it, and it can even be sold to third parties (“Facial-Recognition Research Needs an Ethical Reckoning”). This once again affects all stakeholders, and it can be questionable as to where that data of them ends up later. It can be argued that virtually everything else, including computers and cellphones, already records data ranging from voice recognition to geolocation and send it to outside servers. While being true, it comes back to the principle of consent. When purchasing and activating a phone, there are terms of service which notify of ongoing tracking activities. Furthermore, if need be, an individual can distance themselves from the phone knowingly, but with a facial recognition system, one never knows when and where one is being recorded.
The FaceTech system was selected for the University of Hawklandia with the primary purpose of enhancing security. Unfortunately, due to the socio-political environment, campuses have often become the target of violence (mass shootings, political protests) as well as seeing some potential on-campus inappropriate behavior, a comprehensive security system can benefit to promote safety. Using Badaracco’s four Frameworks for a Decision, this step can be justified. It is the best net/net outcome, improving security without significant compromise on rules, freedoms, or restrictions. In terms of rights, the issue of groups was considered, and measures will be taken not to violate the rights of students or discriminate against any racial group. In terms of messages sent about the character, it will present the university as a strong leader in adopting innovative technology. Finally, the realities of the world dictate that the FaceTech system is needed to protect the campus. Ethically, the decision stands because it is made with the objective of the common good. While compromises may stand, all attempts to regulate and ethically control the system will be put into place as safeguards.
Works Cited
“Facial-Recognition Research Needs an Ethical Reckoning.” Nature, vol. 587, 2020, Web.
Noorden, Richard V. “The Ethical Questions That Haunt Facial-Recognition Research.” Nature, vol. 587, 2020, Web.
Rosenberg, Michael. “Facing Consequences.” Sports Illustrated, 2021, Web.
Simonite, Tim. “The Best Algorithms Struggle to Recognize Black Faces Equally.” Wired, 2019, Web.
Watts, Rob. “Facial Recognition Unmasked: How Companies Can Ensure an Ethical Implementation.” Forbes, 2018, Web.
Order from us for quality, customized work in due time of your choice.