for TJG News to implement http cookies for this website. TJG Web Services Cookies Policy
<

The Community Reporting of Algorithmic System Harms (CRASH)

Editorial

Dr. Tamaro J. Green

TJG News:

2021-02-08 11:35:27 viewed: 308

 

The Community Reporting of Algorithmic System Harms (CRASH) project of the Algorithmic Justice League is an essential project to review how algorithms affect communities.  This project may provide insight into how we can better develop and audit algorithms to prevent harm.  This is no small undertaking as the risk of algorithm to communities becomes higher as algorithms increase their influence in decision making processes.  The Community Reporting of Algorithmic System Harms, CRASH, is a critical project towards the protection of constitutional rights.

In U.S. v. Volkswagon, 16-CR-20394, Volkswagon was charged with one count of conspiracy to defraud the United States and customers to violate the Clean Air Act by misleading the EPA and customers about US vehicle emissions standards. The case that was brought forward to Volkswagon is similar to the development of algorithms that cause harm to customers in the United States.  The design of algorithms that mislead customers about protecting civil rights and agencies that are beholden to the protection of civil rights may also be similarly conducting practices similar to Volkswagon.  This is not to accuse one company or another.  There are many companies that develop and employ algorithms to make decisions that were once made human.  They should all be held accountable when decisions that the algorithms make violate civil rights.  They should also be held accountable when they conspire to defraud research and audits of these algorithms. 

 Title 18, U.S.C., Section 241 – Conspiracy Against Rights makes it unlawful for two or more persons to conspire to threaten or intimidate any person in the free exercise or enjoyment of any right secured by the Constitution.  This title of code makes it clear that it is unlawful to conspire to threaten or intimidate any person in the exercise of constitutional rights.  The difficulty here is that this title was developed before the design of algorithms.  When algorithms are performing the violations there is ambiguity about who is held accountable.  If algorithms are not audited and they perform actions that are violations of civil rights, the accountability may fall on who authorized the algorithms, who designed, the algorithms, who developed the algorithms, or who operated the algorithms.  This is an area that may require more review for policy and regulation.  A small error in the development of an algorithm could have a profound effect on the constitutional rights of people.

There are other federal civil rights statutes that may also be impacted by the design of algorithms.  To read more about federal civil rights statutes visit the FBI website at https://www.fbi.gov/investigate/civil-rights/federal-civil-rights-statutes.  For information about the Community Reporting of Algorithmic System Harms, CRASH, project please visit the website http://crash.ajl.org/.

 

Dr. Tamaro Green is a computer science researcher and the founder of TJG Web Services.  TJG Web Services, LLC is a consulting firm in the field of information technology.  Dr. Green writes on topics of privacy, security, and ethics in information technology and computer science.

TJG News Editorials are opinion pieces and do not necessarily express the opinion of TJG News.  To publish editorial pieces in TJG News send an email to editor@tjgnews.com.