ORCAA was featured in Fast Company:
The food we eat has quality certifications. Why shouldn’t the algorithms that shape our world?
ORCAA was featured in MIT Technology Review:
ORCAA was featured in Wired:
Companies and organizations have been increasingly using mathematical models to streamline important decisions, and these new decision-making processes have been largely unaudited. Indeed, they have been assumed to be fair and objective simply by dint of their mathematical nature.
We at ORCAA call this the “authority of the inscrutable.” It’s real. But it’s an undeserved authority that is being increasingly questioned. Lawsuits are being filed against the use of dubious algorithms like personality tests used for hiring as well as recidivism risk models used for sentencing, parole, and bail.
ORCAA’s mission is two-fold. First, it is to help companies and organizations that rely on time and cost-saving algorithms to get ahead of this wave, to understand and plan for their litigation and reputation risk, and most importantly to use algorithms fairly.
The second half of ORCAA’s mission is this: to develop rigorous methodology and tools, and to set rigorous standards for the new field of algorithmic auditing.
ORCAA is a consulting company that helps companies and organizations manage and audit their algorithmic risks.
ORCAA audits existing algorithms for accuracy, bias, consistency, transparency, fairness and timeliness, following a 4-step approach. First, a deep dive into the high level design of the algorithm, including the assumptions on the type, quality, and quantity of data, the nature of data pre-processing, the algorithm itself, the structure and assumptions around the objective function and the optimization to that, the way the algorithm was trained, and the approach taken to monitor and update the algorithm. Second, matching the design to the implementation; this is done at the code level if it’s available and via black box testing if not. Third, the execution audit, in which we kick the tires of real-world constraints of the algorithm. In the fourth step we write up the process and include a “to-do” list of improvements and suggestions.
Algorithmic Auditing Training
ORCAA provides in-person corporate training to conduct audits and to build stakeholder matrices.
The Ethical or Stakeholder Matrix
A construction from the world of bio-ethics, the ethical or “stakeholder” matrix is a way of determining the answer to the question, does this algorithm work? It does so by considering all the stakeholders, and all of their concerns, be them positive (accuracy, profitability) or negative (false negatives, bad data), and in particular allows the deployer to think about and gauge all types of best case and worst case scenarios before they happen. The matrix is color coded with red, yellow, or green boxes to alert people to problem areas.
The Stamp of Approval
Once an ethical or stakeholder matrix has been built, an ORCAA stamp of approval can be awarded if there are no red boxes, or in other words if the algorithm is relatively robust to worst-case scenarios that would rise to the level of litigation or a violation of stakeholder rights.
The information contained in this website is provided for informational and educational purposes only and should not be construed as professional advice on any matter. Before taking any actions based on such information we encourage you to consult with the appropriate professionals. We make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, validity or suitability with respect to the website or the information, products, services or related graphics contained on, distributed through or linked, downloaded or accessed from this website for any purpose. All information is provided on an “as is” basis without any obligation to make improvements or to correct errors or omissions. The use of, or reliance on, any information contained on this site is solely at your own risk.
The transmission and receipt of information contained on this site, in whole or in part, or communication with ORCAA via the internet or e-mail through this website does not constitute or create a professional relationship between us and any recipient. You should not send us any confidential information in response to this webpage.
In no event will we be liable for any loss or damage including, without limitation, indirect or consequential loss or damage, or any loss or damage whatsoever arising from loss of data or profits arising out of, or in connection with, the use of this website. We disclaim all liability in respect to actions taken or not taken based on any or all the contents of this site to the fullest extent permitted by law. Do not act or refrain from acting upon this information without seeking professional advice.
This website may contain links to other websites which are not under the control of ORCAA. ORCAA makes no guarantees or promises regarding such websites. The inclusion of any links does not imply a recommendation or endorsement of the views expressed within them.
Cathy has been an independent data science consultant since 2012 and has worked for clients including the Illinois Attorney General’s Office and Consumer Reports. She wrote the book Doing Data Science in 2013 and Weapons of Math Destruction: How Big Data Increases Inequality And Threatens Democracy, released in September 2016.
Thomas Adams has over twenty-five years of business and legal experience. He has represented banks, companies and individuals on corporate, securities and business law matters. He also provided strategic advice, litigation support and expert witness testimony on issues relating to the financial crisis. Mr. Adams is an expert in creating solutions and solving problems for complex financial and corporate transactions and has provided strategic advice and analysis to banks, insurance companies, private equity companies, hedge funds and a variety of other companies. He graduated from Fordham Law School in 1989 and Colgate University in 1986. He is admitted to practice in New York.