Algorithmic transparency is openness about the purposes, structure and underlying actions of the algorithms used to search for, process and deliver data.
Transparency in algorithms could allow us to see why we receive the content we do on Facebook and other sites whose content varies for different users, indicating how and why some posts are made more or less visible and why some pages get greater visibility and therefore more hits.
Transparency, in general, is adopted – at least in part -- to demonstrate scrupulous behavior, essentially showing that one has nothing to hide. In algorithms, as in other areas, a lack of transparency can facilitate behaviors that would not stand up well to public scrutiny. Algorithms in search engines, for example, are supposed to be designed to deliver the most relevant content for the search query, perhaps personalized to the user’s interests. However, search algorithms have often been implemented in ways that deliver skewed or discriminatory results. Results may be skewed to favor the search provider’s business interests or to discriminate against particular individuals and demographics.
In the case of search giant Google, after complaints were made to the Federal Trade Commission, FTC staffers did some research. They found that Google’s algorithm generally caused its own services to appear ahead of others in search results, even if the quality of their listing was inferior and received fewer clicks. This finding included Google's flight search tool as well as shopping results. The FTC also discovered that Google scraped content from competitors’ sites and presented it prominently in Google answers, for example, while simultaneously burying the competitor’s site in the results. Despite these findings, however, the FTC decided the actions were not anti-competitive in nature and that they helped consumers and stimulated market competition.
Google has also come under fire for demonstrating bias unrelated to their direct business interests. A 2013 Harvard study reported that Google searches for names that might considered "black-sounding," such as Trevon James, generated more ads indicative of an individual that has an arrest record. In another case, searches by female job seekers resulted in the more prestigious jobs being filtered out of the search engine results (SERP). In these cases, as well as the business-related issues, the algorithm would have to contain some element that caused those results. It is unlikely that those elements would be included in the formulation if all the details of the algorithm were going to be publicly available.
Algorithmic transparency is also central to open security, an approach to safeguarding software, hardware and other information system components with methods whose design and details are publicly available. The philosophy open security is based on Kerckhoff’s principle, which maintains that cryptographic system should be secure by design rather than obscurity. The mathematician Claude Shannon further developed that idea, stating that "one ought to design systems under the assumption that the enemy will immediately gain full familiarity with them." In other words, a system such as an algorithm might as well be transparent to begin with.