Desktop version

Home arrow Computer Science

  • Increase font
  • Decrease font

<<   CONTENTS   >>

Transparency, diversity, and Al ethics

Due to the significant role of Al in the digital economy and culture, many governments around the world have developed various relevant policies to advance Al-related areas, as mainly discussed in Chapter 3, while reducing unnecessary negative impacts as much as possible. Some countries like Singapore even develop legal measures to curb down negatives like fake news, as discussed in Chapter 6, and other countries focus on the development of ethical codes instead of regulating them through legal and policy mechanisms. In Australia, the country’s Human Rights Commission published its report in December 2019 and emphasized that companies and government departments using Al must be accountable through laws—not merely industry codes of ethics—to allow customers to understand and potentially challenge decisions made using AL The commission’s discussion paper stated that although self-regulatory efforts by digital platform firms and ICT companies creating principles to govern the ethical use of Al are commendable, they are not a substitute for national laws preventing discrimination and should be closely monitored by government (Australian Human Rights Commission, 2019). It is an ongoing question whether people should emphasize laws and policy over ethical codes or vice versa; however, people need to secure both dimensions for the sake of human rights, transparency, and diversity. As one of the most significant concerns in our contemporary capitalism is the divide

New media ethics in the age of Al 135 between the haves and the have-nots, and Al has expanded the gap between them, it is critical to advance socio-economic justice and equality.

More specifically, it is understandable to note the increasing importance of Al for the growth of digital economy and culture; however, we need to develop relevant ethical codes to advance and guarantee diversity and cultural identity, which are also the primary components of democracy. As van Djick et al. (2018, 161) point out, “Considering governments as developers and as partners in multi-stakeholder corporations requires a more comprehensive approach” to the Al society, and therefore, it is critical to create “an approach that reaches beyond governments’ common roles as regulators and exemplary users.” Governments, media and platform corporations, and consumers have sought to generate socio-cultural and political conditions for the safe and responsible use of Al (Elliott, 2019). This normative agenda also applies to the new global media and cultural environment where Al plays a pivotal role in cultural production, because only a few artists and entrepreneurs know how to use tools such as ML. As Kulesz (2018a, 2) argues,

The commercial logic of the large platforms may lead to increasing concentration of supply, data and income and to the impoverishment of cultural expressions in the long term. In a tech world dominated by the United States and China—and to a lesser extent by Europe, Israel, Canada, Japan and the Republic of Korea—there is a risk of fomenting a new creative divide, which would result in the increasing decline of developing countries.

Media and creative divide are colliding more and more as new platforms mainly in the Global North emerge, while these mega media and platform owners monopolize Al and big data. Kulesz (2018b, 72) also points out that the growth of Al-supported digital platforms raises various concerns, including

the structural transformation of the creative chain, which is shifting from a pipeline-like configuration to a network model, and the new risks resulting from the rise of large platforms: market concentration, a lack of public statistics and a monopoly on artificial intelligence.

In fact, as several countries experienced negative impacts, including fake news, inequality, and digital divide due to the convergence of Al and several digital platforms, they have developed relevant ethical guidelines. Al, algorithms, big data, and digital platforms have become integrated into a wide range of public services. These new digital technologies have begun to play a pivotal role in “the realization of important public values and policy objectives associated with these activities, including freedom of expression, diversity, public safety, transparency, and socio-economic equality” (Helberger et al., 2018,1). Therefore, it is critical to develop social mechanisms to secure these significant social norms and socio-economic and cultural fairness. As for the acceptable use of Al, governmental and corporate discussions on Al have centered on fairness, transparency, and diversity. These terms question the acceptability of Al—“whether its models introduce bias, produce reliable results, and can be understood or explained—as well as suggest what should be the ethical framework of the industry” (Barocas et al., 2013; McKelvey and MacDonald, 2019). As one of the most recent efforts, for example, the U.S. government released its ten principles for government agencies to adhere to when proposing new Al regulations for the private sector as part of the American Al Initiative. The principles have three major goals: 1) to ensure public engagement; 2) limit regulatory overreach; and 3) promote trustworthy Al that is fair, transparent, and safe (Hao, 2020).

Ethics has become a potential solution often pushed by the industry hoping to emphasize the individual choices of designers over the scrutiny of critics calling for more regulation and accountability around Al development (Bostrom and Yudkowsky, 2014; Campolo et al., 2017; McKelvey and MacDonald, 2019). Al and digital platforms also hold “the promise of empowering individuals to effectively take up their role as producers of public goods and services, as well as to act as autonomous and responsible citizens” (Helberger et al., 2018, 1). However, in practice, Al technology and various platforms have, to date, “not fulfilled this promise. Instead, in many cases they appear to be further intensifying the pressure of the market on important public values, such as transparency and non-discrimination in service delivery, civility of public communication, and diversity of media content” (Helberger et el., 2018, 1).

Al continues to function as a double-edged sword. In the Al era, again, there are several caveats as well as fruits that Al has brought about in the media and cultural industries. As David Gunkel points out in his interview in a newspaper (Parisi, 2019), innovations in Al, particularly with deeplearning algorithms, have made great strides in the previous decade. People continue creating a world full of idiot savants that will control every aspect of our lives. This might actually be more interesting, and possibly more terrifying, than superintelligence. This means that we have to understand the significance of the development of necessary ethical codes as well as policy mechanisms to resolve several socio-cultural matters based on the growth of Al in media and popular culture. When the European Commission (2020,2) published its white paper in January 2020, it claimed that “Al is a collection of technologies that combine data, algorithms and computing power” and, therefore, represents “advances in computing and the increasing availability of data key drivers of the current upsurge of AL” What the European Commission (2020) emphasized is that Al is becoming a central part of every aspect of contemporary society, and therefore, people should be able to trust it. Without trustworthiness based on fairness, uptake in the right direction is not possible.

<<   CONTENTS   >>

Related topics