Harvard University Berkman Klein Center for Internet and Society smhlambi@cyber.harvard.edu

Abstract

The design and production of machine learning systems and application of algorithms in mediating human affairs, without communal accountability and regulation, present significant ethical challenges in Ubuntu, a Sub-Saharan African relational-humanism philosophy. When the role of an individual is defined in a communal context, personalization and algorithmic individualization are considered a process of dehumanization. Furthermore, the centralization of data and the centralization of power amongst a few companies, countries, continents violates ubuntu’s principles of equitable distribution, restoration and reciprocity. The current conversations and efforts of ethics and justice in machine learning suffer the same flaws of biased algorithmic systems as their exclusiveness also reinforces discriminatory social biases. As algorithms and automated decision making increasingly play a role in mediating our lives, stronger concepts of fairness and justice that prioritize communal good over individualistic priorities are needed to better society through offering the widest range of social protection.

This paper examines and explores Ubuntu’s notions of justice and equity and how they can be applied in creating machine learning solutions. It will focus on 1) greater inclusion based on the principle of bottom-up governance and the distribution of power 2) data sovereignty based on the principle of communal good and the community’s responsibility to the individual.

Introduction

Ubuntu is based on Sub-Saharan Africa’s traditional worldview and spirituality which define the universe as a “spiritual whole in which all beings are organically interrelated and interdependent” [1]. An individual is inextricably interrelated with their community in the same way that all dimensions of reality are interrelated [2]. In the context of ubuntu, the community is the environment a human exists in and is composed of previous and future generations, and nature itself. Ubuntu asserts that a person is a person through other persons and that it is by reinforcing the humanity of others one uplifts one’s own humanity and grows in personhood [3].

The relationship between the individual and the community is reciprocal and while an individual’s obligation to the community is prioritized it does not necessarily crush the individual. The community is also expected to provide the necessary environment for an individual to thrive. Sub-Saharan African spirituality’s assertion that all dimensions of reality are interrelated serves as the moral foundation of ubuntu, which obligates a person and the community to maintain and affirm this universal harmony – the interconnectedness of everything. Acting individualistically through prioritizing one’s own interests over the community’s well being, or having excessive surplus while others needlessly suffer, violates harmony and brings about chaos.

Personhood is an idealized status not immediately conferred to an individual at birth. An individual must be educated and nurtured by the community in order to learn the community’s order and the interrelatedness amongst humans. In this case the saying “a village raises a child” takes on a significant meaning as the community is expected to provide the necessary environment for a child to become a fully functioning member of society who has attained full personhood. This is not to say that humans should not be protected at every level of development but rather that it is to underscore that the goal of a human is to reach a level where the individual can maintain and live in harmony with the community through the community’s dedicated commitment to the individual in bringing the individual into social harmony. To be human is to be in community. Attaining full personhood is marked by a heightened maturity of ethical-sense through the fulfillment of one’s responsibilities to society [3].

Equity, restorative justice and the decentralization of power are natural results from ubuntu. The individual is not neglected and is left as an afterthought and the individual’s good does not challenged and offer a framework to reimagine internet technology, ethics and algorithmic decision making from an inclusive and justice oriented approach.

Equity: Extending the community to Vulnerable populations

Extending community to vulnerable populations extends protection and recourse to marginalized communities. Vulnerable populations should be recognized as part of the community and should be considered in the design, implementation and the ecosystem of machine learning. Their voices should be included in the discourse on ethics in machine learning as a condition to addressing inequalities exacerbated by machine learning systems. The current discourse on ethics and machine learning is not only exclusive but does not consider alternative ethical systems. A homogeneity is assumed when discussing ethics in technology and posits western ethics as the default framework in addressing the harmful effects of technology.

Ethics is not missing in technology, equity is. “The technology society creates and chooses not to create is a window into the ethics and values of the powerful. The technological feats and the business models that sustain them are a mirror to the priorities of the few who hold the capital and capability to create technology”. When technology companies, for instance, in the process of amassing power, harm society (socially, economically, politically), it is not just an oversight, it is an insight into their ethics – It is an insight to power at work under individualistic values. Machine learning is not amoral. Machine learning acquires moral agency as it is designed by humans and mediates on behalf of humans [4].

The process of building machine learning systems requires time, effort, capital, and material – value based conscious decisions. This process is not an unbiased one, it is a process that directly embeds the creators’ ethical values through the prioritization of features. The ethics of those who hold the power to bring about technology matters. It is their ethics that becomes applied through their spheres of influence. The more power one has the more imperative that one must be ethical and capable of fulfilling the social responsibility afforded by power. Ubuntu’s principle of restorative justice requires restoring the voices of the marginalized and including them in the design of technology.

Injustice: Individualization and Commodification

The process of individualizing a person strips a person from their connectedness and status as an integral part of the whole, therefore bringing vulnerability to the person, community, or both. Reduction or the objectification of an entity leads to the commodification of that entity. This pattern historically was demonstrated in the Euro-North American reduction and objectification of humans in the African continent and their land, which led to the commodification of both. Today the commodification of data spurred by greedy business models, a reality now termed “surveillance capitalism”, is an instance of coloniality. The power asymmetries between users and the online platforms that commodify and collect user data in order to create predictive models that can influence the user, sometimes in subtle ways, for non-communal good, are individualistic, violate the principles of ubuntu and are characteristic of colonialism. This relationship is more emphasized when algorithms are applied to vulnerable populations especially members of the global south.

In content recommendation systems, the reduction of a human being to non-causal metrics optimized for a company’s profit, strips away the interrelatedness and complexity of a person and creates the potential for an individualistic business model that maximizes a company’s individualistic gain over communal good. The process of individualization that breaks a person’s connectedness, humanity, is a process of dehumanization. As artificial intelligence is not able to infer causality in the data it processes, it is difficult to provide recommendations that are based on the assessment of user’s needs. Most widely used recommendation systems are not able to determine the motivations behind a user’s interaction in their platform and are not able to provide value based recommendations. The metric of success quickly becomes maximizing a user’s attention and behavior as opposed to successfully collaborating with the user to provide value based recommendations. When a company is incentivized to prioritize their own profits over the user’s good and societal good, it creates a precarious situation in which harmful content may be maximized if it will maximize a company’s profits and the possibility of creating filter bubbles.

Ubuntu would classify the negative effects of machine learning caused by the automation of existing inequalities, and especially damaging to the most vulnerable communities, as a process of dehumanization. These negative effects reduce the ability for an individual to be in harmony with society. For example a non discretionary filtering of content may create filter bubbles that limit the ability for humans to act in community.

Data Sovereignty and Communal good

Users having full control of their own data and being able to aggregate it for communal good is a model that can lead to justice and equity. When data ownership is removed from private companies that are not accountable to their users or the public, it weakens the ability for private entities to exploit personal data. Under ubuntu’s framework, individual personal data could be aggregated for social good provided that it is sufficiently anonymized, thus removing the ability for any party to marginalize another. In this model the individual prioritizes communal good through offering useful data however the community asserts the humanity of the individual in providing the necessary environment that will not deanonymize and perhaps dehumanize the individual.

In order to truly have data sovereignty and machine learning systems that have positive effects on society, we must develop the sense that the internet is a shared space and a shared resource. The internet further reinforces an interconnected world whose reality and governance can be described by Sub-Saharan African philosophy in that disrupting the interconnectedness that the internet may bring, in a communal sense, is bound to create power asymmetries and lead to chaos. Harmful machine learning systems applied at scale through the internet will further fragment society and disproportionately affect already marginalized communities. A party non-representative of the internet community should not control the future and direction of the internet. Efforts must be made to make the internet accessible and simultaneously to keep individual data secure and in the custody of individuals. Machine Learning systems not accessed by users online yet still being applied to users also must be designed to maintain the humanity and dignity of those it affects.

Conclusion

Ubuntu offers a reimagination of equity and justice that may address the inequalities that may be exacerbated by machine learning. This framework provides the necessary ideology to protect individual data and only aggregate user anonymized data on behalf of the community’s expressed interests. Excessive greed driven by individual interests that outweigh the interests of the community can motivate the design of algorithmic predictive systems that intentionally include the user’s motivations and maximize value to the user. Ubuntu posits equity as a necessary component of ethics and one that warrants the inclusion of vulnerable communities in the design and decision making of algorithmic systems. Machine learning will also have to better society by equitably addressing the extremities of surplus and lack that can arise when individuals take priority over the community. The process of being human is directly related to ethical responsibility and as machine learning continues to increase its role in mediating the human experience it will need to have stronger ethical constraints. The promised gains of artificial intelligence may remain elusive until we reimagine artificial intelligence and internet technology in a more interrelated and interconnected framework.

  • [1] Asante, M., & Vandi, A. (1980). Contemporary Black thought : Alternative analyses in social and behavioral science (Sage focus editions ; vol. 26). Beverly Hills: Sage Publications.
  • [2] Hord, F., & Lee, J. (2016). I am because we are : Readings in Africana philosophy (Revised ed.). Amherst: University of Massachusetts Press.
  • [3] Tutu, D. (1999) No future without forgiveness (New York, Random House).
  • [4] Menkiti, Ifeanyi (1979). “Person and Community in African Thought”, African philosophy: An introduction. University Press of America.
  • [5] Heikkerö, T. (2012). Ethics in technology : A philosophical study. Lanham, Md.: Lexington Books.uneu

commodificaiton removes humanity -> controversial. Maybe types of data.