This is the third and final installment in DAMA Norway’s series on data ethics. All topics touched upon throughout these three articles are derived from DAMA’s DMBoK2, and its chapter on data handling ethics. The aim of this endeavour has been to illuminate some key universal truths regarding how data ethics should be approached. Article number one uncovered the importance of data governance maturity while the second article took a look at dedicated roles for data ethics. This third article however, deviates from this notion of homogeneity as it delves into the impact culture and geography has on data ethics.
Ethics, values and culture 101
Let’s set the stage. Ethics denote the principles that make up one’s belief system, thus governing what conduct is deemed appropriate and what conduct is not. These principles are based on a set of underlying values, which reflect an individual’s standards of right and wrong. Right, wrong and everything in between are not absolute and static values however. They are inherently dynamic and subject to constant influence, perception and change.
Cue culture. As one interacts with a group, one exercises one’s values and principles upon the members of that group. If the nature of this behavior is deemed appropriate, it will contribute to the pool of collectively accepted set of values and become part of that group’s culture. If not, it will be rejected as inappropriate. This evaluation is the group exerting its values back onto the individual, creating a dialogue of influence based on perceptions, ultimately fostering change.
Change is inevitable when faced with the opportunities and challenges brought on by digitalization. Møller Mobility Group, the Nordics’ largest car importer and retailer, recognized this fact as their capital, relationships and decisions became increasingly data-driven. When a Data Governance was set up, it provided a forum to discuss differing viewpoints, such as on data sharing and deciding on what is ‘right’ or ‘wrong’ and how to formalize this in the policies.
Norwegian culture tends to place a high value on trust, whereas the level of trust is not equally high in other cultures. One viewpoint, which was less rooted in trust, felt that policies should be made more explicit. The Norwegian culture, however, is not hierarchical and has high levels of delegation to facilitate more autonomous knowledge workers. An implicit understanding of the policy is more likely to be accepted, where in case of questions, the Norwegian trust-driven culture lets people decide to do the right thing on a more decentralized level. The new data governance clarified cultural values of trust and their consequent desires for a more implicit vs explicit policy through principles of change management. Otherwise invisible questions on what the collective trust-appetite for data sharing should be, and who gets to decide what is right, became transparent as a result of this Data Governance. Whereas frustrations may have festered if these questions would have stayed invisible, they were now discussed and resolved with a new appreciation for different cultural viewpoints.
Møller recognized that focused change management could help illuminate the underlying values, principles and biases making up the different positions. Furthermore they speculated that a common forum for discussion and compromise was the only way for the different stakeholders to truly understand each other and reach a common understanding. This process and others like it have provided some key learnings regarding the management of different cultural perspectives and their impact on data handling ethics in organizations.
As algorithms and automated processes encroach rapidly on key societal activities, legislation governing its use must follow suit. And in the cracks between law and regulation lies interpretation and morality of the actors themselves. When discussing the morality of actors in the space of technology and vacuum of legislation, the most commonly used example is that of automated cars and the trolley problem. A version of which could sound a little something like this. An older person and a child are on the sidewalk when the child suddenly runs into the street in front of an automated car. The only way for the car to avoid hitting and killing the child is to go up on the curb, and by doing so will surely hit and kill the old person. The automated car has enough time and maneuvering capabilities to make a decision and save one life, but not both.
«Which life should the algorithm save?»
That is a question, but not the most interesting and pertinent question. When confronted with near impossible dilemmas of who or what to favor, the algorithms will rely on sequential decisions in highly dynamic environments. The better question then becomes, what intrinsic values and principles will the algorithm be programmed to fall back to and rely on for guidance? And most importantly, who decides what those values and principles are? Not the programmers developing the algorithm, not by their lonesome at least.
Just as police officers act on the basis of specific protocols and principles provided to them by the society they are set to serve, so too should these algorithms. Programmers are not equipped to make these decisions on behalf of the rest of us, nor should it be the responsibility of legislators. Leonora Onarheim Bergsjø is a professor at the University of Oslo, currently researching digital ethics. She proclaims that it needs to be the responsibility of the society as a whole, across lines of occupation and technical proficiency. Values and ethical principles are universal after all, regardless of the arena in which they are applied. A public debate is needed where the perspectives of those outside the realm of data management are being voiced, heard and included in the discourse. It takes a village.
Today, we are witnessing data ethics stepping onto the world stage in a meaningful way, and thus far, western ideas have hogged the spotlight. With this global perspective one must ask if it is even feasible to obtain an international ‘golden standard’ of moral code for data managers to follow? And if so, is Western culture creating a biased foundation for developing these data ethics principles?
The social credit system implemented by the Chinese government is an obvious example of data policy which conflicts with traditional western notions, and as a consequence the status quo of data ethics. While this system might rub those with western sensibilities the wrong way, it is important to acknowledge that the policy is not a child of malicious intent. Chinese culture is collective at its core, which in turn dictates that the prosperity of the individual is a result of the collective good. As a result, one can argue that this core value of collectivism, inherent in Chinese culture, justify these means. Here in Europe however, our valuation of individual privacy makes such a practice unimaginable. Illustrated by continental regulation and legislation protecting individual rights in the online space, spearheaded by GDPR.
Let us revisit the trolley problem, and introduce this global perspective. Bergsjø states that while she believes that the majority of Norwegians would be inclined to save the child, the opposite might be true for China. She justifies this by arguing that in Chinese culture respect for one’s elders is held in very high regard. Neither inclination can claim moral high ground or to be the correct one, as both are inherently good. This may all seem a tad abstract and hypothetical, but consider this. No legislation in place, and a Chinese car manufacturer programs a protocol reflecting their culture’s values into their automated cars, or any other product for that matter. When exporting to Norway, should the protocol remain untouched or be changed to fit the Norwegian culture’s values? Which moral compass should guide these data-driven processes and data management once borders are crossed?
Biases and diversity
Biases are born from culture and may potentially lead to unforeseen, unwanted and damaging consequences. The only way to confront one’s own bias is to step outside of one’s circle of like-minded people and expose oneself to input from different viewpoints. Only then can one acquire a neutral outlook and let one’s values and ethics be challenged. According to Bergsjø, the only way to do this is through public discourse, letting oneself be challenged, while being open to others’ perspectives.
The consequence of not opening up to outside influences is creating a cultural echo chamber. Where one only resonates those notions that fit with one’s current ones and reject all others. This is the breeding ground for group think, which is highly detrimental to innovation and overall progress. Group think is a physiological effect where members of a group or community reach a level of conformity where consensus is not only the goal, but the default. New and/or controversial ideas that don’t align with the status quo are not only not sought out, but actively avoided.
The only way to challenge these vicious circles is through conscious discourse and proactively seeking exposure to viewpoints that differ from one’s own, thereby stimulating critical thinking across cultures and the sharing of ideas.
Forum for fostering discourse
Even though we have described a dynamic world comprised of countless iterations of challenges and perspectives, we have surmised some universal learnings we believe transcend this complexity. Learnings from Møller Mobility Group’s journey and others like it, which we have summarised into a list of actions we believe are applicable to all instances where culture has an impact on data handling ethics.
- Clearly define the ethical challenge at the core of the problem.
- Create a forum for collaboration and discussion
- Identify (a) key perspectives and (b) the key stakeholders representing that perspective who are best equipped to translate/carry across that perspective’s viewpoint.
- Communicate challenges, objectives and risk in addition to the consensus reached, with all stakeholders. Each representative knows their audience best and should tailor the message.
The aim is to end up with a result that is greater than the sum of its parts, by utilizing the viewpoints representing the different cultures within one’s organization as the assets they are.
«It takes a village.»
This was the third and final article in our three-piece series on Data Ethics. Thank you for following along. If you haven’t had a chance to read the first two entries, they are linked to at the top of this page. The themes touched upon throughout these articles are derived from DAMA’s DMBoK2, and its chapter on data handling ethics.
Subscribe to DAMA’s Data Nugget to be notified of other great content from DND/DAMA Norway, straight in your inbox.
Faggruppen DAMA (Data Management Association – Norway Chapter)
DAMAs visjon er å standardisere og formalisere data management i Norge for å øke kompetanse og kunnskapsnivå innen fagfeltet. Vi vil fasilitere erfaringsdeling rundt data management, og ønsker å ha en positiv sosial innvirkning på samfunnet. Finn lenker til LinkedIn, nyhetsbrevet og mer på siden "Get involved". Vår LinkedIn-side inneholder alle de siste oppdateringene våre, og vårt nyhetsbrev Data Nugget gir en månedlig dose med datanyheter.