<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom">
	<channel>
      <title>2020 Week 3: Feminist AI — CCS Working Group 2020</title>
      <link>http://wg20.criticalcodestudies.com/index.php?p=/</link>
      <pubDate>Sat, 04 Apr 2026 22:43:54 +0000</pubDate>
          <description>2020 Week 3: Feminist AI — CCS Working Group 2020</description>
    <language>en</language>
    <atom:link href="http://wg20.criticalcodestudies.com/index.php?p=/categories/2020-week-3:-feminist-ai/feed.rss" rel="self" type="application/rss+xml"/>
    <item>
        <title>Week 3: Feminist Search (Code Critique)</title>
        <link>http://wg20.criticalcodestudies.com/index.php?p=/discussion/88/week-3-feminist-search-code-critique</link>
        <pubDate>Mon, 03 Feb 2020 23:14:19 +0000</pubDate>
        <category>2020 Week 3: Feminist AI</category>
        <dc:creator>Christine.Meinders</dc:creator>
        <guid isPermaLink="false">88@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>by Christine Meinders, Jana Thompson, Sarah Ciston, Catherine Griffiths</p>

<p><strong>Approaches to Co-Creation </strong><br />
In this example, community-sourced data can be traced both visually and in code, and can be used to inform the very model used to process this information. Rather than simply coding, the prototyping process is incorporated in the code from a critical perspective. This process is guided by the <a rel="nofollow" href="https://aidesigntool.com/">Cultural AI Design Tool</a>, which refocuses the design process so that questions of creator, data origin, and rule-creation are centered rather than marginally examined or ignored. Using these as a basis for this particular critical code context, contributors are credited, while also keeping the prototype open for co-creation and reformulation by the community.</p>

<p><strong>Modeling Binaries:</strong> <br />
There are several pieces that contribute to <em>Feminist Search</em>:<a rel="nofollow" href="https://aidesigntool.com/feminist-search">  personal data donation</a>, interface design, and the use of binaries in data collection and model creation.   </p>

<p>The <em>Feminist Search</em> project explores what is safe and what is dangerous. Binary notions of safety and danger are just the starting point. Within the last five years, rising dangerous rhetoric is becoming socially acceptable once more and a corresponding rise in violent acts globally. Beyond this, there are the pressures of misogyny, racism, and other forms of bigotry that increase an individual or community's constant awareness of action to make themselves safe. What makes people feel safe? Safety can be categorized differently, such as physical, emotional and professional safety. </p>

<p>These binary definitions can be expanded by examining the grey spaces with the questions in the personal data donation. By having people discuss what safety means to them, or semantics of this term and related concepts, models can be built that reflect these spectrums, that allows for both exciting design and technical challenges, but more importantly, for creating technology that is for the people who contribute their data. <em>Feminist Search</em> explores the challenges of search from a community perspective---with a goal of reflecting the shared data of communities in Los Angeles and San Francisco. </p>

<p>One highlight is that computation is fundamentally binary, as are labels in machine learning---the data donation portion of <em>Feminist Search</em> uses labels of safe and dangerous. However, the goal is to move beyond a true/false dichotomy, because truth value in subjective particularly in categorizations of feelings and sentiments.</p>

<p>For those who are not familiar with the details of machine learning, fundamentally, machine learning is mathematical representations of geometric spaces that have distance functions as part of their definition. In defining geometric  classes, there will be a division between classes in an n-dimensional space (as in linear regression), or instead perhaps something such as a centroid in a clustering algorithm that will be the most representational of a cluster. Prediction(s) as to a class or type of image will depend on the geometric location in the vector space of the item(s).  </p>

<p>The interesting problems in data science and machine learning aren't in churning out mathematically good predictions, however. The outcome of an algorithm is only as good as the data given to it and how the person(s) constructing it use that data in the creation of a model. What often happens in construction of models is that outliers from other data points are often thrown out or are drowned out in the majority vote of the more "normal" considerations. Thus, these lead to models where a literal tyranny of the majority can happen, since the majority of opinions have more weight statistically - instead of treating all the data equally. </p>

<p>In this approach, the simple act of search can be used to understand binary decisions that are used to form a model, and how users can donate information to understand who is contributing to search and data collection. This is the central starting point that prioritizes visualization and creates a space to develop a community search engine. In <strong>Feminist Search</strong>, communities create and provide contexts for evaluation, with the goal of sharing these decisions along with donated personal data, and the "why" in the search results. </p>

<p>An additional goal of <em>Feminist Search</em> is to highlight thoughtful data donation and model weighting processes, while also showing how search is used---thus incorporating Feminist.AI approaches by exploring the act of search by utilizing embodied, multi-sensory (movement, sound, and images) methods through critical prototyping. <em>Feminist Search</em> is a way to solidify and continually honor the work of feminist communities.</p>

<p>Here is the code for <a rel="nofollow" href="https://github.com/FeministAI/feminist_search">Feminist Search</a><br />
<em>Thompson, 2020, Python</em></p>

<pre><code>import numpy as np
import cv2
import glob
from sklearn.svm import SVC
from sklearn.model_selection import GridSearchCV

def  import_image(path):
    &quot;&quot;&quot;
    INPUT: path to image file in jpg
    OUTPUT: machine readable image file
    &quot;&quot;&quot;
    image = cv2.imread(path)
    return image

class  ClusteredImages:
    def  __init__(self, positive_images_path, negative_images_path, image_suffix, number_of_clusters):
    self.positive_images = set(glob.glob(positive_images_path + '/' + image_suffix))
    self.negative_images = set(glob.glob(negative_images_path + '/' + image_suffix))
        self.no_of_clusters = number_of_clusters

    self.image_paths = [[path, True] for path in self.positive_images] + [[path, False] for path in self.negative_images]
        self.image_array = np.array(self.image_paths)
        
    self.descriptors = []
        for path, label in self.image_array:
            image = import_image(path)
            b_and_w = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
            sift = cv2.xfeatures2d.SIFT_create()
        kp, each_descriptors = sift.detectAndCompute(b_and_w, None)
            self.descriptors.append(each_descriptors)

    def  return_labels(self):
        return np.array(self.image_paths)[:, -1]

    def  generate_features(self, clustering_model):
            # rename function to reflect that it returns both training data and predictable data
            number_of_clusters = clustering_model.n_clusters
        descriptors_pre_array = [desc for desc_list in self.descriptors for desc in desc_list]
            descriptors_array = np.array(descriptors_pre_array)
            clustering_model.fit(descriptors_array)
        clustered_words = [clustering_model.predict(words) for words in self.descriptors]
            return np.array([np.bincount(words, minlength=number_of_clusters) for words in clustered_words])

class  ParameterFinder:
        def  __init__(self, X, y):
            # use gammas for rbf, poly and sigmoid
            #degrees for poly
            self.X = X
            self.y = y
            self.kernels_to_try = ['linear', 'rbf', 'poly', 'sigmoid']
            self.C_params = [0.001, 0.01, 0.1, 1, 10]
            self.gamma_params = [0.001, 0.01, 0.1, 1]
            self.degree_params = [0.0, 1.0, 2.0, 3.0, 4.0]
        
    def  find_best_params(kernel, X, y, param_grid):
                grid_search = GridSearchCV(svm.SVC(kernel = kernel), param_grid)
                grid_search.fit(X, y)
                return grid_search.best_params_

        def  return_all_best_params(self):
            best_params = {}
                # should rewrite to pass kernel and find parameters
                for kernel in self.kernels_to_try:
                    if kernel == 'linear':
                            param_grid = {'C': self.C_params}
                            search_for_params = find_best_params('rbf', self.X, self.y, param_grid)
                            best_params['linear'] = search_for_params
                    elif kernel == 'rbf':
                            param_grid = {'C': self.C_params, 'gamma': self.gamma_params}
                            search_for_params = find_best_params('rbf', self.X, self.y, param_grid)
                            best_params['rbf'] = search_for_params
                    elif kernel == 'poly':
                            param_grid = {'C': self.C_params, 'gamma': self.gamma_params, 'degree': self.degree_params}
                            search_for_params = find_best_params('poly', self.X, self.y, param_grid)
                            best_params['poly'] = search_for_params
                    else:
                            pass
        return best_params
</code></pre>
]]>
        </description>
    </item>
    <item>
        <title>Week 3: Feminist AI (Main Thread)</title>
        <link>http://wg20.criticalcodestudies.com/index.php?p=/discussion/87/week-3-feminist-ai-main-thread</link>
        <pubDate>Mon, 03 Feb 2020 17:48:43 +0000</pubDate>
        <category>2020 Week 3: Feminist AI</category>
        <dc:creator>Christine.Meinders</dc:creator>
        <guid isPermaLink="false">87@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>by Christine Meinders, Jana Thompson, Sarah Ciston, Catherine Griffiths</p>

<p><strong>Feminist Legacy and Theory</strong><br />
As long as there has been code there have been feminist approaches to coding and pattern finding---such as the first<a rel="nofollow" href="https://twobithistory.org/2018/08/18/ada-lovelace-note-g.html">  complex program</a> which was created by Ada Lovelace in the 1840s, as well as<a rel="nofollow" href="https://www.atlasobscura.com/articles/knitting-spies-wwi-wwii/">  the use of knitting for encoding secrets during World War II</a>, often hidden in plain sight and overlooked for its perceived lack of value.</p>

<p>As artificial intelligence developed in the latter half of the twentieth century, most of its theorists and developers were overwhelmingly white and male. The roots of Feminist AI can be traced to contemporary theorists like British academics Alison Adam and <a rel="nofollow" href="https://psycnet.apa.org/record/1996-97938-005">Lucy Suchman</a>. In Alison Adam's <a rel="nofollow" href="https://www.routledge.com/Artificial-Knowing-Gender-and-the-Thinking-Machine-1st-Edition/Adam/p/book/9780203005057">Artificial Knowing</a>, she critiques traditional AI as symbolic and connectionist, both of which fail to address embodiment in the production of knowledge. Referencing <a rel="nofollow" href="https://www.taylorfrancis.com/books/9781315003221">Tong</a> (1994) and <a rel="nofollow" href="https://www.psupress.org/books/titles/0-271-00802-4.html">Wajcman</a> (1991), Adam argues that technology is inherently social, political, and cultural in its usage and production and that AI research can and should be informed by feminist theories such as liberal feminism, eco-feminism, postmodern feminism and standpoint theory. Additional Feminist theoretical approaches include participatory (<a rel="nofollow" href="https://dl.acm.org/doi/10.1145/1978942.1979041">BardzelI</a>), embodied (<a rel="nofollow" href="https://www.rowmaninternational.com/book/little_vast_rooms_of_undoing/3-156-5577c24f-e725-498e-8d26-ad80990d7a3b">Blumenthal</a>), implementation into practice (<a rel="nofollow" href="https://www.hup.harvard.edu/catalog.php?isbn=9780674728943">McPherson</a>), and design as research (<a rel="nofollow" href="https://mitpress.mit.edu/books/design-research">Burdick</a>). These approaches examine the mutable relationships of form, making, theory, and community.</p>

<p><strong>Feminist Practices and Projects</strong><br />
Building on earlier work, new critical approaches and projects have emerged in recent years, including volumes on racism and feminism such as Safiya Umoja Noble's Algorithms of Oppression, Cathy O'Neil's <a rel="nofollow" href="https://weaponsofmathdestructionbook.com/">Weapons of Math Destruction</a>, Joy Buolamwini's <a rel="nofollow" href="https://www.ajlunited.org/">Algorithmic Justice League </a>and Catherine D'Ignazio and Lauren F. Klein's <a rel="nofollow" href="https://mitpress.mit.edu/books/data-feminism">Data Feminism</a>. Additional feminist projects which explore the role of the body in knowledge production, critical prototyping, and critiques of science and technology include Ursula Damm's generative video project<a rel="nofollow" href="http://ursuladamm.de/membrane-2019/">  Membrane</a>,<a rel="nofollow" href="http://www.wekinator.org/">  Wekinator</a> by Rebecca Fiebrink, design approaches from<a rel="nofollow" href="https://feministinternet.com/"> Feminist Internet.com</a>,<a rel="nofollow" href="https://feministinternet.org/">  Feminist Internet.org</a>,<a rel="nofollow" href="https://lauren-mccarthy.com/LAUREN">  LAUREN</a> by Lauren McCarthy, Anne Burdick's digital humanities design fiction<a rel="nofollow" href="http://micromegameta.net/trina/">  Trina,</a><a rel="nofollow" href="https://gendersec.tacticaltech.org/wiki/index.php/Main_Page">  Gender and Tech resources project</a> by Tactical Tech,<a rel="nofollow" href="https://adanewmedia.org/2019/02/issue15-ciston/">  ladymouth</a> by Sarah Ciston, Catherine Grffiths'<a rel="nofollow" href="https://isohale.com/VISUALIZING-ALGORITHMS">  Visualizing Algorithms</a>, and Caroline Sinders' work on the<a rel="nofollow" href="https://carolinesinders.com/feminist-data-set"> Feminist Data Collection</a>, which details a path for building data collections and ontologies in a feminist reference and framework.</p>

<p>The organization Feminist.AI (to be distinguished from the conceptual approach Feminist AI) is a collective based across three cities (LA, SF, and Holdenville, OK) which strives to redefine AI from its current development in private companies and academic settings to community and socially-driven futures. Feminist.AI is developing a project called Feminist Search that explores many of the issues and approaches of Feminist AI practices through community-driven research and prototyping. Feminist Search is used to actively highlight the work of feminist theorist Dr. Safiya Umboja Noble, and her book <a rel="nofollow" href="https://nyupress.org/9781479837243/algorithms-of-oppression/">Algorithms of Oppression</a>. </p>

<p><strong>Feminist.AI Project: Feminist Search (Searching for Ourselves) </strong><br />
Feminist.AI emphasizes and employs critical prototyping, participatory focused approaches (BardzelI), acknowledging creators, de-centering the human (Adam, Hayles, Braidotti), and viewing embodiment as beyond the enfleshed, as a body-self---living and indefinite (Blumenthal). Feminist.AI is an explicitly feminist practice and this value appears in our projects, including the recent offering Feminist Search, which involved co-creating a Feminist Search Engine. Sarah Ciston has <a rel="nofollow" href="http://2019.xcoax.org/pdf/xCoAx2019-Ciston.pdf">written</a> elsewhere in more detail about bringing intersectional methodologies to artificial intelligence.</p>

<p>Feminist Search addresses this multi-faceted approach to search and offers visual entry points as a starting place. It requires participatory engagement and critical prototyping. Feminist Search also engages the challenges of working within a binary, asking how that construct impacts the weighting and utilization of specific models, as well as how the interface design can highlight data bias or community contributors.</p>

<p>Search is one of the most commonly-used algorithms in the world and it is dominated today by several large private corporations, most notably Google. Google's search algorithm is driven by a combination of usage and ad revenue. The first Google search algorithm, PageRank, was used to measure and prioritize websites ranked according to a number of factors, and a more recent explanation is detailed in the video<a rel="nofollow" href="https://www.youtube.com/watch?v=0eKVizvYSUQ"> "How Google Search Works (in 5 minutes)."</a>  To summarize, a search engine is comprised of a database (which can include things like images, webpages, videos, pdf's), and algorithms that interact with information on the web (such as text, metatags, links).  When searching, software is used to enter in a text based query and results are returned via text, voice or images. Search engines use web crawlers (spiders, spiderbots) to collect information about pages. These crawlers summarize links, tags, metatags and share information with the search engine. These "spiders" crawl across these pages to do several things---like find new content, index information, and rank. There is no human intervention---these algorithms are deployed in real time to gather information.</p>

<p>While these explanations clarify the process somewhat for the non-expert, they do not address the problematic features such as those covered by Safiya Umoja Noble, beginning with her 2012 article<a rel="nofollow" href="https://safiyaunoble.files.wordpress.com/2012/03/54_search_engines.pdf"> Missed Connections: What Search Engines Say About Women</a>. In both this and her 2018 book, Algorithms of Oppression, Noble discusses the troubling intersection of search results and Black bodies, including returning a page of primarily pornographic pages when entering the term "black girls" and on results for searches for professional hairstyles returning pictures of primarily white women. Additional problems with web search involve indexing algorithms and filter bubbles. Specifically, information is linked, indexed, and personalized using factors and decisions that are not transparent. Professor Noble highlights the lack of review process for determining what is hyperlinked (Noble 41). Unwanted bias can also occur with filter bubbles which can serve up narrow information in personalized search.</p>

<p>Starting with a visual search prototype that uses community sourced images, Feminist.AI will build a larger visual search engine powered by community definitions and informed by library science and critical theory. With a belief in users as not only contributors, but also as owners of their experience and information, an editable, co-created search tool will be developed where users have control over access to their information and how it is used for development. Using this as inspiration, the Feminist.AI community proposes to create an alternative to private search engines with the working title Feminist Search.</p>

<p>Feminist Search seeks to add transparency to the search process and promote education about factors that determine how search results are created. The project creates multiple entry points for community members to learn about how information is classified, trained and accessed, making search more understandable and accessible. In our next post, we will go further into details of the <a rel="nofollow" href="http://wg20.criticalcodestudies.com/index.php?p=/discussion/88/week-3-feminist-search#latest">Feminist Search</a> project.</p>

<p><strong>Critical Questions:</strong><br />
1 - How do engaging feminist principles and practices around embodiment, community, and critical prototyping shift how code might be read or written? How do they shift how you understand or engage AI more broadly?</p>

<p>2 - Historically, the contributions of female-identified persons has often been overlooked in the main narratives of events, as seen in the stories of one of humanity's most famous scientific achievements - t<a rel="nofollow" href="http://wg18.criticalcodestudies.com/index.php?p=/discussion/18/week-1-colossus-and-luminary-the-apollo-11-guidance-computer-agc-code#latest%22">he Moon landing in July 1969</a>, or in the algorithmic processes of weaving or knitting. How can we reframe and revaluate traditionally "female" practices in light of today's emphasis on STEM education and create feminist spaces for both learning and developing coding practices?</p>

<p>3 - Can we imagine different sensory approaches for querying information? Currently, we use computers, with typing, voice search, and touch on screens. Could we simply improve our screen interfaces to incorporate new visual and interactive models of knowledge? How might we synthesize human and natural environments through search? How can we incorporate culturally specific ways of exploring knowledge through artifacts?</p>

<p><strong>Contribute to this research!</strong><br />
We invite you to contribute to this research. We are currently seeking data donations for our Feminist Search prototype. You may donate your data here: <a rel="nofollow" href="https://aidesigntool.com/feminist-search">https://aidesigntool.com/feminist-search</a></p>

<p>Thank You CCSWG 2020 community: Mark Marino, Jeremy Douglass, Zach Mann.</p>
]]>
        </description>
    </item>
   <language>en</language>
   </channel>
</rss>
