Warning: Use of undefined constant image_setup - assumed 'image_setup' (this will throw an Error in a future version of PHP) in /home/clinam5/public_html/clinamen/wp-content/themes/autofocus/functions.php on line 576

Warning: Use of undefined constant image_setup - assumed 'image_setup' (this will throw an Error in a future version of PHP) in /home/clinam5/public_html/clinamen/wp-content/themes/autofocus/functions.php on line 577
Clinamen » Making Machines

Warning: Use of undefined constant CC - assumed 'CC' (this will throw an Error in a future version of PHP) in /home/clinam5/public_html/clinamen/wp-content/themes/autofocus/single.php on line 14
CC

Making Machines

Image Credit: Robot by stevent

On Wednesday, July 10, 2013 a robot will be giving me a writing assignment. That robot is a Twitterbot called @makingmachines, and it will pull two random texts from the table of contents of Patricia Bizzell and Bruce Herzberg’s The Rhetorical Tradition.

So, my “assignment” will be generated by an algorithm, but the writing itself will also follow a set of procedures. That is, it will be constrained in various ways. I’ll say more about those constraints below, but first I should take a moment to answer the question many are probably asking right now: “Why?”

Let me begin by explaining the roots of this project. A few years ago, I began thinking about using the “mashup” as a method for generating theoretical concepts. I described this method in a previous blog post and also in a Computers and Composition article. Without getting into too much detail about these previous writings, my aims were to think of the musical mashups of DJs like Girl Talk as a method for rhetorical invention. I asked students to mashup rhetorical theorists, and I even carried out some of these mashups myself. I began to consider using this method as part of an extended argument about digital rhetoric, and I started thinking about how I might randomly combine rhetorical theorists to generate new concepts for digital rhetoric. These concepts would be attempts to build theories and concepts for creating and analyzing texts and objects in digital spaces.

In the last few years, my interest in this project has expanded. I am now interested both in the mashup method and in a detailed consideration of the procedures that might generate those mashups. Part of these expanded interests are linked to my attempts to consider how nonhumans participate in rhetorical ecologies. In addition, I have been inspired by the work of Darius KazemiMark Sample, and others who have explored how computation can be used to generate things (from shopping lists to jokes to rap lyrics).

From this group of interests and influences comes a project I’m calling Making Machines, which takes up the making of machines (digital, discursive, or otherwise) and how we might consider theories to be “making machines” – machines that help us make things. Given that I’m a rhetorician, Making Machines is focused on understanding and expanding “the available means of persuasion.” My hope is that the concepts that emerge out of this project will be useful for those trying to persuade and to those trying to understand how persuasion operates. My primary focus is on digital rhetoric – on how persuasion operates in digital environments, but my hope is that this discussion can extend into other types of environments as well.

With this background in mind, here are two reasons for pursuing this project. As I write and make, more reasons will emerge, but this is what is currently motivating the project:

1) This project aims to make explicit the algorithms and procedures that already shape writing, argument, and theoretical innovation. I see rhetorical theory as a set of algorithms for generating and interpreting arguments. Rhetorical theory offers us a set of procedures by which we might make arguments or understand the arguments of others. Making Machines is an attempt to take that premise one step further, to view rhetorical theories as machines and to make some of my own machines.

2) Making Machines attempts to demonstrate what emerges from human-machine collaborations. We make machines, and they make us. For me, one of the clearest articulations of this comes in Katherine Hayles’ Electronic Literature: New Horizons for the Literary. Hayles urges us to consider “the human and the digital computer as partners,” and she describes how “humans engineer computers and computers reengineer humans in systems bound together by recursive feedback and feedforward loops” (47-8). This new project puts me into these recursive loops, alongside various machines, and asks what emerges from such collaborations. In order to ask this question, humans have to be willing to see that they are sometimes “machinic.” This is what I’m trying to do by placing myself at the mercy of various machines, rules, and procedures.

I’m writing about this project in the interest of sharing both my process (the procedures that shape the project) and the eventual products (the theoretical “mashups” that will emerge from this process), but I’m also writing this so that I am accountable to a broader audience. (For instance, sharing this process means that I can’t just run the @makingmachines bot until I get a favorable pairing.)

So, here are the constraints that shape the writing and making of Making Machines:

1) I am tasked with “mashing up” the two texts named by @makingmachines in order to create a new rhetorical concept. I can only use these two texts. If the bot selects a text that is excerpted in The Rhetorical Tradition, I will be using the entire text and not just the exerpt.

2) This “mashup” will be an essay, and that essay must be exactly 3000 words.

3) I must then use the concept that I’ve created to generate some kind of digital object.

4) I must complete composition of both the essay and the digital object before the bot generates the next pairing (the bot currently generates a new pairing every month). [Note: This time period will likely have to change once the academic year starts. For this “pilot” run, I’m giving myself a month.]

5) I will publish the results at http://makingmachines.jamesjbrownjr.net. [Note: This page does not yet exist.]

I’ll post more about this project as it progresses. You can also follow @makingmachines if you’re interested in seeing the pairing it generates on Wednesday.

3 Trackbacks

  1. […] Processes: Oulipian Computation and the Composition of Digital Cybertexts James Brown, “Making Machines” Paul Eggert, “Text as Algorithm and as Process.” from Text and Genre in […]

  2. By Clinamen » Sound Arguments on August 12, 2013 at 3:03 pm

    […] thuswise to swerve Skip to content HomeAboutBlogrollRSS « […]

  3. By Collin Gifford Brooke on October 12, 2013 at 2:11 pm

    […] Clinamen » Making Machines: A few years ago, I began thinking about using the “mashup” as a method for generating theoretical concepts…I asked students to mashup rhetorical theorists, and I even carried out some of these mashups myself. I began to consider using this method as part of an extended argument about digital rhetoric, and I started thinking about how I might randomly combine rhetorical theorists to generate new concepts for digital rhetoric. These concepts would be attempts to build theories and concepts for creating and analyzing texts and objects in digital spaces. […]

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

<div id="footer"> My most recent blog entries are linked on the front page. Visit the <a href="http://www.clinamen.us/?author=1">archive</a> to view previous posts. <br> <script type="text/javascript"> var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); </script> <script type="text/javascript"> try { var pageTracker = _gat._getTracker("UA-10800421-1"); pageTracker._trackPageview(); } catch(err) {}</script> All content is licensed under a Creative Commons License. Some Rights Reserved. <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/"><img alt="Creative Commons License" style="border-width:0" src="http://i.creativecommons.org/l/by-nc-sa/3.0/us/88x31.png" /></a><br /><span xmlns:dc="http://purl.org/dc/elements/1.1/" property="dc:title"> <p id="footer-credit"> <span id="generator-link"><a href="http://wordpress.org/" title="<?php _e('WordPress', 'sandbox'); ?>" rel="generator"><?php _e('WordPress', 'sandbox'); ?></a></span> <span class="meta-sep">|</span> <span id="theme-link"><a href="http://www.plaintxt.org/themes/sandbox/" title="<?php _e('Sandbox for WordPress', 'sandbox'); ?>" rel="designer"><?php _e('Sandbox', 'sandbox'); ?></a></span> <span class="meta-sep">|</span> <a href="http://www.allancole.com/wordpress/themes/autofocus" title="<?php _e('Autofocus', 'sandbox'); ?>"><?php _e('Autofocus', 'sandbox'); ?></a> </p> </div><!-- #footer --> </div><!-- #wrapper .hfeed --> <?php wp_footer(); ?> <head> <script src="http://platform.twitter.com/anywhere.js?id=pMcqZMD2TWnxUpr73ghpLw&amp;v=1"> </script> <script type="text/javascript"> twttr.anywhere(function(twitter) { twitter.hovercards(); }); </script> </head> </body> </html>