inferentialkid

Codes

In Sessions on September 24, 2015 at 7:45 pm

SoftwareIcon

Agenda:

  • Q & A with special guest James J. Brown, Jr.
  • Demonstrations of text visualization projects (Corey, Sarah)
  • Demonstrations of “creative visualization” (Ruth), “composing voice” (Val), and “zooming out” (Sean) projects.
  • Discussion of Ethical Programs

Reading for Next Week:

Assignments for Next Week:

  • Deanna & Chris: create a lesson plan/assignment around MyHistro
  • Ruth: create a lesson plan around NB
  • Corey: create a lesson plan around a.nnotate.com
  • Nathaniel: create a lesson plan around Audacity
  • Everyone else: write a 300-500 word response to the above reading and bring a hard copy with you to class
Advertisements
  1. The text of question for Jim Brown:

    Earlier this year, we learned of the story of Chris Roberts, a security researcher who hacked into the command of an in-flight airplane he was on. His hack apparently targeted the planes thrust system and he was able, as a passenger, command one engine of the plane to climb and subsequently change course laterally.

    Roberts had previously presented on the security vulnerabilities (exploits accessed through in-flight entertainment systems) and claims to have spoken to (“yacked” with) airplane manufacturers multiple times before determining that those conversations were dead ends. It was only after the yacking that he went about hacking.

    In the controversy that followed, Robert’s were subject to a search by the FBI and some called for a criminal investigation and jail-time for Robert’s alleged actions. Assuming Roberts did hack the airline in-flight, thereby endangering the lives of himself and everyone aboard the plane, I’m interested in pursuing the issue of the ethics of hacking.

    On the one hand, I completely agree with you in your book when you question the utility of Mashable.com and others admonishing Delphin and others who hacked Twitter through the onMouseover exploit. I’m with you all the way when you say that “Moralism and recommendations about what should have been done by these hackers is not a particularly useful response to this situation.” And I do see that much of your work here is concerned with the software infrastructure of networked, hospitable spaces.

    But I think it may be helpful to grapple with the ethics of this particular situation. Clearly the stakes are higher in the airplane exploit than in the other kinds of exploits, so this is where I’d like to dwell for discussion. My question for you, beyond the general avenues of discussion this issue may have opened, is a bit layered: What kind of responses do you think are appropriate in the Roberts case? What are the kinds of ethics might we imagine for white-hat hackers? And how might we begin to think through the kinds of communication ecologies or communication networks that arise when an unknown or intrusive white-hat transgresses established laws of hospitality in a manner that has the potential to produce dire ramifications for guests who we might frame as innocent bystanders?

  2. Rationale:
    • In chapter 4 of Remixing Composition, Palmeri claims that Ira Shor’s Critical Teaching and Everyday Life demonstrates the “ways that writing teachers might employ video composing to contribute to social change” (137). For Shor, “popular films and television programs encourage people to view the problems in their lives as wholly individual rather than social—to ignore the ways they can act collectively to challenge social hierarchies of class, race, and gender” (137). For Palmeri, Shor “asserts that mass media produces false consciousness by fostering a culture of ‘spectatorism’—a culture in which people merely consume but never produce the texts of mass media that so pervade their lives” (138). According to Shor, although students are consumers of vast amounts of media, “Each student is not trained to analyze critically the message thrown at her or him or to be a creator of the media filling daily life” (qtd. in Palmeri 241-42). I have designed the following lesson with Shor’s work, and Palmeri’s call to extend Shor’s work, in mind. In this lesson/unit students will study the persuasive strategies of commercials. At the end of the lesson/unit, students would produce a parody commercial that undermines or criticizes the ideological assumptions of an advertisement. In this way, students would be thinking critically about media and become a producer of media.

    Learning Objective:
    • By the end of this lesson, students will be able to both identify and think critically about the persuasive strategies used in commercials

    Materials:
    • Commercials on YouTube
    • Video cameras, editing software, etc.

    Discussion:
    • Students will begin by discussing visual rhetorical analysis.
    • We will then discuss the techniques used by advertisers, such as editing, composition, angle, lighting, voice-over narration, music, etc.
    • Students will then spend time analyzing a variety of advertisements.
    • The questions that drive class discussion are: What message (both implicit and explicit) is being communicated by these advertisements? More importantly, how is the message communicated? What ideological assumptions are driving the advertisements? What ideological assumptions are reinforced by the advertisement?
    • We will then look at the way that these ideological assumptions are either subverted or criticized in parody ads.

    Assessment:
    • Students will produce a commercial parody that is a response to one of the commercials viewed in class.
    • Students will write an essay explaining the choices they made in the production of their parody ad.

  3. Tori Reeder

    James Browns text takes up a sort of pentadic criticism about scholar’s lack of concern with infrastructure. He states that it is not a focus on digital hospitality per se, but addressing the issue of invited and uninvited audience (Brown 2). Here in the text Brown uses the term ‘ethical programs’ which, in this context, means how software’s infrastructure is developed to address ‘ethical predicaments’ (Brown 5). He then goes on to say that “Ethical Programs focuses on how tools such as MediaWiki and Twitter enact ethical programs and express arguments about how best to contend with hospitality” (Brown 6). However, what’s interesting here is Bogost’s concept of “procedural rhetoric”. I found it interesting to think about procedural rhetoric as a device as well as an analytical tool to address infrastructural concerns (Brown 16). Moreover, procedural rhetorical, in Brown’s words, offers an extensive ‘rhetorical theory’, and that, is an infrastructural (Brown 30). Furthermore, this leads me think about delivery, not just in terms of a rhetorical canon but in the sense of virtual physicality. More specifically, to think about how digital writing’s infrastructure is maintained through this system, and how that turns into an alternative form of delivery moving from the process of delivery as a performance, a sort of digital college, but to more of virtual physicality. That is to say, infrastructural concerns ultimately turn into concerns about delivery. I find that delivery acts as a circulatory system and much like Rice I would like to focus on how a digital Infrastructure influences spatial relation of ideas. I also find it interesting to examine the connection with computational ethics. Brown articulated it in this way, “every act of internetworked writing requires ethical questioning. Given this view, ethics is not just a matter of the occasional problematic episode; rather, ethics, like audience, is a factor of every rhetorical act” (Brown 31). That is to say, “questions of ethics are always tied to these question of ethos”. It goes without saying that delivery is inexplicably linked to both of the element and that Infrastructural ideas weigh heavily on ethics and ethos. Brown brings to our attention that ”Hyde draws upon Aristotle and Martin Heidegger to rethink ethos in terms of how a rhetorical situation “transforms the spatial and temporal orientation of an audience, its way of being situated or placed in relationship to things and to others” (Brown 36). With that being said, I had the same question that Brown had: “How does that conversation affect users and developers alike” (Brown 99)? I would be interested to know the implications for both parties involved. On a different note, I enjoyed that this text urges us to place delivery, or an “attempt to speak well” in the context of digital infrastructure, to address these attempts by examining constraints as well as limitation on speech, code and writing (Brown 35).

  4. Unsettling Systems
    If software helps to code our rhetorical and ethical engagements, then
    do we run the risk of putting in place an ethical structure that is immovable,
    that loses sight of the unconditional, that focuses too much on a specific context,
    and that treats ethical questions as settled? While this is a concern with
    any kind of ethical infrastructure, a general distrust of computing makes us
    especially aware that software may code our ethical dwellings in final, inflexible
    ways. (38)
    While Brown’s question and subsequent analysis in the aforementioned quote are relevant and require our attention, research, and analysis, I wonder if this awareness may contain implications more far reaching than what Brown has suggested. Of specific concern is the treatment of ethical questions as “settled”. The coding of software runs the risk of becoming “evidence” of ethical principles and/or concepts as they are interpreted by users. After a user runs into an ethical program enough times or is confronted with what is rhetorically presented, through the delivery, style, and arrangement of the ethical program, users, as Brown indicates, may develop a “general distrust” or, greater still, fall victim or prey to what were previously “unsettled” ethical arguments.
    If, in our interactions with programs, that are programmed at least “primarily” by people, do we risk a certain “conditioning” of ethical issues through the rhetorical arguments that these programs “send” through there arrangement, style, and delivery? In other words, our relationship and interaction with systems and software (ethical programs) has the reverse effect of dictating back to us that which is ethical; the messages software send, even in the face of evidence that something is wrong with “the system,” become a disembodied authority, above reproach and beyond question. This is quite unsettling to me.
    Also, Brown hinges his theory on the Derrida’s Law of Hospitality, and while I agree with its main premise, I wonder if a more appropriate lens for what is taking place with, in, and about ethical programs in networks is the notion of Foucault’s culture, wherein the culture is established not only by inclusion but also by systems of exclusion and selection. If am thinking here about the Wikipedia example, for it seems more representative of Foucault’s theories of culture. Without a system of exclusions and inclusions the system will loose its ethos. While Wikipedia creates its artistic, or situated, ethos, artistic ethos, in that it allows a free and open society of “knowledge workers” to edit, it’s inartistic ethos is always in question.

  5. The whole idea of hospitality in regard to the internet is intriguing to me. There has to be hospitality in order for the internet to function and be of use to anyone, but as anywhere else in life, given enough hospitality there will always be someone who wants to exploit this hospitality. If there is hospitality then eventually a gap will be found and exploited. But what happens to those who exploit the gaps?
    The most interesting aspect was the manipulation of volunteers through the MyBO program. At first, points were awarded to the volunteers based on activities that volunteers participated in. Of course there were users who took advantage of the hospitality and did as little work as possible to gain points and prizes. The whole idea behind the point system is to control the volunteers and push certain activities on them by assigning larger point values for certain campaign activities. To make matters worse the point system was scrapped and a different system was implemented to motivate volunteers.
    The idea assigning of points seems unethical. The word volunteer itself insinuates that the act being performed is done so without hope of payment. Furthermore the information that the volunteers gained was collected, charted and used to help Obama gain the advantage and ultimately the presidency.
    Later on Brown talks of the exploit of twitter. The “twitter-verse” was effected, but it was not a malicious attack. Twitter itself is now losing users to third party platforms, and therefore losing revenue from ad space. What happened to the person who exploited twitter? His actions ultimately have injured twitter, their revenue and the number of users. The Obama campaign bribes volunteers with prizes. Are those in power above reproach when it comes to hospitality and the misuse of it? Does the fact that these unethical events happen online in a virtual world make them any less worthy of a response or punishment?

  6. Reading Response: Ethical Programs

    The section of “Processing Power” that describes phone bank scripts from Obama’s presidential campaign led me to think about scripts that I’ve followed in past jobs (training manuals, etc.) or the automated phone systems that I’ve called so often I’ve memorized the prompts and can key them in without listening first. For many of us, both students and teachers, maybe we have a script for the 5 paragraph essay is in the back of our minds. Maybe we have in our minds the most efficient way to order a coffee, put in a lunch order, etc. for any number of retail or service jobs we’ve held. If you answered your phone and the person at the other end just started talking about Obama and asking you who you planned to vote for, wouldn’t it seem off-putting? We have become comfortable with procedure and protocols in a variety of circumstances. Perhaps this is the same reason students fall back on known genres when they are unsure of a new writing situation. It’s not that they believe this path to be correct, but they believe it to have gotten them to their destination (good grade on an essay, cooperating with coworkers) in the past. Why do we choose scripts or procedure over original thinking? Do we trust machines more than people? Answering that question yes or no suggests that people and machines “think” entirely differently. I think Brown would say that we are making a rhetorical choice to persuade others with procedure, since studying digital procedures and human procedures reveals that both are rhetorical.

    Brown describes Derrida’s definitions of Law and laws, pointing out that the “pure” hospitality of Law ultimately turns into “its opposite” (24). The rhetorics of procedure “continually oscillate between these two poles, between the Law and the laws” (10). Thinking through this idea brought to mind our discussion (last week) of grading with rubrics. Is it clearer to use a rubric or use a mentor text? (And, in the end, is there a difference?) On the surface, using a rubric might seem clearer. But strict adherence to Law (rubric) or law (flexible rules based on example) is not an ethical procedure. Perhaps fair grading is always a combination of these two ideas? What would networked grading look like? Brown’s discussion of ethos suggests that the responsibility for arrangement is shared with readers in a digital context. Readers select their route into a text. For this reason, for digital projects, should multiple readers be included in grading? Perhaps this idea doesn’t totally work, since people who “show up” in a network aren’t limited to classmates. Still, I think it’s an interesting consideration for grading.

    The swarm is self-organized (4). Erdbauer calls for a “more fully theorized rhetoric as a public(s) creation” (169). In both of these ideas, the specific “who” seems to be left undefined. Bitzer’s rhetor seems to be left undefined, so we try to profile. I like the idea that analyzing the processes of machines could lead us to reconsider or rearticulate the terms we use to discuss human processes for writing or producing texts.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: