Author: Rebecca Miller

A Woman on Walden Pond? (a sample blog post)

(This is an example blog post I created so English 340 students could see the potential style of a blog post, and begin to create your own. To get started, log in and then click on “+ New” at the top of the page. You can write about almost anything you want related to tech, digital lit, Thoreau, etc. etc. But, let me know if you have blogger’s block, I have some ideas I can share!)

I began this post with a question: What would it look like for a woman to go into the woods, as Thoreau did? What would it look like for a woman to join network marketing site Wake Up Now like Damien?

I don’t believe there are many popular narratives offering that possibility, or exploring that outcome. The only literary option that came to mind was Eat Pray Love, but one could argue that the protagonist’s ability to drop everything and travel the world puts her in a privileged

Maybe I'd like Thoreau more if he ate more gelato...
Maybe I’d like Thoreau more if he ate more gelato…

position not representative of all women. So, what happens when a regular-Joe lady decides to escape our capitalist rat-race of a society in a search to find herself?

Consider the world of gaming. Virtual reality scenarios could be considered a modern-day escape from the world. They don’t make you a living or offer the possibility of sustaining yourself financially, so I suppose you never truly leave the rat-race, but they offer a temporary escape into another world where less is at stake, and you can play out scenarios impossible in real life. Women are essentially barred from this space in a number of ways, including misogynistic and violent story lines and, maybe more importantly, actual harassment and threats. In the Gamergate controversy of 2014, gamers coordinated to threaten women involved in gaming, including Anita Sarkeesian, the creator of Feminist Frequency, a site dedicated to exploring “representations of women in pop-

Follow Anita Sarkeesian on Twitter at @femfreq
Follow Anita Sarkeesian on Twitter at @femfreq

culture narratives.” Sarkeesian received really upsetting death threats (and more), but she continued to speak about about representation of women in video games, and about cool+smart digital humanities topics such as anonymity, narrative control, and questions of access.

So, when women try to enter spaces offering escape, they usually aren’t safe. This is sad, because even Thoreau acknowledges that a retreat to the woods isn’t for everyone, and there are a lot of ways to find oneself in this world. Maybe the whole project of finding oneself is meant for the privileged, and is therefore totally flawed, or maybe the system at large people are escaping from should be fixed rather than run from, or maybe we need to work harder to ensure open access, as it were, in these escape scenarios….

I think it was Allison who said in class that she felt Thoreau came from a position of great privilege. And I agree. And in the end, I think the question we as digital humanists and admirers of Thoreau need to ask is, how can we engage with this work responsibly? How can we create safe spaces for everyone in comment sections, in games, on social media, or surrounding digital texts?

One heartening example: Remember our class discussion about Storify? Like Darby described in class, it’s a place of narrative control, where people literally embed things into narrative form. Another oft-harassed Twitterer is Black Girl Dangerous. She uses Storify to save whole twitter conversations as they happen, so even if people go back and delete comments, or make false claims about who said what first, she has archival proof of what happened and how. This is one way people venture into escapist places (Twitter) and use digital humanist-style tools to work toward safety in those spaces.

Any more examples of online harassment? Ideas for remedies? Arguments or agreements? Comment below!

“iridescence, for Piano and iPad”

Just a quick thought:

So a friend of a friend of a friend of mine is an extremely talented pianist (check out his website if you don’t believe me…..) Among his other groundbreaking work, he experiments with classical music and technology. The video I’ve embedded here is to a piece titled “iridescence, for piano and iPad.” How many pieces ARE there for piano and iPad? It’s hauntingly beautiful, and speaks to classical music fans and perhaps to fans of electronic music as well…..he’s almost like a DJ, making things he plays repeat on the iPad to blend with what he’s currently playing, essentially making his own background music as he goes.

This is a random post, but listening to this music reminded me of what Dr. Schacht said in class today….that we’re using computers to do what humanists have always done, which is to look at what humans do or make and ask, What does this mean? What an important task. And we’re not the only ones doing it– artists are as well. Just think of the digital literature we read, and how difficult it was to interpret it and determine its value. The world is being analyzed in different ways with different tools, but these “ways” aren’t really new, just blending old methods with new tools. And the results can be beautiful. It just means humanists have to catch up to our subjects.

So take a listen if you’re stressed with papers or finals and I promise you’ll love it!

It’s Written All Over Your Screen

So this isn’t my most thoughtful or deep post, but I’d like to backtrack a moment to possibly the number one concern I hear from people who are wary about digital texts: reading off of a screen. And it really can be uncomfortable! We’ve all experienced tired eyes and headaches after staring at our laptops too long, and the glare from the sun that forces you inside if you need to use your computer. According to one article, “the issue has become so prevalent in today’s work environment that the condition has been officially labelled by the American Optometric Association as ‘Computer Vision Syndrome.'”

I would also argue that there’s a more philosophical relevance to this concern; the screen is the face of the computer. The field of interaction between user and machine is primarily located on the screen (although the tactile experience of typing is of course also relevant, but perhaps less complained about). So, if this interaction is to be comfortable and integrated into every-day life, the screen needs to be user-friendly.

Let’s see what’s being done to improve this experience, focusing on laptop computers (of course ebook tools such as Nook and Kindle have done more on this front, but these aren’t the devices people are using for hours on end).

Laptops used to be black and white, prone to blurry screens and ghost images. But, by about 1991 color LCD screens came into use, which improved visual quality as well as cost for consumers. Nowadays of course, laptop screens are at a whole new level, with the emphasis

The MacBook Pro Retina Display...does seem to reduce glare!
The MacBook Pro Retina Display…does seem to reduce glare!

being on resolution and the implementation of touch screens. MacBook has a new “retina display” that’s supposed to have incredibly high resolution (the MacBook Pro 15″ Retina Display is literally advertised as “eye-popping”). However, there seems to be little to no push for these screens to be easy on the eyes. Improvements generally focus on bright color and high definition, but that “brighter and bolder” sort of thinking would intuitively seem to me to make things worse.

In fact, there are articles dotting the web about how to avoid eye-strain yourself. These range from buying computer glasses to sitting up straighter to taking breaks. Clearly this is a common concern. However, the physical screens themselves are not being made more ergonomic and healthy by engineers. I could only find one company that is on a mission to reduce eye fatigue

So my mom WAS right when she said I should sit up straight....
So my mom WAS right when she said I should sit up straight….

through engineering better screens. They are concerned with using direct current to reduce perceived flicker. However, this is not a mainstream laptop producer truly implementing revolutionary technology into their products.

Does anyone else know more about this, or have special tips/screen appliances they use to reduce eye-strain?

WWMS: What Would Marx Say about Digital Commons

Perhaps this is just because I’m currently reading The Manifesto in Humanities, but with all this talk about the consequences for private property in the digital age, I was wondering what Marx would have to say about all of this. The answer I arrive at is vague and pretty unhelpful (like Marx himself on the whole), but I’ll get there in a minute.

Before all this talk of communism, John Locke wrote about the implications of property ownership as early as the 17th century. He writes in his Second Treatise of Government (based on my limited Humn knowledge) that property was originally defined by what

John “Locke”: Because he was exercising his natural right to liberty…

you could hunt/gather for yourself without wasting. This created a level of equality among people, because amassing enormous wealth would be physically taxing and people stopped collecting things when their “natural” needs were met. Things like berries and meat spoil quickly, so it would make no sense to horde them. The appearance of money and its triumph over the barter system changed the way people owned things. Now people could own things unequally, and theoretically amass unlimited amounts of wealth that last and accumulate. This sounds like the capitalist system we have today.

 Marx, of course, credits any social development throughout history to economics–essentially, the distribution of property. Engels is really the one who describes the property distribution between the upper-middle bourgeoisie class and the working proletariat. The workers are stuck in an endless cycle of poverty. Marx writes in his Manifesto that the typical system of “synthesis” that happens when “haves” and “have-nots” clash will not be possible when the bourgeoisie and proletariat of capitalism inevitably meet their end. Capitalism will finish, and some sort of revolution that is unimaginable will happen. Communism, or equal distribution of wealth, is the best way to stop putting band-aids on capitalism and urge on this “revolution.”

Blamed Capitalism before it was mainstream....
Blamed Capitalism before it was mainstream….

So, WWMS about the question of digital commons, or places online such as Digital Thoreau in which anyone with internet access can “own” something? How can anyone truly own something/the rights to something if digital sites are open-access?

I think the important thing to remember is that the nature of property distribution has changed as texts, ideas, images, etc. have moved online. Nowadays, an artist can’t be assured for one second that she’ll receive money for everything she has published; someone somewhere will undoubtably have found a way to copy-paste or download or screenshot, etc, etc, etc, her work. There’s a block in the money-centered, capitalistic flow of trade that people such as Scott Turow, Paul Aiken, and James Shapiro would argue discourages creativity and production.


This is where things get eerie, because Marx predicts the destruction of means of production as ways to combat the over-production of final-stage capitalism. The sheer volume of things produced on the web make it a perfect example of capitalism in its final stages. There’s overproduction and then unwillingness/inability to pay on the part of consumers, and then a disincentive for producers to continue….producing.

Communal spaces on the web of course sound kind of communistic in that they equalize people as consumers. However, they’re different than the material property and situation that Marx and Engels were so sure determine everything in the world. In fact, it seems to me to be more similar to the berries and meat Locke spoke of. Web content doesn’t really have an expiration date, but there’s only so much you can download and read and listen to on a computer or in a day. And the amount that you download on your computer doesn’t determine your wealth or material situation (unlike money). This is arbitrary property that falls not really under the supply/demand chain of communism, but more under the take-what-you-need-but-it-will-take-time-and-effort model of the hunter/gatherer system.

Of course where it differs is that people have to produce online content, whereas deer produce venison for us (thanks, deer). So we still have the problem of production. But, Marx would definitely say that that anxiety is the capitalist in all of us which can’t envision any other way of viewing the world except as a giant factory of creation. However, that still doesn’t help us very much in finding pragmatic ways to encourage production in a communal world without guaranteed payback for your time and effort.

So I think Marx would look at the digital age and the way property has become in nature and in distribution, shake his head, think of the end of capitalism, smile, and say I told you this was coming.

Off the Grid….(for a day)

I just got very excited about a New Yorker blog post I stumbled across. The writer, Casey N. Cep, is specifically addressing the “National Day of Unplugging,” in which participants spent a day without technology, and posted photos of themselves holding signs about what they did with that free time. She dismisses the idea that this movement is truly meaningful, and cited different ways in which technology enhances our lives, and why attempts to “escape” it are ultimately unsustainable.

@Pontifex #coolestpopeever
@Pontifex #coolestpopeever

Citing statistics regarding relationships forged online or the Pope’s perspective on the validity of online identities, Cep argues that the concept of a “real” world versus a “virtual” one is an inaccurate binary. She believes that, in its essence, turning off technology is just an extension of the age-old journey to find a “core” and escape “the hustle and bustle of life.” Here’s one quote I found particularly provocative:

“But how quickly the digital age turned into the age of technological anxiety, with our beloved devices becoming something to fear, not enjoy. What sex was for the Puritans, technology has become for us. We’ve focussed our collective anxiety on digital excess, and reconnecting with the ‘real’ world around us represents one effort to control it.”

This guy needs a vacation....
This guy needs a vacation….

I do understand this sense that technology is out of control, and needs to be somehow regulated to curb a feeling of over-excess. I’ve felt it myself sometimes when I’m sitting with facebook, twitter, my email, a homework assignment, some syllabuses, etc. etc. open on different windows and tabs on my laptop; my eyes start to blur, and I begin to day-dream about how simple everything would be if only technology would just go away.

Cep has an alternative solution in her article. Instead of an over-excess of technology, or rigid nonexistence, we should consider ways to make technology work for us. How can technology function in ways that aren’t overwhelming and socially isolating? How can people be in front of screens and still be healthy and happy? She says, “[b]ut let’s not mistake such experiments in asceticism for a sustainable way of life. For most of us, the modern world is full of gadgets and electronics, and we’d do better to reflect on how we can live there than to pretend we can live elsewhere.” Taking a break from technology is never forever; people participating in the National Day of Unplugging have no intention of going off the grid. So what’s a sustainable, practical way technology can be improved as a permanent fixture of everyday life?

"I'm going off the grid, man."
“I’m going off the grid, man.”

I recently had a conversation with my mom about this. She was saying that while watching tv with me, it feels like we’re doing something together. Like, I’m watching tv, she’s watching tv, we’re watching tv together, almost as if there are three people in a room all having a conversation. However, in the case of laptops or phones, it feels like I’m communicating with the screen while she’s trying to communicate with me. Three participants, but no well-rounded conversation. This may be because of the nature of internet-related activities vs. television viewing (the former active, the latter passive), or it could be the ergonomic superiority of one over the other.

If so, how can a computer screen be enhanced or tweaked to become a better fixture in living rooms, dining rooms, etc? How can it have better manners (excuse my cheesy personification) and not hog or interrupt social interactions? It’s not unreasonable for us to desire these improvements, but it is unreasonable for us to assume they can’t be made.

Of course, decisions about technology usage are subject to personal preference and need. But, specifically referring to the use of technology in literary studies, I think proactive, optimistic attempts and improvement are certainly more useful than rejection and denial. Keep in mind that if the Pope has 3.81 million followers on twitter, there’s definitely no turning back from the digital age we live in!

(Digital) Literacy

This past summer I spent a bit of time volunteering for an organization called Literacy Volunteers of Greater Syracuse. When I came in the office for an interview, the woman running the program was ecstatic that she had an inexperienced, untrained 20-year-old come in to help teach illiterate adults who need to learn to read to find a job. Why?

Because I’m young and, therefore, “naturally understand the internet.”

Being a member of this generation put me in a position of privilege I didn’t know I had and had never really appreciated before. I was placed in a program geared toward digital literacy that works one-on-one with clients in a computer lab. The program is very much tailored to the client’s needs, ability, and interest. What I mean here is that it can range from setting up a facebook account to write to their grandchildren to how to move a mouse and turn a computer on and off.

The client I worked the most with was a 60-year-old man from the city of Syracuse. He told me that he never did well in school (from my short time with him I’d guess he had dyslexia or another type of mild learning disability) and had to drop out after 3rd grade. It wasn’t a problem ~50 years ago though, and he had an easy time finding a job working in an airplane part factory. He said that he never had any incentive to learn how to read, because he knew how to do his job well, and could learn by watching other people. Then, this year, when he was only a few years about from retirement, his company shut down and he was left without a job in a market that was vastly different from the one he knew.

My client vowed he would take 2 years off to learn how to read and get his GED so he could find something else to do for the last few years before retirement. However, he quickly realized that wasn’t going to be enough. Most job applications are online now, and employers want to communicate via email. What’s the point of learning how to read/write in this day and age if you can’t Google search or type? A lot of resources LVGS uses to help people are websites such as USA Learns, which use games, quizzes, videos, etc to supplement their tutoring, and make it more interesting. None of this is accessible to someone who has low literacy AND low digital literacy, and what I found is these often go hand-in-hand. And this just reinforces the idea for me that nowadays reading/writing are inextricably linked to our screens and keyboards.

I guess technology and literacy have always been intertwined. How much good would it have done you 20 years ago to know your numbers if you couldn’t dial a telephone? But in this particular moment it seems especially crucial to be literate not just in a “knowing how to read” sense, but in a “knowing how to read in different contexts” sense. I’ve been thinking this in class as we wade through learning about XML, blogging, online texts, etc. I mean, why isn’t that sort of knowledge part of a well-rounded education, whether at the primary or secondary level? As important as it was for my client to learn how to email hand in hand with learning how to write, it seems important for English majors to learn how to access digital texts while learning how to read critically.

So, my time tutoring was very eye-opening as to the idea of the internet/digital world as a collaborative space. This space is becoming as important to access as a pencil and paper were in the past, for many reasons. I see this future of English classes as emphasizing what is done online, and becoming intertwined with what my high school called “computer class,” because really, what can one be without the other?

Does anyone have any thoughts? 🙂