Seeing Ourselves Through Technology: Week 2

As I continue with Jill Walker Rettberg’s book Seeing Ourselves Through Technology, I’m enjoying how much I’m able to relate her writing to my own life and social media use. As I said in my post last week, I am becoming more reflective about my self-representation online and how I use and filter my social media.

In chapter 4, Rettberg focused a great deal on algorithms. She mentioned the way that our lives have such automaticity now due to apps and other programs that track our routines for us. No longer do we have to be aware of the steps we take or how we sleep at night. Our phones will track our locations, our sleep patterns, our social lives, and our health. Sometimes without even making us lift a finger. However, the main flaw that Rettberg focused on was the computer’s inability to recognize what is important to each individual. Technology has made incredible strides when it comes to data tracking. The next step is getting the computer or phone to analyze the data for us in a meaningful way. An example that Rettberg gave was Facebook’s annual “Year in Review” videos.

I remember the first year that Facebook piloted these videos. I was very excited to be hit by a wave of nostalgia as Facebook replayed my most meaningful life moments for me. Strangely though, as I watched I found that Facebook had attached meaning to events that I almost didn’t remember. I hadn’t assigned the same level of meaning as Facebook wanted me to connect to the events. An excellent example of this happened a day after I shared my Year in Review video. I got a call from my dad. He was surprised and a little hurt that I had chosen to include my brother, mom, and step-dad in my video. But he hadn’t been included. My Facebook didn’t understand that just because my dad wasn’t as active on Facebook as my mom was, he was just as important to me. As far as the algorithm knew, I didn’t interact with him online as regularly so he didn’t need to be in my video. Luckily, my dad quickly understood once I explained it to him. But this shows perfectly what Rettberg was getting at with this algorithm piece. Computers cannot correctly identify what events, pictures, locations, etc. have the most meaning to us. They are absolutely trying, but this is a level of artificial intelligence that still eludes us.

As the internet and our devices keep track of our lives automatically, we find ourselves in a web that we cannot escape. We have made it to the point that our lives are documented and we can’t turn that off. Rettberg pointed out that even deleting an app or discontinuing use of an account doesn’t remove the data. Anything you turn off can be turned on again another day. Our data is out there and our phones and devices are continuously monitoring us. One quote from Rettberg that really struck me was “even your Facebook timeline doesn’t stop with death”. You would assume that a social media account focused on the story of your life would end when that life ends. Yet even after you’ve passed, your social media is still there to act as a grieving outlet for some and an area to promote information about your life.

Though many may hear this and be upset or frightened by the amount of ourselves we’ve given over to technology, I find it consoling. I appreciate knowing that I can visit the Facebook timeline of my friend who passed away four years ago. I can scroll through her wall and read the kind things people continue to say about her as she passes through their mind. I can look through her pictures and catch a glimpse of her life. In this way, social media continues to keep us close to loved ones we’ve lost.

Technology has automated so much of our lives that we aren’t always fully aware of how much we’re being monitored. Now we just have to ask ourselves how far is far enough. At what point are we uncomfortable with the tracking? Or are we going to continue down this path with the thought that the loss of privacy is outweighed by the benefits of data collection?

Advertisements

One thought on “Seeing Ourselves Through Technology: Week 2

  1. thehunkfin says:

    Your book sounds so interesting! The example you give about your dad and the algorithms that Facebook has for interactions is such a true story for so many people in our lives. Just because the technology says something is important and something else is not, the humans involved get hurt feelings. It is interesting to think how much those algorithms are dictating how we feel.

    It reminds me of a podcast I listened to last quarter on note to self. On her podcast she invited Jacky Alcine’ to come on the show to talk about his personal story with a racist google. Because the majority of web designers, coders, etc…are white males, therefore all the ‘brains’ of these technologies only have white, male perspectives. Because of this, when Jacky and his friend took a selfie of themselves and uploaded it to google photos, it labeled them as ‘gorillas’. The architects and engineers of google were inputing data from their perspective and completely forgetting other cultures, races, socioeconomic statuses, genders, etc…

    I’m sorry I don’t know how to make a hyperlink in a comment but here’s the podcast:
    http://www.wnyc.org/story/deep-problem-deep-learning/

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s