Blog

Robotics Workshop Announcement

29 Jun, 2016   | by: and

Two of our contributors, Dr. Christoph Lutz and Aurelia Tamò, are co-organizing with Eduard Fosch Villaronga and Jo Bac a twinned workshop on robotics, to be held in both Barcelona (Spain) and Yokohama (Japan). These workshops take place on 2 and 14 November 2016 and will have the same content and format. Participants can select which workshop they would prefer to attend.
More…

Blog

Robots, robots everywhere!*

3 May, 2016   | by: and

paro

We were among the lucky ones chosen to present at the 2016 We Robot Conference and for both of us, it was one of the best conferences we have ever attended. We Robot 2016 took place at the University of Miami and was hosted mainly by Professor Michael Froomkin. From the organization to the speakers: everything was amazing! We won’t address every topic discussed at the conference but we will give you a taste of the topics and point you to some interesting readings in this area. Interested parties should also check out the We Robot website and consider applying for next year’s edition, taking place on March 31 & April 1 at Yale University. More…

, , , , , ,

Blog

Long Live hitchBOT — How to Deal with Robots and the Ethical Issues they Trigger

3 Sep, 2015   | by: and

The abrupt death of hitchBOT on August 1, 2015 shocked its fans. hitchBOT, the friendly hitchhiker robot, had traveled across Germany, the Netherlands, Canada and some parts of the USA. In Philadelphia, however, the robot was vandalized—a scenario he had not been programmed to deal with. And so his journey ended. More…

, , , ,

Things that caught our eye

The Perils of the New Connectivity…Or How iPhones Can Kill

5 Feb, 2015   | by:

Being currently at the Oxford Internet Institute, I have access to a wide range of great talks by scholars researching the Internet (in fact, there are almost too many events to attend!). On Monday and yesterday two talks touched upon the negative and exploitative aspects of the Internet and its connective culture.

The first talk by Gina Neff focused on venture labor, a concept that Neff introduced in her seminal study “Venture Labor”. It connected Neff’s work on high-tech workers in the “Silicon Alley” of the late 1990s to current developlements related to micro-labor, such as click farms in Bangladesh, low cost transcription services in the Philippines or Mechanical Turk workers and Uber drivers all over the world. Interesting parallels between the venture labor of the late 1990s and micro-work today were revealed in the narratives of the workers themselves. Many of them seem to see their employment as an entrepreneurial investment – a view that is also imposed on them from the outside via government policies and initiatives.

The second talk featured Jenny Chan, a lecturer at the University of Oxford’s “School of Interdisciplinary Area Studies”. She presented an impressive long-range ethnographic study about the terrible working conditions of Foxconn employees manufacturing iPhones and iPads in South China. Her talk took the suicides in 2010 as the starting point and showed how these workers are actually “Dying for an iPhone”.

Both talks highlighted that behind the shiny facade of newness, innovation and progress connected with the Internet many people pay a heavy price, are suffering.. or even dying.

Stay tuned for more updates from the OII!

, ,

Things that caught our eye

Internet Censorship and The Streisand Effect

28 Nov, 2014   | by:

Just some days ago, I read a new and very interesting study on Internet censorship. The author argues that resistance censorship is futile and, in some cases, can even be counterproductive. The latter, or the so-called “Streisand effect”, is defined by the author as:

unintentional virality of any information, online or otherwise, as a consequence of any attempt to censor, suppress, and/or conceal it.

Using web traffic data, the author demonstrates the Streisand effect at work in Pakistan and Turkey – two countries with heavy Internet censorship. In March 2014, the Turkish government censored Youtube and Twitter because of videos implicating the Turkish government in massive corruption scandals. Despite the censorship, Youtube still ranked among the top 5 sites. It turned out that users used circumvention devices, such as proxies and Tor, to access the censored content. Even more surprisingly, one of the videos in question even spiked in popularity after the Youtube censorship. The author:

Using limited and sparse data from multiple sources we have been able to show that not only does censorship not work but it also inadvertently causes restricted content to become popular.

You can find the full study here.

 

 

,

Blog

The Ghettoization of Facebook

19 Nov, 2014   | by:

(Disclaimer: I first wanted to name this post “Applying dynamics of urban sociology to the Internet”… I think “The Ghettoization of Facebook” is much sexier, though)

Can we analyze the digital sphere with the same concepts that have proved helpful in describing urban developments in the “real world”? Yes, we can! Today, I want to apply two such concepts to the Internet: gentrification and ghettoization.
More…

Things that caught our eye

Privacy Online: From Access to Content, to Access to Meaning

1 Oct, 2014   | by:

danah boyd’s newest blog entry is – again – a very insightful read. She argues that privacy on the Internet is more complex than the mere control of access to personal information:

Achieving privacy requires a whole slew of skills, not just in the technological sense, but in the social sense. Knowing how to read people, how to navigate interpersonal conflict, how to make trust stick. This is far more complex than people realize, and yet we do this every day in our efforts to control the social situations around us.

(Many of) today’s teenagers know that they cannot control the access to their online information. So, they try to control their privacy via access to meaning. One tactic to do so is social steganography, i.e., to hide messages and meaning in plain sight. boyd and Marwick – in the longer, more academic version of the text – provide an example of social steganography:

Carmen, a 17-year-old Latina from Massachusetts, uses Facebook to talk to friends and family. She loves her mother’s involvement in her life, but feels that her mother has a tendency to jump in inappropriately and overreact unnecessarily online. Carmen gets frustrated when her mother comments on her Facebook posts “Because then it scares everyone away. Everyone kind of disappears after the mom post … And it’s just uncool having your mom all over your wall, that’s just lame.” When Carmen and her boyfriend broke up, she wanted sympathy and support from her friends. Her inclination was to post sappy song lyrics that reflected her sad state of mind, but she was afraid that her mother would overreact; it had happened before. Knowing that her Argentinean mother would not recognize references to 1970s British comedy, Carmen decided to post lyrics from a movie that she had recently watched with her geeky friends. When her mom saw the update, “Always look on the bright side of life,” she commented that it was great to see Carmen doing so well. Her friends, recognizing the lyric came from the Monty Python film Life of Brian where the main character is being crucified, immediately texted her.

However, exercising control over one’s privacy via access to meaning can be very difficult as well, since others can publish unwanted content and meanings, for example by tagging embarrassing pictures or commenting in undesired ways.

In a networked setting, teens cannot depend on single-handedly controlling how their information is distributed. What their peers share about them, and what they do with the information they receive cannot be regulated technically, but must be negotiated socially. […] no technical solution can provide complete reassurance. Instead, teenagers often rely on interpersonal relationship management to negotiate who shares what about them, who does what with their information, and how their reputations are treated.

 

, , ,