Search Blog
Categories

The e-memory revolution is changing everything.

Be part of the conversation.

Entries in Future 2015 (1)

Wednesday
May152013

Q&A re. Extreme Lifelogging with Autographer

I was interviewed by Imogene O'Neil of Autographer re. my thoughts about lifelogging with cameras. 

It is at http://blog.autographer.com/2013/05/the-future-of-lifelogging-interview-with-gordon-bell/  and I have copied the interview page below.

The Future of Lifelogging – Interview with Gordon Bell

14th  May 2013

Lifelogging pioneer Gordon Bell has been using a wearable camera in some form since early 2000. He was the subject for the MyLifeBits experimental lifelogging project and is a principal Researcher at Microsoft. Here he talks with the Autographer team about the future of lifelogging.

 

Gordon, you’ve been involved at the forefront of technology for many years now, from minicomputers, timesharing and multiprocessors in the 1960s, to the birth of the internet through to the fascinating work with Microsoft Research. What was it that led you to first become interested in wearable technology?

Gordon Bell with an Autographer

LUCK! Or as Pasteur said: “Chance favours the prepared mind.”

In 1998 I started the quest to capture bits of my life so as to be paperless. That soon evolved to include “everything” in life… without really thinking a lot about what that meant. By 2001, when we wrote the first paper on Storing Everything, except anything real time. We started the MyLifeBits project based on the necessity of a database and Gates’ 1995 observation that “someday you will be able to store everything you see and hear”. Vannevar Bush’s 1945 design of Memex was our design spec.

In 2000, I met Dr. Astro Teller, the founder of BodyMedia, a wearable armband for tracking energy expenditure and heart rate, which was similarly intriguing for health monitoring. I started wearing this device in late 2002, so the idea that there would be full body monitoring was already coming into view.

In the late 1990s I had seen the MIT ‘Cyborgs’ – Steve Mann, Thad Starner, and others at the MediaLab, who were doing various forms of lifelogging. In September 2003 the founder of DejaView contacted me about using their wearable video capture that stored snippets; and then in October, Lyndsay Williams of Microsoft Research produced the first wearable SenseCam with Fisheye lens, based on the Philips USB Key Camera. So it was pretty clear that something for visual capture was going to happen. Lyndsay sent me one of their first prototype SenseCams in 2004.

Steve Mann with "Digital Eye Glass" (wearable computer and Augmediated Reality systems)

Steve Mann with “Digital Eye Glass” (wearable computer and Augmediated Reality systems)

It was also during this time that I began to speculate about the body area networking (BAN) and the body mainframe, that looked to eventually be the cell phone we have declared to be a smartphone.

 

These events illustrate “the Carver Mead eleven year rule”—namely it takes 11 years to achieve any kind of uptake from something coming from a lab, based on his observations of the inventions of the transistor and the integrated circuit.

Did you anticipate the lifelogging trend would catch on with the wider public as quickly as it has?

No, people have adopted lifelogging more rapidly than I thought, or at least they have recognised the value of saving everything. I owe this partially to social media e.g. Facebook and Twitter, and the smartphone, which makes it is easy to chronicle all sorts of aspects of your life. These devices are the capture agents of life events, and the social media is where the content is held.

You’ve been using a wearable camera in some form since early 2000 – from early SenseCam models to the Vicon Revue and now an Autographer. What have you found most fascinating as a user?

Image sampling is an effective capture mechanism for special events, walks, conferences, and site visits. Constant monitoring is especially useful in social situations as a means to capture a lot of faces for eventual person retrieval. (To make the most effective use, the faces have to be identified and ideally matched with contact information in professional settings, such as conferences.) Recalling exactly what I’ve eaten and then sizing this up. However, the most fascinating aspects still reside in my mind, waiting for software and hoping for some surprising, compelling, “killer” apps.

What excites you most about wearable technology?

On-body 24 x 365 logging of personal health data. Capturing every heart beat and being able to ultimately use this information for understanding e.g. stress and then being able to provide early warnings of heart attack, stroke, etc.

BodyMedia aim to provide accurate information about your body.

BodyMedia aim to provide accurate information about your body.

What do you see as the main technology and behavioural enablers for wearable tech and lifelogging?

Technology: Much of the hardware exists. The peripherals for smartphones to monitor and diagnose health e.g. heart, eyes, ears, echo sensing, even the possibility of small MRIs or X-rays. These will trickle down to be used for personal health i.e. lifelogging. Instead of being asked about diet and exercise, these things will be automatically captured.

eMemory is what I believe to be the significant use – helping immediate recall. This covers a number of ranges from the distant to the immediate past, and then a way to provide immortality. 

How do “extreme life-loggers” deal with what many people may see as information overload?

I don’t think we have many “extreme”  lifeloggers. Cathal Gurrin of Dublin City University is the most extreme for picture capture and he doesn’t record audio. Cathal has tools to analyze the millions of images that are his life. One can imagine all the software and insight you can get e.g. time and motion of everything you do, to the amount and healthiness of all your food intake.

 

Thad Starner does the most useful and extensive lifelogging —he has an on body computer and uses his “twiddler” keyboard to take notes, thus his content is easily accessible be searching or the database he uses. BTW: he was on the Google Glass design team.

 

Wait a year to ask that question when there starts to be software and more cameras, including Google Glass that can do “extreme lifelogging” with audio. These will cut new paths as people record audio and then get challenged for doing it.

Where do you see life-logging going next?

I really believe we are going to have to see how wide scale and deep it goes—i.e. how much of life people are going to bother to log, and how many people do it. One could argue that there are a billion shallow lifeloggers that comment and tweet about everything. Let me posit the following taxonomy that illustrates the possibilities as to the depth of lifelogging:

Implicit, light lifelogging You don’t delete anything on your computer or cloud stores or social sites
Professional lifelogging Communication, professional material
Personal and family lifelogging iLife, Google
Lifelong learned logging Books, magazines and journals you read
Social lifelogging Communication, ideas, etc. e.g. FB, LinkedIn, Twitter, Yammer
Health-wellness lifelogging Quantitative Self groups
Conversations & thoughts lifelogging Transcribing notes from conversations. Thad Starner, c1993-
Extreme lifelogging Everything you see and hear aka Sousveillance e.g. the likes of Autographer products and services (camera and image cloud store)
Lifelog Tracks Everywhere you’ve been, aka lifetrack / lifetrek
“Image” i.e. what society thinks it know about you logging or Reputation.com
After-lifelogging: Only your avatar knows. TBD
Institutional lifelogging of the famous: e.g. LoC, British Library
Property lifelogging: A catalogue of all the stuff we own

And how about looking to the future of wearable technologies and life-logging, for instance in the next five to 10 years?

Two possibilities: a plethora of special appliances like we have today; and a body mainframe based on smartphones with all the devices connected to them to hold data and to do special post processing. There’s a social aspect too where people’s state is distributed and held by others. I will stick with my 2010 prediction that extreme lifelogging will be commonplace in 2020 based on the next generation of devices.

You can find out more about Gordon Bell on his website and Wikipedia page.