Friday, May 30, 2008

Latest Ethogram


An image of the latest ethogram.

What the field researchers want...

From Whitney, a great summary of the work we've done so far on the LiveScribe pen project.

"I think we've come up with some really great design ideas, and pinpointed opportunistic places for users to have control of the data/the ability to code. Examples that stand out to me are: The buttons are a fantastic solution to time stamping a behavior, and I really like being able to make more "record" buttons for voice notes. Having access to data points, as well as being able to code per region allows for flexible use of the space (er, not to mention the making of the buttons)."

Whitney's take on more LiveScribe field research ideas from the Johnson research group and others:

"As far as other projects/interest: I know Chris is itching to find a way to integrate the pens on the bonobo project, and the bioacoustician I've been working with at SeaWorld thinks they might be the holy grail of field research - she took down the name and model of the pen, and I've promised to keep her posted on the pen's integration.

Friday, May 23, 2008

.:The Arduino LilyPad and Wearable Electronics:.

A concise review of the microcontroller for wearable electronics and light-up clothing. Gives a balanced view of the benefits and design issues of using this system to build your own costumes with embedded microcontrollers.

read more | digg story

Wednesday, May 21, 2008

Wonder how we change our behavior?

One really interesting question is how we adapt our behavior knowing what the LiveScribe pens are capable of. Whitney said,

"I've gotten used to the pen, and I'm not taking as many notes. Yeah, and it was funny 'cause I was like, oh recording, you know, that's like cheap, right? Recording's cheap and you can just do it and upload it to the computer, but the paper is finite ((laughs))"

Another Productive Brainstorming Session

Today was another productive Wednesday meeting for the Beluga group. Although Prof. Johnson could not be in attendance, I feel like we made a lot of progress on the ethogram design. I recorded our meeting, and found the Paper Replay function to be very useful for writing up summary notes.

Whitney is very excited about the possibility of using pictures, icons, and symbols to separate the ethograms. She said, "I think it's so cool. I think what we first saw with the pens was like, 'We can do things in pictures now! This can make things more...human...rather than us having to do things in a computer way.'

She brought a prototype meant to have blocks of space cut out below a picture. The whole thing can be printed on paper and laid over Anoto paper, so the paper can be marked on and strokes timestamped. I suggested using vellum that can be purchased for ordinary laser printers.

I feel pretty enthusiastic about the surface, diving, orientation and proximity categories we sketched out. We're still a bit unsure about the event sheet and how best to build an ethogram.

Some of the remaining challenges - should states vs. events be on separate sheets, or separated by some other feature (health behaviors like breathing and nursing, versus synchrony data)? Are we staying 'true' to DCOG, like should we only care about the dyad and not individual behaviors?





"I don't think the baby can really spyhop. That is something that the adult can do but not the baby, it's not something the calf is physically able to do."

Friday, May 16, 2008

Brainstorming with Jim

During our meeting yesterday, Jim and I talked about the power of the LiveScribe pen for collecting ideas, and helping re-establish context after time goes by. We're interested in looking more deeply at how the pen can support some of the creative/brainstorming processes that are an essential part of doing science. I mentioned how the class of activity I've witnessed at the Beluga group's last set of meetings is common to many, if not all, scientists who do observational research. You have to figure out what you are going to look for, what you can see, how you can record data - and feed these constraints into the design of a data sheet.

Wednesday, May 14, 2008

Meeting Notes, 5/14


Today's meeting felt quite fruitful. We started out with an update from me - I basically outlined for the group what kind of data we should be able to access sometime in the future, once the SDK is available and more accessible. I also told the group how the pen timestamps strokes. Chris asked about how fine-grained the timestamp is.

Whitney then reported that she was still very unsatisfied with the current state of both ethograms. She is also not thinking that they are fully taking advantage of what the pens can offer and record. She said she had gone back and reflected deeply on the kinds of questions the project is trying to answer.

Chris led the group through a very careful outline of all the data that needs to get recorded. We started out with the All Occurrences sheet. What behaviors need to be captured?

Breathing - rate important for SeaWorld to know animals are healthy
Nursing - again, indicates health of infant
Floating - too much could indicate unhealth
Bubbles - indicates breathing, and can pinpoint who vocalized if unknown

Chris pointed out that both events and states are recorded here. Separating them can be useful.

Possible States:
-Floating
-Static underwater
-Swimming
-Floating
-Spyhopping

Possible Events:
-Breathing
-Nursing
-Surfacing (B only, M only, B-->M, M-->B, Synch.)
-Diving (B only, M only, B-->M, M-->B, Synch.)

Chris also noted that it is important to know what state an animal is in when it performs an event. The timestamped pen data can help delineate this. There may also be a work-around (/) to point out state changes during a time block.

_________________

Interaction/Relative Dynamics Sheet

We had a discussion about the importance of recording proximity information. Whitney had outlined a proximity scale, and we discussed/refined definitions.

0 - touching/in contact
1 - slipstream (baby or less width apart)
3 - proximal (adult to baby's width apart)
5 - other

We also talked about the importance of relative orientations. Chris sketched out several possibilities (shown on attached sheet). We brainstormed about putting pictures of postural configurations along the top of the sheet. In the current version of the sheet, we can assign values to the orientations. In the future, we can assign Anoto address space to these icons, and touching the pen in the box will trigger a recording event.

The quality of the contact between mother and baby is important too. We decided what kind of touches are important - all, or just a subset? Is fluke to fluke important to know, or is just a touch in general enough to know about. We decided that the only important touches were touches at the mammary area of mother, and rostral touches. Rostral touches are touches where the ecolocation organ is facing the target - might indicate that some kind of spatial mapping development is going on? Therefore, the touch categories are:

On mother: On baby:
Rostral Rostral
Mammary Body
Body

Play behavior is also important to record, as it may indicate imitation activities.

**The researchers also pointed out an additional feature of using the LiveScribe pens to pilot this study - they can record meta-level observations of data collection while simultaneously taking data, i.e. "We really should add another column for xyz..."

____________________________

Remaining questions:

How fine-grained is the timestamp?
What is the human error between two scorers?
Can we mock up something for next week, using transparency sheets, and??

Tuesday, May 13, 2008

LiveScribe Data Update


A big question we've been asking is what kind of access we'll have to pen data, and what kind of data the pen is recording. Jim recently got (some) access to information about LiveScribe's SDK, so we have more of an idea of what we're dealing with.

First of all, what the data looks like. In other Anoto pens, the pen records and timestamps all address information during a stroke, kind of like tiny video frames of what the camera is seeing during a stroke. Data is recorded 72X/min. In the LiveScribe, data is only timestamped at the onset and offset of a stroke.

What this means is that we wouldn't be able to get real-time data if we were recording whale paths through the tank as a continual line. We need to pick the pen up often to get stroke information timestamped. This should feed into the design we choose. The good news is that Whitney's design using specialized symbols induces several strokes by design, so we should get good data from this kind of ethogram.

Another question we've had is about encoding special regions of the paper to become icons or buttons to register certain behavior events. Jim said that this is what the SDK should allow us to customize. The bad news is that we don't yet have access to the full SDK. But prototyping or mocking a few things up to try out in the field should give us a good idea of what we'd want eventually.

Thursday, May 8, 2008

3 Laws of D-COG Analyses

From Chris Johnson's 4/30/08 lab meeting:

1. Interaction as a unit of analysis
2. Consider multiple time scales
3. Attend to configural change

LiveScribe Beluga Project


Yesterday's meeting was quite productive. We spoke at length about ethogram designs that were scientifically sound and also technologically feasible. Two that really stood out were being able to trace/code behaviors on a top-view version of the tank, or creating a document with pictograms of behaviors. Touching the pen over a pictogram would enter a data point into a spreadsheet at that time code.

Yesterday Jim said he had gotten some documentation from LiveScribe on the SDK. Hopefully in the next few days we will be able to have a better idea about our ability to support those designs.

In other news, apparently the beluga whale Ruby has "dropped", meaning her baby might be due sooner than expected!

Thursday, May 1, 2008

TOOLS FOR ETHNOGRAPHERS

As with any science, doing ethnography involves creating cascades of representations. Below are general cascade levels many folks have expressed a need for.

First-pass tool

Many of us do a general "first-pass" through the data and create an annotated Table of Contents describing events within a video. It would be nice to have a program that would a) allow for rapid creation of "chapter headings" that could be integrated with video record, and b) feed in easily to other levels of analysis.

What should the form of the Table of Contents look like?
Timeline? Spreadsheet?
Perhaps integrates the LiveScribe pen?
Multitouch table or stylus?
Directly on the video (e.g. dots on the slider)?

Coding tool
As one develops increased familiarity with the content, categories of activity emerge. A video can then be coded for categories.

What should the form of the coded data be?
Spreadsheet?
Timelines? (Multiple)?
Integrated with the LiveScribe pen?
Superimposes coded categories on the video?

Event table tool
Timeline of events in a smaller video segment (Chapter)
What should it look like? Many of us currently use Excel. What is the advantage/disadvantage?
Should we incorporate transcripts of other kinds (line drawings, cartoons, etc.)?