Brainstorming Immersively: Of Streams, Position, & Coordination

August 30, 2009

Friday was the 8th installment of the ThinkBalm Innovation Community‘s Brainstorming series.  I’ve consistently found ThinkBalm immersive events incredibly beneficial, particularly in refining my understanding of enterprise use of immersive technologies.  This one was no exception.

It was a focused, vigorous 1 hour discussion lead by Erica Driver & Sam Driver (ThinkBalm) on the topic of “How to write an immersive technology business case.”  We used my newly released BrainBoard version 1 as our primary collaboration orchestrator.  At the close of the session, Erica gathered evaluation feedback from the participants by using the Attitudometer.

Brainstorming Area

Here are some observations & thoughts from my experience:

I saw some fascinating non-verbal problem solving and coordination

Erica & Sam structured the discussion agenda with 3 points.  Thus, contributions appearing on the board were placed under one of the 3 points.  The initial setup was to use the 4 quadrants on the main board to sort the user generated notes for each of the 3 points.  Any miscellaneous notes would be sorted into the 4th quad.  The supplemental board was placed on the side just in case we needed more room.  Well, we needed it.  Due to the sheer number of user notes, we quickly moved the 3rd and miscellaneous points to the supplemental board, leaving the main board for the 1st & 2nd point notes.  I bold & italics “WE” for emphasis…All this was conducted without verbal coordination.  It happened organically (or digitally, I guess).  Several individuals visually observed the need for more room, and coordinated a solution using visual observation and interaction.  And all this without breaking the momentum of the ongoing discussion.

Simultaneous voice & text = better discussion communication

My computer was not playing nice that day, thus I could hear participant voices, but no one could hear mine (a fact that my wife humorously pointed out might have been a good thing).

However, the curse was actually a blessing.  I was reminded yet again that although voice is a powerful & flexible communication tool, it is inferior to text for synchronous multi-person contributions.  I found myself following the stream of voice discussion, while simultaneously contributing relevant thoughts & ideas into notes on the BrainBoard.  I could see several others doing the same thing, their thoughts popping into existence onto the board.  This visual/textual discussion stream at times tracked the vocal, qualitatively and quantitatively expanding & enhancing it.  At other times, it branched away from the vocal, following the rabbit-like course of thought in a separate exploration of the core discussion topic.

This approach to brainstorming allows multiple contribution roles

Brainstorming Main BoardI also found the way in which I contributed to the discussion changed over the course of the hour.  It seemed that in the first half, I followed the voice discussion rather closely, contributing many textual notes to the board without too much concern for their position on the board (or position relative to other notes).  My role seemed to be primarily to add thought.  As we moved into the 2nd half, I found myself spending more time reviewing & sorting the notes.  I started grouping similar notes together, looking for patterns that would inform the ongoing discussion.  The act of positionally sorting the notes during the discussion seemed to help me connect the concepts together into a working, developing understanding of the overall context.

I would be fascinated to learn what impact the repositioning/sorting of the notes had on other participants’ understanding of and contributions to the discussion.

Immersive/virtual environments enable positional relevance in discussions

Utilizing voice and text simultaneously enabled multiple discussion streams to which participants could contribute.  Having these streams in an immersive environment, where the participant contributions exist disparately and spacially as notes on the BrainBoard, allowed for their position to become relevant.  For example, you contribute a thought in a note.  You move that note to a position on the board.  As more contributions are added, someone moves your thought next to another thought.  Seeing the two thoughts placed closely together causes an insight in yet another person, who notes their contribution on the board.


Meerkat Viewer – The Buggy Coolness

August 23, 2009

Downloaded and tested the Meerkat V0.1.6 viewer this weekend.  Despite the occasional crashes and inoperable extra features, I like what it has to offer.

Avatar List

Accessible via the “Meerkat” dropdown menu on the top bar, this little window tool gives some pretty handy features.

  • Shows all nearby avatars
  • Allows you to mark, track, get key, and instantly TP to avatars

Avatar List

Chat Bar as Command Line

Probably my favorite little extra.  With it, you can set your own custom chat-based commands.  The Teleport to camera position is great fun, however your mileage may vary.  After I TP’d a few times, other avatars could not see me.

Command Line

Visual Environmental Settings

This one is a immersive photographers dream.  Via the Arrowed button at the bottom toolbar, you can pull up a rather lengthy list of environmental settings that change the way the world looks.

Environmental

Make your region look ghostly…

Ghost

…or all funky alien-like.

Alien

Again, your mileage may vary.  I encountered several inventory locking quirks that I suspect have something to do with the viewer.  It also crashed on me once when selecting one of the environmental presets.  However, the neat extras make up for the early-version bugs.  I won’t be using it for a business meeting or event yet, but for the fun experimenting times, it’s great.  I’m definitely keeping my eye on this viewer.  Visit the Meerkat development home to download and try it out yourself.

Have you tried it yet?  What were your impressions/favorite features?