Andrew F.: Thoughts on Memory Place (August 2011)

>Here are my somewhat rambling impressions of where we are at, from my viewpoint...hope this is useful to getting the next round going.

 

Information and Other-Than-Information: Examining space before it gets to be information (Relevance to new-media, dance, architecture)

We live in a world of data, of encoded information. This is especially true of our visual world. Phenomenology, as we have been at it in these experiments, has something to say about experience of space, of being and space, prior to it being monetized as information that can be parsed in the realm of computation. This seems like a key area where the ‘experiments’ and experiences of Memory Place are a useful corrective to assumptions coming from science and technology (assumptions which, to be fair, also permeate art practice). Key question: is there a way of thinking about this information threshold (when experience/space becomes info) in a way that is useful to a better understanding of technologised culture and what we can do with it?

In relation to many new-media practices, the space of movement/performance is of prime interest as a hybrid between two encoded cultural spaces; firstly, of action grounded in the human body, with its proper perceptual world, and, secondly, of the ‘without-ground’ or ‘without body’ of new media and the virtual. Somewhat naively, electronic arts often use the theatre as a model for virtual/immersive space without examining the implications of that importation. One could argue that dance- and movement-based performance as well as architecture have become more relevant in contemporary visual culture precisely in critical relation to electronic screen-based media, virtual-reality and immersive interfaces. That is, we are trying to understand two different kinds (registers?, grasps?...) of space which are nested or interleaved one with the other. Virtual or immersive ‘reality’ relies on this confusion; that we believe (or are asked to believe) one is the other.

Returning to bodily movement as a form of encounter may be an important corrective to assumptions engendered by new technologies. For the human animal, watching another in movement is immersivity without technology. This makes movement- and space-based art practices (such as dance and architecture) the most relevant areas of artistic research today and a key point of overlap between traditional artistic disciplines and new technology in their ability to confront a culture of information with a culture anchored in the experience of space. How are movement and space ‘fixed’ by technology and new media? The nature of this ‘fixing’ should be a significant preoccupation for creative and critical practitioners in the face of new technologies.

Methods

As outlined in the grant summary the methods of data organization, experimental set-up, interview and so on are all part of the same conundrum of distinguishing scientific/technological method from what it seeks to understand. Our recent experiments have interestingly shown that our observation of the participants ‘working’ in space is fascinating and useful (see the underlined sentence above). We have often said, “it was interesting watching so-and-so doing this or that”, or that the particular ‘style’ of exploration someone has is exceptional. So incorporating rigorous observation into our method (and descriptions and write-ups) seems a good direction. There is a parallel here which might be worth following up in dance observation / dance movement therapy (DMT).  All to say, if we are intrigued by observing movement we should do some work on bringing observation notes into the method alongside alternate debrief methods such as multiple interviewees/ 3 way conversation, etc.  In line with this is an idea in the grant summary that participants can be trained to some extent to help them leave behind some presumptions and tune their own observation skills towards the simpler experiences we are interested in (what breathing and moving before the last round did—could this be more comprehensive). All in all, this means abandoning the naive subject and the invisible experimenter. This is complicated. Scientific experimental method is useful because it is effective, and vice versa. It 'moves forward', as everyone says these days. Our method is defficient in that regard. Its fuzzy both in activity and analysis phase. The question beomes not how 'accurate' a method can be but is it rigorous enough to be generative/creative of...something.

analogue /digital

David’s suggestion in the last follow up of a chair based mechanical device was interesting on two counts. One, it gets our feet off the ground, therefore interfering with some intuitive ways of getting around and sounding-out a space. Two, it is mechanical, therefore analogous to bodily mechanisms.  Somehow I think we might adapt to it in a different way that a device which uses information processing (something mechanical operates in the world of body physics we understand and believe; something computational can be programmed to lie, we have to chose to believe it—perhaps a subtle difference).  There are a variety of simple ‘analogue’ experiments which could complement the more ‘digital’ ones to give us a sense of this.

Perspective and architectonic space

Is linear perspective learned from living in and amongst buildings? What prejudice does this layer onto our experience of space? (linear perspective being the structuring basis of most virtual-visual environments)? But perspective, as a mathematical construction of points is a representation of something that itself it is not. What other kinds/shapes of rooms are there?

Interdisciplinary Relevance

By definition a cross-disciplinary practice straddles multiple discourses and audiences, which brings about either a flattening or sharpening of the work’s critical positioning. Memory Place sits across philosophy, art/design, computational environments in a superbly interesting way—a way in which putting stress on assumptions coming from these disciplines rather than using these disciplines purely as a generator of a hybrid method for production/invention/innovation.  That can be a redefinition of ‘innovation’ right there. This makes ‘M-P’ relevant to the elaboration of what ‘inter-disciplinary’ can be, in that in part it questions the disciplinary partitions themselves and asks how can something creative, experimental and philosophically rigorous work.  Specifically, thinking about movement, memory and space, puts M-P in a key position in relation to ‘traditional’ disciplines (dance, performance, architecture, computation, philosophy) which make a claim to various parts of the shoreline of such an inquiry. Is the end product a work of art or a cognitive-science paper ? Probably not, likely it is a method of  working, observing, generating understanding of this movement, memory, space differently relevant to multiple disciplines, but which helps those disciplines shed some prejudices.

on interdisciplinarity, and transdisciplinarity...

Speaking of interdisciplinarity ...

I was part of the symposium around this book sponsored by the Rockefeller and the National Academies in the USA when it came out 2003.   It was mostly written from the perspective of engineering sciences, in particular information technology and computer science, and telecommunications.    To their credit, the committee was trying to be open to art and design, guided by the best examples of multidisciplinary, interdisciplinary, and transdisciplinary collaborations between artists and engineers over the past century.   Unfortunately their examples were dated.  (ie. they had not seen the TML ;)

It may be interesting to see how disciplines inter-percolate (an Alexander patterning transposed to the ecology of practices?)

We are far from transdisciplinarity as defined in chapter 4, but I expect that does not have to happen in order for significant work to be done.   I'm not even sure that it ought to be an institutional goal, pace Simon Penny who created the exemplary ACE program at UC Irvine.   The late ACE program from which Erik Conrad one of the Memory Project's forbears got his Masters before returning to the TML.

Attached is Chapter 4: The Influence of Art and Design on Computer Science Research and Development
from: Beyond Productivity (2003), NAS, William J. Mitchell,  Alan S. Inouye,  and Marjory S. Blumenthal,  Editors.

GyroOSC is good, IR is good

Hi David,

By the way, GyroOSC a great little app -- atop Adrian / John / CNMAT OSC lib. ;)  No need for Polhemus if we use iPhone ?   And maybe the familiar form factor works in our favor?

I've downloaded it but don't have time to play with it just yet in Max -- expect they have demo patches?   Have you looked at the data coming out on Max/MSP?  I bet it's clean but would like to check effects of interference, range.

If GyrOSC works we can get attitude from onboard sensors, so the only remaining issue is x-y position on the floor.   I hope the demo patches include a simple little calibration to output orientation  relative to local room.  ("Let's do the time warp again!")

Tracking x-y location by LED on the body or handheld accoutrement is delicate  -- that's the solution TG2001  used in Linz and Rotterdam, 2000-2002.   We abandoned on the body LED's after 2002 because they impose stiffening constraints on  people moving around:  people have to keep parts of their bodies or accoutrement presented to the camera.    It's quite annoying when the systems stops responding to you simply because line of sight is broken, or because of light-interfernce.   For our purposes this may be ok for now -- but we'll want to carefully fold the artifactual  "stiffening" constraints into our design.

We (mostly) eliminate interference by visible light conditions by tracking in infra-red.  CV has improved so much that we just track with cv.jit tools + overhead camera(s).  Students and I have dreamed up various schemes for minimally encumbering ID-based tracking.  But I have deliberately avoided tracking ID-ed people, expecting interesting ethico-epistemic implications.   (For wearables, to be considered minimally encumbering I use jewelry as the criterion. :)

Speaking of infra-red, it could be interesting to fold our prosthetic sense-organ "back" into the "ordinary" sensorium by using an IR bright source (assuming our photocells can pick it up).   Then we have more amplitude for the event design.

Cheers,
Xin Wei

On 2011-06-10, at 10:13 PM, Sha Xin Wei wrote:

Hi David,

True 6DOF was solved elegantly by Marek Alboszta's team (Espi) (quaternions are the way to go)  and supplied by Polhemus using a more elaborate set of sensors.  Each solution has trade-offs, for good engineering reasons. I wouldn't try to solve it myself.   These are not cheap gadgets for good reason.


On 2011-06-10, at 1:20 PM, David Morris <davimorr@alcor.concordia.ca> wrote:

Actually, I wonder if this might do what we need:

Glue rigid wood or fibreglass dowel to iPhone case.

Fix a ring of LEDs around the dowel shaft at two points, so an overhead camera can always pick up two points of light at known distances from the iPhone.

how would these distances be measured? 
not enough degrees of freedom to recover 6DOF I understand what you have in mind.
And at what time and space  accuracy and frequency ?  ( over the years many generations of TMLabbers and I have conceived diff solutions, but I'm interested in fresh feasible approaches.)

so this is a line of sight approach.

if we use a
line of sight approach, then I'd simply use Wii and hack up its IR receiver . sender pair since it's so well engineered.

ditto Kinect for camera based tracking.

Adrian and I agree
it makes sense to buy and hack this inexpensive off the shelf gear to get a feel for what can be done.  good practice : "rapid prototyping, quick fail"   The question now is who will do the DIY electronics.

In  that spirit,  before that or meanwhile we can wizard of oz, walk thru some scenarios, and get a lot more juice out of the gloves by using them in many alternative scenarios, I think.

Looking forward to talking with you again, and catching up w people Wed. next week!
Xin Wei

Put the iPhone in the case.

Run GyroOSC on the iPhone.

We now know the tilt of the iPhone relative to 3 axes—that gyroscope is pretty amazing.

From the overhead camera monitoring the LEDs we can get the position of the iphone in the XY plane, and its rotation relative to X and Y axes.

And then, I am thinking that by measuring the apparent distance between the two LEDs in the camera image, and combining this with the gyroscope tilt data, we could solve for the height of the iphone from the floor.

Xin Wei, would the math work here? Does they gyroscope data plus the two LEDs in the image give us a unique 6DOF solution?

From: David Morris [mailto:davimorr@alcor.concordia.ca]
Sent: June-10-11 12:57 PM
To: 'Niomi Anna Cherney'; 'zohar'; 'p.a. duquette'; 'Sha Xin Wei'; 'Noah Brender'; 'Tristana Martin Rubio'; 'Andrew Forster'
Subject: Update: Further Research on 'Magic Wand'

I spent some further time today researching possibilities for the tech for the ‘magic wand’, first to see if we can get this via the iPhone. Results:

--it turns out if you want to simulate Lightsaber duels on the iphone, you need to solve the 6 degrees of freedom problem, position and tilt in 3 dimensions. The App Lightsaber duel from THQ apparently lets you have virtual duels through Bluetooth. I guess it’s doing relative position through Bluetooth? There is also iSamurai. I’ve downloaded these, and I am hoping someone else can bring an iPhone, so we can see how this works. What I don’t know is if this only gives you audio feed back when you hit the other person’s virtual lightsaber, or also if you are pointing their way. (NB, the thing we are trying to do is like Luke in the first Starswars movie tracking the practice drone while blindfolded through his light sabre.)

--there are also a bunch of apps for doing read outs of the gyroscope sensor on the iPhone. One sends the data via OSC, GyroOSC. If an overhead camera tracking a fiducial on the hand could gives us 3D location, then we could use an iPhone or WiiMote to give tilt.

--I also came across this: http://www.free-track.net/english/. It’s a freeware package for Windows that uses LED’s mounted on someone’s head to track 6DO data through a webcam or Wii camera. Maybe we could mount the LEDs on the end of a stick, and use an overhead camera?

Hexagram Research-Creation Seminar on Self-Powered Wireless Biomedical Devices - June 6

Hi Laura,

Thanks for taking the time to share such a delightfully opinionated report.   As Maturana and Varela said at the beginning of their Tree of Life,  everything said is said by someone.     
It's characteristic of "design" not to ask framing questions to your depth.   It's interesting that Yong Lian used "simulations" rather than "experiment".   Had he used "experiment", qualified to characterize typical industrial drug research, then I may agree more.    Yong Lian's responses are very revealing: To say that TCM is "a closed system" expresses epistemic incommensurability.  To say that TCM has no theory reflects the inadequacy of western medical knowledge, which I always thought overwhelmingly un-theoretical compared against mathematics and physics.

Cheers,
Xin Wei

On 2011-06-07, at 12:10 AM, laura emelianoff wrote:

Hello lab and others,

I attended the talk on 'Wireless self-powered biomedical devices' today and wanted to share my notes. 
I tried to record but my Edirol ran out of battery power and was unable to get the whole talk. But I did capture at least a lengthy introduction outlining the speaker's achievements and academic positions held. 
I can share the file if anyone wants it.
Yong Lian Is a researcher developing self-powered devices for monitoring of biomedical data, such as heart activity and chemical levels. One such product was designed to report whether a patient had taken prescribed medication, in cases where the patient's memory was no longer reliable. It was an ingestible, powered by the acid in the stomach. 
Other biomedical interventions include pacemakers, cochlear implants, EEG sensors to detect seizures, and deep brain stimulation, used to treat Parkinsonian nervous disorders.
In general, his products (Wireless Biomedical Sensors, WBS) are intended to collect vital information and share it with medical staff via a mobile phone, for example. Dr. lian says that 'wireless healthcare is the solution' for prevention-oriented care- though he did not clearly address who should receive these prosthetics, we assume they are primarily intended for more extreme cases,  for those who are not capable of autonomous self-care.  However, he seemed very casual about using them to replace person-to-person examinations, so perhaps we should assume that everyone should have a few ECG or EEG sensor pods stuck here and there.

Why should biomedical products be self-powered? Currently, a patient with a pacemaker needs open-chest surgery every 8-10 years to replace the battery. As we are in movement all the time, we are capable of generating electricity through electromechanical, thermal, and chemical means. Batteries should no longer be used.

His main points were:
1. 'There are not enough doctors, and they are overworked. The problem is that 30% of med students are female, and they have kids and quit the workforce.' Constant ambient monitoring of biosignals will reduce diagnostic costs and time.
'Only the technology can reduce the workload of doctors'
2. Wireless is lucrative! Just look at the chart of the value of the US dollar, at point in the last century when key technologies were introduced (TV, mobile phone, internet, etc.) 
3. Current monitoring systems for biodata are invasive and cumbersome- a heart-rate monitor requires several electrodes and a mess of wires, while the autonomous devices are much smaller. 
4. Streamlining signal processing can improve energy efficiency- using an 'event-based' sampling instead of a Nyquist-style periodic sampling rate can reduce power usage. Data transmission can be sent in pulses instead of continuously, and fewer op-amps can be used in circuitry.

My first question was, which power harvesting methods are found to be most useful?
Answer: vibration is often effective, but we are exploring uses of 
glucose as a fuel.

Second question- there are so many hands-on techniques of listening to vital signs and movements: practitioners of Traditional Chinese Medicine palpate tissues and listen to the pulse, as it reveals metabolic functions, many doctors learn to listen to their patients' respiration and heart activity using stethoscopes (auscultation) and cranio-sacral therapists feel the tides of spinal fluids with their own hands. How do you regard these existing methods?
Answer: (quote) 'Traditional Chinese Medicine has no theory. It's a closed system, simulations are hard to do'…. 

meaning that TCM doesn't provide 'data'; perhaps it provides more qualitative information. 
But….  health and wellness is not just about data, it is more complex. 

My third question which I didn't get to ask was, How should we remedy this problem of female students going off and having kids, then quitting the workforce? Is that really why they are dropping out of school? What about the obvious gender inequality in certain disciplines, such as we have here in this lecture room- 25 men and 5 women from the Engineering department?

But there are many more questions- why continue to develop unilaterally, where this monitoring technology feeds the isolation of patients? Why is that acceptable? How long will it take to address problems like implant rejection? 
If prevention is really the focus, why not emphasize healthcare long before measures such as stents and pacemakers become necessary?
The proposed technologies are useful and effective, in the specific applications where they are really necessary, but there seems to be a lack of perspective about what is important in healthcare, beyond just life support systems…. One woman I know who regularly has epileptic seizures is alerted four hours ahead of time, when her dog and cat begin to follow her around the house. She doesn't need an implanted EEG sensor.

comments anyone?
Laura E


From: shaxinwei@gmail.com
Subject: Hexagram Research-Creation Seminar on Self-Powered Wireless Biomedical Devices - June 6
Date: Fri, 20 May 2011 22:08:50 -0400
CC: memory-place@concordia.ca; artcrd@langate.gsu.edu
To: tml-active@concordia.ca

TMLabbers -- of possible interest to those who swear by bio-sensing, or would like to peek at the future of biopolitics -- docile mitochondria! :)  Will someone who can attend please send notes to tml-active?
Thanks!
Xin Wei

Begin forwarded message:

From: Momoko Allard <hexinfo@alcor.concordia.ca>
Date: May 20, 2011 10:56:08 AM EDT
To: Momoko Allard <hexinfo@alcor.concordia.ca>
Subject: Hexagram Research-Creation FW: Seminar on Self-Powered Wireless Biomedical Devices - June 6

-----Original Message-----
From: ENCS Communications <communications@encs.concordia.ca>
Subject: [All-ftfac-announce] Seminar on Self-Powered Wireless Biomedical Devices - June 6

Dear ENCS Members,

The following seminar may be of interest to you. This announcement is sent at the request of the Department of Electrical and Computer Engineering.

INVITED SPEAKER SEMINAR IN ELECTRICAL AND COMPUTER ENGINEERING

CO-SPONSORED BY:
The Department of Electrical and Computer Engineering IEEE Circuits and Systems Montreal Chapter IEEE Montreal Section

Monday, June 6, 2011
6:00 p.m.
Room EV002.184
(Refreshments will be served.)


“Towards Self-Powered Wireless Biomedical Devices”

DR.YONG LIAN
DEgr. (h.c.), FIEEE
Lutcher Brown Endowed Chaired Professor
The University of Texas
San Antonio, TX, USA

ABSTRACT
Body Sensor Network (BSN) combined with wearable/ingestible/injectable /implantable biomedical devices are envisaged to create next era of healthcare system. Such systems allow continuous or intermittent monitoring of physiological signals and are critical for the advancement of both the diagnosis as well as treatment. With the advances of nanotechnologies and integrated circuits, it is possible to build system-on-chip solutions for implantable or wearable wireless biomedical sensors. Such wireless biomedical sensors will benefit millions of patients needing constant monitoring of critical physiological signals anytime anywhere and help to improve the life quality. This talk will cover several topics related to the wireless biomedical sensors, especially on the development of self-powered wireless biomedical sensors and associated low power techniques. A design example of sub-mW wireless EEG sensor is discussed to illustrate the effectiveness of the low power techniques.

BIOGRAPHY
Dr. Yong Lian received the Ph.D degree from the Department of Electrical Engineering of National University of Singapore in 1994. He worked in industry for 10 years and joined NUS in 1996. Currently he is a Provost's Chair Professor and Area Director of Integrated Circuits and Embedded Systems in the Department of Electrical and Computer Engineering. His research interests include biomedical circuits and systems and signal processing. Dr. Lian is the recipient of the 1996 IEEE CAS Society's Guillemin-Cauer Award for the best paper published in the IEEE Transactions on Circuits and Systems II, the 2008 Multimedia Communications Best Paper Award from the IEEE Communications Society for the paper published in the IEEE Transactions on Multimedia, and many other awards.
Dr. Lian is the Editor-in-Chief of the IEEE Transactions on Circuits and Systems II (TCAS-II), Steering Committee Member of the IEEE Transactions on Biomedical Circuits and Systems (TBioCAS), Chair of DSP Technical Committee of the IEEE Circuits and Systems (CAS) Society. He was the Vice President for Asia Pacific Region of the IEEE CAS Society from 2007 to 2008, Chair of the BioCAS Technical Committee of the IEEE CAS Society (2007-2009), the Distinguished Lecturer of the IEEE CAS Society (2004- 2005). Dr. Lian is a Fellow of IEEE.


For additional information, please contact:
Dr. Wei-Ping Zhu
514-848-2424 ext. 4132
weiping@ece.concordia.ca

random idea Re: "scientific" gesture / movement research ?

This is definitely an aside to the ongoing thread. But may be
applicable for next memory-place experiments (not as sophisticated
as UI device)... Just something I've been loosely imagining.....
Without yet thinking through the tech process or synthesis....

A couple of references first:

Depth and gesture mapping / tracking of a participant:
http://jmpelletier.com/freenect/

I was thinking that this environmental installation project
might offer some intriguing possibilities:
http://kinecthacks.net/spinny-glowy-foil-in-a-kinected-bunker/
In particular, notice the white spindly sculptural things (you'll see
them toward end of the video) hanging from the ceiling.

I was imaginging, for example, what would happen if we built
the equivelant of the white 'glowy foil' sculptures - but made
them out of strips of IR LEDs. Then use a matched IR detector
apparatus for triggering vibratory feedback (EG:
http://www.radioshack.com/product/index.jsp?productId=2049723)
pretty much as our last device did. Only in this case, it would
be the overall temporal experience of detection, that might add
up to the participant's recognition of distinct shapes. They might
undergo a slow process of meading one detection from another,
as they explore the space.. Slowly developing a physical
relationship with the way they share the space with the
sculptures (known to them only as on/off vibrational feedback).

Pelletier's note (first link above) is that more than one kinect
can be in use at once. It might be messy to work the bugs
out (interference wise) but; one kinect program could be utilized
to trace participant movement in relation to the IR sculptures,
and another could be employed for retreival of depth and gesture
informations... If that's the sort of information we hope to
retrieve.... I haven't thought out the means of integrating info,
at this point at all....

Also, I was thinking about the issue of participant's bumping
into the sculptures.... So.... What if we used transparent cloths
as dividers that were situated between participants and the
sculptures? The IR sculptures could be encircled with these
cloths, hung from the ceiling to floor... Light-weight and soft,
and distanced enough from the sculptures to serve as an
adequate indicator (to the participant) that they should move
in another direction. This instruction could be provided in
advance of the participant's blind-folded exploration (whenever
they feel the cloth against their skin, they should stop and
turn away from it). Would simplify process, avoid collisions...

Could use battery power, rather than cabling, for the IR
sculptures.... Clearing out and freeing up space for participant's
to move in.... With exception of the cloth veils, of course...

?

x patricia


----- Original Message -----

From: Sha Xin Wei

Sent: 05/31/11 06:41 AM

To: Michael Fortin

Subject: Re: "scientific" gesture / movement research ?

I suggested that some one open up a wii mote  and re-assemle the parts into a suitable form factor.  we still need a visible light photocell though, and cant use  line of sight, so that solution is also clunky...
 
We need a good versatile engineer to own this project and work with the MP group.
 
And strategically in June I'd like to define a non-hobbyist grant to NSF/NSERC/FQRNT parallel to an MP grant to do a gesture/movement tracking research project that meets different interests around the TML -- MP, Adrian (+Wessel), MM, Satinder.   I'll propose this  to my EU colleague as well.
 
Xin Wei
 

On 2011-05-30, at 6:43 PM, Michael Fortin wrote:

This might be a jaded comment.... 
 
I'll call it an advanced WiiMote (WiiMote just tracks the x-y-rotation, they have some idea of angle and distance to the display which the WiiMote doesn't have).   (Morgan -- WiiMote has vibrotactile feedback)
 
Speaking of odd remotes, there's this unrelated interesting toy; http://www.thinkgeek.com/interests/dads/cf9b/
 
Cheers,
~Michael();


On Mon, May 30, 2011 at 03:55, Sha Xin Wei wrote:
Hi Adrian, and scientific researchers,
 
Raising the stakes and thinking ahead to more robust and precise instrumentation, here's the
 
NaviScribe 6-DOF 3D wand by Electronic Scripting Products, Inc. (ESPi) in Palo Alto
 
The exclusive patent describes a  6 DOF, x,y,z + euler angles    The company's founded  by a physicist friend from Stanford: Marek Alboszta.  Not productized yet.  "Commercial" co-development would require O(100K) USD.  I've not discussed how to enter into actual relation with this company, but we could perhaps work out a deal.  This would make sense in a real NSERC/NSF  co-development grant.
 
Shall we think about this in context of a scientific gesture research proposal, along with high FPS cameras, and EONYX etc?  Let's discuss this in June.
 
Xin Wei
 
 
 
 

On 2011-05-20, at 9:32 PM, Marek Alboszta wrote:

Hello Xin Wei,
 
We can definitely do everything you ask (briefly - up to 100 Hz and better with all degrees of freedom (6DOF) reported in compact stream (right now not compressed), requires at most 120 MIPS to do everything (during periods of a lot of activity) - unit is small so can be in a ring or glasses or headgear or whatever you choose - we give you intervals so you can compute your derivatives, resolution in 3D space is considerably better than 1 cm (in plane it's down to 0.2 mm and better)).  I can't do wireless unless somebody gives me money to properly design a wireless beta unit (it is not a problem of technology but pure resources).  
...

Is your party ready to pay for this work ...?  If not then we should reschedule for when they are ready to commit resources for technology development (or if they/your side wants to do the work).  Anyway, we can talk about it if the allocation of resources is a given - let me know.
 
warm greetings,
<pastedGraphic.jpg>

_____________________
Marek Alboszta
 

On May 20, 2011, at 10:08 AM, Sha Xin Wei wrote:

Hi Marek,
 
For a memory & place experiment, we would like to give people a wand that they can hold that can report position, euler angles, and their time derivatives.  Ideally at better than 30 Hz for the entire 12-vector.
 
We need it wireless, range of say 10m suffices.
 
Spatial resolution is important, for tracking "pointing" at virtual objects that people infer by indirectly mapping position & angle to a vibration motor that will be embedded somewhere on their body.   I expect any pen-based input device has more than adequate time-space resolution.
 
We would also like to be able to have a "wand" small enough to fit anywhere attached to the body in some not too obstrusive way.
 
We can write our own code to parse the data if you tell us the format coming in some standard protocol, serial or ethernet/port stream.
 
The person may be free to wander around the room and point in any direction whatsoever.
Does the wand needs to see an IR array in "front" ie be constrained to a half-sphere, or can it be pointed in any direction provided a set of IR beacons ...
 
Cheers,
Xin Wei
 

 


...............................................................................................
Because the essence of technology is nothing technological, essential reflection upon technology and decisive confrontation with it must happen in a realm that is, on the one hand, akin to the essence of technology and, on the other, fundamentally different from it. Such a realm is art. But certainly only if reflection upon art, for its part, does not shut its eyes to the constellation of truth, concerning which we are questioning." - Heidegger

"scientific" gesture / movement research ?

I suggested that some one open up a wii mote  and re-assemle the parts into a suitable form factor.  we still need a visible light photocell though, and cant use  line of sight, so that solution is also clunky...

We need a good versatile engineer to own this project and work with the MP group.

And strategically in June I'd like to define a non-hobbyist grant to NSF/NSERC/FQRNT parallel to an MP grant to do a gesture/movement tracking research project that meets different interests around the TML -- MP, Adrian (+Wessel), MM, Satinder.   I'll propose this  to my EU colleague as well.

Xin Wei


On 2011-05-30, at 6:43 PM, Michael Fortin wrote:

This might be a jaded comment.... 

I'll call it an advanced WiiMote (WiiMote just tracks the x-y-rotation, they have some idea of angle and distance to the display which the WiiMote doesn't have).   (Morgan -- WiiMote has vibrotactile feedback)

Speaking of odd remotes, there's this unrelated interesting toy; http://www.thinkgeek.com/interests/dads/cf9b/

Cheers,
~Michael();


On Mon, May 30, 2011 at 03:55, Sha Xin Wei <shaxinwei@gmail.com> wrote:
Hi Adrian, and scientific researchers,

Raising the stakes and thinking ahead to more robust and precise instrumentation, here's the

NaviScribe 6-DOF 3D wand by Electronic Scripting Products, Inc. (ESPi) in Palo Alto

The exclusive patent describes a  6 DOF, x,y,z + euler angles    The company's founded  by a physicist friend from Stanford: Marek Alboszta.  Not productized yet.  "Commercial" co-development would require O(100K) USD.  I've not discussed how to enter into actual relation with this company, but we could perhaps work out a deal.  This would make sense in a real NSERC/NSF  co-development grant.

Shall we think about this in context of a scientific gesture research proposal, along with high FPS cameras, and EONYX etc?  Let's discuss this in June.

Xin Wei


On 2011-05-20, at 9:32 PM, Marek Alboszta wrote:

Hello Xin Wei,

We can definitely do everything you ask (briefly - up to 100 Hz and better with all degrees of freedom (6DOF) reported in compact stream (right now not compressed), requires at most 120 MIPS to do everything (during periods of a lot of activity) - unit is small so can be in a ring or glasses or headgear or whatever you choose - we give you intervals so you can compute your derivatives, resolution in 3D space is considerably better than 1 cm (in plane it's down to 0.2 mm and better)).  I can't do wireless unless somebody gives me money to properly design a wireless beta unit (it is not a problem of technology but pure resources).  
...

Is your party ready to pay for this work ...?  If not then we should reschedule for when they are ready to commit resources for technology development (or if they/your side wants to do the work).  Anyway, we can talk about it if the allocation of resources is a given - let me know.

warm greetings,
<pastedGraphic.jpg>

_____________________
Marek Alboszta



On May 20, 2011, at 10:08 AM, Sha Xin Wei wrote:

Hi Marek,

For a memory & place experiment, we would like to give people a wand that they can hold that can report position, euler angles, and their time derivatives.  Ideally at better than 30 Hz for the entire 12-vector.

We need it wireless, range of say 10m suffices.

Spatial resolution is important, for tracking "pointing" at virtual objects that people infer by indirectly mapping position & angle to a vibration motor that will be embedded somewhere on their body.   I expect any pen-based input device has more than adequate time-space resolution.

We would also like to be able to have a "wand" small enough to fit anywhere attached to the body in some not too obstrusive way.

We can write our own code to parse the data if you tell us the format coming in some standard protocol, serial or ethernet/port stream.

The person may be free to wander around the room and point in any direction whatsoever.
Does the wand needs to see an IR array in "front" ie be constrained to a half-sphere, or can it be pointed in any direction provided a set of IR beacons ...

Cheers,
Xin Wei


Polhemus; Path dependence

Indeed Polhemus is the standard instrument.  From 2004, Polhemus gear seemed unacceptably clunky to be wearable and cost-effective by my "jewelry' standard, but Memory-Place could perhaps put its Verfremdungseffekt to good use.   And it's now sleeker.  If the MP group decides to really go after 6DOF in an future phase of this research, perhaps someone go source and borrow one?    Check TAG , Dr. Mudur, or CIISE.

Naviscribe seems to be an interesting case of the "good enough" solution.   Marek's patent is for reporting euler angle, but the other 6DOF composite the information, which is why his method is so compact, with very nice optics in a tiny lens.   The problem is path dependence.

All Souls College, Oxford & Stanford University 2000.

As with other TML work less tightly coupled to the consumer commodity market (eg. game controllers), we can try to go our own way and leverage our particular knowledge and friendship networks, subject to practical constraints.

Xin Wei


On 2011-05-30, at 7:52 PM, <adrian@adrianfreed.com> <adrian@adrianfreed.com> wrote:

As official grumpy old man on these things I should point out that a
high precision 6dof device is the holy grail
and  hard to do at any price. It is a question of making an inventory of
where the various solutions proposed fall down.
There are dozens of companies that have crashed and burned on this
already so we should be cautious as to where
we put our eggs.

The Naviscribe core technique was patented 4 years ago. What happened?
Why can't they implement it or raise the money
to implement it for a gaming controller?

For the particular needs of the Memory/Place work it may be easier to
borrow or rent the market leader for a short time:

http://www.polhemus.com/?page=Motion_PATRIOT%20Wireless


-------- Original Message --------
Subject: Re: "scientific" gesture / movement research ?
From: Morgan Sutherland <morgan@morgansutherland.net>
Date: Mon, May 30, 2011 10:36 am
To: Sha Xin Wei <shaxinwei@gmail.com>
Cc: Adrian Freed <adrian@adrianfreed.com>, memory-place@concordia.ca,
Satinder Gill <spg12@cam.ac.uk>


Scientifically speaking, I'd love to see somebody integrate vibrotactile
feedback into this. There is the feedback problem (vibration add noise to
signal which adds noise to the vibration motor ad infinitum) to figure out.

http://www.cim.mcgill.ca/~haptic/pub/HY-VH-RE-CAS-05.pdf ("A Tactile
Enhancement Instrument for Minimally Invasive Surgery")
http://www.cim.mcgill.ca/~haptic/pub/HY-VH-JASA-10.pdf (on better vibration
motors)

I remember I had a specific idea for this kind of device recently – I'll see
if I can remember...

As for the wireless part, it would be dead simple to use an XBee to beam the
data over to a USB data acquisition device (Teensy or whatever), just not
elegant (XBee's are bulky in comparison to a pen). The question there is
whether we get analog or digital output from the pen itself. If it's
digital, then there could be synchronization problems unless Marek can
provide the spec for the communication protocol. If it's analog, then it's
dead simple – just cut the wire and put 2 XBees in between. Add 1 extra week
for unforeseeable headaches that always crop up when doing wireless.

I'm actually very excited about this – if this gets commercialized and keeps
its form factor, I can see myself using one regularly, hopefully by then in
conjunction with a fast RGB E-ink display. Viva post-television.

Morgan

On Mon, May 30, 2011 at 3:55 AM, Sha Xin Wei <shaxinwei@gmail.com> wrote:

Hi Adrian, and scientific researchers,

Raising the stakes and thinking ahead to more robust and precise
instrumentation, here's the

NaviScribe 6-DOF 3D wand by Electronic Scripting Products, Inc. (ESPi) in
Palo Alto


The exclusive patent describes a  6 DOF, x,y,z + euler angles    The
company's founded  by a physicist friend from Stanford: Marek Alboszta.  Not
productized yet.  "Commercial" co-development would require O(100K) USD.
I've not discussed how to enter into actual relation with this company, but
we could perhaps work out a deal.  This would make sense in a real NSERC/NSF
co-development grant.

Shall we think about this in context of a scientific gesture research
proposal, along with high FPS cameras, and EONYX etc?  Let's discuss this in
June.

Xin Wei





On 2011-05-20, at 9:32 PM, Marek Alboszta wrote:

Hello Xin Wei,

We can definitely do everything you ask (briefly - up to 100 Hz and better
with all degrees of freedom (6DOF) reported in compact stream (right now not
compressed), requires at most 120 MIPS to do everything (during periods of a
lot of activity) - unit is small so can be in a ring or glasses or headgear
or whatever you choose - we give you intervals so you can compute your
derivatives, resolution in 3D space is considerably better than 1 cm (in
plane it's down to 0.2 mm and better)).  I can't do wireless unless somebody
gives me money to properly design a wireless beta unit (it is not a problem
of technology but pure resources).

...

Is your party ready to pay for this work ...?  If not then we should
reschedule for when they are ready to commit resources for technology
development (or if they/your side wants to do the work).  Anyway, we can
talk about it if the allocation of resources is a given - let me know.

warm greetings,
<pastedGraphic.jpg>

_____________________
Marek Alboszta
marekalb@yahoo.com



On May 20, 2011, at 10:08 AM, Sha Xin Wei wrote:

Hi Marek,

For a memory & place experiment, we would like to give people a wand that
they can hold that can report position, euler angles, and their time
derivatives.  Ideally at better than 30 Hz for the entire 12-vector.

We need it wireless, range of say 10m suffices.

Spatial resolution is important, for tracking "pointing" at virtual objects
that people infer by indirectly mapping position & angle to a vibration
motor that will be embedded somewhere on their body.   I expect any
pen-based input device has more than adequate time-space resolution.

We would also like to be able to have a "wand" small enough to fit anywhere
attached to the body in some not too obstrusive way.

We can write our own code to parse the data if you tell us the format
coming in some standard protocol, serial or ethernet/port stream.

The person may be free to wander around the room and point in any direction
whatsoever.
Does the wand needs to see an IR array in "front" ie be constrained to a
half-sphere, or can it be pointed in any direction provided a set of IR
beacons ...

Cheers,
Xin Wei




"scientific" gesture / movement research ?

Hi Xin Wei,

Interesting device.

For the memory place investigation, though, it wouldn’t work, because it depends on line of sight between the wand and the LEDs around the TV. Also, at the moment, I wonder if it’s fast enough: the people in the demo are moving really slowly.

Best,

David

From: owner-memory-place@concordia.ca [mailto:owner-memory-place@concordia.ca] On Behalf Of Michael Fortin
Sent: May-30-11 1:44 PM
To: Sha Xin Wei
Cc: Adrian Freed; memory-place@concordia.ca; Satinder Gill; post@memoryplace.posterous.com
Subject: Re: "scientific" gesture / movement research ?

This might be a jaded comment.... 

I'll call it an advanced WiiMote (WiiMote just tracks the x-y-rotation, they have some idea of angle and distance to the display which the WiiMote doesn't have).   (Morgan -- WiiMote has vibrotactile feedback)

Speaking of odd remotes, there's this unrelated interesting toy; http://www.thinkgeek.com/interests/dads/cf9b/

Cheers,
~Michael();

On Mon, May 30, 2011 at 03:55, Sha Xin Wei <shaxinwei@gmail.com> wrote:

Hi Adrian, and scientific researchers,

Raising the stakes and thinking ahead to more robust and precise instrumentation, here's the

NaviScribe 6-DOF 3D wand by Electronic Scripting Products, Inc. (ESPi) in Palo Alto

The exclusive patent describes a  6 DOF, x,y,z + euler angles    The company's founded  by a physicist friend from Stanford: Marek Alboszta.  Not productized yet.  "Commercial" co-development would require O(100K) USD.  I've not discussed how to enter into actual relation with this company, but we could perhaps work out a deal.  This would make sense in a real NSERC/NSF  co-development grant.

Shall we think about this in context of a scientific gesture research proposal, along with high FPS cameras, and EONYX etc?  Let's discuss this in June.

Xin Wei

On 2011-05-20, at 9:32 PM, Marek Alboszta wrote:

Hello Xin Wei,

We can definitely do everything you ask (briefly - up to 100 Hz and better with all degrees of freedom (6DOF) reported in compact stream (right now not compressed), requires at most 120 MIPS to do everything (during periods of a lot of activity) - unit is small so can be in a ring or glasses or headgear or whatever you choose - we give you intervals so you can compute your derivatives, resolution in 3D space is considerably better than 1 cm (in plane it's down to 0.2 mm and better)).  I can't do wireless unless somebody gives me money to properly design a wireless beta unit (it is not a problem of technology but pure resources).  

...

Is your party ready to pay for this work ...?  If not then we should reschedule for when they are ready to commit resources for technology development (or if they/your side wants to do the work).  Anyway, we can talk about it if the allocation of resources is a given - let me know.

warm greetings,

<pastedGraphic.jpg>


_____________________

Marek Alboszta

On May 20, 2011, at 10:08 AM, Sha Xin Wei wrote:

Hi Marek,

For a memory & place experiment, we would like to give people a wand that they can hold that can report position, euler angles, and their time derivatives.  Ideally at better than 30 Hz for the entire 12-vector.

We need it wireless, range of say 10m suffices.

Spatial resolution is important, for tracking "pointing" at virtual objects that people infer by indirectly mapping position & angle to a vibration motor that will be embedded somewhere on their body.   I expect any pen-based input device has more than adequate time-space resolution.

We would also like to be able to have a "wand" small enough to fit anywhere attached to the body in some not too obstrusive way.

We can write our own code to parse the data if you tell us the format coming in some standard protocol, serial or ethernet/port stream.

The person may be free to wander around the room and point in any direction whatsoever.

Does the wand needs to see an IR array in "front" ie be constrained to a half-sphere, or can it be pointed in any direction provided a set of IR beacons ...

Cheers,

Xin Wei

"scientific" gesture / movement research ?

Hi Adrian, and scientific researchers,

Raising the stakes and thinking ahead to more robust and precise instrumentation, here's the

NaviScribe 6-DOF 3D wand by Electronic Scripting Products, Inc. (ESPi) in Palo Alto

The exclusive patent describes a  6 DOF, x,y,z + euler angles    The company's founded  by a physicist friend from Stanford: Marek Alboszta.  Not productized yet.  "Commercial" co-development would require O(100K) USD.  I've not discussed how to enter into actual relation with this company, but we could perhaps work out a deal.  This would make sense in a real NSERC/NSF  co-development grant.

Shall we think about this in context of a scientific gesture research proposal, along with high FPS cameras, and EONYX etc?  Let's discuss this in June.

Xin Wei


On 2011-05-20, at 9:32 PM, Marek Alboszta wrote:

Hello Xin Wei,

We can definitely do everything you ask (briefly - up to 100 Hz and better with all degrees of freedom (6DOF) reported in compact stream (right now not compressed), requires at most 120 MIPS to do everything (during periods of a lot of activity) - unit is small so can be in a ring or glasses or headgear or whatever you choose - we give you intervals so you can compute your derivatives, resolution in 3D space is considerably better than 1 cm (in plane it's down to 0.2 mm and better)).  I can't do wireless unless somebody gives me money to properly design a wireless beta unit (it is not a problem of technology but pure resources).  
...

Is your party ready to pay for this work ...?  If not then we should reschedule for when they are ready to commit resources for technology development (or if they/your side wants to do the work).  Anyway, we can talk about it if the allocation of resources is a given - let me know.

warm greetings,
<pastedGraphic.jpg>

_____________________
Marek Alboszta



On May 20, 2011, at 10:08 AM, Sha Xin Wei wrote:

Hi Marek,

For a memory & place experiment, we would like to give people a wand that they can hold that can report position, euler angles, and their time derivatives.  Ideally at better than 30 Hz for the entire 12-vector.

We need it wireless, range of say 10m suffices.

Spatial resolution is important, for tracking "pointing" at virtual objects that people infer by indirectly mapping position & angle to a vibration motor that will be embedded somewhere on their body.   I expect any pen-based input device has more than adequate time-space resolution.

We would also like to be able to have a "wand" small enough to fit anywhere attached to the body in some not too obstrusive way.

We can write our own code to parse the data if you tell us the format coming in some standard protocol, serial or ethernet/port stream.

The person may be free to wander around the room and point in any direction whatsoever.
Does the wand needs to see an IR array in "front" ie be constrained to a half-sphere, or can it be pointed in any direction provided a set of IR beacons ...

Cheers,
Xin Wei