Archive for the ‘technology’ Category

Robotic Wireless Sensor Networks

January 27th, 2009

In the last decade, Wireless Sensor Networks (WSNs) have been successfully deployed to perform numerous automation tasks such as environmental monitoring, surveillance and inventory tracking. By introducing actuation capabilities (in particular controlled-mobility), robots have the potential to improve the capabilities of existing WSNs significantly. Recent advances in robotics as well as the availability of inexpensive robotic platforms have made it feasible to develop hybrid networks in which multiple mobile robots interact with each other and other static sensors to perform complex tasks. On the other hand, design and implementation of such hybrid systems bring forth new algorithmic and systems challenges related to coordination, planning, and resource management.

The goal of this workshop is to explore the algorithmic and systems aspects at the intersection of robotics and sensor networks. We seek work in a variety of areas including:

  • Development of hardware and software platforms
  • Experiences from deployments
  • Resource allocation algorithms
  • Novel research challenges and applications
  • Localization and route planning
  • Sensor tasking, control and planning

http://hinrg.cs.jhu.edu/RWSN09/Home

“Robotica­- Control inside the panopticon” by Stanza

November 11th, 2008
Copyright Image by Stanza

Copyright Image by Stanza: Robots making paintings. 2008.

A world premier of Stanza’s Robotica: Control inside the panopticon playful robot installation – with performative and interactive aspects – that questions ideas of surveillance and tracking in popular culture using, robots, CCTV and sensor technologies.

Twelve robots – each named after prison inmate numbers – roam freely on a canvas on the floor of the Gallery. These robotic prisoners are sent out across the canvas with small tasks to complete. This robotic “wandering” is captured over the evening onto the canvas. They create their own painting in their own little prison. The idea of the Panopticon originated with the English utilitarian philosopher Jeremy Bentham as a prison design that would allow an observer to monitor all the prisoners at all times, without any prisoner being aware of whether he was being monitored or not.Like people, robots have common behaviours and can be programmed accordingly i.e. robots can follow a path (path following mode), the can avoid obstacles (avoidance mode) and they can operate in wander mode. They all try to avoid one another – depending on their proximity to one another – while searching the space. In doing so they demonstrate social behaviour.

In moving through the gallery people create a ‘memory space’- a reference to a past created by the traces and paths left behind. The patterns we make, the forces we weave, reveal different ways of moving through the space. These patterns disclose new ways of seeing the world. All the robots are recorded via CCTV and each is made to wear CCTV which is shown on a monitor which also records the event. Police “tape” keeps the robots inside their controlled space. The robots mimic and trace the patterns people make – but based on algorithms. The robots are tracked – everything is watched and recorded – and unlike people their movements can be networked into retrievable data structures that it can be re-imagined and sourced for information. The digital patterns of the robots are re-made as analogue patterns. The robot path is in effect replaced with a series of ‘brushes’ – and it is these that are wandering around the canvas. A series of actions are applied to the movement of the digital brush across the rectangular canvas to create these robotic generative paintings.

This artwork investigates the relationship between the analogue and the digital aesthetic. The robots wander over the canvas to make the image – and this also protects the floor. The suggested canvas size 2.5 by 5m – and therefore a reasonable floor space is needed. All the robots will see the edges of the canvas and turn around automatically) i.e. they are roped off and will not go wandering off on their own!

Sensity: An Urban Scale Wireless Sensor Test Bed.

October 23rd, 2008
Copyright Image by Stanza

copyright stanza artwork image London

From Sensity by stanza…2004.. The sensor network can be moved from urban to rural setting and different types of visualization can be made depending on the environment. Sensity is an open social sculpture that informs the world and creates new meaningful experiences. Sensity is also a highly technical project that will give vast amounts of information about the fabric of our cities. By embedding the sensors like this we can re-engage with the urban fabric and enable new artistic metaphors within city space. The sensors are positioned across the city. Custom made software enables these sensors to communicate will one another in a network over a proxy server in real time.The data can also be used to create visualizations in an open source environment. Other online users can also re- interpret the data and interrogate the various sensors in the network as this is open sourced as well (see xml streams). Representations of these data sets will allow unique understanding of the urban environment and environment in real time.”

stanza live data

Copyright Image by Stanza …Stanza live city data from Sensity. Around Goldsmiths College London For Meta Data.


Google are our big brothers. Masters of our Universe.

July 7th, 2008

And we thought google was just a search. Well, for the past few years they have come up with some cool tools. But now it seems their cards are on the table, their intentions are clear. Its world domination by surveillance culture. We are just data and google aims to “own” us.

Think of  Will Smith in the film “Enemy of The State”, and then maybe we are getting close. Google in two months, two years, or twenty years.  Maybe its cool we can all watch each other going peacefully about out business as long as google make our faces blurred, what planet are google on, google earth?…yeah right.

So just what are Googles longer term intentions here? Spying on us through serach algorithms in the digital world is one thing, tracking us via open internet is another….time for the ethical debate to be brought to the centre stage with some creative input other than it makes the world a safer place.

ref

http://news.bbc.co.uk/1/hi/technology/7492844.stm

Google has defended its controversial Street View photo-mapping tool, saying it will meet local privacy laws in European countries at launch.

The tool, which matches real world photos to mapped locations, has drawn fire from some privacy campaigners.

In the UK, Privacy International said the tool could breach data protection laws if people’s faces were shown.

Google has said it is using face blurring technology to preserve the privacy of individuals photographed.

“In our view they need a person’s consent if they make use of a person’s face for commercial ends,” Simon Davies, of Privacy International told BBC News.

Street View has already been launched in the US and includes photos of streets in major American cities. Photographing of areas in the UK, including London, is believed to have started last week.

Mr Davies has written to Google asking for details of the face-blurring technology, saying he would ask the UK Information Commissioner to intervene if he did not receive a satisfactory response.

He told BBC News that he was concerned that Google’s technology would not work.

Google’s senior privacy counsel Jane Horvath has responded saying that the technology had already been deployed.

So Jane I guess thats all right then is it.?

Copyright Image by Stanza

Copyright Image by Stanza: Globals live visualusation of media over the net 2004

Shows my live maps work from 2004…..now google maps can just take what they  want.

They  even have a photo of me in my house on google earth that they  took just as the google van passed…( glad I had my shorts on)..? Will google become the enemy of the state? ….

Stanza image from 2004. Global…Never the same again always different….forever.” by Stanza 2004

Citysense passively “senses” the most popular places based on actual real-time activity and displays a live heat map.

June 18th, 2008
stanza image

Stanza Artwork. Shanghai 2004.

Here is the sales pitch from citysense. A system for gathering and representing real time city data from San Francisco. A nice idea for a company.
Quoted.
Citysense is an innovative mobile application for local nightlife discovery and social navigation, answering the question, “Where is everybody?”

Citysense shows the overall activity level of the city, top activity hotspots, and places with unexpectedly high activity, all in real-time. Then it links to Yelp and Google to show what venues are operating at those locations. Citysense is a free demonstration of the Macrosense platform that everyone can enjoy.

Instead, it evolves searching to sensing. Citysense passively “senses” the most popular places based on actual real-time activity and displays a live heat map.
Location data is everywhere. Cars, buses, taxis, mobile phones, cameras, and personal navigation devices all beacon their locations thanks to network-connected positioning technologies such as GPS, WiFi and cell tower triangulation. Millions of consumers and businesses use location-enabled devices for finding nearby services, locating friends & family, navigating, asset- and pet-tracking, dispatching, sports, games, and hobbies.

These forces have lowered the cost of technology, ignited interest in location-enabled services, and resulted in the generation of significant amounts of historical and real-time streaming location information. Sense Networks was founded on the idea that these datasets could provide remarkable real-time insight into aggregate human activity trends.

Macrosense employs patent-pending technology to learn from these large-scale patterns of movement, and to identify distinct classes of behaviors in specific contexts, called “tribes.”

Once it’s known which tribes are where, by sampling the distribution of tribes at any given place and time, it’s possible to understand what it means when a user is there at that place and time.

For example: rock clubs and hip-hop clubs each retain distinct tribal distributions. When a user is out at night, Citysense learns their preferred tribe distribution from time spent in these places. When that user visits another city, they see hotspots recommended on the basis of this distribution and combined with overall activity information.

Users who go to rock clubs see rock club hotspots, users who frequent hip-hop clubs see hip-hop hotspots, and those who go to both see both. The question “where is everybody like me right now?” is thus answered for these users – even in a city they’ve never visited before.

Citysense is an application that operates on the Sense Networks Macrosense platform, which analyzes massive amounts of aggregate, anonymous location data in real-time. Macrosense is already being used by business people for things like selecting store locations and understanding retail demand. But we asked ourselves: with all this real-time data, what else could we do for a city? Nightlife enhancement was the obvious answer. This release is just a test, and we’re interested in your feedback on how to make the application better. You’ll find a feedback button in Citysense.

Principles…

People should own their own data
People should have full control over the use of any data that they generate. All data collection should be “opt-in,” and users should be able to easily remove themselves and their data from the system without questions or hassle. The system doesn’t “remember” a user for later, but completely deletes data at the user’s discretion.

People should receive a meaningful benefit in exchange for sharing data
Meaningful benefits include compelling applications to help manage life better, or personalized services based on anonymous learning from “users like me.” People should be able to enjoy the benefits of these services simply in exchange for their data.

We’re looking for additional common good uses of aggregate, anonymous location data. If you would like to submit a project for consideration, please contact us at ….
http://www.citysense.com/home.php

All of the above ref their website.

From my Sensity projects.
Citysense…Sounds like sensity backwards….Various types of data can be re-imagined within the context of city space and the environment. This includes pollution data recorded via sensors in the street, to create audio acoustic files expressing the pain and suffering of the air as it pollutes. Weather and forecast data, acquired via weather station equipment; this can be used and can create ambient soundscapes and morphing visualizations as the wind shifts direction or the rain increases. Noise monitor levels, and noise maps , create a symphony of true urban sounds that can be used to make sound reactive sculptures. The patterns we make, the forces we weave, are all being networked into retrievable data structures that can be re-imagined and sourced for information. These patterns all disclose new ways of seeing the world. The value of information will be a new currency as power change. The central issue that will develop will be the privilege and access to these data sources….
I like their pitch about owning their own data, couldn’t agree more in fact all royalties should be shared. Its not just about privacy its about ownership. Once you enter the grid you body is now externally giving away data and information. Companies are now rushing to harvest this information , ( information services) making new products for mobile devices. I think we are going to see a lot of this.

Motes technical .Temperature responses for mts 310 series on mica 2

June 3rd, 2008

Stanza:Sensity ...real time city mash ups, merging my online XML feeds from each city and making online visualisations.

The temperature sensors in moteview on four of my motes show -273.15. The rest of the board ie light and sound is working. How do I get the temp working or reset it.

Did you “Reboot” into the new image (slot) that you OTAP’d?

This is required in order for the nodes to start executing new image.

You may choose any of the available slots where you want to store the

new image.

Please refer to the MoteConfig manual for details on OTAP.

http://www.xbow.com/Support/Support_pdf_files/MoteConfig_Users_Manual.pd

——————————————————

The temperature sensors in moteview on four of my motes show -273.15. The rest of the board ie light and sound is working. How do I get the temp working or reset it.

Response:

—————————————————————

It looks like you are able to query the nodes (i.e. they are running OTAPImage).

Which Motes are you trying to reprogram with the new code?

Which slot have you selected to OTAP?

Please check and make sure that you have at least 2.7V battery voltage on these nodes.

Response:

—————————————————————

Unless you had previously enabled the nodes to be OTAP’d you won’t be able to OTAP with new application.

What I would suggest is to bring the nodes and attach them to the MIB board and then program them using MoteConfig’s Local program tab. You need to chek OTAP enable box if you wish to OTAP them in the future.

—————————————————————

Response

May be I wasn’t clear earlier. You can use both CA and CB boards in

the same network. The main difference is that they use different power

control lines for temperature sensor. To get accurate temperature readings, you should Program the Mote attached to the CA sensor board (without jumper wire) with  “XMTS300CA__.exe and program the Mote attached to the CB  sensor board (with jumper wire) with “XMTS300CB__.exe

Response:

—————————————————————

It is quite likely that you have an MTS310CB board (look for a jumper

wire on the bottom-side of the sensor board).

The MTS310CA uses INT2 for temperature power control where as the CB

version uses PW0. It sounds like you are using CA code on CB hardware (or vice versa) and hence the Temp sensor never gets turned on and returns 0.

If you have CB board, then you need to use CB version of the app.

Question:

—————————————————————

We want the batteries to last longer.

How do we do this?

How long should batteries last without a change….

If they are in low power mode how long will they last.

I know know this is difficult to answer) but how do I get them to last longer.

Also do you have solar panel one can plus in to the motes for power?

Response:

—————————————————————

The high power (HP) version of the apps don’t duty cycle the radio and hence would deplete the battery in few days. The low power (LP) version of the apps draw an average of 330 uA current with MICA2 platform and when used with alkaline AA batteries can easily deliver over 6 months of battery life.

In order to make the batteries in the Kits last long time, you need to use them in XMesh-LP mode.

We do have solar panel implementation in our next generation eKo Pro series products that can deliver battery life of over 5 years.

http://www.xbow.com/Eko/index.aspx

“Gallery” by Stanza, is a dynamic public sculpture viewable over the internet.

May 28th, 2008

stanza Image

"Gallery” by Stanza, is a dynamic public sculpture viewable over the internet.

“Gallery” by Stanza, is a dynamic public sculpture viewable over the internet. Gallery describes the space, in this case the upper gallery in Plymouth Arts Centre, England. Made during an artist in residency project in situ in the gallery space during feb 2008.

The gallery interior has been made virtual and placed online. “Gallery”, is part of a series of process led experiments in data visualization within the context on an art gallery. This is an experimental engagement with data in the art gallery using sensors and CCTV. Stanza asks , “what happens during the process of visiting the gallery as a dataspace”; ie what happens to the gallery and what do the visitor do?

The sensors are used as real time recording devices to gather information about the sensory behaviour of the real space. The gallery becomes the artwork formed by the emergent real time data in the space.
The gallery laid bare as a work of art. Gallery proposes that the data is art. The art is a real time flow of the things around us that allow our senses to invoke understanding. The gallery space becomes the art described by the shifts in light, temperature and noises in the space over time.

http://www.stanza.co.uk/gallery/index.html

Internet Art and technology. Stanza 2003

January 9th, 2008

Artwork by Stanza: CCTV  Media Visualisation 2005. Large print On Canvas.

The computer has now become a central tool within new media creativity. We are starting to see more and more traditional artists move into the web from other media. This has happened because of a combination of economic conditions, and the artists continual ‘search for the new’.

The internet offers various economical and valuable distribution benefits for artists and artworks. New media creativity also offers a variety of shifting parameters within which the interpretation of previous art histories may be re-evaluated. This is why ‘expression’ and use of the internet as a medium, and a resource has expanded to envelop our new world framework and is embraced by so many artists and art colleges. The use of this new technology also offers a sense of belonging which was never exposed through various other art histories.

This sense of connection is one of several qualities inherent to the internet as a medium for creative expression; sound, visual effect, time, movement and interaction all provide new parameters for the development of contemporary art. Here we have the convergence of painting and printmaking, photography, film and music.

The merging of the audio visual is increasingly becoming a central issue in the development of interactive media. Web artists are fusing the arts, incorporating a wide range range of approaches to the medium of the internet and audio visual practice. Artists are producing new audio visual experiences, and this includes art , games, generative music, interactive environments.

Artists have always been influenced by technology. Previously the artist’s or musician’s studio was a place of many hardwares, softwares, and bits and pieces. Today instead of brushes and paint and wooden stretchers and huge space for storage we a have a small box that can be both studio and gallery. For musicians instead of all sorts of instruments and masses of expensive gear, we have the same small box the PC. So the convergence of hardware and software has enabled many types of creatives to meet or converge. The PC acts as studio and gallery. Works can be disseminated globally. The distribution system has changed and the artist has direct access to a bigger audience through his very own “white cube ” gallery.


It could be said we are now starting to see the emergence of a new art form. As the newness unfolds a history will unfold with it. At the moment there is a blurring of the boundaries as many approaches are adopted, and this is confused further because of the constantly changing and developing nature of technologies which also allows for the artwork themselves to change. We are starting to see a much bigger emphasis on works that generate and evolve.


Certainly there is now a whole new category of online art and music driven by computer technology.

Online, we have net art. These works encounter and engage the user without whose presence in the interactivity the work is not only meaningless but does not exist. Within the global exhibition of such works the parameters of the artists relationship to his audience has shifted ground. We see emerging, a shared multidimensional relationship to these works. So now we find that the computer, this box, is in fact the gallery, the exhibition space, the computer as white cube. This box has become specification for which these works are made and are experienced.


Fifteen to twenty years ago very few computers where being used by creatives in colleges or universities. Now a visit to any college will see classes of art , fashion, graphics, music, all huddled up around the computer screen.

Artists are specifically looking at creative possibilities for the computer and the internet as a medium. One thing seems obvious, more and more artists are being drawn to new media. The diverse range and plurality of backgrounds means that the specifics of this form are hard to evaluate. From design, music, art, and programming various skills are needed to produce work in the digital domain.

So to fully engage with the internet as a medium, the artist must adopt multiple skills and languages in addition to those traditionally associated with the arts. Presented with an internet specific artwork, the visitor must physically engage with the work to experience it as it is meant to be and by that I mean that the work must utilise the qualities inherent to the medium if it is to be considered internet art at all – time must pass, things must change, connection must be made for the experience to be complete.

Text stanza 2003

Art on spheres. Stanza 3d display globes with data and surveillance.

June 1st, 2007

I have for the past four years been trying to make a large scale display of live data in a globe screen; and I have made several works in this area. I first made this proposal to the Watershed in 2004 and developed several concepts and prototypes with them through my Clarks bursary.

stanza 3d globe

Stanza. Art on spheres. Stanza 3d display globes.

3D globe with live data. 2004.

This research was also followed through with my Nesta Dreamtime Award in 2004, and I also pitched it to the Nesta business unit to make a real globe in 2004 (to develop this as a display device) as part of my Nesta Dreamtime outputs.

stanza 3d globe

Stanza. Art on spheres. Stanza 3d display globes.

Pitch to Watershed to place data glove outside the media centre..image 2004…the pitch was made again in 2006, when I sent the proposal for SOUL on the globe.

As part of this research I looked at other 3d globes of which have appeared in the last five years such as Omniglobe and Pufferfish. Indeed Pufferfish now has a really nice 3d globe that they market as a display for trade fairs.

Stanza Image. cctv

Live CCTV. Stanza. Art on spheres.

Stanza image..live CCTV on Globe. Shown here at County Hall Pufferfish Globe. 2006

I was going to make my own globe at one stage in 2003 but didn’t have the money to pursue. I also was involved with BDS it its inception, ie as a director of BDS, when we made a globe in China by project images onto Armand’s Weather balloon which we took to China, which looked quite nice. (I subsequently left them and only worked with them for the brief period in China).

stanza image

Stanza image from China. We tried this experiment in China….on a large weather baloon. The image shows a live CCTV camera of mine, which was showing a picture from my house. 2003

stanza image

stanza image 2004

also this one showing live images from Mexico CCTV being hacked 2003. the image on the globe is the data and time stamp of the CCTV camera

I have subsequently developed some relations with Pufferfish to show work on a 3D globe which they developed. Indeed they where kind enough to let me test some work on their display.

stanza image

stanza image 2004. 3d Globes

Stanza image on Pufferfish display…live sensors from across the city showing live data in a real 3d globe, in this case the work Sensity. 2006

I cannot claim to own any patents on globe technologies unfortunately, however what all my ideas in this area have in common was a goal to make a 3d display technology and place it in an artistic context as a sculptural display.

Live CCTV

stanza image 2004. 3d Globes

LIVE CCTV for a globe hung above the city. 2004.

My artistic goal was much more focused on the nature and use of this technology.

So this for the past four years I have been researching the use of live data for art gallery use indoor and outdoes to make a piece of work from a fine art perspective ie for cultural use, that would appear as a 3d globe.

I have used live CCTV images, realtime news feeds and various forms of live data from my own sensor networks. Indeed the idea of “globe” was essential to my metaphor of a world full of data, and archive, a meta ball of information.

This process hasn’t been easy, as well as trying to develop prototypes for four years I having been applying for arts grants to install a 3d globe to show live data in an art gallery. I pursued this idea of a globe showing live data from a cultural perspective ie real time information from various sources so the whole piece becomes a data globe, a world focused on the nature of live data and information flow. I want this to be both an outdoor public sculpture but also a piece indoors in an art gallery.

stanza globe with live feeds

Stanza image 2004. 3d Globes

Stanza live feeds on globe. 2004

I had also applied to Space 4 gallery in Peterborough where they said yes to the idea. (They had agreed a one man show that subsequently fell through) Indeed their curator Lisa Helin made a bid to the arts council on my behalf…or rather she filled the form in only for it to be rejected in 2006, by London Arts.

I had also pitched the globe project to The Watershed where my original ideas were formed as part of a Clarks Bursary in 2004.

I also pitched this as Sunderland Winter Gardens where I came second in a shortlist process in 2006.

Series of Sketches for SOUL…live data in the city.

I was also recently approached by a consortium in the United Emirates about this concept but it fell through this was such a disappointment as they where planning a series of them.

I think it would be great at the Tate Gallery, Turbine Hall if it looked like this with live data in it.

stanza

stanza image 2004. 3d Globes

Amber Stanza in Turbine Hall…image shows live data (CCTV feeds from around the TATE on huge globe…

It would look like this…………………..

Links to my works on this…

2006: http://www.stanza.co.uk/sensity/index.html
2004: http://www.stanza.co.uk/micro_city/index.html
2004: http://www.stanza.co.uk/global/index.htm
2006: http://www.stanza.co.uk/soul_globe/index.html
2006: http://www.stanza.co.uk/biocities/index.html
2006: http://www.stanza.co.uk/newsfeeder/index.html

In short I am still trying to this, so if anyone wants to commission me get in touch.

As you might know if you read this Blog. I am the recipient of an AHRC research fellowship. The concept of displaying live data on unique technologies is also one of my listed outputs of my fellowship.

Indeed Helen Sloan of Scan and Gill Haworth and all at the Watershed Media centre are supporting me in my endeavour find galleries and public art spaces who are interested on exhibiting my work.

Proffesor Janis Jefferies at Goldsmiths is also helping look for outputs for this

If you are a gallery and you want to exhibit my work contact me.

All images on this page copyright Stanza

Sensity in Italy Festival at Share IT. The art of environmental data

March 21st, 2007

ddcPIEMONTE_SHARE_FESTIVAL | TORINO, 23 – 28 GENNAIO 2007: From over 200 submitted works, an international jury shortlisted six works that were exhibited at the festival, from which one prize winner is selected.

Sensity was selected but they just set up a projected version rather than the live sensor network. Like with many other festivals, there is one prize only that is not tied to a specific technology or genre but rather to their combination and to the expression of ideas.

A series of artworks based on connecting city spaces. The results are visualisations and sonifications of real time spaces using my own wireless sensor networks and environmental sensor technologies.

Sensity artworks are made from the data that is collected across the city. The sensors interpret the micro-data of the interactive city or responding city space. The outputs from the sensors networks then display the “emotional” state of the city online, in real time. The information is also used to create offline installations and sculptural artworks. Several artworks (sonfications and visualisations) have been made connecting up space and cities.

All the artworks in this series by Stanza use data from the real time environment. A new city experience results based on the mash up meta data from these multiple cities streams. Sensity leverages these real time data city streams and represents it online, showing the life of the system, opening up the system, and the publishing emerging changing bahaviours of the space.

As all things becomes connected and networked, my concept will be become a system that senses not just the city but the whole world. Eventually sensors will be linked to give a real time global visualization. ~ Public domain data resource for art and environmental monitoring.

 

Crossbow Motes. Details of experience to date. 2004 – 2006

April 9th, 2006

Crossbow Motes.

What follows is a sort of diary / journey of my experiences of dealing and researching new technologies at the fuzzy edge where ‘stuff’ is getting developed and marketed to end users as research tools. The technology in this case is Crossbow Motes.

My interest and work with Crossbow motes (Mica 2 wireless board and sensors) and wireless technology goes back to early 2004 when I went over to Crossbow head office in San Jose and then attended a workshop in Boston in late 2004. For the workshop which costs over $500 dollars, you have to turn up owning or having ordered the technology,I actually wanted to evaluate it or at least see if it worked before I bought. I was looking for a technology that could fit my concepts (live real time data from a sensor network represented online for monitoring public space) and you have to start somewhere, however I paid upfront for the course and the technology , booked the flight from London, and got myself to a hotel in Boston USA.

It should have been a clue as to how difficult XBOW motes actually are. Out of 200 people at the workshop of which they said ten people would turn up to the free two hour session the day before (for the pre install). Well what I mean is, the clue should have been in the pre install day. Nearly every person or group, who had previously bought this technology needed to be shown how to install them. This audience included some navy seals, software engineers with PHD’s and all sorts of BSC and MSC graduates working in professional fields, ie hardware and software engineers. That is nearly everyone who had bought this couldn’t get it to work and needed their hand holding.

Anyway I went ahead and bought the kit at the workshop except the director of Crossbow couldn’t get mine working and he eventually gave me his used mib 510 board.

Its three years I have been messing around with these motes and there are some simple truths.

The first is this is expensive, the second is there is hope offered but failure is always close at hand, the third is your on your own.

I also wonder out of all the people that bought/buy these kits off Crossbow Technology who has actually to used them successfully. If anyone has can you write to me. I did ask crossbow for a list but they wouldn’t give me one. How many of these kits are deployed somewhere and working, or are they all left in dusty cupboards; my guess is the later.

Not only is it difficult to appraise what they should do, they don’t even do what the company promises (more on this later) but to try to develop your own ideas you have to read one of five manuals (which I did) this isn’t the problem, I mean at least there are some manuals. The problem is lots of the stuff is either not true or misleading.( the online forum is useless.)

Maybe I just ask the wrong question, however for three years my question has been the same. Can I get the data online using motes. I want to make real time online environment with the live data, not the local saved data. And after much questioning I was told use XML RPC. The answer to this has always been yes, but how?

They actually sold me software which they said does this out of the box. Well I can tell you and them it does not. It does not take XML RPC into flash for example. The version of XML RPC is proprietory and unique to Crossbow. Believe me it took months to figure this out. ie it’s a non standard version. I repeated the same question after I bought more stuff and they said I would now need to develop either a bridge or a PHP or CGI script in order to do this. (Which I have now also achieved as of 2007.see below.)

I asked them for examples or to at least show me that you at Crossbow have done this and their reply was to say that the code is Crossbow property ie proprietry intellectual property. OK what does this mean. Well first it means that the product doesn’t do what they say it does. Second it means either they are still developing this and also that they weren’t prepared prove to me that this worked. Thirdly it meant that I still hadn’t got my sensors blasting out data online over a network so that I can manipulate it.

However this isn’t the point.

The distributors in UK Willow and Crossbow in USA all said I can get real time data online to my website using the software they sold me for £800 pounds, err that’s why I bought it. They said it is built in. I made several requests about this and they assured me before I purchased that it was. Anyway you can not get realtime data to be presented to an online source using the kits and the software they supply you with. What they said is not true and as such is misrepresentation.

However, I did get my real time data but only after having written my own custom software which was mote proxy bridge in java. (Thanks to Eamonn) There is also way to do this via PHP and postgreSQL , but they didn’t help with this. In other words, you have to do a lot of your own software development to get this to happen, so be prepared.

The other issue with the motes is that I am always having issues restarting the program Xserve and trying to establish re -connections. In other words getting them running is one thing, keeping them running is another.

And of last month (mid 2007) Xbow announced they are now giving the software Moteworks away I bought this from them two months before for £800 ie $1600 us dollars. (No Joke) from their UK distributor Willow Technology. ER can I have my money back?

Recently I updated from version 1.4 moteview to version 2.0 and my mote bridge stops, working. They probably have updated their XML structure, so I rolled back to an earlier version. Well OK I here you say its the cutting edge of technology….er bleeding edge. So my bridge was re-wrote and updated again so I am now running version 2 of moteview with my own mote proxy.

I also bought a “stargate” wireless gateway (It cost £700 pounds via UK Distribution Willow Tech). This was advertised with built in micro wireless camera, err they could have said what it was. The built in micro camera was in fact an external USB Logitec webcam, this is what I got when I opened the box. They charged about £120 pounds $240 dollars for this webcam. The other thing about the “stargate” is it needs a regular power supply. It should be solar or battery powered. It makes no sense to have a battery unit for a wireless sensor network for remote monitoring. (True I can adapt it) At this price I should have just bought another laptop, and that’s what I recommend to anyone else..

There has not been one stage in this process when I have felt like I have had good service either off Willow ( Xbow UK distributors, although Willow tried their best, since their interest is making a sale in redirecting products that come from the USA with 100 percent mark up) or from Crossbow USA who I have dealt with directly.

Although I am still pursuing this project and my research, (I have started so I will finish)…this is more of a warning. Really I feel like should just ask for my money back with all this as it feels like sales misrepresentation.

You see I actually went to San Jose head offices from London and told them what I was going to do and what wanted before I started to invest my time and money in this, this was early in 2004.

Maybe all the researchers use other peoples money so it doesn’t matter and nobody speaks up; but this is just a word to those that might even want to invest in this. WARNING; look at any other wireless technology for sensors platform and avoid Crossbow Motes, maybe gumsticks are better, or build your own. Anyway you have been warned.

Oh, one more thing, I even bought housing to protect them, this does not fit properly and you have to break solder connections to get them inside, and they arent waterproof. This felt like more money wasted.

To conclude you have to do a lot of your own software development to get real time online visualization.

I have also been doing tests for continual running, mainly battery life but also stability and data polling. So far thirty six hours is the longest period I have had my sensor networks running without either a re-start or some other related health issue. Well maybe there is a memory leak in moteview 1.4. Also on this endurance testing during the last month of this two motes have now stopped working in my system. This means that they are fairly unstable for anything beyond “play” development.

Some positive points

I now have written my own custom software which was mote proxy bridge in java to get real time data online. I have tested this now and I making several online real time visualizations. A version mote proxy middle has been made. Current Version 1.23

There is also way to do this via PHP and postgreSQL though opening ports, I have also done this and an online PHP kit Vers 0.9 has been developed.

I am still trying to develop with this, for my Sensity and House project. Its part of my AHRC creative fellowship and I have now set up my studio ie main operational headquarters in the Digital Studios at Goldsmiths College University of London.

Maybe Crossbow would like to support me by giving me a complete set of the new motes and sensors boards with GPS that can network over larger distances.(Imotes and Iris), so I can continue my research..

 

Wish list

Smaller more stable motes.

Data of much larger distances.

Plug in solar power cell to power them.

Stargate with solar polar or battery

Some decent housing that easy to clip wrong and fits well.

Much better technical support.

Easier set up.

More sensors available that can just clip on and piggyback the set up.

ALSO

A mote with Ethernet that just clips into a new work ports and configures to send data…that would be cool.

Sensity: The online interfacing of live real time sensors networks allows a communication with environment, with real space in the present.

January 10th, 2006

Screen-Shot-2013-08-10-at-21.43.03dSensity: Environments. The ‘environment’ in these projects is created from a wireless multi nodal multi sensor network that is in place. The analogue is made digital and the digital can be formed into a variety of output devices.The flow of the data can be set to affect the behaviour of the output environment. The data environment that is created is a mapped on top of the space, a virtual data map or the real world. The environment is intelligent its just that we don’t know how to communicate with this space yet.

Within Sensity there is now a loop from the real to the virtual and back to the real. This notion of playing or manipulating with a malleable form (data) is made possible as each stream, each node, each sensor, or even the entire network can be communicated with using this xml online gateway.

We have seen rich shift in relational and responsive interactive works and the move away from gallery as a venue for art to the use of architecture and public domain space in the last twenty years.

In an age of global warming, so many artists are still using the architectural space as a coloured light bulb. As we burn more fossil fuels the light are flashing on and off.

Can Sensity be made more physical on output to represent of the growth of the city as an experience in the real world away from the screen. A city representation of the fabric of city space end the emerging patterns caused by these data flows.

An art city can be made where the data powers the wind turbines, the data changing the solar panels that change the lights. Loops of real time data change the meaning all the while changing the input and output which is (e)merging into a new space.

stanza image

Can Sensity be made more physical on output to represent of the growth of the city as an experience in the real world away from the screen. A city representation of the fabric of city space end the emerging patterns caused by these data flows.

stanza image