Archive for June, 2008

Starry Night…Maybe we should all turn the lights on….

June 28th, 2008

This spring, a new sculpture by James Yamada entitled Our Starry Night, will be on view at Doris C. Freedman Plaza at Fifth Avenue and 60th Street. Built from powder coated aluminum and punctuated with 1,900 colored LED lights, Our Starry Night is a 12-foot-tall sculpture that acts as an interactive passageway to Central Park. As visitors to the park walk through the sculpture at all hours of the day and night, it will illuminate in response to each person individually.

When visitors walk through the portal in the piece, they trigger a metal detector hidden inside the structure’s casing. This activates the LED lights that perforate the exterior of the sculpture. Common everyday metal objects such as cell phones, keys, belts, jewelry, cameras, computers, and the like will trigger the lights; the luminosity and the light patterns seen in the piece will correspond to the quantity of metal detected. Our Starry Night is literally activated by the public, reinforcing the notion that art — and particularly public art — is dependent on the people around it.refhttp://publicartfund.org/pafweb/projects/08/yamada/yamada-08.html

With interactive media  / art it seems the that once the interactivity takes place  all one can do is make things spin or in most cases turn the LEDS on. The current fascination for for turning lights on an off with various modes of controlling device(in this case metal detection) suggests we are about to be driven crazy by all sorts of public “entertainment” art. Sort of like cheap fair found stalls  from Blackpool or Coney Island …it  all seems a bit cheap.

Surely the question is how can we find a more meaningful experience from these public interactions….turning the lights on and off ain’t the answer even if it does look pretty.

It seems the only way to justify the recent run of works of this type is in terms of it “playfullness”…..err whatever.

Playful is the default mode of interactive media , ie  when the work has no context or meaning, or the artist cannot place significant meaning around the work…..saying its is  playfull seems to be enough.

Maybe we should all turn the lights on……..or off…..

Software to “hear” sounds.

June 25th, 2008

CCTV cameras which use artificial intelligence software are being developed to “hear” sounds like windows smashing, researchers have revealed.

University of Portsmouth scientists are working on adapting the software so it can also react to crowd noise.

Crimes would be captured on camera faster and response times improved.

The news comes after the BBC learned councils in southern England routinely used powers brought in to fight terrorism and crime to spy on people.

Figures obtained by BBC South showed the Regulation of Investigatory Powers Act (Ripa) was used more than 750 times by the councils in 2007/08. The new three-year surveillance study is being funded by the Engineering and Physical Sciences Research Council (EPSRC).

http://news.bbc.co.uk/1/hi/england/hampshire/7471140.stm

http://www.port.ac.uk/aboutus/newsandevents/frontpagenews/title,79126,en.html

The research team is now working on using the same software to ‘learn’ sounds and react to them by swinging the CCTV camera towards in them at the same speed a person would turn their head if they heard someone scream, which is about 300 milliseconds.

Dr David Brown, director of the Institute, said: “The visual-recognition software will be able to identify visual patterns but for the next stage we want to get the camera to pivot if it hears a certain type of sound. So, if in a car park someone smashes a window, the camera would turn to look at them and the camera operator would be alerted.

“The longer artificial intelligence is in the software the more it learns. Later versions will get cleverer as time goes on, perhaps eventually being able to identify specific words being said or violent sounds. We are only listening for specific words associated with violence, not full conversations.”

The software behind this research uses fuzzy logic to identify certain visual cues and sounds. Dr Brown said: “In identifying sound we are looking for the shapes of sound. In the same way, if you close your eyes, you can trace the shape of a physical object and ‘read’ its profile with your hand we are developing shapes of sound so the software recognises them.

“The software will use an artificial intelligence template for the waveform of sound shapes and if the shape isn’t an exact fit, use fuzzy logic to determine what the sound it. For example, different types of glass will all have slightly different waveforms of sound when they smash but they will have the same generic shape which can be read using fuzzy logic.

“It’s a very fast, real-time method of identifying sounds.”

The Cyborg Self And The Networked City. William J. Mitchell.

June 22nd, 2008

Image: Stanza Spiral Jetty 2007 live data visualisation in the grounds at Goldsmiths College London.

Copyright image Stanza artwork…

ME ++. The Cyborg Self And The Networked City. William J. Mitchell. The Mit Press.2003.

Networks connections and the interrelatedness of things that flow through our bodies, our building
“Code is mobile.  Code is everywhere.  And code-for both people and machines who interact with them – is the law”

Citysense passively “senses” the most popular places based on actual real-time activity and displays a live heat map.

June 18th, 2008
stanza image

Stanza Artwork. Shanghai 2004.

Here is the sales pitch from citysense. A system for gathering and representing real time city data from San Francisco. A nice idea for a company.
Quoted.
Citysense is an innovative mobile application for local nightlife discovery and social navigation, answering the question, “Where is everybody?”

Citysense shows the overall activity level of the city, top activity hotspots, and places with unexpectedly high activity, all in real-time. Then it links to Yelp and Google to show what venues are operating at those locations. Citysense is a free demonstration of the Macrosense platform that everyone can enjoy.

Instead, it evolves searching to sensing. Citysense passively “senses” the most popular places based on actual real-time activity and displays a live heat map.
Location data is everywhere. Cars, buses, taxis, mobile phones, cameras, and personal navigation devices all beacon their locations thanks to network-connected positioning technologies such as GPS, WiFi and cell tower triangulation. Millions of consumers and businesses use location-enabled devices for finding nearby services, locating friends & family, navigating, asset- and pet-tracking, dispatching, sports, games, and hobbies.

These forces have lowered the cost of technology, ignited interest in location-enabled services, and resulted in the generation of significant amounts of historical and real-time streaming location information. Sense Networks was founded on the idea that these datasets could provide remarkable real-time insight into aggregate human activity trends.

Macrosense employs patent-pending technology to learn from these large-scale patterns of movement, and to identify distinct classes of behaviors in specific contexts, called “tribes.”

Once it’s known which tribes are where, by sampling the distribution of tribes at any given place and time, it’s possible to understand what it means when a user is there at that place and time.

For example: rock clubs and hip-hop clubs each retain distinct tribal distributions. When a user is out at night, Citysense learns their preferred tribe distribution from time spent in these places. When that user visits another city, they see hotspots recommended on the basis of this distribution and combined with overall activity information.

Users who go to rock clubs see rock club hotspots, users who frequent hip-hop clubs see hip-hop hotspots, and those who go to both see both. The question “where is everybody like me right now?” is thus answered for these users – even in a city they’ve never visited before.

Citysense is an application that operates on the Sense Networks Macrosense platform, which analyzes massive amounts of aggregate, anonymous location data in real-time. Macrosense is already being used by business people for things like selecting store locations and understanding retail demand. But we asked ourselves: with all this real-time data, what else could we do for a city? Nightlife enhancement was the obvious answer. This release is just a test, and we’re interested in your feedback on how to make the application better. You’ll find a feedback button in Citysense.

Principles…

People should own their own data
People should have full control over the use of any data that they generate. All data collection should be “opt-in,” and users should be able to easily remove themselves and their data from the system without questions or hassle. The system doesn’t “remember” a user for later, but completely deletes data at the user’s discretion.

People should receive a meaningful benefit in exchange for sharing data
Meaningful benefits include compelling applications to help manage life better, or personalized services based on anonymous learning from “users like me.” People should be able to enjoy the benefits of these services simply in exchange for their data.

We’re looking for additional common good uses of aggregate, anonymous location data. If you would like to submit a project for consideration, please contact us at ….
http://www.citysense.com/home.php

All of the above ref their website.

From my Sensity projects.
Citysense…Sounds like sensity backwards….Various types of data can be re-imagined within the context of city space and the environment. This includes pollution data recorded via sensors in the street, to create audio acoustic files expressing the pain and suffering of the air as it pollutes. Weather and forecast data, acquired via weather station equipment; this can be used and can create ambient soundscapes and morphing visualizations as the wind shifts direction or the rain increases. Noise monitor levels, and noise maps , create a symphony of true urban sounds that can be used to make sound reactive sculptures. The patterns we make, the forces we weave, are all being networked into retrievable data structures that can be re-imagined and sourced for information. These patterns all disclose new ways of seeing the world. The value of information will be a new currency as power change. The central issue that will develop will be the privilege and access to these data sources….
I like their pitch about owning their own data, couldn’t agree more in fact all royalties should be shared. Its not just about privacy its about ownership. Once you enter the grid you body is now externally giving away data and information. Companies are now rushing to harvest this information , ( information services) making new products for mobile devices. I think we are going to see a lot of this.

Motes technical .Temperature responses for mts 310 series on mica 2

June 3rd, 2008

Stanza:Sensity ...real time city mash ups, merging my online XML feeds from each city and making online visualisations.

The temperature sensors in moteview on four of my motes show -273.15. The rest of the board ie light and sound is working. How do I get the temp working or reset it.

Did you “Reboot” into the new image (slot) that you OTAP’d?

This is required in order for the nodes to start executing new image.

You may choose any of the available slots where you want to store the

new image.

Please refer to the MoteConfig manual for details on OTAP.

http://www.xbow.com/Support/Support_pdf_files/MoteConfig_Users_Manual.pd

——————————————————

The temperature sensors in moteview on four of my motes show -273.15. The rest of the board ie light and sound is working. How do I get the temp working or reset it.

Response:

—————————————————————

It looks like you are able to query the nodes (i.e. they are running OTAPImage).

Which Motes are you trying to reprogram with the new code?

Which slot have you selected to OTAP?

Please check and make sure that you have at least 2.7V battery voltage on these nodes.

Response:

—————————————————————

Unless you had previously enabled the nodes to be OTAP’d you won’t be able to OTAP with new application.

What I would suggest is to bring the nodes and attach them to the MIB board and then program them using MoteConfig’s Local program tab. You need to chek OTAP enable box if you wish to OTAP them in the future.

—————————————————————

Response

May be I wasn’t clear earlier. You can use both CA and CB boards in

the same network. The main difference is that they use different power

control lines for temperature sensor. To get accurate temperature readings, you should Program the Mote attached to the CA sensor board (without jumper wire) with  “XMTS300CA__.exe and program the Mote attached to the CB  sensor board (with jumper wire) with “XMTS300CB__.exe

Response:

—————————————————————

It is quite likely that you have an MTS310CB board (look for a jumper

wire on the bottom-side of the sensor board).

The MTS310CA uses INT2 for temperature power control where as the CB

version uses PW0. It sounds like you are using CA code on CB hardware (or vice versa) and hence the Temp sensor never gets turned on and returns 0.

If you have CB board, then you need to use CB version of the app.

Question:

—————————————————————

We want the batteries to last longer.

How do we do this?

How long should batteries last without a change….

If they are in low power mode how long will they last.

I know know this is difficult to answer) but how do I get them to last longer.

Also do you have solar panel one can plus in to the motes for power?

Response:

—————————————————————

The high power (HP) version of the apps don’t duty cycle the radio and hence would deplete the battery in few days. The low power (LP) version of the apps draw an average of 330 uA current with MICA2 platform and when used with alkaline AA batteries can easily deliver over 6 months of battery life.

In order to make the batteries in the Kits last long time, you need to use them in XMesh-LP mode.

We do have solar panel implementation in our next generation eKo Pro series products that can deliver battery life of over 5 years.

http://www.xbow.com/Eko/index.aspx