Showing posts with label data. Show all posts
Showing posts with label data. Show all posts

Tuesday, June 04, 2013

Figuring Data (Datascape Catalog Essay)

This essay was commissioned for the exhibition Datascape, at the Cube Gallery, QUT in April 2013. I should mention that since writing it I've discovered that Jer Thorp was way ahead of me on to the new oil thing.

“Data is the new oil” - Ann Hummer, Hummer-Winblad Venture Partners (source)

In the swirling chaos of twenty-first century capitalism, everybody wants to know what’s next. “Data is the new oil” is a pithy little announcement. It reminds us how we got here, powered by the long energetic boom of fossil fuels, now entering its closing stages. it announces a successor, a new wealth (and just in time). But in drawing the analogy, it also constructs data in a certain way; as a sort of amorphous but precious stuff, a resource for exploitation, and a sort of promising abundance. Similarly The Economist trumpeted the “Data Deluge” on their February 2010 cover: a businessman catches falling data in an upside-down umbrella, funnelling it to water a growing flower whose leaves are hundred dollar bills.

We need not (and should not) accept this analogy; but it demonstrates how data is figured, or constructed, in our culture. Our everyday life and culture is traced, tangled and enabled by digital flows. We produce and consume data as never before. But what exactly is this data? What can it do, and what can we do with it? Who owns or controls it? How can we understand, appreciate, or even sense it? The construction of data as a cultural actor is vital because data itself is so abstract, so hard to pin down. We ought not leave it to the captains of industry, and their upside-down umbrellas. In Datascape we see artists working with data, applying and diverting it for their own ends, as well as offering their own figurations of its potentials and limits. In a culture increasingly built on data, these works provide moments of cultural introspection, reflections on this abstract stuff that is our new social medium.

Google, Facebook, Twitter and the rest make us - their users - into data. This makes us anxious about privacy and surveillance, but perhaps a more interesting question is what it’s like to be data. If we are all data subjects now, then what is data subjectivity? Jordan Lane’s Digital Native Archive imagines a new bureaucratic archive for the data subject, and immediately comes to the question of mortality. If we are data, and data can be faithfully preserved, are we now immortal? Or are we, instead, dead forever, entombed in a rationalised hierarchy of metadata, request protocols and archival record formats? Christopher Baker’s My Map (below) shows us what it might be to take charge of a personal archive, with a tool that reveals the patterns and relationships in email correspondence. This self-portrait suggests that one of the challenges of data subjectivity is simply knowing oneself: the scale of our personal data exceeds our grasp.
In two of the most prominent data art works from the mid 2000s, we mine these personal archives en masse. Golan Levin’s The Dumpster and Sep Kamvar and Jonathan Harris’ We Feel Fine scour the internet for “feelings” that are compiled into datasets, and in turn staged as dynamic visualisations. In turning our digital selves into swarming dots and bouncing balls, the artists animate us as members of a teeming throng. Data here is in part a new form of social realism, a way to represent the complex texture of life in the crowd; but these works also ask us to reflect on the limits of data-subjectivity. Can the intensity of our inner lives really be represented in cool, abstract data? Are we all so much alike? Aaron Koblin’s Sheep Market answers both yes and no; for we can see here both the comical diversity of the crowd (and its sheep avatars), and the uniformity that digital systems encourage.

The pathos of this contrast, between the coolness of the digital and the warm, messy intensity of humankind, emerges again in Luke du Bois’ Hard Data, where the tolls of war unfold as stark lists and map references. Du Bois’ soundtrack, generated from the same source data, acts as an emotional mediator, trying to return some of the tragic importance that the data fails to convey. Du Bois’ work pivots between the data-subject and what we might call the data-world. For if the world, too, is now data, then what might that feel like? How do we approach such a world?

In many works here the weather - a complex (and increasingly uncooperative) material flux - is a sort of proxy for the data-world: a field that is both easy to measure, and difficult to grasp. In Miebach’s Weather Scores, Viegas and Wattenberg’s Wind Map (above), and my own Measuring Cup, weather data is a source of aesthetic richness, as well as a pointer to the world beyond, the world that data traces. The weather - so much part of our everyday sensations - is abstracted here into numbers and symbols, only to be remade in new sensual forms. What if we could see the wind across an entire continent? Or hold a hundred years of temperature? Or hear the tides as music?

Here we get a glimpse of an alternative figuration of data itself. Rather than some kind of precious (but immaterial) stuff, or fuel for market speculation, data here is a relationship, a link between one part of the world with another, and a trace that can be endlessly reshaped. Of course, that trace is imperfect; a mediated pointer, not a pure reproduction. So Viegas and Wattenberg issue a disclaimer for their Wind Map: this is just an “art project”, they say; we "can't make any guarantees about the correctness of the data or our software.” Yet that connection remains; and art here plays the role that it always has. It transforms our understanding of the world, by representing it anew.

Read More...

Sunday, June 06, 2010

Measuring Cup

Measuring Cup is a little dataform project I've been working on this year. It's currently showing in Inside Out, an exhibition of rapid-prototyped miniatures at Object gallery, Sydney.

This form presents 150 years of Sydney temperature data in a little cup-shaped object about 6cm high. The data comes from the UK Met Office's HadCRUT subset, released earlier this year; for Sydney it contains monthly average temperatures back to 1859.


The structure of the form is pretty straightforward. Each horizontal layer of the form is a single year of data; these layers are stacked chronologically bottom to top - so 1859 is at the base, 2009 at the lip. The profile of each layer is basically a radial line graph of the monthly data for that year. Months are ordered clockwise around a full circle, and the data controls the radius of the form at each month. The result is a sort of squashed ovoid, with a flat spot where winter is (July, here in the South).


The data is smoothed using a moving average - each data point is the average of the past five years data for that month. I did this mainly for aesthetic reasons, because the raw year-to-year variations made the form angular and jittery. While I was reluctant to do anything to the raw values, moving average smoothing is often applied to this sort of data (though as always the devil is in the detail).


The punchline really only works when you hold it in your hand. The cup has a lip - like any good cup, it expands slightly towards the rim. It fits nicely in the hand. But this lip is, of course, the product of the warming trend of recent decades. So there's a moment of haptic tension there, between ergonomic (human centred) pleasure and the evidence of how our human-centredness is playing out for the planet as a whole.


The form was generated using Processing, exported to STL via superCAD, then cleaned up in Meshlab. The render above was done in Blender - it shows the shallow tick marks on the inside surface that mark out 25-year intervals. Overall the process was pretty similar to that for the Weather Bracelet. One interesting difference in this case is that consistently formatted global data is readily available, so it should be relatively easy to make a configurator that will let you print a Cup from your local data.

Read More...

Wednesday, May 19, 2010

This is Data? Arguing with Data Baby

These IBM commercials are gorgeous, lavish examples of modern motion graphics from Motion Theory. Like some of the agency's earlier work, and a handful of other examples noted here, these ads show how code-literate design (could we call it the P factor?) is transforming this field. For all those reasons, I love this work; but it also really bothers me. I'll try to explain.


The opening line of this voiceover says it all, really. This is data. Making that call - defining what data is - is a powerful cultural gesture right now, because as I've argued before data as an idea or a figure is both highly charged and strangely abstract. It makes a lot of sense for a corporation like IBM to stake a claim on data; this stuff is somehow both blessing and curse, precious and ubiquitous, immaterial and material. IBM promises here to help with the wrangling, but also, most powerfully, to show us what data is.

So, what is data here? In these commercials data is first and foremost material. It is a physical stuff. In Data Baby it wraps a little infant like some kind of luminescent placenta, drifting away into the air, thrown off in shimmering waves as the child breathes. In Data Energy it trails like a cloud behind a tram, and spins with the blades of a wind turbine. A lot of the (beautiful) animation work here has been devoted to simulating behaviour, making this colorful, abstract stuff seem to be tightly embedded in the world with us. What that means is both coupling it tightly to real objects, and supplying it with immanent dynamics - making it drift, disperse or twirl.


The second interesting property of data here - related to the first - is that it just exists. Look again at Data Baby, and note that there is no visible sign of this data being gathered (or rather, made). No oxygen saturation meter, no wires, no tubes, no electrodes. Not a transducer in sight. Not until the closing wide shot do we even see a computer. (This is fascinating in itself; IBM (or their ad agency) gets it that the computer is no longer the right image, or metaphor, for "information technology". Neither is the network; now it's immanent, abundant data.) In other words data here is not gathered, measured, stored or transmitted - or not that we can see. It just is, and it seems to be inherent in the objects it refers to; Data Baby is "generating" data as easily as breathing.

Completing this visual data-portrait are some other related themes: data is multiplicitous and plentiful, it's diverse (many colours and shapes) but ultimately harmonious and beautiful - in Data Transportation it looks like an urban-scale 3d Kandinsky painting.



Several things bother me about this portrayal. The first is the same is the reason I love it: it's powerfully, seductively beautiful, and this amplifies all my other reservations. The vision of data as material, in the world, is also incredibly seductive; my concern is that we get such pleasure from seeing these rich dynamics play out - that the motes wafting from Data Baby's skin seem so right - that we overlook the gaps in the narrative. This vision of material data is also frustrating because it has all the ingredients of a far more interesting idea: data is material, or at least it depends on material substrates, but the relationship between data and matter is just that, a relationship, not an identity. Data depends on stuff; always in it, and moving transmaterially through it, but it is precisely not stuff in itself.

You could say that I'm quibbling about metaphors here, and you'd be right, but metaphors are crucially important because they shape what we think data is, and what it does. Related to data as stuff is this second attribute; data that just is, in the same way that matter is neither created or destroyed, but just exists. This is crucially, maybe dangerously wrong. Data does not just happen; it is created in specific and deliberate ways. It is generated by sensors, not babies; and those sensors are designed to measure specific parameters for specific reasons, at certain rates, with certain resolutions. Or more correctly: it is gathered by people, for specific reasons, with a certain view of the world in mind, a certain concept of what the problem or the subject is. The people use the sensors, to gather the data, to measure a certain chosen aspect of the world.

If we come to accept that data just is, it's too easy to forget that it reflects a specific set of contexts, contingencies and choices, and that crucially, these could be (and maybe should be) different. Accepting data shaped by someone else's choices is a tacit acceptance of their view of the world, their notion of what is interesting or important or valid. Data is not inherent or intrinsic in anything: it is constructed, and if we are going to work intelligently with data we must remember that it can always be constructed some other way.

Collapsing the real, complex, human / social / technological processes around data into a cloud of wafting particles is a brilliant piece of visual rhetoric; it's a powerful and beautiful story, but it's full of holes. If IBM is right - and I think they probably are - about the dawning age of data everywhere, then we need more than a sort of corporate-sponsored data mythology. We need real, broad-based, practical and critical data skills and literacies, an understanding of how to make data and do things with it.

Read More...

Saturday, December 12, 2009

Data Walks - a #climatedata proposal

In response to the UK Met Office's recent data release and Manuel Lima's call for visualisations, there's been a flurry of #climatedata activity in the last couple of days, including some revealing visualisations. Though I'm looking forward to playing with the data myself, this isn't a post about visualisation. It's a simpler proposal for a way to make the data tangible.


Global warming is ultimately a question about change in a single measurement - temperature - over time. One way or another, it can be boiled down to a line graph. How best to make that line tangible? Visualisation is great, but how else could we feel those changes, especially over time? One way would be to walk the data. We could make a kind of giant line graph, in the form of a path or road, then walk from 1850 to 2009. According to the Met Office's graph - remixed above with a picture of my local landscape - this would be a fairly undulating journey, but the last half especially would be a distinct and noticeable climb. Building this path at a walkable scale seems like hard work though. It would be much easier to use the paths we already have. So, here's a recipe for a #climatedata walk:

  1. Make a graph. There are all kinds of options here. The Met Office graph shows global difference from a long-term (1961-1990) average. You could for example use local data only, or use raw average temperatures rather than difference from average. You would also need to select a year range from the data - want to walk the whole century or just post-WW2? All the data choices should be made clear to any walkers.
  2. Fit to landscape. This is the tricky part. The idea would be to find a walkable route with changes in elevation that fit your line graph well. Finding a perfect fit will be very difficult, but finding an OK fit should be possible. This will involve some scaling questions: how long will the walk be, and how much elevation will it cover? Accessibility, ergonomics, experience design, affect - lots of juicy design decisions here. One crude but easy fitting procedure would be to begin with a route, find its elevation profile, then scale the graph to fit the start and end points of the graph to the route start and end, then note the points where the path and the graph intersect. Maybe some GIS / maps people could help with software tools here for route finding and fitting?
  3. Tick marks. Walk the route and mark it out in order to make the whole thing legible. Mark out years or decades, as well as temperature variation (elevation). One option for paths with an imperfect fit, would be to notate the difference between the path and the graph at certain points, as well as points where the path and the graph intersect.
  4. Walk. Again you can imagine many ways to do this, ranging from big organised public walks, to smaller private ones. Of course walking often leads to talking - and in a different way to, say, looking at a graph.
I should emphasise that I haven't even tried this, yet, but I hope to - Canberrans, if you're interested in helping organise a walk here, let me know. Wherever you are, if you do try it, let me know - also feel free to adapt / refine / repurpose the procedure. Could be fun, even informative - at the very least, you'll walk up a hill.

Read More...

Wednesday, October 07, 2009

Weather Bracelet - 3D Printed Data-Jewelry

Given my rantings about digital materiality and transduction, fabrication is a fairly obvious topic of interest. I posted earlier about an experiment with laser-cut generative forms and Ponoko - more recently I've been playing with 3d-printing via Shapeways, as well as trying out data-driven (or "transduced") forms. This post covers technical documentation as well as some more abstract reflections on this project - creating a wearable data-object, based on 365 days of local (Canberra) weather data.


Shapeways has good documentation on how to generate models using 3d-modelling software. Here I'll focus more on creating models using code-based approaches, and Processing specifically. The first challenge is simply building a 3d mesh. I began with this code from Marius Watz, which introduces a useful process: first, we create a set of 3d points which define the form; then we draw those points using beginShape() and vertex().

The radial form of the Weather Bracelet model shows how this works. The form consists of a single house-shaped slice, where the shape of each slice is based on temperature data from a single day. The width is static, the height of the peak is mapped to the daily maximum, and the height of the shoulder (or "eave") is mapped to the daily minimum. To create the radial form, we simply make one slice per day of data, rotating each slice around a central point. As the diagram below shows, this gets us a ring of slices, but not a 3d-printable form. As in Watz's sketch, I store each of the vertices in the mesh an array - in this case I use an array of PVectors, since each PVector conveniently stores x,y and z coordinates. The array has 365 rows (one per day, for each slice) and 5 columns (one for each point in the slice). To make a 3d surface, we just work our way through the array, using beginShape(QUADS) to draw rectangular faces between the corresponding points on each of the slices.


To save the geometry, I used Guillame laBelle's wonderful SuperCad library to write an .obj file. I then opened this in MeshLab, another excellent open source tool for mesh cleaning and analysis. Because of the way we draw the mesh, it contains lots of duplicate vertex information; in MeshLab we can easily remove duplicate vertices and cut the file size by 50%. MeshLab is also great for showing things like problems with normals - faces that are oriented the wrong way. When generating a mesh with Processing, the order in which vertices are drawn determines which way the face is ... er, facing... according to the right hand rule. Curl the fingers of your right hand, and stick up your thumb: if you order the vertices in the direction that your fingers are curling, the face normal will follow the direction of your thumb. Although Processing has a normal() function that is supposed to set the face normal, it doesn't seem to work with exported geometry. Anyhow, the right hand rule works, though it is guaranteed to make you look like a fool as you contort your arm to debug your mesh-building code.

The next step in this process was integrating rainfall into the form. I experimented with presenting rainfall day-by-day, but the results were difficult to read; I eventually decided to use negative spaces - holes - to present rainfall aggregated into weeks. Because Shapeways charges by printed volume, this had the added attraction of making the model cheaper to print! The process here was to first generate the holes in Processing as cylindrical forms. Unlike the base mesh, each data point (cylinder) is a separate, simple form: this meant I could take a simpler approach to drawing the geometry. I wrote a function that would just generate a single cylinder, then using rotate() and scale() transformations made instances of that cylinder at the appropriate spots. Because I wanted the volume of each cylinder to map to rainfall, the radius of each cylinder is proportional to the square root of the aggregated weekly rainfall. As you can see in the grab below, the base mesh and the cylinders are drawn separately, but overlayed; they were also saved out as separate .obj files. The final step in the process was to bring both cleaned-up .obj files into Blender (more open source goodness) and run a Boolean operation to literally subtract the cylinders from the mesh. This took a while - Blender was completely unresponsive for a good few minutes - but worked flawlessly.





Finally, after checking the dimensions, exporting an STL file from MeshLab, and uploading to Shapeways, the waiting; then, the printed form. I ordered two prints, one in Shapeways' White, Strong and Flexible material, and the other in Transparent Detail. You can clearly see the difference between the materials in these photos. The very small holes tested the printing process in both materials; in the SWF print the smallest holes are completely closed; in the TD material they are open, but sometimes gummed up with residue from the printing process (which comes out readily enough). Overall I think the TD print is much more successful - I like the detail and the translucency of the material, as well as the cross-hatched "grain" that the printing process generates.






So, a year of weather data, on your wrist - as a proof of concept the object works, but as a wearable and as a data-form it needs some refinement. As a bracelet it's just functional - the sizing is about right, but the sharp corners of the profile are scratchy against the skin. As a data-form, it could do with some simple reference points to make the data more readable - I'm thinking of small tick-marks on the inner edge to indicate months, and perhaps some embossed text indicating the year and location. More post-processing work in Blender, I think.

Another line of development is to do versions with other datasets - and hey, if you'd like one for your city, get in touch. But that also raises some tricky questions of scaling and comparability. The data scaling in this form has been adjusted for this dataset; with another year's data, the same scaling might break the form - rain holes might eat into the temperature peaks, or overlap each other, for example. A single one-size-fits-all scaling would allow comparisons between datasets, but might make for less satisfying individual objects - and, finding that scaling requires more research.


What has been most enjoyable with this project, though, is the immediate reaction the object evokes in people. The significance and scale of the data it embodies, and its scale, seem to give it a sense of value - even preciousness - that has nothing to do with the cost of its production or the human effort involved. The bracelet makes weather data tangible, but also invites an intimate, tactile familiarity. People interpret the form with their fingers, recalling as they do the wet Spring, or that cold snap after the extreme heat of February; it mediates between memory and experience, and between public and private - weather data becomes a sort of shared platform on which the personal is overlayed. The form also shows how the generalising infrastructures of computing and fabrication can be brought back to a highly specific, localised point. This for me is the most exciting aspect of digital fabrication and "mass customisation" - not more choice or user-driven design (which are all fine, but essentially more of the same, in terms of the consumer economy) - but the potential for objects that are intensely and specifically local.

Read More...

Friday, May 15, 2009

Landscape, Slow Data and Self-Revelation

This text was an invited contribution to Kerb 17: Is Landscape Architecture Dead? This looks like a rich volume with a sharp critical edge, and a swathe of interesting material spanning architecture, urbanism, art and landscape. Unfortunately my contribution was edited fairly severely; so here's the unabridged version. Redundancy warning for regular readers: there's a slight rehash of Watching the Sky in here; but afterwards there's fresh material on landscape / data projects by Driessens and Verstappen and Usman Haque.


Data is, we imagine, an immaterial thing; or at least ethereal, made of light and electricity, processed at superhuman speed, transmitted in real time. The everyday world we move in seems dense and slow by comparison. The landscape is slower again; thick, heavy and persistent. At the moment however those two domains, the fast lightness of data and the heavy slowness of the landscape, are urgently linked. We are faced with the prospect of momentous change in the landscape that is somehow both slow and fast; too slow for our real-time culture to grasp, and too fast for the living systems of the landscape to adapt to. This paper presents a handful of works that dwell in that disjunction, between landscape and data; not solving it at all, but at least forming links, complicating assmptions, and recasting the relationship between two terms that seem to neatly encapsulate our future.


In Watching the Sky a camera looks out my office window, at the sky and the landscape. A banal view over a university campus to a bushy ridge in Belconnen. The camera takes an image every three minutes; four hundred and eighty images in twenty four hours. Tethered to a computer, the camera records for weeks at a time; the computer accumulates thousands of images. I think of the images as data, traces of change in the world outside the office window. I visualise, or re-visualise, this image data in the simplest possible way; an automated process "cuts" a narrow vertical slit from the same location in each image, and compiles all these slits together (this is a digital imitation of an analog photographic technique known as "slit-scan"). In the rectangular visualisations the slices are tiled from left to right. In the radial visualisations slices are gradually rotated so that a twenty-four-hour period spans one complete revolution (the "seam" is at midnight).


In the resulting images the patterns of change within and between days are immediately visible. As I imagined, day and night, cloud and sky are obvious. The brief, delicate colour shifts of dawn and dusk were more surprising. Below the horizon, though, patterns appeared that complicated the work's nominal focus on the sky. It became clear that some of the richest and most revealing data here came from the landscape. In one of the earliest sketches I found small but distinct variations in the horizon line over the course of a day, and recurring on successive days. I eventually realised that this was caused by the afternoon breeze, shifting foliage by a few pixels within the frame. In other words, subtle changes in the material field of the landscape carried through to the image data. Moreover in many ways the landscape visualises its own internal structure: the trees blowing in the breeze are partly instruments, revealing material changes around them (the breeze); but also data, traceable as pixels. In many images the passage of a shadow across the ground appears as a recurring pattern, an enfolded or multiplexed representation of another set of material interactions; the landscape measures and reveals itself, but not as an object, image or view. It is a connective, dynamic, material system; what is revealed are the specific interactions of that system with itself. The image data acts as a kind of core sample, drilling through multiple spatial and material systems, but each is connected outwards, beyond the frame. The wind in the trees doesn't belong to this image, but like the angle of the sun revealed in the shadow, is an index of a wider system.

It also became clear that the landscape is densely packed with human, social data which is equally apparent in image data. In the rectangular visualisations presented here stripes of colour are visible towards the bottom of the frame. These are caused by cars, parked illegally under the trees; they form another ad-hoc graph that reflects cultural, institutional calendars and cycles, though again they are intermingled with other scales and structures.


Landscape is also cast as a self-revealing instrument in Driessens and Verstappen's Tschumi Tulips project. This landscape installation occupied the Tschumi Pavilion, in Hereplein, Groningen, during the northern Spring in 2008. The pavilion is a rectilinear glass container, rising at an angle from the surrounding park. In this installation the artists filled the base of this box with soil and planted over ten thousand white tulips. A matching array of tulips was planted outside, extending the line of the pavilion. Like scientists, the artists set up two identical subjects, but vary their environment: ten thousand tulips inside, ten thousand outside. A webcam reveals how these variations in environment are slowly materialised in the life of the tulips. The tulips inside grow, bloom and then, wonderfully, decay more rapidly than their twins outside. As in Watching the Sky, long time spans are compressed into human-scale time and space; and here too digital imaging plays a pragmatic role in that revelation. Deployed in rectangular masses we can easily read the flowers as abstract, sculptural materials; organic pattern and variation enframed and aestheticised. But at the same time the work has a kind of deadpan resonance, a rendition of life, and death, inside a greenhouse.


The Huey-Dewey-Louie Climate Clock, by Usman Haque and Robert Davis, addresses the long timescales of environmental change head-on in a proposal that further develops this articulation of slow data and landscape. The clock is a multi-layered system of autonomous machines and material processes. The "Huey" agent slowly builds "accretion mounds" using material extracted from the atmosphere and formed into accumulating conical stacks over the course of a year; like tree rings or geological strata these embed environmental materials directly into a designed representation. The "Dewey" element is a circular array of one hundred transparent containers, in which air and biomass samples are preserved year by year. Like Driessens and Verstappen's Tulips, Haque and Davis propose a biological instrument of one hundred genetically identical daffodils, which are sown and harvested each year, then entombed in the plinths - again a simple grid, a layer of invariance is imposed that allows the landscape to essentially represent itself, materially. Finally Louie, an autonomous solar-powered robot, gathers soil samples and compresses them into cubes, one per day. The surface of each cube is imprinted with some current data point - chosen by daily popular vote; perhaps oil price, or rainfall. So here fast, real-time, socially selected data comes to rest directly on the slow, material medium of the soil.

At one stage, not long ago, it may have seemed that we were leaving the landscape behind, or drafting it in only in as a support or substrate for the flickering patterns of real-time culture. Even now, that seems possible: the monthly figure for new housing construction, a bellwether for economic growth, is imposed on the landscape by earthmovers and roadbuilders, underscored by raw mounds of earth. The works presented here suggest an alternative role, perhaps an alternative future for the landscape; as slow data and slow instrument, a complex material system that can be subtly designed into self-revelation.

Read More...

Wednesday, July 30, 2008

The Visible Archive

I signed the contracts thismorning on a research project that I'm really excited about: a grant from the National Archives of Australia to develop interactive visualisations of their collection. That collection has over nine million items, grouped into some thirty thousand series (or sets); it's basically all of the Federal government's paperwork, but also includes photographs, AV material and other stuff. You can search the collection via the Archives site - and access digital copies of the original records in some cases.

The Visible Archive aims to do what the search interface doesn't: provide a sense of context and orientation, revealing structures and relations within the collection. The visualisations should be useful for both archivists and archive users; and the techniques developed should also be useful for other archives and collections.


The idea seems to have some currency - you may have seen Lev Manovich recently announce a project on Visualizing Cultural Patterns, working with collaborators including Noah Wardrip-Fruin.

Read more and follow the project at its own, freshly minted blog. And if you have any pointers to other related work in the visualisation of cultural datasets, especially archives, please send them along.

Read More...

Wednesday, July 16, 2008

Radiohead's Data Melancholy

In case you missed it, Radiohead have gone all data-aesthetic with their latest video, House of Cards. What's more, it's fully zeitgeist-compliant, with open access and a call for re-visualisations of a quite massive dataset: hundreds of megabytes of spatial data gathered with various 3d laser-scanning rigs. If the download stats and early signs are anything to go on, we will be seeing much more of this dataset.


As well as being technically cool, the project is yet another sign of the increasing cultural prominence of data as both material and idea - in that sense, after Design and the Elastic Mind and Wired's "Petabyte Age", this is more of the same. But it's also something different, it seems to me. Like any other visualisation, House of Cards doesn't only use data, it presents a certain sense of what data is, means, and (crucially) feels like; and this is where it's different. The dominant narrative of data visualisation at the moment is informed by the networked optimism of web 2.0, where the social sphere, and increasingly the world as a whole, is unproblematically digitised; where more is more and truth, beauty, and commercial success all are immanent in the teeming datacloud.

House of Cards, by contrast, is a manifestation of data melancholy. Data here is low res, with a sketchy looseness of detail that evokes the gaps, the un-sampled points. This data is also abject or corrupt, the scanner intentionally jammed with reflective material, a bit like the metallic chaff used to confuse missile guidance systems. These glitches are familiar devices in electronic music and video, including Kid A-era Radiohead. However here the errors are very much in the data; they have migrated out of the music, which is human, organic and more or less intact here. This disjunction between failed data and the emotional, human domain is what characterises the data melancholy; it's illustrated beautifully at the end of House of Cards, with the "party scene" (one of Thom Yorke's ideas for the clip), a social scene decimated into abstract clouds of points. This theme also resonates across In Rainbows, especially in the closing track, Videotape: "this is one for the good days / and I have it all here, in red blue green." Here image data is again a sort of failed trace of an emotional reality, all that remains of "the most perfect day I've ever seen."


Yorke's other motif for House of Cards was "vaporisation," which is clear enough in the clip; I think its most effective in the final shots of the house; the earlier clips of Yorke disintegrating seem a bit langurous, with that undulating look of Perlin noise (is it, anyone?). The house shot in particular reminded me of Brandon Morse's Preparing for the Inevitable; Morse's work in general has a related feel about it, though the models seem to be synthesised rather than sampled. Again the poetics is one of cool, digital melancholy, where tragedy is stripped down to a set of vectors and forces (above: Collapse, from Flickr). Here though, rather than a failure of data (sampled representation) it's a failure of the procedural model, or perhaps failure with, or in, the model.

Read More...

Thursday, July 03, 2008

Image, Data and Environment: Notes on Watching the Sky

Watching the Sky is a data visualisation project I've been working on for the past six months or so. The work is almost ridiculously simple: slit-scan type visualisations of large image time-series, shot from the window of my Canberra office. All the images from this process are up on Flickr. Recently UK journal Photographies invited me to write an "image led" piece on the work for their forthcoming second issue. Here's the essay, which looks at how we interpret, and literally image, pattern and change in the environment, and the role of data in that process. The themes (data, materiality, aesthetics) and some of the examples will be familiar to regular visitors. New things include spatiotemporal imaging (and even photography) as data visualisation, weather vs climate, black cockatoos, a quick look at art using environmental data-sources, and an equally quick dig at Tufte's Wavefields. It's also the most autobiographical bit of writing I've done in years - make of that what you will.

A few related projects that I discovered in the course of things: Miska Knapek's 24 hour visualisations, Michael Surtees'
36 Days of New York Sky, William Gaver's Video Window (pdf) - thanks Karl for the link - and yesyesnono's Travelling Around images - beautiful radial time-slices at a smaller time scale.

05.07_540


My childhood home was near an air force base on the outskirts of Sydney, where the sky was host to a wonderful array of aircraft. Mostly big, droning transports; Caribou and Hercules, each with their signature profiles and engine notes. Jets and helicopters were rarer and more prized: Mackie trainers, F-111s, Iroquois, Sikorskys and Chinooks. Once, miraculously, a visiting Starlifter transport, an immense silver thing apparently suspended over the hobby farms and horse paddocks. Unasked-for, revelatory, literally out of the blue, the planes were also metonymic signs of a wider world, and an idealised high-tech future I could barely wait for. Living signs flew over us too; we loved to think that black cockatoos were harbingers of rain, and would count them to predict the number of wet days ahead (image: Beppie K). I discovered the UFO lore of the early 80s; in dreams I was visited by terrifying lights, and saw archaic aircraft disintegrating above the eucalypt gully behind our house.


I came to Canberra from Sydney in early 2001, and the sky changed, opening out into a brilliant dome bounded by hills. Soon after it changed again as the nostalgic motif of the gliding passenger jet was overlain with catastrophe. This was echoed by strange weather, a long drought. Safe in suburbia, I installed a water tank and began watching the sky more hopefully, tracking rain bands and storm cells on the weather bureau's website: running out to clear the downpipes, then back to the laptop, downloading the latest. Sky data, almost real-time, a new and better harbinger, and with more at stake this time - water in the tank, a four thousand litre buffer against the next dry stretch. Never far away, the question of when weather becomes climate; is this a "blip" or a trend, short term variability or long term change? Temperature and rainfall statistics become common currency, and every month brings new data, but the more we know the less certain we become; in fact the only consensus seems to suggest more uncertainty. Ocean temperature measurements feed supercomputer models whose simulations are distilled into enticing, oracular suggestions, indications, projections. We occupy an increasingly detailed graph of accumulated data, but remain trapped inevitably in the present, at its right hand edge.

Watching the clouds approaching and cross-checking the weather radar, it's impossible not to sense the gaps and disjunctions between the data - an authorised, centralised and objective account of what is - and the situation "on the ground." This patch of rain that should be on us now, and somehow is not. It seems to have eluded the radar's view, slipped between the pixels or time-steps, or vanished in the lag, the aporia of almost-real-time which is the time data itself takes: to gather, check, validate, compile, visualise, distribute. The weather stubbornly continues to occur in the present, and at full resolution. The rainfall figures always come (as any weather watcher will know) from elsewhere, a single, notionally representative monitoring point. We're always cheated, as a result; overstated or undermeasured. Rain carries such social charge, where I live, that locals call the radio station, reporting from their backyard rain gauges in pyjamas and gumboots. This is the only way of closing that gap, to measure the world locally and create data instead of just siphoning it down from the web. I make my own measurements, tapping on the side of the tank slowly, bottom to top, listening for the hollow ring of the air cavity, homing in on the water level: data sonification.


In contemporary networked culture we are constantly reminded of the scale, ubiquity and significance of data. Every search, message, document, image, social exchange is a data transaction. We seem to be couched in data; it is our new environment. We accept this much-heralded "information overload" with more or less equanimity, as our inboxes and hard drives steadily fill. It's not surprising that in recent years artists and designers working in this domain have begun to grapple with data as a material. As I've argued elsewhere this inevitably involves the construction of an idea of what data is, what it's for, and what it contains. This practice also confronts the pragmatic question of what to do with data, what to make from it and (if we accept the value of the term) a data aesthetics.

One of the dominant creative strategies in this field, and its main aesthetic trope, is multiplicity: displays in which the points and lines of simple graphs burgeon into clouds, fields or flows. The datasets, and their visual figures, reflect our overloaded data-environment. This aesthetics of scale has been theorised through the notion of the sublime, a figure historically associated with nature's beautiful and/or terrible expanses; once again data takes the place of environment (see for example Manovich 2002 (doc) and Jevbratt 2004 (13Mb pdf)).

The data sublime is aesthetically expedient, as well as culturally resonant. Sheer scale generates visual richness as well as revealing patterns within datasets; yet the data points we see here are meagre and unmysterious in themselves. Each is a small cluster of symbols and parameters generated through a (social, cultural) process of selection, filtering, quantification and categorisation, in order to grasp some specific slice of the world in a certain way. When data swarms and flows with apparently inherent dynamics, it's easy to forget how data is created, or even that it is created. This is especially true when the data source is the network itself; self-referentiality gives an impression of self-sufficiency, again a world in which data is given, rather than made.


Countering this tendency a number of works draw in data from the physical environment "outside", and direct our attention back towards a space that is more familiar and more uncomfortable than the digital realm. For example Andrea Polli's work brings data from large spatial and temporal scales into the realm of experience, often in close collaboration with scientists; her Atmospherics project (2004) renders meteorological data gathered from a severe storm as a complex spatial soundscape. Heat and the Heartbeat of the City (2004) sonifies temperature data for New York City, beginning with data gathered during the 1990s, and presenting projections for future decades based on climate change modeling. More recently Bonding Energy, by Douglas Repetto and LoVid (2007, above), gathers data from custom-made sculptural devices measuring solar energy levels, and displays changing levels from multiple measuring sites in an animated visualisation. These works use data reflectively, and show a commitment to the "outside" that is their ultimate data source. However they are also limited by the structure of their material, which measures the world through a single value — temperature or solar radiation level. This single point, as telling as it is, seems somehow overdetermined: too much what it is, too tightly bound to an existing set of meanings and stories.

Photographic imaging, by comparison, gathers large amounts of complex data from the environment: many millions of numerical values with a rich set of spatial interrelations. The notion that the camera reveals the otherwise invisible, as in the work of Muybridge for example, mirrors the aims of data visualisation; yet this also reveals an important difference in these two practices. The reduced data of measurements such as temperature go to great lengths to exclude the extraneous. On the other hand photography, if we regard it as a form of data visualisation, often seems to welcome the extraneous, to embrace incursions, unexpected interactions or extra layers. This is not to claim the photographic image has some kind of special relation to reality, or that it isn't just as selective, intentional, and conditional as a temperature measurement; it's more a slight opening out of the field of view. The photograph can operate something like a geological core sample, selective but inclusive, a piece of whatever happens to be within the frame.

In the emergent field of (what I will call) space-time imaging, artists exploit the digital photographic image to reconfigure representations of the world. This work has a pre-digital ancestry in slit-scan photography and cinematic effects, but with the digital image it has expanded and proliferated (see Levin). Artists have begun to approach the image as a two-dimensional data field; they treat time by extension as a third conceptual axis, forming a three dimensional volume. This abstract structure is literalised in projects such as Alvaro Cassinelli's Khronos Projector (2005), where we can "push" parts of the image back in time. While the experience of the work hinges on the fleshing out of a spatial metaphor, its operation can be understood as interactive data visualisation: a technique for selecting and presenting data points from the image series. Other work, such as that of Australian artist Daniel Crooks, can also be understood laterally, I would argue, as data visualisation or re-visualisation. Crooks works with digital video source material and explores the de- and re-composition of the image in ways that deform space and time, but also, like other data practices, reveal their subjects anew . In "time slice" work such as Train 6 (2004) Crooks samples small segments of the time/image stack, revealing their raw edges, rather than trying to smoothly reconstitute the image. The discourse around this work tends to emphasise its (broadly familiar) agenda: reconfiguring perception, breaking down conventions of representation, and so on (see for example Doropoulos). Like Dziga Vertov before him, Crooks' subject matter is deliberately everyday (public transport, urban spaces), drawing attention back to this reflexive project. Yet at its most poignant, Crooks' work also reveals real patterns of movement and change in the world that it samples. It re-visualises reality, and in doing so it demonstrates the richness of the photographic time-series as data set.

05.02_540_radial
Watching the Sky is a deliberately simple-minded experiment. It uses the most basic techniques of slit-scan photography and related digital space-time work. Using a static digital camera tethered to a computer, I take images at three minute intervals; four hundred and eighty per day. The camera is in my office, pointing out the window with an unremarkable view of the neighbouring building, some trees, power lines, and the sky over west Belconnen. A simple script extracts a narrow vertical slice from each image, at the same location in the frame; then compiles those slices into a new image. In the rectangular visualisations the slices are tiled from left to right. In the radial visualisations slices are gradually rotated so that a twenty-four-hour period spans one complete revolution (the "seam" is at midnight).

Of course any number of other visualisation processes are possible. The digital space-time field illustrates many of the options, though this work often plays with the reconstitution of a transformed image, which was not my interest here. Slices are used as a simple way to compress days' worth of data into a single visual field, while preserving as much as possible the spatial relations within each frame. They also make for visualisations with a simple logic, readable as high density graphs.

In a strange inversion of this project, Edward Tufte, prominent theorist of information visualisation and design, recently called for a new generation of information graphics - "wavefields" - that match the data rate of high-definition video, showing "high-resolution, complex, multiple, animated statistical data-flows." Yet the video exemplars that Tufte uses to make this proposal are not "statistical data flows" but abstract shots from the physical environment: rippling reflections on water and undulating meadows. It's striking that Tufte turns to these environmental sources of visual pattern to mock up a more "intense" genre of abstract, statistical visualisation. Among other things, Watching the Sky attempts to demonstrate that this kind of informational density (and aesthetic intensity) is already immanent (it's just out the window).

I'm influenced here by the work of Lisa Jevbratt, an artist whose data visualisations have focused on the digital networks, but whose approach works against any simple notion of information. Here too density is increased to the point of saturation: with a large and multilayered dataset, Jevbratt's 1:1 (1999/2002) visualises the attributes of some 180,000 internet (IP) addresses sampled by the artist. The resulting images are startling and completely abstract, but not at all unstructured. Jevbratt describes the visualisations as "abstract reals", and "objects for interpretation, not interpretations." Instead of demonstrating the already known, or the answer to a preconceived question (information), Jevbratt's data works provoke, and perhaps answer, new questions; in the artist's words "hints, suggestions, and openings."


Although the data source in Watching the Sky is as tangible and unmysterious as possible, surprising hints and suggestions continue to appear. In one of the earliest sketches I found small but distinct variations in the "horizon" over the course of a day, and recurring on successive days. I eventually realised this was caused by the afternoon breeze, shifting foliage by a few pixels within the frame. The dataset here is a trace of a complex material field that in a sense visualises its own internal structure: the passage of a shadow across the ground appears as a recurring pattern, an enfolded or multiplexed representation of another set of material interactions. As a data source, the photographic image also cuts easily across categories and domains. In the rectangular visualisations presented here stripes of colour are visible towards the bottom of the frame. These are caused by cars, parked illegally under the trees; they form another ad-hoc graph that reflects human (cultural, institutional) calendars and cycles, though again they are intermingled with other scales and structures.

Time, and the perception of change, are central here. Like Jevbratt my hope is that these visualisations will be platforms for interpretation that can somehow augment our local, subjective, everyday practice of reading the environment. There's a yawning gap in our culture at the moment, between this experiential scale, and the long, slow-motion catastrophe we seem to be in. Weather watchers comment on the isobars, track the low pressure systems as they pass, speculate on ocean surface temperatures and the Southern Oscillation; like the black cockatoos each data point is an ambiguous sign that refers to a wider material system. This project is a straightforward response that proposes another way to image, and think, pattern and change in the environment.

This is a preprint of an article submitted for consideration in Photographies © 2008 Taylor and Francis; Photographies is available online here.

Read More...

Friday, March 07, 2008

Notes on Transmateriality

At the recent UTS symposium I gave a short presentation titled "After Inframedia: Presence and Transmateriality." The presence stuff I covered earlier, but the second idea - which I touched on very briefly in this 2003 paper - is much less developed. So here goes.

The relationship between matter and "information" or "the digital" has been a recurring theme in new media theory for more than a decade. We could sketch it very roughly as follows. In the early to mid 90s, as digital hype was gathering pace, artists and cultural theorists began to critique the apparent drive towards disembodiment in technoculture. Simon Penny's 1991 text "Virtual Reality as the end of the Enlightenment Project" is a good (and early) example, even if VR now looks a bit like a straw figure in these critiques. This critical project of grounding the digital in the material (and the body) has continued. In 2000 Felix Stalder wrote of the "ideology of immateriality" underpinning the so-called "new economy." Around the same time Katherine Hayles published a more complex investigation in How We Became Posthuman, asking "how information lost its body" but also considering the inevitably embodied effects of this supposedly immaterial stuff (this is well covered in her paper The Materiality of Informatics).

Hayles introduces a conceptual pair: inscription and incorporation. Inscription is "normalized and abstract ... a system of signs operating independently of any particular manifestation" [Posthuman 198]. Inscription refers to the properties of a text, for example, that can be transcribed without regard to its specific embodied manifestation - digital computation thus relies on inscription, in moving patterns of data through various substrates. Incorporation is its flip-side, referring to the inescapably embodied aspect of a sign. Both inscription and incorporation are verbs - practices or processes - rather than ontological states; and they oscillate, a bit like presence and meaning for Gumbrecht: "incorporating practices are in constant interplay with inscriptions that abstract the practices into signs" [199].


Today I came a cross a more recent paper by Matthew Kirschenbaum, who pursues this investigation into the materiality of the digital, and like Hayles is approaching it from the perspective of textuality. In “Every Contact Leaves a Trace” (pdf) Kirschenbaum critiques the neo-Romantic, screen-focused tendencies of digital textual theory that tend to emphasise ephemerality and instability. He uses digital forensics to moves us from the screen to the hard drive, showing exactly how data is embodied (as in this image: a magnetic force microscopy image of a hard drive surface, from Pacific Nanotechnology). In the process he introduces another pair of concepts: formal and forensic materiality. Formal materiality refers to machine-readable data that reveals material specificities - in Kirschenbaum's paper, the use of a hex reader to discover traces of not-quite overwritten game code on an old Apple II floppy disc.

Forensic materiality refers to the material residues or byproducts that mark out one digital instantiation as different to another; for example the physical instantiation of copies of a file on two different hard drives will be different due to the material specificities of the drives - as when a misaligned write head again leaves traces of overwritten data. Yet these files are, for the computers concerned, formally identical. As Kirschenbaum writes, this shows how

"computers ... are material machines dedicated to propagating a behavioral illusion, or call it a working model, of immateriality."
This really nails it for me. It's exactly the functionality of this immateriality that earlier critiques of the disembodied digital overlook. It is an illusion, but it's an illusion that (mostly) works, and so is easily maintained: this is a hard-working model.

I'm developing an idea of transmateriality (sorry about the coinage), that draws on Hayles and describes exactly the "conundrum" that Kirschenbaum poses here; but that also has, I think, some wider implications, specifically for the media arts. Briefly, it proposes that the digital is, of course, always and inevitably embodied; that concepts like "data" are functional abstractions for describing the propagation of material patterns through material substrates. But that at the same time these material patterns - and here I mean everything from optical pulses to hard disk substrates, luminous screens and speakers pushing air - these material patterns, and the sensations and aesthetics that result are profoundly shaped by data acting as if it were symbolic and immaterial. Transmateriality is an attempt to "ground" the digital without losing sight of its (let's say) generative capacities. It also seems to resonate with a lot of current work in the media arts - but more of that later.

Read More...

Monday, September 10, 2007

Against Information - a Data Art Critique

Next week I'm off to Perth for DAC, where I'll be presenting a paper focusing on data art. It looks at a good handful of works from the last few years, including The Dumpster by Golan Levin with Kamal Nigam and Jonathan Feinberg, We Feel Fine by Jonathan Harris and Sepandar Kamvar, Alex Dragulescu's spam visualisations, Lisa Jevbratt's 1:1 and Infome Imager Lite, Brad Borevitz's State of the Union and some of Jason Salavon's abstraction and amalgamation works.

The paper develops the questions that I posted here a while ago, focusing on how artists construct a notion of data while they use it as a creative material. It especially considers the distinction between data and information, arguing that data art often works to defer, abstract or undermine information - in the sense of a formed or contextualised message - and instead offers us a more open or underdetermined experience of the data as abstract pattern and relation. The problem here is that we can't have unmediated access to the abstract data - it's always mapped to something, structured in ways extraneous to the dataset. And data itself is always extracted, made or constructed, not some kind of autonomous digital object.

The case studies are clumped around four data-figures: indexical data - data as a sign of something real - as in The Dumpster and We Feel Fine; abject data - data as empty and malleable, as in Dragulescu's work; Lisa Jevbratt's data material or Infome; and data as anti-content or "artist's squint" in Salavon's work and Borevitz's State of the Union.

Anyhow, here's the full paper (3.3Mb pdf). Feedback very welcome, of course.

(update: the pdf file was corrupt, sorry - fixed now)

Read More...

Monday, March 12, 2007

Lisa Jevbratt - Infome Imaging

Lisa Jevbratt has been doing data art for some time now. Her 1:1 (1999) was one of the first data-vis works to gain critical attention in new media art circles. I re-read some of Jevbratt's writing recently, and the artist pointed me to this 2005 paper, which in part sets out the concept of the Infome. Jevbratt seems to be changing direction - towards bio/eco practices - but her work remains significant, especially while data is the new code/black/whatever, for the Processing generation. The Infome idea is particularly interesting, because it creates a distinctive sense of just what data is.

Jevbratt's Infome is a kind of data cosmology - the Infome is an "all-encompassing network environment/organism that consists of all computers and code." Once you get past the biological analogy, the Infome offers a way to treat data as a kind of material that is concrete and self-sufficient, but also shaped by the (social, political, technological) forces outside it. Data is indexical, but not in the empirical sense of measurement or simple correspondence. Instead Jevbratt uses another material (geological) analogy; the Infome is a kind of landscape in which external forces and structures are overlayed and condensed. Another nice twist is that visualisation becomes recursive: "Images can now simultaneously be reality, since they are part of the Infome, and an imprint of that reality, as if the image produced by a potato stamp were also a potato."

Jevbratt's images of the Infome in 1:1 and Infome Imager Lite aspire to this kind of material directness, making a "slice" or "imprint" of the data. She describes the images as "real, objects for interpretation, not interpretations." This desire to present the data "in itself" closely resembles the "pure data" aesthetics of the audiovisual databenders I mentioned in "Hearing Pure Data" (2004). We can make the same critiques of Jevbratt's work - that we can't see the data in itself, only its specific mapping. Jevbratt does take great care to explain the mappings used, and best of all in IIL she encourages the user to experiment with changing mappings and datasets - an artistic precursor of the public data literacy now mentioned in relation to social data-vis services Swivel and ManyEyes. The critique still stands though; it's clearest in the way Jevbratt wraps all these visualisations around the rectangular picture plane - a structure that has no inherent relation to the data, but a significant relation to the art-world context that these works function in.

For Jevbratt these data-impressions allow us to "use our vision to think" - information and pattern arise from a perceptual process, rather than a computational analysis. Like much other data art, Jevbratt resists providing information, in the sense of meaning or message; instead she offers a substrate for information, a field of potential meaning. She writes of seeking "something unexpected," "hints, suggestions, and openings" that lead us into the Infome itself, its immanent, collective dynamics, even its emergent, distributed agency. It's a kind of data mysticism, but also an attempt to sense the real but otherwise imperceptable shapes of digital culture.

Read More...

Thursday, September 21, 2006

Data, Code & Performance

A few more thoughts and questions following on from the previous post, and responses to it. If data art isn't necessarily concerned with the (apparent) meaning of its datasets, or their empirical basis, then what is it concerned with? Perhaps one answer has something to do with performance. Whatever else it does, this work performs a process that is meaningful in itself. Whatever else it says, it also says, "watch what I do with this data." It displays a data literacy, an ability to acquire, munge, filter, process, map and render. Since it's primarily operating as art, rather than functional visualisation / sonification, it also demonstrates a process of translating or mediating between those domains. This isn't a criticism (necessarily), just trying to think through a few basics, and taking on those points from toxi and infosthetics re. the tension between art and visualisation here. If data art is partly self-referential performance, then what kind of cultural values exist / are constructed around that? Manovich refers to "data-subjectivity" - are data artists exploring / peforming this "super-modern" state of being?


I'm sure there's a connection here somewhere with literal acts of data-performance. I saw some live coding performances at the Medi(t)ations conference in Adelaide (blogged earlier). Brisbane duo aa-cell(Andrew Sorensen and Andrew Brown) played a great set - two laptops, both running Sorensen's own Impromptu environment, with screens projected to show the accumulating code. Here too there was a kind of mediation between computational and cultural domains - a performance of (largely obscure) code structures that generated a sonic structure dense with musical references. It was partly the pulse of a synth kick drum (hand coded, of course) but I came away thinking of Kraftwerk - laptop live coding as the new "man machine."

Live coding has a transparency that a lot of data art lacks - the code structure is gradually constructed, giving an (expert) observer some chance of following the formal, generative structure. Most data art conceals its mapping and munging, offering only an artefact and a promise that yes, this is "the data." Live coding's transparency is itself pretty opaque, though. At least one audience member at the Adelaide performance had no idea that the displayed text bore any relation to the sound. Live coding looks like great fun for the performers (like most improv), but what about the audience? Is data-subjectivity a prerequisite?

Read More...

Tuesday, August 22, 2006

Data Art - Some Questions

I'm working up a paper on data aesthetics and creative practice, looking especially at visualisation (a kind of companion to "Hearing Pure Data," a paper written a while ago focusing on sonification / audification). At this stage all I have is a collection of questions and semi-formed hunches - so make of it what you will, etc.


  • Are we talking about data or information?Lev Manovich uses the term "info-aesthetics" and connects these practices to the notion of an "information society". What if we move back a step, and look at the relationship between data and information? Data is the raw material, the datum or measurements: information is the message or meaning constructed using those datum. Both terms get used (more or less interchangeably) around artworks doing visualisation, but I think we should maintain the distinction. Is this work concerned with rendering information - a known, formed message? On the surface at least it seems to be more interested in visual interfaces to data, downplaying or leaving open the interpretation of that data - its transformation into information.

  • As I argued in "Hearing Pure Data," presenting the data "in itself" is an impossible ideal; it is inevitably shaped, interpreted, formed, framed, etc., in any manifestation; in which case how does visual data art negotiate its own construction of information from the datasets it works with? Does it pass off its own interpretation and framing as "raw data"?

  • What about the constitution of the data itself? Data art seems to take a pragmatic and concrete approach - "the data is the data" - but any meaning constructed from that data must be inflected by the way the data itself was formed or gathered. This is stating the obvious to anyone working in the empirical sciences... how do data artists respond? In the wake of the AOL reSearch dataset affair, the issue of constructing information from data comes into sharp focus. It will be interesting to see how artists use this dataset (which as Marius Watz recently observed, they no doubt will). The ethics of data art?

  • Data art treats its datasets as generative resources: sources of rich structure, pattern and complexity. It seems that often the appreciation of these formal qualities of the datasets (or their visualisations) exists in tension with the content or referentiality of the data. There's a continuum: TheyRule leans towards referentiality and meaning; Ben Fry's Valence is more concerned with pattern (it's an exploration of a visualisation technique after all); The Dumpster sits somewhere in the middle.

  • Toxi blogged a while ago on the issue of access to quality datasets for creative visualisation. As the comments on his post show, this begs a kind of cart/horse question. Tom Carden writes: "once you've got the info vis bug, you feel like a guy with a big shiny hammer, but nobody will give you a nail." This brings us back to the same question: is this work about data as an indexical link to the world, or data as a generative device? Or both?

  • On a related point, there's a clear crossover between generative and data-driven art; the artists are often one and the same; the same tools are used. How can we think about the relationship between these practices? They seem to be complementary approaches to similar goals (visual and aesthetic complexity, the joy of the unexpected, etc): one builds a generative system from scratch, the other latches onto the most complex existing generative system (the world) and visualises that.


Responses to all this very welcome of course... stay tuned for more chunks of undersupported and undigested theorisation.

Read More...