Category Archives: Exhibition

Play Table: Iteration 3

Now that the dust of the summer holidays has settled, I’ve had chance to edit some video of the Play Table R&D project as it appeared at FACT, Liverpool, in August.

Play Table : Iteration 2

Installed at The Minories Galleries, Colchester.

Data Flow at FACT

Data Flow is now up and running at the Foundation for Art and Creative Technology (FACT), Liverpool, as part of the Type Motion exhibiton which opens tomorrow evening (13/11/14) and doesn’t stop until 08/02/15. Just as I threw the switch to demonstrate the installed work to FACT people, after some time spent tweaking, 50+ students turned up directly in front of the screens and immediately started jumping around and taking pictures. That was a good start! Here’s a short screen capture of the piece being put through its paces by a combination of FACT staff and passers-by.

6 Days in Liverpool

I was up in Liverpool again recently for a 6 day residency as part of the art tech, boundary pushing Syndrome project curated and produced by Nathan Jones.
‘Choros’ was a collaboration between myself and sound artist Stefan Kazassoglou of Kinicho. Artist and poet, Steven Fowler gave a martial-arts derived performance for the opening event. The venue was 24 Kitchen St which is a fascinating project in its own right – a constantly evolving arts-centric space at the very beating heart of Liverpool’s Baltic Quarter, full of positiveness and an admirable can-do attitude.
kitchen-st

Inspired by a visit to the venue and at all times spurred on by Nathan Jones, Stefan Kazassoglou and myself set ourselves the brief of creating a ‘roomstrument’ – an interactive space capable of responding visually and sonically to physical presence and movement, similar to the way in which a musical instrument responds to the physicality of being played. The roomstrument should be ‘playable’ by an individual or by a group. It would be an experimental piece designed to make performers of participants by stimulating and rewarding performative play. The name Choros came about as an anglicisation of the Greek word ‘Koros’ which happens to mean both dance and room – the synonymity appealed to us.

With the benefit of continued encouragement and support from Syndrome, we specified 3 projectors, 3 screens, 3 Kinect cameras and an 8 speaker ambisonic sound system to create a kind of open sided cube that would form the basis of an interactive space. In the end this relatively complex rig was constructed with a minimum of fuss by the Syndrome team and suppliers and we found ourselves facing a technically exacting set-up with plenty of calibrated communication required between video and audio processing. After the inevitable teething issues, we kickstarted our creative workflow and got a basic interactive audiovisual framework up and running on day 4 of the residency which only gave us 1 day to tweak before the incredible opening performance featuring Steven Fowler. The following day we opened up the work to the wider public…

spec

At the centre of the installation was a lit ‘hotspot’ which we invited visitors to use a starting point to explore the interactive space. By reaching out towards or approaching any of the 3 screens from this hotspot, a series of ‘control cubes’ was revealed, each of which triggered a sound in respective ambisonic audio space. Thus a control cube in the bottom left part of a screen triggered an audio element spatially placed at the corresponding location. For the opening performance we used a series of Japanese martial arts-inspired vocal samples recorded by Steven Fowler himself.

Subsequent sound sets featured cello samples specially recorded by Stefan Kazassoglou and vocal samples from 20th Century philosopher and mathematician Alfred Korzybski discussing how abstraction creates an illusion of reality.

Moving around quickly inside the space triggered a rapid sucession of audio elements, resulting in a dramatic if slightly overwhelming mix of near-simultaneous sounds issuing from various locations in 3D space. Conversely, moving more carefully and identifying the points at which audio elements triggered gave visitors the chance to learn how to ‘play’ the work more as an actual ‘roomstrument’.

This experimental piece was a great success judging by the feedback and comments from visitors. As an artistic collaboration, I am sure it has inspired each one of us involved both at an individual and collaborative level. Certainly Stefan Kazassoglou and myself are keen to develop the ideas and techniques at the heart of Choros…

Massive thanks to all who helped and supported in spirit or in person.

Bye-bye Liverpool until next time…liverpool

The after-glow of Luminescence

Luminescence was installed at firstsite for 3 days in April. Due to the crescendo of effort required to pull the project together followed soon after by Easter holidays and the beginning of a new term, it has take me a little while to get round to writing up my thoughts and observations.

Not long after I wrote the last post discussing my previous show Electricus, I finally got hold of a Kinect sensor. Having worked exclusively with webcam video input for the last year or more, it was a good time to cross the line into the world of the depth image and positional information. I’m particularly happy that I really ‘maxed out’ the webcam and learned so much from operating within the constraints of rgb video over an extended period before coming in from the cold. The way I see it there are two main directions to go with the (version 1) Kinect – skeleton tracking which provides fine control for a restricted number of participants or exploitation of the depth image which just gives z information in a grainy black and white low-res feed. The first approach sounds way more groovy right? But the second approach is to my mind essentially more open in the sense that it can be used to set up multi-participant interactivity with the minimum of calibration or initialisation. I really like the idea of an ‘uninvigilated’ interactive space as opposed to the invigilated version in which perhaps only 2 are allowed to approach the sensor at a time. But of course, it’s all a matter of fitness for purpose and I’m sure I’ll be maxing out the skeleton tracking functionality before long!

So what of the depth image? For a start, using a threshold-type filter it is simple to set up an active zone in front of the sensor thereby knocking out the background or any other unwanted objects. For Luminescence, I developed functionality that locates objects/people and draws lines around them in an approximation of their silhouette. Because the lines are drawn using a cluster of points located on the edge of a person/object’s shape, it is straight forward enough to calculate the average position of a cluster which roughly equates to the centre of a drawn shape. Once more than one shape occurs, the piece joins them together dramatically. So an individual might dwell at the edge of the active zone and experiment by putting only parts of the body (eg hands, face) into the active zone and seeing how these all connect up. A group of participants might just jump around in front of the screen and watch how the connections between their respective body shapes light the screen up.
Visitor response was very positive. I was in the installation space or close by for the duration of the event at firstsite which gave me ample opportunity to observe, chat with and generally appreciate participant interaction with the piece.

mumandson-simple-800-600
mumandson-800-600
spec-side

From Electricus to Luminescence

Electricus is dead, long live Electricus! Well not dead, just put to bed for now while I start to work on my next show ‘Luminescence’ to be exhibited at firstsite in April.

Putting on a show at Ipswich Town Hall was certainly a memorable experience. Although Gallery 3 feels slightly off the beaten track, it’s such a hugely impressive room with enormous sash windows, ornate plasterwork not to mention several large 19th Century maritime-themed oil paintings hanging high, that one cannot help but feel slightly in awe. Luckily the 4 metre wide, 3 metre high back projection screen I had installed across one corner of the room was big enough to hold its own within the space.

The exhibition opened to a queue of people thanks to Sarah Jacques of University Campus Suffolk who brought her Fine Art undergraduates along to have  a look and ‘show some energy’.  I gave an impromptu presentation which was fun 😉  Saturday was the busiest day thanks to a great feature by Wayne Savage which appeared in the East Anglian Daily Times and Ipswich Star newspapers on the previous day.

This was the first time I had worked with back rather than overhead projection and it really does make things much easier in terms of positioning, providing that the necessary space behind the screen is available. In general, the show was a definite success with lots of minor production lessons learned and quite a few in-depth conversations with visitors, which was really informative.

I’ve been working with outlines and sparks now for some time and will probably now take a slightly different route foward. I hope to post samples of some of the ideas I’m working on for Luminescence over the coming weeks.

solo

Electricus, Jamie Gledhill 2014

Spark Shower at Slack Space

As part of the ‘Lights in the Dark’ exhibition at Slack Space Colchester I exhibited ‘Spark Shower’, a development of an individual element of my 2013 masters project ‘Mirrornoise’. It was a positive experience not least being the first time I have exhibited an interactive work in Colchester where I am based. Slack Space is a very supportive organisation despite the meagre resources they have to hand and I commend them for giving artists such as myself an opportunity to show new work.

I tried something a little different from previous shows with the sound design cycling through a series of rhythmic sequences interspersed with periods of rest, all superimposed upon responsive sound elements such as spark noises. Each time a new rhythmic sequence began, a visual change occurred such as change of spark colour. Although pretty simple at heart, the transformative apect of this approach seemed to work.

Here is a excerpt of video shot of the screen during the private view.