Monday, October 24, 2011

vj labor: TouchOSC on iPad and VDMX

vj labor: TouchOSC on iPad and VDMX: TouchOSC on iPad and VDMX from CPU on Vimeo . getting ready for the Media Jam Friday night in Toronto - setting up the iPad for bidirectio...

Monday, March 07, 2011

Tuesday, January 18, 2011

scoreLight synesthesia laser tracking


"scoreLight" is a prototype musical instrument capable of generating sound in real time from the lines of doodles as well as from the contours of three-dimensional objects nearby (hands, dancer's silhouette, architectural details, etc). There is no camera nor projector: a laser spot explores the shape as a pick-up head would search for sound over the surface of a vinyl record - with the significant difference that the groove is generated by the contours of the drawing itself.

Sound is produced and modulated according to the curvature of the lines being followed, their angle with respect to the vertical as well as their color and contrast. Sound is also spatialized; panning is controlled by the relative position of the tracking spots, their speed and acceleration. "scoreLight" implements gesture, shape and color-to-sound artificial synesthesia; abrupt changes in the direction of the lines produce trigger discrete sounds (percussion, glitches), thus creating a rhythmic base (the length of a closed path determines the overall tempo).

Elliot Woods, Kinect-Augmented Reality, as Projection Mapping Meets Depth Sensing


Kinect Hadouken from Elliot Woods on Vimeo.


Elliot Woods writes with an extraordinary proof of concept: it couples the depth-sensing capabilities of Microsoft’s Kinect with projection mapping to effectively “scan” a 3D scene. It’s almost Holodeck good, from the looks of the potential here.
Kinect hack + projection mapping = augmented reality +hadoukens): Using the kinect camera, we scan a 3D scene in realtime. Using a video projector, we project onto a 3D scene in realtime.By combining these, we can reproject onto geometry to directly overlay image data onto our surroundings which is contextual to their shape and position.
As seen in the video, we can create a virtual light source which casts light onto the surrounding surfaces as a real light source would.
At Kimchi and Chips we are developing new tools so we can create new experiences today. We share these new techniques and tools through open source code, installations and workshops.

More on the Kimchi and Chips blog


Updated: The duo is Kimchi and Chips as it’s a girl + guy, Seoul + Manchester team. I’m on it.


This video nicely shows some of the process of making this work:




createdigitalmotion

Tuesday, November 30, 2010

THE PERFECTION OF SUICIDE LIES IN AMBIGUITY









The suicide of Guy Debord on 30 November has led to the former Situationist being caught up in a number of discourses that he may, at one time, have viewed as distasteful. In the 'Guardian Weekend 1994: Review of the Year' (Guardian 31/12/94), Debord was name checked in the 'Those We Have Lost' column alongside two other suicides, rock singer Kurt Cobain and Great Train Robber Buster Edwards (other deaths noted included those of Derek Jarman, Richard Nixon, John Smith, Jackie Onassis, Dennis Potter, Kim Il Sung, Peter Cushing, Karl Popper and Keith Joseph). Clearly, Debord's timing was good because if he'd killed himself at the beginning of the year, the mainstream media may well have forgotten his suicide by its end.


Messages placed on the internet about the suicide included one from Edward A. Shanken who wrote: 'Guy Debord did not kill himself. He was murdered by the thoughtlessness and selfishness of so-called scholars (primarily trendy lit-criters) who colonized his brilliant ideas and transformed his radical politics into an academic status symbol not worth the pulp it's printed on...' This generated a few angry responses, the import of which was that Debord was not another Jim Morrison, Ian Curtis or Kurt Cobain who 'died for our sins.' Shanken didn't address the fact that Debord was utterly obsessed with the notion of 'recuperation,' and that as a consequence, he was to some degree responsible for all the uses made of his work. Debord's version of the Situationist International deposited a good deal of material with archives and museums precisely because it did not want to be forgotten by academia.


John Young used the Net as a soapbox from which to claim that Debord had worked for Mossad: ‘this dazzling and humbling association with real world power beyond the soft-minded literary and philosophical worlds totally mesmerized Debord... The elixitrate mix of sacred and profane literally made Guy drunk with intellectual stimulation and shared worldly risk... the intrigue and daring bond of high mind and base reality was an alchemic transformation of mental to physical like no head-wrought book could come near.' Unfortunately, the intertextual origins of this thesis were plainly evident in Young's claim that he'd learnt of Debord's spook activities from Philip Roth. Young even went as far as asserting that Debord had provided the model for the central character in Philip Roth's novel Operation Shylock: A Confession.


Meanwhile, Malcolm Imrie's obituary in the Guardian of 5 December 1994 absurdly claimed that 'with consummate irony, he (Debord) allowed his work to be republished by Gallimard, entering the pantheon of French literature, just as the pantheon was collapsing.' In the world the Situationists wished to create, such a panegyric would be viewed as supremely ironic. Suicide was an occupational hazard for the Dadaists and Surrealists, perhaps Debord hoped to realise and suppress this tradition by using death as a method of reintegrating himself into the avant-garde. In the meantime, death remains the ultimate commodity, a handy gimmick to help sell works of 'revolutionary theory' in an already over saturated market.



www.stewarthomesociety.org/debord.html

Dynamic, Projection-Mapped Topographies.



Reasons to be thankful: it seems we’re at the beginning of an explosion of projected digital imagery as medium, with the best yet to come. And some of the most compelling work right now deals with the most elemental qualities of this medium, how light and space interact.


Take the work of hc gilje. He shares some of his most recent projects, which include the elegant-looking theatrical projections at top:


I was invited by Trøndelag Teater to do a combined physical set design with video projections. It was an adaptation of the Norwegian literary classic “Fuglane” (“The Birds”) by Tarjei Vesaas, with Harry Guttormsen as director. I created an organic physical form, which combined with the videoprojection became a very dynamic landscape.


Cool as tech like Microsoft’s Kinect are, I find myself drawn to work that focuses on the sparest elements, visual etudes in form and composition. I’m particularly interested in this having been following the writing of John Maeda, whose thinking helped inspire Processing (and, by extension, OpenFrameworks and lots of other stuff).


The other work hc gilje shares to me fits some of that work. How much can you do, in rhythm and space, using only a single line?



snitt (hc gilje 2010) from hc gilje on Vimeo.


An installation for Galleri 21 in Malmö.

A straight line moves slowly through the three rooms of the gallery space, cutting the space into different sections (snitt). The movement of the line, “attacking” the space from different angles, focus the attention of the viewer on the physical qualities of the space.


The physical properties of the galllery space (the walls, ceiling,floor, door openings, light fixtures etc) modulates/breaks up the straight line into a continuously evolving pattern of line fragments, depending on the position of the viewer and the angle of the line in relation to the architecture.


The solo show explores the concept of “line-space” — a fascinating proposition, that the one-dimensional line can define a three-dimensional volume. More in his blog post:

Conversations with spaces: snitt


If you do happen to live in a town like Malmö, Sweden, or Oslo, I imagine you’ll have quite a lot of time this season for quiet reflection, and cause to do some projecting of light – the sun spending very little of the day getting in your way. Let us know what thoughts you have, and what light you project.



SOURCE: http://createdigitalmotion.com/

Kinect Hacking : Why it matters?


Interactive Puppet Prototype with Xbox Kinect from Theo Watson on Vimeo.When Microsoft gobbled up vision technology and announced they were channeling their own research into a product for their game console, artists, researchers, and hackers lamented. It seemed the tech might be destined only for a handful of mainstream game titles.
Hours after the product launch, however, and one open source bounty later, it was clear the opposite was happening: Kinect was opening to new possibilities. Some of the world’s leading visual experimenters, many of them regulars in this site’s stories, were quickly pulling in data and reimagining what the device could do. And that’s just in the first days: given its sophistication, the real potential lies ahead.I pulled together a number of the artist-hackers to get their thoughts:Phil Torrone talks to us from Adafruit Industries, who put up the bounty for the project and contributed to the EFF to protect the rights of hackersTheo Watson, OpenFrameworks co-originator, is one of the original hackers and has built Mac supportMemo Akten, OpenFrameworks contributor, is building expressive and artistic applications of the techKyle McDonald, artist and visual researcher, is working with massive clouds of point data, building on his previous work in 3D scanning
Dan Shiffman, Processing guru and NYU faculty, is working on tools to make this more accessible to Processing and Java coders.

Adafruit on the Competition to Hack Kinect


Phil Torrone of Adafruit explains what went on behind the scenes as Adafruit Industries offered a bounty to hack Kinect.>
CDM: What was important about this particular project?
Torrone: The results speak for themselves, the creative potential was unlocked.
When did you actually make the decision to commit to this?
The day before the kinect was launched in the usa.
Have you been surprised by anything that’s happened? Was this the pace of progress you anticipated?
We never underestimate the creativity and passion of people who and do love open source.
You of course gave some cash to the EFF and not just the winner … have you had conversations with EFF about how to protect artists working on the project, or the legality of the work?
We did not talk with EFF at all prior to this effort; we did let them know we were sending them $2k after we declared a winner.
With so much going on, what’s the best way for interested parties to keep track of what’s going on?
Likely the Google Group
How can someone best contribute?
There’s a google group, there’s GitHub where we put our data dump and code.
https://github.com/adafruit/Kinect


Kyle McDonald reaches into 3D virtual space, represented by a massive point cloud, through his Kinect code. Photo (CC-BY-SA-NC) Kyle McDonald; used by permission.

Where should people go to learn more about this stuff?


Shiffman: In terms of doing Kinect with Processing, I think learning the basics of Processing first (duh), with a focus on image processing is probably good:
http://processing.org/learning/pixels/
Watson: There is an openFrameworks wrapper being developed:

ofxKinect @ GitHub

…and active forum threads:

libfreenect discussion @ GitHub

openFrameworks forums
Also the main libfreenect development is happening on GitHub:

libfreenect
There is a big cleanup coming to the api so things might be in a bit of a state of flux for the next few days but hopefully soon we will have super solid drivers/apis for all platforms.
Akten: I’ll be posting my little demos (whenever I find a moment, which will be rare for the next few months unfortunately :S) at

https://github.com/memo/ofxKinect-demos and of course my blog memo.tv

You’ve probably seen the post on CAN [creative applications], has a good summary of the early demos and the history of how the opensource drivers came about (Hector etc.)

http://www.creativeapplications.net/news/kinect-opensource-news/
And I saw a tweet that someone had it working with Cinder.
McDonald: For my work in general, see http://kylemcdonald.net/
For my pre-Kinect 3d scanning work, see:
http://www.flickr.com/photos/kylemcdonald/sets/72157613657773217/

http://code.google.com/p/structured-light

https://groups.google.com/group/structured-light
With Kinect, everything I’ve done with 3d scanning for the last two years is starting to take on a new meaning…

The best place for following Kinect stuff is:
1 openkinect google group

2 #openkinect on freenode (super active discussion)

3 the github wiki


Why hack the Kinect in the first place?


McDonald: It’s essential that we develop drivers and libraries for Kinect, because we have to decide what new technology means to us.
Kinect has taken a technology out of academic labs and defense agencies, and put it in our living room. now we need to decide where we want to point the camera.
Shiffman: A cheap (relatively speaking) “3D” camera is killer technology for the interaction design / computational art community. This kind of tech has been around, but it’s either been too hard to find or prohibitively expensive. I think that you will see a ton of creative uses (in digital art, exhibition design, assistive tech, etc.) that you wouldn’t find if it was only used for console gaming.
Watson: It’s a really amazing piece of hardware for a really affordable price. To put it in perspective, I currently have a commercial-depth camera on loan which produces a similar quality depth image and it retails for $7000! That is really way out of reach for most people who might be hobbyists, artists or researchers, but $150 is incredibly cheap for what the technology allows you to do.
Akten: First, check out Kyle’s little poem :)
For me, it’s very simple. I like to make things that know what you are doing, or understand what you are wanting to do, and act accordingly. There are many ways of implementing these ideas. You can strap accelerometers to your arms and wave them around, and have the accelerometer values drive sound or visuals. You can place various sensors in the environment, you can use camera(s) to track movement etc. Ultimately, you create an environment that ‘knows’ what is happening inside it, and responds as you designed and developed it to. What excites me is not the technology, but how you interpret that environment data, and make decisions as a result of it. How intuitive is the interface? You can randomly wire the environmental parameters (e.g. orientation of arm), to random parameters (e.g in audio and/or visuals), and it will be fun for a while, but I don’t think it will have longevity, it won’t be an *instrument* that you can ultimately learn to play and naturally express yourself with. In order to create an instrument, you first need to establish a language of interaction – which is the fun side of interaction design, but you always have the technical challenge of making sure you can create a system which can understand that language. It’s too common to design an interaction, but not have the technical capabilities to detect or implement it – then you have a system which reports incorrectly, and makes inaccurate assumptions resulting in confusing, non-intuitive interaction. So you need a smarter system, and the more data you have about the environment, the better you can understand it, and the smarter, more informed decisions you can make. You don’t *need* to use all the data all the time, but it is there if you need it.
Kinect is ultimately a depth-sensing camera. To put it simply, it returns a normal RGB image just like a webcam, but for every pixel in the image, it also returns a ‘distance to camera’. This kind of tech has been around for a while, but very expensive (minimum thousands of dollars), and definitely not a consumer device, more for labs, robotics, military etc. That depth information, is a ton of extra data. With that extra data, we are a lot more knowledgable about what is happening in our environment, we can understand it more accurately, thus we can create smarter systems that respond more intuitively.
One point which is often overlooked – which is a very important point – is not only ‘what can you do with the Kinect that you couldn’t before’, but ‘how much simpler is it technically to do something with the Kinect, as opposed to using other consumer devices’. This really is a very important point. A simple example is the recent rough demo I posted of drawing in 3D with your hands.

ofxKinect 3D draw 001 from Memo Akten on Vimeo.
That is completely possible to do pre-Kinect. You would need two webcams, you would need to setup your lighting quite specifically. You would want control over your background and overall lighting of the space. And then you would need a lot of hairy maths and code. With the kinect, you just plug it in, make sure there isn’t any bright sunlight around, and with a few lines of code you have the information you need. So now that interaction is available for developer / artists of *all* levels, not just hardcore math geeks – and that is very important. Once you have loads of people playing with these kinds of interactions (who pre-Kinect would not have been able to) then we are bound to see loads of really innovative, fresh applications for it. Sure we’ll get a ton of “pinch to zoom and rotate the photo” demos which will get sickening after a few thousand, but people will be developing ideas that you or I would never have thought off, but instantly love – which in turn will spark new ideas in us to go off and play with – which in turn will feed others.
It’s still really early days yet, it’s just been a case of getting the data off the Kinect into the computer, and then seeing what actually is that data, how reliable is it, how is it’s performance, what can we do with it. Once this gets out to the masses, that’s when the fun will start pouring in :)

What might people do with these tools as artists?


Watson: There is quite a lot that it can be used for. For interactive installations, we are often dealing with trying to track people in a space. Typically this requires careful lighting and IR cameras and it can be quite a tricky issue, but with the Kinect the depth image allows us not only to track people but understand where they are in relation to each our in z-space. This is just one application however, another really nice feature is that it has pixel matched color and depth cameras and this could allow for a ‘greenscreen-less’ live greenscreening. And then of course there is its use as a 3D scanner, for building depth maps, understanding the space around us etc and more possibilities than I probably realise.
Shiffman: All sorts of things I can’t possibly imagine! (Just the fact that having depth makes background removal so easy is killer for my students.)
McDonald: I’ve noticed tendencies to work at very different levels of abstraction.
Some people are most interested in the raw data, the inherent glitches, the aesthetic of 3d scanning.
Others are interested in slightly generalized data, maybe the idea of ‘scenes’ that are being captured and analyzed, reconstructed. Some people are interested in specific applications — object recognition, pose estimation, gestures. these are the most abstracted.
I expect work to come from all different levels, in every different medium.
Sculptors will record and build unusual models of spaces informed by 3d scanning, spatial mash-ups will be standard fare, 3d printing for 3d slit scanning. motion spaces, negative spaces. paths through space over time.
Sound artists and musicians will use the device to control standard audio parameters, or use the values as input parameters to complex synthesis environments and for controlling spatialized sound with large speaker arrays.
Photographers will work with long exposures in combination with 3d-reactive projection to augment layers of the space over time.
Interaction designers will invent new gestures and modes of interaction specifically targeted at the strengths of the sensor.
Interactive art will experience a minor renaissance as a variety of tasks that were previously very difficult become very simple (e.g., tracking someone against a background that is the same color, or even tracking someone against a moving background)
… etc., etc. :)

XBox Kinect running on OS X ( with source code ) from Theo Watson on Vimeo.

What’s technically possible with the libraries now; what’s coming?


Watson: At the moment, we can get back the depth image and color image from the two cameras, access the motor, LED and the accelerometer of the device. Some developers are now working on accessing the four microphones which allows for location of sounds in 3D space. Also, a big part of the Kinect as it relates to the Xbox is the full body skeletal tracking, which from a researcher or artist’s perspective is very valuable feature. This is implemented in software on the Xbox and is the result of many years work by some of the top people in the field. A big part of the future research will be at the software level developing tools that build of off and extend the functionality of the hardware, like open source implementations of the realtime skeletonization code.
McDonald: The general rundown is that Linux is fastest, OS X is 5-10 fps behind,

and Windows is just starting to work.
ofxKinect was originally developed by Dan Wilcox and Theo Watson, with some minor contributions from me, and is now also being developed by Arturo Castro. It runs well on OS X and Arturo is still adding Linux support.
https://github.com/ofTheo/ofxKinect
Right now it’s only possible to get the RGB and depth images, and to get the depth image in centimeters (which is not what the sensor returns by default). Next will be alignment of the RGB and depth images, and of course making it cross platform. Other suggestions are on the OF forum.
Shiffman: Right now the library just returns two pixel arrays (640×480 RGB image and 640×480 image with depth mapped to grayscale). My to-do list is (a) make all the raw data available, (b) optimize for speed, and (c) add any little analysis tricks / features that might be particularly useful. Basically, anything people do with the openkinect project and OF, I’ll try to add as a feature for Java / Processing.


Stay tuned to CDMotion for more… and let us know if you have specific comments or questions, or have seen work that inpires you. Ed.
More reading…
Fantastic round-up of what’s happened so far from our friend Creative Applications Network:

Kinect – OpenSource
Memo reflects on his blog…

Kinect – why it matters
And on Music, I’ve got more for anyone interested in MIDI or C#/.net:

Kinect with MIDI

SOURCE: http://createdigitalmotion.com

Archigram Archival Project


The Archigram Archival Project makes the work of the seminal architectural group Archigram available free online for public viewing and academic study. The project was run by EXP, an architectural research group at the University of Westminster. Archigram Began Life as a Magazine produced at home by the members of the group, showing experimental work to a growing, global audience. Nine (and a half) seminal, individually designed, hugely influential, and now very rare magazines were produced between 1961 and 1974. The last ‘half’ was an update on the group’s office work rather than a ‘full’ Archigram magazine. The Six Members of Archigram are Peter Cook, David Greene, Mike Webb, Ron Herron, Warren Chalk and Dennis Crompton. Cook, Greene and Webb met in 1961, collaborated on the first Archigram magazine, later inviting Herron, Chalk and Crompton to join them, and the magazine name stuck to them as a group.



More Than 200 Projects are included in the Archigram Archival Project. The AAP uses the group’s mainly chronological numbering system and includes everything given an Archigram project number. This comprises projects done by members before they met, the Archigram magazines (grouped together at no. 100), the projects done by Archigram as a group between 1961 and 1974, and some later projects.

How it would be, if a house was dreaming...

By far my favourite facade projection. This one beats all others by 555 Kubik

555 KUBIK | facade projection from urbanscreen on Vimeo.

Digital Flesh


Making Future Magic: iPad light painting

Making Future Magic: iPad light painting from Dentsu London on Vimeo.


Funky Forest





Funky Forest - Interactive Ecosystem from Theo Watson on Vimeo.

Funky Forest is a wild and crazy ecosystem where children manage the resources to influence the environment around them. By using their bodies or pillow "rocks" and "logs", water flowing from the digital stream on the floor can be dammed and diverted to the forest to make different parts grow. If a tree does not receive enough water it withers away but by pressing their bodies into the forest children create new trees based on their shape and character. As children explore and play they discover that the environment is inhabited by a variety of sonic life forms and creatures who appear and disappear depending on the health of the forest.  As the seasons change the creatures also go through a metamorphosis.


Emergent Urbanism: MadDecent - MajorLazerApp - FaceTracking



"Emergent Urbanism: MadDecent - MajorLazerApp - FaceTracking: 'Here i
s a quick application Vik made using the OpenCv with processing (JAVA). ..."
Download it here!
http://emergenturbanism.blogspot.com/

Monday, February 15, 2010

Cue Vj Hardware concept developed by



Cue Vj hardware concept

Cue made of a software-hardware combo turns your laptop into a complete VJ system. The Main features are a 15,4″ Touchscreen and a linear moving Knob-fader. Certain areas of the screen represent information in context with the position of the knob-fader. Alterations with the knob-fader are displayed by movement, scaling and rotation of graphics. These elements gain complexity at higher zoom levels and become touch sensitive. The visual jockey is able to mix loops, apply effects in an intuitive way as well as access a vast library.


cue001 cue003 cue004 cue005 cue008 FarbFX FarbFX cue002



Motion Pad for iPad





We're looking forward to testing this and the fact that it will be available for ipad means in theory it should work for iphone,
although that would be a fiddle. Obviously in its infancy in development tho as the mixer refers to blending mode as 'brending mode' a quick sketch up? I think so.

Motion Pad http://blog.inokuman.in/?eid=63

ChairTV feature in IDN Magazine, Thanks to Mayo



thanks to Jason Mayo we had our Logo featured in IDN magazine

new site preview http://www.jasonmayo.co.uk/chairtv/index.html

Jason Mayo site:

http://www.jasonmayo.co.uk/

Stoneshaper






http://stoneshaper.blogspot.com/

BBC iPlayer-Grabber




iPlayer-Grabber

Tuesday, March 10, 2009

SMD WAREHOUSE CLIP---LINKS TO MORE










Check the tube for more

Dottie!

Simian Mobile Disco - Synthesise

Simian Mobile Disco - Synthesise. Live Visual Performance. Directed by Kate Moross & Alex Sushon. !WARNING THIS VIDEO CONTAINS FLASHING IMAGES! Familiar Concept guys!

THE END (A LATE TRIBUTE)




Goodbye Baby
For every beginning there has to be an end.

The End has announced that the club will be closing those legendary metal doors for good in January 2009. The End & AKA will be closing in style, giving the venues the send off they deserve, and the chance for the different DJs and nights to say goodbye. We will be open as normal through autumn, with September, October and November's parties rocking as usual. The farewell begins with The End's 13th birthday on December 6th, with long-time favourite Sven Vath headlining. There will follow a series of closing parties featuring The End’s closest DJs and promoters, and then a grand closing weekend on January 23rd and 24th. We talk to The End’s directors, Layo and Zoe Paskin, to tell us about the end of The End.

You’ve announced that The End & AKA are to close, after 13 years of parties. Can you explain a bit about the decision?

Layo - It’s partially a circumstantial decision, and partially a decision of choice. I started working on The End nearly two years before it opened - we’re approaching our 13th birthday, so nearly fifteen years. And one of the great things about The End & AKA is that the same team has remained in essence - Liam has been the manager of the club since The End opened, likewise Ty who now manages AKA. There are other members of staff who have been here for ten or eleven years, and we’ve got to the point where some of the key people are ready to move on. The End won’t feel the same if we aren’t doing it together. This has never been solely a business, it’s a labour of love, and a great part of the charm is the team, and running it with that team.

Zoe - We stopped and thought what The End meant to us around the 10th birthday – we did a lot to celebrate, including putting together a biography of the club. And after that, we started thinking about how we wanted to move on from there.

An opportunity arose that made us think even more deeply about what we wanted for The End in the future. Not just for today and tomorrow, but for several years moving forward. We discussed with the people that make the club what it is behind the scenes whether we wanted to carry the club on, and whether we could all continue to be as engaged as we had been. It was inevitable that people would eventually want to move on, and the chance to do this was in front of us.

Layo - Sometimes you just have to choose the moment.

So it feels like the right time in terms of what the club has achieved?

Layo - We wanted to do something very unique – and we’ve done that. Even though the past few years have arguably been our most successful as a club and as a business, I don’t particularly believe that there’s anything more that we can achieve.

The End & AKA have fundamentally been nothing but a success, and we’ve enjoyed the residencies, and working with the very best in clubland. The End isn’t a club where we over-market ourselves – we attract talent, fantastic DJs and promoters, and we work with that talent to promote certain styles of music and parties. And that’s why after all these years it’s kept such a cool core crowd, because it isn’t over exposed. The music varies through the month, different nights appeal to different people. We’ve had to compete 365 days a year, and we have to run it unbelievably well to compete – I think our standards here are fantastic, but equally, it does take a lot out of you. Plus the option to be able to close on a high appeals to our romantic nature.

Zoe - The End has always been about West Central Street, not loads of different things exploiting the brand. That’s always what we wanted to do, what we were good at, what we were interested in. We feel that we’ve done that, and that part of the success is about the attention to detail, the love that goes into the place, and the team. When we looked at it closely, we felt that our time had come. And so we started to think about ending on a high, giving the club the departure that it deserves.

Did you think about handing over the club to other people to run it for you?

Layo - I’d never believed that other people could run it in the same way. There’s a lot involved, you have to give more of yourself than just doing it as a job, anyone who’s going to run it has to commit for a fair amount of time. You can’t mess around, because it’s built on relationships, it’s built on understanding the scene. I never felt that someone else could – I may have been wrong, but that never felt a natural thing.

Zoe - It’s the first thing that friends ask when I’ve mentioned moving on – “What would happen? Who would do your job?” All of us in our different ways contribute to what The End is, but I can’t really picture The End without the core people that manage it. They really define the place, the environment, the ethos, the mood of working here, which then infiltrates into the whole culture and the creativity of what we do. I just couldn’t picture replacing any of us – not in an arrogant way, but I couldn’t see how you’d bring other people in and it still be The End as we know it.

Was it a hard decision to make?

Layo - It’s still not an easy decision, I’m going through a lot of different emotions. On the one level I want to do other things with my life, and it’s a good moment to start embarking on that – you can stay in something too long. But the loss will be huge on another level – it’s my life. Of all the things I do, there’s nothing that I get as much pleasure from as everything to do with The End. The idea of not working with everyone here, of not being involved with the creation, feels very odd. But I also feel it’s a good thing – I think it’s wonderful to be able to open a place, run it, and close it all on your own terms.

My father and I built this place alongside Mr. C, and my sister and I ran the place. The pride that I feel in doing this with them is beyond words. We did it as equals, but we all learnt from one another along the way. In a very traditional sense it was like a family business, but the business itself was anything but traditional.

Zoe - Of course it was a hard decision – I don’t really remember my life without The End in it. But I also feel quite excited - on a personal level. It’s been a very tough decision, I can’t yet really picture what it’s going to be to close the doors on the last day, it’s an impossible emotion to try and reach. At the moment it’s a really day by day thing.

I do know however that it has ended up being a family affair, by default not by design, and this is one of the things that will stay with me the most. Layo and I will always have that to share – it’s something very unique.

So tell us a bit more about the actual sale…

Zoe – Well it wasn’t a purely financial decision – it was about beginning again for us. We felt that we’d done what we could do here, and we wouldn’t want to be in a position where we’re just repeating ourselves. So the choice we were faced with was either how to develop The End, or how to let go.

Layo - We’ve had offers for the club before, and yes, this was the best one. But I wouldn’t say that it’s so good that it makes this a purely financial issue – it’s a circumstantial issue, coupled with financial. If it was seven or eight years ago, an offer like this wouldn’t have been accepted. It’s a lot to do with the timing.

Can you tell us anything about the closing parties?

Layo – Well you know us, and you know what we do. We’re speaking to all the key people and we’re not going to go out on anything other than the most massive bang. Because of the nature of how we run the club, with all the different nights and promoters, it will be a whole series of parties. They’ll begin with Sven Vath at the 13th birthday in December, and run through to the final party on 24th January. There will be different closing parties for different nights and promoters, with different DJs for different crowds, and then a very grand final weekend.

Zoe – We may be closing, but I see the final parties as one long celebration.

What are you going to miss?

Layo - I’m going to miss the warmth, the creativity and the humour of the office and the venues. I think that the working rapport will carry with me more than anything else, for the rest of my life. I will desperately miss playing at the club. God knows how many times I’ve played here – the first Saturday of the month for thirteen years. I will miss the feeling of being part of The End, with the DJs and promoters, that whole creative energy. I’ll miss working with my sister which I’ve enjoyed immeasurably. And I’ll miss the power to have a meeting, get ideas together with a group of people, and just decide that we’re going to do it. It’s not that easy to do in life. But I'll miss it all when it's gone.

Zoe - First and foremost, the people I work with. For me, without doubt, the thing I’ve enjoyed the most here is developing the team and the personal relationships. The rapport, respect and admiration that I have for people like Liam and Ty amongst others, I can’t really find words for what they’ve done for the venues, and the wider team. There are people who aren’t here now that still come back – when we had AKA’s 10th birthday recently people flew in from abroad. It’s very moving for me that it’s captured their heart so much.

I suppose I’ll miss everything – the dynamics, the banter with the security team, the spirit of the place, the street, the atmosphere, the whole culture. Turning the corner onto West Central Street, or standing on any of the dancefloors and seeing people having a fantastic time, time and again. It goes on and on.

Some people are going to find it a great loss to London…

Zoe – It’ll feel like a void but other things will come through though, new venues. The scene is always evolving but I imagine they’ll always remember The End & AKA.

Layo - The End is a very unique place. When The End is rocking, there’s very few clubs in the world that even get close to it. And that’s on a lot of nights, in a variety of different ways – from house and techno, to drum & bass, to nights like Trash and Durrr, to the afterhours parties. That’s very special – I don’t know many clubs that can do that, and I think it’ll be a loss to London.

I’ve only played in a few places around the world where I feel people put in the same level of love that’s put in it here, from reception to the managers, to the bar staff, to the sound guys. It’s very rare that you get that, and there’s almost nowhere in the UK that comes close. I do believe the old saying about nature abhorring a vacuum, and I do believe that someone else will create something. But that’s the nature of it – things come, things go, it’s always moving.

What would you say to the regulars from the club - whether they’ve been here from day one, or joined us in the last few years?

Zoe We set this up as a business, but it’s also about this mad culture of creating a lot of fun for people. The energy to do that comes from the public – we give them the platform in which to have the fun, and we’ve been very lucky, to this day, to have such a fantastic audience. So I hope that everyone will understand our reasons for going, and that The End has moved on. And of course… thank you.

Layo - There is nothing to say but thank you. I hope you will remember some of your time spent here, and that your memories give you as much pleasure as mine do for me.

Published: 1/09/2008

iphone hacks

if you want to update your iPhone 3G to firmware 2.2.1 but at the same time preserve its baseband so that you can also unlock it then you should follow our step-by-step guide to use Dev Team's PwnageTool.

But If you don't want to unlock your iPhone 3G now or anytime in the future and only want to jailbreak it to install jailbreak iPhone apps<
iPhone 3G, iPhone Dev Team, iPhone Firmware 2.2.1, JailBreak iPhone, JailBreak iPhone 3G |


Step-by-Step Guide to Jailbreak iPhone 3G running Firmware 2.2.1 using QuickPwn 2.2.5 (Mac)



The Dev Team had warned iPhone 3G users against updating to firmware 2.2.1
as it also included a baseband update which breaks the "injection hole"
that they had used in their application yellowsn0w to modify the
firmware which had helped to successfully unlock iPhone 3G.

So if you want to update your iPhone 3G to firmware 2.2.1 but at the same time preserve its baseband so that you can also unlock it then you should follow our step-by-step guide to use Dev Team's PwnageTool.

If you don't want to unlock your iPhone 3G now or anytime in the future and only want to jailbreak it to install jailbreak iPhone apps then you can update your iPhone 3G with firmware 2.2.1 using iTunes and then use QuickPwn to jailbreak it.

So here is our updated step-by-step guide to jailbreak iPhone 3G using QuickPwn for Mac users updated for iPhone firmware 2.2.1.

MPTVOutWindow Hacks?

Confirmed by Ars Technica, a programming class in the iPhone SDK known as MPTVOutWindow does essentially what its name implies: it sends video out through the Dock Connector port to an outside source, such as an external screen.

The adaptation is primarily meant for movies but allows any program to export the current contents of the screen, theoretically allowing apps intended for screens larger than the 3.5-inch iPhone LCD.

After testing, however, well-known iPhone developer Erica Sadun notes that touch input is disabled in this mode with current implementations and so prevents using the iPhone as a regular controller for games or presentations. It does recognize accelerometer input for a basic level of control.

The code writer also observes that the programming call appears designed for the phone's landscape view rather than the upright portrait mode for most iPhone apps, forcing developers eager to use the new mode to adjust for the realignment.

But while these limitations have already been discovered, developers have already written basic code and plan to explore the MPTVOutWindow function more in the near future to see what it will do; an example of this is provided below.

Surely there is a way to modify your apps to display them fullscreen??


iPhone Video Out from Ars Technica on Vimeo.