Monday, 28 February 2011

Design Development

This week I have been trying to understand the structural system of my proposal in a little more depth. This is slightly tricker than I would have liked, firstly due to the scale of the proposal and secondly due to the form and tesselation. Anyway to make sense of it I started to look at the main roofscape throughout the site and connecting circulation ramps as illustrated below.

In order to understand the structure of the individual buildings, I need to do more studies into the actual program of each building and determine a layout which works in conjunction with the structural lattice detected on site. To do this I started to look at some of the key spaces in my building:
These perspectives are fairly indicative at this stage but they illustrate the irregularities in the structural system that will be present throughout the building. 

Wednesday, 23 February 2011

Tracking

The drawings below highlight the process undertaken for generating a structural composition on site, using a camera to carry out the 3d detection.
The point cloud data from the detection is then fed into processing and is converted into a 3d lattice based on a particle-spring algorithm which connects neighbouring nodes based on their distance.


Friday, 18 February 2011

Programme Strategy

Having analysed the site in a little more detail last week, I have decided to simplify the programme of the proposed development slightly. The drawing below illustrates a quick study of the proposed programme  strategy.

The Augmented-reality Pavilion will be the central exhibition space which will  feature the Archi-maton application along with other application from the Apple App store. The key aim of this space will be to provide public engagement and interaction with the pavilion through the use of mobile technology.
The AR pavilion also acts as the central plaza connecting the other spaces in the entire master plan. The location of the pavilion is defined through the intersecting point between the main pedestrian routes and key views through the site.


The plan above highlights the location of the key spaces and routes through the site based on the analysis conducted so far. I intend to take this back on site today and start tracking the main spaces in real-time and explore the possible potentials of gestural design.

Wednesday, 2 February 2011

Gestural Design

In designing an application which can cater for both the design and construction of automated architectural solutions, a significant factor which can aid in the design process is the use of a gestural interface which goes beyond the multi-touch capabilities of mobile devices. In the quest for intuitive human-computer interaction, pervasive computing in the form of smart phones can allow for experimentation with new interaction methods, which can prove useful for architectural design.  Nintendo's Wii and Apple's iPhone and iPod Touch have made gestural interfaces popular, however with the release of Xbox kinect, the use of this form of pervasive computing provides multiple advantages for a more intuitive design methodology.


Although in this project I probably wont go into the genius that is Kinect, I will however be incorporating some of the basic principles of this technology in the design stages of the building project. Pervasive computing through the use of mobile devices allows for real-time scanning and sketch design of buildings on-site, with combined capabilities of gestural  interactivity. This means that massing and sketch designs can be explored on site using the mobile device as a controller for drawing.
Possible technologies that mobile devices provide include a 6-axis gyroscope, a motion sensor more accurate than the accelerometer, which provides information of gravity. This combined with the use of the accelerometer and GPRS could easily set up the foundations for gestural design capabilities.