Dave Webb Portfolio 2019
...coding is loading
Until recently my career has been a long way from art, and only loosely around the sharp end of technology, but it has been seasoned with occasional creative projects.
More recently I have rediscovered my technical and creative muscles and have been working them hard to make up for lost time. I have invested a lot of energy learning creative coding and building many small sketches, plus a few more ambitious projects. Examples of each can be found on this page. I submit to exhibitions and competitions and grant opportunities, with occasional success. I code daily (in Processing, p5.js) and share something most days.
On one hand, I am excited by the purely self-serving exploration of technology in creative applications, and to see cultural and broader human benefits rather than purely commercial ones. But I am also keen to see where the technology can be the hidden medium for some creative concept or experience. Both the means and the ends matter.
I am not fully an artist, nor a designer, nor a software developer, but I bring what I can of each element to my work.
I recognise, grudgingly, that there is a limit to how many technologies I can learn, let alone master, or even be aware of. For each of those technologies, and their many permutations, there are endless creative possibilities. What I can do is stay tuned in, observe and learn from others, and draw on them to create and build.
My current obsession is around the misunderstood human brain and mind, and our often self-defeating behaviour. As much as our minds are limited, if we fail to understand the way they work, we miss the chance to make them work better for ourselves, and for the world, and we leave ourselves open to exploitation. We build automation that exploits our weaknesses instead of enabling our better selves. I believe creative technology can help us zoom out from our constrained inner viewpoint to better understand our own workings and behaviour, to be more tolerant, and more skilful at driving these incredible machines we inhabit, and to make automation an enabler rather than a threat. Remented (see below) is a living journal of my exploration of this area, brought to life through animations and mini games.
Having developed a fascination with the human brain, mind and behaviour, I feel frustrated at how hard some of the psychology and neuroscience is to grasp and to relate to our own familiar experiences. Sensing a role for creative, interactive representations, I have been developing simple digital visualisations and games to illustrate the aspects of the mind and brain that I have researched. Remented is my platform for sharing my learnings and my animated maquettes. I have been using two dimensional visuals (developed in p5.js), but see a potential for more immersive experiences to place the viewer inside a simplified mind, from where they can relate their own experiences to a fun and abstracted simulacrum of behaviour and mind.
The site itself is here
Created for and Exhibited in University of Bath’s 2018 show “Visions of Science”. Equals thought simulates the propagation of synaptic “firing” across a simple brain according to connections between neurons, and their predisposition to excite or inhibit one another, according to sense data derived from its simulated “world”. The work cycles through three different visualisations of the behaviour.
The work is built in p5.js using Pixi.js to offload some of the graphical heavy lifting to the GPU. It does suffer from performance limitations and I was limited to around 1000 neurons and 10,000 synaptic connection.
See the work in action. https://remented.com/equals-thought/
See a blog post explaining some of the development. http://crispysmokedweb.com/2018/07/p5-gpu-pixi-js/
Submitted (but not accepted) for the 2019 Fringe Arts Bath festival, under the Hidden exhibition, this work superimposes a slightly grotesque slit-scanned capture of the viewer’s face, superimposed on a shifting collage of data scraps.
The artistic intent is to consider the vulnerability and visibility of a true, individual identity in a digital world, where we are represented by data (our own, aggregated with millions of others’), and the statistical assumptions built about us by algorithms to serve our day to day interactions.
Technically the piece is built in p5.js, and (with some irony) the work uses a machine learning algorithm (POSENet from the ml5.js library) to identify the position of the viewer’s face. Slices of the face are captured from the webcam, and used to build the phantom self. The collage of data elements forming the backdrop is made of pictures of data sources from my own digital life (usage data, purchase history, fitness, dna, position tracking, social feeds).
A 3D highly articulated robot simulated in Processing’s 3D renderer.
The robot is designed to traverse a 3D space using it’s very complex kinematics to create 3D drawings.
The kinematics are so complex that I feel it probably needs some Genetic or ML algorithm to master its motion in a more deliberate fashion. The robot mechanics and polygons are all built with vertices in code – no 3D models are used. This taught me more than I ever wanted to know about 3D geometry and there are a few geometric flaws.
BeHere was developed for and long listed in the 2017 Lumen Prize for digital art (interactive category). It uses a browser on viewers’ mobile phones and a local Node.js server (running on a Raspberry Pi and with a small WiFi Router) to allow devices to join together and create an aggregated digital display. A series of short visualisations created in p5.js appear to flow from one device’s screen to the next.
The artistic intent was to create a digital experience that united viewers in a shared experience that required presence, proximity and cooperation, using the devices that often leave us disengaged from our neighbours.
The technical challenges are documented here. http://crispysmokedweb.com/2017/05/behere-many-mobiles-making-massive-moving-motifs/
Created as an interactive installation for a friends party in a barn, SeePilgrims used Processing and a Microsoft Kinect to capture depth data about viewers’ bodies in the room, and interpret their body shapes and motion in different ways through a series of changing visualisations. This was projected onto a screen (approx 4m x 2.5m) just above the Kinect.
A series of posts explaining the development process can be found here: http://crispysmokedweb.com/2015/08/seepilgrims-part1/
Click a sketch to launch its Codepen
A selection of very early, pre-digital art projects made over the years
Untitled. A suspended, mobile sculpture, responds to air movements. A gauze membrane hangs in ‘cells’ from a series of counterbalanced arms that allow independent movement but also propagation of energy between cells.
Approximately 1.8m x 1.8m x 1.5m drop.
A sculpture of approximately 100 suspended plaster elements, hung between two plywood discs (1.2m across 2.4m apart). Created as part of an OCN Sculpture course at City of Bath College c1999
Involvement in local creative technology groups and activities
Creative Coding in Bath
Creative Computing in Bath, is a monthly meetup of both experienced coders and the curious. We hold discussions on methods and practise, share and discuss our work, and run workshops (mainly targeted at beginners) where we share some point where an artistic theme is explored through some technical method.
Processing Community Day / You Make the Rules!
Processing Community Day Bristol was a global series of local events to celebrate the crossover or art, code and community. The Bristol event was organised by a SouthWest team of educators, artists, technologists (including myself) and was a great success. We had a day of hands-on exploration of code and art aimed at beginners and rounded the day with an Algorave. The event has spawned a new series of events: You Make the Rules! aiming to bring art/code/community events to locations around Bristol.
Bath Machine Learning
The Bath Machine learning group is made up of mathematicians, data scientists, developers and the curious. The group holds talks, workshops, hacks and provides a friendly place for the uninitiated to get to grips with some of the concepts and tools. The image shows the result from a recent style transfer workshop, using Python and a pre-built style transfer model built in Tensor, running on IBM Watson.