Digital Exhibition


Patterns formatted for Instagram’s augmented reality function 
and can be viewed >>HERE<< via the Instagram app.




Where The Exhibition Began


The idea began at the start of the first lockdown, observing that some galleries opted to recreate their own bricks and mortar buildings online, replicating the gallery experience with works ‘hung’ like a physical exhibition. Whilst I was impressed with the speed of mobilisation, I felt the results were mixed for this method. Some of the galleries wound up building what felt more like an architectural rendering, making the gallery the focal point of the exhibitions and side lining the impact of the works on show. It felt like the core of viewing the artwork was overlooked; that experience of art in a gallery mostly based on how the viewer approaches the work and lets it fill their field of sight and isn’t distracted by anything in their peripheral vision. The difficult job that galleries had (and will have) is building an exhibition space online that feels unique and connected to their physical space without distracting from the work on show.
     The future work I want to create is more geared towards installation than a hung work show, and I think of all the physical micro-interactions I have in any gallery space that would be digitally difficult to replicate; The journey I slowly trace when viewing an exhibition, the different air that stands between me and the art, the feeling of sharing a viewing experience with others wandering through the gallery.



The Process


Learning the software wasn’t as straightforward as I thought it would be but Spark AR comes with a lot of templates to cannibalise. Using the templates for World Object and 2D Sticker as a basis, I had somewhere to find the functions for placing shapes with a texture overlaid.

In Fig 1 the important part is the plane tracker function. This is where the user can tap on their screen and the filter software should recognise a flat surface in the real world and fix the digital object to it. By adding a plane to this, we have our blank canvas to digitally place in the world.



Where the software fails for image based display, is in the file size limit for JPEG or other image formats that can be used as a texture. Spark AR has an upload size limitation of about 4mb, and every JPEG added takes up a huge chunk of that upload limit. Looking at the image size of a few of my iPhone photos, the average is about 0.8mb, and that’s to be viewed on a 1080 x 1920 screen at 72 dpi resolution. That would work perfectly well as a flat image on a screen, and from Spark AR’s point of view, held at a distance by the user to take a selfie with. With this it feels like this is the core function of Spark AR, creating face altering filters that encourage engagement. With this functionality in mind, it makes sense that less detailed images are uploaded, and more emphasis is put on inserting 3D objects and simple textures into the viewer's environment that don’t obscure the users face. With an AR rendered art piece however, we have more of an expectation that we can view and walk around an image in front of us. Moving up close to it, and appreciating at the scale the image creator intended. High resolution and large image size are essential to this, and now at odds with the core function of Spark AR.

With my pattern work, the average size of one of my A1 print quality files is 100mb - 200mb. Compressing down to a file size a fraction of the original would leave the image quality too poor for display or at a size too small to show any great amount of scale.

This actually might have killed the exhibition idea dead, however an interesting work around occurs within the software, in that a lot of planes can be added as long as they all have the same texture file, actually ideal for pattern based work. Here’s an example of the wind turbine pattern in Spark AR, that little wire frame pyramid is where the viewers screen is spreading out to what the camera is capturing. This gives you an idea of how big the scale is that is being built to.






It’s worth noting here that my pattern work is actually built as a series of overlapping tiles to find the repeat points (left image), so a large degree of rebuilding and manipulation had to take place to make them fit to a regular grid like configuration (right image).



The Patterns In The Wild


As follows a few in screen photos of the finished AR patterns and how they appeared when placed. I was satisfied with the built pieces, especially how the sense of scale came across. It’s quite hard to judge, but with items in the background, the tallest patterns felt several stories tall, and the deliberately wide ones had a panorama feel of a cinema screen. I’ve not treated it as an exact science, mainly basing on how much the viewer has to move their screen around to view the work.


Fix heights on these


On the day, the audience was made up mainly of instagram followers and a couple of drifts over from Twitter. The patterns worked as expected, the function to test these out before uploading made it easy to smooth out any potential issues, but also meant the ‘attendees’ didn’t experience any issues in viewing the work. Feedback was that the scale really came across, with a lot of panning around and looking up when viewing through the screen. I was pleased with how the audience got to experience that, a major part of future pattern work will be creating at bigger scales and this felt like I could get that across Aspects such as ease of use were a little trickier, outside of viewing filters in stories (such as modifying faces or creating quizzes), it’s not easy to explain where the filter part is. Also, it requires the attendee to go through several steps and be on the artist’s profile to view.

I definitely felt the loss of having a physical launch, I think this is a very personal taste matter for artists but I like having an event in a space to build some buzz around, but I also believe again that something sympathetic between the physical and digital can be achieved. I’ll be paying attention to how conferences are run when covid restrictions are eased up. I feel there will be a smart number who try and create a space for both of these, seeing how they get digital and physical attendees to interact will be interesting.

Visibility for the exhibition was another issue that was hard to judge. The whole exhibition was run through Facebook proprietary products, including Spark AR, so the exhibition was wholly at the whim of Facebook’s algorithms.


Outcome and Plans For Future

After a few weeks of being on display, I’ve had the time to look into the software and finesse certain elements, but overall I felt happy with the exhibition run. I started this as a principle tester for future exhibited work, mainly how technically feasible it was, how roll out would work and how easy it was for an audience to access. These questions were answered and I’d be happy to make this format part of future exhibitions, running sympathetically alongside a physical exhibition. I’m not sure if I’d want to run a full digital exhibition as the technology and ubiquity stands at the moment... I feel there needs to be much broader acceptance and uptake for the AR format, plus there still seems to be a lack of resolution in displaying the work, a limit of Instagram's own size requirements but something to explore in other software.

Recently I’ve discovered Spark AR has the ability to generate QR codes to link to filters. These can be detected in the iPhone camera screen and link through to Instagram. Putting codes out in the real world combined with good online promotion would help the situation, giving an easier route for the viewer and taking the promotion away from solely algorithm decided.

Looking at much bigger institution’s doing this, here’s a really good article on Manchester International Festival’s virtual Factory as a space in Fortnite. Attracting around a million visitors, even given the caveat of a pandemic being on, thats feels like a very impressive uptake for a digital event.


....