Second Semester – Concept Iteration, Development, and Completion

Throughout the second semester, I mainly worked on developing the design concept I had presented at the end of semester 1, and iterate on it so that I could be able to present a final design at the end of the semester that would reflect the aims and objectives I had for this project. In terms of research, not much had changed this semester considering I was using the same high concepts throughout the whole project. After reviewing and editing the literature and contextual reviews, I only added one section which detailed how physical representations can be used to represent data, to make sure all bases in terms of original research. Finally, the rest of the semester then consisted of reviewing and re-writing the honours exegesis document based on feedback from the first semester, as well as throughout the second semester. 

 

Project Artefact Development

The main thing I wanted to address from the previous prototype was the medium and design that I would be using to create the physical data visualisation. What I liked most about the previous concept was that it used actual water to represent rainfall data. However, what I didn’t like about the concept was that it was more or less a waste of water, therefore having a negative impact on environmental sustainability. Furthermore, it was very similar to Nori Takagi’s “Silent Tides” project, which was discussed in the first semester process. Therefore, my first thought when designing a new concept for this project was to make a device that used purely physical materials, while keeping with a structure that represented the rainfall metaphor. The sketches below (figures 1 and 2) introduce a new concept; a physical, table-top data visualisation piece that takes the form of a wave (representing a metaphorical wave) through the use of 3D bar graph columns. The data in this visualisation would detail the monthly rainfall averages for Australia from January 2017 (the start of the drought) to August 2020 (present) using the previously mentioned bar graph columns. In particular, the bar graph columns would be arranged so each year would have its own row (eg. Rainfall averages for 2017 in the back, rainfall averages for 2018 and 2019 in the middle, and rainfall averages for 2020 in the front).

Figure 1: Initial sketches of the new concept
Figure 2: Initial sketches of the new concept

To properly measure how high the bar graphs should be, the data would be informed by the Bureau of Meteorology’s Climate Summaries Archive (2020), which represents average rainfall for each month as a percentage. Firstly, this meant using an “average” column which would be exactly 100mm high (representing 100%). Then, I would go through every month from January 2017 to August 2020, recording the percentage of area-average rainfall across Australia for that month. As an example, the average rainfall for Australia throughout the month of August in 2019 was 54% below average (see figure 3), meaning I would make the height of the bar graph column for that month 54mm (54%) below the average column (100mm/100%) mentioned earlier. This ultimately meant that the column for August 2019 would be 46mm in height. I used this “formula” for making the bar graph columns because I believed it would be the best way to compare the different months to each other.

Figure 3: A screenshot of the data used to inform this prototype

What I liked most about this new concept was that it represented a large amount of rainfall data using a physical “wave”, rather than actual water. Furthermore, this concept allows the audience to view significant stages of the drought using this data. For example, this data showed how bad the rainfall had been in the crucial winter months, as well as the whole year of 2019. With this concept, I would be able to use materials that are environmentally sustainable as well. Overall, I believed that this data visualisation concept would allow the audience to connect the evocative imagery to the extremely worrying data, which would create a powerful visual data story.

 

To prototype this new concept, I started by using a material called corflute to construct an “average” column, which I made exactly 100mm in height, and 20mm in width and length. I made this first column as a test to see what the best way would be to construct a completed lo-fidelity graph using this material. Figures 4 and 5 (below) show the first lo-fidelity column being put over the top of an LED that has been turned on after a button press.

Figures 4 and 5: The “average” column put over an LED that has been turned on

 

After making the initial corflute bar graph column, I then went to work constructing columns for every month from January 2017 to August 2020, based on the data from the Bureau of Meteorology’s Climate Summaries Archive (2020). Figures 6 to 8 show the progression of the construction of this lo-fidelity prototype. The main reason I wanted to prototype this concept in physical 3D form so soon was because I wanted to see if this data visualisation would actually look like a wave (representing the metaphor of rainfall). Figure 8 shows the fully constructed prototype, with the data columns representing a physical wave, as expected. What I liked most about this prototype is that the data was easily readable. More specifically, I believed that the audience could understand the data that was being presented, and connect that data with the visual story.

Figure 6: The average national rainfall for 2017 and 2018 prototyped using corflute
Figure 7: The national rainfall average for 2019 prototyped using corflute
Figure 8: The national rainfall average rainfall for 2020 prototyped using corflute

This stage of the project saw myself stay in the prototyping/building phase of the process, as I moved into working with Arduino technology, coding, and wiring. Firstly, I wanted to use Arduino technology for this project/concept because I had worked with it before, and knew it would be the best way to achieve the technical aims I had for this concept. In particular, I wanted to wire blue (representing rainfall) LEDs for each column that had an image in the visual story. Furthermore, I also wanted to code these LEDs with a button, so that when the user presses the button, the LEDs light up in a specific sequence, and the visual story plays at the same time on a computer screen.

 

Firstly, I started playing around with simple Arduino technology and code, such as just turning on one LED with a button. Figures 4 and 5 show this action being performed with a lo-fidelity column placed over the top of it. From there, I then tried connecting two LEDs to the Arduino, and attempted to write a code which made the second LED turn on 5 seconds after the first LED turned on. After a few attempts, this was also successful, with the help from some code from the internet that I adapted to suit my own technical needs. The biggest challenge was then writing code that made the LEDs turn on in sequence, while also playing a video on the computer screen, when the button is pressed. To achieve this, I had to write two separate pieces of code: one in Arduino, and one in Processing. In the Arduino code, I had to open up a serial port using the code, which would allow the two programs (Arduino and Processing) to talk to each other. After doing the same thing on the Processing end, I was able to write up code that allowed the video to play from Processing. After completing the code on both ends, I then had a program which allowed me to play the video while turning on the LEDs at the push of a button. Over time, I would develop this code to include every LED, as well as time when they turn on and off based on the complete visual story video. Figure 9 below shows a video being played on the computer screen in Processing, while the LEDs light up when the button has been pressed.

Figure 9: The first working version of the code used in this project

Figure 10 below also shows the original code which was used to make this prototype work. This is the code which would be developed into to the final prototype over a few weeks. For reference, below are links to the codes on the internet that I adapted to help write my own piece of code for this project:
http://youTu.be/2WwedCRwmgA (used to turn the LEDs on and off in a sequence)
https://www.youtube.com/watch?v=NhyB00J6PiM&t=4s (used to connect Arduino to Processing)
https://processing.org/reference/libraries/video/Movie.html (used to play the video)

Figure 10: The two codes used to make LEDs turn on while a video plays, when a button is pressed

After coding, wiring, and lo-fidelity construction had been completed, it was time to conduct the first round of playtesting with this concept. This was done on campus with a couple of peers, and it provided me with great insight into how this concept could be developed and improved.

 

Firstly, one minor and easily fixable problem with the device for users was that they found the timelapse in the visual story video went too quickly. The play-tester believed that because of the speed of the timelapse video, they couldn’t effectively comprehend the emotional images they were being shown in this video. To combat this, I would simply slow down the timelapse, so that it would roughly take 5 minutes for the video to be completed, rather than 1 minute, like it previously was. While I did like that the video was now longer and the drought-stricken landscapes in the video could be easily seen, I was still worried that it would make the overall video longer and the audience viewing the video could perhaps lose interest. However, after conducting playtesting with the updated version of the timelapse video, users told me that they very much liked how they could see the landscapes at a slower, more realistic pace, which eased my concerns.

 

Secondly, the next piece of feedback I took from playtesting was that I needed to provide further research into why it was better to use physical graphs (a physicalization), rather than just buttons and a screen (a visualisation). After doing some research into this area, I make note that while there are arguments for using a situated representation over an embedded representation, there doesn’t seem to be any major arguments for using a physicalization over a visualisation, and vice versa. Instead, I explain that I believe I have justifiable reasons for using a physicalization over a visualisation, which are more closely related to the outback, and the sustainable issue I’m pursuing.

 

Thirdly, another key piece of feedback was that the user found it fairly hard to tell what the data was in the physical graph. While I didn’t have the full visual story video ready for this first round of play-testing, I already had plans to make a screen at the start of the video, which would tell the user what the data was, and what the user had to do to start the visual story video.

 

Finally, the most important piece of feedback I received is that the physicalization would make more sense if the graph looked more chronological. What the user meant by this is that the data could be more easily understood if the bar graph columns were in one, big chronological line, rather than having a different row for each year of data. Immediately after hearing this piece of feedback, I already very much liked it, and was extremely excited by the possibility of constructing a prototype like this. Over the next few days, I would work on constructing this new iteration of the prototype.

 

The main piece of feedback I would be addressing from the first round of playtesting was how the actual prototype was presented. Throughout this stage of the process, I worked with corflute again, and constructed an iteration of the concept that saw the graph columns in one, big chronological line, rather than using a new row for each year. Figure 11 below shows this new lo-fidelity iteration. What I liked most about this iteration is that the data was most definitely more readable. Furthermore, I believed that this prototype also still kept the shape of a wave, representing the metaphor of rainfall. However, one thing I didn’t like about this prototype is the fact I thought it would make it more difficult to construct, considering the length of the presentation, which was 147cm (1.47m)

Figure 11: The second lo-fidelity prototype

The second round of playtesting was conducted on the second iteration of the lo-fidelity prototype, and was done at home, and on campus. For reference, the second iteration of the lo-fidelity prototype can be found in figure 11. For this round of play testing, I included the Arduino tech, and also built a singular LED into one of the columns. Therefore, the user would be able to press the button themselves, and see how the LED lights up in the column and view the (still incomplete) visual story video on the computer screen. Figure 12 shows the device being set up for playtesting, while figure 13 shows the device in use throughout the playtesting at home. The main piece of feedback I got from this round of playtesting was that they found the button too small to press, meaning I would have to source a bigger button to use. Overall however, the feedback was mostly positive. The users expressed that they understood what the data was saying and what it was referring to. Furthermore, users also made it clear that they liked how they could see the imagery of the drought alongside the data that was in front of them. This made one crucial thing clear to me, which was that the audience could make the connection between the physical presentation (the physical device, the data) and the physical referent (the Australian drought). Considering this connection is crucial in creating a situated visualisation, I was extremely happy with this feedback. After completing this stage of the process once again, I now felt confident that I could move on to hi-fidelity prototyping with this concept.

Figure 12: The device set up for playtesting
Figure 13: Device in use throughout playtesting

The first step in constructing the hi-fidelity prototype was designing the pieces to be laser cut in Adobe Illustrator, and constructing them. Figure 14 below shows the Illustrator file used to design the bar graph columns that would laser cut using MDF board. Figure 15 shows the first iteration of the case design I made in Illustrator to be laser cut. After I realised this design wouldn’t work for this prototype, I made a new version, which made use of two modular pieces that get pieced together when the final device is put together (this can be seen in figure 16).

Figure 14: Illustrator file of bar graph columns
Figure 15: First iteration of case design in Illustrator
Figure 16: Final iteration of case design in Illustrator
In terms of my project, I decided to use MDF board because I believed that its look of authenticity related to the authenticity of outback Australia, which is definitely what I liked most about using this material. Furthermore, I liked using this material because it was very practical and easy to work with, making this part of the process much easier than first expected. After laser cutting all the pieces I needed for this device, I went to work on putting together all the pieces and constructing the hi-fidelity prototype. To do this, I made jigsaw-like cuts on the edges of the MDF pieces so they could be glued together with ease. Figure 17 below shows these MDF pieces being put together, with some columns not being glued to the case until the LEDs, wiring, and Arduino has been rigged up inside. This picture also shows the markings and measurements on the top of the case to place the columns with precision, as well as holes which were drilled in the top of the case to accommodate the LEDs and wiring. Once the appropriate pieces were glued together, the LEDs were rigged inside of the case, which can be seen in figure 18. After completing this first stage of construction, I found that it was much easier than I first expected. In particular, I was extremely happy to see that all the measurements were correct, and the pieces fit together when I first constructed the prototype. Furthermore, I was also very happy to see that the graph columns still had the look of a wave (representing rainfall) in real life. Overall, I was very much filled with confidence heading into the next stage of construction.
Figure 17: Gluing together the hi-fidelity prototype pieces
Figure 18: The first stage of construction completed

This next stage of constructing the hi-fidelity prototype was by far the most challenging part of the process. This stage of the process included soldering blue LEDs, resistors, and wires together, and then wiring them up to the Arduino which was inside of the device case. Furthermore, I also had to figure out a way I could wire up a button so it could sit in the middle of the device and work with a button press on the top of the case.

Firstly, I did a soldering and wiring test on one LED, just to make sure that what I was doing would definitely work with Arduino, before proceeding with soldering for 10 LEDs. Figure 19 below shows this first test, which has a resistor soldered to the positive LED pin, a wire soldered to the resistor, and another wire soldered to the negative LED pin, which allows the wire to be grounded.

Figure 19: The first test of wiring and soldering with an LED
After I knew that this method would definitely work on a large scale with Arduino, I began soldering all 10 LEDs and wiring/inserting them inside the prototype case. The only thing that was different when wiring the LEDs on a large scale was that I had to connect all the ground wires together (the wires on the negative LED pins, making a chain). This was done not only because the Arduino board didn’t have enough ground pins, but also because creating a ground chain saved more time and was easier to solder and put together. Figure 20 below shows half of the LEDs soldered together, placed permanently inside the top of the case, and connected to the Arduino. For reference, a short timelapse video showing myself soldering some of the wires together can be found below.
Figure 20: The LEDs being soldered and rigged up to be placed inside the prototype case

While soldering all the wires with the LEDs was technically simple, it was very time consuming. However, the most challenging part of this stage of the process was figuring out a way to wire up the button so that it could be placed in the middle of the device for the users convenience. I went through the testing of a few concepts to see how this could be done. Firstly, I tried putting the average column in the middle of the device, and having a stick run through the column and to the bottom of the case where the button is. Furthermore, this concept would also require the average column to have a small square at the top that the user could use to press the button. At first, it seemed like this concept was working, but because of the modular nature of the device’s structure, only half the column could be glued onto the case, meaning it was unstable and soon fell off after a short period of time. See figures 21 and 22 to see what this first button iteration looked like.

Figures 21 and 22: The first iteration of the button design

 

After I realised this wouldn’t work, I came up with another concept which saw the average column on the far-left side of the device, with the cylindrical button on its own in the middle of the device. This worked much better because the device kept a sturdy structure, and the button could be pressed very easily. This was once again another very time-consuming part of the process, mostly because it look me so long to figure out how I would design this button to be practical and user friendly. However, once I had figured out the problem and got it to work, it was very rewarding.

The final part of creating this project artefact was making the visual story video using all the imagery I had collected over the last 4 months (May to August). All that was needed to compile all the images and edit them into an evocative video was Adobe Premiere Pro. Using this program, I was able to add all the images and videos, fade them in and out, add text to give a bit of context to the audience, and also add some evocative, royalty-free music. After creating the video and having all the necessary timings for the images finalised, I then had to take those timings and apply them to the code, so that the LEDs in certain columns would light up when their respective month was shown in the visual story video. A small timelapse showing the process of creating this video can be found below.

Overall, this was one of the easiest parts of the entire process. While it was time consuming because of the amount of content I had to sort through and edit, it was extremely easy in doing so, and I never ran into any problems while creating the video. What I liked most about this video is that I believed it did a good job in connecting the images to the data, because of the LEDs in the device, the powerful images, and the context given to the audience through the text in the video.

 

After all the iterating, developing, and construction was completed, I finally had a completed, working prototype ready for final playtesting and feedback. This working prototype showed the audience a graph which detailed the monthly rainfall averages for Australia from January 2017 to August 2020 (the length of the drought), with a screen above the device telling the audience what they are looking at and what to do to start the device. Once the audience understands what they have to do, the user presses the button in the middle of the device, which triggers the timed LED sequence and the visual story video, creating a visual data story.

 

Because this final prototype was completed only a week before the honours class exhibition, the playtesting for this prototype was fairly brief, mostly because I wouldn’t have time to make any major changes. Therefore, playtesting was completed in class with peers, and I made use of the big projector to show it on the screen, replicating what it would look like in the exhibition. I got a couple of students to come up, look at the data and the screen, and then press the button and watch the video.

 

The feedback I got from this final round of playtesting was almost all positive. Most play-testers said that they liked the new inclusion of the screen at the start of the interaction, because it allowed them to understand more clearly what the data was before viewing the visual story video. Furthermore, users also expressed that they particularly liked the powerfulness of the images, however they also said that they didn’t get the same powerful stillness (like that experienced with the images) from the timelapse of the bus ride from Roma to Wallumbilla.

Figure 23: The final prototype set up for play testing

References

Bureau of Meteorology. (2020). Climate Summaries Archive. http://www.bom.gov.au/climate/current/statement_archives.shtml