{"id":830,"date":"2019-01-22T07:13:56","date_gmt":"2019-01-22T14:13:56","guid":{"rendered":"http:\/\/blogs.unsw.edu.au\/clic\/?page_id=830"},"modified":"2019-01-31T02:43:06","modified_gmt":"2019-01-31T09:43:06","slug":"experiment-two","status":"publish","type":"page","link":"https:\/\/blogs.unsw.edu.au\/clic\/archive\/ruth-shipman\/experiments\/experiment-two\/","title":{"rendered":"EXPERIMENT TWO"},"content":{"rendered":"<h1 style=\"text-align: center\"><strong>MAN CODE STRATOSPHERE<\/strong><\/h1>\n<p><iframe loading=\"lazy\" width=\"890\" height=\"400\" scrolling=\"no\" frameborder=\"no\" src=\"https:\/\/w.soundcloud.com\/player\/?visual=true&#038;url=https%3A%2F%2Fapi.soundcloud.com%2Ftracks%2F567527706&#038;show_artwork=true&#038;maxwidth=890&#038;maxheight=1000&#038;dnt=1\"><\/iframe><\/p>\n<p><iframe loading=\"lazy\" width=\"890\" height=\"400\" scrolling=\"no\" frameborder=\"no\" src=\"https:\/\/w.soundcloud.com\/player\/?visual=true&#038;url=https%3A%2F%2Fapi.soundcloud.com%2Ftracks%2F567524451&#038;show_artwork=true&#038;maxwidth=890&#038;maxheight=1000&#038;dnt=1\"><\/iframe><\/p>\n<p>The nature of the project itself (refining something down to a point and then using that point as a foundation for extrapolation\/creation) was quite inspiring. I became interested in using this somewhat meta reference to the assignment process as a conceptual starting point for a making process. That is, I wanted to examine <em>identity\u00a0<\/em>(one of the first things that came to mind when I discovered my word) as a concept by\u00a0recycling the process of refining then extrapolating, and apply it to my media.<\/p>\n<p>I felt this was particularly relevant given the media that was generated by my process. <i>Apocalypse Now Redux\u00a0<\/i>(2001) is an entity that is both entirely new and a facsimile of something else (<i>Apocalypse Now <\/i>(1979)). As mentioned above, when I found <i>MAN<\/i> as my word, one of the first things that came to mind for me were notions of identity (particularly surrounding gender and masculinity). I was interested in the performative nature of masculinity, and how those presenting as male navigated their identities and their surroundings, and, perhaps most importantly,\u00a0<em>the perceptions of that identity.<\/em><\/p>\n<p>Thinking about identity led me back into my thoughts about extrapolation\/creation, and how I might be able to apply this to my word and media to generate an artwork. I chose to focus on the still image of the media, and find a way to break that down and recreate something else. I thought an interesting way to do this would be through examining the pixels (the atoms\/foundation\/identity of the image), and seeing if I could use that data to make something else, or present something that would be viewed as something else.<\/p>\n<p><a href=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUALCropped-Apocolypse-Now-Redux-COPY-RS-copy.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-1105 size-large\" src=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUALCropped-Apocolypse-Now-Redux-COPY-RS-copy-1024x441.jpg\" alt=\"\" width=\"890\" height=\"383\" srcset=\"https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUALCropped-Apocolypse-Now-Redux-COPY-RS-copy-1024x441.jpg 1024w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUALCropped-Apocolypse-Now-Redux-COPY-RS-copy-300x129.jpg 300w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUALCropped-Apocolypse-Now-Redux-COPY-RS-copy-768x331.jpg 768w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUALCropped-Apocolypse-Now-Redux-COPY-RS-copy.jpg 2026w\" sizes=\"auto, (max-width: 890px) 100vw, 890px\" \/><\/a><\/p>\n<p>I want to create a new entity, so I used an online colour extractor (another serendipitous crossover here in terms of etymology and concept) to pull out the pixel colours present in the above image, which is the frame where Lieutenant General Corman utters the word <em>MAN<\/em>. There were several numerical variables to use the extractor, so I decided to use the numbers that had repeatedly appeared in the earlier stages of this project \u2013 533 and 11. This determined how many colours were extracted from the image, that will appear in the form of their hex numbers (a six digit code that represents a specific colour, usually based on RGB values, associated with HTML coding).<\/p>\n<p>After discussions with Paul and the group, I decided to explore a process that was suggested to further underline the connection between my developing concept, my media and my word, <i>MAN<\/i>. Paul suggested exploring changing the identity of the media\/image further (i.e. simply changing it from 2D image to sound) by adding the word <i>MAN<\/i> into the code that made up the image.<\/p>\n<p>A classmate from the group, Michael, who has (much) more of a background in video\/image\/media than I do, helped me find an online hex code generator that would upload my image and convert it to code for me to change.\u00a0I decided to follow the initial process, where I would feed an image into a colour extractor; however, I decided to add a step where I would use the online hex editor to edit the code of the image (randomly inserting MAN into the code) prior to feeding the image to the colour extractor.<\/p>\n<p>I decided to again reference the numbers that were so significant to the beginning of this assignment. I decided to add <i>MAN<\/i> to the code 11 times, in random places. The edited code generated this image:<\/p>\n<p><a href=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUAL-MAN-code-image-for-colour-extractor.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-1107 size-large\" src=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUAL-MAN-code-image-for-colour-extractor-1024x443.jpg\" alt=\"\" width=\"890\" height=\"385\" srcset=\"https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUAL-MAN-code-image-for-colour-extractor-1024x443.jpg 1024w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUAL-MAN-code-image-for-colour-extractor-300x130.jpg 300w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/LOW-QUAL-MAN-code-image-for-colour-extractor-768x332.jpg 768w\" sizes=\"auto, (max-width: 890px) 100vw, 890px\" \/><\/a><\/p>\n<p>Using the online colour extractor on this image generated the below hex codes and their associated colours:<\/p>\n<p><a href=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/JPEG-Colour-swatch-and-code-MAN-CODE.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-1109 size-large\" src=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/JPEG-Colour-swatch-and-code-MAN-CODE-1024x710.jpg\" alt=\"\" width=\"890\" height=\"617\" srcset=\"https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/JPEG-Colour-swatch-and-code-MAN-CODE-1024x710.jpg 1024w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/JPEG-Colour-swatch-and-code-MAN-CODE-300x208.jpg 300w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/JPEG-Colour-swatch-and-code-MAN-CODE-768x533.jpg 768w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/JPEG-Colour-swatch-and-code-MAN-CODE.jpg 1750w\" sizes=\"auto, (max-width: 890px) 100vw, 890px\" \/><\/a><\/p>\n<p>To make the new identity\/perception,\u00a0<em>Man code stratosphere<\/em>, I isolated the codes and\u00a0used the keyboard option on GarageBand to assign each letter (a to f) and each number (0 to 9) a note. I left out the hashes for this first part of the experiment, to see if I could generate a more \u201cmusical\u201d soundscape, and wanted to keep it as simple as possible, as I\u2019d never used the program before (plus, I\u2019m non-musical to the point of failing the recorder in Grade Three). I also abbreviated the number of codes included to a sample of the first eight codes, this was a choice made with the presentation duration in mind! See the audio file at the top of this page for this experiment sample.<\/p>\n<p><a href=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/IMG_3050.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-1899 size-large\" src=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/IMG_3050-768x1024.jpg\" alt=\"\" width=\"768\" height=\"1024\" srcset=\"https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/IMG_3050-768x1024.jpg 768w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/IMG_3050-225x300.jpg 225w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><\/a><\/p>\n<p><a href=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Screen-Shot-2019-01-31-at-16.01.32.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-1915 size-large\" src=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Screen-Shot-2019-01-31-at-16.01.32-1024x640.png\" alt=\"\" width=\"890\" height=\"556\" srcset=\"https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Screen-Shot-2019-01-31-at-16.01.32-1024x640.png 1024w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Screen-Shot-2019-01-31-at-16.01.32-300x188.png 300w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Screen-Shot-2019-01-31-at-16.01.32-768x480.png 768w\" sizes=\"auto, (max-width: 890px) 100vw, 890px\" \/><\/a><\/p>\n<p>I played around with different instruments on GarageBand before settling on the Stratosphere option in the Synthesiser Soundscape collection. As I mentioned, I&#8217;m profoundly <em>not<\/em> musical, but I wanted to distort the sound past a simple collection of piano notes, which weren&#8217;t very interesting (or particularly nice to listen to). I had an idea that I wanted the soundscape to blend and sound somewhat harmonious, but maintain the element of unsettling tunelessness that the original piano notes achieved.<\/p>\n<p>For the second, longer, experiment, I used\u00a0<em>all<\/em> of the hex code data, for a full conversion from image to soundscape (see the full version at the top of this page). For this version, I put the hex code data in Word, and then converted each digit of code to their corresponding keyboard command (including the hash this time).<\/p>\n<p><a href=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Keyboard-keyboard.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-1918 size-large aligncenter\" src=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Keyboard-keyboard-1024x768.jpg\" alt=\"\" width=\"890\" height=\"668\" srcset=\"https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Keyboard-keyboard-1024x768.jpg 1024w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Keyboard-keyboard-300x225.jpg 300w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Keyboard-keyboard-768x576.jpg 768w\" sizes=\"auto, (max-width: 890px) 100vw, 890px\" \/><\/a><\/p>\n<div id=\"attachment_1919\" style=\"width: 518px\" class=\"wp-caption aligncenter\"><a href=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Screen-Shot-2019-01-31-at-15.10.43.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-1919\" class=\"wp-image-1919 size-full\" src=\"http:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Screen-Shot-2019-01-31-at-15.10.43.png\" alt=\"\" width=\"508\" height=\"964\" srcset=\"https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Screen-Shot-2019-01-31-at-15.10.43.png 508w, https:\/\/blogs.unsw.edu.au\/clic\/files\/2019\/01\/Screen-Shot-2019-01-31-at-15.10.43-158x300.png 158w\" sizes=\"auto, (max-width: 508px) 100vw, 508px\" \/><\/a><p id=\"caption-attachment-1919\" class=\"wp-caption-text\">Hex codes converted for input into GarageBand<\/p><\/div>\n<p>With both of these experiments, audiences are experiencing the image in a very different way to simply looking at it. I have taken the data that makes up the image and used it to create a new way of &#8216;viewing&#8217; the image through sound. I feel that this process doesn&#8217;t so much change the\u00a0<em>identity<\/em> of the image, given the fundamental data was identical, but rather the\u00a0<em>perception<\/em> of the identity of the image, which is what I set out to achieve with this experiment.<\/p>\n<p>If I were to extend this experiment into more resolved works, which I am quite interested in doing, I&#8217;d like to create lengthy tracks that incorporate as many hex codes as possible, for the audience to have the opportunity to &#8216;hear&#8217; a complete image. I&#8217;d like to display the tracks in a space that referenced some elements of sensory deprivation: for example, in a wall-to-wall cushioned room with no light or visual input.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>MAN CODE STRATOSPHERE The nature of the project itself (refining something down to a point and then using that point as a foundation for extrapolation\/creation) was quite inspiring. I became interested in using this somewhat meta reference to the assignment process as a&hellip;<\/p>\n","protected":false},"author":101332,"featured_media":0,"parent":470,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-830","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/blogs.unsw.edu.au\/clic\/wp-json\/wp\/v2\/pages\/830","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs.unsw.edu.au\/clic\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/blogs.unsw.edu.au\/clic\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.unsw.edu.au\/clic\/wp-json\/wp\/v2\/users\/101332"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.unsw.edu.au\/clic\/wp-json\/wp\/v2\/comments?post=830"}],"version-history":[{"count":17,"href":"https:\/\/blogs.unsw.edu.au\/clic\/wp-json\/wp\/v2\/pages\/830\/revisions"}],"predecessor-version":[{"id":1976,"href":"https:\/\/blogs.unsw.edu.au\/clic\/wp-json\/wp\/v2\/pages\/830\/revisions\/1976"}],"up":[{"embeddable":true,"href":"https:\/\/blogs.unsw.edu.au\/clic\/wp-json\/wp\/v2\/pages\/470"}],"wp:attachment":[{"href":"https:\/\/blogs.unsw.edu.au\/clic\/wp-json\/wp\/v2\/media?parent=830"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}