📜 ⬆️ ⬇️

Creating music interfaces

The alarm bell, the characteristic squeak, which opens the doors of the car - we constantly interact with the technology through sounds. Nevertheless, many still retain the belief that the interface broadcasts information primarily through the screen, while the huge potential of the sound language is ignored.



I have been designing for 16 years, and in my free time I write music. Possession of these skills confirmed me in the idea that interfaces should combine at least sound and image. The team from the company Udemy, where I am currently working, is currently developing a new approach to learning. During a brainstorming session, the idea arose of connecting audio signals to intermediate screens. I got carried away and began experimenting with synthesizer and MIDI samples in order to provide the user with audio feedback during the course and completion of the course. We tried different instruments, chords and tempo. The difficulty was that the audio content intelligently showed progress and at the same time expressed the values ​​of our company. What sounds can tell about us? As a result, we settled on short, unobtrusive motifs in A major, played on a marimba and a harp.

After this experience, I wondered ... what if instead of using audio signals as interfaces in the interfaces for the user, we would use harmonies, notes and chords as symbols? What if we would choose a tool or toolkit that fits our brand is in tune with the “voice” of our product? What if the music was applied in such a way that the user could read the message embedded in it?
')
Despite the fact that hearing is one of the main channels of perception of information, most of the interfaces focus on the visual. Sound feedback can improve user experience, however, with few exceptions, developers rely only on what can be displayed. Audio feedback helps the user, giving him the opportunity to look away from the device and do several things at the same time. It is also convenient in that it lets you know when the action is fixed, processed or completed, without using the screen. But creating designs with sound is not so easy. Many aspects need to be taken into account to make the user experience enjoyable, meaningful and practical.

I liked the described experience so much that I decided to create a collection of music recordings that others could use in their workings out. I got more than 200 audio samples - harmonies, sequences, sound effects, sounding speech and combinations of chords performed on 8 different instruments.

The entire archive can be downloaded here . Well, if you are interested in my past, my advice on how to create music interfaces, or the history of the creation of these samples - read on.



If a tree falls in the forest and no one is around, will the notification come?

Before we talk about music, let's first understand how we decipher and, ultimately, create the meanings hidden in the sounds. The audio content, even if it is not a speech, is full of information that helps us better understand the environment - this process has long been for us a part of everyday life. It is enough just to listen to determine that the batter hit the ball, that someone unfastened the Velcro fastener, or that the kettle boiled. We use the audio feedback mechanism in devices such as televisions, microwaves, cars, toys and mobile phones. Sound interfaces can serve as a pleasant and useful addition to the visual (or even a substitute, given the growing popularity of wrist devices).

Creating a design with sound, it is important to determine what value each of them will convey, in the very early stages of work. A signal that broadcasts important information should be significantly different from those that simply accompany visual content. The visual and auditory channels of perception are fundamentally different, so sounds can transmit information that is inaccessible for visual content. Sound uniquely reinforces the three basic principles of interactive design: visibility, response and constancy.

Audio designs can express different meanings: regularity, the passage of time, a call to action or a warning. The possibilities are endless, but it does not follow from this that every interaction should include sound. Audio content should facilitate interaction, not hinder, interfere with or distract. In order not to annoy the user with monotonous signals, it is best to give preference to short and simple sounds that are informative in their very form. Thus, the sound will transmit the value that is built into it initially.

Design and music should be sung

Design is my main passion, but music also occupies a special place in my soul. The story of my relationship with music was unfolding not quite in the traditional way, but still rather ordinary. I started by playing as a teenager in a punk band (I was playing terribly, by the way), then I switched to synth punk with MIDI and virtual studios, then I got to nu-disco with synthesizers and arpeggiators (James Murphy wouldn’t approve) . For some time he “conquered” the audience with the enchantment música sabrosa in the Latin American group, and then decided to master the “lost art” of DJing (my favorite is Mexican weddings).

During the years that I worked as a designer and wrote music for the soul, I came to the following conclusion: the creative process in both cases is almost the same. It doesn't matter if you write a song, draw a comic or create a user experience, the goal is always the same - to tell a story. You adhere to the universal basic structure: the set, the development of action on the increase, the culmination, the development of action on the drop and the outcome. The secret is to capture and retain the audience.

The similarity is not limited to one structure. Sound characteristics (pitch, timbre, duration, volume, direction) are similar to the design elements (shape, color, size, texture, direction). The principles of creating music and design also have much in common (composition, form, rhythm, texture, harmony, similarities and contrasts).

Why am I telling you all this? Because I believe that in any interface, sounds and visual elements should be integrated as one. For example, when creating a warning module, we can use red color and an icon with an exclamation mark - both of these characters are familiar to the user and cause an idea of ​​danger or risk. Similarly, a high, loud sound with an unusual timbre can be selected as a warning signal. There should be a connection between visual and audio content in the interface, be it similarity or complementarity.

Blackberry compares the visual interface language with its sound component in its Earconography:

“The envelope on the icon can be of different colors, with or without a stamp, at an angle of 25 degrees - while it looks like an envelope, users will understand what the icon means. The same story with sounds. ”

Finding the right sound is such a mess

The choice of a suitable sound design depends on the purpose of your product or service, as well as on its style. At the fundamental level, you can use speech sounds or sound signals in the interface - the so-called “sound icons” (earcons). Applications like Facebook, TiVo, iPhone and Skype use audible signals to create a feeling of being tied to their holistic ecosystem. The use of sound icons helps the tools better represent the brand on the market or emphasize the personal style of the product. Should the sound leave a feeling of something metallic or wooden? Synthetic or natural? Massive or small? Difficult or simple? Answers to these questions help determine the material, type of instrument (wind, percussion, string), and also set a general theme.

The variability of sounds has no limits. You can change any characteristics, getting completely different results with each new combination. Moreover, the sound characteristics influence each other. For example, the volume affects the pitch, the pitch can change the volume, timbre and duration can also affect each other. Going into all the technical details may be difficult, and the ability to hire a sound engineer does not always fit into the budget. Therefore, I would recommend experimenting a little and trusting your instincts, selecting the optimal sound design for your project. Or just hire a teenager from a punk band.

Ideally, music interfaces should be partly ideographic and partly metaphorical. In other words, they must contain both standard sound attributes and abstract categories like size, material, speed or weight. I like to define these two variants of sound design as “flat” and “skeuomorphic”. For example, when you close a dialog box in an application, you can directly reproduce the natural sound of a closing door, or you can use a synthesized imitation (or skeuomorph) of this sound with corrected timbre, speed, and power.



Musical interfaces on top

Most people have a general idea of ​​music, even with a lack of experience working with it or having the appropriate education. Experiments with its different characteristics, such as rhythm, harmony, instrument, motive or tempo, can help determine the meaning and purpose behind each sound.

Among the applications that masterly use music in collaboration with the user, I would call monument valley and okey. The fact that both applications are gaming is not a mere coincidence. Game designers have long been working on the topic of using music in interfaces, and in my opinion, developers can learn a lot from them. The use of chords can add depth to the interface in which the tonalities of various heights are played without stopping. Harmonious movement arising from the movement of melodies may cause the listener to associate with progress, success or error. Other events, such as completion, sending (sending, downloading) or return (receiving, downloading), can be represented by modulation from dominant to tonic and back.

Musical messages can also appeal to the senses. In Western music culture, major causes positive emotions (just remembering most of pop music), while minor melodies are perceived as sad and melancholic (for example, “Love will tear us apart” - Joy Division, “New York I love you, but you 're bringing me down ”(LCD Soundsystem). Choosing a scale can help give your product the right mood.

The archive I compiled uses D major. I created various sequences of notes, octaves and chords that can be combined in consonance. In the future, I plan to update it as other octaves are added.

Use archive - minuet business

The archive was created by recording analog and digital synthesizers in Ableton Live. Eight musical instruments (bell, guitar, harp, marimba, piano, whistle, flute, xylophone), several sound effects (R2D2, The X-Files) and voices (both male and female) were used to create it.

Each musical instrument has from 20 to 40 sounds. In various combinations, they can represent a sequence of actions, success, error, warning, warning and other simple interactions. Including, I added a few embellished chords in case you want to breathe new life and add zest to your product.

The folder structure is quite simple: “Root folder / Tool / File”. File names have the form "instrumental concept note number. Resolution". I recommend using the same instrument sounds for different types of interactions. But if you want to break away, you can combine the two tools and see what happens.

Concluding on a high note

Music can influence how we interact with visual interfaces. It helps the user to dive deeper into the story and penetrate it. Well-designed music interfaces can enhance the experience and make the product personalized, but if used incorrectly, sounds are distracting and annoying (remember the flash sites of the 2000s and terrible blogs?). Audio is something very personal, and care must be taken not to cross the line when communicating with users.

I hope that the archive I created will help you gain rich experience and inspire you to create amazing products.

Source: https://habr.com/ru/post/318800/


All Articles