Playing Guitar Tabs in Rust

If you've ever tried to learn guitar, chances are you're familiar with guitar tablatures.

This is a simple way to visualize guitar music, an alternative to sheet music, where ASCII symbols represent strings and frets.

For example, here are the first four bars of Deep Purple's Smoke on the Water:

e|-----------------|-----------------|-----------------|-----------------|
B|-----------------|-----------------|-----------------|-----------------|
G|-----3---5-------|---3---6-5-------|-----3---5-----3-|-----------------|
D|-5---3---5-----5-|---3---6-5-------|-5---3---5-----3-|---5-------------|
A|-5-------------5-|-----------------|-5---------------|---5-------------|
E|-----------------|-----------------|-----------------|-----------------| <- верх

This song is played in standard tuning (EADGBe), indicated by the letters on the left indicating the tuning of each string. The numbers indicate where to place your fingers on the fretboard.

In addition to the text description, the format used in software has become a de facto standard Guitar Pro for rendering and synthesizing tablature sound.

Such binary files, depending on the software version, have the extension .gp3, .gp4, .gp5 or .gp6they can be easily found on the Internet on websites such as Ultimate Guitar.

Although the software for playing tablature is proprietary, some versions of the file format are well documented, and there are even open source projects that can read them.

Probably the best open source tablature player is TuxGuitarit has so many features, it is an amazing tool for learning guitar.

Because TuxGuitar is no longer supported and is written in Java, I decided it would be interesting to write my own tablature player in Rust.

I named my project

Ruxguitar

combining words

Rust

And

Guitar

.

The project is still in its early stages, but I believe it is functional enough to announce to the world. That's why I wrote this post!

I won't describe the project's capabilities, but will simply show a video in which the tablature player plays a fairly complex composition:

Of course, the source code can be found at

GitHub

; there are also ready-made ones

Binary files

for Linux, macOS and Windows.

The first step in creating a tablature player is to parse the binary tablature file.

During my research I found on dguitar file format specification .gp4.

The file has approximately the following structure:

  1. File version to know which version of the file format is being used
  2. Information about the song (i.e. title, subtitle, artist, album, etc.)
  3. Lyrics of the composition
  4. Number of bars and tracks
  5. The number of bars per track in the following format:
    • Bar 1/Track 1
    • Bar 1/track 2
    • Bar 1/track m
    • Bar 2/Track 1
    • Bar 2/Track 2
    • Bar 2/track m
    • Bar n/track 1
    • Bar n/track 2
    • Bar n/track m
  6. In each measure we find the number of beats to be read
  7. In each beat we find the duration of the beat and the number of notes read
  8. In each note we find a string, a fret, a duration, an effect, and so on.

For parsing tablatures I decided to use a crate

nom

that's why I already

worked

with it when parsing binary format.

Here's a quick summary of the code that drives the parser so you can understand what it looks like:

pub fn parse_gp_data(file_data: &[u8]) -> Result<Song, RuxError> {
    let (rest, base_song) = flat_map(parse_gp_version, |version| {
        map(
            tuple((
                parse_info(version),                                     
                cond(version < GpVersion::GP5, parse_bool),              
                cond(version >= GpVersion::GP4, parse_lyrics),           
                cond(version >= GpVersion::GP5_10, take(19usize)),       
                cond(version >= GpVersion::GP5, parse_page_setup),       
                cond(version >= GpVersion::GP5, parse_int_sized_string), 
                parse_int,                                               
                cond(version > GpVersion::GP5, parse_bool),              
                parse_signed_byte,                                       
                cond(version > GpVersion::GP3, parse_int),               
                parse_midi_channels,                                     
            )),
            move |(
                song_info,
                triplet_feel,
                lyrics,
                _master_effect,
                page_setup,
                tempo_name,
                tempo,
                hide_tempo,
                key_signature,
                octave,
                midi_channels,
            )| {
                // инициализация базовой композиции
                let tempo = Tempo::new(tempo, tempo_name);
                Song {
                    version,
                    song_info,
                    triplet_feel,
                    lyrics,
                    page_setup,
                    tempo,
                    hide_tempo,
                    key_signature,
                    octave,
                    midi_channels,
                    measure_headers: vec![],
                    tracks: vec![],
                }
            },
        )
    })(file_data)
    .map_err(|_err| {
        log::error!("Failed to parse GP data");
        RuxError::ParsingError("Failed to parse GP data".to_string())
    })?;
    // парсинг дорожек и тактов
    ...

The main load of parsing tracks and beats is performed by another function, which I will not show for the sake of brevity.

Eventually I got tired of dealing with different versions of the file format and decided to go with the widely used version. .gp5.

To be honest, this part of the project turned out to be quite difficult because the file format is quite complex and the documentation is not always clear.

Luckily, I could study the parsers from TuxGuitar and the crate guitarproto better understand the file format.

To ensure correctness, I wrote some unit tests for specific tablature files to make sure the parser works correctly.

This approach is useful when starting out, but it doesn't scale well, so I also validate some high-level invariants of the resulting structure. Song for a folder containing several hundred tablatures.

Thanks to this, I found some bugs in the parser and am now confident that it works as it should.

So now we have a description of the tablature in memory, but no way to display it.

The user should not only be able to see the tablature, but also interact with it.

I wanted to use a native GUI library to make the app look and feel like a native app on all platforms.

Current state The availability of GUI libraries for Rust forced me to do some research.

To handle playback synchronization, I needed an event-driven library that could also custom-render tablature onto some canvas abstraction.

Based on these conditions, I decided to try Iced.

Spoiler: I am very happy with my choice and have not tried any other libraries.

Iced

Library

Iced

It's very well written, but it could use a little more documentation.

I recommend reading the source code examplesto better understand how to use the library.

I started with an example text editor and gradually adapted it to his needs.

At some point I encountered a bug in the version 0.12.0which forced me to upgrade to version 0.13.0which has not yet been released.

Because of this I had to use a branch main repositories Icedwhich was a little scary, but in the end everything ended well.

All the problems I found were related to the fact that Iced is under active development; I am very grateful to the maintainers for their hard work.

The library's architecture is based on messages and subscriptions that cause UI updates.

For example, I used these messages:

#[derive(Debug, Clone)]
pub enum Message {
    OpenFile, // диалоговое окно открытия файла
    FileOpened(Result<(Vec<u8>, String), PickerError>), // содержимое и имя файла
    TrackSelected(TrackSelection), // выбор дорожки
    FocusMeasure(usize), // используется при нажатии на такт в табулатуре
    FocusTick(usize), // фокус на конкретном такте в табулатуре
    PlayPause, // переключение воспроизведения/паузы
    StopPlayer, // остановка воспроизведения
    ToggleSolo, // включение соло-режима
}

Here is a simplified entry point of the application:

impl RuxApplication {
    pub fn start(args: ApplicationArgs) -> iced::Result {
        iced::application(
            RuxApplication::title,
            RuxApplication::update,
            RuxApplication::view,
        )
        .subscription(RuxApplication::subscription)
        .theme(RuxApplication::theme)
        .font(ICONS_FONT)
        .centered()
        .antialiasing(true)
        .run()
    }
}

The application is built on functions orchestrated by the engine

Iced

.

Function update has a signature Fn(&mut State, Message) -> CWhere:

  • State — the state of the application that can be changed (here RuxApplication)
  • Message – message to the process
  • C – this is a day off Taskpotentially creating new Message

Function

view

has a signature

Fn(&'a State) -> Widget

and renders

Widget

based on the current

&State

.

Drawing tablature

I started by creating code that would neatly render on

Iced::Canvas

separate beat.

That is:

  • draws every string
  • for each beat draws the notes on the strings and the potential effect of the beat (eg string muting)
  • for each note adds a potential note effect (eg slide, legato, bend)
  • annotates a bar with additional information (e.g. bar number, tempo, part annotation, chord)

It took some tweaking to get the offsets right, but I like the end result.

Having collected a collection of bars on canvas, I assemble them into a responsive grid to display the entire tablature using a widget.

wrap

from the crate

iced-aw

.

Bars can vary in length depending on the number of beats, making empty bars very short and bars with crazy guitar solos very long.

So now we have the tab description in memory and the UI, it's time to start making sounds!

We want to turn every note of every beat of every bar of every track into a specific sound in necessary time.

This can be achieved using a MIDI synthesizer, which is software that generates sounds based on MIDI events.

Synthesizing MIDI Events

There are different types of MIDI events, but the most important ones for us are

NoteOn

And

NoteOff

.

  • Note On: Indicates that a note is being played. Includes the note number (pitch) and velocity (the force with which the note is played).
  • Note Off: Indicates that the note is released.

For each note in the tablature, we can generate a pair of MIDI events annotated with the following data:

  • a timestamp, also called a tick, when the event should occur.
  • the track to which the event belongs.
pub enum MidiEventType {
    NoteOn(i32, i32, i16),  // канал midi, нота, скорость
    NoteOff(i32, i32),      // канал midi, нота
    ...
}

pub struct MidiEvent {
    pub tick: usize,
    pub event: MidiEventType,
    pub track: usize,
}

All these events are written to a single array, sorted by tick events.

This makes it possible to use binary search to find the next events to play at any given time.

These MidiEvents can be converted into an audio signal using a synthesizer before being sent to the audio output.

To implement the synthesizer I chose a crate rustysynthwhich has a convenient MIDI synthesizer.

Here is a simplified version of the code to play a MIDI event:

let synthesizer_settings = SynthesizerSettings::new(SAMPLE_RATE as i32);
let mut synthesizer = Synthesizer::new(&sound_font, &synthesizer_settings);

let midi_event = // находим следующее событие для воспроизведения
match midi_event.event {
    MidiEventType::NoteOn(channel, key, velocity) => {
        synthesizer.note_on(channel, key, velocity as i32);
    }
    MidiEventType::NoteOff(channel, key) => {
        synthesizer.note_off(channel, key);
    }
    ...
}

It is important to note that the synthesizer requires a sound font file to generate sound.

For simplicity's sake, I add the sound font file to the binary during compilation. TimGM6mb.sf2.

const TIMIDITY_SOUND_FONT: &[u8] = include_bytes!("../../resources/TimGM6mb.sf2");

This makes the binary file slightly larger, but the user doesn't have to worry about finding the sound font file.

However, you can specify a larger file size using a command line argument. --soundfont.

For example, I like to use FluidR3_GM.sf2which is present on most systems and can be easily found online (Here or Here).

./ruxguitar --sound-font-file /usr/share/sounds/sf2/FluidR3_GM.sf2

Audio loop

The audio output stream is controlled by a separate stream that produces sound at regular intervals.

I chose the cross-platform audio library crate cpal.

Here is a simplified version of the code for setting up the audio loop:

let host = cpal::default_host();
let device = host.default_output_device().unwrap();

let config = device.default_output_config().unwrap();
let stream_config: cpal::StreamConfig = config.into();

let channels_count = stream_config.channels as usize;
assert_eq!(channels_count, 2);

// 4410 сэмплов при 44100 Гц - это 0,1 секунды
let mono_sample_count = 4410;

let mut left: Vec<f32> = vec![0_f32; mono_sample_count];
let mut right: Vec<f32> = vec![0_f32; mono_sample_count];

// создаём цикл аудио
let stream = device.build_output_stream(
    &stream_config,
    move |output: &mut [f32], _: &cpal::OutputCallbackInfo| {
        let midi_events = // находим события для воспроизведения
        for event in midi_events {
            // синтезируем события
            synthetizer.process(event)
        }

        // Разделяем буфер на два канала (левый и правый)
        let channel_len = output.len() / channels_count;

        // Рендерим аудиосигнал.
        synthesizer.render(&mut left[..channel_len], &mut right[..channel_len]);
        
        // Перемежаем левый и правый каналы в выходном буфере.
        for (i, (l, r)) in left.iter().zip(right.iter()).take(channel_len).enumerate() {
            output[i * 2] = *l;
            output[i * 2 + 1] = *r;
        }
    }
)
// Запускаем поток.
let stream = stream.unwrap();
stream.play().unwrap();

At each audio loop run, the next time window for processing can be calculated given the following parameters:

  • the current timestamp of the audio player
  • tempo of the current beat
  • how much time has passed since the previous interval
const QUARTER_TIME: i32 = 960; // 1 четверть ноты = 960 ticks

fn tick_increase(tempo_bpm: i32, elapsed_seconds: f64) -> usize {
    let tempo_bps = tempo_bpm as f64 / 60.0;
    let bump = QUARTER_TIME as f64 * tempo_bps * elapsed_seconds;
    bump as usize
}

The resulting tick increase allows one to use a binary search to query the MIDI event array to find the next events to play.

let tick_increase = tick_increase(tempo, elapsed_seconds);
let next_tick = self.current_tick + tick_increase;
// предполагается, что у нас уже есть курсор начала событий (то есть индекс последнего воспроизведённого события)
let start_index = self.current_cursor;
let end_index = match sorted_events[start_index..].binary_search_by_key(start_index, |event| event.tick)
{
    Ok(next_position) => start_index + next_position,
    Err(next_position) => {
        if next_position == 0 {
            // нет совпадающих элементов
            return Some(&[]);
        }
        // возвращаем вырезку до последнего события
        start_index + next_position - 1
    }
};
// возвращаем вырезку воспроизводимых событий
return Some(&self.sorted_events[start_index..=end_index])

Now that we have a working audio loop, we can focus on integrating the audio player and UI.

Having perfect integration is essential for a user-friendly experience:

  • When you press the Play button, the tablature cursor should start moving and the notes should light up as you play them.
  • When you press a bar, the player should jump to the corresponding tablature position and the correct notes should be played.
  • When you click on a different track, the entire tablature should update to reflect the new track, and the audio should update accordingly.
  • When you press the Solo button, all other tracks should be muted.
  • When you press the Stop button, the tab cursor should be moved to the beginning and the sound playback should stop.

I guess the meaning is clear.

The most important bridge between the audio player and the UI was implemented using the mechanism iced::Subscription.

Subscription is a way to listen to external events and publish them as messages to your application.

For example, here's how the app responds to pressing the space bar to toggle playback:

let keyboard_subscription = keyboard::on_key_press(|key, _modifiers| match key.as_ref() {
    keyboard::Key::Named(Space) => Some(Message::PlayPause),
    _ => None,
});

The update function does not care what caused the message, pressing the keyboard or the Play button.

Using a similar mechanism, the audio player can send messages to the application so that it updates the UI based on the current playback position.

The application supports the receiving end of the channel tokio::sync::watchcontaining the current timestamp published by the audio stream.

fn audio_player_beat_subscription(&self) -> impl Stream<Item = Message> {
    let beat_receiver = self.beat_receiver.clone();
    stream::channel(1, move |mut output| async move {
        let mut receiver = beat_receiver.lock().await;
        loop {
            // получаем tick от аудиоплейера
            let tick = *receiver.borrow_and_update();
            // публикуем для UI
            output
                .send(Message::FocusTick(tick))
                .await
                .expect("send failed");
            // ждём следующей доли
            receiver.changed().await.expect("receiver failed");
        }
    })
}
...
// подготавливаем subscription
Subscription::run_with_id("audio-player-beat", audio_player_beat_subscription));

Tablature processes the message

FocusTick

to update the current position in the measure and highlight the notes.

To maintain the illusion of proper synchronization of everything with the user's actions, it was necessary

set

different parts.

Current version

Ruxguitar

— this is practically an MVP for starting the development of a project.

It is still very far from being functional and convenient TuxGuitar.

Here are some ideas for the future:

  • support for more file formats (currently only supported .gp5)
  • display more information about the tablature (e.g. rhythm, time signature, key signatures, etc.)
  • support for repeating beats
  • support for slowing down and speeding up playback

I was working on

Ruxguitar

over the past year and am very pleased with the results.

During the development process, I not only learned a lot of new things, but also created complex software that even seems to work.

Working on such a large project alone requires a lot of discipline, because there were many times when I was ready to give up when I encountered strange bugs or spent weeks struggling with the implementation of individual features.

I couldn't create Ruxguitar without an example of implementation in the form TuxGuitarand I am very grateful for the work that the development team has done over the years. TuxGuitar.

I've spent so much time on the project that it's time to start playing guitar again instead of writing software for it!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *