Overview of work this week

Everything I have done this week has been focused on adding support for the GPX format, the format used by Guitar Pro 6. I wrote a little last time about the “score.gpif” file, and part of the time was spent determining precisely how the structure of the file is represented, and how we can extract the information we need into Musescore. I’ll give a discussion on this below.

The top level node of score.gpif is <GPIF> which has several children, namely <GPRevision>, <Score>, <MasterTrack>, <Tracks>, <MasterBars>, <Bars>, <Voices>, <Beats>, <Notes> and <Rhythms>.

  • <GPRevision> simply contains a number which represents version information.
  • <Score> contains meta-information about the score that has been created. Such information in here includes Title, Artist, and Copyright information.
  • <MasterTrack> contains some information that apply to parts. This includes listing unique identifiers that are assigned to parts, the initial tempo, and other miscellaneous information.
  • <Tracks> describes information about a given track with one of the unique identiiers described in <MasterTrack>. Information such as the instrument reference string that is used by Guitar Pro in order to initialise the correct instrument for playback.
  • <MasterBars> lists information about bars, and this information exists in a specific ordering. Each child of <MasterBars> contains three nodes – <Key>, <Time>, and <Bars>. The <Key> node contains information such as a count of the number of accidentals (it is suspected this can be used to detect changes in key signature). <Time> gives the time signature for the bars listed in <Bars>, and can be used to determine changes in time signature. <Bars> gives a list of bar identifiers, all of which apply to a different part in the order they were defined in the <Tracks> node. This means that the number of identifiers specified in <Bars>, should be the number of bars in the music multiplied by the number of parts.
  • <Bars> contains descriptions of individual bars, each of which have unique identifiers. Descriptions include the clef for that bar, and a list of voice identifiers.
  • <Voices> contains a list of the unique <Beat> identifiers for each unique identifier for a voice. The beat identifiers that are listed will be all the beats that will apply to this voice, in the bar that this voice identifier was specified in.
  • <Beats> describes a list of each <Beat>, each of which has a unique identifier, and details information such as the dynamic of that beat (as a string, P for piano, MF for mezzo forte etc.), a <Rhythm> unique identifier, a <Note> unique identifier and some other miscellaneous properties for that note. Note that the <Note> identifier may not be present – if it is not, then that indicates that a rest should take place for the duration described in <Rhythm>.
  • <Rhythms> lists individual <Rhythm> values, each with its own unique identifier. Each contains a description <NoteValue> containing the duration of this <Rhythm> as a string, examples include “Quarter”, “Eighth” etc. In the test case I have, anything subdivided more that “Eighth” is then represented as e.g. “16th”, “32nd”, In addition to this, the tag <AugmentationDot> describes whether this rhythm is dotted, and if so the duration of the note should be multiplied by 1.5. A count of the dots is also present here, so notes can be “double-dotted”.
  • <Notes> gives a list of individual <Note> values, each again with a unique identifier. Descriptions can be given here about whether a note is tied, whether a note should have a bend to apply to it, whether the note is muted etc. String and fret numbers are specified here among various other note-specific properties.

I am not considering absolutely all of the information specified in the file at the same time. My current goals with respect to support of this format is to get something stable and working, such as displaying all the notes in the parts with no additional (e.g. articulation) information and then add a pull request, and build on top of that. At the start of the week my goal was to create the correct number of parts, which some basic meta-information, as they were described in the file, and this was finished. I wrote a little last week about the problems I was encountering at the time with that, where clicking on any given measure would result in a segmentation fault, and I made the hypothesis that this was because there was no information in the measures, and this has indeed turned out to be the case. I verified this initially by creating rests in every measure of every part, and then starting playing around with the parts by adding notes etc. and things seem stable now.

After the issue with measures had been sorted out, I then started to look at the “score.gpif” file and determine how that information was all put together. The connection of bars to parts was a connection that was a little awkward to make, I had expected something perhaps a little more rigorous that simply depending on the order that the parts were defined in. Once this was correctly determined, I implemented functions to complete the parsing of the “score.gpif” file such that:

  • The link from parts, to individual bars, to voices, to beats, and then to rhythms and notes was followed, so information exists in my local copy of the implementation about everything needed to add notes to parts;
  • Warnings were generated for any XML tags that I simply skip right over for the moment (such as information about bends – now is not the right time to consider that level of detail);
  • Information that was parsed was available in the right place in the implementation. This is still a source of work, as I gather up information about individual notes and rhythms at a point before they can be used. I do not wish to parse the file in any way a second time and I don’t want to hold the necessary information in an intermediate data structure, as I believe if things are structured correctly then neither of these will be required. This is going to be my starting point of work for next week.

In addition to this, I have spent some time familiarizing myself with how to add notes to specified measures, and give information such as duration etc, in the implementation. Given that this has been done, and information right down to the rhythm level has already been parsed as described above, after fixing the structure issues and getting the necessary information all in the right place with some kind of efficiency, there should not be much work left remaining before everything can be looped over correctly such that all notes are added to all parts, which is my current goal.

I haven’t spent any time this week on other Guitar Pro formats, such as fixing the issue with the test file for bends as described in my last entry.It might be the case that I’ll do that this week, it will depend on how progress is on the GPX format. It’s likely I’ll prioritize GPX support for the time being as I feel I’m making headway on that task and there is still more to be done.

Key tasks that stalled

The task which took longer than I expected to was understanding entirely the structure of the score.gpif file. Although most connections were immediately clear, such as the connection between bars and notes and rhythms, the connection between parts and bars was not so. Aside from this, nothing took particularly longer than expected. Adding support for other formats, such as fixing up the bends test so that it can be committed into the repository is something that I haven’t looked at this week. It’s possible I’ll get to that this week, if not, then certainly by the week after after some initial GPX support has been completed.

Upcoming tasks this week

As I’ve mentioned, I am going to likely be focusing on the GPX format. I would very much like all the notes in all of the parts to be displayed by the end of the week, and then the most difficult part of the work that’s associated with supporting the GPX format will all be done and out of the way. At that point I’ll be able to clean up and document the implementation so that others can get access to it, and then that will allow me to just focus on individual features, such as ties, bends, voltas, and other information.

Advertisements