Community project: multifunctional protocol router for open hardware

  • Everyone who has been following the discussion on “open source perceptions” ( might have seen that there are several ideas about developing an open source protocol router that can be used in conjunction with projects made by the rapidly evolving Open Source Hardware community.

    I started a wiki section where we can keep track of all information about this project:

    I'm pretty new to software development, I understand some of the concepts involved and I'm keen to learn more about it. First of all we should probably see if we can define a list of key features and functionalities, please add your thoughts and ideas to the list in the wiki so that we will have a good overview of what needs to be done..

    What will be next? To quote jul from the previous topic:

    "Starting from there, we can start to find common points and factorize protocols. We can also start to define mapping techniques from serial to OSC. Following step is to write the associated libs in various languages. Ideally we should be able to map from anything to anything: e.g. midi note to keyboard strokes, or serial to midi, or OSC to key stroke. Maybe this is not needed initially. I would propose to focus on serial to OSC and see if we can extend the system later on, if the project goes well."

    Enough talk for now, let's get this project going!

  • rooker and i had a very long dialogue about this about a year ago.

    here are his notes:

  • /me hops on board.
    registered my basic skills on the wiki page

  • also, thought i'd throw JSON into the mix.

  • more coming as well.

  • Very interesting.. looks like I will be reading a lot today.. Is rooker still working on his thesis project?

  • not sure, i just sent him mail.

  • Similar idea:

    Just thought of another name for the project: CrossTalk (added it to the wiki)

  • I am still working on that project, although I must admit that there hasn't been any real programming progress so far. :(

    It's still my diploma thesis and I'm still eager to implement it - I've already done some more detailed studies about "element" mapping possibilities. You wouldn't believe how differently a simple button can be made to behave.

    I'll try to copy my paper-work into something digital/transferrable to have you take a look at it.

  • Wow.... I've finished reading up on Skore's thread (#2025) - Great to see that much energy in there! Here are my 2 cents:

    == Proper separation ==
    There are too many good audio applications out there - it's the proper hardware interfaces / controls that are lacking.
    We shouldn't think about re-writing Max, but rather focus on a generic layer that can contain as much cross-application-shareable functionality as possible.

    If the abstraction is properly done you will have the following features:

    - Rapid prototyping, almost completely application independent - and thus reusable and shareable (!)

    - Useable by non-programmers (due to simply re-combining UI-elements. It's like playing LEGO)

    - New devices can quickly be made available because they only have to be broken down into their elements *once*.

    - More complex functionality can be added by creating a new widget on demand. My experience with PD showed me that if you don't have the right object to do the job, you'll end up with hell of a patch. I want to avoid that.

    == About the language choice ==

    I've had a hard time choosing a language for serial-pyio when I started it and the main reason for Python was that I was able to achieve incredible results within almost no-time.

  • I'd like to comment the contents of the "oscrouter" wiki page - but it would turn out to be a mess - so I guess discussions should happen here?

    Important thing I wanted to comment there are the device settings, which are currently focused on serial/monome devices (thus mentioning a baud rate).

    Settings should be device-specific, thus the routing-framework would simply have the right class/object to handle a certain device - so individual settings are not framework related actually.

    Anyone interested in this thread/topic might want to take a look at serial-pyio's proxy engine which is somewhat the monome-specific prototype based on that very idea we're discussing here (Many thanks again to Julien for implementing it)

  • @rooker: great to hear you're still interested in pursuing this endeavor. Do you have a deadline set for your diploma thesis? Programming the whole thing is the easy part.. Defining functionality and creating the project outline will probably take most time and effort...

    Feel free to move/add/delete any of the information in the wiki page, it's just a list/collection of things that have been mentioned in the discussion topics.

    I moved the wiki entry from the app category to a new frameworks category as tehn suggested a while ago in the previous topic (I should learn to read ;-)

    Point your browser to:

    I'm working on a flow diagram of how things could work, will post it when it's done.. A couple of the things you mention above can be directly translated to the hardware of the Prototyping Module project I'm working on, e.g. building blocks, lego and rapid prototyping.. -> Have you seen it yet?

  • (Un)fortunately there's no deadline for my diploma thesis - that's also the reason why company-work always got higher priority, since that usually *has* deadlines. :)

    Since I've had my mind on this subject for almost a year now, I propose that I send you my draft first, because I have to make one anyway - that way you (and others) can modify/add stuff instead of drawing a new graph.

    That prototype at looks incredible! impressive!

    btw: Thanks for the link to junxion - I'm currently studying its manual, but I already think that my design has some valuable advantages over theirs:
    - *any* output protocol (not only MIDI)
    - thus, any data type (not only numeric)
    - combining elements (junxion only maps 1:1)

    There are lots of things about it in my head that haven't made it onto paper/harddisk yet.

  • I needed to learn more about this subject, so I started thinking about the functionality and features of the application.. but I'm looking forward to receiving your notes on this subject. We probably all have different approaches and ideas about it, so we might learn some stuff from eachother.

    I've had a better look at Python today, it seems like a very nice platform to work with (being open source, compatible with lots of other languages and multi platform), plus maybe we can build upon the work that jul did with serial-pyio...

    Attached is a pdf I made to get a better understanding of how this app could work.. it's very much work in progress, but let me know if it makes any sense at all ;-)

    I added a link to GlovePIE to the wiki page, you probably already know it, it's an app (pc only) that can be used to map I/O between devices and apps.

  • it seems we're all on a very similar track. keep me in the loop with any discussion. i'd be happy to throw in ideas. i think it would actually be constructive for us to mark up the wiki with questions and suggestions (as long as the formatting can keep ideas and comments separated?)

  • I think we'll end up dividing the wiki section in multiple pages, but until that's necessary we can use code blocks to seperate comments

    Have a look at the bottom of the wiki page where I added an example.. Will that work?

  • Really interesting project, indeed.
    I added myself to the team, if I can be of any help.

    I think that the best way to go for this project is to use C++ for it's speed and efficiency. Today we can use stuff like Processing for making a Serial router, but we don't know what project will use this software, so we should take the most powerful language. Correct me if i'm wrong, Java is great, but it's still not natively implemented on every OS.However, I added Processing to the languages list, for a prototyping purpose.

    A good exemple is Firmata. It's a protocol allowing communication between microcontrollers and a whole bunch of software (Flash, max, PD, ...). You have libraries for both MCUs and the host computer, so you can exchange datas easily.



  • @melka:
    You probably would be right regarding speed and C++ if we were doing some audio signal processing, but we're only going to route control signals.
    I'm also not comfortable with Java, but in Python you have the best of both worlds:

    - Comforts of an interpreted language
    - Easy to extend using C in case speed is a problem in a certain corner
    - Mostly instantly platform independent

    This framework will be designed to be as accessible to use/extend for everyone that I would prefer a language that allows rapid "hop on-hop off" development, since we're going to see mostly modular code.

    For compiled languages, the hazzle of getting your development environment up and running can be a no-go for a lot of people:
    Sometimes they're lacking knowhow to do so, sometimes they just don't want to spend that much time getting into "yet another whatever..." in order to just adjust a tiny bit of code.

  • About "Firmata":

    Sounds like a protocol that will be used to speak to homegrown devices (e.g. Arduino based). I see it's currently not available for Python, but from what I've seen so far it's rather trivial to wrap a C library in Python.

  • I would also favor Python for the very same reasons Rooker mentionned. It's always possible to make C module for time critical modules. But, I doubt it will be needed.

    Anyway, I think we should avoid language discussion for now, and focus on the specification of the mapping. The specification must be language independent anyhow...


  • You're absolutely right Julien!
    I'll copy my paper draft of the framework design to e.g. OpenDraw so we can immediately get going on the nasty details. :)

  • Great! I'd love to see what you have made so far.. Once everybody's thoughts and ideas are online we can start, discussing, combining and stripping the content to build a solid plan for the framework..

  • Here's a flow-chart draft (v0.1 ) of my idea:
    (the ODG source file is in the same folder - in case someone wants to improve the chart)

    A lot of details aren't in there, because it would make the graph almost unreadable. The design features some value pre/post processing features like "numeric value scaling", "value ramping", "jitter-removal", etc...

  • Nice stuff.. I'll have to study it more closely tomorrow, but like tehn said, it seems like we're all on a very similar track!

  • this is very exciting! i wish i could work on it full time.

  • One important thing I want to mention here:

    I've done quite some studies on the adoption of interfaces and the disturbance of "flow" (flow? read: in regards to electronic music instruments.

    Here's what my assumptions boil down to:
    - keep it simple
    - provide presets/examples

    Why I consider this important is, that "keeping it simple" often means "losing functionality" - and coders (including myself) usually don't like that.

    But I guess it's more important to have less features - so people can focus on what's "there" instead of getting lost in "what would be possible".

    Imagine yourself in front of a modern Synthesizer keyboard without presets... That would frustrate and scare people right from the start - preventing its adoption.

    IMHO, progress in the field of user interfaces could be accelerated by rapid prototyping available for non-coders, and the ability to share their experiences.

    More sophisticated demands will emerge from these experiences - and if the framework is implemented properly, these things can *then* be added later on (by coders).

  • I think you're absolutely right.. how many hours haven't we spent trying to setup/install/configure devices like VCR's, microwaves and ofcourse all kinds of audiovisual equipment.. There are many guidelines, usability tests and tests available for digital products such as software and websites, but somehow manufacturers seem to easily get lost when it comes to device settings.. trying to let several devices talk to eachother can also be a hassle, like the BarBQ example in the oreilly article.. Sure, a minimalistic 2 button interface looks nice, but is it usable ? :-)

    It think this project would certainly contribute to a better flow, but it would also open up lots of possibilities! Like you said it also makes rapid prototyping available to non coders (like myself). The fact that you can "patch" and connect pretty much any input to any output makes it really interesting!

    Translating sophisticated technical settings to well ordered, hierarchical menus that everyone understands can be a big challenge.. This is probably commercially not interesting for big companies that are required to maximize profits for their shareholders.. All that matters are features, more features, hidden away in cryptic and complex menus...

    Hopefully communities like this can bring change.. after all we are the users/customers and we have to tell manufacturers how we like our instruments.. The open hardware scene is also evolving rapidly... partly because of the lack of flow.. people start building their own intuitive instruments and controllers...

    Now is probably a good time to start selling your roland/yamaha/korg shares.. ;-)

    Edit: I just realized that we should probably incorporate the word FLOW in the project name.. it's the keyword that says it all...

  • Unfortunately "uber-flow" is already taken (GPL particle engine) ;)

  • Yesterday, I've written down some plans for general/global features:
    (Post from Sep 09 - 2:27pm)

    Overview of suggested features:
    - keep-alive & auto-fire
    - ramping
    - transformation: scaling, offset and auto-calibration
    - threshold

  • hey, wanted to direct people to our abandoned first attempt at a similar project:

  • I've mentioned mapd to my tutor as an example how a GUI editor could be used to assign elements to widgets, when I first told him about that idea. :)

    I've even thought about improving it to be useable for different pieces of hardware, but I think that the monome specific design is too integrated in the source.

    How come it became abadoned?

  • I'm watching this thread and would be interested in helping with the coding when it get to that point. Comp engineering major here.

  • I tried mapd a while ago, it's certainly in line with what we're trying to accomplish here, but as rooker said it's pretty monome specific and it's mac only. However there are probably some ideas that we can borrow from the app ;-)

    @dseaver: Great! Add yourself to the list on the wiki page..

  • @dseaver (and of course, others who are interested)
    I'd like to discuss some tricky design issues with experienced coders.

  • my diploma hand in date is next tuesday. then i have to hold my defense the same week, but afterwards i'd be glad to join the discussion. what you think about gathering all that information in a mindmap? offers a gread mindmapping service for that manner.

    but.. well i really have to concentrate on polishing my thesis conclusion ;)

  • Using a mind-map is actually a good idea.

    About MindMeister: I'm not really a fan of browser-based apps - especially when I'm not certain about my data being locked up in there, sold, read, etc...

    Does anyone know about their privacy policies, as well as export possibilities?

    Since MindMeister is able to import "Freemind" ( maps, I'd prefer to go with Freemind for the beginning.

    Certainly, the online collaboration aspect is appealing and probably useful.
    (I've only used kdissert so far, and I've never used mind-mapping collaboratively)

  • Hi everyone, I just added myself to the list on the wiki. I'm a C# developer professionally but have done loads of other stuff. I made a serial communication API for the monome in C#.NET when I first got it just for the fun of it so I may be able to help with some of that. I haven't played with OSC too much but very eager to learn. I'm willing to have a crack at anything that's needed.

    Happy days

  • dseaver & crunchy_alligator:
    Welcome aboard! Happy to see so much interest.

    Don't worry about OSC. it's rather trivial - mainly boils down to "name, value" with some sort of "address" (e.g. "/40h/led_row") wrapped up in network packets.
    And thanks to a wide selection of libraries, we'll just have to deal with the content, not the protocol. Using callbacks to attach functionality is probably the hard part - but that's regardless of the protocol.

  • @strigi:
    sorry, I didn't see your attached pdf earlier. I just looked into it.

    I've also thought about a protocol mapping mechanism, but I think it's:
    - probably too complicated, and thus only useable by technically skilled people.
    - unable to map certain use cases which would require some functional logic (thus coding).

    This might sound like I'm trying to push my approach, but I've been thinking about this subject for almost 1 year now, and it turned out that the widget approach is probably able to cover most use cases while still staying "non-geek usable".

    Other opinions?

    My goal is to make the protocols and different hardware devices transparent to the user.

    Daaaaamn! I just had an idea!! I gotta write that down right away....

  • I've been reeeeaaaally busy recently (yes, 4 e's and 4 a's worth of busy), but it's great to get back and see such a large amount of discussion.

    I must admit that I was originally thinking of something along the lines of xndr's protocol mapper, but rooker's widget-based system is pretty intriguing.

    I guess the thing that I'm missing about the widget system is how to connect, say, a slider widget to a hardware slider, without having to standardize what comes out of a hardware slider. Would there need to be some kind of 'protocol mapper' to interface the hardware to the widget? How does the user specify the input to the widget?

  • in agreement with rooker, i don't think that configuring a new device serial map needs to be trivial-- if someone has written their own firmware for a device, they should be able to hack up a python include file (or something to that effect) given there is solid documentation. i'm not sure it's necessary to abstract the serial communication (bit shifting) to the degree that it'd have an xml parsed config. for one thing, the efficiency of the actual operation of the app would suffer.

    on the other hand, a "helper" app to generate a "device config" include file would be fine.

  • I'm about to write examples for each widget I've currently thought of - but for the impatient, here's the answer to the slider example:

    Device_A has a hardware slider.
    This is made available as "fader" element, since a slider ain't anything else than something that outputs values between it's min and max in the end.
    Whether these values are discrete or continous doesn't matter.

    That's the great thing about breaking devices into elements (fader, button, led)

    In your case (input=fader, output=fader), you just wrap the fader's current value into the right protocol.

    e.g. OSC: integer "/box1/slider1" "50"

    Features like value transformation (, search for "value transformation")

    Of course, value transformation should be able to handle floats, too - example:
    input: 0-1024
    output: 0-1
    type: float

  • @tehn:
    A new device requires coding. Of course. But that's done where it belongs: in the device layer (so in Python). All that has to be done for a new device is to break it down into its elements *once*.

    Then it's accessible for non-coders and can be mapped to widgets.

  • good. we're in agreement then.

  • I'm sure I'm not thinking about this the right way, and that an aha moment will come sooner or later, but I guess I still don't quite get it.

    For example, Device_A's hardware slider. The values from this slider are making their way somehow into the computer (be it MIDI, OSC, some bits in a a serial packet, whatever else we can dream up). In the example you gave, input = fader, output = fader, I guess I still don't get how the input side works exactly. For example, let's think of, say, Device_B, which connects to the computer via a serial line, and has two faders, Fader_1 and Fader_2 (sorry, zero-based counting fans). Conceptually, I guess that these appear as Device_B.Fader_1 and Device_B.Fader_2, and you can map either of these to some output message. However, how are you specifying the input side of things, e.g. who is handling the incoming serial bytes to Device_B and associating them with the appropriate values for Fader_1 and Fader_2?

    I hope I'm making some sense ...

  • Oh, hang on, what I've been asking about is in the device layer (there's my 'aha' moment).

  • Good stuff.. Rooker, I thought about your attached diagram for a while.. I would like to propose a protocol layer between your driver and element layers... A protocol widget could represent a single device or a protocol such as Midi/OSC/DMX/ETC.. this should be configured once (add list of available commands/calls/feedback/etc). The options specified in this "widget" would then become available inside other widgets.

    The way I see it a device has it's own (os specific) driver that will output any of the following protocols: OSC/Midi/DMX/SERIAL(Monome/Arduino)/UDP/TCP/ETC. I think there should be some kind of configuration file that will contain all available commands (input/output) that will become available in your element layer..

    Perhaps we should add some pages where we can add examples of how we think the app should work in terms of configuration (code/non code?), coding, configuring and adding widgets. There are probably a dozen ways to implement this, so it probably wouldn't hurt to get a bit more technical about this so we can learn how other people would solve it..

  • @ucacjbs:
    Sorry, I didn't see that you're thinking within a different layer. Sounds like it makes sense to you now. Any more questions?

    I must read your post a few times more, because I think you're going into a direction that I've just had my "aha" moment today (look at previous post containing the word "Daaaaamn!"), but I'm not 100% sure yet.

    Until I've understood your suggestion completely, here's a rough sketch of my "geistesblitz":
    When I mentioned that I want the protocol to be transparent to the user connecting elements to applications, I figured that it isn't transparent in my current design. So what if you could break down applications into elements, too?
    Physical devices as well as apps have inputs *and* outputs. It's bi-directional anyway. I'll try to find a better description for my thought...

  • @xndr:
    I can't see why you're suggesting Protocol *between* driver and elements: "Driver >> Protocol >> Elements".

    Please tell me what you mean by "config file [...] containing all available commands".

    If I understand you correctly, you mean something which is already happening *within* the element layer, where each device is broken down into its elements - This is where talking to the device is happening.

    (but I must update my flow chart anyway)

  • @rooker: I think it's partly solved with your Aha moment..

    I think devices and applications can be accessed through a single "protocol" layer.. I can't make up what is happening inside your element layer, but since every device and application can be accessed through a particular protocol I thought it would make sense to add that to your flowchart..

    In the application a user would first add a protocol widget, where they define all possible input and output commands.. these device specific commands will then become available inside all other widgets.

    Also in your chart it isn't clear if it's possible to communicate between protocols..

    Widgets could be devided in input (listen/receive/read) and output (talk/send/broadcast) type. These two widget types could be connected/patched in the app window.. It should also be possible to connect multiple input widgets to one output widget or vice-versa.

    app1.sequencer > device1.led1 (indicator)
    app1.sequencer > app2.drums.hihat (sound)

    device1.button1 > app1.octave1.key1 (musician1's monome to pc1)
    device2.botton8 > app1.octave1.key1 (musician2's keyboard to pc1)

    How would one map a range of buttons.. say the top 3 rows of 40h buttons to the first 2 octaves of a keyboard or synth?

  • 1) I'll add the device protocol communication to the chart

    2) Do you really think it's necessary to make device specific commands available inside the widgets?
    Communication with devices/apps should be handled through the element abstraction layer. You never talk to a device - you're simply "talking" to an element.
    Each element is defined as either input or output.

    3) Every widget practically has inputs *and* outputs.
    Unfortunately, I think we're talking about the same thing but we're mixing up what we mean with inputs and outputs.

    *sigh* Text is so difficult... :)
    I'll see if I can improve my descriptions, by using your above example (next post)