iPad 2 Dreams

  • Hey there guys, thought it would be awesome to throw cool ideas for apps around not that the specs for the new iPad 2 are on the interwebs.

    One of the biggest features that hit me was to develop a sort of Modul8 for the iPad where you can whip out an awesome VJ set in full 1080P using the iPad. I'm curious if the new A5 dual core processor would be able to handle multiple threads of HD video. I think that would be awesome to see.

    Any others dreams of new software that would work great with the new iPad hardware? Share em' here.

  • velocity sensitive

  • @KristofferLislegaard,

    Velocity's a tricky beast.

    I've got a few apps on my iPhone that achieve it pretty effectively via mic input or monitoring the accelerometer. It's a convincing illusion, but it's only as stable as the environment you're playing in.

    (A better solution is to use your XY coordinates more aggressively. I've seen a lot of "hitting the top of the pad is louder than hitting the bottom of the pad", which offers full control, but I think "the center of the pad is louder than the edges of the pad" is more intuitive.)

    Regardless... Why? This is a pane of glass you're banging your fingers into. That doesn't seem like a great idea on any level.



    @MCDeltat,

    Think of the front camera as a new sensor, and see what capabilities that affords. Simple example: "the music doesn't have a fixed tempo; its metronome is driven by me bobbing my head." (of course, that probably assumes lighting conditions you won't have at a show either)

  • They were saying there is velocity sensitivity in garage band using the accelerometer

  • @thealphanerd.

    It's actually not an accelerometer, it's a gyroscope. Similar, but much different realities.


    @greater
    Holy! Thanks for kinda reminding me. I was so fixed on things I could do with the new full HD out and forgot about the camera.

    It would be really awesome to make a thingamapoop app for iPad then where you could just use any 'ol LED or other flashing object to drive emulations of it's oscillators.

  • yes they do say the drums in garageband are velocity sensitive so the function is there, in one form or another =)

  • @KristofferLislegaard,

    Sure. But it was always there in one form or another, when app developers took the trouble to implement. (garageband is usable on the original iPad, remember) So, that's sort of a leftover iPad 1 dream. =)

  • how sensitive is the touch sensor in an ipad? You could potentially do velocity like the manta controller does, where louder ~ more finger contact... Use the area of the finger press to control velocity?

  • It's actually not good for that -- the drivers deliberately blob thick touches together. But for continuous pressure, a simple "press to trigger, then move to modulate" model is intuitive enough.

  • Maybe this is a good place to ask this question, though it is slightly off topic.

    Does the monome actually send OSC messages? or are they serial bits that are then converted by serialosc or monomeserial to OSC?

    The reason for my asking is I can't figure out why TouchOSC can't be used through USB. Apple just barely introduced core MIDI to their iOS API so I doubt we'll see direct OSC support any time soon.

  • > or are they serial bits that are then converted by serialosc or monomeserial to OSC?

    ^^ this one.

  • hmmm. cool cool. Thanks soundcyst.

    That gives me high hopes in being able to perhaps work something up on the iPad that sends serial through their 30 pin cable and is converted as well.

    I'll have to dig through the source of monomeserial to see what going on exactly.

  • > That gives me high hopes in being able to perhaps work something up on the iPad that sends serial through their 30 pin cable and is converted as well.

    unfortunately apple are complete jackasses about this.

    in order to access their hardware SDK from cocoa (i.e. using objective c 2.0 in your ipad app), you have to sign up for their Made for iPod program, which requires additional cash and a phone interview or two where you have to convince apple that your product is in line with their standards and goals.

    not exactly hacker/diy friendly.

  • hmmmm. You sure about that? Just looked it up real fast, and it seems like the made for iPod program is more about getting tech specs and the like that allow you to design accessories. It seems however though that I can still use their standard 30 pin to usb cable to send messages with ye old developer program registration (proof is the screen shot below).

    I'm already registered as an Apple Developer for both OS X and iOS.

  • they may have changed their policy in the past couple of months, but last time i checked, you still need to register for MFi even to do bluetooth.

    interfacing a non-"computer" device (i.e. something that can't run iTunes) through USB is the same way. perhaps it's different to communicate with a laptop or desktop. just a heads up. i'd love to be proven wrong on this one.. it might actually convince me that it's worth it to buy an ipad.

  • I think you may have the wrong idea on my goal here: I was wondering if I could perhaps code an application similar to TouchOSC that could send data locally over the cable rather than through UDP WiFi which could theoretically eliminate some of the latency that occurs when high volumes of data are being sent over WiFi.

    According to a document I just dug up in their reference library, you can use the external accessory framework to communicate with any MFi device over bluetooth, (which of course includes Mac computers). I'll have to look further though to see if they allow serial protocol to be sent using their SDK in cocoa.

    http://developer.apple.com/library/ios/#qa/qa2009/qa1657.html

  • If all else fails, there are several interfaces available for sending MIDI now. It's not OSC, but it's not wifi either.

    Likewise, you could go truly old-school and send data back and forth as an audio signal. (who wouldn't download an analog modem app?)

  • Also, I haven't had a chance to mess with it yet, but for a lot of our needs, I think this is more viable than TouchOSC:

    http://charlie-roberts.com/Control/

    If you're comfortable writing JavaScript, the new version essentially becomes a platform you can script apps for. (it won't generate audio or anything, but it can respond to MIDI or OSC messages from your computer by rewriting the interface.)

  • Hmmm. Control just looks like a messy TouchOSC. I know a little javascript, but I don't think it would be that friendly when it comes to writing new interfaces on the fly.

    I'm going to keep digging around I guess. I probably wont code anything soon because I'm stuck on a couple other projects first, but we'll see.

  • on the subject of communication over cable, has anyone seen this? It will take hacking to get it to send ethernet style data, but the hardware exists at least...


    http://www.engadget.com/2011/03/05/redpark-console-cable-gives-idevices-an-rj-45-connector-not-eth/

  • Step 1: hack the peripheral.

    Step 2: get something through the iTunes store that encourages hacking a peripheral.

    Step 2's a bit daunting.

  • Thankfully there are other methods for getting applications on an iphone than the app store. I would be surprised if this is not exploited.

  • A better solution is to use your XY coordinates more aggressively. I've seen a lot of "hitting the top of the pad is louder than hitting the bottom of the pad", which offers full control, but I think "the center of the pad is louder than the edges of the pad" is more intuitive.


    ------------------------------------
    [url=http://www.ebelow.com/ipad-2-case-iPad2case-iPad-2-cases.html]ipad 2 case[/url]
    [url=http://www.ebelow.com/]Apple Accessories[/url]
    [url=http://www.ebelow.com/iPad2-cases-cheap-iPad2-cases-iPad2-case.html]iPad2 cases[/url]

  • Spambot's stealing my lines? Well, then! Two can play at this game.

    "Freelance writer."

    How do you like it, eh?




    EDIT:
    That was very old-world of me. Spambot's not stealing, it's remixing.
    Still, where's my attribution, Spambot?

  • I actually am a freelance writer and am a little worried that those spambots might be cheaper and faster. I don't wanna be replaced by a machine. Damn you WWW. Maybe I could become a freelance spambot.

  • he he he. These artificial lifeforms get more interesting all the time. Agreed though, if it was using a markov chain process to parse the conversation, then find similar lines from other conversation, and interjecting those, with attribution, then suddenly spambots would become very useful and welcome. If someone wants to code one to trawl through molar help threads and point people to the faqs, i would be very pleased.

  • on a really related note, ipad and the livid code seem to be playing well together...

    http://www.synthtopia.com/content/2011/04/25/code-station/

  • I haven't been following developments in the app store. Has anything remotely close to our wish list come to pass yet?

  • So, I've been playing with Control a bit more.
    http://charlie-roberts.com/Control/

    The learning curve is indeed painful (in part because there's next to no documentation), but the potential is amazing.

    It's in no way a "messy TouchOSC". It is a complex one, though.

    TouchOSC is messy, because your interfaces are forced to send every bit of data over the network regardless whether your app needs it. Which is fine for controlling Ableton, but practically unusable as a platform to build apps around.

    TouchOSC lacks the ability to embed any intelligence about what to do with your controls once the user touches them. All of that processing has to occur in your app, which means all of that controller data must be sent.

    Simple example: You want a bank of buttons to trigger drum samples, and you want tilt data to control their velocity. Well, you'll need to send tilt as a CC (which you'll map to velocity in your app), and that "velocity" data will be transmitted constantly, even when no notes are playing. You'll also need to send your note data independently of that.

    Same challenge in Control... You make a bank of buttons, and assign them to run a JS function when pressed. That function checks your tilt data, and sends out a note with the appropriate velocity. If you trigger two drum hits, only two messages are sent. And they are, coincidentally, the exact two messages that you were hoping to receive.


    Likewise, you could use an XY pad in TouchOSC to simulate velocity, with one corner louder than the others, or the edges quieter than the center. What this entails is sending X and Y data with every movement of your finger, and enabling an additional message (Z) to let your app know when you're no longer touching that control.

    You then have three choices:

    a) Trigger a note with every movement on the pad, so scraping along it is like a drum roll. This may not sound pretty, but at least the data sent is actually being used.

    b) Store the latest XY coordinates in your app, and when the user lifts their finger from a pad, trigger a note based on those last coordinates. That's pretty wasteful.

    c) Trigger a note based on the first XY coordinates to come in, then ignore subsequent data until the Z message re-opens your gate. That's also pretty wasteful, but might play better than b.

    Same challenge in Control... You build an XY pad, and tell it to execute a function when the value changes, or when the pad is pressed, or when the pad is released. Whichever you're after; it can then send note data (at the proper velocity) when you intend to trigger notes. Or no data when you don't intend to do anything.


    This ability to optimize the data transmission of your interface is sorely needed in TouchOSC.

    But, yes. TouchOSC's ability to visually generate those simple effective controllers is also sorely needed in Control.

    (Have I mentioned that I'm struggling with it?)

    (...and that I write code in JavaScript professionally?)

  • and this is exactly why multi-note aftertouch is such a lacking feature in live. Because thats the way forward, in my opinion: Z on triggers note with velocity based on xy-coordinates. Then xy-coordinates continue to be used to control per-note aftertouch and pitchbend. Trouble is that live ignores all that, so its a no go on that platform. Has anyone had success with renoise or somesuch?

  • I'm curious about this as well. Hoping it's fixed in Live9, but I may well jump ship if it isn't.

    Though, I'm sort of coming at it from the opposite angle; I suspect the per-note modulation limits are more a factor of your virtual instruments than your host. But not having access to the multi-note aftertouch data in my scripted devices is ridiculous. I'm sending in data from hardware. Max can read that data. Let me read it in Max For Live.

    Bidule is probably more flexible on that front.



    On a related note, Control does offer monome templates (40h and 128). Those have been discussed (and improved) by this forum in other threads. What might not be clear is...

    1) These send and receive OSC messages that follow monome's protocols. So they support all the optimized messages that update a whole row or column or 8x8 frame all at once, for example. ToudhOSC based monome emulators have to translate those commands into individual LED assignments, which is sluggish by comparison.

    2) They're open-source and extensible. Want to add support for per-button LED brightness? Can do. Want to add RGB support and turn it into an octant emulator? That could be done. Want to add continuous XY readings to follow every button press as Lokey outlined above? I'm sure it's possible.

    Point being, why stop at emulation?

  • Dumb question but couldn't you have the ipad detect "velocity" simply by reading the footprint of your fingertip? what I mean is if you touch a surface very lightly, only the very tip makes contact. As you apply pressure, more finger surface comes in contact with the surface, thus more velocity. Exaggerate this if you tap with your finger a little more horizontal.

    Obviously would need to be adjusted due to sausage fingers.

  • that's the approach taken by the snyderphonics manta. seems to work well enough, but I think it had higher resolution sensors and better detection/interpolation logic. another idea would be to use vertical or horizontal strips, and using finger position within the strip to determine velocity.

  • It's not a dumb question, though what you're both describing falls more under "poly aftertouch" than "velocity".

    Anyway... it depends on your OS as much as the device. That approach might actually work on an Android.

    iPhone and iPad do have very fine resolution control, but the OS deliberately clumps the footprint of your fingertip into a single touch, and treats anything below a certain size threshold as a false reading. (so, you're meant to use your big fat fingers instead of a stylus or your fingernail)

    To the best of my knowledge, it's not possible for an app to override that.


    I'm starting to build something related to that idea in Control right now. It's not quite the same thing, but tracking two fingers within a given area should let us extrapolate a lot more information than one finger was going to afford anyway. (ie, the XY pad gains pinch and rotate controls)

  • @gtz, now that is an awesome idea, although i would suspect it would need fairly large pads to get a range of finger widths...