On Mac OS X the trackpad has support for several gestures, one is the two fingered swipe to scroll up, down, left, or right on a page. wxPython has a panel to help create scrolled widgets wx.lib.scrolledpanel. However it does not have support for gestures which is a real pain.
I have tried to modify the NSView, as it is done if it were a normal Objective C application, however the NSEvents use methods (touchesBeganWithEvent:, etc) that are subclassed to be used as a notification and handling of an event. This is unlike the Bind calls in wxPython. This would be fine however if Objective C allowed monkey patching... eg
def handleTouchBegin(event):
print "Hey a touch event has begun!"
view.touchBeganWithEvent_ = handleTouchBegin
but as you can guess PyObj C errors (because Objective C doesn't support monkey patching or not in any clean and nice fashion) and I get the following error
TypeError: cannot change a method
Ok well I could do what apple says and subclass it, but the object is already created, so how can I still capture the events. Of course there is also
NSEvent addGlobalMonitorForEventsMatchingMask:
and
NSEvent addLocalMonitorForEventsMatchingMask:
but those also disappoint in that they either don't even deal with the application (global deals with all the others) or doesn't deal with a single NSView's events (or does it and I am misinformed).
So how should I do this? Am I missing another option, I know I read something about NSResponder but from what I gathered that is what NSView is, an event responder and you don't add one to a NSView.
Are there observers like in QTKit such as for monitoring the load state changing ( https://developer.apple.com/library/mac/documentation/Cocoa/Conceptual/QTKitApplicationProgrammingGuide/AnatomyoftheQTKFramework/AnatomyoftheQTKFramework.html#//apple_ref/doc/uid/TP40008156-CH109-SW11 )?