<html><body bgcolor="#FFFFFF"><span class="Apple-style-span" style="-webkit-tap-highlight-color: rgba(26, 26, 26, 0.296875); -webkit-composition-fill-color: rgba(175, 192, 227, 0.230469); -webkit-composition-frame-color: rgba(77, 128, 180, 0.230469); font-size: medium; "><span>Dear developers</span><br><span></span><br><span>First I'd like to say I love your code and it was easy to find where the changes would be made. I am working on a secondary school deployment of 1.5s and I would like to support usb microscopes within the sugar environment. I am working at Ntugi school in kenya <a href="http://www.ntugischool.com/" x-apple-data-detectors="true"><a href="http://www.ntugischool.com">www.ntugischool.com</a></a> feel free to look us up.</span><br><span></span><br><span>My device can be called with gstream as /dev/video1 but does not support frame caps, colorspace or other driver controls. All i want to be able to do is see the live image and take a picture. I can make most of the gui changes but im stuck when it comes to glive.py. I do not know where the changes need to be made to support this device. I can take a pic with pipeline = gst.parse_launch('v4l2src device=/dev/video1 ! jpegenc ! filesink location=/tmp/microscope.jpg')</span><br><span></span><br><span>Any feedback would be great. </span><br><span></span><br><span>Adam Gordon</span><br><span>Upper Canada College</span><br><span>Toronto, ON, Canada</span><br><span><a href="http://www.ntugischool.com/" x-apple-data-detectors="true"><a href="http://www.ntugischool.com">www.ntugischool.com</a></a></span><br><span></span><br><span>PS. I am in Kenya until the 21st where implementation would be easy, though the communication between us is strong enough to impliment this later aswell.</span></span></body></html>