Wednesday, April 28, 2010

Fixed one issue with too many participants

Well today I tackled another known problem. When a third user would join the existing chat, the two other participants would crash. This would happen because the pipeline would be in a running state when the code attempts to change the pipeline. This was causing a mutex error in the gstreamer pipeline.

The first solution I thought of was to just set the max_participants property in sugar. Well to my surprise, it doesn't actually enforce a max_participants, instead it is used more like a boolean inside sugar's code. It the variable is found to have a value of 1, then it disables sharing and any other value is ignored.

This was very annoying. After confirming that the max_participants was useless in the sugar channel on irc, I had to write my own system to prevent extra participants (ticket 45). This solution allows the third person to join the text chat only, no video. A potential future implementation will allow other participants to choose who they want to talk with from a list of connected users.

Monday, April 26, 2010

Theora + Gstreamer + RTP = Annoyed

Well I am a bit annoyed at the moment. I have been trying all day to get rtp working in gst pipeline with theora. I have been able to get rtp and h264 working together, but no luck when it is using theora. I have been asking around on the gstreamer IRC channel on freenode with no real responses. My only renaming guess is it may have something to do with the caps filter on the receiving pipeline or there is a bug.

I have been testing this outside of my code just using gst-launch. A shell script using h264 can be found in the git repo.

This is a blocker in my opinion to release on activities.sugarlabs.org as having it work some of the times with the solution to retry the connection until it works is unacceptable. Hopefully I can get this figured out soon.

Other news, we have been getting ready for Imagine this weekend.

Thursday, April 22, 2010

Still working on rtp and gstreamer

This week we have be working on a bunch of little things here and there. We are playing with some frame rates seeing how the xo handles them. We currently have it running at 15 frames a second pretty well. We can push it to 20 frames a second, but our preview frame takes a hit dropping below a frame a second. This is because the preview is using ximagesink which is not hardware accelerated.

We are currently working on getting RTP to work so that we can sync our header packets. We have been asking in the gstreamer irc chat room without much of a response. If we get this working, we will be able to merge both pipelines into one pipeline and therefore one xvimagesink because there will be no race conditions for the header packet. With that we may be able to push 25 frames a second which would be a real accomplishment.

Tomorrow I am planing on packaging our activity. I really hope to have the RTP working by then, but that might need to wait for future release.

On another note, os build 120 was released the other day and fixed our screen tearing and xvimagesink issues with our code. This means slightly better frame rates and image quality. Kudos to everybody making this patch available!

Tuesday, April 20, 2010

Started Remote Testing

At our hackfest the other day, we were trying to test Open Video Chat across state lines. With some minor luck, we were able to see video from D.C. Unfortunately, we were not able to get our video to them. Today I we gave it another try, and I found one of the problems we were having. Our code was querying eth0 for its associated ip address. This means our code was not working when someone was using another device to connect to the net, like a USB Ethernet adapter. I was able to commit a fix that would search the device list for the first one it can get an ip address for.

This is not completely ideal as it should just be able to use telepathy without having to deal with ip addresses. In the codes current stage, it still will have problems with being behind a nat as its external ip address would be different then its associated address.

We have one more critical bug to work out, then later this week I expect to release a beta copy on activities.sugarlabs.org. So keep an eye out!

Monday, April 19, 2010

Boston Barcamp and Hackfest

Gnome-shell

Hackfest

We went to OLPC Headquarters for a hackfest.



At the hackfest we were able to get a few major goal completed including a framerate of around 14 to 15 frames a second.

Here is a picture of Remy and Luke chatting! (just before we got it up to 14-15 frames a second)



Checkout out this post by our story tellers: FOSS@RIT goes to OLPC HQ.

We also have another short video of OLPC Headquarters.



Remy DeCausemaker had an interview with the head of HR.




BarCamp

The ovc team went to Barcamp Boston and presented about OVC and other Foss initiatives at RIT. I feel our talk was decent. It felt much shorter then thirty minutes. We will be posting a video of our talk as well as a few other talks we managed to record. That will be posted later in the week.

Thursday, April 15, 2010

OVC is going to Boston

OVC To Barcamp Boston



Ok So we are going to Boston, it will be fun! We couldn't resist making a little teaser video.

On a more serious note, today we got a suggestion from the community on a set of cap filters for our pipeline to improve performance. It has helped. So today I am also tried to get it to show the outgoing feed as well as the incoming feed so the user can see how they are in frame. I added a tee into the pipeline which caused some issues. Now we know that we need to use a queue following the tee. For some reason, the xo is still refusing to run the pipeline in the activity where it works fine from command line. I will hopefully tackle that problem tomorrow.

Hope you enjoyed our teaser, next post will be in Boston!

Wednesday, April 14, 2010

XO Codec Testing

Well yesterday we were playing with the xo, seeing how they benchmark existing video chat clients and other video codecs we had access too. Using empathy's video chat system we found that the xo's were unable to connect to non-xo's. We were able to connect xo to xo pretty well. The only problem is it took a while for it to make the connection and once it was connected, the frame rate was about a frame ever two or three seconds at best.

We were looking into using other codecs, yet the xo's only seem to have access to two of them in a typical install, Theora and Smoke/jpeg. While playing with the gstreamer pipeline, we decided we should try writing our own filter to cut down the frame size by cutting out the color scale on yuv before sending it into Theora. All the current greyscale options require converting colorspace which seems too processor intensive for the xo's. So by writing our own filter that runs in native yuv, we should be able to compress the image further without the multiple colorspace conversions. Our filter would also shrink the resolution quickly as it would be designed specifically for the xo camera's 640 by 480 resolution. The resizing method in gstreamer seems to be too generic and therefore slower then we would like.

We are also still having problems with using Farsight.

Friday, April 9, 2010

Week 4 Review

Hey, sorry for the delay in blog posts this week... It has been a bit busy. Unfortunately not as productive as I would like. Our barcamp took a quite a few cycles out of our dev time, but it was worth it. We have also visited interlock twice this week. The second meeting was much more beneficial. We got into a couple of conversations with a few people there about our project and got some good ideas. I think I will have Fran take a look at writing a gstreamer plugin to cut the video size by half efficiently instead of the all purpose one built into the current pipeline. That might reduce the bandwidth and size without killing our processing cycles on the xo. Otherwise, we didn't get as much dev work as I had hoped.

We have also been spending time getting ready for a Boston Barcamp and a Gnome Python Hackfest which will be fun.

Our ASL is coming along. We are still having class twice a week and this week, we also visited the No Voice Zone at RIT. The No Voice Zone is a weekly event where hearing and non-hearing people get together and try to bridge the social barrier and teach/learn some American Sign Language.

I have also been trying to get telepathy and farsight working together... in its current state, the script isn't very reliable. I can get it working part of the time randomly. So I think this weekend, I will trying to find some decent documentation and hopefully re-write the script.

Tuesday, April 6, 2010

Example Script Update

Well, we recently found an example script that allows us to do video chat the correct way with telepathy-farsight. Today we have been doing a little experimenting with it. While the results are promising on non-xo's. It is still not running as reliable as we would like. Our goal tomorrow is to figure out why it keeps having problems and to start adding its functionality to our activity.

I have committed one change that allows video to run both ways.

This will be vary valuable in our Open Video Chat activity once we can work out the problems. After that, it is just optimizing the code for the xo.

Monday, April 5, 2010

Foss.rit.edu updates and test code

Today I did some work with our Foss.rit.edu landing page. I updated the news aggregator so that it now supports images and added some new feed sources. After that I added an empire page that will hold our links on the net. It will be uses as a central link list of all things foss at rit. Some links include our accounts we host media and places we syndicate our stories.
I also added a blog feature to allow our members to have their own blogs. Forums and comments have been enabled as well. The goal is to increase activity and participation in foss events at RIT. Next step will be to open up the registration system on the website.

We also started playing with some test code we found last week. We setup the server with two test accounts for the master/slave setup the script uses. We got the audio somewhat working, a bit noisy but it is working both ways. The video is a bit temperamental, it seems to work sometime but other times it gives us problems. The example script can be found in our git repo.

Also look forward to possibly a hack fest! More information coming soon.

Sunday, April 4, 2010

Week 4 Review: Video Hack

Video Hack

  • We got video to work using a nasty hack that gstreamer provides.
  • This hack was a temporary solution to allow us to test the encoding and decoding of the video.
  • After updating our xo's, we were able to test it with xvimagesink. This sink gets about double the frame rate of ximagesink which will be helpful.
  • It is currently using the theora codec, which the xo systems seem to have trouble with. Future optimization will be needed.

Cleaner Video

  • We have been looking into streaming video the correct way. We plan to use farsight and telepathy to complete this task.
  • The example code we have from the original video chat activity is unusable. It used binary copies of telepathy, gstreamer, and farsight which makes it hard to understand and ugly.
  • We found some sample Code that is understandable. We just need to get it working and then try to port it into our code.

Story Telling

Barcamp

  • On Saturday, we attended barcamp and gave a small talk about our project and other things that RIT does in the foss community.

Friday, April 2, 2010

Episode 4, Interlock, and xvimagesink

Well today we brought over a couple of the 5Linx video phones to Interlock, gave them a few new toys to play with. Earlier I re-flashed our xo's with os build 117 which gives us access to the shiner xvimagesink. This new xvimagesink in the gstreamer pipeline allows for a better frame rate as changing the color system is not needed. While at Interlock, Fran gave the new sink a try, and it doubled our frame rate. This is huge as we are using a nasty hack which is killing our frame rate as it is.

Today I also found a promising code sample that could be useful as it shows how to use telepathy and farsight without the nasty binary hack in the original video chat code. So that may be very useful.

On a great note, our Episode 4 is up!