domingo, 8 de maio de 2011

Writing a Custom Widget Using PyGTK

Writing a Custom Widget Using PyGTK:

Writing a Custom Widget Using PyGTK

Article by Mark Mruss, originally posted on www.learningpython.com

One of the things that I wanted to add to my simple PyWine application was an easy way for people to rate their wine. There were lots of different ways to do it but since I was looking for a tutorial to write I decided that I wanted to do it the way that you rate songs in iTunes. If you've never used iTunes before, you can rate songs on a sliding scale from zero to five using stars. It basically functions like a slider or a Horizontal Scale except that when drawing it's not a line, it's a row of stars.

Python PyGTK Windows

The full source for this tutorial can be downloaded here.

The three the most useful links that I found on this subject were: A song for the lovers, the writing a widget turotial on the PyGTK website, and the widget.py example in the PyGTK cvs.

The skeleton of the following code will be mostly based off of the widget.py example, but since this example will try to accomplish a bit more there will be some extra code. In order to understand this tutorial better I suggest you give widget.py a couple of reads.

The starting point is a file names starhscale.py which starts off with some rather standard python stuff:

#!/usr/bin/env python  try:         import gtk         import gobject         from gtk import gdk except:         raise SystemExit  import pygtk if gtk.pygtk_version < (2, 0):         print "PyGtk 2.0 or later required for this widget"         raise SystemExit 


Not too much surprising there, now it's time to create and initialize our class, we'll call it StarHScale:
class StarHScale(gtk.Widget):         """A horizontal Scale Widget that attempts to mimic the star         rating scheme used in iTunes"""          def __init__(self, max_stars=5, stars=0):                 """Initialization, numstars is the total number                 of stars that may be visible, and stars is the current                 number of stars to draw"""                  #Initialize the Widget                 gtk.Widget.__init__(self)                  self.max_stars = max_stars                 self.stars = stars                  # Init the list to blank                 self.sizes = []                 for count in range(0,self.max_stars):                         self.sizes.append((count * PIXMAP_SIZE) + BORDER_WIDTH) 

So what's happening here? Well the first thing you see is the definition of our StarHScale widget that is a subclass of gtk.Widget, which is the base class for all widgets in PyGTK. Then we have a rather simple __init__ routine where we set some parameters (the max number of stars to show and the current number of stars to show) and initialize the parent class.

You'll also notice that at the end of the function there is a list created, this list maps the X (horizontal) position of each star. It might not make much sense now, but it will become clear when you see how it is used. PIXMAP_SIZE and BORDER_WIDTH are "globals" that are defined outside of the StarHScale class as follows:

BORDER_WIDTH = 5 PIXMAP_SIZE = 22 

The next function we will write is the do_realize() function. The do_realize() function is related to the gtk.Widget.realize() function and is called when a widget is supposed to allocate its GDK windowing resources.

It may seem a bit complicated, but the do_realize() function is simply where widgets create their GDK windows resources (most probably a gtk.gdk.Window) where the widget will eventually be drawn to). In order to fully understand this it may be helpful to understand what a gtk.gdk.Window is, here is an explanation from the PyGTK documentation:

gtk.gdk.Window is a rectangular region on the screen. It's a low-level object, used to implement high-level objects such as gtk.Widget and gtk.Window. A gtk.Window is a toplevel window, the object a user might think of as a "window" with a titlebar and so on. A gtk.Window may contain several gtk.gdk.Window objects since most widgets use a gtk.gdk.Window.

A gtk.gdk.Window object interacts with the native window system for input and events. Some gtk.Widget objects do not have an associated gtk.gdk.Window and therefore cannot receive events. To receive events on behalf of these "windowless" widgets a gtk.EventBox must be used.

So a gtk.gdk.Window is not a "window" as we normally think of one, it's basically a rectangular region on the screen that will be used for "drawing" of some sort. So for our StarHScale widget, it's gtk.gdk.Window will be the area where the stars will be drawn. If you have done programming with other toolkits or other languages it may be helpful to think of this as the "surface" that the widget draws on. Much of the do_realize() code is taken from the widget.py example:

def do_realize(self):         """Called when the widget should create all of its         windowing resources.  We will create our gtk.gdk.Window         and load our star pixmap."""          # First set an internal flag telling that we're realized         self.set_flags(self.flags() | gtk.REALIZED)          # Create a new gdk.Window which we can draw on.         # Also say that we want to receive exposure events         # and button click and button press events          self.window = gdk.Window(                 self.get_parent_window(),                 width=self.allocation.width,                 height=self.allocation.height,                 window_type=gdk.WINDOW_CHILD,                 wclass=gdk.INPUT_OUTPUT,                 event_mask=self.get_events() | gdk.EXPOSURE_MASK                         | gdk.BUTTON1_MOTION_MASK | gdk.BUTTON_PRESS_MASK                         | gtk.gdk.POINTER_MOTION_MASK                         | gtk.gdk.POINTER_MOTION_HINT_MASK)          # Associate the gdk.Window with ourselves, Gtk+ needs a reference         # between the widget and the gdk window         self.window.set_user_data(self)          # Attach the style to the gdk.Window, a style contains colors and         # GC contextes used for drawing         self.style.attach(self.window)          # The default color of the background should be what         # the style (theme engine) tells us.         self.style.set_background(self.window, gtk.STATE_NORMAL)         self.window.move_resize(*self.allocation)         # load the star xpm         self.pixmap, mask = gtk.gdk.pixmap_create_from_xpm_d(                 self.window, self.style.bg[gtk.STATE_NORMAL], STAR_PIXMAP)          # self.style is a gtk.Style object, self.style.fg_gc is         # an array or graphic contexts used for drawing the forground         # colours         self.gc = self.style.fg_gc[gtk.STATE_NORMAL]          self.connect("motion_notify_event", self.motion_notify_event) 

There is quite a bit of code here so I'll take some time to explain it. The first step is to set a flag so that lets us, and anyone else that wants to know, that we have been realized - that we have a gtk.gdk.Window associated with ourselves.

The next step is to actually create the gtk.gdk.Window that will be associated with the StarHScale widget. When we create it we also set many of it's attributes. You can read more about all the available attributes in the PyGTK documentation but here are the attributes that we are setting:

parent: a gtk.gdk.Window
width: the width of the window in pixels
height: the height of the window in pixels
window_type: the window type
event_mask: the bitmask of events received by the window
wclass: the class of window - either gtk.gdk.INPUT_OUTPUT or gtk.gdk.INPUT_ONLY

We add a few events to the event mask of the gtk.gdk.Window because this widget will be interacting with the mouse. Then we make some necessary connections between the gtk.gdk.Window, the widget, and the widgets style. Finally we set the background colour and move the window into the position that has been allocated for us (self.allocation).

The next step is where the do_realize() code begins to diverge from the widget.py example. The next step is where we create our star pixmap using the pixmap_create_from_xmp_d function:

# load the star xpm self.pixmap, mask = gtk.gdk.pixmap_create_from_xpm_d(         self.window         , self.style.bg[gtk.STATE_NORMAL]         , STAR_PIXMAP) 

Here is a description of what a gtk.gdk.Pixmap is:

A gtk.gdk.Pixmap is an offscreen gtk.gdk.Drawable. It can be drawn upon with the standard gtk.gdk.Drawable drawing primitives, then copied to another gtk.gdk.Drawable (such as a gtk.gdk.Window) with the draw_drawable() method. The depth of a pixmap is the number of bits per pixels. A bitmaps are simply a gtk.gdk.Pixmap with a depth of 1. (That is, they are monochrome pixmaps - each pixel can be either on or off).

What we will use the pixmap for is the drawing of each of our stars. Since we want the widget to be portable without having an xpm file around we simple load it's data. To do so we have to define the STAR_PIXMAP "global" outside of our StarHScale as follows:

STAR_PIXMAP = ["22 22 77 1", "         c None", ".        c #626260", "+        c #5E5F5C", "@        c #636461", "#        c #949492", "$        c #62625F", "%        c #6E6E6B", "&        c #AEAEAC", "*        c #757673", "=        c #61625F", "-        c #9C9C9B", ";        c #ACACAB", ">        c #9F9F9E", ",        c #61635F", "'        c #656663", ")        c #A5A5A4", "!        c #ADADAB", "~        c #646562", "{        c #61615F", "]        c #6C6D6A", "^        c #797977", "/        c #868684", "(        c #A0A19E", "_        c #AAAAA8", ":        c #A3A3A2", "<        c #AAAAA7", "[        c #9F9F9F", "}        c #888887", "|        c #7E7E7C", "1        c #6C6C69", "2        c #626360", "3        c #A5A5A3", "4        c #ABABAA", "5        c #A9A9A7", "6        c #A2A2A1", "7        c #A3A3A1", "8        c #A7A7A6", "9        c #A8A8A6", "0        c #686866", "a        c #A4A4A2", "b        c #A4A4A3", "c        c #A1A19F", "d        c #9D9D9C", "e        c #9D9D9B", "f        c #A7A7A5", "g        c #666664", "h        c #A1A1A0", "i        c #9E9E9D", "j        c #646461", "k        c #A6A6A4", "l        c #A0A09F", "m        c #9F9F9D", "n        c #A9A9A8", "o        c #A0A09E", "p        c #9B9B9A", "q        c #ACACAA", "r        c #60615E", "s        c #ADADAC", "t        c #A2A2A0", "u        c #A8A8A7", "v        c #6E6F6C", "w        c #787976", "x        c #969695", "y        c #8B8B8A", "z        c #91918F", "A        c #71716E", "B        c #636360", "C        c #686966", "D        c #999997", "E        c #71716F", "F        c #61615E", "G        c #6C6C6A", "H        c #616260", "I        c #5F605E", "J        c #5D5E5B", "K        c #565654", "L        c #5F5F5D", "                      ", "                      ", "          .           ", "          +           ", "         @#$          ", "         %&*          ", "        =-;>,         ", "        ';)!'         ", "  ~{{]^/(_:< [}|*1@,   ", "   23&4_5367895&80    ", "    2a4b:7c>def)g     ", "     2c4:h>id56j      ", "      {k8lmeln2       ", "      j8bmoppqr       ", "      {stusnd4v       ", "      ws;x@yq;/       ", "      zfAB {CmD{      ", "     rE{     FGH      ", "     IJ       KL      ", "                      ", "                      ", "                      "] 

The star is based off of star found in the Art Libre Set of the wonderful Tango Desktop Project. I simply darkened it a bit.

Then we make a quick reference to the normal state foreground gtk.gdk.GC (graphic context) associated with our style. A gtk.gdk.GC is simply an object that "encapsulates information about the way things are drawn, such as the foreground color or line width. By using graphics contexts, the number of arguments to each drawing call is greatly reduced, and communication overhead is minimized, since identical arguments do not need to be passed repeatedly. (From the PYGTK Docs )" So it's basically a bunch of drawing settings encapsulated in one simple object.

Finally to finish off the do_realize() function we connect ourselves with the "motion_notify_event" which we will use to track when the user moves the mouse over our widget.

The next step in our widget creation is the do_unrealize() function, which is called when a widget should free all of its resources. The widget.py example calls:

self.window.set_user_data(None) 

But I got a type error running that, so instead I simply destroyed the window. I'm not entirely sure what the correct approach is, or if one even has to worry about clearing the resources, either way this is code that i used:

def do_unrealize(self):         # The do_unrealized method is responsible for freeing the GDK resources         # De-associate the window we created in do_realize with ourselves         self.window.destroy() 

The next two functions deal with the size of our widget. The first function do_size_request() is called by PyGTK so that PyGTK can figure out how large the widget wants to be. The second function, do_size_allocate() is called by PyGTK in order to tell the widget how large it should actually be:

def do_size_request(self, requisition):         """From Widget.py: The do_size_request method Gtk+ is calling          on a widget to ask it the widget how large it wishes to be.          It's not guaranteed that gtk+ will actually give this size          to the widget.  So we will send gtk+ the size needed for          the maximum amount of stars"""          requisition.height = PIXMAP_SIZE         requisition.width = (PIXMAP_SIZE * self.max_stars) + (BORDER_WIDTH * 2)   def do_size_allocate(self, allocation):         """The do_size_allocate is called by when the actual         size is known and the widget is told how much space         could actually be allocated Save the allocated space         self.allocation = allocation. The following code is         identical to the widget.py example"""          if self.flags() & gtk.REALIZED:                 self.window.move_resize(*allocation) 

The next function is the do_expose_event() function, which is called when the widget should actually draw itself. For the StarHScale this function is actually pretty simple:

def do_expose_event(self, event):         """This is where the widget must draw itself."""          #Draw the correct number of stars.  Each time you draw another star         #move over by 22 pixels. which is the size of the star.         for count in range(0,self.stars):                 self.window.draw_drawable(self.gc, self.pixmap, 0, 0                                                                                         , self.sizes[count]                                                                                         , 0,-1, -1) 

Basically we simply loop through the current number of stars (self.stars) and draw our star pixmap to the window using the draw_drawable function. We use the self.sizes list (which we calculated in the __init__ function) to determine the x position where we will draw the star.

Now comes the time where we actually need to let the user interact with the widget and show and hide the stars. To do so we need to pay attention to the "motion_notify_event" and the "button_press_event". One thing you may have noticed in the do_realize() function is that we pay attention to the gtk.POINTER_MOTION_MASK and the gtk.POINTER_MOTION_HINT_MASK, the reason for this is explained in the PyGTK documentation:

It turns out, however, that there is a problem with just specifying POINTER_MOTION_MASK. This will cause the server to add a new motion event to the event queue every time the user moves the mouse. Imagine that it takes us 0.1 seconds to handle a motion event, but the X server queues a new motion event every 0.05 seconds. We will soon get way behind the users drawing. If the user draws for 5 seconds, it will take us another 5 seconds to catch up after they release the mouse button! What we would like is to only get one motion event for each event we process. The way to do this is to specify POINTER_MOTION_HINT_MASK.

When we specify POINTER_MOTION_HINT_MASK, the server sends us a motion event the first time the pointer moves after entering our window, or after a button press or release event. Subsequent motion events will be suppressed until we explicitly ask for the position of the pointer using the gtk.gdk.Window method:

x, y, mask = window.get_pointer()

Our motion_notify_event handler is as follows:

def motion_notify_event(self, widget, event):         # if this is a hint, then let's get all the necessary         # information, if not it's all we need.         if event.is_hint:                 x, y, state = event.window.get_pointer()         else:                 x = event.x                 y = event.y                 state = event.state          new_stars = 0         if (state & gtk.gdk.BUTTON1_MASK):                 # loop through the sizes and see if the                 # number of stars should change                 self.check_for_new_stars(event.x) 

This function is pretty simple, first we check to see if the event is a hint or not, if it is a hint we ask GTK+ to get us the real pointer information. If it is not a hint then we just collect the information from the passed gtk.gdk.Event object.

Then we check the events state to make sure that the left mouse button is down, and if it is we pass the x coordinate of the mouse pointer to the self.check_for_new_stars() function which will determine how many stars should be shown.

The other event that lets the user hide and show stars is the button press event which we handle using the do_button_press_event() gtk.Wdiget virtual method that gets called when a button is pressed on the widget:

def do_button_press_event(self, event):         """The button press event virtual method"""          # make sure it was the first button         if event.button == 1:                 #check for new stars                 self.check_for_new_stars(event.x)         return True 

This function is very simple, first we check to make sure that it was the left button that fired the gtk.gdk.BUTTON_PRESS_EVENT, and if it was we pass event.x (the position the mouse was in at the time of the event) to the check_for_new_stars() function.

def check_for_new_stars(self, xPos):         """This function will determine how many stars         will be show based on an x coordinate. If the         number of stars changes the widget will be invalidated         and the new number drawn"""          # loop through the sizes and see if the         # number of stars should change         new_stars = 0         for size in self.sizes:                 if (xPos < size):                         # we've reached the star number                         break                 new_stars = new_stars + 1          #set the new value         self.set_value(new_stars) 

check_for_new_stars() is a relatively straight-forward function. It takes an x coordinate as a parameter and then determines how many stars should be visible based on that. To see how many stars should be visible we loop through the self.sizes list and compare the pre-calculated starting point of each star with the passed in x coordinate. We keep adding more stars until the x coordinate is no longer larger then the starting position of the current star. Then we make sure that a new star should be added and if it is we call self.set_value() to set the number of stars.

def set_value(self, value):         """Sets the current number of stars that will be         drawn.  If the number is different then the current         number the widget will be redrawn"""          if (value >= 0):                 if (self.stars != value):                         self.stars = value                         #check for the maximum                         if (self.stars > self.max_stars):                                 self.stars = self.max_stars                         #redraw the widget                         self.window.invalidate_rect(self.allocation,True) 

set_value() is another simple function that performs a few validation checks and then sets the current number of stars. If the number of stars has changed, the widget will be redrawn.

Now there are three functions remaining and these are simply to make the widget more usable. They are pretty self explanatory:

def get_value(self):         """Get the current number of stars displayed"""          return self.stars  def set_max_value(self, max_value):         """set the maximum number of stars"""          if (self.max_stars != max_value):                 """Save the old max in case it is less then the                 current number of stars, in which case we will                 have to redraw"""                  if (max_value > 0):                         self.max_stars = max_value                         #reinit the sizes list (should really be a separate function                         self.sizes = []                         for count in range(0,self.max_stars):                                 self.sizes.append((count * PIXMAP_SIZE) + BORDER_WIDTH)                         """do we have to change the current number of                         stars?"""                         if (self.stars > self.max_stars):                                 self.set_value(self.max_stars)  def get_max_value(self):         """Get the maximum number of stars that can be shown"""          return self.max_stars 

Now we finish of starhscale.py with a little bit of code that will simply create a window and add the StarHScale widget to that window if someone executes the starhscale.py file directly:

if __name__ == "__main__":         # register the class as a Gtk widget         gobject.type_register(StarHScale)          win = gtk.Window()         win.resize(200,50)         win.connect('delete-event', gtk.main_quit)          starScale = StarHScale(10,5)          win.add(starScale)         win.show_all()         gtk.main() 

So if you run the file you should see the following:


Whew! So that's it, I hope that you found this tutorial useful, now the next step (in the next tutorial) is to add it to the gtk.TreeView.

The full source can be downloaded here.

tutorial:dvd_to_avi [Avidemux DokuWiki]

tutorial:dvd_to_avi [Avidemux DokuWiki]:

DVD/MPEG-2 to AVI

This tutorial explains the process of converting an MPEG-1 or MPEG-2 DVD file into an AVI file containing MPEG-4 ASP video (this is often incorrectly called “DivX” or “Xvid” - see the Common myths article for an explanation of the words DivX and Xvid and the difference between software and format).

There are several programs available online which allow you to convert DVDs to MPEG-4 AVI, using tools such as the popular MEncoder. The difference between Avidemux and these other projects is that Avidemux allows you to edit the file before encoding it, and to do a visual check of what you are doing.

Understanding MPEG file types

  • m1v means “MPEG-1 Video”, i.e. a file which only contains a so called elementary video stream, without any audio.
  • m2v means “MPEG-2 Video”.
  • mpg is a program stream containing the multiplexed video and audio streams.
  • vob is a DVD system stream which contains video, audio and additional information, it is a also a program stream.
  • vdr is a transport stream containing video and audio(s). It is supported but without sync correction.

Getting a usable VOB file using MPlayer

mplayer dvd://1 -dumpstream -dumpfile rippeddvd.vob

This will create a file in your working directory called rippeddvd.vob. This is an Avidemux compatible VOB file in MPEG format with the various audio streams included on the DVD.

Loading and indexing your MPEG file

Load this rippeddvd.vob file into Avidemux by either clicking the “Open” folder icon in the toolbar, or going File→Open.

You will now be presented with a dialog box with a dropdown list of audio stream choices. This is where you pick which audio track on your DVD you want to use for your video. Generally the default, or first in the list is the best choice. Click “OK” to begin indexing the MPEG. This may take a number of minutes depending on the speed of your machine.

Note: Avidemux does not read MPEG streams. It has been designed to read an MPEG stream index. An MPEG stream index is a plain text file containing a description of the MPEG and the location of frames throughout the stream. This file allows Avidemux to random seek and stay accurate. Said otherwise, without the index, Avidemux cannot handle MPEG files.

Editing

NTSC versus FILM

Some DVDs are coded as 23.976 fps aka FILM (most movies actually). Some others are coded as 29.96 fps (NTSC), soap for example. In the first case, the DVD player does an operation to convert it on the fly to NTSC format (telecine). So the MPEG header always says 29.96 as it will always be the final format.

Avidemux uses mpeg2dec to decode MPEG streams (with a little patch). mpeg2dec does not do the telecine on FILM movie (and that's better that way).

It means that Avidemux cannot tell the difference between FILM and NTSC. So if the MPEG looks progressive (not interlaced) and obvious desync appears (and gets worse and worse), use Video→Frame Rate and set it to 23.976.

For PAL MPEG, there is no problem, it is always 25 fps.

If audio is present, Avidemux will try to guesstimate if the video is 23.976 by comparing audio and video duration.

Cropping

Cropping removes the black borders along the top and bottom of the video in a widescreen format DVD. It allows more data to be used in encoding actual picture information. Without cropping, the sharp edges would radically reduce picture quality in these areas, as the MPEG based codecs do not handle them well. Always make sure you remove all borders completely, even if they're only half-black or unclean.

  1. To crop the video, we must use the video filters. Before selecting the filters however, use the slider bar at the bottom to select a spot in the middle of the movie. The reason for this, is that auto crop feature will adjust the cropping based on the current frame.
  2. Now press the Video Filters button to popup the video filter list.
  3. Add Transformation → Crop.
  4. Now click the “AutoCrop” button. You'll notice the black areas will now appear in green to show the areas where the video will be cropped.
  5. If you like the way it looks click “OK” and then close the video filters list.

Resizing

You probably want to resize the video to something smaller. Lower resolution means higher bits/pixel ratio, which may improve quality at lower bitrates that are typical for MPEG-4.

Also, AVI files do not contain aspect ratio information (but it can be stored inside the MPEG-4 bitstream), so you'll have to resize the video after cropping to get the correct aspect ratio. The aspect ratio is the shape of a pixel. On a PC it is mostly square, however, on a DVD it could be 4:3 or 16:9.

So, bring back up the video filter list.

  1. Select the input aspect ratio (16:9 is the most common for DVD) and target aspect ratio (1:1 for AVI).
  2. Then select the resizing method – bilinear is not as sharp as bicubic or lanczos, but is generally more compressible. So it depends on the bitrate (sharper methods for higher bitrates), source video compressibility and your taste (some people prefer sharper video even if it means more compression artifacts such as ringing).
  3. You will want to check the 16 round up to be sure the final width and height are multiples of 16.
  4. Then move the slider until you reach the desired width.

More video filters and cutting

Depending on the source, you may want to add more filters (subtitling, denoiser, deinterlacer, IVTC etc.).

At this point, the video is ready for any editing/cutting you may wish to do.

Configuring video

Choosing an encoder

Now choose the video encoder from the drop down list. The two MPEG-4 encoders supported by Avidemux are Xvid and FFmpeg (libavcodec) MPEG-4. Both are very good and more or less comparable, so it's up to you which one you prefer. We'll use Xvid in this tutorial, as it's currently better supported in Avidemux and therefore recommended.

Configuring the encoder

Click the Configure button and you will be presented with the Xvid Configuration window. It is highly recommended to use 2 pass encoding so you can select your final file size. Don't worry about the final file size, and set the options as you would like to have them and click OK to save the settings.

Calculating video size

Now use the Calculator, make sure Format is AVI, select your medium (i.e. final file size), and click Apply. This will automatically fill in the file size in the Xvid Configuration window. The calculation results will be displayed in the window. You can close this window now.

Configuring audio

Selecting audio track

The first step is to choose which audio track to encode. By default the first audio track is selected which on a DVD should be the main language of the DVD (i.e. English for Region 1). To choose another main audio track go to Audio→Main Track and use the drop down box to select the internal audio track you want, or if you desire you can choose an external audio source. If you want a second audio track to be encoded use Audio→Second Track.

Checking A/V sync

At this point you want to check the audio sync. Turn on the filtered output preview mode and make sure the audio is synced, if it's not, use the Shift feature to correct this.

Choosing audio encoder

Next you must decide which encoder to use. MP3 via LAME is the most supported format. Keeping the default AC3 track is also a very good option for 5.1 sound. Recompressing from an already lossy compression will never sound as good, the file size savings won't be too great in using another codec, and you will save time in not compressing the audio.

Audio filters

Finally you must decide if you need or want any filters.

Saving

Now select File→Save→Save Video and wait. Avidemux will do pass 1 then switch to pass 2 (including audio). The result will be an AVI file with MPEG-4 ASP video and MP3 audio inside.

Howto OpenBrowser - Mono

Howto OpenBrowser - Mono:

Howto OpenBrowser

Any versions of Mono available since 2007 will support opening a url by using Process.Start with a url, for example:

Process.Start ("http://www.google.com");

Older versions of Mono did not support this and have to manually do this. Here is how it used to be done in the past. This copes with a few different operating systems in *older* versions of Mono:

using System; using System.Diagnostics;   public static bool OpenLink(string address) {     try {         int plat = (int) Environment.OSVersion.Platform;         if ((plat != 4) && (plat != 128)) {             // Use Microsoft's way of opening sites             Process.Start(address);         } else {             // We're on Unix, try gnome-open (used by GNOME), then open             // (used my MacOS), then Firefox or Konqueror browsers (our last             // hope).             string cmdline = String.Format("gnome-open {0} || open {0} || "+                 "firefox {0} || mozilla-firefox {0} || konqueror {0}", address);             Process proc = Process.Start (cmdline);               // Sleep some time to wait for the shell to return in case of error             System.Threading.Thread.Sleep(250);               // If the exit code is zero or the process is still running then             // appearently we have been successful.             return (!proc.HasExited || proc.ExitCode == 0);         }     } catch (Exception e) {         // We don't want any surprises         return false;     } }

If your program is meant to be run only under GNOME, you have a better and easier solution, however. It is suficient to call Gnome.Url::Show, as shown below:

using Gnome; ... public void OpenMyProgramWebsite() {     Url.Show("http://websiteofmyproject/"); }

tutorial:projectx [Avidemux DokuWiki]

tutorial:projectx [Avidemux DokuWiki]:

ProjectX

ProjectX tries its best to handle & repair many stream types and shows what went wrong on reception. This is very useful for repairing MPEG files, especially with audio and video syncronization problems.

Installing Java

Java must be installed to run the ProjectX executable binary. How this is done for your computer will vary. Here are some guides for specific operating systems on how to install Java.

Ubuntu/Kubuntu/Xubuntu

Enable the universe in your apt-source list. From a root console or via sudo use nano/vim/gedit or whatever text editor you like to open: /etc/apt/sources.list. You should either have the “universe” enabled on one of your repository lines.

After you have made sure the “universe” repository is enabled in your apt-sources file, from a root console or via sudo run this command:

sudo apt-get update

Next we will install Java itself. Run this command line from the root console with either su or sudo:

sudo apt-get install sun-java6-bin

Downloading ProjectX

ProjectX is written using Java and can be run on almost any platform. You can download binary execute jar files from Doom9.org in their Downloads section.

Extracting ProjectX

After you have downloaded the ProjectX binary zip file, you need to extract it. How you do this will depend on your operating system. In most Linux/BSD systems it can be done with the unzip command. Like in this example:

unzip ProjectX_0.90.4.zip

Running ProjectX

After you have downloaded ProjectX, you can simple run it. Exactly how you run the program will depend on your operating system.

Linux/BSD

In Linux and BSD variant system, that have contain a full and valid console enviroment, you can run ProjectX using the command line. From a command console, make sure your current console prompt is actually working the same directory as the ProjectX .jar file or that the .jar file is in your system path, then run this command:

java -jar ProjectX.jar

ProjectX should now open.

Windows

If you are using Windows, usually you can simply double-click on the .jar file. Alternatively you can select it and then right-click to open the mouse-menu and select Open to start the file. By default Java, if it is properly installed, should run the application for you.

Problems

Ubuntu

If you are using Ubuntu and you had error when you tried to run ProjectX from the console, you may need to run the following commands to configure Java (either as root or using sudo):

sudo update-alternatives --install /usr/bin/java java /usr/local/java/bin/java 3 sudo update-alternatives --config java

If this does not fix your errors when executing, please refer to the Java install page here.

SourceForge.net

ProjectX is a SourceForge project. News and updates to the program can be found at http://sourceforge.net/projects/project-x. This website does not contain a pre-compiled binary executable file and it may require you to create it if you want to use their version. Usually the regular binary files are up-to-date and current with very recent code releases.

tutorial:deinterlacing_video [Avidemux DokuWiki]

tutorial:deinterlacing_video [Avidemux DokuWiki]:

Editing MPEG capture (DVB or IVTV)

This page tries to give hints about editing captured MPEG files. These captured MPEGs are generally from DVB S/T (in MPEG TS format) or from IVTV based cards or any other card with hardware MPEG-2 encoding (in MPEG PS format).

The problem

These captures often contain transmission errors which end up as missing or broken frames. A player (MPlayer, xine, VLC) will constantly resync the streams using the timing information embedded in the stream. Avidemux will not.

Apart from the constant shift, which is easily recoverable using the timeshift filter, it will result in a growing synchronisation issue when encoding or transcoding. Even saving without re-encoding will be async.

MythTV recordings are a prime example of this problem. If the process below is not followed the audio will be offset by approximately -330 ms at the start of the recording and the drift throughout the duration of the recording. Please note that not all MythTV recordings have this problem, just some depending on the software and hardware configuration.

The solution

The only 100% reliable way to do it is to use ProjectX (see the ProjectX tutorial). It takes a bit of time but is easily scriptable, an example script is available from here. Let's say you have a DVB file called 2537_20060819203500.mpg in MPEG TS format (DVB-T) capture.

First we will demultiplex the file into elementary streams, but synced elementary streams

projectx 2537_20060819203500.mpg

That will generate one file per elementary stream with the following extensions:

  • m2v: for video stream
  • mp2: for MPEG audio stream
  • ac3: for Dolby Digital aka AC3 stream

Now we must recombine these streams, into PS format for example.

mplex -f 8 -o output_file.mpg 2537_20060819203500.ac3 2537_20060819203500.m2v 2537_20060819203500.mp2

The resulting file (output_file.mpg) can be edited in Avidemux (remove commercials for example) or transcoded to any formats without sync issue.

ProjectX is very reliable to resync the streams. For information, it will dupe missing audio frames or create empty video frames when they were dropped.

Up to date versions of mplex for Win32 systems are hard to find, however ImagoMPEG-Muxer is a perfect substitution.

Final touch

If you encode to other format (MPEG-4,…), you will probably want to go to Preferences and select Input→Use libavcodec MPEG decoder. libavcodec contains code to try to hide decoding errors. So instead of having a green block, you will have a mostly correct but blurred block, which is much more pleasing to the eye.

tutorial:editing_mpeg_capture [Avidemux DokuWiki]

tutorial:editing_mpeg_capture [Avidemux DokuWiki]:

Editing MPEG capture (DVB or IVTV)

This page tries to give hints about editing captured MPEG files. These captured MPEGs are generally from DVB S/T (in MPEG TS format) or from IVTV based cards or any other card with hardware MPEG-2 encoding (in MPEG PS format).

The problem

These captures often contain transmission errors which end up as missing or broken frames. A player (MPlayer, xine, VLC) will constantly resync the streams using the timing information embedded in the stream. Avidemux will not.

Apart from the constant shift, which is easily recoverable using the timeshift filter, it will result in a growing synchronisation issue when encoding or transcoding. Even saving without re-encoding will be async.

MythTV recordings are a prime example of this problem. If the process below is not followed the audio will be offset by approximately -330 ms at the start of the recording and the drift throughout the duration of the recording. Please note that not all MythTV recordings have this problem, just some depending on the software and hardware configuration.

The solution

The only 100% reliable way to do it is to use ProjectX (see the ProjectX tutorial). It takes a bit of time but is easily scriptable, an example script is available from here. Let's say you have a DVB file called 2537_20060819203500.mpg in MPEG TS format (DVB-T) capture.

First we will demultiplex the file into elementary streams, but synced elementary streams

projectx 2537_20060819203500.mpg

That will generate one file per elementary stream with the following extensions:

  • m2v: for video stream
  • mp2: for MPEG audio stream
  • ac3: for Dolby Digital aka AC3 stream

Now we must recombine these streams, into PS format for example.

mplex -f 8 -o output_file.mpg 2537_20060819203500.ac3 2537_20060819203500.m2v 2537_20060819203500.mp2

The resulting file (output_file.mpg) can be edited in Avidemux (remove commercials for example) or transcoded to any formats without sync issue.

ProjectX is very reliable to resync the streams. For information, it will dupe missing audio frames or create empty video frames when they were dropped.

Up to date versions of mplex for Win32 systems are hard to find, however ImagoMPEG-Muxer is a perfect substitution.

Final touch

If you encode to other format (MPEG-4,…), you will probably want to go to Preferences and select Input→Use libavcodec MPEG decoder. libavcodec contains code to try to hide decoding errors. So instead of having a green block, you will have a mostly correct but blurred block, which is much more pleasing to the eye.