To send an image every 50 ms., should I use TCP or UDP?

I am building a C# application, using the server-client model, where the server is sending an image (100kb) to the client through a socket every 50ms. I was using TCP, but besides the overhead of this protocol, sometimes the client ended up with more than one image on the socket. And I still haven't though of a clever mechanism to split the bytes of each image (actually, I just need the most recent one). I tried using UDP, but got to the conclusion that I can't send 100kb dgrams, only 64kb ones. And even so, I shouldn't use more than 1500bytes; otherwise the packet would be divided along the network and the chances of losing parts of the packet would be greater. So now I'm a bit confused. Should I continue using TCP and put some escaping bytes in the end of each image so the client can separate them? Or should I use UDP, send dgrams of 1500 bytes and come up with a mechanism for ordering and recovering? The key goal here is transmitting the images very fast. I don't mind losing some on the way as long as the client keeps receiving newer ones. Or should I use another protocol? Thanks in advance!

54.3k 22 22 gold badges 128 128 silver badges 146 146 bronze badges asked Apr 15, 2009 at 23:06 Joao Oliveira Joao Oliveira

5 Answers 5

You should consider using Real-time Transport Protocol (aka RTP).

The underlying IP protocol used by RTP is UDP, but it has additional layering to indicate time stamps, sequence order, etc.

RTP is the main media transfer protocol used by VoIP and video-over-IP systems. I'd be quite surprised if you can't find existing C# implementations of the protocol.

Also, if your image files are in JPEG format you should be able to produce an RTP/MJPEG stream. There are quite a few video viewers that already have native support for receiving and displaying such a stream, since some IP webcams output in that format.