AForge.Video.DirectShow.VideoCaptureDevice
. He needs to specify the moniker of the device from which the capture will occur. You must also set the NewFrame
event NewFrame
. This event occurs every time a new frame is received from the device, which is passed to the handler as a Bitmap
object, where it can already be processed:
private void VideoSourceNewFrame(object sender, AForge.Video.NewFrameEventArgs eventArgs) { var img = (Image) eventArgs.Frame; using (var ms=new MemoryStream()) { img.Save(ms,ImageFormat.Jpeg); // , mjpeg _bufImage = ms.ToArray(); } }
Start()
method.
FilterInfoCollection
class, passing it, as a parameter, the necessary category of devices:
var videoDevices = new FilterInfoCollection(FilterCategory.VideoInputDevice);
VideoCaptureDevice
class.
HTTP/1.1 200 OK Cache-Control: no-cache Pragma: no-cache Transfer-Encoding: chunked Content-Type: multipart/x-mixed-replace; boundary=--myboundary Expires: -1 --myboundary Content-Type: image/jpeg Content-Length:96719 .....image....... --myboundary Content-Type: image/jpeg Content-Length:96720 .....next image.......
public ActionResult Video() { Response.Clear(); // Response.ContentType = "multipart/x-mixed-replace; boundary=--myboundary"; // Response.Expires = 0; Response.Cache.SetCacheability(HttpCacheability.NoCache); var ae = new ASCIIEncoding(); // while (Response.IsClientConnected) { try { //_bufImage - , jpeg var buf = _bufImage; // var boundary = ae.GetBytes("\r\n--myboundary\r\nContent-Type: image/jpeg\r\nContent-Length:" + buf.Length + "\r\n\r\n"); Response.OutputStream.Write(boundary, 0, boundary.Length); Response.OutputStream.Write(buf, 0, buf.Length); Response.Flush(); // , 20 / Thread.Sleep(50); } catch (Exception) { } } Response.End(); return null; }
Source: https://habr.com/ru/post/177793/