Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

The browser will show the live video stream from the camera. Controls for focus, IR-LEDs as well as image processing (cropping, mirroring, …) will be added in other examples. Please find below a brief introduction to this examples code.

Packages used

Apart from standard library packages, we’ve used

func main ()

main()

  • line 64 opens the device /dev/video0 which is closed when the app terminates.

  • line 71 reads the pixel format of the device and stores width and height globally. See Components of interest for details on the sensor. Note the difference between the image width (720) and the number of bytes per line (736) for this sensor. Each line is padded with two empty bytes and we have to work with the actual data size of 736 x 540 pixels. The sensor is set to deliver 8 Bit values per pixel by default, so each byte in gthe frame represents one pixel.

  • line 82 starts the cam

  • line 88 stores the channel fro reading frames from the video device in a global variable

  • line 92 adds func frameSrv() as handler for the http endpoint /stream to the default http multiplexer

  • line 93 starts listening on port 55553 for incoming resource requests. It will start a new frameSrv-Handler as go routine for each incoming request.

func frameSrv(w http.ResponseWriter, r *http.Request)

frameSrv is a http handler that serves requests for the resource /stream on Merlins port 55553.

It first creates a HTTP multipart/x-mixed-replace keep-alive stream and sets it’s content type to image/jpeg. Each camera frame is sent as jpeg to the http.ResponseWriter of this connection.

  • line 40 It creates a buffer for frames from the video device.

  • line 41 create a buffer an image struct used to convert the frame to jpeg.

  • line 44 is a loop that waits for a new frame from the video device and

    • line 45 creates a new partWriter for the multipartWriter

    • line 51 copies the received frame from the video device into the prepared image

    • line 52 encodes the image as jpeg into the partWriter

Advanced code syntax highlighter
version2
{"code":"package main\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"image\"\n\t\"image/jpeg\"\n\t\"mime/multipart\"\n\t\"net/http\"\n\t\"net/textproto\"\n\n\t\"github.com/rs/zerolog/log\"\n\n\t\"github.com/vladimirvivien/go4vl/device\"\n\t\"github.com/vladimirvivien/go4vl/v4l2\"\n)\n\nconst (\n\tPORT    = \":55553\"      // Port number for the server\n\tDEVNAME = \"/dev/video0\" // Name of the video device\n)\n\nvar (\n\tframes      <-chan []byte  // Channel to receive frames from the video device\n\tpixfmt      v4l2.PixFormat // Pixel format of the video stream\n\tframeWidth  int            // Width of the frame\n\tframeHeight int            // Height of the frame\n)\n\n// frameSrv is a handler that serves the frames from the video device\n// as a multipart stream.\nfunc frameSrv(w http.ResponseWriter, r *http.Request) {\n\t// --- Create a multipart writer and set the boundary ------------------------\n\tmimeWriter := multipart.NewWriter(w)\n\tmimeWriter.SetBoundary(\"frame\")\n\tw.Header().Set(\"Content-Type\", fmt.Sprintf(\"multipart/x-mixed-replace; boundary=%s\", mimeWriter.Boundary()))\n\tpartHeader := make(textproto.MIMEHeader)\n\tpartHeader.Add(\"Content-Type\", \"image/jpeg\")\n\n\tvar frame []byte                                                // bugger for frames as slice of byte\n\timg := image.NewGray(image.Rect(0, 0, frameWidth, frameHeight)) // buffer for frame as image data type\n\n\t// --- read frames from video device and send them to the host ---------------\n\tfor frame = range frames {\n\t\tpartWriter, err := mimeWriter.CreatePart(partHeader) // create part to write to mulitpart writer\n\t\tif err != nil {\n\t\t\tlog.Error().Msgf(\"Failed to create multipart writer part: %v\", err)\n\t\t\treturn\n\t\t}\n\n\t\timg.Pix = frame                         // copy the frame into the image\n\t\terr = jpeg.Encode(partWriter, img, nil) // encode the image as jpeg into the writer\n\t\tif err != nil {\n\t\t\tlog.Error().Msgf(\"failed to encode frame to jpeg image: %v\", err)\n\t\t}\n\t}\n}\n\n// main is the entry point of the program.\n// It opens the video device, starts the device and serves the frames\n// on the specified port.\nfunc main() {\n\t// --- Open the video device -------------------------------------------------\n\tcam, err := device.Open(DEVNAME, device.WithBufferSize(1))\n\tif err != nil {\n\t\tlog.Fatal().Msgf(\"Failed to open device %s: %v\", DEVNAME, err)\n\t}\n\tdefer cam.Close() // Close the device when the program exits\n\n\t// --- Get the pixel format of the video stream ------------------------------\n\tpixfmt, err = cam.GetPixFormat()\n\tif err != nil {\n\t\tlog.Fatal().Msgf(\"Failed to get pixel format: %v\", err)\n\t}\n\n\t// --- Get frame width/height and print the pixel format ---------------------\n\tframeWidth = int(pixfmt.BytesPerLine)\n\tframeHeight = int(pixfmt.Height)\n\tfmt.Sprintf(\"Video stream format: %d x %d\\n %s\\n\", frameWidth, frameHeight, pixfmt)\n\n\t// --- Start the video device ------------------------------------------------\n\terr = cam.Start(context.TODO())\n\tif err != nil {\n\t\tlog.Fatal().Msgf(\"Failed to start device: %v\", err)\n\t}\n\n\t// --- Get the frames from the video device ----------------------------------\n\tframes = cam.GetOutput()\n\n\t// --- Serve the frames on the specified port --------------------------------\n\tlog.Info().Msgf(\"Serving images on [%s/stream]\", PORT)\n\thttp.HandleFunc(\"/stream\", frameSrv)\n\terr = http.ListenAndServe(PORT, nil)\n\tif err != nil {\n\t\tlog.Fatal().Msgf(\"Failed to start server: %v\", err)\n\t}\n}","theme":"coy","language":"go","showLineNumbers":true}

Connection details

image-20241227-181614.pngImage Added