0

我正在尝试连续捕获图像,以便可以使用 UDP 发送它们。我这样做是为了实现一个实时视频流程序。

下面的代码连续捕获图像并将图像分配给 QGraphicsScene,这样我就可以测试图像是否像视频一样播放。但是当我运行程序时,即使我删除了指针,我的计算机也会在几秒钟后冻结。我该如何解决这个问题?

#include "mainwindow.h"
#include "ui_mainwindow.h"
#include <QThread>

MainWindow::MainWindow(QWidget *parent) :
QMainWindow(parent),
ui(new Ui::MainWindow)
{
ui->setupUi(this);
scene = new QGraphicsScene(this);
ui->graphicsView->setScene(scene);
cam = new QCamera;
cam->setCaptureMode(QCamera::CaptureStillImage);

viewfinder = new QCameraViewfinder;
viewfinder->show();
QCameraImageCapture *cap = new QCameraImageCapture(cam);
cap->setCaptureDestination(QCameraImageCapture::CaptureToBuffer);

cam->setViewfinder(viewfinder);


QObject::connect(cap, &QCameraImageCapture::imageCaptured, [=] (int id, QImage img) {

     while(true){
        QByteArray *buf = new QByteArray;
        QBuffer *buffer=new QBuffer(buf);
        buffer->open(QIODevice::WriteOnly);
        img.save(buffer, "BMP");
        QPixmap *pixmap = new QPixmap();
        pixmap->loadFromData(buffer->buffer());
        scene->addPixmap(*pixmap);
        delete buf;
        delete buffer;
        delete pixmap;

        QThread::sleep(0.0416);
        cap->capture();

     }

});

QObject::connect(cap, &QCameraImageCapture::readyForCaptureChanged, [=] (bool state) {
   if(state == true) {
       cam->searchAndLock();
       cap->capture();
       cam->unlock();
   }
});
cam->start();
}

MainWindow::~MainWindow()
{
    delete ui;
}
4

2 回答 2

1

您应该使用 imageAvailable() 而不是 imageCaptured 信号。

这是一个例子:

connect(cap, &QCameraImageCapture::imageAvailable, [=] (int id, QVideoFrame v ) {

    if (v.isValid()) {
        if(v.map(QAbstractVideoBuffer::ReadOnly)) {

            QByteArray bitsVideo( (char *) v.bits(), v.mappedBytes() );

            //call to your send raw data function (over UDP) : 
            //datagram will contain frame details e.g : [ width, hight, byteperline, format, rawdata ]

            sendDataOverUDP( v.width(), v.height(), 
                             v.bytesperLine(), 
                             QVideoFrame::imageFormatFromPixelFormat(v.pixelFormat()), 
                             bitsVideo );


        }
    }
});

另一方,服务器或其他客户端将根据收到的原始数据创建图像,如下所示:

    void onDataImageReceived( int width, int height, 
                          int bytePerLine, 
                          QImage::Format fmt, 
                          QByteArray bitsVideo )
{

    QImage img ((uchar *)bitsVideo.data(), width, height, bytesPerLine, fmt);
    //do something with img ...

}
于 2018-12-17T00:53:37.017 回答
0

我不熟悉QCamera和相关的类,但lambda你连接的QCameraImageCapture::imageCaptured信号看起来不正确。当单个帧准备好进行预览时,就会发出该信号。lambda然而,在你的...

while(true){
    QByteArray *buf = new QByteArray;
    QBuffer *buffer=new QBuffer(buf);
    buffer->open(QIODevice::WriteOnly);
    img.save(buffer, "BMP");
    QPixmap *pixmap = new QPixmap();
    pixmap->loadFromData(buffer->buffer());
    scene->addPixmap(*pixmap);
    delete buf;
    delete buffer;
    delete pixmap;

    QThread::sleep(0.0416);
    cap->capture();
}

while循环永远不会退出并且会阻塞Qt事件处理循环。另请注意,代码块...

QByteArray *buf = new QByteArray;
QBuffer *buffer=new QBuffer(buf);
buffer->open(QIODevice::WriteOnly);
img.save(buffer, "BMP");
QPixmap *pixmap = new QPixmap();
pixmap->loadFromData(buffer->buffer());
scene->addPixmap(*pixmap);
delete buf;
delete buffer;
delete pixmap;

是矫枉过正,(除非我弄错了)基本上相当于......

scene->addPixmap(QPixmap::fromImage(img));

所以我认为你lambda应该更像(未经测试)......

[=](int id, QImage img)
{
    scene->addPixmap(QPixmap::fromImage(img));
}
于 2018-12-16T19:24:12.923 回答