1

我正在尝试开发和应用程序来保存视频,并且该视频可以在 pc 中查看。我在手机上使用android,在pc上使用java。pc 内容服务器端和传输是由套接字进行的。我的问题似乎是我可以录制视频,但 pc 端应用程序无法再现发送的视频。`我告诉你这是我设置 MediaRecorder 的代码:

public void prepareVideoRecorder(Camera mCamera, ParcelFileDescriptor pfd,
        SurfaceHolder mHolder) {
    if (mCamera == null) {
        mCamera = safeCameraOpen(mCamera);
    }
    if (mMediaRecorder == null) {
        mMediaRecorder = new MediaRecorder();

        mCamera.stopPreview();
        // Step 1: unlock and set camera to MediaRecorder;
        mCamera.unlock();
        mMediaRecorder.setCamera(mCamera);
    }

    // Step 2: Set sources:
    mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
    mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
    //mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);


    // Step 3:Set a CamcorderProfile (APi level 8 or higher)
    mMediaRecorder.setProfile(CamcorderProfile
            .get(CamcorderProfile.QUALITY_HIGH));

    // Step 4: Set output file
    mMediaRecorder.setOutputFile(pfd.getFileDescriptor());
    // Step 5: Set the preview output
    mMediaRecorder.setPreviewDisplay(mHolder.getSurface());
    try {
        mMediaRecorder.prepare();
    } catch (IllegalStateException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }

}

这似乎是正确的。然后应该播放视频的 pc 端使用 xuggler 开发,应用程序停止:

if (container.open(inputstream, null) < 0) {
            throw new IllegalArgumentException("could not open inpustream");
        }

这是下一个 java 类的一部分:

public class imagePnl extends JPanel {

URL medialocator = null;
BufferedImage image;
private Player player;
private DataSource ds = null;
private String mobileLocation = "socket://localhost:1234";
// private ByteArrayDataSource byteDs = null;
private InputStream inputStream = null;
IContainerFormat format;

public imagePnl() {
}

public void setVideo(InputStream inputstream) {
    // Let's make sure that we can actually convert video pixel formats.
    if (!IVideoResampler
            .isSupported(IVideoResampler.Feature.FEATURE_COLORSPACECONVERSION)) {
        throw new RuntimeException("you must install the GPL version"
                + " of Xuggler (with IVideoResampler support) for "
                + "this demo to work");
    }

    IContainer container = IContainer.make();

    if (container.open(inputstream, null) < 0) {
        throw new IllegalArgumentException("could not open inpustream");
    }
    // query how many streams the call to open found
    int numStreams = container.getNumStreams();
    // and iterate through the streams to find the first video stream
    int videoStreamId = -1;
    IStreamCoder videoCoder = null;
    for (int i = 0; i < numStreams; i++) {
        // Find the stream object
        IStream stream = container.getStream(i);
        // Get the pre-configured decoder that can decode this stream;
        IStreamCoder coder = stream.getStreamCoder();

        if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
            videoStreamId = i;
            videoCoder = coder;
            break;
        }
    }
    if (videoStreamId == -1) {
        throw new RuntimeException("could not find video stream");
    }
    /*
     * Now we have found the video stream in this file. Let's open up our
     * decoder so it can do work.
     */
    if (videoCoder.open() < 0) {
        throw new RuntimeException(
                "could not open video decoder for container");
    }
    IVideoResampler resampler = null;
    if (videoCoder.getPixelType() != IPixelFormat.Type.BGR24) {
        // if this stream is not in BGR24, we're going to need to
        // convert it. The VideoResampler does that for us.
        resampler = IVideoResampler.make(videoCoder.getWidth(),
                videoCoder.getHeight(), IPixelFormat.Type.BGR24,
                videoCoder.getWidth(), videoCoder.getHeight(),
                videoCoder.getPixelType());
        if (resampler == null) {
            throw new RuntimeException(
                    "could not create color space resampler.");
        }
    }
    /*
     * Now, we start walking through the container looking at each packet.
     */
    IPacket packet = IPacket.make();
    long firstTimestampInStream = Global.NO_PTS;
    long systemClockStartTime = 0;
    while (container.readNextPacket(packet) >= 0) {
        /*
         * Now we have a packet, let's see if it belongs to our video stream
         */
        if (packet.getStreamIndex() == videoStreamId) {
            /*
             * We allocate a new picture to get the data out of Xuggler
             */
            IVideoPicture picture = IVideoPicture.make(
                    videoCoder.getPixelType(), videoCoder.getWidth(),
                    videoCoder.getHeight());

            try {
                int offset = 0;
                while (offset < packet.getSize()) {
                    System.out
                            .println("VideoManager.decode(): decode one image");
                    /*
                     * Now, we decode the video, checking for any errors.
                     */
                    int bytesDecoded = videoCoder.decodeVideo(picture,
                            packet, offset);
                    if (bytesDecoded < 0) {
                        throw new RuntimeException(
                                "got error decoding video");
                    }
                    offset += bytesDecoded;

                    /*
                     * Some decoders will consume data in a packet, but will
                     * not be able to construct a full video picture yet.
                     * Therefore you should always check if you got a
                     * complete picture from the decoder
                     */
                    if (picture.isComplete()) {
                        System.out
                                .println("VideoManager.decode(): image complete");
                        IVideoPicture newPic = picture;
                        /*
                         * If the resampler is not null, that means we
                         * didn't get the video in BGR24 format and need to
                         * convert it into BGR24 format.
                         */
                        if (resampler != null) {
                            // we must resample
                            newPic = IVideoPicture
                                    .make(resampler.getOutputPixelFormat(),
                                            picture.getWidth(),
                                            picture.getHeight());
                            if (resampler.resample(newPic, picture) < 0) {
                                throw new RuntimeException(
                                        "could not resample video");
                            }
                        }
                        if (newPic.getPixelType() != IPixelFormat.Type.BGR24) {
                            throw new RuntimeException(
                                    "could not decode video as BGR 24 bit data");
                        }

                        /**
                         * We could just display the images as quickly as we
                         * decode them, but it turns out we can decode a lot
                         * faster than you think.
                         * 
                         * So instead, the following code does a poor-man's
                         * version of trying to match up the frame-rate
                         * requested for each IVideoPicture with the system
                         * clock time on your computer.
                         * 
                         * Remember that all Xuggler IAudioSamples and
                         * IVideoPicture objects always give timestamps in
                         * Microseconds, relative to the first decoded item.
                         * If instead you used the packet timestamps, they
                         * can be in different units depending on your
                         * IContainer, and IStream and things can get hairy
                         * quickly.
                         */
                        if (firstTimestampInStream == Global.NO_PTS) {
                            // This is our first time through
                            firstTimestampInStream = picture.getTimeStamp();
                            // get the starting clock time so we can hold up
                            // frames until the right time.
                            systemClockStartTime = System
                                    .currentTimeMillis();
                        } else {
                            long systemClockCurrentTime = System
                                    .currentTimeMillis();
                            long millisecondsClockTimeSinceStartofVideo = systemClockCurrentTime
                                    - systemClockStartTime;

                            // compute how long for this frame since the
                            // first frame in the stream.
                            // remember that IVideoPicture and IAudioSamples
                            // timestamps are always in MICROSECONDS,
                            // so we divide by 1000 to get milliseconds.
                            long millisecondsStreamTimeSinceStartOfVideo = (picture
                                    .getTimeStamp() - firstTimestampInStream) / 1000;
                            final long millisecondsTolerance = 50; // and we
                                                                    // give
                                                                    // ourselfs
                                                                    // 50 ms
                                                                    // of
                                                                    // tolerance
                            final long millisecondsToSleep = (millisecondsStreamTimeSinceStartOfVideo - (millisecondsClockTimeSinceStartofVideo + millisecondsTolerance));
                            if (millisecondsToSleep > 0) {
                                try {
                                    Thread.sleep(millisecondsToSleep);
                                } catch (InterruptedException e) {
                                    // we might get this when the user
                                    // closes the dialog box, so just return
                                    // from the method.
                                    return;
                                }
                            }
                        }

                        // And finally, convert the BGR24 to an Java
                        // buffered image
                        BufferedImage javaImage = Utils
                                .videoPictureToImage(newPic);

                        // and display it on the Java Swing window
                        setImage(javaImage);
                        // if (listener != null) {
                        // listener.imageUpdated(javaImage);
                        // }
                    }
                } // end of while
            } catch (Exception exc) {
                exc.printStackTrace();
            }
        } else {
            /*
             * This packet isn't part of our video stream, so we just
             * silently drop it.
             */
            do {
            } while (false);
        }

    }
    /*
     * Technically since we're exiting anyway, these will be cleaned up by
     * the garbage collector... but because we're nice people and want to be
     * invited places for Christmas, we're going to show how to clean up.
     */
    if (videoCoder != null) {
        videoCoder.close();
        videoCoder = null;
    }
    if (container != null) {
        container.close();
        container = null;
    }

    // byteDs = new ByteArrayDataSource(bytes, "video/3gp");
    // ToolFactory.makere byteDs
    // .getOutputStream();
    // Manager.createPlayer(byteD);
    // Player mediaPlayer = Manager.createRealizedPlayer(new
    // MediaLocator(mobileLocation));
    // Component video = mediaPlayer.getVisualComponent();
    // Component control = mediaPlayer.getControlPanelComponent();
    // if (video != null) {
    // add(video, BorderLayout.CENTER);
    // }
    // add(control, BorderLayout.SOUTH);
    // mediaPlayer.start();
    // } catch (IOException | NoPlayerException | CannotRealizeException ex)
    // {
    // Logger.getLogger(imagePnl.class.getName()).log(Level.SEVERE, null,
    // ex);
    // }
    paint(getGraphics());
}

public void setImage(BufferedImage image) {
    this.image = (BufferedImage) image;

    paint(getGraphics());
}

@Override
public void paintComponent(Graphics g) {
    // super.paintComponent(g);
    // Graphics2D g2d = (Graphics2D) g;
    //
    // g2d.drawImage(image, 0, 0, null);
    // explicitly specify width (w) and height (h)
    g.drawImage(image, 10, 10, this.getWidth(), this.getHeight(), this);

}

当应用程序在这一行停止时,不会显示错误,但应用程序也不会在 pc 端显示视频。

我希望你能帮助我。该项目用于学习目的。提前致谢, 弗兰

4

1 回答 1

2

如果要从 android 流式传输视频,则应使用RTSPRTP等流式传输协议。使用 TCP 套接字将不起作用,因为在通过套接字接收的所有数据包中都不会提供标头信息。请看一下Spydroid

于 2013-03-12T13:16:19.900 回答