我想将用户在我的 Java Applet 中的交互记录为视频以发送(可能是流)到我的服务器,以上传到 Youtube(或类似的)。不需要高帧率(每秒几帧就足够了)。
尽量减少使用的带宽是首选,因此将 jpeg 快照发送到服务器并在服务器端编码是我最后的手段。
是否有任何不需要本机代码的轻量级 Java 视频编码库?
我想将用户在我的 Java Applet 中的交互记录为视频以发送(可能是流)到我的服务器,以上传到 Youtube(或类似的)。不需要高帧率(每秒几帧就足够了)。
尽量减少使用的带宽是首选,因此将 jpeg 快照发送到服务器并在服务器端编码是我最后的手段。
是否有任何不需要本机代码的轻量级 Java 视频编码库?
I'm new to java so don't take this to seriously :)
I guess a good start with video encoding in java is Java Media Framework. I haven't tried it, so I don't know what's they're support on flv encoding.
Since Flash Media Server is commercial, couldn't you use Red5 ? You would have a swf, not an applet, but you will get a broader percentage of viewers since Flash Player is pretty wide spreaded.
And Alex has a good point, since you need to upload the video to youtube, why not use they're API ?
hth
Xuggler可用于编码几乎任何来自 Java 的格式,但它需要安装本机组件。易于使用的下载中没有可用的小程序版本,但一些用户已经构建了他们在可下载应用程序中使用的自定义版本的 FFmpeg 和 Xuggler。尝试询问 xuggler-users 用户组,看看其他人是否会提供帮助。
您可以通过这种方式将图像编码为 H.264/MP4,这将立即适用于网络流媒体。要在录制的同时上传它,您可以将您的序列分成小块,假设每个 25-100 张图像,然后将每个块作为单独的电影上传。
您可以在没有任何本机代码的纯 Java 中执行此操作,只需使用 JCodec ( http://jcodec.org )。这是一个方便的类,您可以使用:
public class SequenceEncoder {
private SeekableByteChannel ch;
private Picture toEncode;
private RgbToYuv420 transform;
private H264Encoder encoder;
private ArrayList<ByteBuffer> spsList;
private ArrayList<ByteBuffer> ppsList;
private CompressedTrack outTrack;
private ByteBuffer _out;
private int frameNo;
private MP4Muxer muxer;
public SequenceEncoder(File out) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Transform to convert between RGB and YUV
transform = new RgbToYuv420(0, 0);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
public void encodeImage(BufferedImage bi) throws IOException {
if (toEncode == null) {
toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);
}
// Perform conversion
transform.transform(AWTUtil.fromBufferedImage(bi), toEncode);
// Encode image into H.264 frame, the result is stored in '_out' buffer
_out.clear();
ByteBuffer result = encoder.encodeFrame(_out, toEncode);
// Based on the frame above form correct MP4 packet
spsList.clear();
ppsList.clear();
H264Utils.encodeMOVPacket(result, spsList, ppsList);
// Add packet to video track
outTrack.addFrame(new MP4Packet(result, frameNo, 25, 1, frameNo, true, null, frameNo, 0));
frameNo++;
}
public void finish() throws IOException {
// Push saved SPS/PPS to a special storage in MP4
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));
// Write MP4 header and finalize recording
muxer.writeHeader();
NIOUtils.closeQuietly(ch);
}
}
Why do you need to send the images or video form directly? Sounds like a big bandwidth expense. Just serialize and send the stream of UI events with timestamps, and reconstruct what the user should be seeing on your server later (some visual details may depend on the user's machine/setup, but your applet ain't gonna be able to get to them decently anyway).