65

就像许多人似乎已经拥有(这里有几个关于这个主题的主题)我正在寻找从一系列图像创建视频的方法。

我想用 C# 实现我的功能!

这是我不想做的事情:

/*Pseudo code*/
void CreateVideo(List<Image> imageSequence, long durationOfEachImageMs, string outputVideoFileName, string outputFormat)
{
    // Info: imageSequence.Count will be > 30 000 images
    // Info: durationOfEachImageMs will be < 300 ms

    if (outputFormat = "mpeg")
    {
    }
    else if (outputFormat = "avi")
    {      
    }
    else
    {
    }

    //Save video file do disk
}

我知道有一个名为Splicer ( http://splicer.codeplex.com/ ) 的项目,但我找不到合适的文档或可以遵循的清晰示例(这些是我找到的示例)。

我想做的最接近的,我在 CodePlex 上找到的是: 如何从 C# 中的图像目录创建视频?

我还阅读了一些关于ffmpeg的主题(例如:C# 和 FFmpeg 最好不使用 shell 命令?还有:使用 ffmpeg 转换图像序列),但我发现没有人可以帮助我解决我的问题,我不认为ffmpeg -命令行样式对我来说是最好的解决方案(因为图像数量)。

我相信我可以以某种方式使用 Splicer 项目(?)

就我而言,大约有 > 30 000 张图像,其中每个图像应显示约 200 毫秒(在我要创建的视频流中)。

(视频是关于什么的?植物生长...)

谁能帮我完成我的功能?

4

7 回答 7

70

好吧,这个答案来的有点晚,但是由于我最近注意到我的原始问题有一些活动(以及没有提供有效解决方案的事实),我想告诉你最终对我有用的东西。

我将我的答案分为三个部分:

  • 背景
  • 问题
  • 解决方案

背景

(这部分对于解决方案并不重要)

我最初的问题是我有很多图像(即大量),这些图像作为字节数组单独存储在数据库中。我想用所有这些图像制作一个视频序列。

我的设备设置类似于这张总图: 在此处输入图像描述

这些图像描绘了在不同州种植的番茄植物。所有图像在白天每 1 分钟拍摄一次。

/*pseudo code for taking and storing images*/
while (true)
{
    if (daylight)
    {
        //get an image from the camera
        //store the image as byte array to db
    }
    //wait 1 min
}

我有一个非常简单的数据库来存储图像,里面只有一个表(表 ImageSet): 在此处输入图像描述


问题

我已经阅读了许多关于 ffmpeg 的文章(请参阅我的原始问题),但我找不到任何关于如何从图像集合到视频的文章。


解决方案

最后,我得到了一个可行的解决方案!它的主要部分来自开源项目AForge.NET。简而言之,您可以说AForge.NET 是 C# 中的计算机视觉和人工智能库。(如果你想要一个框架的副本,只需从http://www.aforgenet.com/获取它)

在 AForge.NET 中,有一个 VideoFileWriter 类(一个借助 ffmpeg 编写视频文件的类)。这几乎完成了所有工作。(这里也有一个很好的例子)

这是我用来从我的图像数据库中获取图像数据并将其转换为视频的最后一个类(简化的):

public class MovieMaker
{

    public void Start()
    {
        var startDate = DateTime.Parse("12 Mar 2012");
        var endDate = DateTime.Parse("13 Aug 2012");

        CreateMovie(startDate, endDate);
    }    
    

    /*THIS CODE BLOCK IS COPIED*/

    public Bitmap ToBitmap(byte[] byteArrayIn)
    {
        var ms = new System.IO.MemoryStream(byteArrayIn);
        var returnImage = System.Drawing.Image.FromStream(ms);
        var bitmap = new System.Drawing.Bitmap(returnImage);

        return bitmap;
    }

    public Bitmap ReduceBitmap(Bitmap original, int reducedWidth, int reducedHeight)
    {
        var reduced = new Bitmap(reducedWidth, reducedHeight);
        using (var dc = Graphics.FromImage(reduced))
        {
            // you might want to change properties like
            dc.InterpolationMode = System.Drawing.Drawing2D.InterpolationMode.HighQualityBicubic;
            dc.DrawImage(original, new Rectangle(0, 0, reducedWidth, reducedHeight), new Rectangle(0, 0, original.Width, original.Height), GraphicsUnit.Pixel);
        }

        return reduced;
    }

    /*END OF COPIED CODE BLOCK*/


    private void CreateMovie(DateTime startDate, DateTime endDate)
    {
        int width = 320;
        int height = 240;
        var framRate = 200;

        using (var container = new ImageEntitiesContainer())
        {
            //a LINQ-query for getting the desired images
            var query = from d in container.ImageSet
                        where d.Date >= startDate && d.Date <= endDate
                        select d;

            // create instance of video writer
            using (var vFWriter = new VideoFileWriter())
            {
                // create new video file
                vFWriter.Open("nameOfMyVideoFile.avi", width, height, framRate, VideoCodec.Raw);

                var imageEntities = query.ToList();

                //loop throught all images in the collection
                foreach (var imageEntity in imageEntities)
                {
                    //what's the current image data?
                    var imageByteArray = imageEntity.Data;
                    var bmp = ToBitmap(imageByteArray);
                    var bmpReduced = ReduceBitmap(bmp, width, height);

                    vFWriter.WriteVideoFrame(bmpReduced);
                }
                vFWriter.Close();
            }
        }

    }
}

更新 2013-11-29(如何)(希望这是您对 @Kiquenet 的要求?)

  1. 从下载页面下载 AForge.NET Framework (下载完整的 ZIP 存档,您会发现许多有趣的 Visual Studio 解决方案与项目,如视频,在AForge.NET Framework-2.2.5\Samples folder...)
  2. 命名空间:( AForge.Video.FFMPEG来自文档
  3. 组装:( AForge.Video.FFMPEGAForge.Video.FFMPEG.dll)(来自文档)(您可以AForge.Video.FFMPEG.dllAForge.NET Framework-2.2.5\Release文件夹中找到它)

如果您想创建自己的解决方案,请确保AForge.Video.FFMPEG.dll在您的项目中有参考。然后应该很容易使用VideoFileWriter类。如果您点击该课程的链接,您会发现一个非常好的(且简单的示例)。Bitmap image在代码中,他们在for-loop中提供 VideoFileWriter


于 2012-09-11T19:08:04.080 回答
11

我在切片器示例中找到了这段代码,看起来非常接近你想要的:

string outputFile = "FadeBetweenImages.wmv";
using (ITimeline timeline = new DefaultTimeline())
{
    IGroup group = timeline.AddVideoGroup(32, 160, 100);
    ITrack videoTrack = group.AddTrack();
    IClip clip1 = videoTrack.AddImage("image1.jpg", 0, 2); // play first image for a little while
    IClip clip2 = videoTrack.AddImage("image2.jpg", 0, 2); // and the next
    IClip clip3 = videoTrack.AddImage("image3.jpg", 0, 2); // and finally the last
    IClip clip4 = videoTrack.AddImage("image4.jpg", 0, 2); // and finally the last
}

  double halfDuration = 0.5;

  // fade out and back in
  group.AddTransition(clip2.Offset - halfDuration, halfDuration, StandardTransitions.CreateFade(), true);
  group.AddTransition(clip2.Offset, halfDuration, StandardTransitions.CreateFade(), false);

  // again
  group.AddTransition(clip3.Offset - halfDuration, halfDuration, StandardTransitions.CreateFade(), true);
  group.AddTransition(clip3.Offset, halfDuration, StandardTransitions.CreateFade(), false);

  // and again
  group.AddTransition(clip4.Offset - halfDuration, halfDuration, StandardTransitions.CreateFade(), true);
  group.AddTransition(clip4.Offset, halfDuration, StandardTransitions.CreateFade(), false);

  // add some audio
  ITrack audioTrack = timeline.AddAudioGroup().AddTrack();

  IClip audio =
     audioTrack.AddAudio("testinput.wav", 0, videoTrack.Duration);

  // create an audio envelope effect, this will:
  // fade the audio from 0% to 100% in 1 second.
  // play at full volume until 1 second before the end of the track
  // fade back out to 0% volume
  audioTrack.AddEffect(0, audio.Duration,
                 StandardEffects.CreateAudioEnvelope(1.0, 1.0, 1.0, audio.Duration));

  // render our slideshow out to a windows media file
  using (
     IRenderer renderer =
        new WindowsMediaRenderer(timeline, outputFile, WindowsMediaProfiles.HighQualityVideo))
  {
     renderer.Render();
  }
}
于 2012-03-16T20:58:39.047 回答
11

我无法使上述示例正常工作。但是,我确实找到了另一个运行良好的库。通过 NuGet 尝试“ accord.extensions.imaging.io ”,然后我编写了以下小函数:

    private void makeAvi(string imageInputfolderName, string outVideoFileName, float fps = 12.0f, string imgSearchPattern = "*.png")
    {   // reads all images in folder 
        VideoWriter w = new VideoWriter(outVideoFileName, 
            new Accord.Extensions.Size(480, 640), fps, true);
        Accord.Extensions.Imaging.ImageDirectoryReader ir = 
            new ImageDirectoryReader(imageInputfolderName, imgSearchPattern);
        while (ir.Position < ir.Length)
        {
            IImage i = ir.Read();
            w.Write(i);
        }
        w.Close();
    }

它从文件夹中读取所有图像并从中制作视频。

如果你想让它变得更好,你可能会读取图像尺寸而不是硬编码,但你明白了。

于 2015-05-29T15:07:29.403 回答
8

FFMediaToolkit是 2020 年的一个很好的解决方案,支持 .NET Core 。

https://github.com/radek-k/FFMediaToolkit

FFMediaToolkit 是一个用于创建和读取视频文件的跨平台 .NET 标准库。它通过 FFmpeg.Autogen 绑定使用本机 FFmpeg 库。

图书馆的自述文件有一个很好的例子来回答这个问题。

// You can set there codec, bitrate, frame rate and many other options.
var settings = new VideoEncoderSettings(width: 1920, height: 1080, framerate: 30, codec: VideoCodec.H264);
settings.EncoderPreset = EncoderPreset.Fast;
settings.CRF = 17;
var file = MediaBuilder.CreateContainer(@"C:\videos\example.mp4").WithVideo(settings).Create();
while(file.Video.FramesCount < 300)
{
    file.Video.AddFrame(/*Your code*/);
}
file.Dispose(); // MediaOutput ("file" variable) must be disposed when encoding is completed. You can use `using() { }` block instead.
于 2020-11-15T09:47:30.937 回答
5

这是使用 C# 使用 Visual Studio 从图像序列创建视频的解决方案。

我的出发点是下面“Hauns TM”的答案,但我的要求比他们的更基本,所以这个解决方案可能更适合不太高级的用户(比如我自己)

图书馆:

using System;
using System.IO;
using System.Drawing;
using Accord.Video.FFMPEG;

您可以通过在“工具 -> NuGet 包管理器 -> 管理 NuGet 包以获取解决方案...”中搜索 FFMPEG 来获取 FFMPEG 库。

我传递给函数的变量是:

  • 输出文件名 ="C://outputFolder//outputMovie.avi"
  • 输入图像序列 = ["C://inputFolder//image_001.avi", "C://inputFolder//image_002.avi", "C://inputFolder//image_003.avi", "C://inputFolder//image_004.avi"]

功能:

private void videoMaker( string outputFileName , string[] inputImageSequence)
{
  int width = 1920;
  int height = 1080;
  var framRate = 25;

  using (var vFWriter = new VideoFileWriter())
  {
    // create new video file
    vFWriter.Open(outputFileName, width, height, framRate, VideoCodec.Raw);

    foreach (var imageLocation in inputImageSequence)
    {
      Bitmap imageFrame = System.Drawing.Image.FromFile(imageLocation) as Bitmap;
      vFWriter.WriteVideoFrame(imageFrame);
    }
    vFWriter.Close();
  }
}
于 2018-10-26T12:07:00.857 回答
3

看起来很多这些答案在 2020 年有点过时了,所以我补充一下我的想法。

我一直在解决同样的问题,并在 GitHub 上发布了 .NET Core 项目Time Lapse Creator : https ://github.com/pekspro/TimeLapseCreator它展示了如何在额外帧上添加信息(例如时间戳),背景音频,标题屏幕,淡入淡出等等。然后使用ffmpeg进行渲染。这是在这个函数中完成的:

// Render video from a list of images, add background audio and a thumbnail image.
private async Task RenderVideoAsync(int framesPerSecond, List<string> images, string ffmpgPath,
        string audioPath, string thumbnailImagePath, string outPath,
        double videoFadeInDuration = 0, double videoFadeOutDuration = 0,
        double audioFadeInDuration = 0, double audioFadeOutDuration = 0)
{
    string fileListName = Path.Combine(OutputPath, "framelist.txt");
    var fileListContent = images.Select(a => $"file '{a}'{Environment.NewLine}duration 1");

    await File.WriteAllLinesAsync(fileListName, fileListContent);

    TimeSpan vidLengthCalc = TimeSpan.FromSeconds(images.Count / ((double)framesPerSecond));
    int coverId = -1;
    int audioId = -1;
    int framesId = 0;
    int nextId = 1;

    StringBuilder inputParameters = new StringBuilder();
    StringBuilder outputParameters = new StringBuilder();

    inputParameters.Append($"-r {framesPerSecond} -f concat -safe 0 -i {fileListName} ");

    outputParameters.Append($"-map {framesId} ");

    if(videoFadeInDuration > 0 || videoFadeOutDuration > 0)
    {
        List<string> videoFilterList = new List<string>();
        if (videoFadeInDuration > 0)
        {
            //Assume we fade in from first second.
            videoFilterList.Add($"fade=in:start_time={0}s:duration={videoFadeInDuration.ToString("0", NumberFormatInfo.InvariantInfo)}s");
        }

        if (videoFadeOutDuration > 0)
        {
            //Assume we fade out to last second.
            videoFilterList.Add($"fade=out:start_time={(vidLengthCalc.TotalSeconds - videoFadeOutDuration).ToString("0.000", NumberFormatInfo.InvariantInfo)}s:duration={videoFadeOutDuration.ToString("0.000", NumberFormatInfo.InvariantInfo)}s");
        }

        string videoFilterString = string.Join(',', videoFilterList);

        outputParameters.Append($"-filter:v:{framesId} \"{videoFilterString}\" ");
    }

    if (thumbnailImagePath != null)
    {
        coverId = nextId;
        nextId++;

        inputParameters.Append($"-i {thumbnailImagePath} ");

        outputParameters.Append($"-map {coverId} ");
        outputParameters.Append($"-c:v:{coverId} copy -disposition:v:{coverId} attached_pic ");
    }

    if (audioPath != null)
    {
        audioId = nextId;
        nextId++;

        inputParameters.Append($"-i {audioPath} ");
        outputParameters.Append($"-map {audioId} ");

        if(audioFadeInDuration <= 0 && audioFadeOutDuration <= 0)
        {
            // If no audio fading, just copy as it is.
            outputParameters.Append($"-c:a copy ");
        }
        else
        {
            List<string> audioEffectList = new List<string>();
            if(audioFadeInDuration > 0)
            {
                //Assume we fade in from first second.
                audioEffectList.Add($"afade=in:start_time={0}s:duration={audioFadeInDuration.ToString("0", NumberFormatInfo.InvariantInfo)}s");
            }

            if (audioFadeOutDuration > 0)
            {
                //Assume we fade out to last second.
                audioEffectList.Add($"afade=out:start_time={(vidLengthCalc.TotalSeconds - audioFadeOutDuration).ToString("0.000", NumberFormatInfo.InvariantInfo)}s:duration={audioFadeOutDuration.ToString("0.000", NumberFormatInfo.InvariantInfo)}s");
            }

            string audioFilterString = string.Join(',', audioEffectList);

            outputParameters.Append($"-filter:a \"{audioFilterString}\" ");
        }
    }

    int milliseconds = vidLengthCalc.Milliseconds;
    int seconds = vidLengthCalc.Seconds;
    int minutes = vidLengthCalc.Minutes;
    var hours = (int)vidLengthCalc.TotalHours;

    string durationString = $"{hours:D}:{minutes:D2}:{seconds:D2}.{milliseconds:D3}";

    outputParameters.Append($"-c:v:{framesId} libx264 -pix_fmt yuv420p -to {durationString} {outPath} -y ");
        
    string parameters = inputParameters.ToString() + outputParameters.ToString();

    try
    {
        await Task.Factory.StartNew(() =>
        {
            var outputLog = new List<string>();

            using (var process = new Process
            {
                StartInfo =
                {
                FileName = ffmpgPath,
                Arguments = parameters,
                UseShellExecute = false,
                CreateNoWindow = true,
                // ffmpeg send everything to the error output, standard output is not used.
                RedirectStandardError = true
                },
                EnableRaisingEvents = true
            })
            {
                process.ErrorDataReceived += (sender, e) =>
                {
                    if (string.IsNullOrEmpty(e.Data))
                    {
                        return;
                    }

                    outputLog.Add(e.Data.ToString());
                    Console.WriteLine(e.Data.ToString());
                };

                process.Start();

                process.BeginErrorReadLine();

                process.WaitForExit();

                if (process.ExitCode != 0)
                {
                    throw new Exception($"ffmpeg failed error exit code {process.ExitCode}. Log: {string.Join(Environment.NewLine, outputLog)}");
                }
                Console.WriteLine($"Exit code: {process.ExitCode}");
            }
        });
    }
    catch(Win32Exception )
    {
        Console.WriteLine("Oh no, failed to start ffmpeg. Have you downloaded and copied ffmpeg.exe to the output folder?");
    }

    Console.WriteLine();
    Console.WriteLine("Video was successfully created. It is availible at: " + Path.GetFullPath(outPath));
}
于 2020-07-12T17:48:55.797 回答
1

此功能基于 Splicer.Net 库。我花了很长时间才了解该库的工作原理。确保您的 fps(每秒帧数)正确。顺便说一句,标准 24 f/s。

就我而言,我有 15 张图像,现在我需要 7 秒的视频-> 所以 fps =2。Fps 可能会因平台...或开发人员的使用而异。

public bool CreateVideo(List<Bitmap> bitmaps, string outputFile, double fps)
        {
            int width = 640;
            int height = 480;
            if (bitmaps == null || bitmaps.Count == 0) return false;
            try
            {
                using (ITimeline timeline = new DefaultTimeline(fps))
                {
                    IGroup group = timeline.AddVideoGroup(32, width, height);
                    ITrack videoTrack = group.AddTrack();

                    int i = 0;
                    double miniDuration = 1.0 / fps;
                    foreach (var bmp in bitmaps)
                    {
                        IClip clip = videoTrack.AddImage(bmp, 0, i * miniDuration, (i + 1) * miniDuration);
                        System.Diagnostics.Debug.WriteLine(++i);

                    }
                    timeline.AddAudioGroup();
                    IRenderer renderer = new WindowsMediaRenderer(timeline, outputFile, WindowsMediaProfiles.HighQualityVideo);
                    renderer.Render();
                }
            }
            catch { return false; }
            return true;
        }

希望这可以帮助。

于 2018-04-05T15:59:00.550 回答