Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implemented LogLevels and missing Arguments (BlackDetect & BlackFrame) #373

Merged
merged 3 commits into from
Jan 31, 2023

Conversation

koskit
Copy link
Contributor

@koskit koskit commented Nov 27, 2022

No description provided.

@koskit
Copy link
Contributor Author

koskit commented Nov 27, 2022

Seems a partial solution to #212 for blackdetect

@rosenbjerg rosenbjerg merged commit 3c31650 into rosenbjerg:master Jan 31, 2023
rosenbjerg added a commit that referenced this pull request Feb 5, 2023
Implemented LogLevels and missing Arguments (BlackDetect & BlackFrame)

Former-commit-id: 3c31650
@NickDrouin
Copy link

@koskit , @rosenbjerg , I'm sorry to revive an old PR, but I can't seem to make anything work with BlackDetect or BlackFrame.

Can you post an example of using one of these?

My goal is to identify the frame-ranges which are black within a FromUrlInput.

@koskit
Copy link
Contributor Author

koskit commented Jun 27, 2024

Hey @NickDrouin , this is the "barebones" code that I used 1.5 years ago (haven't tested it since). Hope it helps!

/// <summary>
/// Configuration on how black frame detection behaves.
/// </summary>
public class BlackFrameDetectionConfiguration : IVideoAnalysisConfiguration
{
    /// <summary>
    /// The ratio of how much of the screen must be black pixels
    /// to be considered a black screen. 
    /// <para>
    /// It's a ratio of black_pixels/total_pixels.
    /// </para>
    /// <para>
    /// Values from 0 to 100.
    /// </para>
    /// </summary>
    public int ScreenBlackRatioThreshold = 98;

    /// <summary>
    /// The threshold below which a pixel will be considered black.
    /// <para>
    /// From 0 to 255 (guess, no official documentation).
    /// </para>
    /// </summary>
    public int PixelLuminanceRatioThreshold = 32;

    /// <summary>
    /// Applies the configuration to the video filters.
    /// </summary>
    /// <param name="options">The existing video filter options.</param>
    public void Apply(VideoFilterOptions options) =>
        options.BlackFrame(
            amount: ScreenBlackRatioThreshold,
            threshold: PixelLuminanceRatioThreshold);
}
/// <summary>
/// Configuration on how black screen detection behaves.
/// </summary>
public class BlackScreenDetectionConfiguration : IVideoAnalysisConfiguration
{
    /// <summary>
    /// The minimum detected black duration expressed in seconds. 
    /// It must be a non-negative floating point number.
    /// </summary>
    public double MinimumDuration = 2.0;

    /// <summary>
    /// The ratio of how much of the screen must be black pixels
    /// to be considered a black screen. It's a ratio of 
    /// black_pixels/total_pixels.
    /// </summary>
    public double ScreenBlackRatioThreshold = 0.98;

    /// <summary>
    /// The threshold below which a pixel will be considered black.
    /// <para>
    /// E.g. 0 is total black (#000000) and 1 is (#FFFFFF)
    /// </para>
    /// </summary>
    public double PixelLuminanceRatioThreshold = 0.1;

    /// <summary>
    /// Applies the configuration to the video filters.
    /// </summary>
    /// <param name="options">The existing video filter options.</param>
    public void Apply(VideoFilterOptions options) =>
        options.BlackDetect(
            minimumDuration: MinimumDuration,
            pictureBlackRatioThreshold: ScreenBlackRatioThreshold,
            pixelBlackThreshold: PixelLuminanceRatioThreshold);
}
public class StreamAnalyzer
{
    /// <summary>
    /// Runs analysis on the stream and raises events based on the provided configurations.
    /// </summary>
    /// <param name="analysisConfigurations">The configurations of the analysis.</param>
    public static async Task AnalyzeStreamAsync(
        Uri streamUri, 
        params IVideoAnalysisConfiguration[] analysisConfigurations)
    {
        FFOptions ffmpegOptions = new()
        {
            BinaryFolder = $"ffmpegBinaryPath",
            WorkingDirectory = $"somePath\\_work",
            TemporaryFilesFolder = $"somePath\\_temp",
            LogLevel = FFMpegLogLevel.Info
        };

        GlobalFFOptions.Configure(ffmpegOptions);

        //Dummy file used for ffmpeg output. Will not write anything.
        string dummyFile = $"somePath\\_temp\\ffmpeg_dummy_file_.mp4";

        await FFMpegArguments.FromUrlInput(streamUri)
                             .OutputToFile(dummyFile, addArguments: options =>
                             {
                                 options.WithVideoFilters(v =>
                                 {
                                     analysisConfigurations.ToList()
                                                           .ForEach(c => c.Apply(v));
                                 });
                             })
                             .NotifyOnOutput(l => Console.WriteLine($"[Out] {l}"))
                             .NotifyOnError(l => Console.WriteLine($"[Err] {l}"))
                             .WithLogLevel(FFMpegLogLevel.Info)
                             .ProcessAsynchronously();
    }
}

and then use it like so:

//Using the defaults of the configurations
await StreamAnalyzer.AnalyzeStreamAsync(
    new Uri(""),
    new BlackFrameDetectionConfiguration(),
    new BlackScreenDetectionConfiguration());

@NickDrouin
Copy link

Thanks, @koskit .

In case anyone searches and hits this, I went with:


        var output = new StringBuilder();

        await FFMpegArguments
            .FromUrlInput(new(inputUriString))
            .OutputToFile(dummyOutputPath, addArguments: options => options
                .WithVideoFilters(filterOptions => filterOptions
                    .BlackDetect()
                    //.BlackFrame()
                    ))
            .NotifyOnOutput(l => output.AppendLine(l))
            .NotifyOnError(l => output.AppendLine(l))
            .WithLogLevel(FFMpegLogLevel.Info)
            .ProcessAsynchronously();

        var ranges = Helpers.ParseBlackFrameRanges(output.ToString());

Where:

    public static List<BlackFrameRange> ParseBlackFrameRanges(string ffmpegOutput)
    {
        var blackFrameData = new List<BlackFrameRange>();

        string[] lines = ffmpegOutput.Split('\n');
        foreach (var line in lines)
        {
            if (line.Contains("black_start:") && line.Contains("black_end:") && line.Contains("black_duration:"))
            {
                var parts = line.Split(' ');
                var start = double.Parse(parts[parts.Length - 3].Split(':')[1]);
                var end = double.Parse(parts[parts.Length - 2].Split(':')[1]);
                var duration = double.Parse(parts[parts.Length - 1].Split(':')[1]);

                blackFrameData.Add(new BlackFrameRange
                {
                    Start = start,
                    End = end,
                    Duration = duration
                });
            }
        }

        return blackFrameData;
    }

public class BlackFrameRange
{
    [JsonPropertyName("start")]
    public double Start { get; set; }
    
    [JsonPropertyName("end")]
    public double End { get; set; }

    [JsonPropertyName("duration")]
    public double Duration { get; set; }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants