admin管理员组

文章数量:1404923

I've been creating colored animations in the terminal using c++ but have started to face the performance decrease that ANSI brings. I want to have full color in the terminal while still maintaining a decent amount of speed. I currently use _write to write to the terminal. I suspect that ANSI increases waiting times because of it's increased space usage: \x1b[48;2;000;000;000m for one letter with fully independent RGB. This is 20 times more expensive than simply writing a single letter to the terminal.

I have compared ANSI to plain text using the following code segments (Yes, I'm aware that std::chrono::high_resolution_clock is not the best for profiling but the difference is large enough that it doesn't matter):

#include <windows.h>
#include <iostream>
#include <chrono>
#include <unistd.h>

int main()
{
    HANDLE consoleIn = GetStdHandle(STD_INPUT_HANDLE);
    DWORD dwMode;
    GetConsoleMode(consoleIn, &dwMode);
    dwMode |= ENABLE_VIRTUAL_TERMINAL_PROCESSING;
    SetConsoleMode(consoleIn, dwMode);
    HANDLE console = GetStdHandle(STD_OUTPUT_HANDLE);
    CONSOLE_SCREEN_BUFFER_INFO screen;
    GetConsoleScreenBufferInfo(console, &screen);
    short columns = screen.srWindow.Right - screen.srWindow.Left + 1;
    short rows = screen.srWindow.Bottom - screen.srWindow.Top + 1;
    int newfd = dup(STDOUT_FILENO);
    char buffer[columns * rows];
    int i;
    for (i = 0; i < columns * rows; i++)
    {
        buffer[i] = ' ';
    }
    int fileHandle = 0;
    auto start = std::chrono::high_resolution_clock::now();
    for (i = 0; i < 500; i++)
    {
        _write(newfd, buffer, sizeof(buffer));
    }
    auto finish = std::chrono::high_resolution_clock::now();
    std::cout << std::setprecision(20) << std::chrono::duration_cast<std::chrono::nanoseconds>(finish - start).count() / 1000000000.f;
    return 0;
}

0.40927010774612426758 seconds.

#include <windows.h>
#include <iostream>
#include <chrono>
#include <unistd.h>

int main()
{
    HANDLE consoleIn = GetStdHandle(STD_INPUT_HANDLE);
    DWORD dwMode;
    GetConsoleMode(consoleIn, &dwMode);
    dwMode |= ENABLE_VIRTUAL_TERMINAL_PROCESSING;
    SetConsoleMode(consoleIn, dwMode);
    HANDLE console = GetStdHandle(STD_OUTPUT_HANDLE);
    CONSOLE_SCREEN_BUFFER_INFO screen;
    GetConsoleScreenBufferInfo(console, &screen);
    short columns = screen.srWindow.Right - screen.srWindow.Left + 1;
    short rows = screen.srWindow.Bottom - screen.srWindow.Top + 1;
    int newfd = dup(STDOUT_FILENO);
    char buffer[columns * rows * 20];
    int i;
    for (i = 0; i < columns * rows; i++) {
        strncpy(buffer + i * 20, "\x1b[48;2;000;000;000m ", 21);
    }
    int fileHandle = 0;
    auto start = std::chrono::high_resolution_clock::now();
    for (i = 0; i < 500; i++)
    {
        _write(newfd, buffer, sizeof(buffer));
    }
    auto finish = std::chrono::high_resolution_clock::now();
    std::cout << std::setprecision(20) << std::chrono::duration_cast<std::chrono::nanoseconds>(finish - start).count() / 1000000000.f;
    return 0;
}

5.0492200851440429688 seconds.

Is there an alternative to ANSI or some way to make it faster?

本文标签: winapiIs There a Better Way to use 24 bit Color in the Terminal Using CStack Overflow