Accessing GL::Texture raw data on the CPU #624
-
Hi there, I am rendering a SceneGraph::Camera and the displaying the output frame using ImGui. So far so good. ImGuiIntegration::image(camera_controller->outputFrame(), frame_size); The above code works fine and displays my camera output. I am trying to get the raw pixel data from the GL::Texture2D returned by outputFrame(). But I'm not having luck. PixelFormat format = Magnum::PixelFormat::RGBA8UI;
std::size_t row_stride =
4 * ((frame_size.x() * pixelFormatSize(format) + 3) / 4);
Image2D image{ format, frame_size, Containers::Array<char>{ValueInit, std::size_t(frame_size.y() * row_stride)}};
camera_controller->outputFrame().image(0, image);
// every value of image.data() is equal to 0 Should image.data() now contain my raw pixel data? Am I missing something about transferring the data from a Texture onto the CPU memory? My goal is to get the raw data, move it into an OpenCV mat, run some analysis and draw onto it, then move it back into a GL::Texture2D. Any tips on moving data out of the Texture and back in would be greatly appreciated. Thanks. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Ok I feel silly now! Just needed to walk away from the PC and look again after a break :) Anyway! For anyone looking to convert magnum Texture2D into OpenCV cv::Mat and back... here's how I ended up doing it: using namespace Magnum;
using namespace Corrade;
// Grab the image on the CPU
Image2D image = _input_texture->image(0, Image2D{PixelFormat::RGBA8Unorm});
// and construct an opencv type from it
cv::Mat mat(_frame_size.y(), _frame_size.x(), CV_8UC4, image.data());
// run any opencv analysis...
// opencv drawing... this sets all pixels to red
mat.setTo(cv::Scalar(255, 0, 0, 255));
// move the resulting cv::Mat into an appropriate container
auto element_count = mat.total() * mat.channels();
Containers::Array<uint8_t> buffer_data(NoInit, element_count);
std::copy(mat.datastart, mat.dataend, buffer_data.data());
GL::BufferImage2D buffer{
Magnum::GL::PixelFormat::RGBA, Magnum::GL::PixelType::UnsignedByte,
_frame_size, buffer_data, GL::BufferUsage::StaticDraw};
// and finaly into the GL Texture
_output_texture->setSubImage(0, {}, buffer);
|
Beta Was this translation helpful? Give feedback.
-
Just to add a tip here -- the texture format mismatch would have generated a GL error. Assuming you use the builtin Application classes, you can see GL errors printed to the standard output if you run the application with |
Beta Was this translation helpful? Give feedback.
Ok I feel silly now! Just needed to walk away from the PC and look again after a break :)
My pixel format was normalised floats not unsigned ints.
Anyway! For anyone looking to convert magnum Texture2D into OpenCV cv::Mat and back... here's how I ended up doing it: