July 12, 2019
This post is an expansion of the previous page, which offered the basic code to get the OpenGL rendering from OpenCV cameras. In this example, I’ll show you how to create a rotating-three-dimensional-model video, which is very effective for showing off your cool reconstructions in a variety of settings (as shown above). This code uses the same repository as the previous set of examples on Github – amy-tabb/OpenCV2OpenGL.
Back to Tips and Tricks Table of Contents
For this example, I altered the existing code in Example1
in the repository, and added another version (1). The structure of this post is to quickly go over running the code and the parameters using the data in the repository, and then go into the details a bit. To call this version, you need the calibration information file like before (but with some more optional parameters), and output directory, and the version number of 1, like:
./OpenCV2OpenGL1 --input /home/username/git/OpenCV2OpenGL/Data2/CaliHorse.txt --output /home/username/git/OpenCV2OpenGL/Data2/HorseFolder --version 1
The calibration information file can contain the same kind of information as with the version 0
example on the previous page, but with a few additional parameters:
Examples of these types of files, and results are given in the Data2
folder in the amy-tabb/OpenCV2OpenGL repository. This is CaliHorse1.txt
cols 640
rows 480
K
1000 0 320
0 1000 240
0 0 1
RTworld2cam
0 1 0 0
0 0 -1 0
1 0 0 5
Model
10 0 0 0
0 10 0 0
0 0 10 0
0 0 0 1
total-degrees 360
rotation-vector 1 0 0
degrees-per-step 5
first-file-number 72
write-camera 1
light-position -10 10 0.0
shininess 0.1
near 1
far 10
file /home/username/Data2/horse1.ply
To create the video above, I ran the code three times, with three different files. The files only differ in the rotation-axis
and the first-file-number
parameter and are all in the Data2
folder. For instance:
./OpenCV2OpenGL1 --input /home/username/git/OpenCV2OpenGL/Data2/CaliHorse.txt --output /home/username/git/OpenCV2OpenGL/Data2/HorseFolder --version 1
./OpenCV2OpenGL1 --input /home/username/git/OpenCV2OpenGL/Data2/CaliHorse1.txt --output /home/username/git/OpenCV2OpenGL/Data2/HorseFolder --version 1
./OpenCV2OpenGL1 --input /home/username/git/OpenCV2OpenGL/Data2/CaliHorse2.txt --output /home/username/git/OpenCV2OpenGL/Data2/HorseFolder --version 1
You’ll notice that the output directory is the same – the result is 216 image files of the horse, \(360\) degrees around three rotation axes, at \(5\) degrees between each image per rotation with the same axis.
To create a video, my choice is ffmpeg. From within the HorseFolder
, the command is:
ffmpeg -r 30 -i %d.png -c:v libx264 -pix_fmt yuv420p horse30.mp4
for a .mp4 video at \(30\) frames per second. To create a gif is straightforward given a .mp4:
ffmpeg -i horse30.mp4 horse30.gif
ffmpeg has lots of options; you can tell by the massive numbers of StackOverflow questions and answers on the topic.
Recall that in the first version of this code, an OpenGL window is shown, and you can’t really do anything with your computer until you hit escape. That was because there was a while
loop that held the OpenGL window open based on the condition below:
while (!glfwWindowShouldClose(window)){
In this version, the number of images to acquire is computed based on the user-supplied parameters, and this value is number_steps
. This loop will hold the OpenGL window until the loop is finished. If you experiment, I encourage creating a loop with only 1 or 2 iterations in the beginning, and double-check the loop bounds. If you get stuck in an infinite loop, you’ll have to reboot or quit the windowing system!
//while (!glfwWindowShouldClose(window))
for (int step_counter = 0; step_counter < number_steps; step_counter++){
Within the loop described in the previous section, the rotation about an axis is achieved as follows:
current_angle_rad = float(step_counter)*degrees_per_step*0.0174533;
modelR = glm::rotate(modelR, current_angle_rad, glm::vec3(rotation_vector(0), rotation_vector(1), rotation_vector(2)));
model = modelR*model;
model
is the modelview matrix from the previous page, in other words, the transformation applied to the object.current_angle_rad
is the angle of rotation in radians.rotation_vector
is the axis of rotation supplied by the user in the calibration information file.modelR
is the rotation matrix, which the glm library will compute for you.Finally, we create a new modelview matrix by applying the current modelR
to the transformation applied to the object for all steps. In the case of the horse model, this is convenient, because I scale it using the model matrix.
I’m sure you can see there are many more variations you can add to this demo to get interesting visualizations.
© Amy Tabb 2018 - 2023. All rights reserved. The contents of this site reflect my personal perspectives and not those of any other entity.