国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看

合肥生活安徽新聞合肥交通合肥房產(chǎn)生活服務(wù)合肥教育合肥招聘合肥旅游文化藝術(shù)合肥美食合肥地圖合肥社保合肥醫(yī)院企業(yè)服務(wù)合肥法律

代寫CSE 167、輔導(dǎo)3D OpenGL Rendering

時間:2023-11-21  來源:合肥網(wǎng)hfw.cc  作者:hfw.cc 我要糾錯


UCSD CSE 167 Assignment 3:

3D OpenGL Rendering

Figure 1: We will develop an interactive interface for inspecting 3D models in this homework.

As you can probably tell from the previous homeworks, rendering requires computing interactions between

millions of pixels and billions of triangles. This leads to significant challenges in performance, especially when

we want to interact with the content in real-time. To make things really fast, pioneers in computer graphics

came up with the solution to use domain-specific hardware to speedup rendering. Instead of using a general

purpose computer to compute everything, we build chips that specialize at rendering. These processors are

called the Graphics Processing Units (GPUs). The idea of GPUs can be traced back to more than 40 years

ago: The first GPU, Geometry Engine was developed by Jim Clark and Marc Hannah in 1981. Jim Clark

formed the company Silicon Graphics Inc (SGI) in the same year and SGI was one of the most important

computer graphics companies in the history. Nowadays, GPUs are found to be general enough to compute

very wide-range of computation, including deep learning and many scientific computing tasks, and they are

indispensable to the human society. GPU is one of the most successful examples of domain-specific hardware.

In this homework, we will write code to render things using GPUs on your computer. To command your

GPUs, we need to send commands to it using some sort of “Application Programming Interface” (API).

These interfaces are collectively decided by the GPU companies and some other organizations, and each

hardware will come with some “drivers” that actually implement these interfaces using underlying hardware

instructions. The most popular APIs are: OpenGL, DirectX, Metal, Vulkan, and WebGPU. Among these,

DirectX is Windows only, Metal is MacOS only, WebGPU is only for browsers, and Vulkan is extremely

low-level and very verbose for providing fine-grained control (it takes literally a thousand lines to render a

single triangle in Vulkan). Therefore, we will use OpenGL in this homework: even though DirectX, Metal,

and Vulkan are more update to date (the lastest version of OpenGL is 6 years ago), OpenGL is still use

in practice and supported by all major GPUs and OSes, and it is significantly easier to learn compared to

other lower-level APIs. Just like programming languages, it’ll be a lot easier to learn other APIs once you’ve

learned OpenGL.

In this homework, we will mostly follow an online tutorial: learnopengl.com, because they likely write

significantly better tutorials than me. We will implement what we did in the previous homework in OpenGL

and hopefully see significant speedup. We will also create a Graphics User Interface (GUI) and enable

real-time interaction.

This homework is also more “open-ended” compared to the previous ones. We do not ask you to produce

the exact same output as we do. At this point, you should be familiar with the theory of rasterization. We’re

just wrangling with hardware interface, so allowing a bit of creativity seems reasonable.

1

1 Creating a window (10 pts)

Our first task, instead of rendering a single triangle, is to create a window! Read the chapters of OpenGL,

Creating a window, and Hello Window in learnopengl.com to see how to create a window with OpenGL

context using GLFW. Pick your favoriate background color. We have included GLFW and glad in balboa,

so you shouldn’t have to download them. We’re using OpenGL 3.3, but feel free to use the version you like.

Implement your code in hw_3_1 in hw3.cpp. Test it using

./balboa -hw 3_1

Once you are done, take a screenshot of the window you created and save it as outputs/hw_3_1.png.

2 Rendering a single 2D triangle (20 pts)

Yeah, it’s that time again! Read the Hello Triangle chapter and render a single triangle with constant color

(pick one that you like the most). Make sure you’ve become familiar with the ideas of shaders, VAO, VBO,

and EBO. Just to make things slightly different so that we are not just copy and pasting code, let the triangle

rotate in the image plane over time (it can be clockwise or counterclockwise, your choice). For the rotation,

you can do it whichever way you want, but I recommend you do it in the vertex shader. Read the Shaders

chapter and understand how to pass in a uniform variable, then you can use the uniform variable as the

rotation angle.

float vs. double By default, balboa uses double precision floats through the Real type. However, by

default, GLSL uses single precision floats. Be careful of this discrepancy. You can use Vector3f/Matrix3x3f

to switch to float in balboa. Also feel free to use the glm library which is used in the tutorial.

Implement your code in hw_3_2 in hw3.cpp. Test it using

./balboa -hw 3_2

This time, do a screen recording of your rotating triangle and save it as outputs/hw_3_2.mp4 (or whatever

encoding you are using).

3 Rendering 3D triangle meshes with transformations (35 pts)

Next, we’ll use OpenGL to render the type of scenes we handled in the previous homework. Read the

chapters Transformations, Coordinate systems, and cameras, and that should give you enough knowledge to

render the JSON scenes like the ones in the previous homeworks.

This part is a big jump from the previous parts. I would recommend you to do things incrementally. E.g.,

handle two 2D triangles first, add projection matrix, add view matrix, add model matrix, handle multiple

triangle meshes, and finally add camera interaction.

Below are some notes and tips:

Clip space. In Homework 2, our projection matrix convert from camera space directly to the screen space.

In OpenGL, the hardware expects the projection to convert from camera space to the clip space, which by

default ranges from −1 to 1 for x, y, and z axes. Everything outside of the clip space is clipped. Note that

the clipping happens at the far side of z as well – we use the z_far parameter in the camera in our JSON

scene to specify this. The difference in spaces means that we need to use a different projection matrix:

1

as

0 0 0

0

1

s

0 0

0 0 −

zfar

zfar−znear

zfarznear

zfar−znear

0 0 −1 0

?**7;

?**8;

?**8;

?**9;

, (1)

2

where s is the scaling/film size parameter as before, and a is the aspect ratio. The first row and the second

row scale the x and y clipping plane to [−1, 1] respectively. The third row compresses z values from −znear

to −zfar to [−1, 1]. The fourth row is the perspective projection using homogeneous coordinates.

Depth test. By default, OpenGL does not reject triangles when they are occluded. Remember to turn

on depth testing using glEnable(GL_DEPTH_TEST) and clear the Z buffer (e.g., glClear(GL_COLOR_BUFFER_BIT

| GL_DEPTH_BUFFER_BIT)).

Vertex colors. In contrast to the learnopengl tutorial, balboa stores the vertex color in a separate array.

Therefore it’s likely more convienent to create two VBOs:

unsigned int VBO_vertex;

glGenBuffers(1, &VBO_vertex);

glBindBuffer(GL_ARRAY_BUFFER, VBO_vertex);

glBufferData(GL_ARRAY_BUFFER, ...);

glVertexAttribPointer(0 /* layout index */,

3, GL_FLOAT, GL_FALSE, 3 * sizeof(float), (void*)0);

glEnableVertexAttribArray(0);

unsigned int VBO_color;

glGenBuffers(1, &VBO_color);

glBindBuffer(GL_ARRAY_BUFFER, VBO_color);

glBufferData(GL_ARRAY_BUFFER, ...);

glVertexAttribPointer(1 /* layout index */,

3, GL_FLOAT, GL_FALSE, 3 * sizeof(float), (void*)0);

You only need one VAO per mesh regardless.

Multiple meshes. To handle multiple meshes in a scene, create a VAO for each mesh.

Window resizing. We don’t require you to handle window resizing in this homework. It’s annoying

because you’ll need to regenerate the projection matrix every time the aspect ratio changes.

Gamma correction. When we save the image in balboa, we perform a gamma correction by taking a power

of 1

2.2

. OpenGL does not by default do this. To enable gamma correction, use glEnable(GL_FRAMEBUFFER_SRGB).

Read the gamma correction chapter in learnopengl.com to learn more.

Camera interaction. Like the tutorial, you should also implement a simple camera interaction scheme,

see the Camera chapter. A simple WSAD style translation suffices. To obtain the camera direction and

right vector, you can look at the columns of the cam_to_world matrix.

As a bonus (15 pts), add camera rotation based on mouse input like the tutorial. Note that the rotation

in the tutorial assumes a particular camera frame and would not work for our case. I recommend doing the

following: 1) store yaw and pitch angles and the original cam_to_world matrix from the scene. 2) update the

yaw and pitch based on the mouse movement offsets like in the tutorial. 3) form a rotation matrix R based

on yaw and pitch, then form a new cam_to_world matrix by multiplying the original cam_to_world matrix

with R. (Don’t overwrite the original cam_to_world matrix!)

For rotation, it might be tempting to keep only one cam_to_world matrix by keep multiplying it with

new rotation matrices. However, this is going to produce unintuitive behavior (try it!) since yaw and

pitch rotations are not commutative: applying yaw first then pitch will produce different result compared to

applying pitch first then yaw. As a result, when you chain together many pitches and yaws matrix rotations,

they will not represent the desired rotation. Yes, rotation is weird. This is why you should explicitly store

the yaw and pitch angles and modify those instead.

3

Passing parameters in callback functions. If you dislike global variables as much as me, you would

like the functions glfwSetWindowUserPointer and glfwGetWindowUserPointer. You will use it like this:

void mouse_callback(GLFWwindow* window, double xpos, double ypos) {

StructIWanttoPasstoCallback *data_ptr =

glfwGetWindowUserPointer(window);

}

GLFWwindow* window = glfwCreateWindow(width, height, "Balboa", NULL, NULL);

StructIWanttoPasstoCallback data = ...;

glfwSetWindowUserPointer(window, &data);

glfwSetCursorPosCallback(window, mouse_callback);

Debugging. Debugging OpenGL (and other graphics API) programs is painful: if you do one thing wrong,

you’ll likely get a black screen. The learnopengl tutorial provides useful tips for debugging. To debug shaders,

it’s particularly useful to use a debugger such as renderdoc. Unfortunately, none of the existing OpenGL

debuggers work on MacOS anymore (Apple makes it extremely hard to develop OpenGL on MacOS because

they want people to use Metal). For MacOS users, a potential debugging strategy is to emulate the shader

on CPU: write the same code on CPU and print out the values, and see if it does what you expect. It’s going

to be painful regardless, I’m sorry. On the other hand, this is a fruitful research area that awaits innovation

to make things better!

For the 3D transformation, copy your Homework 2 code to the parse_transformation function in hw3_scenes.cpp.

Implement the rest in hw_3_3 in hw3.cpp.

Test your OpenGL rendering using the following commands:

./balboa -hw 3_3 ../scenes/hw3/two_shapes.json

./balboa -hw 3_3 ../scenes/hw3/cube.json

./balboa -hw 3_3 ../scenes/hw3/spheres.json

./balboa -hw 3_3 ../scenes/hw3/teapot.json

./balboa -hw 3_3 ../scenes/hw3/bunny.json

./balboa -hw 3_3 ../scenes/hw3/buddha.json

For two_shapes and cube, they should render to the same images as the previous homework (before you

move the camera yourself). The rest are new scenes. (teapot.json is a higher-resolution version that has 10

times more triangles!) Record a video of you moving the camera for each scene and save them as:

outputs/hw_3_3_two_shapes.mp4

outputs/hw_3_3_cube.mp4

outputs/hw_3_3_spheres.mp4

outputs/hw_3_3_teapot.mp4

outputs/hw_3_3_bunny.mp4

outputs/hw_3_3_buddha.mp4

Acknowledgement. The bunny model was scanned by Greg Turk and Marc Levoy back in 1994 at

Stanford, so it is sometimes called the Stanford bunny. The texture of the bunny model was made by

KickAir_8p who posted the scene in blenderarists.org. The buddha texture was generated by Kun Zhou et

al. for their Texturemontage paper.

Bonus: textures (15 pts). Read the Textures chapter of learnopengl.com and implement textures for

the shapes above. We have provided the UV maps for the models except two_shapes and cube. I have also

included the original textures I used to produce the vertex colors for teapot, bunny, and buddha.

4

4 Lighting (25 pts)

For this part, read the chapters of Colors and Basic Lighting in the tutorial, and implement some basic

lighting in our viewer. Be careful about the transformation of the normals! Use the vertex colors or texture

colors as the objectColor equivalent in the tutorial. Let’s assume ambientStrength=0.1, specularStrength=0.5

and lightDir is at normalize(vec3(1, 1, 1)). Note that you can extract the camera position by looking at

the fourth column of cam_to_world.

The way the tutorial does the lighting requires defining vertex normals (an alternative is to use face

normals, but it often looks uglier). We have provided vertex normals for the following scenes:

./balboa -hw 3_4 ../scenes/hw3/spheres.json

./balboa -hw 3_4 ../scenes/hw3/teapot.json

./balboa -hw 3_4 ../scenes/hw3/bunny.json

./balboa -hw 3_4 ../scenes/hw3/buddha.json

Save your output as screenshots:

outputs/hw_3_4_spheres.png

outputs/hw_3_4_teapot.png

outputs/hw_3_4_bunny.png

outputs/hw_3_4_buddha.png

Bonus: lighting animation (10 pts). Add some animation to the light. Make it move the way you like,

and submit a video recording of the animation.

Bonus: different types of lights (10 pts). Our light currently is a directional light. Implement point

lights and spot lights (see the Light casters chapter) in your renderer, and support multiple lights.

Bonus: shadow mapping (20 pts). Implement a basic shadow map. See the Shadow Mapping chapter

in learnopengl. Support of directional lights is good enough.

5 Design your own scenes (10 pts)

We’re at the fun part again. Design your own scene and render it using your new renderer!

請加QQ:99515681 或郵箱:99515681@qq.com   WX:codehelp

 

掃一掃在手機(jī)打開當(dāng)前頁
  • 上一篇:指標(biāo)代寫 代寫同花順指標(biāo)公式
  • 下一篇:指標(biāo)代寫 代寫同花順指標(biāo)公式
  • 無相關(guān)信息
    合肥生活資訊

    合肥圖文信息
    流體仿真外包多少錢_專業(yè)CFD分析代做_友商科技CAE仿真
    流體仿真外包多少錢_專業(yè)CFD分析代做_友商科
    CAE仿真分析代做公司 CFD流體仿真服務(wù) 管路流場仿真外包
    CAE仿真分析代做公司 CFD流體仿真服務(wù) 管路
    流體CFD仿真分析_代做咨詢服務(wù)_Fluent 仿真技術(shù)服務(wù)
    流體CFD仿真分析_代做咨詢服務(wù)_Fluent 仿真
    結(jié)構(gòu)仿真分析服務(wù)_CAE代做咨詢外包_剛強(qiáng)度疲勞振動
    結(jié)構(gòu)仿真分析服務(wù)_CAE代做咨詢外包_剛強(qiáng)度疲
    流體cfd仿真分析服務(wù) 7類仿真分析代做服務(wù)40個行業(yè)
    流體cfd仿真分析服務(wù) 7類仿真分析代做服務(wù)4
    超全面的拼多多電商運營技巧,多多開團(tuán)助手,多多出評軟件徽y1698861
    超全面的拼多多電商運營技巧,多多開團(tuán)助手
    CAE有限元仿真分析團(tuán)隊,2026仿真代做咨詢服務(wù)平臺
    CAE有限元仿真分析團(tuán)隊,2026仿真代做咨詢服
    釘釘簽到打卡位置修改神器,2026怎么修改定位在范圍內(nèi)
    釘釘簽到打卡位置修改神器,2026怎么修改定
  • 短信驗證碼 寵物飼養(yǎng) 十大衛(wèi)浴品牌排行 suno 豆包網(wǎng)頁版入口 目錄網(wǎng) 排行網(wǎng)

    關(guān)于我們 | 打賞支持 | 廣告服務(wù) | 聯(lián)系我們 | 網(wǎng)站地圖 | 免責(zé)聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網(wǎng) 版權(quán)所有
    ICP備06013414號-3 公安備 42010502001045

    国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看
    一区二区三区国产福利| 麻豆精品蜜桃一区二区三区| 久久视频中文字幕| 久久波多野结衣| 国产精品自拍首页| 免费高清在线观看免费| 欧美日韩dvd| 免费一级特黄毛片| 国产午夜精品视频一区二区三区| 国产在线观看欧美| 国产情侣av自拍| 国产卡一卡二在线| 超碰97网站| 97人人模人人爽人人喊38tv| 91高跟黑色丝袜呻吟在线观看| 国产精品99久久久久久久久久久久| 国产精品91在线观看| 久久国产精品 国产精品 | 成人动漫在线视频| 91免费在线视频| 久久全国免费视频| 久久精品国产欧美激情| 国产精品第二页| 一区二区三区四区五区视频| 懂色中文一区二区三区在线视频| 日韩在线一级片| 秋霞在线一区二区| 免费精品视频一区| 91麻豆国产精品| 久久精品国产亚洲| 欧美激情精品在线| 亚洲精品偷拍视频| 日韩欧美第二区在线观看| 黄色免费观看视频网站| 国产精品香蕉在线观看| 久久久久一区二区三区| 欧美猛交ⅹxxx乱大交视频| 免费99视频| 91av网站在线播放| 国产精品视频免费一区二区三区| 久久伊人免费视频| 欧美一区二区三区免费视| 欧美不卡福利| av一区二区三区四区电影| 欧美日韩一区二区三区在线视频 | 色综合666| 国产在线精品一区二区中文| 91久久精品国产| 国产精品久久国产| 午夜啪啪福利视频| 免费观看国产精品视频| 久久福利电影| 亚洲最大av网| 欧美午夜小视频| 国产伦精品一区二区三区免费视频| 国产成人福利网站| 欧美激情精品久久久久久变态| 日韩成人在线资源| 成人久久18免费网站图片| 精品国内产的精品视频在线观看| 在线不卡日本| 国内精久久久久久久久久人| 亚洲国产精品www| 九九热在线精品视频| 日韩免费高清在线| 国产欧美中文字幕| 视频在线一区二区| 亚洲精品无人区| 国产天堂在线播放| 国产精品视频免费在线| 日韩国产欧美一区| 久久人人97超碰人人澡爱香蕉| 国产精品视频网站| 日韩精品视频在线观看视频| 99视频免费观看| 精品麻豆av| 免费观看亚洲视频| 久久精品99久久香蕉国产色戒| 性欧美长视频免费观看不卡| 成人h视频在线观看| 国产精品久久久久久久av电影| 日本久久久网站| 久久久免费观看| 亚洲影院污污.| www.亚洲一区二区| 一区二区三区欧美成人| 国产亚洲综合视频| 国产精品日韩欧美综合| 欧美影院在线播放| 日韩有码视频在线| 青草青草久热精品视频在线观看| 久久综合九色欧美狠狠| 亚洲视频在线二区| 高清一区二区三区视频| 久久99视频精品| 国产精品在线看| 亚洲一区二区免费| 国产精品.com| 日本高清不卡一区二区三| 久操手机在线视频| 欧美日韩一区二区三区电影 | 成年人网站国产| 欧美激情a∨在线视频播放| 国产欧美日韩专区发布| 欧美日韩高清在线观看| www国产免费| 亚洲成色www久久网站| 久久久视频精品| 日本不卡一区二区三区在线观看| 精品国产一区二区三区久久狼5月 精品国产一区二区三区久久久狼 精品国产一区二区三区久久久 | 人体内射精一区二区三区| 久久国产精品久久| 欧美中文字幕精品| 国产精品久久久久久久久久久不卡| 国产一区二区在线免费视频 | 欧美成aaa人片免费看| 国产精品一区二区免费| 亚洲国产欧美日韩| 俺也去精品视频在线观看| 国产在线高清精品| 亚洲国产日韩欧美| 久久久国产视频| 国产日韩欧美视频| 亚洲欧美国产一区二区| 国产高清视频一区三区| 免费在线一区二区| 久久久久久69| 久久久久这里只有精品| 国产在线观看91精品一区| 亚洲一区二区三区乱码aⅴ | 色婷婷av一区二区三区久久| 黑人中文字幕一区二区三区| 精品国产乱码久久久久久丨区2区| 国产美女99p| 日本高清+成人网在线观看| 国产精品高清在线| 国产经典久久久| 国内精品二区| 欧美一区二区高清在线观看| 国产精品男人的天堂| 成人免费在线网| 欧美一级二级三级九九九| 中文字幕欧美日韩一区二区三区| 久久国产精品-国产精品| 国产午夜精品视频一区二区三区| 色噜噜狠狠色综合网| 精品久久久久久亚洲| 国产超碰91| 国产深夜精品福利| 日韩免费电影一区二区三区| 欧美极品第一页| 国产精品涩涩涩视频网站| 91精品国产高清久久久久久91裸体| 黄色小网站91| 日本福利视频一区| 亚洲一区二区三区色| 国产精品久久久久9999爆乳| 国产成人亚洲精品无码h在线| 国产欧美一区二区三区不卡高清| 日韩五码在线观看| 亚洲精品天堂成人片av在线播放| 国产精品日韩欧美一区二区| 91精品国产高清久久久久久91| 国产一区免费观看| 欧美在线视频一区二区| 色999日韩自偷自拍美女| 中文字幕欧美日韩一区二区 | 97人人模人人爽视频一区二区| 国产专区精品视频| 欧美精品七区| 人人妻人人做人人爽| 午夜久久资源| 亚洲人成无码www久久久| 国产精品国模大尺度私拍| 日韩中文字幕网站| 久久久久久欧美| 久久精品magnetxturnbtih| 97公开免费视频| 成人黄色av网站| 高清国语自产拍免费一区二区三区 | 日本一欧美一欧美一亚洲视频| 久久精品国产久精国产一老狼| 97精品国产97久久久久久春色| 国产在线视频一区| 日本精品免费在线观看| 天堂资源在线亚洲资源| 亚洲欧洲国产日韩精品| 亚洲一区二区精品在线| 中文字幕日韩一区二区三区| 美女视频久久黄| 精品国产一区二区三区麻豆免费观看完整版| 久久精品影视伊人网| 精品国产一区二区三区四区在线观看| 久久久久久久国产精品| 北条麻妃一区二区三区中文字幕| 久久久久久久久一区二区| 久久久久网址| 久久精品99国产精品酒店日本| 色偷偷av亚洲男人的天堂| 久久久久五月天|