程序代写CS代考 assembly Computer Graphics – cscodehelp代写

Computer Graphics
COMP3421/9415 2021 Term 3 Lecture 3

What did we learn last week?
Graphics in a Nutshell
¡ñ History of Modern Computer Graphics
¡ñ What’s in the Course
¡ñ Graphics Hardware (monitors and graphics cards)
¡ñ Polygon Rendering overview
¡ñ Course coding platform

What are we covering today?
2D Graphics
¡ñ Continuing our learning about Polygon Rendering in 2D
¡ñ The OpenGL Pipeline
¡ñ Colouring shapes with shaders ¡ñ Textures

The OpenGL Pipeline

Going from Data to Pixels
Last week, we looked at the Polygon Rendering Process . . . Today, we go into more detail!
Image credit: learnopengl.com

A step by step process
A breakdown of the OpenGL Pipeline
1. Vertex Data is passed to OpenGL
2. Vertex Shader
3. Shape Assembly
4. Geometry Shader (not covered in this course)
5. Rasterization
6. Fragment Shader
7. Tests and Blending (we’ll look at this in later lectures)

Before the OpenGL Pipeline
What are our shapes?
¡ñ In our CPU Code
¡ñ We will build up information first (like a vertex vector)
¡ñ . . . then pass it to OpenGL
¡ñ Each vertex can have a position vector (x,y,z coordinates)
¡ñ Also colours! (Red, Green, Blue)
¡ñ And more . . .

How does OpenGL receive our data?
Buffers and Arrays
¡ñ We give information as a big collection of vertices ¡ð This is very similar to an array in memory
¡ñ But we tend to dump it all in at once!
¡ñ How do we organise it into separate vertices?
¡ñ How much data is in one vertex? It varies!
¡ñ Vertex Buffer Object – can store many vertices
¡ñ Vertex Attributes – split up a single vertex into different information

Vertex Attributes
Each Vertex takes up a certain amount of memory
Image credit: learnopengl.com
¡ñ Attributes are things like coordinates, colours and other information
¡ñ Each attribute is somewhere in the vertex’s memory
¡ñ We can tell OpenGL how big a vertex is and where in each vertex’s
memory each attribute is

Vertex Array Object
We end up with a group of Vertex Attribute Pointers
¡ñ These allow us to reach each attribute in a vertex
¡ñ We’re also going to want to treat all the vertices in a buffer the same
¡ñ We end up with a Vertex Array Object which can be applied to every vertex in a particular Vertex Buffer Object
Image credit: learnopengl.com

The Vertex Shader
Giving Shape Information to the Graphics Card
¡ñ The Vertex shader works on one vertex at a time
¡ñ Each vertex will end up with a position (xyz coordinates)
¡ñ These might be different from what we provided (we’ll learn more about
this later)
¡ñ Some processing of colour information will happen

Shape Assembly
We never explicitly code edges between vertices
¡ñ Edges don’t exist, only vertices
¡ñ But how we connect them together is very important!
¡ñ OpenGL will take our list of vertices and convert it into triangles

A Vector of Vertices
Is it enough to give a big list of vertices?
¡ñ Can you make shapes if all you have is a list of vertices?
¡ñ Technically yes?
¡ñ Is this a good idea?
¡ñ Let’s look at a simple example . . .

A Rectangle
I want to make a simple object
¡ñ Give a list of vertices to OpenGL so that it makes two triangles that form a rectangle
¡ñ {A,B,D,D,B,C}
¡ñ This works . . . we get two triangles
¡ñ But why do we have 6 vertices when there
are obviously only four corners?
¡ñ This is wasting memory in our VBO
A
B
D
C

Element Buffer Objects
Let’s reuse vertices instead of copying them
¡ñ An array of vertices: {A,B,C,D}
¡ñ A triangle is an array of three indices into this array
¡ñ Our two triangles: {0,1,3,3,1,2}
¡ñ This array is an Element Buffer Object
¡ñ Significant reduction in the number of vertices needed
¡ñ Allows shared vertices to only exist once
¡ñ The element buffer of ints is much cheaper than an array of vertices

Rasterization
Conversion into grids of pixels
¡ñ Taking shapes built from vertices
¡ñ Turning them into fragments, which
correspond to pixels on the screen
¡ñ But they have more information like knowing
which vertices make up their shape (nearly always a triangle)
Image credit: Nvidia

Fragment Shader
A fragment is the information necessary to create a pixel
¡ñ Calculates the final colour of a pixel
¡ñ Knows about vertex data in the shape
¡ñ But will also know things like lights in a scene (we’ll be spending weeks on
this later!)
¡ñ This information all gets written to the Frame Buffer containing colours
¡ñ The Frame Buffer is like a 1:1 mapping to the pixels in the monitor

Break Time
Assignment 1 has been released!
¡ñ Yes, it’s a test to see whether you’ve done all the tutorials 😛
¡ñ Also a chance to stretch your creativity with the techniques we’ve taught
¡ñ Due on the 1st October at 5pm

Colouring Shapes with Shaders

How do we decide the colour of a pixel?
We’re using our Shaders!
¡ñ Vertices can have a colour (a vector of floats using RGBA)
¡ñ Red, Green, Blue, Alpha(transparency, which we’re not using yet)
¡ñ Vertex Shaders can specify a colour output
¡ñ Fragment Shaders can take that input and use it

Colour Attributes in Vertices
We’re adding information to Vertices
¡ñ This means each vertex needs attribute pointers
¡ñ One to the 3 float vector of location
¡ñ Another to a 3 float set of RGB values for colour
Image credit: learnopengl.com

Fragment Interpolation
Fragment Shaders and their tricks
¡ñ Each fragment exists somewhere between vertices
¡ñ Instead of just taking the colour from one of the vertices
¡ñ The fragment shader will interpolate values from all the vertices based on its position in the shape
Image credit: (using course example code)

Textures

Textures are Images!
Games before Polygon Rendering were often “sprite” based
¡ñ Sprites are images that can be moved around the screen
¡ñ It’s like putting an image on a rectangle in our rendering
Image credit: Nintendo
Image credit: Capcom (edited by Marc)

Textures on Surfaces in 3D
3D Objects can have images wrapped around them
¡ñ Shows surface detail that doesn’t need extra vertices or triangles
¡ñ We can show details like faces, or surfaces like grass or brick walls
¡ñ Having lots of vertices and triangles is expensive (computationally)
¡ñ Textures can be included in the render pipeline!
Images credit: id Software

Textures on Triangles
Starting with the basics
¡ñ We can provide OpenGL with a Texture (image file)
¡ñ We then “map” the vertices in our shape to coordinates in the image
¡ñ The fragment shader can interpolate each fragment’s position
¡ñ The colour from the texture is “sampled” to give the pixel its colour
¡ñ More on this next lecture . . .

What did we learn today?
Details on Rendering
¡ñ The OpenGL Pipeline (a first look)
¡ñ Some details on code constructs
¡ð Vertex Buffer (VBO), Vertex Attributes (VAO), Element Buffer (EBO)
¡ñ Shaders in the pipeline
¡ñ An intro to Textures

Leave a Reply

Your email address will not be published. Required fields are marked *