Another rendering update
- fr0stbyte124
- Developer
- Posts:727
- Joined:Fri Dec 07, 2012 3:39 am
- Affiliation:Aye-Aye
I've seen Notch code before in a livestream. He's actually an amazing coder. I've seen him write entire engines from scratch for indie competitions without planning it out beforehand or looking up any of the algorithms he uses, and above all, he's fast. I've never seen anyone code that quickly and confidently for an extended period of time. And then the stuff he writes controls well and is actually fun to play. Once we start the content phase, you'll probably discover just how difficult that is to pull off. Notch get's full points in my book for coding skill.
What he's not though, is a scientist. He likes to make new stuff, but doesn't like going back and redoing stuff he's already done unless it becomes necessary. Plus, it's his job is to make the game fun, and add fun stuff the average player is going to appreciate, not publish a paper on voxel rendering optimizations in a coarse-scale environment. My job, on the other hand, is to take what already exists and make it perform as efficiently as possible, and I've had over a year of planning and going over research papers and blog posts and stack exchange to get to where I'm at.
For comparison, at this point in the original timeline, Minecraft Alpha would have already been released, along with an early build of SMP.
*edit*
And we're coming up on the Golden Age of Seekret Friday Updates, in which major new features like minetracks and redstone were getting added on a weekly basis.
*edit*
Wait, the Friday updates started before SMP, not after. We're almost done with the Golden Age of Seekret Friday Updates.
What he's not though, is a scientist. He likes to make new stuff, but doesn't like going back and redoing stuff he's already done unless it becomes necessary. Plus, it's his job is to make the game fun, and add fun stuff the average player is going to appreciate, not publish a paper on voxel rendering optimizations in a coarse-scale environment. My job, on the other hand, is to take what already exists and make it perform as efficiently as possible, and I've had over a year of planning and going over research papers and blog posts and stack exchange to get to where I'm at.
For comparison, at this point in the original timeline, Minecraft Alpha would have already been released, along with an early build of SMP.
*edit*
And we're coming up on the Golden Age of Seekret Friday Updates, in which major new features like minetracks and redstone were getting added on a weekly basis.
*edit*
Wait, the Friday updates started before SMP, not after. We're almost done with the Golden Age of Seekret Friday Updates.
Re: Another rendering update
I am becoming increasingly annoyed with Jeb as time goes on.Tiel wrote:That's a nice way of putting it.Last_Jedi_Standing wrote:Notch is a brilliant game designer, but not a great coder, which is why Minecraft runs so slowly. Jeb is a great coder but an atrocious game designer, which is why every Minecraft update for the past year has sucked more than the previous one.hyperlite wrote:Well, I know Notch is credited with coding, but I always here he is not that good at coding, and minecraft is very inefficient and slow for the type of game.
He should be more credited with the idea, not the code itself.
;.'.;'::.;:".":;",,;':",;
(Kzinti script, as best as can be displayed in Human characters, translated roughly as "For the Patriarchy!")
(Kzinti script, as best as can be displayed in Human characters, translated roughly as "For the Patriarchy!")
-
- Vice Admiral
- Posts:2312
- Joined:Sun Dec 09, 2012 10:21 pm
- Affiliation:Strigiforme
- IGN:ACH0225
- Location:Cuuyth
Re: Another rendering update
I know Notch is busy with his new game, but he works in the same room as Jeb. Can't he just stick his head over, clack his beak in dissatisfaction, and say, "That idea doesn't flow well. Add this instead."
mfw brony imagesfr0stbyte124 wrote:5 months from now, I will publish a paper on an efficient method for rendering millions of owls to a screen.
Spoiler:
- fr0stbyte124
- Developer
- Posts:727
- Joined:Fri Dec 07, 2012 3:39 am
- Affiliation:Aye-Aye
Re: Another rendering update
Oh, I should add one thing to the explaination of lighting. Lights don't have to be shadowed to be dynamic. In fact, it's only when they are not that deferred lighting really makes a difference. So if you don't mind your spotlights not casting shadows, or you have a special way of doing them outside of the normal pipeline, you could toss a hundred of them in the scene. Still probably wouldn't work for us because we won't be able to tell what needs to cast shadows and what doesn't.
*edit*
Though it occurs to me that it could work as a modifier of sorts. Stick a colored point light on top of a regular torch, and then calculate the color from the ambient light with the magnitude of the regular light. If you had multiple colored lights, it wouldn't know how much each light was actually contributing, but it could make a decent approximation, and more importantly, it would be fast and lightweight.
*edit*
Though it occurs to me that it could work as a modifier of sorts. Stick a colored point light on top of a regular torch, and then calculate the color from the ambient light with the magnitude of the regular light. If you had multiple colored lights, it wouldn't know how much each light was actually contributing, but it could make a decent approximation, and more importantly, it would be fast and lightweight.
-
- Designer
- Posts:397
- Joined:Fri Dec 07, 2012 11:59 pm
- Affiliation:Alteran
- Location:In the Holy Citadel of Altera
Re: Another rendering update
You've said some pretty confusing things in your time, but that one tops them all. I don't know what the object of your post is...fr0stbyte124 wrote:Oh, I should add one thing to the explaination of lighting. Lights don't have to be shadowed to be dynamic. In fact, it's only when they are not that deferred lighting really makes a difference. So if you don't mind your spotlights not casting shadows, or you have a special way of doing them outside of the normal pipeline, you could toss a hundred of them in the scene. Still probably wouldn't work for us because we won't be able to tell what needs to cast shadows and what doesn't.
*edit*
Though it occurs to me that it could work as a modifier of sorts. Stick a colored point light on top of a regular torch, and then calculate the color from the ambient light with the magnitude of the regular light. If you had multiple colored lights, it wouldn't know how much each light was actually contributing, but it could make a decent approximation, and more importantly, it would be fast and lightweight.
Anyway. Fr0st, if you believe that creating a better lighting system will take too much time, then please move on to Copernicus or rendering optimization.
This is a signature.
- fr0stbyte124
- Developer
- Posts:727
- Joined:Fri Dec 07, 2012 3:39 am
- Affiliation:Aye-Aye
Re: Another rendering update
It's all interconnected. If we want to do something down the road, we need to make allotments for it now.
-
- Designer
- Posts:397
- Joined:Fri Dec 07, 2012 11:59 pm
- Affiliation:Alteran
- Location:In the Holy Citadel of Altera
Re: Another rendering update
While I see that for a lot of things, I fail to see how the lighting engine effects how an NPC walks down the street.fr0stbyte124 wrote:It's all interconnected. If we want to do something down the road, we need to make allotments for it now.
This is a signature.
-
- Developer
- Posts:2968
- Joined:Fri Dec 07, 2012 1:25 am
- Affiliation:NSCD
- IGN:Currently:Small_Bear
- Location:Yes
Re: Another rendering update
I think we can stick with normal lighting, it's nice and simple, and everyone knows it, and in my experiences, the best method is usually the simplest.
Unless there are some serious problem with the current lighting that will cause problems later, then we need to fix it.
Unless there are some serious problem with the current lighting that will cause problems later, then we need to fix it.
Spoiler:
Mistake Not... wrote: This isn't rocket science, *!
Spoiler:
-
- Vice Admiral
- Posts:2312
- Joined:Sun Dec 09, 2012 10:21 pm
- Affiliation:Strigiforme
- IGN:ACH0225
- Location:Cuuyth
Re: Another rendering update
It might have some problems with moving objects. The lighting is independent of entities, and ships might be entities.
mfw brony imagesfr0stbyte124 wrote:5 months from now, I will publish a paper on an efficient method for rendering millions of owls to a screen.
Spoiler:
- fr0stbyte124
- Developer
- Posts:727
- Joined:Fri Dec 07, 2012 3:39 am
- Affiliation:Aye-Aye
Re: Another rendering update
Deferred lighting works in screen-space for all materials which wants to use it, which in our case is every non-transparent surface. The algorithm doesn't care about how the scene was generated.
If we do dynamic shadows at all, we should be using a deferred lighting system. Everything else after that we get very cheaply.
Oh, and the GLSL shader packs for minecraft set up deferred lighting, so you can see what all that can get you.
If we do dynamic shadows at all, we should be using a deferred lighting system. Everything else after that we get very cheaply.
Oh, and the GLSL shader packs for minecraft set up deferred lighting, so you can see what all that can get you.
-
- Designer
- Posts:397
- Joined:Fri Dec 07, 2012 11:59 pm
- Affiliation:Alteran
- Location:In the Holy Citadel of Altera
- fr0stbyte124
- Developer
- Posts:727
- Joined:Fri Dec 07, 2012 3:39 am
- Affiliation:Aye-Aye
Re: Another rendering update
That's because it's tacked on top of the engine rather than being a part of it. The rendering pipeline is really delicate and can get clogged pretty easily if you're not careful.
-
- Designer
- Posts:397
- Joined:Fri Dec 07, 2012 11:59 pm
- Affiliation:Alteran
- Location:In the Holy Citadel of Altera
Re: Another rendering update
Does that make you a graphics-plumber?fr0stbyte124 wrote:That's because it's tacked on top of the engine rather than being a part of it. The rendering pipeline is really delicate and can get clogged pretty easily if you're not careful.
This is a signature.
- fr0stbyte124
- Developer
- Posts:727
- Joined:Fri Dec 07, 2012 3:39 am
- Affiliation:Aye-Aye
Re: Another rendering update
That's right, Caboose, I'm a graphics plumber.
But back on topic, I just thought of something interesting.
We have these 3 specialized render passes just for drawing cube geometry with various levels of detail, and then some normal passes for the remaining non-cube geometry. Two of the special render passes are going to be fragment (pixel shading) heavy, because of all the texture loading and raycasting. The other passes are going to be vertex heavy (or rather, there's just more of them, and almost no fragment logic). With that in mind, I wonder if we can't interlace them and draw in a single pass...
The idea here is that the entire rendering process is pipelined, with throughput heavily prioritized over latency. Things feed into the vertex shader, get their points converted into screen coordinates, and go into another queue. A few rasterizing steps later, the fragment shader pulls out individual pixels from another queue and does math on them and spits them into another queue to get written to the screen buffer. Basically, no part of the shader program cares about what is going on in other parts of the GPU, it only cares whether it has stuff to do or not. And, unless you have a very balanced program, one of those queues are going to pass data slower than the others and hold up the process. In vanilla minecraft, it's the vertex shader. You can tell because resizing your screen doesn't change the fps. Fragment limited draws will have better fps as the screen size shrinks.
But another way to look at this, if fragment limiting is inevitable, is that you are getting a whole bunch of real-estate in the vertex shading pipeline for free. So you could do something like vertex transforma a bounding box for a raycasting routine, then do a thousand triangles or whatever, filling up the pipeline while you wait for the fragment shader to get its act together and finish loading textures. Then once that's done, it breezes through the next thousand triangles which don't have any shading or even texturing to do (because that is done later).
I'm not sure if this is a good idea or not. There are a number of things to consider. First, you need a way to identify which type of fragment shading you need to do, and the only place you can specify that is from the vertex shader, and the only place you can specify it there is from whatever data you've stored in your vertex buffer. Secondly, since you can't rebind resources in the middle of a draw call, texture sample and constant you use has to be accessible at the same time, and I'm not sure how that will affect resource consumption. Third, triangle rasterization is super cheap, as that's what GPUs are designed around drawing. So really the polygon impact is going to be negligable, especially since we already have vastly reduced poly counts. Also, drawing polygons first is beneficial, because anything occluding a raycaster sample means that that sample will be automatically skipped. If you are indoors, your raycasting cost automatically becomes zero (*edit* unless you are doing dynamic shadows, which are always outside, but even then, there are tricks you can do to help.) So really, polygons should all be drawn first, even if there is a chance they will be occluded later.
I guess, nevermind on the interlacing render types thing. Proper ordering is going to make the bigger impact on the fragment shader output than packing the pipeline more efficiently. Still, it would be nice if there was some way to make better use of the vertex shader during those raycasting passes. They'll be doing a whole lot of nothing most of that time, especially if all the polygons have already been drawn.
It'll be really nice to be able to profile this. So much guesswork in the planning stages. That's the main reason I'm writing this stuff outside of Minecraft before porting it in.
But back on topic, I just thought of something interesting.
We have these 3 specialized render passes just for drawing cube geometry with various levels of detail, and then some normal passes for the remaining non-cube geometry. Two of the special render passes are going to be fragment (pixel shading) heavy, because of all the texture loading and raycasting. The other passes are going to be vertex heavy (or rather, there's just more of them, and almost no fragment logic). With that in mind, I wonder if we can't interlace them and draw in a single pass...
The idea here is that the entire rendering process is pipelined, with throughput heavily prioritized over latency. Things feed into the vertex shader, get their points converted into screen coordinates, and go into another queue. A few rasterizing steps later, the fragment shader pulls out individual pixels from another queue and does math on them and spits them into another queue to get written to the screen buffer. Basically, no part of the shader program cares about what is going on in other parts of the GPU, it only cares whether it has stuff to do or not. And, unless you have a very balanced program, one of those queues are going to pass data slower than the others and hold up the process. In vanilla minecraft, it's the vertex shader. You can tell because resizing your screen doesn't change the fps. Fragment limited draws will have better fps as the screen size shrinks.
But another way to look at this, if fragment limiting is inevitable, is that you are getting a whole bunch of real-estate in the vertex shading pipeline for free. So you could do something like vertex transforma a bounding box for a raycasting routine, then do a thousand triangles or whatever, filling up the pipeline while you wait for the fragment shader to get its act together and finish loading textures. Then once that's done, it breezes through the next thousand triangles which don't have any shading or even texturing to do (because that is done later).
I'm not sure if this is a good idea or not. There are a number of things to consider. First, you need a way to identify which type of fragment shading you need to do, and the only place you can specify that is from the vertex shader, and the only place you can specify it there is from whatever data you've stored in your vertex buffer. Secondly, since you can't rebind resources in the middle of a draw call, texture sample and constant you use has to be accessible at the same time, and I'm not sure how that will affect resource consumption. Third, triangle rasterization is super cheap, as that's what GPUs are designed around drawing. So really the polygon impact is going to be negligable, especially since we already have vastly reduced poly counts. Also, drawing polygons first is beneficial, because anything occluding a raycaster sample means that that sample will be automatically skipped. If you are indoors, your raycasting cost automatically becomes zero (*edit* unless you are doing dynamic shadows, which are always outside, but even then, there are tricks you can do to help.) So really, polygons should all be drawn first, even if there is a chance they will be occluded later.
I guess, nevermind on the interlacing render types thing. Proper ordering is going to make the bigger impact on the fragment shader output than packing the pipeline more efficiently. Still, it would be nice if there was some way to make better use of the vertex shader during those raycasting passes. They'll be doing a whole lot of nothing most of that time, especially if all the polygons have already been drawn.
It'll be really nice to be able to profile this. So much guesswork in the planning stages. That's the main reason I'm writing this stuff outside of Minecraft before porting it in.