Lexapro
True Bro
PSN: Lexa_pro
Posts: 1,066
|
Post by Lexapro on Feb 5, 2013 21:06:28 GMT -5
|
|
Will
True Bro
K/D below 1.0
Posts: 1,309
|
Post by Will on Feb 6, 2013 1:16:49 GMT -5
Best topic of 2013. Specifically because of the cartoons of Mousey attempting to knife lunge.
|
|
tiesieman
True Bro
mental lagger
Posts: 1,401
|
Post by tiesieman on Feb 6, 2013 4:38:00 GMT -5
nvm im a dingus
|
|
|
Post by corpsecreate on Feb 6, 2013 7:48:47 GMT -5
Uhh anybody who has recorded lossless gameplay from the console versions of the game will tell you that yes, it does run at 60 fps. And by 60 fps, we actually mean 59.94 fps. I don't know how anyone can argue that it runs at a different framerate.
@mousey - From just playing the game I agree with you when saying that hitmarkers are determined server-side. On 1 bar connections (something I'm all too familiar with) there is a severe delay in when hitmarks appear but they are always synced to the death animation of my enemy.
|
|
mmacola
True Bro
the brazilian guy
Posts: 1,995
|
Post by mmacola on Feb 6, 2013 8:25:36 GMT -5
@mousey - From just playing the game I agree with you when saying that hitmarkers are determined server-side. On 1 bar connections (something I'm all too familiar with) there is a severe delay in when hitmarks appear but they are always synced to the death animation of my enemy. Me too. It's pretty noticeable with snipers and shotguns at higher pings. Shoot someone, 360 and they die
|
|
toysrme
True Bro
"Even at normal Health, there's no other choice than the Vector" Den Kirson
Posts: 1,339
|
Post by toysrme on Feb 6, 2013 9:42:48 GMT -5
it's client side. the server just accepts the client as true looking at timestamps. shoot someone. you get the hitmarker in the typical frame time for input lag on your given frame rate. but to kill someone... that can get parsed out mostly a second under lag. Uhh anybody who has recorded lossless gameplay from the console versions of the game will tell you that yes, it does run at 60 fps. And by 60 fps, we actually mean 59.94 fps. I don't know how anyone can argue that it runs at a different framerate. how slow are you? you're not outputting a constant frame rate. your capture device is adding and subtracting frames to come up with it's target framerate. not only does console COD not achieve 60fps. but the frame rate changes with selected resolution...
|
|
toysrme
True Bro
"Even at normal Health, there's no other choice than the Vector" Den Kirson
Posts: 1,339
|
Post by toysrme on Feb 6, 2013 10:02:48 GMT -5
id find that as probable as anything. once youve hit kill damage you're shooting air. seems to be exactly my feelings on the matter going all the way back to cod4 now that you've gone down that road.
|
|
|
Post by otisman666 on Feb 6, 2013 11:48:39 GMT -5
A few questions I was hoping maybe to get answered here:
Toys depicts a discussion with GHANDI regarding an increase to the updates per second. GHANDI's response includes one statement that I don't think anyone has really debunked. The console services have a limit (PSN or XBL) on the amount of traffic that can flow through their services. Is this right or wrong?
Is it done on volume, )ie: COD has 400k playing and we can only afford X amount of data transferring at one time); or is done on a per user basis?
2nd Question regarding FPS. BO and BOII are the only games COD games that I have experienced this with on a consistent basis. My FPS will look normal when I am running around aimlessly, but as soon as I engage an enemy in a fire fight, my FPS goes out the window. Sometimes really choppy, other times just a slight frame loss. I know when there is no frame loss that I have the outright advantage over my opponent (god mode like). Why is this and is there anything I can be doing? Does this happen to anyone else?
|
|
|
Post by palladium on Feb 6, 2013 13:26:36 GMT -5
On the second part I would hazard a guess that it's just part of the 3arc engine. They run a slightly modified version of what IW uses and for whatever reason one of the changes they made causes the console (does it happen on pc?) to have trouble processing everything at once in certain situations leading to the game running choppy. More detail than that will be hard to get I'm sure, unless someone with advanced knowledge of the engine or how the console handles graphics chimes in.
Likely it is that they put as much shit in the game as possible so they are right on the line of using all the memory and processing power of the consoles, which still doesn't make sense to me.
|
|
|
Post by otisman666 on Feb 6, 2013 13:46:45 GMT -5
@mouse- Thanks for the replies, but I guess my question was misunderstood. I have seen the results that Toys and a few others had shown when they upped the update count. My question is if we know if GHANDIs reply of saying that the console services place limits on the amount developers can use is legit?
It seems that there isn't limit set in the games or services themselves (not that has been hit anyways), but do we know if in their end user agreements with MS / Sony if there is a limit that is agreed upon? (similar to how we agree to obey the speed limit even though our cars can go faster then them, type of thing) When I originally read GHANDI's reply, this was what I thought he meant.
Wouldn't this be the same for almost any game developer? So if we knew if this was SOP for the services, any developer (even the guy that did Dorito's crash course) could answer the question.
|
|
toysrme
True Bro
"Even at normal Health, there's no other choice than the Vector" Den Kirson
Posts: 1,339
|
Post by toysrme on Feb 6, 2013 20:55:41 GMT -5
2nd Question regarding FPS. BO and BOII are the only games COD games that I have experienced this with on a consistent basis. My FPS will look normal when I am running around aimlessly, but as soon as I engage an enemy in a fire fight, my FPS goes out the window. Sometimes really choppy, other times just a slight frame loss. I know when there is no frame loss that I have the outright advantage over my opponent (god mode like). Why is this and is there anything I can be doing? Does this happen to anyone else? most of the time on x360 the game can run a quasi 60fps in multiplayer (across all the games). but then you pull that trigger and youre staring 52-54fps in the face. whats really inconsistent in all COD's is FPS while firing VS weapon chosen. IMO the classic example of this is playing with LMG's in MW2. you pull out an HBAR, LSAT. even HIGH ROF like the MG4 or M240 you're fine! Pull that relatively low volume RPD out and your frame rate will insta-crash as soon as you pull that trigger. why? people have speculated particles of the ejection cycle. i've no idea. needless to say, each COD has weapons that behave the same way. probably 80-90% of the weapons are all fine, but ya you run into the 1 in 10 that for whatever reason just blows the frame rate down.
|
|
richardj
True Bro
For the love of gawd, stop whining!
Posts: 230
|
Post by richardj on Feb 7, 2013 6:22:53 GMT -5
One thing I have to throw in for console players: What is your refresh rate? The manufacturers will claim that a television has a refresh rate of say 120hz is common now. 240hz is not that common. Alright, so at 120hz, every other line is refreshing so divide it by 2. Now, realize that during the refresh, it actually goes black and clears out but you don't notice because of the definition and light>dark. Now realize that the TV manufacturers count that black out as a refresh but it is not actually a refresh. Oops. That means we have to divide by 2 again. What do we get? 30. That means a 120hz television runs 30fps.
At least the overhead past 30fps is good in assuring that the game won't get choppy during sh1tstorm moments on ground war...Ghandi's stuff pissed me off so I felt the need to point this out. Plus, PSN/XBL are partially to blame.
Great post Mousey, loved the added illustrations. Hopefully some people will find their CoD zen now and a few TV's/controllers can be saved!
|
|
|
Post by iw5000 on Feb 7, 2013 9:07:48 GMT -5
Wow, even to a common finance guy like myself, this almost makes sense. Really nice write-up. Seriously, A+++ On the down side, I regret to inform you that I have a sneaky suspicions some of the Youtube heroes won't be sending you Christmas cards next year. What are you implying there? That certain YouTube heros kind of stack the odds in their favor when they play somehow?
|
|
|
Post by iw5000 on Feb 7, 2013 9:18:33 GMT -5
Errr I'm not sure if you have this in that huge post, but you could add one thing on that second picture of part 3.0. There was some video floating around this board that showed the broken playercamera. Some guy tested this on split-screen and called it local lag compensation, so you should mention that is not the case. Essentially, the guy who came out of cover from crouch or behind a wall could shoot first without being seen. It could explain some corner bulldoo-doo and getting killed by invisible snipers. It's mentioned right in 2.4 That camera thing is a little wonky just because the delay you see is still plenty more than what you should expect on a local connection, but yeah calling it a broken camera is stupid. I remember watching that video. Question then. Are you saying the delay that video showed (that was blamed on faulty camera mechanics)...that delay was entirely explained by 100ms interpolation using the 3 snapshot stuff you mentioned? I don't remember the video, but the delay seemed to be more than 1/10th of a second...which would be what? Around 5 to 6 frames. The delay seemed longer. or was something else going on in that video. The artificial delay the game places on the host, based on other people (average). It was my understanding though, that the video only had two people playing and even on local, they should have been both sourcing from the same net connection in their building. Right? You can't have an artificial delay added onto that.
|
|
|
Post by iw5000 on Feb 7, 2013 9:33:54 GMT -5
....he was joking about how this explains away most of what a lot of youtubers blame 3arc for (hurrr lag comp). Ok..missed that, reading in a hurry. Sometimes i do wonder though if some of these youtube superstars aren't doing something fishy (net wise). I've seen enough of these videos now, to see that quite a many of players aren't doing anything special in terms of tactics. Most of their tactics (especially in Domination) are ridiculously predictable. But at least on the videos, ...they are winning encounter (gunfight) after encounter, like sometimes 6 to 7 well placed 1v1's in a row. That's hard to consistently do.
|
|
|
Post by iw5000 on Feb 7, 2013 9:51:23 GMT -5
I remember watching that video. Question then. Are you saying the delay that video showed (that was blamed on faulty camera mechanics)...that delay was entirely explained by 100ms interpolation using the 3 snapshot stuff you mentioned? I don't remember the video, but the delay seemed to be more than 1/10th of a second...which would be what? Around 5 to 6 frames. The delay seemed longer. or was something else going on in that video. The artificial delay the game places on the host, based on other people (average). It was my understanding though, that the video only had two people playing and even on local, they should have been both sourcing from the same net connection in their building. Right? You can't have an artificial delay added onto that. 1. Well to be honest, 6 frames at 60 fps would be right on the money for 100ms. The video was about a 6-8 frame delay. That effectively adds up to interpolation+the innate artificial delay of 10-40ms that players experience (you can see the exact number in pc cod4. There is a delay even when hosting a listen server). Ok. Thanks. I wasn't sure how long the delay was on the tin foiler conspiracy video. When you run stuff in slo motion, multiple angles, rewinding and forwarding.. plus watch on YouTube....it gets hard to get a feel for what the actual delay was. The maker of that video kind of confuses things up a bit. So i gots to ask. Is this interpolation delay, whatever the snapshot rate is...is it a bit worse than other CoD games? And if so, would this explain why this game (BO2) it feels more important than ever, to be the one who is initiating action on corners (pre-firing, jumping, etc..) rather than the one waiting for contact.
|
|
|
Post by corpsecreate on Feb 7, 2013 11:45:04 GMT -5
What are you saying bro. My capture card records 60 unique frames sequentially one after another. Its not adding/removing anything. It would have been mighty difficult for me to calculate RPM for guns if it was adding/removing frames. The only time the framerate will change is when it drops straight down to 30 fps. It wont hit any other value in between because the game is v-synced on consoles. At this point, my capture card does exactly what you would expect, records each frame twice. I await your attempted counter argument
|
|
|
Post by Marvel4 on Feb 7, 2013 11:53:13 GMT -5
The delay is much worse in Black Ops II.
See this video for measurements and comparison to MW2:
|
|
|
Post by ElysMustache on Feb 7, 2013 12:41:51 GMT -5
Uhh anybody who has recorded lossless gameplay from the console versions of the game will tell you that yes, it does run at 60 fps. And by 60 fps, we actually mean 59.94 fps. I don't know how anyone can argue that it runs at a different framerate. how slow are you? you're not outputting a constant frame rate. your capture device is adding and subtracting frames to come up with it's target framerate. not only does console COD not achieve 60fps. but the frame rate changes with selected resolution... What are you saying bro. My capture card records 60 unique frames sequentially one after another. Its not adding/removing anything. It would have been mighty difficult for me to calculate RPM for guns if it was adding/removing frames. The only time the framerate will change is when it drops straight down to 30 fps. It wont hit any other value in between because the game is v-synced on consoles. At this point, my capture card does exactly what you would expect, records each frame twice. I await your attempted counter argument First, it is not actually 59.94. It is 60,000/1001 (AKA 59.94005994). So if you are going to nitpick, try to be correct. Second, my capture card (Blackmagic Intensity Pro) will not capture lossless video. If I click the "stop recording if frames drop" option, The video will quit recording within seconds.
|
|
|
Post by corpsecreate on Feb 7, 2013 22:12:24 GMT -5
Who said I'm nitpicking? I know its 60,000/1001, one look at my my formula for RPM should tell you that Second, I had a Blackmagic Intensity Pro and the reason it will drop frames and stop recording has nothing to do with the game dropping frames.. The capture card doesnt care which frames its getting so long as its getting something. The reason why it will stop recording is because of a bandwith/power issue, it has nothing to do with which frame the game "decided to drop". The fact that you said that this will only happen when you try to capture lossless video supports what I'm saying. Lossless will naturally require higher bandwidth. Try again
|
|
|
Post by corpsecreate on Feb 7, 2013 22:16:59 GMT -5
proof for what? EDIT: I just saw your edit on the previous post. My results were rounded to the nearest 5 thats why they dont exactly match what they should be at 60000/1001 fps.
|
|
|
Post by corpsecreate on Feb 7, 2013 22:33:13 GMT -5
Its an option to stop the recording if a frame is dropped. At lossless, its likely this will happen. If you have it set to stop when a dropped frame is detected, it will stop recording pretty quickly. If you change it from lossless, this might still happen but it is significantly rarer. A rough calc of the bandwidth required would be: 1280*720 = 921,600 pixels per frame 921,600*1.5 = 1,382,400 bytes per frame 60,000/1001*1,382,400 = 82,861,138.86 bytes per second 82,861,138.86/1024^2 = approx 79 MB/s. Thats a lot of bandwidth
|
|
toysrme
True Bro
"Even at normal Health, there's no other choice than the Vector" Den Kirson
Posts: 1,339
|
Post by toysrme on Feb 8, 2013 12:44:10 GMT -5
|
|
Zero IX
True Bro
༼ つ ◕_◕ ༽つ
Posts: 1,655
|
Post by Zero IX on Feb 9, 2013 0:33:10 GMT -5
Toys, any chance you can link to that private vid with the altered network refresh rates?
|
|
Erik
True Bro
Ring-Giver
Posts: 10,023
|
Post by Erik on Feb 9, 2013 0:38:30 GMT -5
By Odin's Eye! Our resident crowfeeder has thrown down the gauntlet!
|
|
|
Post by corpsecreate on Feb 9, 2013 1:44:21 GMT -5
I concede defeat, well played Toys.
However not everything I sad was incorrect, in fact the only thing I said that was wrong was that the framerate is either 60 fps or 30 fps and nothing in-between. The reason I thought it was this way is because the game is constantly v-synced. I have never seen sheering on the console versions of the game. From my knowledge, this means that the framerate must be locked at either 30 fps or 60 fps. On PC, if you enable v-sync, then you no longer have a variable framerate, instead it is locked at a specific value.
I dont question your sources (I'll believe anything digitalfoundry says) but how is it possible for the game to have variable framerate and still be v-synced? I ask so that I may be educated.
Furthermore, my capture card was not manufacturing frames in my tests though this is likely to be because all my frame-by-frame analysis of this game was done in multiplayer by myself where the framerate is likely to be a constant 60. I never looked extra close at my recorded footage in a live multiplayer game where the framerate may have dropped. In this case, like you said, the capture card would duplicate frames to reach the target framerate of 60.
I'm glad I was wrong, it means I won't be spreading incorrect information to other people. You taught me something I didn't know and at the same time you got to educate someone. This is a win-win situation.
|
|
toysrme
True Bro
"Even at normal Health, there's no other choice than the Vector" Den Kirson
Posts: 1,339
|
Post by toysrme on Feb 9, 2013 4:35:53 GMT -5
no problem. v-sync has nothing to do with it but youre more on track for the refresh rate which is handled by other hardware after the system is done with it. for console COD it's hard locked to 60fps. with vsync and a 1 frame flip que. if the frame rate is above the refresh rate, vsync is disabled to improve performance (your refresh rate will vary depending on your viewing medium standard. for this example we'll make it applicable for PC gamers also and just say you're running VGA/dvi/hdmi/ds and it's at 60htz). at 60fps, vsync is disabled and you're simply running a one frame flip-que. under this frame rate and double-buffering comes into play where the half the screen is held until the next half is transmitted, greatly reducing screen tearing, but introducing another 16.67ms worth of lag. (the frame transmission time of a 60htz refresh rate)
the refresh always remains the same. *something* will be transmitted every refresh cycle. this is where screen tearing comes into play when your frame rate is under your refresh rate. you typically transmit alternating top & bottom halves of images each refresh. without a good buffer solution (triple buffering, vsync(s) flip que(s)). the image will be transmitted in-part. viewing devices also differ. CRT's do not buffer information. they will display ASAP. digital displays (lcd/plasma) can also do this, but are *normally* programmed to either display the half-frame at a time, or buffer a half frame and display one complete image. (this is why screen tearing is typically not as bad on modern displays. the downside is you're waiting on some amount of information to buffer. another 16.67ms or two to buffer) knowing little detour that lets go back to screen tearing. a display device that does not buffer + 60htz refresh rate + 30fps frame rate + NO form of vsync. now have the view pan left or right. what you will see is alternating screen tears in the image. where there are breaks in the image at roughly 1/4 and 3/4 vertical positions on the screen. the video hardware doesn't have enough time to draw it's full image (remember by full image we're talking HALF the image at the time).
taking another side turn, this is why several years ago there was a large uproar from PC gamers because video card drivers slipped in all quietly and had the video cards set by default to run a frame flip que without an option to change it. (this impacts the buffers when the card is running above the refresh rate). what happened is they realized virtually everyone had LCD's now, and LCD's pretty much all do their own 1/2 or full frame buffer. so they took the liberty of pre-tweaking the option to the ire of many people.
because the refresh rate never changes, the recorded file should never change the frame rate that it actually records. this is also why if you record a console game with the typical suspects of hardware, it'll show that it records 59.94fps (stimming from NTSC's 29.97fps @ 59.94htz). go record PAL off a scart connector and it'll show 50fps. record the VGA output, it'll be 60fps. and so on & so forth.
|
|
|
Post by corpsecreate on Feb 9, 2013 6:58:03 GMT -5
Wow that is certainly one of the most technical posts I've had to read on here...
So if I'm understanding this right, when the game is running below 60 fps, the screen will be updating more often than the game is being rendered. As a result, the hardware will "wait" (by means of buffering) until it can next display a full frame at the nearest refresh cycle.
If thats right, then that would mean that if the framerate dropped to say, 40 fps, then there would be 20 frames being displayed that are duplicates of previously drawn frames. These 20 frames are duplicates because the hardware was not yet ready to display the next frame but the refresh rate will mean that something will always be displayed at a rate of 60hz.
Am I understanding this right?
|
|
jaykay
True Bro
bo2 emblem
Posts: 49
|
Post by jaykay on Feb 9, 2013 9:02:33 GMT -5
It obviously exists because otherwise people would find out that they constantly have riddiculous pings. Talking about PC, all CoDs that were simple console ports and had the hosting system, had riddiculously bad pings - mainly due to the fixed settings of cl_maxpackets 30 , snaps 20 (just 20 snapshots..) and com_maxfps 85 (aka 91). That is very bad for what you can do on computer with nowadays connection standards. The Cods that allow maxpackets 100, free fps values and snaps 30 play much more fluently in terms of hit registration. This is another point in which I deem the PC superior. Though whenever I watch console vids there seems to be no problem, admittably. But when you compare the 30 20 85 standard to oldies like counterstrike on goldsrc or source which always had good netcode, it just seems antiquated.
As for the FPS debate going on - the quake engine which CoD uses always uses a dividor of 1000 as its fixed FPS value. this means that random fps values will always tried to be rounded up or down to values within the com_maxfps limit, such as: 1000/5 = 200 1000/6 = 166 1000/7 = 142 1000/8 = 125 1000/9 = 111 1000/10 = 100 1000/11 = 91 1000/12 = 83 1000/13 = 76 1000/14 = 71 1000/15 = 66 1000/16 = 62 1000/17 = 58 1000/18 = 55 etc
|
|
toysrme
True Bro
"Even at normal Health, there's no other choice than the Vector" Den Kirson
Posts: 1,339
|
Post by toysrme on Feb 9, 2013 9:52:25 GMT -5
Wow that is certainly one of the most technical posts I've had to read on here... So if I'm understanding this right, when the game is running below 60 fps, the screen will be updating more often than the game is being rendered. yes. the screen will draw *something* every refresh rate on its own. some will change the top & bottom halves of the image as they come in, others will buffer them together and draw the entire image at one time. to clear this up. the display will display something all the time as described above. the video hardware side is really a two step process. generate the picture, and output the picture to the display. the output will happen on time all the time. generating the picture happens at it's own pace. if its slower, it's just slower if it's faster one of two (three) things will happen. it will HALT until that frame transmission has completed and start back afterwards or continue rendering as many frames as the CPU can feed it data to work on (until it reaches a pre-defined number of frames ahead and THEN stop. typically 1, 3 or 5 frames ahead). ya pretty much. to be more accurate on my part; sometimes duplicates, sometimes itll be a flip between new & old info and youll get a ton of screen tearing. depends on how its handled.
|
|