• Welcome to the Angry Joe Show Army!

    Join our community of gamers passionate about our community and our hobby! Whether it's playing, discussing, or watching games, regardless of platform, genre, or location, we have a place for you, always!

  • PS4 Forum

    The AJSA Playstation 4 Division: Game Nights and More!

    The AJSA is on Playstation 4! Join us for weekly Game Nights with a selection of the best games the PS4 has to offer!

  • XBO Forum

    The AJSA Xbox One Division: We Got You Covered!

    The AJSA Xbox One Division is ready to connect with you on XBox Live with a ton of events for the best Xbox games!

  • News Archive

    The Best News from the Best Sites, Every Week.

    The AJSA News Collection Team is hard at work condensing a week's worth of news into one giant-sze digest for you to chew on and discuss! Links to source articles are always provided!

  • More Info

    The AJSA Expeditionary Force: Deploying to Play the Best PC Games!

    The elite vanguard of the AJSA, the Expeditionary Force (EF) chooses a new PC game every week! Join us for weekly events and help decide if the game has a future in the AJSA.

  • The Team

    Streaming Now: The AJSA Stream Team

    Joe can't stream every game, but our talented AJSA Stream Team covers a wide variety of games and personalities! Check them out, and show them some AJSA Love!

  • The Tube

    The AJSA Community YouTube Channel

    Featuring news, gameplay clips, and more from the community! The Community is a chance to showcase the best moments in AJSA Gaming!

FroZenToday

Assasin's Creed Unity locked at 900p/30fps on next gen (now Evil Within too :C)

110 posts in this topic

To avoid debate they said. Also Jim Sterling and Daniel Hardcastle are laughing their asses off.

 

http://www.videogamer.com/ps4/assassins_creed_unity/news/assassins_creed_unity_is_900p_30fps_on_both_ps4_and_xbox_one.html

 

Thoughts?

 

Edit:

Bethesda confirmed 30 fps lock on PC for Evil Within for "cinematic experience". And Vram requirements at 4gb thanks to nextgen and the port being a direct port.

http://www.vg247.com/2014/10/09/the-evil-within-is-locked-at-30fps-on-pc/

ProfessorOaked likes this

Share this post


Link to post
Share on other sites

"We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."

oh-wait-youre-serious-let-me-laugh-even-

 

I can understand the reasoning for them being limited by CPU power though, seeing as both consoles have the exact same CPU anyways, so that part makes sense. The whole "to avoid debate" line is true though, the actual truth is that both of them are equally inferior, and they don't want to piss off Microsoft and Sony (understandably). This is just hilarious after seeing so many larger gaming companies selling their games on the power of a new set of consoles and showing off the huge boost in performance it brings, even though it truly isn't that impressive. Even if they had a more powerful CPU, the GPU's in these consoles couldn't run a modern game like that at 1080p 60fps, MAYBE 1080p 30fps. 

Share this post


Link to post
Share on other sites

In other news the PC port will not be optmized but ultimately the best version of the game.

You forget that Ubisoft also locked the framerates in the PC port for Watch_Dogs.

So, following that treand, they'll most certainly lock the PC version "to stop any further debate about which gaming platform is the most powerful".

Let me get my umbrella for the next volley of vids and editorials about Ubisoft's poor PC ports ^_^

ProfessorOaked likes this

Share this post


Link to post
Share on other sites

Unbelievable. I can't believe Ubi actually give a shit about saving face for the bone. They should only care about getting their game to be the best game it can be one either console, limiting the PS4 because otherwise fanbois will be fanbois is flabbergasting.

Share this post


Link to post
Share on other sites

Hmm

 

"We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."

oh-wait-youre-serious-let-me-laugh-even-

 

I can understand the reasoning for them being limited by CPU power though, seeing as both consoles have the exact same CPU anyways, so that part makes sense. The whole "to avoid debate" line is true though, the actual truth is that both of them are equally inferior, and they don't want to piss off Microsoft and Sony (understandably). This is just hilarious after seeing so many larger gaming companies selling their games on the power of a new set of consoles and showing off the huge boost in performance it brings, even though it truly isn't that impressive. Even if they had a more powerful CPU, the GPU's in these consoles couldn't run a modern game like that at 1080p 60fps, MAYBE 1080p 30fps. 

Well, if what they're claiming about having up to 35,000 AI on screen at one time something has to give.

 

Also, I never cared much for graphics, I just want a good game :)

ProfessorOaked likes this

Share this post


Link to post
Share on other sites

"We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."

oh-wait-youre-serious-let-me-laugh-even-

 

I can understand the reasoning for them being limited by CPU power though, seeing as both consoles have the exact same CPU anyways, so that part makes sense. The whole "to avoid debate" line is true though, the actual truth is that both of them are equally inferior, and they don't want to piss off Microsoft and Sony (understandably). This is just hilarious after seeing so many larger gaming companies selling their games on the power of a new set of consoles and showing off the huge boost in performance it brings, even though it truly isn't that impressive. Even if they had a more powerful CPU, the GPU's in these consoles couldn't run a modern game like that at 1080p 60fps, MAYBE 1080p 30fps. 

The issue here is that they're limiting the PS4 because of Xbox fanbois which is a new low. The PS4 could easily output 1080p/30fps, the bone can't and as such now everyone has to suffer

Nya` likes this

Share this post


Link to post
Share on other sites

Is it me or does Unity sound more half-assed by the day?

First was that thing with no female characters for multiplayer that they gave that stupidly lazy excuse for to try and hide that fact that all of the characters in multiplayer are just poor reskins of the same person. 

And now this

Share this post


Link to post
Share on other sites

The issue here is that they're limiting the PS4 because of Xbox fanbois which is a new low. The PS4 could easily output 1080p/30fps, the bone can't and as such now everyone has to suffer

But Ubi said it was a CPU issue, the Xbox One has a higher clocked CPU than the PS4 in addition to using ddr3 ram which works better with CPUs then ddr5. 900p/30 seems the max for consoles

Share this post


Link to post
Share on other sites

The issue here is that they're limiting the PS4 because of Xbox fanbois which is a new low. The PS4 could easily output 1080p/30fps, the bone can't and as such now everyone has to suffer

Well last i checked both consoles had the same CPU with the Xbox having a 0.15ghz boost compared to the PS4. If it truly is a CPU based problem regarding how many AI's they have running all at once on the same screen then both of those processors are going to run nearly identically, like within a percent of each other. This actually seems like a very viable reason they would have to limit both consoles equally, since they're both having the same problem.

CQR Myles likes this

Share this post


Link to post
Share on other sites

The issue here is that they're limiting the PS4 because of Xbox fanbois which is a new low. The PS4 could easily output 1080p/30fps, the bone can't and as such now everyone has to suffer

Well, they locked the PS4 version of Watch-Dogs to match the Xbone too....

Later, they released the "definitive edition" with extra whatevers on PS4, but the lock was there initially.

 

But Ubi said it was a CPU issue, the Xbox One has a higher clocked CPU than the PS4 in addition to using ddr3 ram which works better with CPUs then ddr5. 900p/30 seems the max for consoles

Well, the Xbone's CPU is higher-clocked, but it's locked partly to keep processes available for TV, Kinect, voice-recognition and whatever apps and services are open at the time.

My (admittedly, amateur) appraisal is that both CPUs should be roughtly equal, with no clear advantage for either of the consoles.

Share this post


Link to post
Share on other sites

lol, ubisoft really is trying hard to takeover EA's #1 worst company spot.

They need to try harder.

Share this post


Link to post
Share on other sites

Well, the Xbone's CPU is higher-clocked, but it's locked partly to keep processes available for TV, Kinect, voice-recognition and whatever apps and services are open at the time.

 

This is no longer the case for devs, this processing power was unlocked in the June SDK update

Share this post


Link to post
Share on other sites

Going to drop this because it's the statement of truth when it comes to this:

zawisz likes this

Share this post


Link to post
Share on other sites

"We understand how Vincent’s quotes have been misinterpreted," reads the statement Ubisoft sent to IGN. "To set the record straight, we did not lower the specs for Assassin’s Creed Unity to account for any one system over the other.

Assassin’s Creed Unity has been engineered from the ground up for next-generation consoles. Over the past 4 years, we have created Assassin’s Creed Unity to attain the tremendous level of quality we have now achieved on Xbox One, PlayStation 4 and PC. It’s a process of building up toward our goals, not scaling down, and we’re proud to say that we have reached those goals on all SKUs.

At no point did we decide to reduce the ambitions of either SKU. Both benefited from the full dedication of all of our available optimization resources to help them reach the level of quality we have today with the core Assassin’s Creed Unity experience.

Final specs for Assassin’s Creed Unity aren’t cemented yet, but we can say we showed Assassin’s Creed Unity at 900p during our hands-on preview event last week. We’re confident that gamers will be thrilled with the gorgeous graphics and how Paris is brought to life in Assassin’s Creed Unity."

 

Share this post


Link to post
Share on other sites

Hmm

 

Well, if what they're claiming about having up to 35,000 AI on screen at one time something has to give.

 

Also, I never cared much for graphics, I just want a good game :)

I'm quite sure that they don't need to have 35k AI's on the screen. If they pre-script the AI or just make it into a cutscene it isn't a problem. The AI of AC games has always been terrible so it's funny for me to even read that "CPU limits our AI" when every single AI in their AC games has been almost like a Dynasty Warriors character. And Dynasty Warriors can have hundreds to thousands on the screen.

 

If it really is a CPU limit then I don't think that PS4 would be running it better. But that's only "if it really is". Considering the whole article I'd say that Ubisoft is lying and that PS4 COULD run it better as it isn't a CPU problem, but rather bad optimization. I mean maybe PS4 can't handle 1080p, but it can handle like 980p or something and cause they can't reach 1080 so just take the 80p's off and make it the same.. Can't manage to optimize it well enough for the Xbone so take off some of them p's from the PS4 as well.. Hell if it really were a CPU bottleneck then they wouldn't have said that "We'll lock them on same specs to avoid all the debates and stuff" .. Why would you EVEN say that if it really were a CPU bottleneck?! If you would have just said "The CPU is bottlenecking us so yeah, same specs".then I wouldn't have blinked an eye to it. But this is just a bad lie from Ubisoft. 

ProfessorOaked likes this

Share this post


Link to post
Share on other sites

You forget that Ubisoft also locked the framerates in the PC port for Watch_Dogs.

So, following that treand, they'll most certainly lock the PC version "to stop any further debate about which gaming platform is the most powerful".

Let me get my umbrella for the next volley of vids and editorials about Ubisoft's poor PC ports ^_^

Watch Dogs is not locked to 30, mine runs at 60 in fact. The PC version of Unity won't be locked either if they do as usual, however it'll be terribly unoptmized and few computers will be able to run it at ultra and keep a 60 fps.

Share this post


Link to post
Share on other sites

The thing is you won't tell that much of a diffrence between 900 and 1080p in all reality. I been thinking this console cycle there has been agreements with devs and publishers to not make one console look better than the other. In their prospective it makes alot of sense since they don't have to try to work trying to make the game look good across 2 consoles giving them less work to do. But yeah their excuse is kinda stupid.

Share this post


Link to post
Share on other sites

Well, they locked the PS4 version of Watch-Dogs to match the Xbone too....

Later, they released the "definitive edition" with extra whatevers on PS4, but the lock was there initially.

Well, the Xbone's CPU is higher-clocked, but it's locked partly to keep processes available for TV, Kinect, voice-recognition and whatever apps and services are open at the time.

My (admittedly, amateur) appraisal is that both CPUs should be roughtly equal, with no clear advantage for either of the consoles.

No they didn't, the PlayStation 4 version ran at 900p and the Xbox One version at 792p.

Also the Xbox One does not have a higher frequency on its CPU for background tasks or anything of that nature. They overclocked it because the cooling on the Xbox One is better and they were able to without heat issues.

If this is an AI issue relating to the CPU, technically the PlayStation 4 is holding the Xbox One back, the CPU is 10% more powerful.

Share this post


Link to post
Share on other sites

This is no longer the case for devs, this processing power was unlocked in the June SDK update

That was GPU reserve they unlocked related to Kinect skeletal mapping, not CPU. The CPU was always 10% more capable.

Share this post


Link to post
Share on other sites

This is no longer the case for devs, this processing power was unlocked in the June SDK update

Well, there are two problems with it.

First, we didn't really know how much was locked (rumors being rumors, not facts), and we don't know how much of that was released either. Or does the SDK patchnotes state that all of the processes are available now?

Second, any process running in the background is going to draw resources from the machine, so maybe in theory all the power could be used for what the devs plan, but when the console runs the game, every other process running requires processor time and takes from what the game can use.

 

No they didn't, the PlayStation 4 version ran at 900p and the Xbox One version at 792p.

Also the Xbox One does not have a higher frequency on its CPU for background tasks or anything of that nature. They overclocked it because the cooling on the Xbox One is better and they were able to without heat issues.

If this is an AI issue relating to the CPU, technically the PlayStation 4 is holding the Xbox One back, the CPU is 10% more powerful.

Last time we talked, you escaped technical facts by complaining about my long posts.

So, following my own words, I'll disengage from this thread again in order to avoid it being turned into yet another PS4 vs Xbone argument.

My disagreement with your math remains the same.

Recoveryanonymous likes this

Share this post


Link to post
Share on other sites

The thing is you won't tell that much of a diffrence between 900 and 1080p in all reality. I been thinking this console cycle there has been agreements with devs and publishers to not make one console look better than the other. In their prospective it makes alot of sense since they don't have to try to work trying to make the game look good across 2 consoles giving them less work to do. But yeah their excuse is kinda stupid.

I can tell the difference between 900p and 1080p.. And so can a lot of people.

A while back console people said that "You can't tell the difference between 720p and 1080p anyway." . Then overjoyed when they heard that their consoles "will" be able to run 1080p (Or atleast will aim to do it.. To no success I see.). You can easily tell the difference. 900p to 1080p might be a bit harder to spot if you're standing far away from the TV sure. Though at the very least on the PC you can easily tell the different and I could also tell the difference between 1080p and 720p on the TV.

The format of your claim that "We won't notice the difference anyway" just kinda seems similar to how people say "You can't tell the difference between 60 fps and 120fps anyway", Or "30fps and 60 fps." One of the claims is more extreme then the other indeed.. But I'm confident that I could tell the difference if you put those resolutions side by side even on the TV.

Share this post


Link to post
Share on other sites

I can tell the difference between 900p and 1080p.. And so can a lot of people.

A while back console people said that "You can't tell the difference between 720p and 1080p anyway." . Then overjoyed when they heard that their consoles "will" be able to run 1080p (Or atleast will aim to do it.. To no success I see.). You can easily tell the difference. 900p to 1080p might be a bit harder to spot if you're standing far away from the TV sure. Though at the very least on the PC you can easily tell the different and I could also tell the difference between 1080p and 720p on the TV.

The format of your claim that "We won't notice the difference anyway" just kinda seems similar to how people say "You can't tell the difference between 60 fps and 120fps anyway", Or "30fps and 60 fps." One of the claims is more extreme then the other indeed.. But I'm confident that I could tell the difference if you put those resolutions side by side even on the TV.

The difference isn't that obvious to everyone with 900p and 1080p. Heck, I don't see the difference really. I have difficulty spotting the difference in 30 FPS and 60 FPS. Sometimes I think it's 60, but then it turns out to be 40 something according to the framerate counter. But 30 to 120, that's an obvious difference.

 

I think being used to playing 60 fps or higher is the reason. You'd notice it if you're always on 60 frames then one day you play something at 30 FPS. But then again, you'd think always being on 30, then suddenly switching to 60 or higher would also be noticeable.

 

I noticed the frame rate difference in the Hobbit movies compared to regular movies. But 720p and 1080p I have difficulty seeing. Even more so if the images are moving.

 

The difference isn't always like this for me:

2591039-4785636647-Luthe.jpg

480 to 780, pretty obvious. But 1080 and 720 on Youtube, I don't always notice. In fact, honestly, unless the picture is really big, or close, I don't 4k and 1080p for this picture looks identical for me. So you can probably understand my difficulty seeing the difference between 900 and 792 or 900 and 1080p.

 

So yea, it could be as you said a distance or size thing. If the picture's in someone's face, resolution difference is very noticeable.

 

This is no longer the case for devs, this processing power was unlocked in the June SDK update

No they didn't, the PlayStation 4 version ran at 900p and the Xbox One version at 792p.

Also the Xbox One does not have a higher frequency on its CPU for background tasks or anything of that nature. They overclocked it because the cooling on the Xbox One is better and they were able to without heat issues.

If this is an AI issue relating to the CPU, technically the PlayStation 4 is holding the Xbox One back, the CPU is 10% more powerful.

Just wondering if this is the update you guys are talking about: http://news.xbox.com/2014/06/xbox-one-xdk-qa

 

I'm asking because it doesn't seem to mention anything about the CPU. It just says that developers can opt to use 10% more of the GPU.

 

Or did the overclocking happen before the system launch?

 

I do remember hearing that the X1's processor is doing 1.75 whatevers and the PS4's only going 1.5 or 1.6 though.

DemonsColt likes this

Share this post


Link to post
Share on other sites

Last time we talked, you escaped technical facts by complaining about my long posts.

So, following my own words, I'll disengage from this thread again in order to avoid it being turned into yet another PS4 vs Xbone argument.

My disagreement with your math remains the same.

 

Not at all what happened or the reasoning why, your posts became so long it was impossible to make any kind of a cohesive reply, 5000+ words.... That's insane, that's the length of a game review if not more, anything said would have gotten lost in translation and you would no doubt come back with an even larger post... Don't give me that BS...

 

You said "Well, they locked the PS4 version of Watch-Dogs to match the Xbone too...."

 

That is factually untrue and I corrected you, the PlayStation 4 version ran at 900p while the Xbox One version ran at 792p.

 

In terms of the nearly 10% overclock to the CPU of the Xbox One from 1.6Ghz to 1.75Ghz, that was done because the system could still amply cool itself without generating an audible noise increase in terms of the fan spin-up. This overclock like I said had absolutely nothing to do with operating/managing system resources or it being reserved for anything of any kind, they did it simply because they could. The only reason they were able to achieve this is due to the large form factor of the case, how well it is ventilated and how large the system fan is.

 

This is not about the Xbox One versus the PlayStation 4, this is about being correct, stopping people like you from spreading misinformation that will confuse other users and providing them with the proper information. You can't disagree with my math because it's not my math, it's just math plain and simple...

 

 

Just wondering if this is the update you guys are talking about: http://news.xbox.com/2014/06/xbox-one-xdk-qa

 

I'm asking because it doesn't seem to mention anything about the CPU. It just says that developers can opt to use 10% more of the GPU.

 

Or did the overclocking happen before the system launch?

 

I do remember hearing that the X1's processor is doing 1.75 whatevers and the PS4's only going 1.5 or 1.6 though.

They overclocked the GPU to 853Mhz from 800Mhz and the CPU from 1.6Ghz to 1.75Ghz before the system launched. What you're posting is related to them freeing up 10% of the GPU which was reserved for Kinect skeletal tracking. In terms of the PlayStation 4, that still operates at 1.6Ghz because of the obvious cooling issues it would present.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now