The Tech Random Crap Thread

What is the current consensus on bang-for-buck video cards? Is it still the 1060, or has the 1050 Ti stolen that crown for certain levels of non-4K gaming? My gaming desktop is starting to feel its age a bit (2012-era), but before I replace the entire thing, I want to see if switching out the current card (a Radeon HD 7850) will do the trick.

In theory, I could afford a 1080 Ti, but I really don't actually want to. Between work and work-travel schedules, I don't get a lot of opportunity to sit at home and game, so anything above $300 is probably wasted, and less would not be taken amiss.

Primary need is to play recent-ish shooters and open-world A/A games at 1080p/60. I am unlikely to push for 4K anytime soon, and if I get a new monitor, I will prioritize things like G-sync and 144hz far more than 4K. VR would be nice, but aside from the fact that this basically requires at least a 1080 for consistent performance, I just don't have the space for it without re-arranging my entire living room and that's just not a real priority when I'm using my PC for games at home.
 
What is the current consensus on bang-for-buck video cards? Is it still the 1060, or has the 1050 Ti stolen that crown for certain levels of non-4K gaming? My gaming desktop is starting to feel its age a bit (2012-era), but before I replace the entire thing, I want to see if switching out the current card (a Radeon HD 7850) will do the trick.

In theory, I could afford a 1080 Ti, but I really don't actually want to. Between work and work-travel schedules, I don't get a lot of opportunity to sit at home and game, so anything above $300 is probably wasted, and less would not be taken amiss.

Primary need is to play recent-ish shooters and open-world A/A games at 1080p/60. I am unlikely to push for 4K anytime soon, and if I get a new monitor, I will prioritize things like G-sync and 144hz far more than 4K. VR would be nice, but aside from the fact that this basically requires at least a 1080 for consistent performance, I just don't have the space for it without re-arranging my entire living room and that's just not a real priority when I'm using my PC for games at home.
I've been researching the same thing. Looks like the 1050ti is the way to go for 1080p gaming on a budget. The miners have killed the market for anything higher. :(
 
See if you can find the 1080 (non-Ti). There are many variants out there (including this little darling), but most of them are out of stock due to Ethereum miners. According to Passmark, you should see some improvement:

gpus.png


I know the 1070 is the price/performance darling in the GTX 10xx lineup right now, but the non-Ti 1080 is about 30% more card than the 1070 for around 20% more price ($550 v. $450), which is really going to make a difference when you test the 4k waters or when your refresh rates go well above 120Hz. The Ti's are nice, but they're about $750 right now, which is silly unless you make your living playing pro gaming or streaming and NEED the horsepower.

However, if you're set on 1080p rez and will never go higher, the 1060 is probably the absolute cheapest value. The 1050Ti is the sort of card you get for those business boxes that weren't meant to game so they don't have the power budget available for a 1060 (i.e., don't have supplementary PCIe power connectors). To put it another way, a non-Ti 1080 is basically the equivalent of two 1060's, or just over three 1050Ti's.

--Patrick
 
Last edited:

GasBandit

Staff member
What is the current consensus on bang-for-buck video cards?
The 1060 3 gig is still the king of bang for the buck. Well, other than the Radeon 560, but you want an actual DECENT card.

https://www.videocardbenchmark.net/gpu_value.html

I'm still using my 1060 3gig and have never had trouble with it, and always get 60+fps at 1080p. As long as you're not trying to do 4k or VR, I'd say there's no reason to spend on anything pricier.
 
I wouldn't say never going above 1080p, but more that the cost of getting to an acceptable level of 4K gaming is prohibitive, given that I don't play games that require my gaming PC that often (my work laptop is more than sufficient for the variety of indie Metroidvanias and RPGs that I have been playing lately). 1440p is probably on the table, but isn't really a goal.

The last few games that I played on my PC are:
  • PUBG (27 FPS with some tearing on very low, 1080p, playable but painful after a couple rounds)
  • Overwatch (60+ FPS on near-highest settings, 1080p)
  • Sunder (1080p/60, without the slightest problem, obviously)
  • Forza H3 (didn't have a counter running, but it was at least 45+ and smooth at 1080p with middle-to-good)
  • DAI (didn't have a counter running, 30-ish at 1080p with good)
PUBG is obviously going to get better as they keep patching it, because it is probably the least-optimized popular game on Earth, and Overwatch is playable on a 2017-release ham sandwich. That being said, I have denied myself The Witcher 3 until I have a decent enough PC to run it, and I intend to play Destiny 2 on PC. I was going to play FO4 on PC after an upgrade, but then I got the PS4 disk as a giveaway at a party, so I'm just going to play it on PS4.
 
Last edited:

GasBandit

Staff member
Basically the 1060 3 gig has the same performance as a GTX 970 for 33% less money. If your aim is to stay at 1080 for the next couple years, I'd go with a 1060/3.
 
1060 3GB (ASUS DUAL-GTX1060-O3G) is what I put on both of our rigs at home, and it's been excellent bang for the buck. Got them for about $170 each on Jet around December with discounts.
 
EFF withdraws from W3C with comments: An open letter to the W3C Director, CEO, team and membership
In 2013, EFF was disappointed to learn that the W3C had taken on the project of standardizing “Encrypted Media Extensions,” an API whose sole function was to provide a first-class role for DRM within the Web browser ecosystem. By doing so, the organization offered the use of its patent pool, its staff support, and its moral authority to the idea that browsers can and should be designed to cede control over key aspects from users to remote parties.

...

Today, the W3C bequeaths an legally unauditable attack-surface to browsers used by billions of people.

...

We will defend those who are put in harm's way for blowing the whistle on defects in EME implementations.

It is a tragedy that we will be doing that without our friends at the W3C, and with the world believing that the pioneers and creators of the web no longer care about these matters.

Effective today, EFF is resigning from the W3C.

Thank you,

Cory Doctorow
Advisory Committee Representative to the W3C for the Electronic Frontier Foundation
 
Is there a record of which members voted for the EME API? I didn't see it in the boingboing article.
Many of the commenters on that article were lamenting the exact same thing.
Really, if anyone wants an example of just how absurd this could get, there's that part where they explain how if the website is crafted to use EMEs to deliver advertising content, blocking those ads would become illegal under federal law.

--Patrick
 
Why wouldn't you want the one with more VRAM? If you go with larger displays, you're going to want that extra VRAM.

--Patrick
 

GasBandit

Staff member
@PatrThom the "Sweet spot" build is supposed to emphasize the most value per dollar. On that metric, the 3 gig card outperforms the 6 gig card because the 6 gig card costs $50 (25%) more for basically a 2% increase in benchmark performance. As long as you're not doing 4k or VR, (IE, staying at 1080p on a single monitor) you won't notice a difference between the 1060/3gig and the /6gig. Or even a 1070 or 1080 or 1080 TI, for that matter (which are the cards they correctly picked for their midrange, expensive, and no-expense-spared builds).
 
Ah, ok. I haven't actually read the entire rundown...mainly because they don't usually tell me anything I don't already know.

--Patrick
 

GasBandit

Staff member
Ah, ok. I haven't actually read the entire rundown...mainly because they don't usually tell me anything I don't already know.

--Patrick
They basically traditionally do a few sample builds, and they generally call them some variation of
1) The Budget Build - the cheapest build you could expect to game on
2) The Sweet Spot - the build with the highest value per dollar spent
3) The Grand Experiment - What you'd buy if you had $2500+ to drop on a new rig

They've monkeyed with it a little over the years (I note now what used to be called the "Sweet Spot" is now the "Middle Ground" and the new "Sweet Spot" is more expensive and less value/$), but there's always one build that emphasizes value per dollar, and that's usually the one I pay the most attention to.

I also think they put too much emphasis on number of cores, especially on their gaming-specific rigs. These days, CPUs are still far, far less important than GPUs to gaming, and the few times the CPU is needed, individual core strength still trumps number of cores, which means you want Intel. But they've had a massive hardon for all the AMD Ryzen stuff all year.

Of course, it's a different tune if you are gonna do a lot of video editing or other CPU-heavy activities.
 
Conventional wisdom holds that clock speed and IPC are what matters now, with 4 threads being plenty for just about everything, but recent testing has shown that if you're looking for an LTS gaming build, you might want to consider building it with at least 6 cores (or at least 4-core with SMT), because some games are starting to show an advantage when run on more than 4 cores.

The more I hear about this stuff, the more I start feeling a whooooole lot better about my venerable 1090t.

--Patrick
 

GasBandit

Staff member
Conventional wisdom holds that clock speed and IPC are what matters now, with 4 threads being plenty for just about everything, but recent testing has shown that if you're looking for an LTS gaming build, you might want to consider building it with at least 6 cores (or at least 4-core with SMT), because some games are starting to show an advantage when run on more than 4 cores.

The more I hear about this stuff, the more I start feeling a whooooole lot better about my venerable 1090t.

--Patrick
I really question the validity of benchmarks that base themselves on performance in PUBG, as what you're really measuring there is a computer's ability to deal with unoptimized alpha nonsense. Also note that even the worst performing cpu in that article got you over 80fps - because of the video card they're using with the test - a 1080 TI.
 
even the worst performing cpu in that article got you over 80fps - because of the video card they're using with the test - a 1080 TI.
Right, but that's because the test was specifically created to see the effect of different core counts, and using the 1080ti means their testing will not be skewed by insufficient graphics horsepower.

--Patrick
 
Given their summaries of intent at the beginning and throughout, I think their "Sweet Spot" is meant to be able to do VR, thus the extra RAM. So I think your expectation of what it is is different than what they're offering.

I don't doubt they've had a terminology switch though. I agree "Sweet Spot" meant value/$ in the past. I do think that's shifted, but their nomenclature throughout the whole article is fairly consistent.
 
So I suppose you've all heard about that brand new launch today.
Yup. And it's about time...Java 9 has finally been released.

--Patrick
I look forward to the SCREAMS of people for having strings screwed with. Some "big hacks" will break with this. Normal people will be only benefit (as the article says) but anybody doing "very hackish" things (using strings with chars as byte arrays instead of copying into actual byte arrays) will be affected badly.

Strings are always "fun" to deal with, no matter the language. Some are better, some are worse, but the CHANGE always "gets" people.
 
I look forward to the SCREAMS of people for having strings screwed with. Some "big hacks" will break with this. Normal people will be only benefit (as the article says) but anybody doing "very hackish" things (using strings with chars as byte arrays instead of copying into actual byte arrays) will be affected badly.

Strings are always "fun" to deal with, no matter the language. Some are better, some are worse, but the CHANGE always "gets" people.
oof. I recall dealing with java and doing byte processing. Byte array support (well, anything embedded - byte or bit level) is really terrible in that language. I still used 'em, but it was tempting to use the strings. Fortunately I wasn't confident they would work, so stuck with the byte arrays.

Still don't like java. Just haven't used it enough, I guess.
 

GasBandit

Staff member
oof. I recall dealing with java and doing byte processing. Byte array support (well, anything embedded - byte or bit level) is really terrible in that language. I still used 'em, but it was tempting to use the strings. Fortunately I wasn't confident they would work, so stuck with the byte arrays.

Still don't like java. Just haven't used it enough, I guess.
They insisted on teaching us J++ instead of C++ when I went to A&M. I'm fairly confident this is what derailed my youthful fantasies of being a game designer.
 
They insisted on teaching us J++ instead of C++ when I went to A&M. I'm fairly confident this is what derailed my youthful fantasies of being a game designer.
Yeah, I think teaching java as a standard language is/was a mistake. Hindsight is 20/20, though, and there was a time many moons ago when everyone was betting on Java.
 
Top