The Tech Random Crap Thread

Intel and AMD team up to combine Intel CPU with AMD GPU on single package. This is the first time they've really teamed up again probably since the days of x486 CPUs.

On the one hand, this may take the wind out of AMD's new "Ryzen Mobile" part and relegate them to the lowest-budget end of the laptop spectrum.
On the other hand, NVIDIA may be very upset at this news.

On the third hand, everyone is talking about this as if it were a completely unexpected thing, but I'm wondering whether this is just foreshadowing for whatever Apple's next product may be (since Apple is currently using both Intel CPUs and AMD GPUs in their products). I mean, it's not the first time that a component manufacturer (Intel) has developed a custom package just for Apple.

--Patrick
 
I think Intel and AMD understand the writing is on the wall. Apple is progressing with both their in house processor and in house graphics chipset at such a furious pace that they will legitimately be on a competitive level with Intel and AMD, particularly in the mobile space.

I expect Apple to ditch Intel and AMD within a few years on their low end computers (particularly since they've released their arm specific darwin kernel bits recently) and simply start using their own chips.[DOUBLEPOST=1509991485,1509991454][/DOUBLEPOST]
I think Intel and AMD understand the writing is on the wall. Apple is progressing with both their in house processor and in house graphics chipset at such a furious pace that they will legitimately be on a competitive level with Intel and AMD, particularly in the mobile space.

I expect Apple to ditch Intel and AMD within a few years on their low end computers (particularly since they've released their arm specific darwin kernel bits recently) and simply start using their own chips.
 
I think Intel and AMD understand the writing is on the wall. Apple is progressing with both their in house processor and in house graphics chipset at such a furious pace that they will legitimately be on a competitive level with Intel and AMD, particularly in the mobile space.

I expect Apple to ditch Intel and AMD within a few years on their low end computers (particularly since they've released their arm specific darwin kernel bits recently) and simply start using their own chips.[DOUBLEPOST=1509991485,1509991454][/DOUBLEPOST]

I think Intel and AMD understand the writing is on the wall. Apple is progressing with both their in house processor and in house graphics chipset at such a furious pace that they will legitimately be on a competitive level with Intel and AMD, particularly in the mobile space.

I expect Apple to ditch Intel and AMD within a few years on their low end computers (particularly since they've released their arm specific darwin kernel bits recently) and simply start using their own chips.

;)
 
Last edited:

GasBandit

Staff member
Oh FFS: Intel Q3’17 ME 11.x, SPS 4.0, and TXE 3.0 Security Review Cumulative Update

Translation: their "Management Engines" have security holes. Those are the "beyond hypervisor, we pwn your system" co-processors that are on Intel chipsets.

Edit: Reporting on it, with clearer explanations of how bad this is: Critical Flaws in Intel Processors Leave Millions of PCs Vulnerable

Local processes running access is now -> you are completely pwned.
That sucks. But at least my Core i5 is 4th gen, so I'm in the clear... (It affects gen 6, 7, and 8)
 
That sucks. But at least my Core i5 is 4th gen, so I'm in the clear... (It affects gen 6, 7, and 8)
3rd gen at home. Hilariously, my work computer should be affected, but my old home one isn't!


Btw, anybody know a good reference for processor speeds cross-generation? I mean on actual benchmarks, not just GHz comparisons.
 
Translation: their "Management Engines" have security holes. Those are the "beyond hypervisor, we pwn your system" co-processors that are on Intel chipsets.
Still not as bad as the one revealed in May.
anybody know a good reference for processor speeds cross-generation? I mean on actual benchmarks, not just GHz comparisons.
Just go through their front door at https://www.passmark.com/

—Patrick
 
The 2500K is also not on this list. :D
That's because it's Sandy Bridge.
This advisory is for Haswell (4xxx) and newer.

FWIW, I was looking for processors recently for a potential server build, and the Sandy/Ivy bridge processors (2xxx/3xxx series) were inexplicably selling at a premium.
Now I guess we know why.

--Patrick
 
I generally use https://www.cpubenchmark.net/

But more often for their video card stats,.
Thanks Gas. I found out that my 5-year-old i5-3470 @ 3.2GHz is "not bad IMO" for performance versus a i7-7700K @ 4.2GHz considering generational changes. 6600 vs 12000, which given that it has a ~30% higher clock speed, that isn't terrible at all. If mine was clocked up, it'd be in the 9000 range, so only 33% higher per-clock performance 4 generations higher? Yes there's the 8000 series out now, so probably 50% better per-clock (i haven't looked it up yet), but still... I think I'm doing fine! I have a GTX 960 that I put in two black fridays ago, so that's good enough considering I only have a 1080p monitor.
 
That's because it's Sandy Bridge.
This advisory is for Haswell (4xxx) and newer.

FWIW, I was looking for processors recently for a potential server build, and the Sandy/Ivy bridge processors (2xxx/3xxx series) were inexplicably selling at a premium.
Now I guess we know why.

--Patrick
I've got an Ivy Bridge and two Sandy Bridge processors in those spare Dells I bought from Pitt surplus. I might want to hang onto them a little longer. :)
 
Thanks Gas. I found out that my 5-year-old i5-3470 @ 3.2GHz is "not bad IMO" for performance versus a i7-7700K @ 4.2GHz considering generational changes. 6600 vs 12000, which given that it has a ~30% higher clock speed, that isn't terrible at all. If mine was clocked up, it'd be in the 9000 range, so only 33% higher per-clock performance 4 generations higher? Yes there's the 8000 series out now, so probably 50% better per-clock (i haven't looked it up yet), but still... I think I'm doing fine! I have a GTX 960 that I put in two black fridays ago, so that's good enough considering I only have a 1080p monitor.
Like I said over in the BYO thread, this current generation (Cannonlake - 8xxx) is the first real generational improvement in CPU power that isn't primarily due to faster clock speed, adding more cores, etc. Because otherwise, like I said here:
Once you account for the difference in clock speed, the single-thread (i.e., gaming) performance delta between the 2010 i5-2500k Sandy Bridge and the current 2017 Kaby Lake i5-7600k is merely +9.94%.
--Patrick
 

GasBandit

Staff member
Thanks Gas. I found out that my 5-year-old i5-3470 @ 3.2GHz is "not bad IMO" for performance versus a i7-7700K @ 4.2GHz considering generational changes. 6600 vs 12000, which given that it has a ~30% higher clock speed, that isn't terrible at all. If mine was clocked up, it'd be in the 9000 range, so only 33% higher per-clock performance 4 generations higher? Yes there's the 8000 series out now, so probably 50% better per-clock (i haven't looked it up yet), but still... I think I'm doing fine! I have a GTX 960 that I put in two black fridays ago, so that's good enough considering I only have a 1080p monitor.
Game stuff is almost all GPU dependent anyway. The only things I've found I need a beefy CPU for would be things like video editing (rendering the videos for HFA took all 4 of my cores to 100% for 20+ minutes straight). But most games barely even cause any CPU load at all. And the 960's a decent card.
 
most games barely even cause any CPU load at all
Weelllll that is finally changing. Games are finally starting to split the load because developers are starting to assume everyone is using at least a quad-core (or dual-core with some form of SMT) processor...or because they're FINALLY abandoning support for ye olde single-thread CPUs (consumer single-socket dual-core came out all the way back in 2005!). Starcraft II, for instance, is notorious for eating up CPU because it has to manage pathing and AI for potentially thousands of individual units.

But yes, developers these days tend to prioritize their resources towards eye candy, which means that GPUs bear the brunt of today's gaming.

--Patrick
 

GasBandit

Staff member
Weelllll that is finally changing. Games are finally starting to split the load because developers are starting to assume everyone is using at least a quad-core (or dual-core with some form of SMT) processor...or because they're FINALLY abandoning support for ye olde single-thread CPUs (consumer single-socket dual-core came out all the way back in 2005!). Starcraft II, for instance, is notorious for eating up CPU because it has to manage pathing and AI for potentially thousands of individual units.

But yes, developers these days tend to prioritize their resources towards eye candy, which means that GPUs bear the brunt of today's gaming.

--Patrick
There are one or two exceptions, yes... I doubt Starcraft is all that taxing, TBH, but Space Engineers is one of the exceptions where a powerful CPU will definitely be helpful - it does all the physics calculations for the bajillion blocks/parts used/in motion/colliding at any given time on the CPU. Last I played it, though, it wasn't too good at multithreading. That may have changed.

But for 95% of games out there, the GPU (and RAM) are all that matters.
 
I doubt Starcraft is all that taxing, TBH
Starcraft Two.



For reference, that first processor is the same one I'm using (though I don't have mine overclocked), a 6-core processor without SMT (6 threads total) that came out in 2010.
The second one is a quad-core Intel (that DOES have SMT, so 8 threads total) that came out just three years later AND is clocked 16% faster.

Sure wish he'd included a CPU usage display of some kind.

--Patrick
 
There are one or two exceptions, yes... I doubt Starcraft is all that taxing, TBH, but Space Engineers is one of the exceptions where a powerful CPU will definitely be helpful - it does all the physics calculations for the bajillion blocks/parts used/in motion/colliding at any given time on the CPU. Last I played it, though, it wasn't too good at multithreading. That may have changed.

But for 95% of games out there, the GPU (and RAM) are all that matters.
I'm going to have to play around with this and see if I can get on the SE team... ;)

(Space Engineers is built in .net, hence finding an OpenCL .net bindings library)
 
MS Patenting the obvious: Microsoft’s Edge browser may soon hide your adult browsing history automatically (patent)

Basically, go into "private browsing" mode upon detection of a certain set of websites. One big problem: Containers in Firefox has had this for most of the year (at least). Multi-Account Containers Also the help page: Containers

This add-on (it has support in FF itself to make it work as well) has cookies and such separated by container, and you can configure it to always open certain websites in certain containers. So it's basically exactly this patent. And considering that the patent was filed in June of this year, and this add-on pre-dates it by months (easily), methinks MS is screwed. It only went out for testing in the last few months, but the github has records back at LEAST 11 months, and it's been publicly accessible prior to the June date. Hopefully they won't try and be assholes with it, but this patent should be rejected.
 
I'm sure this has already been discussed on here but I can't find it so I'm asking again :p
I'm looking for a reliable, free MOBILE ad blocker. I use ABP on my desktop, don't know if their mobile offering is equally good? I've gotten sick and tired of the 15 second ad videos before and after each youtube video. I can live with an ad every 15 minutes but when listening to a list of short snippets of music having almost as much time lost to ads as to actual music, it's just too much.
Anyone a good suggestion?
 
I'm sure this has already been discussed on here but I can't find it so I'm asking again :p
I'm looking for a reliable, free MOBILE ad blocker. I use ABP on my desktop, don't know if their mobile offering is equally good? I've gotten sick and tired of the 15 second ad videos before and after each youtube video. I can live with an ad every 15 minutes but when listening to a list of short snippets of music having almost as much time lost to ads as to actual music, it's just too much.
Anyone a good suggestion?
Just to be clear, you want the ads removed from inside the YouTube app, or inside the browser's youtube player?
 
Just to be clear, you want the ads removed from inside the YouTube app, or inside the browser's youtube player?
I currently use the YT app, but I used to use it in the browser. Either's fine, probably easier if I go through the browser.
 

GasBandit

Staff member
Just so you know, there's no real way to accomplish the ad blocking that a full PC browser does without either using the Adblock browser itself instead of your phone's regular browser, or rooting your phone and downloading a third party blocking app. I've got the Adblock browser and for the most part it's okay but it has some other issues which always leads to me going back to Chrome. There's also another kind of ad blocker that uses itself as a VPN but I've gotten mixed results with that.
 
It’s really hard to do in-browser blocking with a mobile browser due to less control over scripted content. Usually the best way is via VPN-based blockers but then that drains your battery faster.

—Patrick
 
Mozilla vs Yahoo lawsuit: Yahoo sues Mozilla for breach of contract -- so Mozilla counter sues Yahoo

So that's a thing. It's about how Mozilla had Yahoo as their default search provider in exchange for money. Mozilla dropped Yahoo early, so Yahoo is suing for breach of contract. Mozilla is counter-suing under both technical reasons (Yahoo's performance wasn't "good enough" under the contract according to Mozilla) as well as lack of payment from Yahoo, or at least not payment on schedule.

I think regardless of any other factor, if you're not paying, the other party gets to terminate. That's one of the easier factors to prove in court as breach of contract. The rest might be true, but that's the easy one to prove, and is the "you can always get out" reason (generally at least).
 
New vulnerability in SSL, but only for RSA hosts: https://robotattack.org/

The best part about that page though is this:
Can this attack be used against Bitcoin?
Bitcoin does not use RSA, instead it uses elliptic curve cryptography based on the curve secp256k1. Our attack cannot be directly applied to that. However if you transform a quantum key exchange to a supersingular Isogeny you can attack post-quantum RSA and thus apply our attack indirectly to secp256k1.

We believe the only way Bitcoin can defend against this is to immediately switch to Quantum Blockchains.
I give props to those researchers. That's a level of trolling that only works against a certain level of technical knowledge. Well-done!
 
Top